Revolution Tech Details Emerge ( Xbox1+ performance, 128 MB RAM )

fearsomepirate said:
Like someone else said...feature-rich, just not a huge performer. So hopefully it can do all kinds of crap, just not at huge speeds, or fast enough to consistently run 720p games at 30fps or better.
But that requires power. Let's say you compare performance on programmable Flops, and Rev reaches 1/5th XB360. Not let's say they've got all sorts of hard-wired functionality that adds extra source so output is comparable. That means that the Rev hardware is as powerful. By saying the Rev hardware is less powerful, that to me means it's got less capacity to process data, either with fixed function or programmable ALU's of different sorts. eg. Hardware support for HOS (which Nintendo's traditional cartoony style would benefit from greatly I guess) is adding power to the console. Unless the 'power' thing is only measured in FLOPS and vertices per second and shader ops per second, and Nintenod can get extra functionality without it being counted as any of these things, 2-3x the power of GC should be...what, a console that costs half as much as GC initialliy cost, at least? Though of course it would be small and cool.
 
Relax, what does this article actually say? It has a quote about Revolution not being much more powerful then XBox, which is quite subjective. It says its sort of like a souped up XBox, but how souped up? Also "souped up" generally means the same or very similar hardware but faster and I can't see Revolution using very similar hardware to XBox so this comment is a little strange. It sounds like this developer is basically saying that Revolution is more powerful then the most powerful last generation console but less powerful then 360, which is hardly a suprise. He may also just be basing this on performance numbers from Nintendo as the article suggests (developers being briefed by Nintendo on Revolutions overall power). In which case lets not forget the "6-12 million polygons per second" number for GameCube, which at the time (well before GameCube's release) made it sound many many times weaker then PS2. The water is further muddied when the article explains that most developers are currently using the first Revolution dev kit (a modified GC with extra ram) and even the lucky few who aren't still have incomplete development kits. I mean how much more powerful was 360 compared to the the first 360 development kit?

Wow, these specs are a LOT lower than I (or anyone else here it seems) was expecting.

There really aren't any specs there. The only actual spec we can see rumoured here is amount of main ram. But we can't really put too much stock in that right now considering most developers out there are using the very first Revolution dev kit (a modified GameCube with more ram) and even the lucky few are still using incomplete kits.

I'm glad something is being said by developers though, it sounds like some actual specific specs might start to leak out soon :)
 
Last edited by a moderator:
Guden Oden said:
You're basing this on your own imaginary teaparty figures out of Alice in Wonderland I have to assume. There was a cost breakdown done by a consulting firm that specializes in that kind of stuff, the GPU was estimated at $120+ and CPU at $100+ as I recall. Just RAM chips were over $60.

Exactly. Not that bad. The figure I recall on X360 was $375 total system cost. Yes I know some other outfit claimed $500+ dollars but I dont believe them for good reasons. $375 doesn't seem unreasonable to me for Nintendo.


Accessories? What accessories? The core system doesn't have any (that wouldn't be included with any other console), and of the ones included in the premium system only the harddrive has any real, substantial cost and even that doesn't come close to cover the $100 price premium; the harddrive is a god damn cheap-ass SAMSUNG unit. The headset is as cheap and flimsy as they come for example, and the controller while wireless, doesn't represent any real dollar manufacturing cost either. You can buy wireless mice for peanuts these days.

MS steadily and consistently raised the price of accesories in a very diabolical way. Where before you got the HD free, and it cost MS say $35, now you dont get it free in the core pack. But not only that now you must buy the $40 mem card to save games, which is basiically pure profit, and MS doesn't allow third party mem cards. They just created a $75 swing right there on every single Xbox. Or you can now pay an extra $100 for the $35 HD you used to get free, through the premium pack or otherwise. Also all the system accesory prices were raised. The HD cables are $40, etc. Also now third parties pay royalties on accesories.

You add it all up and see that besides reduced hardware costs the strategy.

You've been disputed on this count a number of times by actual game developers. Face it, you simply have no clue. MS went TO eDRAM because unified memory was a troublesome design in the original xbox, as was it in the N64 before that. It's cheap and easy to design, but performance is not particulary consistent as MS (re)discovered. eDRAM may be limited in space, but performance can be there in oodles. Transparencies forcing read-modify-write accessses kills performance on any design using external RAM, but PS2 with its triple-ported memory barely slows down at all. You could do 20 fullscreen smoke polygon layers on top of each other running at 60 frames/sec and you'd never notice it framerate-wise. That's the power of eDRAM.

But after all that, Sony decided against it for PS3. Also PS2 was the weakest last gen machine. Also EDRAM has no computation capabilities at all in and of itself. It is RAM. It can only hold data placed in it, like any RAM. Any possible benefits are bandwidth savings related only.


Well, considering you still live in with your parents and don't have to pay any bills (let alone any little kiddies to raise), most people aren't made of money. If Rev is sub-$200 (or even sub-$150) and PS3 is $400 or more, that automatically puts PS3 out of reach of a lot of people. Lucky us you're not the one calling the shots at any of these companies though. ;)


Well it's pointless till we have a couple years of sales figures in for PS3 and Revolution, but I'm nearly 100% confident I am correct.
 
Last edited by a moderator:
Consider me well and truly not shocked. They're banking a lot on that controller, good luck to 'em.

Certainly seems like these devs that IGN are supposed to have talked to don't think that GC is as powerful as Xbox. Settles that question at least (was settled a few years back for most people to be honest).*

*[edit] Point being, this isn't really a big change of strategy for Nintendo, it's just a more visible exaggeration of it in the home console space. Nintendo have always sold their consoles so they can sell their games.[/edit]
 
Last edited by a moderator:
Well it's pointless till we have a couple years of sales figures in for PS3 and Revolution, but I'm nearly 100% confident I am correct.

Of course you are, a lack of confidence has never been your problem.
 
I don't care what the article says, no way this thing is only a bit more powerful than the XBox. You could throw a 128meg Geforce 4200Ti in there and that would probably own the XBox gpu, Hollywood is gonna be way better than that. And how slow can the CPU be? They could get away with a 1core version of the 360 processor, maybe make it an inorder instead of out of order, and that would own the XBox CPU by a landslide.

This thing is going to be alot better than you think.
 
Zeross said:
For the PS3 Sony is using OpenGL ES but where did you read that developers could program the GPU down to the metal ? In fact I've read quite the opposite :

John Carmak sucks, i stoped lissening to his bull shit when he started to complain about multiple core cpu's and how hard they are to program. He should shut the hell up and do what every PS2 dev did......just get on with it and learn.
 
fearsomepirate said:
Like someone else said...feature-rich, just not a huge performer. So hopefully it can do all kinds of crap, just not at huge speeds, or fast enough to consistently run 720p games at 30fps or better. I get the impression that Nintendo designs hardware with the philosophy "What's the minimum amount of junk we can put in this box to get this IQ if we design it right?," while the other companies are more like "What's the maximum amount of junk we can put in the box on this budget?" So hopefully the hardware will have a few tricks up its sleeve.

Havent ninty already stated there not goin for HD resolution???
 
Really hard to bellive IMO

A costum ~970FX0 (58M)+a costum X1300 (-100M) would kill it (even a gekko at 2ghz with a VMX+9600) and would be really cheap, plus it is needed to process all the data from the remote so even more power is needed.

IMO they are talking aboutthe early dev kits, and they even dont know about it.
 
Last edited by a moderator:
Bill said:
But after all that, Sony decided against it for PS3. Also Dreamcast was the weakest last gen machine. Also EDRAM has no computation capabilities at all in and of itself. It is RAM. It can only hold data placed in it, like any RAM. Any possible benefits are bandwidth and latency savings related only.

Fixed for accuracy. You act like bandwidth is immaterial. But the effects Guden listed (such as layered transparencies) eat bandwidth like a mofo and need as little latency as possible, which was basically his point. Having 3x or more the main bandwidth all to yourself for doing framebuffer effects and transparencies isn't anything to sneeze at.
 
fearsomepirate said:
Fixed for accuracy. You act like bandwidth is immaterial. But the effects Guden listed (such as layered transparencies) eat bandwidth like a mofo and need as little latency as possible, which was basically his point. Having 3x or more the main bandwidth all to yourself for doing framebuffer effects and transparencies isn't anything to sneeze at.

To be easy, computacional power is good if you do have some thingh to use it, without edram you may have nothing to calculate so the power is a wast and it is equal to not have that power.
 
Last edited by a moderator:
Sounds like a machine designed to have all the graphical eye candy running at non-HD resolutions and cost less then $199 (and be profitable to boot). Nintendos strategy is clear and on SD televisions my guess is it will be much more difficult to differentiate the Rev between the PS3/360, lets see if it works in Nintendos favor.
 
Fafalada said:
What makes you think they can?
Isn't it usually the case that in a console devs are free to hack the hardware direct? This is where the best devs get much of their performance over rivals, isn't it? I can understand that changing though if these companies are serious about long-term backwards compatibility. A forced interface would make that a lot easier.
 
Fafalada said:
What makes you think they can?
Who where when what?

If you're talking about my guess on the revolution, well... It's just a guess.

Now, about the X360/PS3 GPUs, last I heard the developers have an access to the GPU that isn't as restrictive as it was (officially) on Xbox.
I'm not saying the machines are as documented and low-level coding friendly as it could possibly be, just that it's not "limited" to DirectX or OGLES calls only. At least they should, if I understood things correctly*.

*It's not impossible that I understood things incorrectly. But I will never, I say never, admit it!
 
In the year 1999 Sony showned Playstation2 with a technical specs and Nintendo announced Project Dolphin with better specs than PS2.

Playstation2 was an equivalent of a PentiumIII with a DirectX 6.0 Videocard (RivaTNT2, Voodoo3, Matrox G400...) but with a geometry power 8 times superior.

Gamecube appeared in the market in the year 2001 with a GPU that has the power of a GeForce2 Ultra when the GeForce3 was on the PC market.

But a lot of Nintendo ******s are doing stupid comparisions between RE4 GCN and RE4 PS2 and when you shown them games like Conker, FarCry Instincts, Ninja Gaiden, Riddick, Dead or Alive Ultimate and other they wath into another side and they type at the same time in the forums "GCN and Xbox are equaled in power" but you cannot show to them the truth because they will watch to another place.

Now Nintendo has owned their most stupid fans, but stop, Nintendo has done it in the last E3 press conference. A press conference for the press and the analyst and never for the megatonian fans of Nintendo.

Please ******s, stop be ******s of anything material and start to be only normal players that enjoy videogames, Nintendo only want to sell their products and they have seen that you are the minority here and the casual gamers and the non-gamers are more people than you.
 
Synergy34 said:
I don't care what the article says, no way this thing is only a bit more powerful than the XBox. You could throw a 128meg Geforce 4200Ti in there and that would probably own the XBox gpu, Hollywood is gonna be way better than that. And how slow can the CPU be? They could get away with a 1core version of the 360 processor, maybe make it an inorder instead of out of order, and that would own the XBox CPU by a landslide.

This thing is going to be alot better than you think.

I somewhat agree with this, and want to believe it.

as for an Xbox-level GPU, an NV25 ~ GeForce 4 ti 4200 or R200 ~ Radeon 8500 would equal and slightly beat the NV2A. So if IGN's article is to be taken at face value and is really accurate (which is somewhat in doubt) then the current Revolution kits don't even have GPUs as powerful as the R300 ~ Radeon 9700. the R300 being several times more powerful than Xbox NV2A.

R300 is the minimum standard I'd expect from Hollywood.
 
In the year 1999 Sony showned Playstation2 with a technical specs and Nintendo announced Project Dolphin with better specs than PS2.

I don't agree there, most of PS2's specs were better then GameCube's, GameCube just used its power more efficiently.

Incidentally I wonder what a developer would have said about GameCube vs PS2 and XBox had they not used the real system and took Nintendo's "6-12 million polys per second" spec to heart :)
 
Last edited by a moderator:
!eVo!-X Ant UK said:
John Carmak sucks, i stoped lissening to his bull shit when he started to complain about multiple core cpu's and how hard they are to program.

Well maybe it's because he at least has some experience with multi threaded games :rolleyes:

He should shut the hell up and do what every PS2 dev did......just get on with it and learn.

I think that YOU should learn before telling other obviously more knowledgeable people to shut up : the guy has worked on a lot of machines even the infamous Jaguar with one CISC CPU and two custom RISC cores, he is one of the few on the PC to have threaded his engine so who are you to judge him ?

Even on this board a lot of valuable members (Deano, ERP and others) have expressed their concerns with these new CPU. History has shown us that consoles with multiple processors were not particularly successful (Jaguar, Saturn) when they had to face fast single processor machines. PS2 is more the exception than the rule, moreover VU1 was more a back end for the GS than anything else a kind of super vertex shader and VU0 remained largely under utilised based on Sony data. Today it's becoming harder to improve performance of single threaded applications anymore and that's why there is a switch to multicore, not because multicore is the silver bullet.
 
Megadrive1988

One thing to bare in mind from that article:

Still, the studios we spoke with are still very intrigued by Revolution and are not ruling out the possibility of additional graphic horsepower. No developer that chatted with us had, or was willing to share, details on the console's GPU, Hollywood. One studio said: "As soon as we find out what it can do then we'll know if Revolution will just be like an Xbox or something a little more."
 
Back
Top