Revolution Price Confirmed (?)

It is not the case that 3x the resolution = 3x greater workload. And even if we take it like that, the bridge between supposed revolution specs and X360/PS3 is still enormous

It does equal 3 times the fillrate, three times the pixel shading workload and three times the framebuffer bandwidth. As far as pixel work (shading effects ect) its three times the workload.

As for those dev kit specs, the bridge between a Pentium 2/Geforce 2 (initial XBox dev kit) and XBox was very big. The difference between a 3.2Ghz G5 and Radeon 9800 (initial 360 dev kit) and the Xbox 360 is also massive.

I believe IGN were referencing final specifications and that this line of thinking is a clutch for some.

Why do you believe that? There's nothing in IGN's article that says any of that info was taken from final specifications. In fact they said the info was taken frm the documents that came with the early dev kits. They also said that the developers they'd spoken too admitted to not knowing what Revolutions GPU would turn out to be, so they can't possibly have access to final specs.
 
Last edited by a moderator:
Like I said, its better to compare the supposed specs to the Xbox and you can see you won't really be seeing much improvement.

To be honest if you're going to compare those supposed specs to any system then surely it should be GameCube.

What evidence do you have that it's not? I haven't been keeping up on the whole affair but it seemed the original IGN article was pretty decent.

Various people in the industry, basically saying that the system IGN referrenced is only being used as a stopgap until they can get real Revolution hardware up and running. As well as logic really, you don't spend money on developing chips that are the same chips that were developed 6 years earlier.

I wouldn't call IGN's article decent, more like vague and contradictory.
 
Last edited by a moderator:
Teasy said:
It does equal 3 times the fillrate, three times the pixel shader workload and three times the framebuffer bandwidth. As far as pixel work (shading effects ect) its three times the workload.
In some areas, yeah, but as a sum of its operation, no.

Teasy said:
The bridge between a Pentium 2 and Geforce 2 and a Pentium 3 and Geforce 3 is very big as well, the first was the initial XBox dev kit and the second was the XBox. The difference between a 3.2Ghz G5 and Radeon 9800 and the triple core 3.2Ghz PPE and Xenos GPU is also massive, the first was the initial 360 dev kit and the second is the actual 360.
What is your point? We also had the near-complete specifications of what the final product would be in those cases. Likewise we are being told what the final product of Revolution will contain.

Teasy said:
You believe that based on what? The fact that IGN themselves said the info was taken from documentation that came with early GC based development kits?.. There is nothing in IGN's article that says any of that info was taken from final specifications.
IGN was not describing the devkits. They were describing what would be the final Revolution specifications.
IGN said:
In yesterday's article, we wrote that Revolution would include 128MBs of RAM, or possibly less. Developers have clarified the makeup based on officially released Nintendo documentation. Revolution will build on GameCube's configuration of 24MBs 1T-SRAM and 16MBs D-RAM (40MBs) by adding an addition 64MBs of 1T-SRAM. The result is a supply of memory in Revolution that totals 104MBs. That number does not consider either the 512MBs of allegedly accessible (but hardly ideal) Flash RAM or the Hollywood GPU's on-board memory, said to be 3MBs by sources.

Revolution's Broadway CPU, developed by IBM, is an extension of the Gekko CPU in GameCube, according to official Nintendo documentation passed to us by software houses.

[...]

Asked if it was developing for Revolution, one major third party source said that it was well past the experimental stage and was evaluating what types of games might work on the platform. "We are looking at it quite differently. It's like another current generation platform for us. But it's such a nice controller that it opens up a lot of possibilities. It's very different and it's very precise."
The bold all describe what the Revolution will be, not the state of the current devkits.
 
In some areas, yeah, but as a sum of its operation, no

In the majority of area's that effect graphics quality.

What is your point? We also had the near-complete specifications of what the final product would be in those cases. Likewise we are being told what the final product of Revolution will contain.

Then why do we have quotes anywhere from 88MB to 128MB of system ram? Why do we have developers saying that they have no idea what Revolutions GPU is? We don't really have so much as a single clear specification, which is pretty amazing if these early dev kit document list the systems final specs.

I'll just backtrack for a second because I'd really like you to answer this bit specifically. How can these documents contain the final system specs when developers have admitted to not knowing what the GPU will be capable of?

The bold all describe what the Revolution will be, not the state of the current devkits

The comments you've bolded only say what the Revolution will be based on the info IGN recieved. Where does it say that info is actually final system specs? I doesn't, IGN even said that the info was taken from the documents that came with what was described as a "early GameCube based development kits".

Here's a quote from the same IGN article:

Readers are advised to make two notes before continuing with this article. The first is that developers are still working with incomplete Revolution hardware. Most studios are, in fact, developing on "GameCube-based kits," according to major software houses we spoke to, which have asked to remain anonymous. The second is that developers are still without final specifications for Revolution's ATI-developed graphics chip, codenamed Hollywood.

They go on to say that these developers have been "partially briefed by Nintendo". But what that entailed exactly is anyones guess, one things for sure, final specs can't have been part of the breifing or they would know about the GPU.
 
Last edited by a moderator:
Teasy said:
But we aren't, were are these specs then? Why do we have quotes anywhere from 88 to 128MB of system ram? Why do we have developers saying that they have no idea what Revolutions GPU is? Does that not tell you that Nintendo have not included final specs in these early development kit documents?
They said they have expectations. Nintendo has to inform developers about what visual range to target. They can't have developers making PS3 graphics for the Revolution.

Teasy said:
Again based on what? You've bolded a couple of comments saying what Revolution will be based on the info IGN recieved. Where does it say that info is actually final system specs? I doesn't, in fact IGN said that the info was taken from the documents that came with what was described as a "early GameCube based development kit".
No, they only say the documents came from Nintendo, and were given to developers as they recieved developer kits.

In fact, the article says that developers have final specifications for everything except the Hollywood GPU, and expectations are its little more than an extended version of the Flipper.


Teasy said:
You keep saying that this info is describing the final system specs and not the dev kits they have at the moment. But there is nothing to suggest that, in fact everything says otherwise.
Nothing suggests otherwise. Nothing. That IGN article, from the title to the content, is suggesting final, production model Revolution specifications. The word "kit" is mentioned only once, and that is in the context described above.
 
Nintendo has to inform developers about what visual range to target. They can't have developers making PS3 graphics for the Revolution.

Developers can't target anything above the development kits they have anyway. All they have to do is develop on the kit they have and upgrade as the kits are upgraded, same as always.

No, they only say the documents came from Nintendo, and were given to developers as they recieved developer kits. In fact, the article says that developers have final specifications for everything except the Hollywood GPU, and expectations are its little more than an extended version of the Flipper.

What's the difference between a 'document from Nintendo given to developers as they recieved the early dev kits' and a dev kit document then? :)

Also where does it say they have final specs for everything but the GPU? The article only says they don't have specs for the GPU, everything else is very vague so who knows how much they actually know. Also what happened to this info being final specs?, now its final specs appart from the GPU which they expect might be a upgraded Flipper, not quite the same thing is it. This proves conclusively that these documents did not include final specs.

Nothing suggests otherwise. Nothing. That IGN article, from the title to the content, is suggesting final, production model Revolution specifications. The word "kit" is mentioned only once, and that is in the context described above.

The article says nothing even close to final production model specs. Nowhere does the article claim that any of this info is final and right at the start they even warn that this is based on early development systems, I'd have thought that would be enough to tell you that this is not final info. But if you can read that article and come to the conclusion that its based not only on current Revolution specs but final production model specs then I think I've hit a brick wall.
 
Last edited by a moderator:
Teasy said:
Developers can't target anything above the development kits they have anyway.

Don't see how that can be true. Of course they will target beyond the development kit. They just have to be a bit conservative with extrapolating final performance.
 
Teasy said:
Various people in the industry, basically saying that the system IGN referrenced is only being used as a stopgap until they can get real Revolution hardware up and running.

What various people in the industry? Do you mean your "contacts" or other more established people say of a Faf or Deano level? Just looking to establish credibility here.

Teasy said:
As well as logic really, you don't spend money on developing chips that are the same chips that were developed 6 years earlier.

Logic would have dictated that IMRs would have died 5 years ago and the world would now be run by ImgTec.

As to where the money went, I dunno. Good question. None of us know but it's not logical to use the absence of information to create scenarios. That is, it is illogical to use a negative to prove a positive.

Teasy said:
I wouldn't call IGN's article decent, more like vague and contradictory.

Admittedly I didn't pay much attention to it (I actually didn't read much of it) but what was vague or contradictory?

Teasy said:
Developers can't target anything above the development kits they have anyway. All they have to do is develop on the kit they have and upgrade as the kits are upgraded, same as always.

Huh? Why not. We do. If you don't, then all of your work prior to getting final dev kits is wasted because now what do you do with your models that have low res textures or are now too low poly? Recreate them?
 
Last edited by a moderator:
I agree with Teasy here, I wouldn't quite say that the article was completely contradictory, but vague certainly. Why would Nintendo add 40mbs of ARAM, (yes, they refer to it as DRAM, which we know the GC did not possess, but it is listed there regardless) As stated here:

Revolution will build on GameCube's configuration of 24MBs 1T-SRAM and 16MBs D-RAM (40MBs) by adding an addition 64MBs of 1T-SRAM. The result is a supply of memory in Revolution that totals 104MBs. That number does not consider either the 512MBs of allegedly accessible (but hardly ideal) Flash RAM or the Hollywood GPU's on-board memory, said to be 3MBs by sources.

Why allocate so much RAM to audio? 16mb was overkill in the GC, benefitting the system primarily as a cache for streaming data & for storing executables as F5 & Retro did. Clearly this is a mistake by IGN's sources, or by IGN themselves.

Comments like these don't exactly instill confidence either:

The Hollywood GPU, meanwhile, is believed to be an extension of the Flipper GPU in GameCube. Since developers have not gone hands-on with the GPU, they can only go on Nintendo documentation, which is limited.

This translates directly as speculation based upon the limited documentation provided by Nintendo themselves, nothing remotely concrete. As they, & IGN have no idea regarding what is generally considered the heart of a system's visual capabilities.

Revolution's Broadway CPU, developed by IBM, is an extension of the Gekko CPU in GameCube, according to official Nintendo documentation passed to us by software houses.

An extension in what way/s? Even the the mighty 360 took cues from the GC's architecture regarding the L1/L2 caches. Remember that the Gekko witheld half of the L1 data locked to keep needed information without wasting reads to L2 cache, and ultimately main memory. The rest of the chip isn't penalized for accesses to the the L2 data cache due to the non-blocking cache arrangement. Iwata said that certain aspects of both the Flipper/Gekko that they deemed innovative, user-friendly, & efficient would again surface in the Hollywood/Broadway chipsets in an interview over a year & a half ago. (I'll attempt to find a link if possible)

An extension is simply not specific enough & runs contradictory to what IBM themselves have stated. Also a sizeable % of the R&D went into the interface, it is much more advanced technically than many of you here realize.
 
Last edited by a moderator:
Ty said:
As to where the money went, I dunno. Good question. None of us know but it's not logical to use the absence of information to create scenarios. That is, it is illogical to use a negative to prove a positive.

So basic to advanced (from really basic to really advanced) militar strategy/tatics are illogical...;)

Anyway Li Mu Bai make a good proof why not take the article so serios.

Anyway if they make GC in 180nm and they get proffit why not at least update Rev as Moore Law say (eg: it should give for at least a tricore updated Gekko with a VMX each, at ~1Ghz; or a 970 + a gekko etc...) and they wouldnt piss off nobody (given the price)?

Also ATI said it would have at least DX9 shaders, that "contradict IGN" (see post ^^) and I think that ATI is more right than IGN.

But more important, we can easly think in concepts here more power would easly be needed, for example physics/animations if you want to the game/controler react in a intuitive maner for exemple in swords, tennis (specialy for the reaction of the ball/racket/flor, then a better AI will also be needed and the animation need to do something that represent what we had done too ie more advanced than 5 pre made animations like eg Top spinn) , sports in general, games that you can use interaction (like those medical ones) without those one of the big advantages (intuitive controls) of Rev is wasted or semethinghs that only Rev can make like a really good telekinesys (or gravity gun in real 3D) control, this is the kind of innovations that need more power to be done right (just like joysticks in the pad would be almost useless in NES/SNES (besides a few games like fighting) they needed 3D to their full use in Mario64 etc...).

Rev just need more power it is a console to the future, that needs freedom and Nintendo isnt stupid so they should have realised that so its HW most proof that.
 
What various people in the industry? Do you mean your "contacts" or other more established people say of a Faf or Deano level? Just looking to establish credibility here.

Mostly people you will know yourself on this board. A couple of devs here and someone else in another sector of the business.

Logic would have dictated that IMRs would have died 5 years ago and the world would now be run by ImgTec.

Nintendo give ATI/IBM money to develop chips = Nintendo get different chips then they had developed 6 years prior. That's far more direct and obvious logic then TBR are more efficient then IMR = TBR rule the world by 2006 :)

Admittedly I didn't pay much attention to it (I actually didn't read much of it) but what was vague or contradictory?

I'll just give a few examples off the top of my head. They give three different amounts for system ram, surely these developers must know what amount is right if they have specs? One dev says "Its not much more powerful then an XBox", which is extremely vague. Another says "Its like a souped up XBox", which doesn't even make sense, since if its a overclocked Gekko and Flipper then, call me crazy, but wouldn't that make it a soupled up GC not a souped up XBox?

Have a read of both articles, the entire thing is very vague and at times very contradictory. Like when a developer describes the system as "double the speed of Gekko and Flipper and you're pretty much there", then later says "As soon as we find out what the GPU can do then we'll know if Revolution will just be like an Xbox or something a little more.". How can they give that estimate on performance for the GPU if they don't even know what the GPU is capable of? To be honest IMO the contradiction comes from the fact that these developers are only describing current development kits and IGN are presenting some of that info as actual Revolution system specs.

Huh? Why not. We do. If you don't, then all of your work prior to getting final dev kits is wasted because now what do you do with your models that have low res textures or are now too low poly? Recreate them?

For content creation yes, they would have to make a good guess on what to aim for in that respect (aim higher then they think they need and downgrade if neccesary?), but the fact is you can't develop a PS3 quality game (graphically) on a GC based development kit.
 
Last edited by a moderator:
Li Mu Bai said:
I agree with Teasy here, I wouldn't quite say that the article was completely contradictory, but vague certainly. Why would Nintendo add 40mbs of ARAM, (yes, they refer to it as DRAM, which we know the GC did not possess, but it is listed there regardless) As stated here:

Revolution will build on GameCube's configuration of 24MBs 1T-SRAM and 16MBs D-RAM (40MBs) by adding an addition 64MBs of 1T-SRAM. The result is a supply of memory in Revolution that totals 104MBs. That number does not consider either the 512MBs of allegedly accessible (but hardly ideal) Flash RAM or the Hollywood GPU's on-board memory, said to be 3MBs by sources.

Sorry what? The quote above says they will add _64_ megs, not 40. The "40" in the above quote refers to the 24 PLUS the 16.

Li Mu Bai said:
Why allocate so much RAM to audio? 16mb was overkill in the GC, benefitting the system primarily as a cache for streaming data & for storing executables as F5 & Retro did. Clearly this is a mistake by IGN's sources, or by IGN themselves.

The article mentioned that so much Revolution RAM was dedicated to audio?

Li Mu Bai said:
Comments like these don't exactly instill confidence either:

The Hollywood GPU, meanwhile, is believed to be an extension of the Flipper GPU in GameCube. Since developers have not gone hands-on with the GPU, they can only go on Nintendo documentation, which is limited.

To be fair to the article it clearly states, "is believed" and then states "can only go on Nintendo documentation" so even the article is quite clear that it's not stating gospel.

<snipped stuff I just agreed with>

Li Mu Bai said:
An extension is simply not specific enough & runs contradictory to what IBM themselves have stated. Also a sizeable % of the R&D went into the interface, it is much more advanced technically than many of you here realize.

What has IBM stated? It's quite possible that IGN is overstating/underestimating when they use the word, "extension" but how do you know it is, "much more advanced technically than many of you here realize"?

pc999 said:
So basic to advanced (from really basic to really advanced) militar strategy/tatics are illogical...;)

What? It's generally considered that one commits a logical fallacy by using a negative to prove a positive. Here's an example. I count cars all day and not once do I see a red car. Therefore I commit a fallacy by coming to the conclusion that red cars don't exist. That is, no proof of red cars should NOT be used to say that "red cars don't exist". That's all I meant.

pc999 said:
Also ATI said it would have at least DX9 shaders, that "contradict IGN" (see post ^^) and I think that ATI is more right than IGN.

Now that is a better argument though perhaps that is what was meant by "extension".

Teasy said:
Mostly people you will know yourself on this board. A couple of devs here and someone else in another sector of the business.

Oh, I must have missed the responses.

Teasy said:
Nintendo give ATI/IBM money to develop chips = Nintendo get different chips then they had developed 6 years prior. That's far more direct and obvious logic then TBR are more efficient then IMR = TBR rule the world by 2006 :)

Well we don't know what was asked of them though.

Teasy said:
I'll just give a few examples off the top of my head. They give three different amounts for system ram, surely these developers must know what amount is right if they have specs?

That definitely sounds like a major contradiction.

Teasy said:
One dev says "Its not much more powerful then an XBox", which is extremely vague. Another says "Its like a souped up XBox", which doesn't even make sense, since if its a overclocked Gekko and Flipper then, call me crazy, but wouldn't that make it a soupled up GC not a souped up XBox?

Hah! Fair enough!

<snipped stuff regarding GPU which does sound iffy>

Teasy said:
To be honest IMO the contradiction comes from the fact that these developers are only describing current development kits and IGN are presenting some of that info as actual Revolution system specs.

That very well might be the case. IGN might be overextending the information they got.

Teasy said:
For content creation yes, they would have to make a good guess on what to aim for in that respect (aim higher then they think they need and downgrade if neccesary?), but the fact is you can't develop a PS3 quality game (graphically) on a GC based development kit.

I still don't understand this. We (my company but not my team) HAVE many PS3 kits RIGHT now. We have an idea of what they supposedly will be capable of. We target that level of performance NOW.

Edit - fixed quoting.
 
Last edited by a moderator:
Ty said:
What? It's generally considered that one commits a logical fallacy by using a negative to prove a positive. Here's an example. I count cars all day and not once do I see a red car. Therefore I commit a fallacy by coming to the conclusion that red cars don't exist. That is, no proof of red cars should NOT be used to say that "red cars don't exist". That's all I meant.

I know as I study a phylosophy (a lot of that XX century analitic that is based on logic) so I know a nice deal of a few of logics (kinds of logic , that by the word kind give you a hint of my argument) so I am not saing that you are wrong, it is perfectely right that in logic it is illogical to use a negative to prove a positive, but its utilization here is wrong as logic (at least as we know todays cant be used in everyhere everytime as I will show to you

For example two generals in the exact same situation, one falls in a trap the other no, the first had thought: none in sight ---> nice less danger, they probably are waiting in a place here they could get better positions etc...; the second had thought the same but after that it thinks it is a good place to place a trap lets go around;

So we got an interesting point that both make equally correct thoughts (dedutions, if you prefer) from the same info (prepositions) but one is better than the other one and logic could not dictate which one, and here you even have the hint once they spended all that money and time in the HW.

Sometime logic cant be aplicable just becauseits scope it is to narrow or it is impossible to get enought info like your IMR eg if you add enought prepositions about economics, enterprise power etc...one could easly argue that it is only logic that IMRs rules the world.

I know that this is hard to agree for a "cience person", but "real life" strategy/policial investigations and phylosophy proves this all the time (I am (and many others) arent even sastified with Set Theory as it is all based in logic but it still dont explain maths IMO). ANyway now I am in a rush so I can (will?) review this later to explain myself better, or correct something and my english.

Yet you have reason in one point, I should not used the word illogical.

Now that is a better argument though perhaps that is what was meant by "extension".

I agree, a definition for extension would help us a lot (or at least know if it is well used once that the article is bad enought (or there would not be threads about it) to make us wonder that).

Could we say that a 8500 is a extension of a 7500 (/8000?), I would hardly define it like that, much less if it is a 9700 and I doubt they could do DX9 without some drastics diferences in architeture (or at least additions enought to chance the prymary HW rendering so it can become a DX9 like chip).
 
Last edited by a moderator:
Well we don't know what was asked of them though.

IBM already had 1Ghz 750CXe CPU's years ago, so they wouldn't need to do any R&D to overclock Gekko to twice its clock speed. I don't think ATI need years of R&D to overclock Flipper to 300Mhz either, especially with the drop from 180nm to 90nm. It could very well be that Broadway and Hollywood are, to a degree, extensions of Gekko and Flipper. But I certainly don't think its logical for them to be overclocked versions of those chips.

I still don't understand this. We (my company but not my team) HAVE many PS3 kits RIGHT now. We have an idea of what they supposedly will be capable of. We target that level of performance NOW.

Lets say you have a GC as a development kit and you were told to target PS3 quality graphics, how could you achieve that?

Basically what I was suggesting is that these early dev kits are more for playing with the new controller and thinking up new game ideas (and some very early game development) then for creating great looking game engines.
 
Last edited by a moderator:
Teasy said:
IBM already had 1Ghz 750CXe CPU's years ago, so they would not need to do any R&D to overclock Gekko to twice its clock speed. I also don't think ATI need years of R&D to overclock Flipper to 300Mhz, especially with the drop from 180nm to 90nm. It just doesn't sound right to me.

It doesn't sound right to me either.

But you seem to be stuck on the literal idea that "3x the performance" literally means "design the same chip but at 3x the clock speed." Why?
 
That's not my idea to be honest. Its my opinion that Revolution won't be an overclocked GC no matter what raw performance it has. However IGN's article has developer quotes which they claim to be describing Revolutions hardware as "double the clock speed of Gekko and Flipper and you're pretty much there". That's the kind of thing that doesn't sound right to me.
 
Last edited by a moderator:
Teasy said:
IBM already had 1Ghz 750CXe CPU's years ago, so they wouldn't need to do any R&D to overclock Gekko to twice its clock speed. I don't think ATI need years of R&D to overclock Flipper to 300Mhz either, especially with the drop from 180nm to 90nm. It could very well be that Broadway and Hollywood are, to a degree, extensions of Gekko and Flipper. But I certainly don't think its logical for them to be overclocked versions of those chips.

But maybe they're NOT going to be overclocked versions of the chips but rather have the performance of overclocked GCN chips.

Teasy said:
Lets say you have a GC as a development kit and you were told to target PS3 quality graphics, how could you achieve that?

Let me try another route. What exactly would prevent you from doing what you just said? Certainly if functions are missing, those will be hard to work on but that still doesn't prevent you from creating a "PS3 level engine" upon a "GCN level" dev kit.

Teasy said:
Basically what I was suggesting is that these early dev kits are more for playing with the new controller and thinking up new game ideas (and some very early game development) then for creating great looking game engines.

I would think both actually. And in fact, I don't think you really need a dev kit to come (think) up with game ideas for the controller - that's a bit backwards actually. You come up an idea then run some R&D on it.
 
Teasy said:
However IGN's article has developer quotes which they claim to be describing Revolutions hardware as "double the clock speed of Gekko and Flipper and you're pretty much there". That's the kind of thing that doesn't sound right to me.

No, they were describing how to anticipates Rev's performance, not the hardware itself. For example, when they also said "souped up X-Box", they did NOT mean that there's actual x-box hardware inside. ;)
 
But maybe they're NOT going to be overclocked versions of the chips but rather have the performance of overclocked GCN chips.

Why would IBM designed a new chip to achieve the same performance as a 900Mhz Gekko?

Let me try another route. What exactly would prevent you from doing what you just said? Certainly if functions are missing, those will be hard to work on but that still doesn't prevent you from creating a "PS3 level engine" upon a "GCN level" dev kit.

I really don't see how you can truley target PS3 level graphics on a GC development kit. Not only would the GC kit have only a tiny fraction of the required performance but as you say it would also be extremely limited in features as well. Maybe we're talking about different things or I'm just not understanding this properly, either way lets leave it :)

I would think both actually. And in fact, I don't think you really need a dev kit to come (think) up with game ideas for the controller - that's a bit backwards actually. You come up an idea then run some R&D on it.

But obviously you need to try out the controller to find out exactly what it can and can't do before you can come up with the less obvious ideas. If its a choice between giving developers these kits now to let them play with the controller or leaving them without the ability to use the controller at all until real Revolution kits come along I know which one I'd choose.
 
Last edited by a moderator:
Back
Top