Revolution Price Confirmed (?)

No, they were describing how to anticipates Rev's performance, not the hardware itself.

If that's what the developer was saying then that doesn't seem to be how IGN portrayed the comment:

Exact clock rates were not disclosed, but one development source we spoke to had this to say of the Revolution CPU and GPU: "Basically, take a GameCube, double the clock rate of the CPU and GPU and you're done."

Either way it still doesn't make sense to develop chips that have the same performance as double clocked GC chips. I think it makes more sense that these developers were talking about early development kits. Especially when the same developers say they don't know what Hollywood will be capaple of (because they're talking about the overclocked Flipper in the dev kits and not Hollywood itself?).

I suppose this discussion is sort of pointless :D Hopefully we'll hear something more concrete soon.
 
Last edited by a moderator:
Ty said:
But maybe they're NOT going to be overclocked versions of the chips but rather have the performance of overclocked GCN chips.

Well, considering the 750 scaled up pretty well, and is pretty cheap, I don't see why they wouldn't just consider using a later revision on a smaller process.

I would think both actually. And in fact, I don't think you really need a dev kit to come (think) up with game ideas for the controller - that's a bit backwards actually. You come up an idea then run some R&D on it.

In some ways, running R&D after is a bit backwards. Some ideas come to you just messing with the equipment and playing with it a bit.
 
Ty said:
I would think both actually. And in fact, I don't think you really need a dev kit to come (think) up with game ideas for the controller - that's a bit backwards actually. You come up an idea then run some R&D on it.


Wouldnt they also need the new tools that (supossedly) come with the updated devkits so they can really work and try their ideas before starting to make the game (should be be very hard to start a game without a pretty good idea of what they can)?

But maybe they're NOT going to be overclocked versions of the chips but rather have the performance of overclocked GCN chips.

Wouldnt be cheaper to just do overclocking te Gekko?
 
Teasy said:
Why would IBM designed a new chip to achieve the same performance as a 900Mhz Gekko?

This gets back to the argument, what is meant by "extension". Perhaps it's roughly the performance listed but has a bunch of new functionality?

Teasy said:
But obviously you need to try out the controller to find out exactly what it can and can't do before you can come up with the less obvious ideas. If its a choice between giving developers these kits now to let them play with the controller or leaving them without the ability to use the controller at all until real Revolution kits come along I know which one I'd choose.

Sure, naturally there is always back and forth but looking for major ideas AFTER you have the hardware is probably a mistake in process. That is, you shouldn't form an idea AFTER you know the capabilities because it also takes quite a bit of time to figure those capabilities out.

Cyander said:
Well, considering the 750 scaled up pretty well, and is pretty cheap, I don't see why they wouldn't just consider using a later revision on a smaller process.

See previous reference to "extension".

Cyander said:
In some ways, running R&D after is a bit backwards. Some ideas come to you just messing with the equipment and playing with it a bit.

As mentioned above, there is always some back-and-forth.

pc999 said:
Wouldnt they also need the new tools that (supossedly) come with the updated devkits so they can really work and try their ideas before starting to make the game (should be be very hard to start a game without a pretty good idea of what they can)?

That's why I mentioned "R&D". You have the idea, you R&D the idea ON the dev kits. Therefore this validation process occurs through hopefully many iterations. This is precisely how we're handling our UE3 game (this one is NOT on the PS3 mind you).

pc999 said:
Wouldnt be cheaper to just do overclocking te Gekko?

You and I already covered this before when we spoke about "extension".
 
Ty said:
That's why I mentioned "R&D". You have the idea, you R&D the idea ON the dev kits. Therefore this validation process occurs through hopefully many iterations. This is precisely how we're handling our UE3 game (this one is NOT on the PS3 mind you).

Ok I did not express well, they do need TOOLs so they can work on the controler and those tools need to come with the new dev kits, at least that is what is been said over the net. Please correct me if I am wrong.(and in that case if they can why not give a bit more powerfull dev kits so they can work new ideas that arent possible in GC, ence this article)


You and I already covered this before when we spoke about "extension".

Real question: there is a difference between a overclocked Gekko and a different chip in this case a extension of Gekko) with equal performance (real world, as it seems that is wat they suggest in the article)? At least besides price, power ... things that arent a problem to Gekko.
 
Last edited by a moderator:
Ty said:
Sorry what? The quote above says they will add _64_ megs, not 40. The "40" in the above quote refers to the 24 PLUS the 16.

Ty, you're completely missing the correlative comparison IGN is illustrating regarding the Revolution's total ram vs. that of the Gamecube's. 24+16 does indeed equal 40mb, but what does 64mb of 1T-SRAM-Q+40mb of DRAM equal? 104mb which is exactly what the article states. (not including embedded-DRAM)


The article mentioned that so much Revolution RAM was dedicated to audio?

Ty, the article clearly states that that the GC had 16mb of DRAM which it did not. Or more accuratey, the DRAM it housed was useless for anything other than sound which it was created to accomodate. It possessed 16mb of Auxiliary-RAM, used for audio primarily by the Macronix DSP due to its 81mhz speed. I was simply pointing out IGN's mislabelling, or erroneous reporting. The are incorrectly drawing parallels between the 2 systems. That is why I was suggesting that the Revolution's secondary pool of ram would also be used for audio, my point wasn't meant to be taken literally.

To be fair to the article it clearly states, "is believed" and then states "can only go on Nintendo documentation" so even the article is quite clear that it's not stating gospel.

Agreed, but the tone of the article doesn't support that line of thought imo. Nor does the reader come away believing that much of this could be supposition, fueled by incomplete Nintendo documentation as final specs. have yet to be finalized. Remember when the Gekko & Flipper both received upgrades & downgrades in speed respectively after the announced initial specs? (as did XBX) The Gekko went from 405-485mhz, & the Flipper 200-162mhz iirc. (or was it 202.5?) IGN seems to be painting the bleakest of scenarios without enough supplemental evidence imo.

What has IBM stated? It's quite possible that IGN is overstating/underestimating when they use the word, "extension" but how do you know it is, "much more advanced technically than many of you here realize"?

From the article "IBM Scores a Hat Trick with Next-Gen CPUs":

Customizable chips also were important to all three game makers because each has a slightly different objective with its machines, Su said.

Microsoft is emphasizing Internet connectivity with its new high-definition Xbox 360, as well as other entertainment features such as the ability to connect to home computers to play music and show movies.

Sony's new PlayStation is expected to introduce a new high-definition DVD technology, called Blu-ray, along with all sorts of ways to connect with other Sony electronics such as MP3 music players and digital cameras.

Nintendo, meanwhile, is sticking fast to the gaming business. It was looking mainly for ways to better display graphics, speed up the processing power of its GameCube successor, and make it more user-friendly, with wireless controller connections and other features.

"All of these [companies] are looking for a way to differentiate themselves from each other," said IBM's Su. "What we offered them is sort of a bag of tricks in terms of processor technology ... that they could pull from to differentiate their products."

This doesn't sound merely like an "extension," but if anyone believes that that we're simply getting 2x the clockspeed of both Flipper & Gekko, I have some swampland in Florida to sell you. Strangely IGN has one & only one source that claim to have any information on Broadway & Hollywood's speed & functionality. 324mhz & 870mhz for the Broadway? So no dedicated vertex shader still? Its rendering at one third the resolution (480p) then it obviously doesn't need the identical shading performance of the 360 or PS3, as 480p is hardly bandwidth intensive. At 1/3 the resolution even one third of the pixel shading performance would be sufficient to incorporate advanced effects, while simultaneously never dropping below the 60fps refresh rate.

Also regarding the controller interface, calculating the feedback is quite demanding on computational power and bandwidth for sensorical data. There must be no perceptible lag between movement & the desired on-screen animation, in addition to force-feedback, well you can see where I'm heading.

Well we don't know what was asked of them though.

Neither does IGN, nor their sources for that matter it seems.

That very well might be the case. IGN might be overextending the information they got.

Ahh.....now you're coming around.

I still don't understand this. We (my company but not my team) HAVE many PS3 kits RIGHT now. We have an idea of what they supposedly will be capable of. We target that level of performance NOW.

I would have to say that's unusual. (as my brother-in-law is deving for the Rev as I post this) Most Revolution devkits going out right now are simply GCs with the new controller interface included, & incomplete documentation as what to expect from both the central processor, & nil on the GPU.
 
Last edited by a moderator:
Li Mu Bai said:
Ty, you're completely missing the correlative comparison IGN is illustrating regarding the Revolution's total ram vs. that of the Gamecube's. 24+16 does indeed equal 40mb, but what does 64mb of 1T-SRAM-Q+40mb of DRAM equal? 104mb which is exactly what the article states. (not including embedded-DRAM)

I'm still confused. Here's the quote you provided from the article.

Revolution will build on GameCube's configuration of 24MBs 1T-SRAM and 16MBs D-RAM (40MBs) by adding an addition 64MBs of 1T-SRAM. The result is a supply of memory in Revolution that totals 104MBs. That number does not consider either the 512MBs of allegedly accessible (but hardly ideal) Flash RAM or the Hollywood GPU's on-board memory, said to be 3MBs by sources.

Above which you led into with:

Why would Nintendo add 40mbs of ARAM, (yes, they refer to it as DRAM, which we know the GC did not possess, but it is listed there regardless) As stated here:

When you said, "Nintendo add 40mbs of ARAM", I read that as "adding 40mbs of RAM to Revolution from what the GCN has" - which of course the article mentions that it's 64. In other words, you said "add 40" but the article said, "add 64" which confused me to all heck.

Li Mu Bai said:
Ty, the article clearly states that that the GC had 16mb of DRAM which it did not. Or more accuratey, the DRAM it housed was useless for anything other than sound which it was created to accomodate. It possessed 16mb of Auxiliary-RAM, used for audio primarily by the Macronix DSP due to its 81mhz speed. I was simply pointing out IGN's mislabelling, or erroneous reporting. The are incorrectly drawing parallels between the 2 systems. That is why I was suggesting that the Revolution's secondary pool of ram would also be used for audio, my point wasn't meant to be taken literally.

Ok, so let's agree that IGN got the 16 mb of DRAM wrong. Fine. But did they state that the additional RAM this time would be used for audio, yes or no? Like I said, I just glanced through it.

Li Mu Bai said:
IGN seems to be painting the bleakest of scenarios without enough supplemental evidence imo.

That's very possible. It's also possible some are reading more into it than intended what with communication being a two-way street.

From the article "IBM Scores a Hat Trick with Next-Gen CPUs":

Nintendo, meanwhile, is sticking fast to the gaming business. It was looking mainly for ways to better display graphics, speed up the processing power of its GameCube successor, and make it more user-friendly, with wireless controller connections and other features.

This doesn't sound merely like an "extension," but if anyone believes that that we're simply getting 2x the clockspeed of both Flipper & Gekko, I have some swampland in Florida to sell you.

What exactly in that bolded quote doesn't sound like an "extension" - which I admit is an entirely broad term extremely open to interpretation? Secondly, you're missing the point Joe and I are making. We're not saying, "double the clockspeed and that's exactly the chips you're getting". We're saying that maybe what they mean is that "double the clock speed and that's the performance you're looking at."

Li Mu Bai said:
Also regarding the controller interface, calculating the feedback is quite demanding on computational power and bandwidth for sensorical data. There must be no perceptible lag between movement & the desired on-screen animation, in addition to force-feedback, well you can see where I'm heading.

Sure, and I don't disagree with that at all. And just maybe that's where a healthy chunk of the R&D budget went.

Li Mu Bai said:
Ahh.....now you're coming around.

Heh, hardly. I prefer the wait-and-see attitude but in the absence of evidence, you go with what you know.

Li Mu Bai said:
I would have to say that's unusual.

And what exactly is unusual with what I said? Unusual to you and/or the industry?

Li Mu Bai said:
(as my brother-in-law is deving for the Rev as I post this) Most Revolution devkits going out right now are simply GCs with the new controller interface included, & incomplete documentation as what to expect from both the central processor, & nil on the GPU.

What does this have to what I said earlier re: developing for a target?

Here's what was originally said.

Developers can't target anything above the development kits they have anyway. All they have to do is develop on the kit they have and upgrade as the kits are upgraded, same as always.

And let me give you an example why this is completely incorrect. Our studio received 1 out of the first 20 PS3s in the US. It was an ugly weird looking microwave-ish box. Did it run anything like the final PS3 in terms of performance? No, of course not. In fact most times it wouldn't even run! So if the preceeding statement were correct, then logically we wouldn't even be able to develop on it. But of course we and others could.
 
Sure, naturally there is always back and forth but looking for major ideas AFTER you have the hardware is probably a mistake in process. That is, you shouldn't form an idea AFTER you know the capabilities because it also takes quite a bit of time to figure those capabilities out.

As I said there are the obvious ideas that naturally spring to mind (FPS, RTS, various sports games, sword fighting ect). But there will also be those less obvious idea's that will come from playing with the controller for a bit and I can't see why that would be a mistake. Even for the obvious idea's its neccesary to actually use the controller to see which of those ideas suit it best.

And let me give you an example why this is completely incorrect. Our studio received 1 out of the first 20 PS3s in the US. It was an ugly weird looking microwave-ish box. Did it run anything like the final PS3 in terms of performance? No, of course not. In fact most times it wouldn't even run! So if the preceeding statement were correct, then logically we wouldn't even be able to develop on it. But of course we and others could.

Yeah you can develop for it but I was talking about creating a certain level of graphics. Nicked was saying that these developers would have to have full details on Revolutions final performance now (something we know they don't have anyway) so they could target that level of graphics. Saying that for instance what if developers are currently making PS3 level graphics for Revolution and the system might end up not being capable of that. Which is a valid point, however what I was saying is that on a GC development kit you can't do much more then very early graphics development on a game targetted to have PS3 level graphics anyway. You reach a certain point where the GC just can't push the kind of graphics you're trying to create (even at a very low framerate). So is it really neccesary to have full and final system specs when all you have to develop on right now is a overclocked GC with the new controller.
 
Last edited by a moderator:
Teasy said:
As I said there are the obvious ideas that naturally spring to mind (FPS, RTS, various sports games, sword fighting ect). But there will also be those less obvious idea's that will come from playing with the controller for a bit and I can't see why that would be a mistake.

Absolutely. But if your company/team is getting a dev kit and banking on the hope that they'll stumble across an idea for a game as you fiddle with it, you're screwed. That's why it's a mistake. As I said before, naturally you'll discover some interesting things about the device once you have it, but that takes time. Meanwhile you're paying the rest of your team to do nothing because if there are no axial ideas already in place, they can't work on anything.

Teasy said:
Even for the obvious idea's its neccesary to actually use the controller to see which of those ideas suit it best.

Absolutely, and that falls under the R&D phase I mentioned previously. You have an idea already and then you test it. You shouldn't count on fiddling with it and then have an idea hit you.

Edit spelling.
 
Last edited by a moderator:
Just by curiosity, what would you consider a extension of a chip?

Could, eg, a Pentium 4 D 955 EE be considered a extension of the first Pentium 4 (1Ghz?, dont remember its code name) once they are all P4 (NetBurst architeture)or it can only be considered in the same core (eg Willamette, Prescott, Northwood etc...)?

So we can have a better idea from what they could do with a extension of Gekko or even Flipper.
 
Last edited by a moderator:
pc999 said:
Just by curiosity, what would you consider a extension of a chip?

That's a good question, isn't it? We'd first have to ask ourselves, what makes Gekko, Gekko? Would slapping on the latest shader support make it a totally new chip and thus not an extension? I dunno - at some point it becomes a new chip and not an extension but that line that divides these definitions probably can't be argued by us.
 
Thanks for the reply, to bad it is only one, anyway it seems that there is no canonic concept to being a extension of a processor, because if the above example can be considered then 2-3x times the raw speed, overall improvements, 64bits, sse instructions, multicore, HT , 4(?)x the cache, then meybe a extension of Gekko could make the work if it is extended in a similar fashion.
 
Teasy said:
That's not my idea to be honest. Its my opinion that Revolution won't be an overclocked GC no matter what raw performance it has. However IGN's article has developer quotes which they claim to be describing Revolutions hardware as "double the clock speed of Gekko and Flipper and you're pretty much there". That's the kind of thing that doesn't sound right to me.

In terms of just fillrate, couldn't that describe an X1600?

If its a choice between giving developers these kits now to let them play with the controller or leaving them without the ability to use the controller at all until real Revolution kits come along I know which one I'd choose.

That doesn't sound good for a 2nd or 3rd quarter revolution launch.....if Rev ends up delayed till 2007, Nintendo probably could release a $200 system that's more powerful than Xbox 360 or PS3. (at least in graphics on a 65nm process)
 
Anyone read this interview with ATI?
Apparently the Hollywood is not based on any pc chipset and it's build from the ground up for the Revolution.

http://www.revolutionreport.com/articles/read/254

Here are some quotes:

"Revolution Report: Is Hollywood based off Flipper, a current or upcoming PC architecture, or built from the ground up?

Swinimer: Hollywood is a specific design and is in no way reflective of PC technology. Even when the Flipper chips came out, people were asking that question: "Is this a spin-off of something done on the PC?", and the answer is no. It is designed the same as the Flipper was -- from the ground up for a specific console. Totally different sort of architecture from what you might find on the PC. Certainly, there are some underlying values—you know, how you get graphics on the screen—that's there. It's not, for example, like we took a PC design and said 'oh, you know what? If we tweak this and test this, it will work in a console.' [That's] not the case.

Revolution Report: Considering the form factor of the Revolution, heating has become a concern. Has this been a challenge for ATI in development of Hollywood?

Swinimer: The form factor design of even some of the newer consumer electronics devices are getting smaller and smaller, and we are taking that into great consideration, all across the board. I don't know if you are aware of this, but ATI has graphics chips in Motorola RAZRs. ATI has graphics chips inside many consumer electronic designs and heat is definitely a consideration, so we definitely take that into consideration when we are designing new chips. There's a lot of technology you can put into the chip now that can reduce the level of [heat] output.

For example, on a totally different side note, on our PC side we are very conscious of that and we try to, over the course of the design of a PC chip, get it to the point where you do not need a fan. These are things we have to take into consideration. Putting a fan on PC cards, retail cards such as ATI’s Radeon cards, adds more cost and complexity to the design of the chip. If we can get the design to the point where you don't need the fan to keep it cool, you've exceeded on a number of different levels. ATI in general is very conscious of this when we are working with not only PC vendors, but also consumer electronics manufacturers.

Revolution Report: A number of Web sites have inferred that Revolution will be significantly inferior graphically. While it certainly seems like Revolution won't output in HD, is it safe to assume that Hollywood will not feature a comparable polygon count or the same amount of graphical effects as the Xbox 360's GPU?

Swinimer: What I can say is that ATI is focused, as is Nintendo, in making [Revolution] a great, gaming entertainment platform. I know that a lot of journalists are very focused on specs. It's the big thing; as a geek, I look for that too. The key thing to keep in mind is that Nintendo, with ATI's help, is trying to create a game console where you don't have to look at [specs].

From a broader perspective, we share in Nintendo's position that this console will be devoted to the general gamer. When you have a game developer developing [for] this, the goal is to ensure that they don't have to worry about the complexity that is required to develop the games by making them "jump through hoops." That was one of the benefits of working on the GameCube; developers were saying that it is quite easy to develop for and there are not a lot of complexities so they could produce titles easily. That being said, we want consumers to look at the game, play the game and be involved in it. We are doing our very best to make this Nintendo gaming experience the very best it can be."
 
Very interesting, althought it do not give any real new, althought it seems that they invested well on the chip (if we assume that is not a overclocked chip).

I wonder if we took this
Totally different sort of architecture from what you might find on the PC. Certainly, there are some underlying values—you know, how you get graphics on the screen—that's there. It's not, for example, like we took a PC design and said 'oh, you know what? If we tweak this and test this, it will work in a console.' [That's] not the case.

And the comments from the blog No End Soon (assuming it is real), we can infer a different rendering mode other than IMR.
 
Last edited by a moderator:
I don't know if you are aware of this, but ATI has graphics chips in Motorola RAZRs.

Gasp, Rev will be as powerful as a RAZR!

Actually, interesting to see that Rev isn't using a PC derived graphics chip. It fits with the knowledge that the former ArtX team is working on Revolution, and that the current PC graphics chips were designed by the Radeon 8500 team. (though based off of R3xx)

I'm really hoping for a TBDR, because then, like Dreamcast, it will allow proponents to claim the system is infinitely powerful.
 
News:

http://www.planetgamecube.com/news.cfm?action=item&id=6735

The most important thing to come out of the interview is that Revolution will most likely launch before Thanksgiving in North America. According to Iwata, Nintendo has "no plans to miss out" on the holiday sales rush, for risk of losing retail support. There was no mention if a Japanese launch would come before or after this time period.

In addition to the more concrete launch date, Iwata also mentioned that a "nearly finished version" of the Revolution hardware will be shown off at E3.


Interview with Fils-Aime
Since you mentioned pricing, I assume the Revolution will be accessible to gamers for substantially less than $700?
Fils-Aime: That's correct. The next-generation console from Nintendo, code-named Revolution, will cost less than $300. Our third resolution is to stop turning away new players.
...
How do you think Revolution will sell?
Fils-Aime: We will sell more units than Xbox 360 did here in the United States in our launch window. I mean, in December, we sold more GameCubes in the United States than Microsoft sold 360s, and Revolution will do better than that.

So nothing new here: Nintendo will launch at least some time after E3. North American launch is before Thanksgiving and will probably have more launch units allocated than X360 launch units. Rev will cost less than 300$
 
I mean, in December, we sold more GameCubes in the United States than Microsoft sold 360s, and Revolution will do better than that.

Interesting. Of course the 360 was far more expensive than the GCN AND it had some pretty major supply issues, so he's hardly making a fair comparison.
 
The real question is who's selling more this holiday season...

The X360 launch last year is a bonus for MS IMO.
 
Back
Top