"2x the power of the GC," can someone clarify what this means? (ERP)

Well, they didn't do it on Xbox, either. It's not like Battlefront, Prince of Persia, and Mercenaries were replete with bump-mapping, DX8 water effects, and per-pixel lighting in the Xbox versions. I really don't think this has to do with manufacturer support; it's just economics. You've got more to program, more to debug, and more variance in content across the 3 versions, thus increasing development time (Ubi was stretched thin enough with 2 versions of Chaos Theory to not warrant giving the Cube a unique gfx engine as well). Why bother when people will buy your game without those things? Cross-platform games sell fine on the Xbox, and more sfx probably wouldn't help them much. Why take the trouble to do a bunch of extra stuff for the Xbox and Cube versions when it won't make you any more money? I mean, that'd be like making a toilet breakable or something. ;)

The Xbox games that take advantage of the hardware are almost without exception exclusives. Cube doesn't have many exclusives, which is why it seems like hardly anything pushes the Flipper.
 
Last edited by a moderator:
I actually wonder how much (if any, beisdes the obvius) hardwired functions will Rev have, because once that its prymary use will be for exclussive games and it seems that it will indeed have a lot of them, this could be a cheap way (both dev cost, HW price and performance wise)/(relatevely to shaders) of making some great fx that are "universals" (eg shadows, self-shadowing...) once that it could enable games to look much better abd this way make Rev much more capable of "fighting" the others consoles grafically.

This may hurt in terms of third party suport but if Rev indeed (and they seems they are certain of this) as a lot of sucess and it is easy/cheap to code I dont see reason to not improve Rev games too.

Althought I think this is not the best solution (do not want only reallism and also want new realitys too) I wonder if it will be a (good) option/choice.

BTW just realise that there isnt a Nintendo icon.

Edit:
This if we assume (I am only on speculation mode here) they will not have a as powerfull (for 480p)(or even a cheap) done only by shaders.
 
Last edited by a moderator:
fearsomepirate said:
Well, they didn't do it on Xbox, either.

But we're talking about Nintendo, not MS. Nintendo doesn't care if Xbox cross platform games don't take advantage of the Xbox.

Therefore if Nintendo wants the games on their system to shine a bit more than the competition, they need to be the ones to make it happen, not hope that Developer X will spend the extra resources on their own.
 
Ty, for what it's worth, I actually agree with you. Nintendo needed to take some proactive steps to get more graphical stand-out titles on their system. Merely demonstrating the Cube's technical merits with Metroid Prime, Star Fox Adventures, and Rogue Leader just didn't work.

For next-gen, their system just plain won't be as powerful, so there won't be any technical stand-out titles. But they should look a lot better than average Gamecube fare, since there won't be any motive to port engines over from the PS2.
 
fearsomepirate said:
Nvidia claimed something like 115m polys/sec out of XGPU. In-game, they got nowhere near that. The handful of Xbox games with large geometry rates, such as Jet Set Radio Future, had very little in the way of fancy shader effects. And when you look at games like Halo 2, Thief 3, Deus Ex 2, Chronicles of Riddick, Doom 3, etc that had lots of normal mapping, self-shadowing, etc, polygon rates came crashing down to Voodoo-era levels. So say that Hollywood's max theoretical vertex rate is 130m polys/sec, but in game situations such as those seen in the aforementioned games, you can achieve an actual performance of 30m polys/sec instead of the 3m polys/sec (or whatever, it wasn't a lot) that those games did due to a very targeted, efficient hardware design. It would be a massive visual improvement over current-gen, but not a massive improvement in theoretical performance. And of course, 30m polys/sec would be well-within a "3x improvement," but now we're (in imaginary land) able to apply far more effects to those polygons than Cube or Xbox could.

I'm not saying that's how it'll be, just that there was such a huge gap between theory and implementation in Xbox that you don't have to get your theoretical performance any higher if you can get your in-game performance much greater.


I pretty much agree with this fearsomepirate. I'm expecting actual Revolution performance to be in the 30M to 50M range. that's within 3x of 15M~18M polygon Gamecube.
 
Based on what we have seen from the GC this gen, where do you tech guy's guess Nintendo will place their emphasis graphically. What features do they seem to like, and what features do they not seem to care for given a cost. Or does ATI decide what Nintendo get's based on a very general request from Nintendo. Or does Nintendo seem to have a very clear specific idea of what kind of grpahics they want to create.
 
Megadrive1988 said:
I pretty much agree with this fearsomepirate. I'm expecting actual Revolution performance to be in the 30M to 50M range. that's within 3x of 15M~18M polygon Gamecube.

Still that may be completely diferent (to the user and "power metrics") depending on the feature set/fx used if I recal from some sythetics benchs X1600 can do about 4xM polys with heavy shaders (althought dont remember here I saw that) can we consider that 2-3x the GC?

ninzel said:
Based on what we have seen from the GC this gen, where do you tech guy's guess Nintendo will place their emphasis graphically. What features do they seem to like, and what features do they not seem to care for given a cost. Or does ATI decide what Nintendo get's based on a very general request from Nintendo. Or does Nintendo seem to have a very clear specific idea of what kind of grpahics they want to create.

I think this will depend on how similar (at a HW and power level) is from the others two, if similar I think that it will probably have a Xenus like phylosophy in mind (made for a API and for a console too) for easy cross platform, if dissimilar probably in a flipper like way (made by specific Nintendo request). But this will also be based on others choises like price or dev costs.
 
fearsomepirate said:
Ty, for what it's worth, I actually agree with you. Nintendo needed to take some proactive steps to get more graphical stand-out titles on their system. Merely demonstrating the Cube's technical merits with Metroid Prime, Star Fox Adventures, and Rogue Leader just didn't work.

For next-gen, their system just plain won't be as powerful, so there won't be any technical stand-out titles. But they should look a lot better than average Gamecube fare, since there won't be any motive to port engines over from the PS2.

Well fp, those $2,000.00 dev. kits should allow for plenty of experimentation & manipulation during the R&D stages.
 
ninzel said:
Based on what we have seen from the GC this gen, where do you tech guy's guess Nintendo will place their emphasis graphically. What features do they seem to like, and what features do they not seem to care for given a cost. Or does ATI decide what Nintendo get's based on a very general request from Nintendo. Or does Nintendo seem to have a very clear specific idea of what kind of grpahics they want to create.

In terms of what Nintendo will go for 'performance-hitless' (it's a word, look it up)-wise it's clear that they have a lot of transistors to play with budget-wise for each pixel.

I think they'll aim for a lot of general shader performance per-pixel and let the developers decide how to allocate it. I can't really think of any effects that they would want to hardwire, but then I'm a gamer, not a 3d programmer.
 
Li Mu Bai said:
Well fp, those $2,000.00 dev. kits should allow for plenty of experimentation & manipulation during the R&D stages.

Ugh, I didn't mean to double-post but I really meant to touch on this. The $2000 dev kits are a way of enticing less wealthy Euro and Chinese developers I think.

If I'm not mistaken, and this is kind of off-topic didn't Nintendo open a studio in Shanghai a couple of years ago? At least, this would fit nicely in with Yamauchi's Fund Q policy, underwriting a game's development up to a certain stage to see if it finds a publisher.

I also think that there'll be a lot of portability between GCN/REV/Gameboy's next iteration. The low price of the dev kits tells me that Nintendo's looking to incubate a lot of development - it would only make sense that if Rev isn't bleeding edge, Nintendo's next portable might be able to a lot of the same things.
 
greg3d said:
The $2000 dev kits are a way of enticing less wealthy Euro and Chinese developers I think.
How do they intend to entice less wealthy US developers? Or are all US developers wealthy and don't bat an eyelid at shelling out tens of thousands on dev kits?

Low cost devkits has to be a good thing. It's not like, in the grand scheme of things, income from devkits isn't going to matter much. Appeal to as many devs as possible and make money on the softwre they produce; not them buying devkits!
 
I have remembered something curious:

Dreamcast CPU size: 40-45mm^2
Gamecube CPU size 45mm^2

A lot of people believes that the PowerPC 750GX is going to be the Broadway, but the PowerPC 750GX at 130nm has a size of 21mm^2

This is just a curiosity, perhaps Revolution CPU is something between 750GX and 970FX in arquitecture, something like the equivalent of a PentiumM in a PowerPC architecture.
 
Urian said:
I have remembered something curious:

Dreamcast CPU size: 40-45mm^2
Gamecube CPU size 45mm^2

A lot of people believes that the PowerPC 750GX is going to be the Broadway, but the PowerPC 750GX at 130nm has a size of 21mm^2

This is just a curiosity, perhaps Revolution CPU is something between 750GX and 970FX in arquitecture, something like the equivalent of a PentiumM in a PowerPC architecture.
I had thought the same thing. Why not just cram new technology into a proven design. Reading up on it, the PPC 750 had an incredibly short pipeline, making it hard to reach 1+ GHz. How do you overcome that? Lengthen the pipeline. What do you have when you're done? I dunno, but it might look a lot like a PPC 970.

Maybe it is just a "double-clocked Gekko" as the devs have said. Then it could very well be a 1 GHz 750GX. How would an OoOE processor at 1 GHz compare to a couple in order 3.2 GHz behemoths? It'd probably be demolished, in most cases.
 
Cant see, if they indeed updated the HW, why would they use and invest so much money in a outdated 750 based CPU if they already have much better CPUs like the 970, others like the PPE (a dev friendly version) etc should also do a great job, and kep it a low cost.
 
OtakingGX said:
I had thought the same thing. Why not just cram new technology into a proven design. Reading up on it, the PPC 750 had an incredibly short pipeline, making it hard to reach 1+ GHz. How do you overcome that? Lengthen the pipeline. What do you have when you're done? I dunno, but it might look a lot like a PPC 970.

Maybe it is just a "double-clocked Gekko" as the devs have said. Then it could very well be a 1 GHz 750GX. How would an OoOE processor at 1 GHz compare to a couple in order 3.2 GHz behemoths? It'd probably be demolished, in most cases.

No doot it'd get demolished.

What's more is the CPU has to reach a certain frequency in order to facilitate the graphics chip's frequency. That's the reason Gekko was at a 3x multiplier over Flipper. I believe the FCC mandated it due to interference.

Unless the per-cycle throughput is extraordinary on Hollywood based on what you're saying the 750 just might not reach frequencies high enough to get the graphics chip at the performance they'd want.

I'm expecting a relatively large (~150m transistors) graphics chip. CPU-wise, I don't think Nintendo will care too much as long as it hits a high enough frequency to support the graphics chip.
 
greg3d said:
No doot it'd get demolished.

What's more is the CPU has to reach a certain frequency in order to facilitate the graphics chip's frequency. That's the reason Gekko was at a 3x multiplier over Flipper. I believe the FCC mandated it due to interference.

Unless the per-cycle throughput is extraordinary on Hollywood based on what you're saying the 750 just might not reach frequencies high enough to get the graphics chip at the performance they'd want.

I'm expecting a relatively large (~150m transistors) graphics chip. CPU-wise, I don't think Nintendo will care too much as long as it hits a high enough frequency to support the graphics chip.
The original Gekko was to be 2x the speed of Flipper. When Flipper only reached 162 MHz, Nintendo had IBM ramp up Gekko's speed to 3x Flippers. I don't think the whole number multiplier has anything to do with the FCC, it's just that now it's easier for them to run on the same bus, 162 MHz. Then Flipper to 1T-SRAM runs at double that, then Flipper to DRAM is half. All these add up to a harmoniously timed machine.
 
OtakingGX said:
I had thought the same thing. Why not just cram new technology into a proven design. Reading up on it, the PPC 750 had an incredibly short pipeline, making it hard to reach 1+ GHz. How do you overcome that? Lengthen the pipeline. What do you have when you're done? I dunno, but it might look a lot like a PPC 970.

Maybe it is just a "double-clocked Gekko" as the devs have said. Then it could very well be a 1 GHz 750GX. How would an OoOE processor at 1 GHz compare to a couple in order 3.2 GHz behemoths? It'd probably be demolished, in most cases.

Dual core 750VX...2 VMX units per core.
 
Back
Top