"2x the power of the GC," can someone clarify what this means? (ERP)

Vysez said:
RE4 didn't push more than 12 Million pps. Not even close.
I mean that would be saying that the game had more than half a Million Polygons per frame... That's more than a lot of next gen games.
Rogue Leader moves around 15 million (and it was a launch game). Rebel Strike looks better and has a much more going at once, so...
 
Vysez said:
RE4 didn't push more than 12 Million pps. Not even close.
I mean that would be saying that the game had more than half a Million Polygons per frame... That's more than a lot of next gen games.
Where did you get that number from?
 
I think the only that is clear is that common people can't tell the number of polygon pushed per framed.

RE4 didn't push more than 12 Million pps. Not even close.

But you're one of the few people who can tell?

I mean that would be saying that the game had more than half a Million Polygons per frame...

Over 500,000 polygons per frame at 30fps would equal over 15 million pps.
 
Last edited by a moderator:
Eagle-Vision said:
Rogue Leader moves around 15 million (and it was a launch game). Rebel Strike looks better and has a much more going at once, so...

I can't wait to see how many polygons the PS3 can apparently move in Factor 5's hands!

;)
 
Ingenu said:
Not sure what you imply there but people don't expect the Revolution to kick either the PS3 or the XBox360, power wise.

I meant IF IGN had mentioned crystal clear specs, it still would have been debated.

So if they mentioned specs that were uber powerful, you'd have the haters saying it can't be possible.

Or if they mentioned specs that were pitifully low, you'd have fan-boys saying it doesn't make sense.

In either case, even with crystal clear specs, there would still be a ton of useless debate. :)

ERP said:
People will continue to believe what they want reguardless of the quality of information to the contrary.

Exactly. It doesn't matter how clear or murky the information is. People would find something to argue about. ;)

And I'm not saying that's a bad thing, I'm just saying they still would argue.
 
Some things never change

Teasy said:
Over 500,000 polygons per frame at 30fps would equal over 15 million pps.
And that would clearly exceed 6-12 Millions polygons per seconds...

And, yes I can tell the difference between a less than 6 Millions pps game and one that pushes more than 12 Millions pps.
 
Ty said:
Exactly. It doesn't matter how clear or murky the information is. People would find something to argue about. ;)

And I'm not saying that's a bad thing, I'm just saying they still would argue.
That's why we need pictures first.

Because no matter what can be said, now, you know that people will cling on anything that supports their beliefs.

First, let people see the games, and then let's talk hardware specifications.

Also, the problem with the Revolution is that Nintendo always did release a Console on par with the competition, so this new strategy is not fully understood by some people.
Well, people do understand that the Revolution won't rival the X360/PS3, but they didn't expect a machine only marginally superior to the GC.
 
pc999 said:
20/30=0,6666...M polys
I ment where did he get the info from.Or where have you gotten THAT info from? ;)
So its around 0.66666666666666666666666666666667 mpolygons per frame? :p
20 million polygons per second is a big numner there I must say.
 
Vysez said:
And that would clearly exceed 6-12 Millions polygons per seconds...

And, yes I can tell the difference between a less than 6 Millions pps game and one that pushes more than 12 Millions pps.
Now my brain is tied up like a knot.Didnt you say that RE4 didnt push more than 12 million polygons?
If its true I believe that Cpacom could have kept the same amount of polygons for the PS2 version.But I am sure its much more
 
ERP said:
It does remind me of the outrage when Gamecube specs were published out of the developer docs by IGN. The number of people who argued that they must be wrong and from early docs, was somewhat amusing. Not that I'm saying IGN is accurate or not this time, just that they have in the do have pretty solid sources, although they can get enthusiastic in their interpretation of their sources.

Good of you to finally come in & comment ERP, I was beginning to believe I was going to have to plug in an ERP bat signal to illuminate the night sky. I also believe that IGN's comments can be interpreted incorrectly, esp. without a thorough tech. background & still no information or developer commentary on Hollywood's capabilities. I've read your opinions before regarding the Revmote as gimmicky, nice to know its precision is quite accurate though. It will be up to developers & their various implementations to prove the controller's gameplay worth however, outside of the fpser genre.

pc999 said:
Things like normal maps and such it is nice as a adition if it does not have (high?) cost (in terms of content, features, price, time till we get the next one...).

Normal maps should indeed be possible on the Rev., esp. since they were actually possible upon the GC. Read here: http://www.beyond3d.com/forum/showthread.php?t=18282

Hollywood will be able to perform the equivalent of DX9 level shaders. This should make (normal mapping) easily acheived esp. if XBX games like Riddick:EFBB, SC:Chaos Theory, (all 3 console versions) Halo 2, & Matrix:path of Neo (PS2) utilized them.

ninzel said:
My understanding is that the difference between Nintendo's quoted figures and MS or Sony's for that matter is that Nintendo shows realisitic in game figures whereas Sony and MS use raw numbers.

Yes, posting unnattainable real-world gaming theoretical maximums is not something Nintendo does, but MS is simply following the same line of PR Sony established here. We know the systems are indeed very powerful, but no need to insult our intelligence Kutarugi/Ballmer.

fearsomepirate said:
pc999, what I'm suggesting is that perhaps there are some unique approaches to hardware in the GPU that we just don't know about. Especially considering it was a 162 MHz chip with a low fillrate and a very simple T&L unit, Flipper could pull off some very nice graphics due to the unique approach of things like the TEV's indirect texturing, the multiple loopbacks, the ridiculously fast (for the time) framebuffer and texture cache, etc.

I think the most we can assume from the ATI article is that it will have a DX9-like feature set. We know little else about what exactly this feature set will be or how it will be implemented. Based on the engineering of DS, GBA, and Gamecube, I think they have maxima on cost, power consumption, and heat output, and they're trying to maximize image quality and graphical fidelity within those constraints.

Which harkens back to my question regarding the knowledge surrounding the Revolution's comprehensive system architecture. Without this, I do not think that a clear understanding can be attained from "2-3x as powerful" statements. Though I trust ERP, I suspect it will be no different from this generation. Meaning that differing developers will be capable of extracting differing levels of performance, once they familiarize themselves fully with the architectural strengths & weaknesses. Also dependent upon their proficiency in exploiting said aspects.
 
Last edited by a moderator:
Vysez said:
That's why we need pictures first.

Because no matter what can be said, now, you know that people will cling on anything that supports their beliefs.

First, let people see the games, and then let's talk hardware specifications.

Also, the problem with the Revolution is that Nintendo always did release a Console on par with the competition, so this new strategy is not fully understood by some people.
Well, people do understand that the Revolution won't rival the X360/PS3, but they didn't expect a machine only marginally superior to the GC.

Vysez, you're correct. We do need pictures first & foremost, but we cannot determine & quantify only "marginally more powerful" than the GC esp. when we know there will be both dedicated pixel & vertex shaders in Hollywood, with DX9 level ability. This alone elevates it beyond those statements imo. No?
 
It could be an issue of efficiency. Peak GFLOPS aren't terribly interesting and far from the complete picture of a processors performance. So what if you can get breath-taking performance in a single tight loop?

The thing that makes the sony/ibm/toshiba cell processor so damned awesome is the efficiency not just the number of SPUs. The SPUs have NO cache, they have a 256k local SRAM that is wholy their own and has it's own adress space. A pentium 4 is frighteningly innefficient because it never knows in advance when it has to go out to RAM and when it doesn't, but with the cell you can just prefetch what you need and then do the computation VERY quickly, not caring that system RAM is hundreds of clock cycles away. For FFTs and similar tasks people have managed to get the cell up to ~50 times faster than a p4.
 
Li Mu Bai said:
Normal maps should indeed be possible on the Rev., esp. since they were actually possible upon the GC. Read here: http://www.beyond3d.com/forum/showthread.php?t=18282

Hollywood will be able to perform the equivalent of DX9 level shaders. This should make (normal mapping) easily acheived esp. if XBX games like Riddick:EFBB, SC:Chaos Theory, (all 3 console versions) Halo 2, & Matrix:path of Neo (PS2) utilized them.

I know I am just trying to coment on the importance of capabilitysspecs//effects/... till we get enought power that is why I had complaint about AI, physics, shadows and not normal maps, but thanks for the link I already didnt remember a few things.

About high detail only effects like normal maps, I think they are nice as long as we do not sacrifice others things (eg between 3 high detail (UE3 level) models/caracthers and 9 normal detail (GC level) I would very probably prefer the second.

Vysez, you're correct. We do need pictures first & foremost, but we cannot determine & quantify only "marginally more powerful" than the GC esp. when we know there will be both dedicated pixel & vertex shaders in Hollywood, with DX9 level ability. This alone elevates it beyond those statements imo. No?

Well for a DX8.1 (?) console like XB that did not pushed 3x more polygons (probably not even 1,5x) than GC they stated it 80Gflops while GC is only capable of 10,5Gflops so it back you up and giving even more importance to what ERP said about interpretation as, given those exemple, 2-3xGC (31,5Gflops) and DX9 arent consistent.
 
pc999 said:
Well for a DX8.1 (?) console like XB that did not pushed 3x more polygons (probably not even 1,5x) than GC they stated it 80Gflops while GC is only capable of 10,5Gflops so it back you up and giving even more importance to what ERP said about interpretation as, given those exemple, 2-3xGC (31,5Gflops) and DX9 arent consistent.

The improvements made since SM1.0 is a good case study to consider when thinking about efficiency. Current DX9 cards show a polygon improvement in effects-heavy situtions by integer factors when switching from SM1.0 to SM3.0 paths in benchmarks like 3DMark. So you could very well have "somewhat better than Xbox" raw specs, but massively better in-game graphics than Xbox could do just by making all the right architecture/instruction set improvements.
 
fearsomepirate said:
The improvements made since SM1.0 is a good case study to consider when thinking about efficiency. Current DX9 cards show a polygon improvement in effects-heavy situtions by integer factors when switching from SM1.0 to SM3.0 paths in benchmarks like 3DMark. So you could very well have "somewhat better than Xbox" raw specs, but massively better in-game graphics than Xbox could do just by making all the right architecture/instruction set improvements.


I really don't know, are near to launch some gpu for phones with few shaders 3.0 from nvidia and ati or I'm wrong?

all depends on how much shaders are, how much they are clocked, how much are they fillrate/bandwitdh limited

16 SM 2.0 Shaders outperforms 8 SM 3.0 shaders with same clock
16 SM 2.0 Shaders @ 400 MHz outperforms 16 SM 3.0 Shaders @ 250 MHz

so this means... nothing

better efficiency with same shaders number and clock is arguable, but it's still unproved in real world, I just remember the comparison between far cry 2.0 and far cry 3.0, there's an improvement about of 5% sometimes, 0% other times, so I think that is a bit early to talk of revo SM 3.0

we have to know al least, the clocks, the shaders architecture, the bandwitdh and the fillrate
 
From my point of view I believe that ATI can give to Nintendo a X850 low-power equivalent based on Flipper and with low power consumption and adapted to MoSys 1T-SRAM-Q.

1. NES CPU sucked against Master System CPU, SNES CPU sucked against Mega Drive CPU, the only exception was N64 but Gamecube CPU sucked against the Emotion Engine.

2. Nintendo never created a CPU from 0 for their systems, they ever used existing CPUs for the work.

3. GPU is another work, Nintendo loves to do great graphics subsystems.

We don´t know if Van Hook, Cheng and the other people are behind the Revolution design but I highly doubt that an engineer like Tim Van Hook said years ago: "oh well, our next design for Nintendo will sucks technically" since they are the people behind the Flipper and the R300 series.

Am I the only person that thinks that the Hollywood is not more than an adapted PC GPU like the Flipper was an adapted "Dual" Aladdin7?
 
I think that nobody is saying that the gpu will suxx

I think that we are trying to understand the "2x cube" that came from devs

what I strongly believe is that we have here a "nintendo DS" thing

a lot less powerfull in 3d
a lot less heat, energy to power it
innovative controller

and this time, a lot less space taken from the whole machine

this drive to:
a) low clocks, low voltages

this make sense, just look how much is little the revolution chassis, it's impossibile to put a powerfull machine in a such few space, they are going to something different, as I sayed
 
Back
Top