PS3 to have 50x graphics power of PS2

Graphics Synthesizer in PS2:

75 million flat shaded triangles/sec draw rate * 50 = 3.75 billion flat shaded triangles/sec

25 million triangles/sec with texture, g-shade, alpha * 50 = 1.25 billion triangles/sec

2.4 billion pixels/sec raw fillrate * 50 = 120 billion pixels/sec raw fillrate

1.2 billion pixels/sec textured fillrate (+bilinear) = 60 billion pixels/sec textured fillrate

48 GB/sec bandwidth * 50 = 2.4 TB/sec bandwidth


okay say PS2 games are getting 10-15 million polygons/sec * 50 = 500M to 750M polygons/sec in PS3 games ???


remember that GSCube (GScube 16) was pushing around 65 million textured & *fully featured* (?) polygons per sec in the realtime Antz demo (bar fight scene). also, in-house at Criterion, GSCube was pushing around 300M polygons/sec (fully featured?) in tests. the 300M figure is about 1/3 the max theoretical performance of GSCube of 1.2 billion polys/sec:
http://groups-beta.google.com/group/rec.games.video.sony/msg/036f50116a3ff28a?dmode=source



ok back to PS3

tens of billions of pixels/sec have not been ruled out by this announcement. seems like there is room for it. and notice I did *not* say that PS3 GPU *will* push tens of billions of pixels/sec. :)

hundreds of millions of polygons/sec with textures, shaders, features on, seems to be within reach, hopefully in games. that might put PS3 (GPU) on a similar level as Xenon GPU.

and what I'm really hoping for is that PS2 GS's 48 GB/sec eDRAM bandwidth gets a 50x boost too! to over 1 Terabyte/sec 8)

please note that I gave no absolutes. I really have no idea how PS3 will perform. I'm just comparing info from PS2, GSCube and what Nvidia is saying about PS3 GPU. that is it.
 
Megadrive1988 said:
Graphics Synthesizer in PS2:

75 million flat shaded triangles/sec draw rate * 50 = 3.75 billion flat shaded triangles/sec

25 million triangles/sec with texture, g-shade, alpha * 50 = 1.25 billion triangles/sec

2.4 billion pixels/sec raw fillrate * 50 = 120 billion pixels/sec raw fillrate

1.2 billion pixels/sec textured fillrate (+bilinear) = 60 billion pixels/sec textured fillrate

48 GB/sec bandwidth * 50 = 2.4 TB/sec bandwidth


okay say PS2 games are getting 10-15 million polygons/sec * 50 = 500M to 750M polygons/sec in PS3 games ???


remember that GSCube 16 was pushing under 100M polygons/sec in realtime demo Antz (bar fight scene) and also around 300M polygons/sec in tests.

Alright...so do you guys expect the ps3 to be more powerful than the GSCube?
 
lright...so do you guys expect the ps3 to be more powerful than the GSCube?

in processing power? yes, it would seem so

in main memory bandwidth and eDRAM bandwidth? most likely, or maybe

in rendering features, pixel shaders, etc? yes

in amount / size of RAM memory? no
 
wazoo said:
Does it mean, Pixar have to start working on TOY Story III to have a decent benchmark for the ps3 :rolleyes:

Actually I thought that their "friends" at Disney already did one or plan to do one themselves, only to cash-in on DVD sales.

I was more thinking about a remake of Final Fantasy. God, I loved Akira ;-)

Or am I saying something wrong here ? I remember some graphics company tried to proof you could stuff like in that movie with some graphics card.... *cough* *cough* FX5800 ?
 
I hope it's not 50x in the shading 'power'!! NG graphics needs much more than that..
GS was quite lacking in that department.. ;)
 
Megadrive1988 said:
lright...so do you guys expect the ps3 to be more powerful than the GSCube?

in processing power? yes

in main memory bandwidth and eDRAM bandwidth? most likely, or maybe

in rendering features, pixel shaders, etc? yes

in amount / size of RAM memory? no

Uh ? Do you think Sony would be silly enough to limit the graphics output to SD resolution or maybe 480P@60Hz ? I think there's no way they can get around 720P... at least. More and more people get LCDTVs, which mostly (at least in the US and Asia, not in Europe) have a "PC input" (= DVI or VGA connector).
 
well IIRC, it has been said that Xenon/Xbox2 VPU will have '100x' the shading power of Xbox GPU. forget where I read that. maybe the leaked Xenon document.
 
Megadrive1988 said:
well IIRC, it has been said that Xenon/Xbox2 VPU will have '100x' the shading power of Xbox GPU. forget where I read that. maybe the leaked Xenon document.

Wasn't it EA that said that? I read the same thing someplace.
 
Well if true, and if we weight the "1000x PS2" claim to be 1/2 graphics and 1/2 CPU, then CELL only needs to be ~510x more powerful than the PS2 EE. :)
 
loekf2 wrote:
Uh ? Do you think Sony would be silly enough to limit the graphics output to SD resolution or maybe 480P@60Hz ? I think there's no way they can get around 720P... at least. More and more people get LCDTVs, which mostly (at least in the US and Asia, not in Europe) have a "PC input" (= DVI or VGA connector)


what?

I do not understand your reply *in context* to what I said. I was comparing GSCube to the clues we have about PS3.
 
rabidrabbit wrote:
How many GS's were on the GS'Cube? less than 50?

london-boy wrote:
The GSCube 16 had... sixteen.
And the GSCube 64 had... err.. sixty-four.


true.

but the GSs in GSCube 16 and GSCube 64 were the GS I-32 variant. these had 32 MB of eDRAM, each. compared to the 4 MB eDRAM in the PS2's GS.

thus GSCube 16 had 512 MB (MegaBytes) of eDRAM from its sixteen GS I-32s. on top of the 2048 MB (2 GB) of RDRAM main memory.

the GSCube 64 had (or would have had) 2048 MegaBytes of eDRAM from its sixtyfour GS I-32s. on top of the 8192 MB (8 GB) of RDRAM main memory)


in other words, GSCube had a fuckload of memory that PS3 cannot possibly hope to match. where PS3 might be superior to GSCube (from what i gather) is in PS3s on-chip memory bandwidth (LS, eDRAM, Image Cache), in sheer fp & ops performance, rendering features, and HOPEFULLY memory latency. among other things.

now, the more knowledgable programmers and techies might be so kind as to point out where i am wrong.
 
I believe that they are talking about the geometric part and from the power of PS2 doing DOT-3 graphics quality (If my memory don´t fails PS2 loses 4 render cycles for do a DOT-3 effect in software mode).

250 milions of polygons per second seems a more realistic number to me.
 
Megadrive1988 said:
Graphics Synthesizer in PS2:

75 million flat shaded triangles/sec draw rate * 50 = 3.75 billion flat shaded triangles/sec

25 million triangles/sec with texture, g-shade, alpha * 50 = 1.25 billion triangles/sec

2.4 billion pixels/sec raw fillrate * 50 = 120 billion pixels/sec raw fillrate

1.2 billion pixels/sec textured fillrate (+bilinear) = 60 billion pixels/sec textured fillrate

48 GB/sec bandwidth * 50 = 2.4 TB/sec bandwidth


okay say PS2 games are getting 10-15 million polygons/sec * 50 = 500M to 750M polygons/sec in PS3 games ???


remember that GSCube (GScube 16) was pushing around 65 million textured & *fully featured* (?) polygons per sec in the realtime Antz demo (bar fight scene). also, in-house at Criterion, GSCube was pushing around 300M polygons/sec (fully featured?) in tests. the 300M figure is about 1/3 the max theoretical performance of GSCube of 1.2 billion polys/sec:
http://groups-beta.google.com/group/rec.games.video.sony/msg/036f50116a3ff28a?dmode=source



ok back to PS3

tens of billions of pixels/sec have not been ruled out by this announcement. seems like there is room for it. and notice I did *not* say that PS3 GPU *will* push tens of billions of pixels/sec. :)

hundreds of millions of polygons/sec with textures, shaders, features on, seems to be within reach, hopefully in games. that might put PS3 (GPU) on a similar level as Xenon GPU.

and what I'm really hoping for is that PS2 GS's 48 GB/sec eDRAM bandwidth gets a 50x boost too! to over 1 Terabyte/sec 8)

please note that I gave no absolutes. I really have no idea how PS3 will perform. I'm just comparing info from PS2, GSCube and what Nvidia is saying about PS3 GPU. that is it.

It's a really good thing you weren't serious about that; because if you were, you would be deluding yourself. I have pointed it out many times before that it does not matter when a console is released; an inferior or superior advantage point is not always the case. A console could release two years, six months, or even a few days apart from one another and it could show little or unexpectedly powerful results. It all depends on the company. But, if what you say was true, then the PS2 would not be able to keep up with the others, or Sony's handheld would show little results in performance to Nintendo's handheld. Although I do hate to be optimistic, There's more than a known fact that Sony does know this area a whole lot better than the others. Combining their knowledge with other outside sources could prove much better results than what is currently on the market.
 
While the PS3 chip will incorporate many elements of nVidia's next-generation PC graphics technology, it is being designed as a standalone unit and is not based on existing PC architecture.
 
The R500 is supposed to have a tri/sec rate 10X that of the X800, about 5.2 billion tri/sec. 5.2 billion versus 3.75 billion. And the Nvidia part will come out much later than the ATI part. What's going on with Nvidia?
 
bbot said:
The R500 is supposed to have a tri/sec rate 10X that of the X800, about 5.2 billion tri/sec. 5.2 billion versus 3.75 billion. And the Nvidia part will come out much later than the ATI part. What's going on with Nvidia?

Little is known about that, I'm sure. But, we do know that Sony is helping them build it.

Oh, and I wish they would set links for the topic so that others can get a few view of the story. :rolleyes:
 
it has not actually been confirmed by ATI that R500 for Xenon has 10x the tri/sec performance of R420/Radeon X800. I highly doubt that. more like 2~3x ...that is more reasonable.
 
Back
Top