Render resolutions of 6th gen consoles - PS2 couldn't render more than 224 lines? *spawn

Best way to think of it is Spiderman on PC.

Remember when the game came out and running textures on Very High on an 8GB GPU would cause the game to use main RAM?

Which slowed the game to a crawl because the PCIEX trans rate was slow, and the GPU had to wait while the required data was swapped in and out of VRAM?

That's essentially what happens on PS2, the game is constantly swapping texture data in and out of VRAM but it's done by design.

And also the connections between the chips are fast enough to not cause the system to slow to a crawl.
 
Best way to think of it is Spiderman on PC.

Remember when the game came out and running textures on Very High on an 8GB GPU would cause the game to use main RAM?

Which slowed the game to a crawl because the PCIEX trans rate was slow, and the GPU had to wait while the required data was swapped in and out of VRAM?

That's essentially what happens on PS2, the game is constantly swapping texture data in and out of VRAM but it's done by design.

And also the connections between the chips are fast enough to not cause the system to slow to a crawl.
But that's the thing. For the purposes of illustrating what I am saying Based on the devs comment on Socom, it's the equivalent of having 10GB of texture data just for Spiderman running the game on an 8GB VRAM GPU. Spiderman is constantly displayed on screen. It's not like traversing environments where you would stream in and out environment and other object data as you move in it. Even worse there is no space to stream in anything else if Spiderman is 10GBs of textures on an 8GB VRAM GPU who is constantly required on screen
 
But that's the thing. For the purposes of illustrating what I am saying Based on the devs comment on Socom, it's the equivalent of having 10GB of texture data just for Spiderman running the game on an 8GB VRAM GPU.

That's exactly how it is on PS2.

And like PS2, that 10GB naturally doesn't fit in to 8GB of VRAM.

Which means that it would need to sit in main RAM on PC and streamed over the PCIEX bus as and when required.

But that's where PC and PS2 differ.

PS2 has the raw bandwidth on its interconnects to stream a shit load of textures from main RAM in to EDRAM without slowing the system down.

The same is not true for PC and it's slow ass PCIEX connection. If you suddenly gave the PCIEX slot 500GB/s of bandwidth (and had system RAM with high bandwidth) you could also stream from system RAM on PC without slowing the system down, essentially giving you unlimited VRAM.

So with Socum, they would send 1MB of character texture data to EDRAM, work on it and write out results to buffers, then do this over and over and over again.......tens if not hundreds of times per frame.

PS2 could and did stream in and out multiple MB's of texture data from main RAM per frame.

PS2 was designed to function this way and had a stupid fast (and complex) memory system, and it was the system interconnect (specifically the GIF interconnect) that was the bottleneck for textures on PS2 and not the physical VRAM it had.
 
But that's the thing. For the purposes of illustrating what I am saying Based on the devs comment on Socom, it's the equivalent of having 10GB of texture data just for Spiderman running the game on an 8GB VRAM GPU. Spiderman is constantly displayed on screen. It's not like traversing environments where you would stream in and out environment and other object data as you move in it. Even worse there is no space to stream in anything else if Spiderman is 10GBs of textures on an 8GB VRAM GPU who is constantly required on screen
Doesn't PS2 have unified memory? The EDRAM is supposed to be for render targets (though if I understand things correctly, you can store textures there if you want), but GS can just access system memory for textures and other assets. If PS2 was limited to it's 4MB of EDRAM, it would have never been able to compete with Xbox because asset quality would have been so low.
 
If one object though is constantly visible and has more than 4MB which is the size of VRAM,
It wouldn't though. 4MB is an obscene amount of data for a PS2 asset! 4bpp on a 256x256 texture would be 32kb,
I don't get how that works. It is loaded and displayed constantly.
All the assets on screen are made out of multiple KBs of data. You load in what you need to draw the object, then flush that out and load in assets for the next object.
but GS can just access system memory for textures and other assets.
No. GS works in EDRAM and doesn't see the rest of the system. The GIF arbitrates transfers of data to GS.
32MB unified RAM + 4MB Infinity Cache (Or L2)

Indeed, but without direct memory access. The GS can't try to use a texture and, if it's not in EDRAM, fetch it. the VRAM needs to be loaded with whatever the GS is wanting. There's zero automatic caching and it's entirely up to the devs to balance the GIF and keep the workloads optimised and data present.
 
Indeed, but without direct memory access. The GS can't try to use a texture and, if it's not in EDRAM, fetch it. the VRAM needs to be loaded with whatever the GS is wanting. There's zero automatic caching and it's entirely up to the devs to balance the GIF and keep the workloads optimised and data present.

If I remember correctly, towards the end of PS2's life developers were sending as much data as possible via Path 3, which in turn freed up bandwidth on GIF for more textures.
 
@Shifty Geezer @see colon @davis.anthony

So if I got this correctly, the bandwidth was so fast that for example in a 30fps game the textures could be swapped in the VRAM significantly faster than 1/30 of a second? So let's say a game was running at 30fps and the scene has 8MB worth of textures and assuming for reasons of simplicity 4MB of VRAM was free for textures, the texture swap would be 1/60 of a second thus swapping 4MB of textures twice every frame thus constructing an image with 8MB of textures per frame?
 
@Shifty Geezer @see colon @davis.anthony

So if I got this correctly, the bandwidth was so fast that for example in a 30fps game the textures could be swapped in the VRAM significantly faster than 1/30 of a second? So let's say a game was running at 30fps and the scene has 8MB worth of textures and assuming for reasons of simplicity 4MB of VRAM was free for textures, the texture swap would be 1/60 of a second thus swapping 4MB of textures twice every frame thus constructing an image with 8MB of textures per frame?
I don't remember the numbers, but yeah you could indeed upload each texture before rendering it. Though obviously better sorting strategies will speed things up
 
Yes @Nesh that's pretty much it.

PS2's texturing ability was ultimately decided by how much of the 32MB RAM you wanted to dedicate to textures, and the available GIF bandwidth you had to send them to the GS.
 
There are various aspects of the PS2 design that meant they were never bottlenecked. RAM BW was basically one of them, as was overdraw. Good optimisation meant keeping these as active as possible.

The RAM situation mirrors GC with its 1T -SRAM. The 3MB VRAM wasn't a problem as it could be populated Just In Time with necessary data from RAM.
 
There are various aspects of the PS2 design that meant they were never bottlenecked. RAM BW was basically one of them, as was overdraw. Good optimisation meant keeping these as active as possible.

The RAM situation mirrors GC with its 1T -SRAM. The 3MB VRAM wasn't a problem as it could be populated Just In Time with necessary data from RAM.
It sounds like a genius way to keep costs down while keeping performance very high. I wish Sony would have used something similar with PS3 so they would have been less bottlenecked with memory, but they totally went the opposite route and it showed with games that relied heavy on transparencies. It is ironic, how Sony from setting an example in price to performance and ability to scale with PS1 and PS2, the PS3 cost around $900 for each unit to produce while the performance was barely any better than the 360's. They focused on too many goals and lost track of their targets with Cell's design, the choice of GPU, BR availability and costs etc. In an alternate universe, the PS3 could have had a better designed Cell to allow easier optimization and usage and the GPU would have been properly settled with NVIDIA or AMD with early enough customizations choices, borrowing from what they have learned with PS1 and PS2, for ease of development and reducing bottlenecks. The GSCube was supposed to define the roadmap for the PS3, with huge amounts of bandwidth but totally underestimated the readiness and evolution of the GPU market that set the base standards for real time graphics.

The PS2 was indeed a very very interesting and genius design considering it's unorthodox solutions and fully unique GPU design. They were re-inventing or setting their own approach proprietary standards in rendering real time 3D while the other 3 went directly to the GPU manufacturers.


edit: what else in the PS2 design was such that they reduced bottlenecks?
 
As fun and interesting PS2's architecture is, it was the best of a dead end.

The brute force approach it used just wasn't something you could realistically go with for future designs.
 
As fun and interesting PS2's architecture is, it was the best of a dead end.

The brute force approach it used just wasn't something you could realistically go with for future designs.
In a broad sense, I agree, but I do think that an embedded device with a fixed resolution, like a gaming handheld, could benefit from many of the design choices that PS2 had. Having enough EDRAM with enough bandwidth to hit whatever your target resolution and framerate are would help a bunch in that case.
 
It sounds like a genius way to keep costs down while keeping performance very high.
I don' think it was cost effective. RDRAM was expensive as was the EDRAM
The PS2 was indeed a very very interesting and genius design considering it's unorthodox solutions and fully unique GPU design.
I dunno that I would call it genius, but it did do what it did very well and I'm glad it existed as hardware. I think all that gen had clear strengths and weaknesses and a great spectrum of different approaches. PS2 definitely suffered in IQ due to jaggies and shimmer. Oooh, the shimmer!!
edit: what else in the PS2 design was such that they reduced bottlenecks?
Just start googling everything Fafalada posted on Beyond3D. Back around 2002, you invariably get conversations between him and all the other devs that touch on this and that. Someone would raise a point that PS2 lacked something, such as back-face culling, and they explain that it didn't make any difference because the hardware just worked around it.
 
As fun and interesting PS2's architecture is, it was the best of a dead end.

The brute force approach it used just wasn't something you could realistically go with for future designs.

And to think Sony almost did continue with it's approach. Not totally sure how Sony would've coped with a close coupled pure 3D accelerator (not GPU) attached to Cell.

As for the PS2, GS was definitely highly limited......could do some crazy stuff though in the right hands, just like the Emotion Engine. I could swear MGS3 uses normal maps on characters, even though I know it doesn't. We all know about Zone of the Enders 2, and I think Final Fantasy X's crazy amount of transparency effects (think the final boss battle) would be very hard on the Xbox and probably the Gamecube too. It also goes to show how important developer support was to allow devs to actually achieve their vision.....you can see it in early PS2 vs late PS2 games. Massive difference.

All in all, yeah it's easy to see the PS2's deffiencies, but how would ya'll reasonably change things up with the same transistor budget?
 
It is ironic, how Sony from setting an example in price to performance and ability to scale with PS1 and PS2, the PS3 cost around $900 for each unit to produce while the performance was barely any better than the 360's.
Don't forget one thing. Cost of one PS3 unit was so high not only because of chips. BD drive, PS2 built-in, wi-fi, HDMI, card reader for 3 different cards, sensor buttons, Super Audio CD support, bluetooth, Xbox 360 had none of these. More advanced and so more expensive cooling system, more expensive console case, controller with motion sensors and built-in battery, and built-in ac block, I think that also was more expensive. :)
The PS2 was indeed a very very interesting and genius design considering it's unorthodox solutions and fully unique GPU design. They were re-inventing or setting their own approach proprietary standards in rendering real time 3D while the other 3 went directly to the GPU manufacturers.
IMO PS2 was one of the best designs of all time. If we look at last games 2008-2010, they was beyond anything everyone would suggested in 2000-2001, when console was new.
 
Don't forget one thing. Cost of one PS3 unit was so high not only because of chips. BD drive, PS2 built-in, wi-fi, HDMI, card reader for 3 different cards, sensor buttons, Super Audio CD support, bluetooth, Xbox 360 had none of these.
Don't forget the standard hard drive!
 
Back
Top