PS2 EE question

Very impressive indeed, best particles in any game that gen (and beyond?).
Dont think xbox or gc could have pulled that off not even close :p
 
Anyone got a link to a video showcasing these particle effects being raved about?
It had it in some explosions.
This boss battle also showcased it in some projectile impacts
Underwater also had particles
Also the infamous water particle effects showecased in the first trailers saw their way in one of the cut scenes near the end of the game. It is demonstrated in the DF MGS2 analysis ((16:35 into the video).
I also suspect that the harsh weather effects was a combination of particles and other effects but someone more knowledgable can shed light to this.
 
Its realy impressive what a good developer team can do, no matter what hardware. It seems the japanese teams are the most creative/talented, the PS2 architecture seemed best at home there.
How did VU1 compare to lets say to an vertex shader in the NV2A? I understand through all the reading of old topics that the VU's where more flexible. To what do the VUs compare in todays GPU's like a GTX1080, stream processors/cuda cores?
 
VUs were more flexible compared to shader processors in yesteryears' graphics cards. I wouldn't be so sure that is still true today, since VUs could only access their own private memory (which had to hold both program code and the data to be worked on), and those memories were quite small. Modern GPU shader processors are much less restricted in that regard.
 
VU1 was more flexible then a vertex shader of the NV2A, but was it more powerfull aswell? I dont know Glops of both but that doesnt say much either :p
 
VU1 was more flexible then a vertex shader of the NV2A, but was it more powerfull aswell? I dont know Glops of both but that doesnt say much either :p

EE's Vector Units were 4-wide Float32, same goes for the vertex units in the NV2A IIRC. VUs have 4x FMACs + 1x FDIV. Not sure about NV2A VS, but they're likely more simple.

Regardless, faster clock (294 MHz on EE vs 233 on NV2A) and VU0 being tied directly to the CPU execution side of things readily make it useful for non-graphics tasks. To properly fight the Xbox, realistically the EE needed another VU. VU0 was already there to "fight" the X-CPU having it's own 4-wide SIMD capability, and then you would need VU1 + a "VU2" to match the overall raw throughput of the NV2A for graphics. The basic FPU tied to the MIPS core in the EE was only single wide 32 bit, so it was no stand in.

Considering Sony was already flying high with the Emotion Engine, I think the fast pace of GPU development and clock speed improvements caught them off guard, or they were too deep into development to make changes. And it still doesn't account for how limited the Graphics Synthesizer was for the kind of future workloads that the Xbox would absolutely shine with. Adding another VU would be sort of moot with GS mucking things up.
 
Last edited:
So had the NV2A just one vertex shader the PS2 would have had the edge with its EE/GS over the xbox, or had the NV2A more advantages besides vertex shaders?
 
So had the NV2A just one vertex shader the PS2 would have had the edge with its EE/GS over the xbox, or had the NV2A more advantages besides vertex shaders?

Ironically the first Geforce 3's had single vertex shaders with the same number of PS, TMUs, and ROPS (4:1:8:4) as the NV2A (4:2:8:4). NV2A was essentially based on the Geforce 3 with an extra VS, and I assume very similar feature set support.

NV2A's entire architecture for the very most part was better than the EE + GS for graphics, and subsequently set the stage for every Nvidia GPU up through the 7000 series. Fully programmable vertex and pixel shaders, proper multipass texture mapping units and dedicated ROPS units gave the Xbox huge real world advantages in both the kind of effects and quality they could be used. Having a single vertex shader would've created some kinks, but the geometrical advantages of the average Xbox game still outweighed that of the average PS2 title, owing to developers often not using VU0 to any meaningful degree. Imagine some of the better looking PS2 and Gamecube games, and I think you have a fair representation of what average Xbox games would've looked like on a single vertex shader NV2A.

The PS2's primary advantages were raw pixel fill and insane (48 GB/s total!) supporting bandwidth between the GS and eDRAM. There were some very good instances of PS2 games using it to their advantage, like the smoke and particle effects in MGS2 and Silent Hill 2. The Xbox versions lack some of these effects. IIRC some of the heat blur effects seen in GT4 were difficult to implement at the same quality on the PS3 when scaling up to the higher resolutions seen on that system. It didn't have the per pixel bandwidth enjoyed by the PS2.

As an avid PS2 gamer back then, I got to see alot of the best looking games in action, but I never really understood how amazing they were. I never had an Xbox or Gamecube until 2005 so it was hard to draw real comparisons aside from what I saw on TechTV and G4 back in the day.
 
Last edited:
So had the NV2A just one vertex shader the PS2 would have had the edge with its EE/GS over the xbox, or had the NV2A more advantages besides vertex shaders?

Yep, NV2A could set up something like 116 million tris per second (two cycles per triangle) so it had a much higher peak rate. It could also apply - I think it was four - textures per pass per triangle, as opposed to the one of the GS. So multi-texturing (e.g. base layer, specular, detail texture, bump over the same geometry) didn't incur as much of a hit on the graphics pipeline as you didn't have to resubmit geometry. Multi-texutring in the shader portion of the pipeline as opposed to in graphics memory would have saved an awful lot of of bandwidth .... but then again, the PS2 had so much of the stuff that wouldn't have been a problem.
 
OK that clears up alot. Wasnt VU0 best used as a 'co-processor' to help out the main core, and compete better with the compeition (GC and xbox cpu)? AS i understand/have read in the many old topics here on b3d, VU0 wasnt that effective to be used as a 'vertex shader' or graphics processing.
If i get it right the best way to compare PS2 to xbox is CPU:EE+VU0, GPU GS+VU1, for xbox its just P3 733 + NV2A.

Imagine the best looking PS2 and Gamecube games, and I think you have a fair representation of what Xbox games would've looked like on a single vertex shader NV2A.

With that you mean the PS2 would be just as powerfull if the xbox had gone with just a GF3 (just one vertex shader)? Quit a feat for hardware finished '98/99.
 
With that you mean the PS2 would be just as powerfull if the xbox had gone with just a GF3 (just one vertex shader)? Quit a feat for hardware finished '98/99.

I think he's suggesting in terms of geometry only..?

Even with only a single vertex shader Xbox would have some advantage with textured, lit geometry. For much of the time even in Xbox exclusives with very high polygon workloads the bottlenecks would have been in places other than the vertex shaders (e.g. ROPs, pixel shaders [or fancy texture combiners as I think they were considered then then], CPU).

NV20 / NV2A / NV25 were amazing graphics processors that were a big part of redefining what constituted the consumer graphics market. EE & GS were impressive but an evolutionary dead end.

It's testament to Sony's abilities in the console market that the Xbox and DC failed while their graphics technologies went on to define PC and mobile graphics, but despite EE & GS being a technological dead end Sony absolutely owned that generation of the console market in a way rarely seen before and never seen since.
 
The PS2's primary advantages were raw pixel fill and insane (48 GB/s total!) supporting bandwidth between the GS and eDRAM...It didn't have the per pixel bandwidth enjoyed by the PS2.
And for giggles, the numbers for that BW for relative machines.

PS2 - 640 x 480 = 307,200 pixels @ 48 GB/s
1080p = 6.75x pixels would require 324 GB/s
4K = 4x 1080p pixels would require >1.25 TB/s

PS2's bandwidth would be akin to PS4Pro having a terabyte/s BW. Of course, Sony considered that option in selecting a PS4 solution...
 
I think he's suggesting in terms of geometry only..?

I dont know. But what i can read.

Having a single vertex shader would've created some kinks, but the geometrical advantages of the average Xbox game still outweighed that of the average PS2 title, owing to developers often not using VU0 to any meaningful degree. Imagine some of the better looking PS2 and Gamecube games, and I think you have a fair representation of what average Xbox games would've looked like on a single vertex shader NV2A.

Reads like he means geometry being the only xbox advantage if only one vertex shader, and equal performance for high end games on both platforms. Thats what suprised me as the PS2s hardware is quit abit older, in a time where tech was moving fast. Meaning that VU1 must be hell lot of powerfull, as VU1+GS equal or better the NV2A which was a late 2001 product. VU0 + EE being the CPU for the most part.

@Shifty Geezer
Yes thats crazy if you thinkoff it, didnt the PS2 need that fillrate though to achieve effects etc?
Liked the exotic hardware and somewhere would be nice to see it on PS4 but i understand its much better for you developers the way it went, and having impressive graphics from day one. PS4 having some very impressive titles, im not complaining :p

Yes the PS2 was the most popular, i got one before the other two cause i had a PSX and it was nice to be able to play PSX games and use controllers etc. Many came from the PSX and therefore went with the PS2, Sony/Playstation was well known. Never even heard of the dreamcast (europe) back then, it was just PS2 and XB/GC hype all over the place.
 
Last edited:
Reads like he means geometry being the only xbox advantage if only one vertex shader, and equal performance for high end games on both platforms. Thats what suprised me as the PS2s hardware is quit abit older, in a time where tech was moving fast. Meaning that VU1 must be hell lot of powerfull, as VU1+GS equal or better the NV2A which was a late 2001 product. VU0 + EE being the CPU for the most part.

But that's not true, so I would suggest that you are misinterpreting his comments.

Even with only one vertex shader ala GeForce 3, NV20 is still massively beyond what GS and EE can deliver in terms of visuals. It still has the same pixel shading / texture combining / compressed texture / MSAA capabilities.
 
Ok yeah must be reading it wrong, but still means the PS2 was quit nice for its time since it launched spring 2000 in japan. One can wonder if it was more expensive for Sony to develop the PS2 then it was for MS for its xbox, but that the PS2 in the long run was the better deal because it was mostly in-house? (not taking into account the PS2 was a better success)
Some sources say multi-billions went into the first xbox.
 
I don't think Microsoft was ever concerned about making money with Xbox. I don't think they wanted to lose as much as they did, but they certainly didn't seam too concerned about it.

Regarding the second geometry engine in NV2A, i think it was there because Sony had been boasting polygon numbers since the Dreamcast launched, and it was a way of future proofing the console a bit. I'm sure it helped, of course, but I struggle to think of a game from that era that ran poorly on my PC at the time, which featured a 1GHz Athlon paired with a GF3, and later a Radeon 8500. Except maybe Halo. I remember Halo being a bit of a performance disappointment at the time on the GF3, which had moved on to my wife's rig by the time it released on PC. Even then, it ran well enough at 480p, so I think it was either fill, shader or bandwidth limited and not geometry limited.
 
Now that you say that, i had a pc with a Ti4200 and one with a GF3 Ti200, while there was a performance difference between the two pc's, it wasnt much, offcourse i didnt really dive into it as i wasnt into hardware like now (was quit young then its 17 years ago :p). But i remember both being able to play games on highest settings without problem, the Ti4200 pc giving more frames but both were able to play most games on highest settings. Dont remember CPUs of both but one was a Xp2500, other one XP2100 i believe. Both could do doom 3 better then the xbox for sure, same for far cry, i had to lower settings but it looks better then what FC Instincts does. Unsure if i tried HL2 on one of those pc's, think i had a 9600XT by then, which came with HL2 for free. Good times.
 
Ok yeah must be reading it wrong, but still means the PS2 was quit nice for its time since it launched spring 2000 in japan. One can wonder if it was more expensive for Sony to develop the PS2 then it was for MS for its xbox, but that the PS2 in the long run was the better deal because it was mostly in-house? (not taking into account the PS2 was a better success)
Some sources say multi-billions went into the first xbox.

Microsoft's Xbox division didn't post a profit until I think it was 3 years into the Xbox 360's lifecycle, aka they didn't make a dime on the original Xbox. Talk about a long term loss leading strategy. But I know it had alot to do with the supply politics with Intel and Nvidia. There's a reason why the Xbox lasted only into late 2006 and MS pretty much cut the cord aside from a few random games and still supporting Xbox Live.
 
PS2's bandwidth would be akin to PS4Pro having a terabyte/s BW. Of course, Sony considered that option in selecting a PS4 solution...
Then again, a modern GPU like found in PS4 Pro has multiple levels of on-chip storage/caches, with multiple read/write ports (1kbit-ish typically? maybe wider, I'm not a hardware engineer), so thoroughly vast quantities of bandwidth sum total. Many many TB/s all told no doubt. I've never seen anyone try to sum up a modern GPU's full, total internal bandwidth, it would be very interesting no doubt. :devilish:

I struggle to think of a game from that era that ran poorly on my PC at the time, which featured a 1GHz Athlon paired with a GF3, and later a Radeon 8500.
DOOM 3 ran poorly on GF3, as it couldn't do fancy per-pixel lighting in one pass. It also lacked bandwidth and fillrate for the heavy stencil shadows rendering system.

And World of Warcraft chugged quite badly on GF3 as well, due to lots of overdraw, even when the game had far less view distance than it does today. There will undoubtedly be other games that ran poorly on it as well, especially towards the end of its lifespan... The original Far Cry perhaps? I don't recall when that one came out.
 
Back
Top