Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
Certainly nothing could rival Crysis' vast super dense environment and physics driven world. That game was overkill for years to come. Its was a title that belonged to a generation ahead. But I checked the other two games you mentioned and I cant quite grasp what is remarkable about them.
Certainly the Uncharted series and God Of War punched above what was expected from the PS3 hardware and there was nothing comparable on PC, even though PCs where certainly more powerful.
I couldnt grasp how these studios pulled of those visuals.
Both were RTS games with high levels of destruction created all over the map by up to 16 people simultaneously. That combined with the particle effects found in World in Conflict would have crippled either console.
 
The real problem is outside of AMD's APU, if you're trying to hit a certain price point then there really are very limited alternative options. Intel + Nvidia? ARM and Nvidia? Would either have produced an overall better package technically at the same price point? ¯\_(ツ)_/¯
Even the simpler interconnects like nVidia's "N2C" allow for 450GB/s+ transfer rate in one direction. There is no reason to stay with one company.
 
The real problem is outside of AMD's APU, if you're trying to hit a certain price point then there really are very limited alternative options. Intel + Nvidia? ARM and Nvidia? Would either have produced an overall better package technically at the same price point? ¯\_(ツ)_/¯

An ARM design would've given them more flexibility to mix and match IP and do control more IP in basically in house. By going for x86 it's basically limited the options to AMD and if they want backwards compatibility it's going to be basically an anchor going forward to stick with x86, which in turn means the GPU IP effectively will have to also be AMD.

Something I've always suspected is that if ARMv8/Cortex A57 timing was slightly more ahead for the PS4 gen both consoles would've likely gone ARM instead of x86.
 
An ARM design would..
You've skipped over the price point which is, I think, the factor that most swayed both Microsoft and Sony to go with AMD again. Discrete CPUs and GPUs add some complexity which in turns add quite a lot of cost. A further AMD design, which was an evolution of the previous designs used which made backwards compatibility easier, was also a factor I am sure.

Consoles need to be cost effective, in terms of either making a profit, or taking as little as loss for as short a time as possible.When consoles aren't cheap.. well, just look at NeoGeo, 3DO and PlayStation 3.
 
You've skipped over the price point which is, I think, the factor that most swayed both Microsoft and Sony to go with AMD again. Discrete CPUs and GPUs add some complexity which in turns add quite a lot of cost. A further AMD design, which was an evolution of the previous designs used which made backwards compatibility easier, was also a factor I am sure.

Consoles need to be cost effective, in terms of either making a profit, or taking as little as loss for as short a time as possible.When consoles aren't cheap.. well, just look at NeoGeo, 3DO and PlayStation 3.

Yup, I'm doubtful NV would have been willing to create anything remotely similar in performance to what's in the PS5/XBS consoles at the price point that Sony and MS wanted for a console that was targeting a 500 USD price point.

Regards,
SB
 
One more time. 8800 gtx is 345.6 gflops and I think most powerful pc cpu in 2006 was not more than 50 glops. So ~ 390 gflops. PS3 Cell 200 + RSX 192 = 392. That is the same or near. Just raw performance. And X1 and PS4 was 2-3 times behind top PC for time they released.

Please don't use GFLOPS as a direct measure of performance like that. Particularly not when adding CPU + GPU GLFOPS together and even more particularly not when comparing between different architectures, one of which is significantly more advanced than the other.

The G80 (the 8800GTX) could issue 256 FMADS per clock as well as 128 MULS per clock under the right circumstances. That's 384 FLOPS across at 1.35Ghz = 518.4 GFLOPS. The MUL wasn't always usable in that way which is why you sometimes see the lower number which just counts the FMADS. But regardless, counting GFLOPS like that is ridiculous when you can simply look at the benchmarks in the PC space. This is how an 8800GTX typically performed compared to the GPU in the PS5:

13533.png


See that 7900GTX right at the bottom? Yeah, that things is way faster than the RSX GPU in PS3. Yes the Cell could be used to prop up RSX performance somewhat - particularly it's lack of vertex shader throughput, but you're not talking anything near the 3x increase it would need to be competitive with the 8800GTX. If it could do that, Sony would never have swapped out the second Cell for RSX in the first place.

Regarding the CPU comparison, CPU performance isn't even measured in GFLOPS. It's measured in actual performance in real world, branchy, memory sensitive, and often single threaded code. The only part of Cell that could deal with that kind of code (it's central PPE) was laughably slow at this. Like, probably an order of magnitude slower than a decent PC single core CPU of the time. And quad core CPU's landed in the PC space around the same time as PS4 launched.

Crysis 2, Crysis 3, Metro Last Light, Alien Isolation. And that is just few. All those games were on 7th gen consoles and I can't even find test on 7800, but will not be surprised if performance wouldn't be great.
Crysis 3 will not work at all, because that game doesn't support Direct X10 and below. But that is different story. :) Metro Last LIght was great. When I played it on Xbox 360, there was a lot of amazing effects, high polygon count and very high texture resolution. For some moments I thought what maybe I play game on high end PC. Joking. :D I knew what wasn't true, but that game looked smazing. Also Alien Isolation was great in terms of polygon count.

You can't use an assumption that performance wouldn't have been comparable to the consoles as evidence that it wouldn't have been comparable. I agree any game that required DX10 baseline support or that was heavily optimised for a unified shader design obviously wouldn't run well, or at all on any DX9 GPU, but the games you're quoting above launched in 2013/2014 lol. That's 8-9 years after that 7800GTX 512 launched and 7-8 years after DX10 and unified shaders arrived in the PC space. Of course games wouldn't be targeting such an ancient architecture in the PC space by that time. My argument above was very specifically about the DX9 era of games that would have been optimised to run well on the older split shader design GPU's within say the first 2-4 years of it's life. No-one in the PC space was worrying about whether a 7800GTX 512 would run Crysis 3 as well as an Xbox 360 when you could have bought a GTX 670 the year prior which would have been significantly more powerful than the Xbox One, let alone the Xbox 360. I'm saying the 7800GTX 512MB was comparable to the 360 during the time period that it was relevant to PC developers. Had PC hardware never evolved beyond that GPU, and developers continued to target and optimise for it, I'm sure it could have held up fairly well for the life of the console, but that's not the reality of PC hardware or software development.

As for 8800 true, but that was year later that xbox 360. So that is not very fair comparison. That is only my opinion.

It was launched before the PS3 so it's a pretty fair comparison to that console.

Please if you can, give source for that info.

Have a look at the GPU notes here:


And sad what that PC power and power of GPUs what was released a years later haven't shown something proportionaly better that 7th gen consoles. LIke GPUs was 3-7 times more powerful, but except for resolution and fps they haven't shown much.

By who's measure? Please be more specific. Some would argue that Crysis absolutely showed that 2x power differential in the first year or so. And I think many console gamers today would argue that being able to play games at a solid 60fps / 1440p vs a shaky 30fps / 720p is a great use of those multiples of GPU power. Would you be happy with that performance from the gen 9 consoles in exchange for better core graphics? I doubt many would.

But how many years are you talking? If you're going up to GPU's that are 7x more powerful than the 360 then you're probably looking at something around 7790 level performance which is getting pretty damn close to the performance of the XBO GPU and hence capable of playing most if not all Gen8 titles at lower resolutions. I'd say RDR2 for example running on a GPU of that calibre at say 720p/30 is vastly better looking than anything that released on the gen7 consoles.

still haven't found info of how many Gflops had Xbox 360 CPU.

Just under 77 GFLOPS. But that's entirely irrelevant. For reference the PS4 CPU was the same I believe. It means next to nothing in a CPU.
 
One more time. 8800 gtx is 345.6 gflops and I think most powerful pc cpu in 2006 was not more than 50 glops. So ~ 390 gflops. PS3 Cell 200 + RSX 192 = 392. That is the same or near. Just raw performance. And X1 and PS4 was 2-3 times behind top PC for time they released.

The problem with your comparison is the SPU's in Cell were used for GPU tasks where as a PC CPU was purely used for CPU work.

And factoring that in the PC CPU's slaughtered Cell for that kind for general CPU work.

PC GPU's were more than powerful enough to not need the CPU to help out.

And hasn't PS5 vs XSX taught you anything about comparing flops?

Crysis 2, Crysis 3, Metro Last Light, Alien Isolation. And that is just few. All those games were on 7th gen consoles and I can't even find test on 7800, but will not be surprised if performance wouldn't be great.

Those games didn't work on PS3 either unless you consider 10-15fps 'working'
 
TL: DR on cell processor simplified

Extremely good at doing massively parallel tasks. When you could properly leverage the parallel nature of the Cell's SPEs you could easily crunch more maths than a traditional CPU. This is why it was quite good at doing some tasks that were traditionally done on GPU. It also made it quite good for tasks without a lot of dependencies or branching.

Really poor at general CPU tasks. It's difficult to make use of a massively parallel architecture for heavily branching code or anything with a lot of dependencies on a previous calculation. Both of which dominate most typical CPU loads. In this case only the PPE was really that useful and it alone wasn't that great for general CPU tasks.

So, could it be used to take on some GPU tasks, yes? Could it do it as well as a dedicated GPU using the same silicon area? Unlikely. Was it better to use it to assist with GPU rendering rather than CPU tasks? Obvious in hindsight. But until developers were able to leverage it for GPU tasks, the Cell CPU combined with the RSX GPU meant that the PS3 was woefully underpowered and underutilized in the first year+ of it's lifespan compared to the competition with a more traditional CPU + GPU setup. Once Sony developers started to use the cell CPU to assist the GPU then it could finally trade blows with the competition with it being better in some things but worse in others.

When played to its strengths it could certainly do some impressive things, but in multiplatform games, developers were still far more comfortable with doing a traditional CPU + GPU rendering path while using Sony provided code for the Cell processor to assist with some of the GPU code. This meant that in general multiplatform games tended to do better on the competing platform (especially in the first few years of the generation).

However, some things the cell processor could not assist the GPU with an you were still stuck with the GPU's limited capabilities in those cases (transparencies for example would always be lower resolution on PS3 versus the competition).

It's far too complex of a situation to use a single number to say PS3's Cell (great parallel computing) + RSX (weak GPU) was better or worse than X360's more traditional setup with a powerful traditional CPU (but less powerful in massively parallel tasks) + traditional powerful GPU.

It's like asking different people what's the better fruit? Apples or Oranges?

Regards,
SB
 
TL: DR on cell processor simplified

Extremely good at doing massively parallel tasks. When you could properly leverage the parallel nature of the Cell's SPEs you could easily crunch more maths than a traditional CPU. This is why it was quite good at doing some tasks that were traditionally done on GPU. It also made it quite good for tasks without a lot of dependencies or branching.

Really poor at general CPU tasks. It's difficult to make use of a massively parallel architecture for heavily branching code or anything with a lot of dependencies on a previous calculation. Both of which dominate most typical CPU loads. In this case only the PPE was really that useful and it alone wasn't that great for general CPU tasks.

So, could it be used to take on some GPU tasks, yes? Could it do it as well as a dedicated GPU using the same silicon area? Unlikely. Was it better to use it to assist with GPU rendering rather than CPU tasks? Obvious in hindsight. But until developers were able to leverage it for GPU tasks, the Cell CPU combined with the RSX GPU meant that the PS3 was woefully underpowered and underutilized in the first year+ of it's lifespan compared to the competition with a more traditional CPU + GPU setup. Once Sony developers started to use the cell CPU to assist the GPU then it could finally trade blows with the competition with it being better in some things but worse in others.

When played to its strengths it could certainly do some impressive things, but in multiplatform games, developers were still far more comfortable with doing a traditional CPU + GPU rendering path while using Sony provided code for the Cell processor to assist with some of the GPU code. This meant that in general multiplatform games tended to do better on the competing platform (especially in the first few years of the generation).

However, some things the cell processor could not assist the GPU with an you were still stuck with the GPU's limited capabilities in those cases (transparencies for example would always be lower resolution on PS3 versus the competition).

It's far too complex of a situation to use a single number to say PS3's Cell (great parallel computing) + RSX (weak GPU) was better or worse than X360's more traditional setup with a powerful traditional CPU (but less powerful in massively parallel tasks) + traditional powerful GPU.

It's like asking different people what's the better fruit? Apples or Oranges?

Regards,
SB
It really makes me wonder what the case would have been if Sony invested proberly in a good GPU and still retained Cell. Considering how Cell managed to leverage the GPU's weaknesses, if Sony didnt waste too much silicon and money on non gaming features, Sony might have come with a beast.
 
Please don't use GFLOPS as a direct measure of performance like that. Particularly not when adding CPU + GPU GLFOPS together and even more particularly not when comparing between different architectures, one of which is significantly more advanced than the other.
I think you and some other guys still don't understand what I said. I compared just raw performance. That's all. I know what CPUs and GPUs can't be compared just using gflops.
Thanks for that info.
Regarding the CPU comparison, CPU performance isn't even measured in GFLOPS. It's measured in actual performance in real world, branchy, memory sensitive, and often single threaded code. The only part of Cell that could deal with that kind of code (it's central PPE) was laughably slow at this.
Yes, similar to PS2 CPU, but as a result on PS2 were a lot of amazing games for that time and on PS3 were a lot of amazing games for that time. I think that means what capabilities of different CPUs also can't be comparet directly.
And quad core CPU's landed in the PC space around the same time as PS4 launched.
Wasn't they released years before?
It was launched before the PS3 so it's a pretty fair comparison to that console.
I meant comparison to Xbox 360.
By who's measure? Please be more specific. Some would argue that Crysis absolutely showed that 2x power differential in the first year or so. And I think many console gamers today would argue that being able to play games at a solid 60fps / 1440p vs a shaky 30fps / 720p is a great use of those multiples of GPU power. Would you be happy with that performance from the gen 9 consoles in exchange for better core graphics? I doubt many would.
In my opinion use power of GPUs mostly for resolution and fps is wrong way for industry. Would've been better to calcultae more polygons, better lighting, better effects ad mora at same time and better shadows. Would've been interesting to see what Xbox oNe X could do in 1080p 30fps. Same with 9th gen consoles. I understand what a lot cheaper to make higher resolution and fps that make a lot more detailed assets. So Microsoft and Sony said what on XSX and PS5 player can play in 4k 60 fps. BUt they haven't told what graphics will not be far ahead of X1 and PS4. Ofcourse tet will change later this gen. But I think what that will not change a lot. PS5 have 5 times more power that PS4 but resolution is 4 times higher. XSX have only 2 times more power that X1X but there at least almost same resolution. Of course improved CPUs and GPUs architecture, more RAM, SSD and ray tracing will help, but as I said I think difference will not be big. That dorsn't mean what I will not play new games and enjoy them, I already have XSX and PS5, but wow effect what was playing 7th gen games will be lower. :)
Just under 77 GFLOPS. But that's entirely irrelevant. For reference the PS4 CPU was the same I believe. It means next to nothing in a CPU.
I just wanted t know raw power of Xenon. Thanks.

The problem with your comparison is the SPU's in Cell were used for GPU tasks where as a PC CPU was purely used for CPU work.

And factoring that in the PC CPU's slaughtered Cell for that kind for general CPU work.
Yes, but all games worked, so what advanteges those CPUs had?

And hasn't PS5 vs XSX taught you anything about comparing flops?
And what they should've taught me? Please explain.
Those games didn't work on PS3 either unless you consider 10-15fps 'working'
THose games worked 30 fps or ~ 30fps n PS3. There is a lot of tests and comparisons. And of Xbox 360 they worked even better, with higher resolution, higher fps and some better effects.
 
You've skipped over the price point which is, I think, the factor that most swayed both Microsoft and Sony to go with AMD again. Discrete CPUs and GPUs add some complexity which in turns add quite a lot of cost. A further AMD design, which was an evolution of the previous designs used which made backwards compatibility easier, was also a factor I am sure.

Consoles need to be cost effective, in terms of either making a profit, or taking as little as loss for as short a time as possible.When consoles aren't cheap.. well, just look at NeoGeo, 3DO and PlayStation 3.
Going with the a low cost and outdated tech doesnt provide a good example either.
 
As for fps test, that is not 10-15 fps. Mostly 30 when there is no comat. In combat 25-27, rare moments below 25. But I'm sur on PC there is also a lot moments when fps drops.
 
PS5 has less Tflops than Series-X but often matches or beats its performance.

So what lesson do you think you should have learnt from that?



View attachment 8989

Interestingly the 360 and the PS3 are performing on similar levels. Even though the PS3 had a worse GPU. Did the devs leverage using the Cell effectively?
I owned the game on my 360 btw, and I was impressed with the visuals and the performance didnt mind me much. Most of the time it performed pretty well
 
As for fps test, that is not 10-15 fps.

There are multiple segments where it's at or below that level.

Crysis 2, much like the first game gets more demanding the further in to the game you go.

Mostly 30 when there is no combat.

In a game that is combat based saying it's 30fps when there's none is laughable.

In combat 25-27

I suggest you watch the video properly to prevent having to screenshot the many, many instances where they're well below 25fps.

rare moments below 25

Again, watch the video.

But I'm sur on PC there is also a lot moments when fps drops.

I remember having a locked 60fps on PC with ease.

And at console equivalent settings (But at 2.5x the resolution) even mid-range GPU's offered 2-3x the frame rate performance.
 
Interestingly the 360 and the PS3 are performing on similar levels. Even though the PS3 had a worse GPU. Did the devs leverage using the Cell effectively?
I owned the game on my 360 btw, and I was impressed with the visuals and the performance didnt mind me much. Most of the time it performed pretty well

Crytek did heavily use the SPU's in CryEngine and in CPU limited situations it did provide PS3 with a higher frame rate than 360, which typically performed better when the action was not as intense.

I also remember them specifically saying to keep drawcalls below 2000 on PS3 to not kill RSX.
 
There are multiple segments where it's at or below that level.
Multiple doesn't mean often or mostly.
In a game that is combat based saying it's 30fps when there's none is laughable.
Ok, wjat is your point? PS3 and Xbox 360 are trash?
Again, watch the video.
I watched some moments from video. And I played Crysis 2 on Xbox 360, there is only some moments when game is running with low fps. They made great result for consoles what were released 5-6 years befor Crysis 2 had 5 times less GPU power and 20 times more RAM.
And at console equivalent settings (But at 2.5x the resolution) even mid-range GPU's offered 2-3x the frame rate performance.
Yeah, and what was your PC specs comparing to consoles? :)
Crytek did heavily use the SPU's in CryEngine and in CPU limited situations it did provide PS3 with a higher frame rate than 360, which typically performed better when the action was not as intense.
As I remember they didn't used SPUs in Crysis 2 for graphics tasks, only for CPU tasks. Maybe in Crysis 1 Remastered or Crysis 3 they started use SPUs also for graphics tasks.
 
Status
Not open for further replies.
Back
Top