This was.I thought this was the current generation games analysis thread?
This was.I thought this was the current generation games analysis thread?
Both were RTS games with high levels of destruction created all over the map by up to 16 people simultaneously. That combined with the particle effects found in World in Conflict would have crippled either console.Certainly nothing could rival Crysis' vast super dense environment and physics driven world. That game was overkill for years to come. Its was a title that belonged to a generation ahead. But I checked the other two games you mentioned and I cant quite grasp what is remarkable about them.
Certainly the Uncharted series and God Of War punched above what was expected from the PS3 hardware and there was nothing comparable on PC, even though PCs where certainly more powerful.
I couldnt grasp how these studios pulled of those visuals.
Even the simpler interconnects like nVidia's "N2C" allow for 450GB/s+ transfer rate in one direction. There is no reason to stay with one company.The real problem is outside of AMD's APU, if you're trying to hit a certain price point then there really are very limited alternative options. Intel + Nvidia? ARM and Nvidia? Would either have produced an overall better package technically at the same price point? ¯\_(ツ)_/¯
The real problem is outside of AMD's APU, if you're trying to hit a certain price point then there really are very limited alternative options. Intel + Nvidia? ARM and Nvidia? Would either have produced an overall better package technically at the same price point? ¯\_(ツ)_/¯
You've skipped over the price point which is, I think, the factor that most swayed both Microsoft and Sony to go with AMD again. Discrete CPUs and GPUs add some complexity which in turns add quite a lot of cost. A further AMD design, which was an evolution of the previous designs used which made backwards compatibility easier, was also a factor I am sure.An ARM design would..
You've skipped over the price point which is, I think, the factor that most swayed both Microsoft and Sony to go with AMD again. Discrete CPUs and GPUs add some complexity which in turns add quite a lot of cost. A further AMD design, which was an evolution of the previous designs used which made backwards compatibility easier, was also a factor I am sure.
Consoles need to be cost effective, in terms of either making a profit, or taking as little as loss for as short a time as possible.When consoles aren't cheap.. well, just look at NeoGeo, 3DO and PlayStation 3.
One more time. 8800 gtx is 345.6 gflops and I think most powerful pc cpu in 2006 was not more than 50 glops. So ~ 390 gflops. PS3 Cell 200 + RSX 192 = 392. That is the same or near. Just raw performance. And X1 and PS4 was 2-3 times behind top PC for time they released.
Crysis 2, Crysis 3, Metro Last Light, Alien Isolation. And that is just few. All those games were on 7th gen consoles and I can't even find test on 7800, but will not be surprised if performance wouldn't be great.
Crysis 3 will not work at all, because that game doesn't support Direct X10 and below. But that is different story. Metro Last LIght was great. When I played it on Xbox 360, there was a lot of amazing effects, high polygon count and very high texture resolution. For some moments I thought what maybe I play game on high end PC. Joking. I knew what wasn't true, but that game looked smazing. Also Alien Isolation was great in terms of polygon count.
As for 8800 true, but that was year later that xbox 360. So that is not very fair comparison. That is only my opinion.
Please if you can, give source for that info.
And sad what that PC power and power of GPUs what was released a years later haven't shown something proportionaly better that 7th gen consoles. LIke GPUs was 3-7 times more powerful, but except for resolution and fps they haven't shown much.
still haven't found info of how many Gflops had Xbox 360 CPU.
One more time. 8800 gtx is 345.6 gflops and I think most powerful pc cpu in 2006 was not more than 50 glops. So ~ 390 gflops. PS3 Cell 200 + RSX 192 = 392. That is the same or near. Just raw performance. And X1 and PS4 was 2-3 times behind top PC for time they released.
Crysis 2, Crysis 3, Metro Last Light, Alien Isolation. And that is just few. All those games were on 7th gen consoles and I can't even find test on 7800, but will not be surprised if performance wouldn't be great.
It really makes me wonder what the case would have been if Sony invested proberly in a good GPU and still retained Cell. Considering how Cell managed to leverage the GPU's weaknesses, if Sony didnt waste too much silicon and money on non gaming features, Sony might have come with a beast.TL: DR on cell processor simplified
Extremely good at doing massively parallel tasks. When you could properly leverage the parallel nature of the Cell's SPEs you could easily crunch more maths than a traditional CPU. This is why it was quite good at doing some tasks that were traditionally done on GPU. It also made it quite good for tasks without a lot of dependencies or branching.
Really poor at general CPU tasks. It's difficult to make use of a massively parallel architecture for heavily branching code or anything with a lot of dependencies on a previous calculation. Both of which dominate most typical CPU loads. In this case only the PPE was really that useful and it alone wasn't that great for general CPU tasks.
So, could it be used to take on some GPU tasks, yes? Could it do it as well as a dedicated GPU using the same silicon area? Unlikely. Was it better to use it to assist with GPU rendering rather than CPU tasks? Obvious in hindsight. But until developers were able to leverage it for GPU tasks, the Cell CPU combined with the RSX GPU meant that the PS3 was woefully underpowered and underutilized in the first year+ of it's lifespan compared to the competition with a more traditional CPU + GPU setup. Once Sony developers started to use the cell CPU to assist the GPU then it could finally trade blows with the competition with it being better in some things but worse in others.
When played to its strengths it could certainly do some impressive things, but in multiplatform games, developers were still far more comfortable with doing a traditional CPU + GPU rendering path while using Sony provided code for the Cell processor to assist with some of the GPU code. This meant that in general multiplatform games tended to do better on the competing platform (especially in the first few years of the generation).
However, some things the cell processor could not assist the GPU with an you were still stuck with the GPU's limited capabilities in those cases (transparencies for example would always be lower resolution on PS3 versus the competition).
It's far too complex of a situation to use a single number to say PS3's Cell (great parallel computing) + RSX (weak GPU) was better or worse than X360's more traditional setup with a powerful traditional CPU (but less powerful in massively parallel tasks) + traditional powerful GPU.
It's like asking different people what's the better fruit? Apples or Oranges?
Regards,
SB
I think you and some other guys still don't understand what I said. I compared just raw performance. That's all. I know what CPUs and GPUs can't be compared just using gflops.Please don't use GFLOPS as a direct measure of performance like that. Particularly not when adding CPU + GPU GLFOPS together and even more particularly not when comparing between different architectures, one of which is significantly more advanced than the other.
Thanks for that info.The G80 (the 8800GTX) could issue 256 FMADS per clock as well as 128 MULS per clock under the right circumstances. That's 384 FLOPS across at 1.35Ghz = 518.4 GFLOPS. The MUL wasn't always usable in that way which is why you sometimes see the lower number which just counts the FMADS.
Yes, similar to PS2 CPU, but as a result on PS2 were a lot of amazing games for that time and on PS3 were a lot of amazing games for that time. I think that means what capabilities of different CPUs also can't be comparet directly.Regarding the CPU comparison, CPU performance isn't even measured in GFLOPS. It's measured in actual performance in real world, branchy, memory sensitive, and often single threaded code. The only part of Cell that could deal with that kind of code (it's central PPE) was laughably slow at this.
Wasn't they released years before?And quad core CPU's landed in the PC space around the same time as PS4 launched.
I meant comparison to Xbox 360.It was launched before the PS3 so it's a pretty fair comparison to that console.
In my opinion use power of GPUs mostly for resolution and fps is wrong way for industry. Would've been better to calcultae more polygons, better lighting, better effects ad mora at same time and better shadows. Would've been interesting to see what Xbox oNe X could do in 1080p 30fps. Same with 9th gen consoles. I understand what a lot cheaper to make higher resolution and fps that make a lot more detailed assets. So Microsoft and Sony said what on XSX and PS5 player can play in 4k 60 fps. BUt they haven't told what graphics will not be far ahead of X1 and PS4. Ofcourse tet will change later this gen. But I think what that will not change a lot. PS5 have 5 times more power that PS4 but resolution is 4 times higher. XSX have only 2 times more power that X1X but there at least almost same resolution. Of course improved CPUs and GPUs architecture, more RAM, SSD and ray tracing will help, but as I said I think difference will not be big. That dorsn't mean what I will not play new games and enjoy them, I already have XSX and PS5, but wow effect what was playing 7th gen games will be lower.By who's measure? Please be more specific. Some would argue that Crysis absolutely showed that 2x power differential in the first year or so. And I think many console gamers today would argue that being able to play games at a solid 60fps / 1440p vs a shaky 30fps / 720p is a great use of those multiples of GPU power. Would you be happy with that performance from the gen 9 consoles in exchange for better core graphics? I doubt many would.
I just wanted t know raw power of Xenon. Thanks.Just under 77 GFLOPS. But that's entirely irrelevant. For reference the PS4 CPU was the same I believe. It means next to nothing in a CPU.
Yes, but all games worked, so what advanteges those CPUs had?The problem with your comparison is the SPU's in Cell were used for GPU tasks where as a PC CPU was purely used for CPU work.
And factoring that in the PC CPU's slaughtered Cell for that kind for general CPU work.
And what they should've taught me? Please explain.And hasn't PS5 vs XSX taught you anything about comparing flops?
THose games worked 30 fps or ~ 30fps n PS3. There is a lot of tests and comparisons. And of Xbox 360 they worked even better, with higher resolution, higher fps and some better effects.Those games didn't work on PS3 either unless you consider 10-15fps 'working'
We still in star of thes generation, and some conclusions will be possible to make in 2-3 years.PS5 has less Tflops than Series-X but often matches or beats its performance.
So what lesson do you think you should have learnt from that?
Going with the a low cost and outdated tech doesnt provide a good example either.You've skipped over the price point which is, I think, the factor that most swayed both Microsoft and Sony to go with AMD again. Discrete CPUs and GPUs add some complexity which in turns add quite a lot of cost. A further AMD design, which was an evolution of the previous designs used which made backwards compatibility easier, was also a factor I am sure.
Consoles need to be cost effective, in terms of either making a profit, or taking as little as loss for as short a time as possible.When consoles aren't cheap.. well, just look at NeoGeo, 3DO and PlayStation 3.
Interestingly the 360 and the PS3 are performing on similar levels. Even though the PS3 had a worse GPU. Did the devs leverage using the Cell effectively?PS5 has less Tflops than Series-X but often matches or beats its performance.
So what lesson do you think you should have learnt from that?
View attachment 8989
As for fps test, that is not 10-15 fps.
Mostly 30 when there is no combat.
In combat 25-27
rare moments below 25
But I'm sur on PC there is also a lot moments when fps drops.
Interestingly the 360 and the PS3 are performing on similar levels. Even though the PS3 had a worse GPU. Did the devs leverage using the Cell effectively?
I owned the game on my 360 btw, and I was impressed with the visuals and the performance didnt mind me much. Most of the time it performed pretty well
Crysis 1 and 2 definitely run a lot of combats at like 15-24 fps.As for fps test, that is not 10-15 fps. Mostly 30 when there is no comat. In combat 25-27, rare moments below 25. But I'm sur on PC there is also a lot moments when fps drops.
Multiple doesn't mean often or mostly.There are multiple segments where it's at or below that level.
Ok, wjat is your point? PS3 and Xbox 360 are trash?In a game that is combat based saying it's 30fps when there's none is laughable.
I watched some moments from video. And I played Crysis 2 on Xbox 360, there is only some moments when game is running with low fps. They made great result for consoles what were released 5-6 years befor Crysis 2 had 5 times less GPU power and 20 times more RAM.Again, watch the video.
Yeah, and what was your PC specs comparing to consoles?And at console equivalent settings (But at 2.5x the resolution) even mid-range GPU's offered 2-3x the frame rate performance.
As I remember they didn't used SPUs in Crysis 2 for graphics tasks, only for CPU tasks. Maybe in Crysis 1 Remastered or Crysis 3 they started use SPUs also for graphics tasks.Crytek did heavily use the SPU's in CryEngine and in CPU limited situations it did provide PS3 with a higher frame rate than 360, which typically performed better when the action was not as intense.