A Generational Leap in Graphics [2020] *Spawn*

Discussion in 'Console Technology' started by chris1515, Dec 19, 2020.

Tags:
  1. Shompola

    Newcomer

    Joined:
    Nov 14, 2005
    Messages:
    197
    Likes Received:
    40
    Games have different bottlenecks and perform different when put on a newer architecture such as PS5/XSX. Not sure what is so shocking about it. Also, we have seen more than a hand of full games on XBO X performing not only better, but beyond 2x rendered pixels compared to PS4 PRO. So 8x or whatever is not that far fetched really!
     
    Shoujoboy likes this.
  2. Flappy Pannus

    Regular

    Joined:
    Jul 4, 2016
    Messages:
    329
    Likes Received:
    567
    Ah my bad, I had the PS4 Pro version on my brain as that's what I was coming from and transposed that over PS4.
     
  3. Flappy Pannus

    Regular

    Joined:
    Jul 4, 2016
    Messages:
    329
    Likes Received:
    567
    ....yes, I agree that a completely different and much older game is less impressive? I mean what is the point of this reply? You do realize that you don't have to respond to every single post in a thread since you last visited it, right?
     
  4. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,236
    Likes Received:
    4,259
    Location:
    Guess...
    I agree with this, although I think DLSS bucks this trend is a pretty disruptive way.

    I don't agree with this though. CB2077 is hugely heavy on hardware, yes. But IMO the visuals warrant the performance cost.
     
    pharma and PSman1700 like this.
  5. techuse

    Veteran

    Joined:
    Feb 19, 2013
    Messages:
    1,426
    Likes Received:
    909
    If games with notably better visuals on hardware over 3x weaker become the norm are they really warranted? Going back to Crysis it’s important to remember that the consoles never came close to matching it, much less surpassing it.
     
    #545 techuse, Dec 30, 2020
    Last edited: Dec 30, 2020
  6. dobwal

    Legend

    Joined:
    Oct 26, 2005
    Messages:
    5,955
    Likes Received:
    2,324
    Not saying it doesn’t. Just saying that it seems CDR addressed CP77 needs by simply throwing a ton of Tflops at it.

    It’s understandable given we are just at the beginning of a new gen, but I don’t think CP77 like IQ is going to take a 3090 with DLSS set to quality in the future.
     
    pjbliverpool likes this.
  7. BillSpencer

    Regular

    Joined:
    Nov 18, 2020
    Messages:
    299
    Likes Received:
    117




    as for CP2077, this is how a console from 7 years ago and over 17 times weaker* handles it:



    *at least going by Nvidias' (fake PR) specs
     
    Shoujoboy likes this.
  8. techuse

    Veteran

    Joined:
    Feb 19, 2013
    Messages:
    1,426
    Likes Received:
    909
    Factoring in DLSS I’d say a 3090 is probably 35-40x more capable than a base PS4. IMO Cyberpunk is a very inefficient use of processing power relative to its visual output.
     
    #548 techuse, Dec 30, 2020
    Last edited: Dec 30, 2020
    Shoujoboy, BillSpencer and Rootax like this.
  9. chris1515

    Legend

    Joined:
    Jul 24, 2005
    Messages:
    7,157
    Likes Received:
    7,965
    Location:
    Barcelona Spain

    Again it shows you don't understand how works streaming, there is enough memory for streaming assets but you have some datastructure generated at runtime like the BVH and it takes a good amount of memory up to 2GB and you can other things generated at runtime. After for BVH you can create it during the development of the game for static object, stream it from the SSD and only update it for dynamic object. I don't speak about replacing RAM. And it does not mean they did not needed more RAM this generation, assets quality will be much better, CPU is more powerful meaning more RAM for system running on it, you have datastructure for raytracing or other for of GI(Voxel datastructure or maybe one day point datastructure). On Unreal Engine 5 demo, they were able to stream film quality assets for static object. If the consoles had more RAM for example 32 GB it means they will need to use less often the streaming system. I am not sure next generation we will need to multiply by two the memory maybe 24 GB is enough, memory bandwidth improvement is more important. The limit for game assets quality is not RAM or streaming system, this is the size of assets on storage. The situation is not as dramatic than with cartridge but this is the reality. I think if it was possible having a 10TB game size limit would be enough for an Open world with UE5 demo level of quality. In theory, you can stream all the content of a 100 GB game in 10 seconds on PS5 (oodle texture and oddle kraken) using the average 11 GB/s give by RAD tool games. On Xbox Series, you can stream all the content in 20 seconds using the 4.8 GB/s number.



    From a guy working for RAD Tools game (oodle kraken/oodle texture)

    The leap is the same than other generation for the GPU and most of the time bigger CPU, storage with SSD and there is some new coprocessors component to take some workload from the CPU, I/O system in PS5 and Xbox Series X (no I/O operation on CPU at all on PS5 side) and the Tempest Engine for 3d audio and some DSP on Xbox Series. The leap is a bit less for memory bandwith but again because of some compression inside the GPU this is more efficient and this is not like it was much bigger last generation PS3(48GB/s) to PS4(176GB/s) was a 3.7 increase in memory bandwidth from PS4 to PS5(448 GB/s) it i only 2.54 but like I said RDNA2 GPU needs less bandwidth than GCN GPU. The difference is not huge.

    No there is a reason, in PC benchmark they use existing titles to compare hardware difference. You need the same workload to compare.
    https://www.guru3d.com/articles_pages/sapphire_radeon_rx_6800_nitro_review,23.html

    And cross gen, on PS4 they used PC path as the base. PS3/360 generation was long and at the end they were destroyed by PC. PS3 was the problem with inferior mutliplatform games compared to 360 and PC. This is useful and the best to compare two differents consoles hardware on pure performance aspect.

    I know Demon's souls is not a PS4 game and it is gorgeous, this is one of the best looking game but it is difficult to compare the first year of games when last generation we had no cross gen title on Sony and Microsoft side. Horizon 2 comparison against Killzone Shadow Fall for example will not be interesting because Horizon is cross gen. This is the reason I think 2022 an 2023 will be a better point of comparison. For example, Unreal Engine 5 title will begin to arrive in 2022/2023.

    I don't compare next-generation consoles to PC SSD, again in two years PCIE5 will be there it will go much faster than PS5 SSD but out of loading a bit faster* it will change nothing because the limit is not on streaming side but size of games. This is the same on PC. The consoles advantage is temporary.

    * if PS5 load in 2 seconds a level it will load in 1 seconds on PC, there is a diminish return effect.
     
    #549 chris1515, Dec 30, 2020
    Last edited: Dec 30, 2020
    Shoujoboy and DSoup like this.
  10. BillSpencer

    Regular

    Joined:
    Nov 18, 2020
    Messages:
    299
    Likes Received:
    117
    With all of the praise and hyperbole going around, the base 2013 PS4 version is extremely, extremely underrated.

    with how impressed people are that the switch can run Witcher 3, or wolfenstein, PS4 running Cyberpunk 2077 is really the equivalent of the PSVita running Crysis
     
  11. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,092
    Look, im not saying 16GB is paltry, that its not enough to drop some jaws down the line. What i mean is that in quantity, its less of a increase for memory then before. The SSD is going to mitigate some of it, but not all. Ram management is going to be the key there indeed.

    The jump in performance for the GPU is absolutely not the same as before. its five times less in pure metrics. Half that as we went from PS3 to PS4. Architectural improvements happen all the time for every generational shift. It could be argued that going from G70 to GCN1.1 was the larger improvement over going from GCN to RDNA2. For that we need someones input but atleast it can be said architectural improvements or IPC is going to account for both shifts.

    CPU wise, even all the credit cell got on some forums and articles, was actually about four times weaker then the 8 core jaguar in the PS4. Thats aside from a very bad vs a much better efficiency. The jaguars got alot of flak, and compared to PC cpus they where low end, but compared to what was in the PS3 (cell) it actually was a great improvemet across the board for gaming related tasks. While the Zen2 cpu is a much needed leap, its not directly more of a leap then going from Cell>jaguar.

    Yes the coprocessor is going to offload IO tasks, but remember that the PS4 didnt really need one as games didnt rely heavlily on fast loading.
    The leap for memory bandwith is not 'abit less', its alot less. The GPU in the PS3 basically had to part with about 20gb/s for its 256mb allocated to vram. That went all the way to 176gb/s for the PS4. RDNA2 needs less BW then GCN, but whos to say bandwith efficieny wasnt improved going from G70 to GCN? Nothing happened in those 8 years of development?
    I'd say that even there the difference is quite large, dGPU variants have infinity cache for a reason, or in Amperes case, closing in to 900gb/s of raw BW.

    Thing is, scaling has improved alot, and cross-gen sure has been the focus now. Still, i see SF and 1886 as bigger leaps to the previous best graphics then what DS and rift apart do. But then we go again in this territory where some thing GTA3 looks better then 4 etc :p

    Hm, seems like a disadvantage to the PS4 then, by using a PC path on the console? Anyway, scaling has come long ways since then, and the focus on cross gen has put much more optimization towards that im sure.

    Its an amazing title graphically, and its up there along others. But its not that leap shadowfall brought us. I mean, its something DF has shared the same opinion on. Generational shifts have decreased since the PS2, its no secret :)

    Then you'd need to compare 2015/2016 PS4 games to 2022/2023 PS5 games or something, not launch ones like shadowfall. Now i say DS and SF because they both are launch titles and basically what we got for PS5.
    Time will tell if we see the same leap as PS2 to PS3, and PS3 to PS4. Im still saying leaps have gotten smaller, its a general thing basically everyone knows.

    The console advantage right now is direct storage not being ready yet on pc. Whatever that advantage is anyway, PCIE4 nvme setups load games almost as quick? Have to do some side by side testing with the same games.
     
    pharma likes this.
  12. chris1515

    Legend

    Joined:
    Jul 24, 2005
    Messages:
    7,157
    Likes Received:
    7,965
    Location:
    Barcelona Spain
    This is not how you count metric means nothing. What is important is real world performance and here we see depending of the title 5 to 8 times more pixel or in Spiderman remastered performance mode a 7/8 times more pixel pushed.

    https://www.techpowerup.com/gpu-specs/geforce-gtx-1080-ti.c2877

    1080 Ti Tflops is higher than 2070 Super or 2080 but the GPU performance is behind. This is the same things and it is better to buy a Turing GPU with or wihtout raytracing. Stop your bullshit. What you tell made no sense at all, this is pure trolling.

    We will not rewrite history the Jaguar is weak. And the main reason we had a regression in gameplay physics. It was not stagnant because we had less physics in PS4/XB1 titles than PS3/360 one.

    for G70 to GCN GPU, memory delta colour compression inside the GPU was a domain where Nvidia was much better than AMD, no chance for you. This is the reason GCN GPU need so much memory bandwidh and Vega GPU use HBM2. It improved with Polaris side and continued to improve with RDNA. You need to learn a bit about PC GPU technology.:wink: The PS3 had an Nvidia GPU.:wink4:


    Again DF is not an absolute value, this is a matter of taste and the work on Demon's souls* began in 2017on a GPU without raytracing acceleration and Bluepoint did not have the time or the workforce to use it. Bluepoint is not GG or ND, they are a tinier studio. It would have been more interesting to compare GG work on PS4 and PS5 if Horizon 2 was not cross gen. But we can compare 2015/2016 title like Uncharted 4 or TO1886 to 2022/2023 title if you want, I am sure the gap will be wide.

    *Imo it is with Flight Simulator 2020 and Cybepunk 2077 the best looking title of 2020. TLOU2 would be the 4th best title before Spiderman MM.
     
    #552 chris1515, Dec 30, 2020
    Last edited: Dec 30, 2020
    Shoujoboy likes this.
  13. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,236
    Likes Received:
    4,259
    Location:
    Guess...
    The quality of graphics at any given time are relative the period they're released in. We're barely scratching the surface of what DX12U class architectures are capable of so at this point, very early on in the optimisation cycle I'd say that yes they are worth it (although see my response below for further context on this). As time goes on developers will naturally learn more about the new hardware capabilities and develop better ways to use the available power and features which will result in better looking games, as is the case every console generation. However for where we are right now in the optimisation cycle I'd say that CB2077's visuals warrant their high performance cost vs other games currently available.

    Actually I do agree with you. My earlier statements about the graphics being worth the performance cost weren't fully thought through and thinking about it more, I was really talking of the non-RT version of the game (as that's what I'm playing) when I made them. The requirements of those graphics seem quite in line with the result IMO compared with other games available at the moment in my own experience. Adding RT seems to give significant additional benefits but yes, an argument can certainly be made that the extra graphical fidelity isn't worth losing 2/3rds of your performance, and there will be games later this generation which use that power for more visual impact.

    Despite YT compression underselling the vast difference in image quality and framerate between 4K 40fps+ and 720p 20fps+, I think the CB comparison holds up very well to the other 2 videos. The Crysis difference is probably bigger, but the Crysis 3 difference is smaller IMO. Just look at the scene from 7:14 for a example in the CB comparison. That scene gets a few seconds in the video but could represent hours of gameplay, both in car and on foot within the game, and that's a very clear generational difference between the two platforms. Here it is for reference:



    This is silly. I assume you're referring to the available TFLOPS, but there is nothing fake or PR about what Nvidia rates the Ampere GPU's at. It's a simple factual reporting of the architectures raw float shader throughput that is easily testable. If you're concluding that Nvidia is lying about it's FLOPS because there isn't a real world 17x difference between the 3080 and PS4 (when not using RT or DLSS) then that simply highlights a lack of understanding of the relevance of FLOPS in the overall system architecture context.
     
  14. techuse

    Veteran

    Joined:
    Feb 19, 2013
    Messages:
    1,426
    Likes Received:
    909
    I agree up to a point. I'd say the performance gap between a 3090+DLSS and a PS5 is clearly bigger than G80 and the PS3. You can look at any gen 7 console title throughout their lifetime and not a single one ever came close to matching Crysis. 7 years of developer improvement and nothing came close. Similar to DX12U being new, there were several new paradigms of the time. Learning them improved things dramatically, but not nearly enough to close the gap. GPU wise Crysis performance was justified. Comparing the best looking games on a base PS4 to Cyberpunk on a 3090 and there is just no way the visuals aren't punching well below their weight with the 35-40x more GPU capability available.
     
    #554 techuse, Dec 30, 2020
    Last edited: Dec 30, 2020
  15. troyan

    Regular

    Joined:
    Sep 1, 2015
    Messages:
    605
    Likes Received:
    1,126
    G70 is based on the CineFX architecture and had so many problems with pipeline bubbles. And unlike G7x is hasnt even twice the pixel shader output per pipeline. GCN 1.1 is so much better that alone the architecture improvements are one the same level as the raw compute performance improvement from PS4 -> PS5.
     
    PSman1700 likes this.
  16. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,092
    Im talking metrics because thats what we do every generation. Even MS and Sony themselfs did so (four times the CPU, twice the GPU power of the OneX etc). On a forum like this, there will be metrics tossed around.
    I agree on real world performance, but pushing 'between 5 to 8 times more pixels' doesnt really tell the whole story either. It doesnt mean flat out that theres a 8x improvement.

    Aside from that, when a new generation of consoles arrive, usually, its the native games that really show off the capabilities, where graphical fidelity is upped. Shadowfall did just that, and so does Demon souls, but the latter didnt show a huge a leap as SF did.

    What has a 1080Ti vs 2070S to do with what i said? Your talking about architectural improvements there, ofcourse TF for TF, Turing is more efficient then Pascal, it should be. I have no clue why you even need to start about that.
    Going from PS3 to PS4, or G70 to GCN, there where architectural improvements too. TF for TF, the PS4's has an advantage there aswell.
    Aside from that, keep things civil. You think the leap is as large as goig from PS3 to 4. I think its not. Its a agree and disagreement, thats how forums work.

    Jaguar is weak, i have never said it was a strong CPU for the time. What i am saying is that the performance relative to Cell vs the jaguar wasnt all that bad. It was a four time increase, not talking effeciency. Which means, the leap in CPU going from PS4 to 5 isnt that much different, despite the CPU being much better then the jaguar. You can, perhaps, blame the Cell for that.

    Lol. atleast you keep it funny with the 'no chance for you' :) Anyway, NV had an advantage you say over amd with G70? Is that what you mean? Anyway, i seriously doubt that, going from a Nvidia G70 (7800GT derative) to a AMD GCN 7870 derative, the memory efficiency wasnt improved at all, or even behind.

    DF is not an absolute value no, why would it be? They are though, very highly regarded here. The work on Demon souls began three years ago, ye, i can believe that. But the same would go for Shadowfall, i doubt they made that game in half a year. The development time probably was quit close to that three year development time you talk about.

    GG wasnt back then what they are today either. Also, an advantage for PS4 to 5 is that we stay on the same AMD/X86 platform, whereas for PS3 to PS4, from NV to amd and Cell to out-order X86.

    Yea, can agree on that, its up there with those (demon souls). Still, i'd rate CP2077 at 1 and can see why alex and john did.

    Sure about that? Uncharted 3 did look very nice graphically. Larger gap then DS to CP2077 pc perhaps, but dont forget that leaps have gotten smaller on pc too, but to a much lesser degree (hardware wise). Blame power usage id say.
     
  17. chris1515

    Legend

    Joined:
    Jul 24, 2005
    Messages:
    7,157
    Likes Received:
    7,965
    Location:
    Barcelona Spain
    There is only one place where we will have a visual diminishing return, this is cutscene. We can model good character and this will be into the details some very visible like hair rendering, peach fuzz and some more subtle effect like better subsurface scattering.

    Slide 106
    http://advances.realtimerendering.com/s2020/NaughtyDog_TechArt_TLOU2-final.pptx

    Comparison between cutscene model and ground truth offline rendered model.

    [​IMG]

    This is exaclty the same problem you can't compare GCN 1.1 Tflops to RDNA 2 Tflops like you can't compare Pascal Tflops to Turing Tflops. Your comparison is bad. At the end the GPU is probably arounf 7 to 8 times more powerful than the 2013 PS4 without pushing new features, it tells a big part of the whole story. The other part of the story is new features and this is an advantage for PS5 GPUs not PS4 GPUs.:wink:

    and same for @troyan Take CELL SPU + G70 and they had some compute shader equivalent with SPU for some workload on vertex side or you can do postprocessing too with CELL SPU. ;)

    Maybe you need to go back to some GDC document of the CELL SPU usage for graphics.
     
    #557 chris1515, Dec 30, 2020
    Last edited: Dec 30, 2020
    DSoup, Shoujoboy and Vega86 like this.
  18. Vega86

    Newcomer

    Joined:
    Sep 25, 2018
    Messages:
    191
    Likes Received:
    131
    Dark Souls 3 to Demon's Souls Remake.
     
    Shoujoboy likes this.
  19. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,092
    Yes. The same for G70 to GCN1, you cant compare them TF for TF either. Fact remains that in TF metrics alone, the improvement was 10 times the increase, as opposed to 5 what we got now. Thats pure metrics. Counting in going from G70 arch to GCN1.1 arch and the improvement is even larger then what we see from GCN to RDNA. If the PS5 GPU is 7 times more powerfull then the PS4s, the PS4s gpu was close to 20 times more powerfull. Architectural changes account for both.
    And no, basing performance increases purely based on resolution is faulty to begin with. In special considering last gen games.

    Maybe use common sense.

    Still doesnt change on what we got as the best visuals possible on PS5 at launch vs the PS4 at launch. No matter what, if the leap was as big as Chris1515 promises, we sure should have seen it in a game natively designed around the PS5 (bluepoint claims so). In special if the development time was three years.
     
  20. Vega86

    Newcomer

    Joined:
    Sep 25, 2018
    Messages:
    191
    Likes Received:
    131
    Sorry I didn't understand. What was Chris' promise? I just came into the thread reading the title then replying. :lol:
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...