I'm not making fun of anyone, it just sounds ridiculous. i understand what he's saying, about the code not running well on the hardware, thus the code is more demanding. but i have never heard poorly optimized codes being called the most demanding games on the console hardware. zone of enders on ps3, a ps2 game, at release would becalled a more demanding game then uncharted 3 because of running 20- 30fps at release according to sebbi, this his pov as a coder, i'm looking at as in GPU saturation, and frankly it's how most people look at it.
Many ways to view it. An excellent example might be if games were released at native 4K vs reconstruction on the same system. A normal person say couldn't see the difference, but they could notice that one had a graphical advantage over the other as a result of freeing up system resources for higher settings.
Which is more demanding? The native 4K or the reconstruction running higher settings? In your POV the latter should be more demanding but we see that the hardware struggles with a native 4K presentation.
Ie: look at the 4Pro. Can I reposition the argument to be some form of lower native resolution like 1440p vs. CBR 4K. In your mind the latter will look better and it's running a higher resolution.
Was the code unoptimized? Or is the code now doing more work for less?
It's th same from when graphics moved away from SSAA to other forms of anti aliasing, they found ways to get approximately the same results for less.
Or what about carmacks (attributed to, but not likely) fast inverse root trick? Back in the day And still today's is one of the most taxing of all operations to be performed on hardware natively but this trick saves all this calculation to be done with a simple hexadecimal value.
You won't notice the difference graphically, but you will notice the performance difference between doing the math the right way vs the trick.
So sebbbi imo is right. He's POV is extremely valid and his experience as senior renderer for as long as he's been doing it; he's seen tons of this type of stuff come and go forever. He's one of the few people in the world that actually use high level math as part of their regular job. He's one of the few people that are constantly looking to optimize different ways to exploit faster ways to do the same thing.
Because If he didn't, if the industry didn't (try to find ways to get the same result for less), graphics wouldn't move. Since the inception of graphics hardware, we've done more than just increase power output, we've changed the way games are coded.
Look at Crysis 1 vs Crysis 3. 3 looks better than 1 and runs several times better than 1 on the newest hardware. Their coders thought that graphics would head in one direction, but ended up heading into another. And that's why that meme exists. Cause it runs like dog on hardware and no one can run that game properly on ultra until recently. That's how much raw horsepower is required to run Crysis.