Are modern GPU's really offering a worse level of performance than historic GPU's??

Been looking at reviews of old GPU's that released when consoles did, to see if modern GPU's struggles more than older GPU's did.

The results are eye opening.

PS3 release date: 17th November 2006
Nvidia 8800GTX release date: 8th November 2006


8800GTX results from techpowerup at 2560x1600 (with max settings) which was the top resolution available then and what this card was aiming for.

  • F.E.A.R - 30fps
  • Prey - 48.7fps
  • Quake 4 - 34.1fps
  • X3 - 47.8fps
What about PS4?

PS4 release date: 15th November 2013
AMD R9 290x release date: 24th October 2013


R9 290x (Uber BIOS mode) results from techpowerup at 2560x1600 (with max settings) which wasn't even the top resolution available then as 4k was starting to gain traction.

  • Assassins Creed 3 - 48.8fps
  • Crysis 3 - 26.3fps
  • Far Cry 3 - 35.1fps
  • Hitman - 36.9fps
  • Metro Last Light - 48.7fps
  • Tomb Raider - 47fps
It doesn't matter what generation you look at, there were always games that crippled PC GPU's and ran at less than 60fps, heck some games barely got to 30fps.

So are modern GPU's any worse and is up scaling the crux people are saying it is? I believe that answer is no, and that actually up scaling is helping modern GPU's to deliver better performance than what has historically been available.

Price of modern hardware is a different matter all together.
 
Upscaling is a lie to sell bad software. The games look barely any better but suddenly you should be happy running them at some 1080p or under with top end hardware twice the cost of last gen.
of course there's diminishing returns and what not, but it's become a joke.
 
Upscaling is a lie to sell bad software. The games look barely any better but suddenly you should be happy running them at some 1080p or under with top end hardware twice the cost of last gen.
of course there's diminishing returns and what not, but it's become a joke.

I think some of it is that quality settings don't decrease quality as noticeably as they once did.

So when running on Ultra quality, performance tanks, when in reality it looks no better than 'high quality' but cuts performance by 40%.

On past generations dropping to high would give you 40% performance but would look 40% worse.
 
In the 8800GTX results is the X3 result from X3:Reunion? That game is amazing. I've never seen anyone use the X games in their benchmark suites. Can you link to that review?
 
Anyone that’s mad a game doesn’t run well on their pc can now claim the game is unoptimized without doing any technical justification.

You could in theory have a game that runs at 1080p30 on a 4090 that’s hyper-optimized that uses the hardware as efficiently as possible.

What people mean when they say unoptimized is that it doesn’t scale how they want it to. Unfortunately for those people upscaling is here to stay because it actually allows games to run better without compromising the artistic vision of the game.

We’re at a point where you can only lower shadows, lighting and geometry so much before the experience is ruined, so upscaling is used in place of potato options.
 
Upscaling is a lie to sell bad software. The games look barely any better but suddenly you should be happy running them at some 1080p or under with top end hardware twice the cost of last gen.
of course there's diminishing returns and what not, but it's become a joke.

I agree in general perf/IQ is pretty terrible these days. However don’t forget we are paying the one time cost of moving away from baked lighting. The older games that look good are relying heavily on baked, static lighting to do so.
 
1440p/1600p was a VERY high resolution back in 2006. This was back when 1080p TV's were only just starting to gain traction and consoles were only just jumping into the realm of 720p rendering resolutions. Even today, nearly twenty years later, 1080p remains an extremely common resolution for people to be playing their PC games at.

Pushing presentation levels to the degree that modern games are is simply inherently more inefficient than it used to be. It's not that the hardware isn't great, or that devs are all terrible now, it's just facing the realities of diminishing returns in terms of bang for buck on tech/features. And now of course we have ray tracing which is the new king of bad bang for buck, but we just kinda have to bear the costs if we want the advancements it provides.
 
Wing Commander 3 was limited to a max framerate of 24fps IIRC. Gotta get that local bus video card and the latest Intel nonsense if you want even that luxury in SVGA mode! :D

I guess I don't really know what I think about upscaling tech. It is nicer than just reducing resolution. But it is also similar to the shader compilation issues in that these big games are not being tuned in as well as they could be. It's also because they are multiplatform though and if people are still buying the games and brainwashed enough to pay for $1600 gaming GPUs then where is the incentive to put massively more work in. Like, Cyberpunk got ripped apart in reviews on all platforms and yet has been very successful. Getting all that free press for your game's problems is just another way to make people more aware and interested in your product!
 
Last edited:
I suppose I wonder what it would have been like for Crysis in 2007 had up-scaling been available then like it is now.
I wonder how well something like FSR would run on the prevalent non-unified shader GPUs of the time. Maybe it's not even feasible on D3D 9/10? G80 is the first GPU with any noteable compute capability.

I do think Crysis has an excessive reputation. There are numerous hot-new games that ran awful on the machines I had at various points in time. Though there was a period from say 1997 - 2003 or so when high-end 3D cards could be had for relatively cheap. The competition was insane back then. Once the ATI/NVidia duopoly got into full swing, pricing started heading upward.
 
Last edited:
I do think Crysis has an excessive reputation. There are numerous hot-new games that ran awful on the machines I had at various points in time. Though there was a period from say 1997 - 2003 or so when high-end 3D cards could be had for relatively cheap. The competition was insane back then. Once the ATI/NVidia duopoly got into full swing, pricing started heading upward.

There has been no game released since Crysis that ran as bad as Crysis did on max settings.

The fastest PC at the time on not even the top resolution would break 30fps in later levels (Especially Sphere) and would drop down to single digits often.

I can't think of any AAA that's ben at that level, even CP2077 with patch tracing isn't that bad compared to what Crysis was.
 
There has been no game released since Crysis that ran as bad as Crysis did on max settings.

The fastest PC at the time on not even the top resolution would break 30fps in later levels (Especially Sphere) and would drop down to single digits often.

I can't think of any AAA that's ben at that level, even CP2077 with patch tracing isn't that bad compared to what Crysis was.
Maybe Crysis was just a really poorly managed project? Like who did they want to sell their game to? 1% of the PC market? I remember with Warhead they renamed their detail presets and tuned some LOD down a bit in general. But then with Crysis 2 they went completely nuts with non-visible tessellation for some reason.

I think it's hard to judge though. The PC performance spread is pretty crazy now, and I don't play most of the AAA games. But a lot of them do run terrible. Days Gone stutters along quite badly for a few minutes until the shader cache fills in for example and it's never smooth on the bike. I don't remember that kind of thing with Crysis.
 
Last edited:
Maybe Crysis was just a really poorly managed project? Like who did they want to sell their game to? 1% of the PC market? I remember with Warhead they renamed their detail presets and tuned some LOD down a bit in general. But then with Crysis 2 they went completely nuts with non-visible tessellation for some reason.

I think it's hard to judge though. The PC performance spread is pretty crazy now, and I don't play most of the AAA games. But a lot of them do run terrible. Days Gone stutters along quite badly for a few minutes until the shader cache fills in for example and it's never smooth on the bike. I don't remember that kind of thing with Crysis.
Think of it as advertisement on what your licensable engine can do
 
Maybe Crysis was just a really poorly managed project? Like who did they want to sell their game to? 1% of the PC market?

They wanted to sell it to most gamers. Remember their performance target was predicated on Intel saying their CPUs would be able to hit 10 GHz. In other words, Crytek were expecting at a minimum a doubling of CPU single threaded IPC/performance (so likely something around 6-8 GHz) by the time the game launched or shortly thereafter.

Considering how frequently CPU IPC performance was doubling back then, that wasn't an unreasonable expectation.

That obviously didn't happen.

Regards,
SB
 
Last edited:
Maybe Crysis was just a really poorly managed project? Like who did they want to sell their game to? 1% of the PC market?

They made their game time future proof and scale with time.

Is that no different to CP2077's Overdrive mode?

But making their engine in line with Intel's predictions certainly didn't help them.

The game hasn't been GPU limited for well over 10 years now.

I remember with Warhead they renamed their detail presets and tuned some LOD down a bit in general.

Warhead actually has higher LOD's and still offers slightly better performance.

Alex from DF showed this in a video he made about Crysis Warhead.

But then with Crysis 2 they went completely nuts with non-visible tessellation for some reason.

This was debunked years ago, again Alex from DF has explained.

I think it's hard to judge though. The PC performance spread is pretty crazy now, and I don't play most of the AAA games. But a lot of them do run terrible. Days Gone stutters along quite badly for a few minutes until the shader cache fills in for example and it's never smooth on the bike. I don't remember that kind of thing with Crysis.

It's easy to judge.

At the time of Crysis released, not even a Tri-SLI set-up of 8800 Ultra's ($3000 worth of GPU) could deliver a locked 30fps in Crisis at max settings.

No other game has done that since.

Even a 4090 manages to do deliver a more playable experience in CP2077's Overdrive mode than the 8800 Ultra's did in Crysis in 2007.
 
Last edited:
Back
Top