davis.anthony
Veteran
Been looking at reviews of old GPU's that released when consoles did, to see if modern GPU's struggles more than older GPU's did.
The results are eye opening.
PS3 release date: 17th November 2006
Nvidia 8800GTX release date: 8th November 2006
8800GTX results from techpowerup at 2560x1600 (with max settings) which was the top resolution available then and what this card was aiming for.
PS4 release date: 15th November 2013
AMD R9 290x release date: 24th October 2013
R9 290x (Uber BIOS mode) results from techpowerup at 2560x1600 (with max settings) which wasn't even the top resolution available then as 4k was starting to gain traction.
So are modern GPU's any worse and is up scaling the crux people are saying it is? I believe that answer is no, and that actually up scaling is helping modern GPU's to deliver better performance than what has historically been available.
Price of modern hardware is a different matter all together.
The results are eye opening.
PS3 release date: 17th November 2006
Nvidia 8800GTX release date: 8th November 2006
8800GTX results from techpowerup at 2560x1600 (with max settings) which was the top resolution available then and what this card was aiming for.
- F.E.A.R - 30fps
- Prey - 48.7fps
- Quake 4 - 34.1fps
- X3 - 47.8fps
PS4 release date: 15th November 2013
AMD R9 290x release date: 24th October 2013
R9 290x (Uber BIOS mode) results from techpowerup at 2560x1600 (with max settings) which wasn't even the top resolution available then as 4k was starting to gain traction.
- Assassins Creed 3 - 48.8fps
- Crysis 3 - 26.3fps
- Far Cry 3 - 35.1fps
- Hitman - 36.9fps
- Metro Last Light - 48.7fps
- Tomb Raider - 47fps
So are modern GPU's any worse and is up scaling the crux people are saying it is? I believe that answer is no, and that actually up scaling is helping modern GPU's to deliver better performance than what has historically been available.
Price of modern hardware is a different matter all together.