Value of Consoles vs PC *spin off*

According to TPU it was about the same:

https://www.techpowerup.com/review/amd-radeon-rx-vega-64/31.html

But that's besides the point. I don't deny (in fact I've already agreed above) that AMD architectures tend to do a little better over time, largely because of their console links and I think AMD supports older architectures better than Nvidia. So yes, Vega moving a few percentage points up in relation to Pascal 4 years after Pascals launch isn't at all surprising.

But it's a few percentage points. After 4 years.

Again, where is the evidence of Pascal performance "falling off a cliff" two and a half years ago?
In lots of titles it does. How else does Vega 64 gain 13% of performance over a 32 game summary. Its fairly common now for the 1080ti to perform like a Vega 64 or just slightly better. Thats pretty dire. Its an entire GPU tier down. 33~%.
 
It does look like 780 Ti fell back a bit compared to R9 290X but it's not exactly catastrophic.

Somebody needs to waste a lot of time testing GTX 580 now.

R9 290X launched just before the 2013 consoles. It outperforms both by a large margin, probably closer to a pro. A 7950 does the job, too. A HD7870 tags along aswell, its not so far from what a base ps4 does.
 
In lots of titles it does. How else does Vega 64 gain 13% of performance over a 32 game summary. Its fairly common now for the 1080ti to perform like a Vega 64 or just slightly better. Thats pretty dire. Its an entire GPU tier down. 33~%.
On the flip side, it was another AMD(/ATI) GPU that wasn't very competitive until it was obsolete. Goes all the way back to R5x0 with a few exceptions on the way to today.
 
Last edited:
In lots of titles it does. How else does Vega 64 gain 13% of performance over a 32 game summary.

An 8-13% performance loss after 4 years is not "performance "falling off a cliff" after 18 months. It's AMD doing slightly better over the long term than Nvidia, which has never been in dispute.

Its fairly common now for the 1080ti to perform like a Vega 64 or just slightly better. Thats pretty dire. Its an entire GPU tier down. 33~%.

I honestly don't understand the desire to exaggerate in order to try and put a point across. If the point is valid it'll stand on it's own without the need to exaggerate.

TPU average performance: 1080Ti is 30% faster than the Vega64

https://www.techpowerup.com/review/amd-radeon-rx-6900-xt/35.html

Tech Spot average performance: 1080Ti is 31% faster

https://www.techspot.com/review/2160-amd-radeon-6900-xt/

Even in CB2077 which we both agree performs unusually badly on Pascal the 1080Ti up to 13.6% faster.

https://cdn.mos.cms.futurecdn.net/4xBmzcASVfCwu6LMS9FMeP.png

In fact if you look at the individual game tests in the above links there is only one where the Vega 64 manages to match the 1080Ti and another where it gets pretty close (i.e. within 10%). And those are the 2 most heavily AMD friendly titles on the market right now - Godfall and Dirt 5. Which incidentally, Ampere and Turing perform just as badly in and Pascal performs exactly in line with expectations in relation to them.
 
The 'problem' with AMD is focusing too much on next generation game engines. Nvidia focused on DX11 level games and excelled in them.
Now it is the other way around, with AMD focussing on rasterisation performance (to the point of this: https://www.hardwaretimes.com/nvidi...sterization-more-than-rtx-plus-our-own-story/ ) and beating Nvidia often.

Now it is Nvidia which focused on RT and DLSS, and out of all released games this year, only a handful support either, or both.
So testing all the games, you now will see a lot of wins of AMD against Nvidia, however, once more and more game engines support RT (the Nvidia implementation at least), you will see in a few year the 3090 winning more benchmarks compared to 6900xt for example
 
An 8-13% performance loss after 4 years is not "performance "falling off a cliff" after 18 months. It's AMD doing slightly better over the long term than Nvidia, which has never been in dispute.



I honestly don't understand the desire to exaggerate in order to try and put a point across. If the point is valid it'll stand on it's own without the need to exaggerate.

TPU average performance: 1080Ti is 30% faster than the Vega64

https://www.techpowerup.com/review/amd-radeon-rx-6900-xt/35.html

Tech Spot average performance: 1080Ti is 31% faster

https://www.techspot.com/review/2160-amd-radeon-6900-xt/

Even in CB2077 which we both agree performs unusually badly on Pascal the 1080Ti up to 13.6% faster.

https://cdn.mos.cms.futurecdn.net/4xBmzcASVfCwu6LMS9FMeP.png

In fact if you look at the individual game tests in the above links there is only one where the Vega 64 manages to match the 1080Ti and another where it gets pretty close (i.e. within 10%). And those are the 2 most heavily AMD friendly titles on the market right now - Godfall and Dirt 5. Which incidentally, Ampere and Turing perform just as badly in and Pascal performs exactly in line with expectations in relation to them.
It isn't an exaggeration. I just listed like 20 mid to high profile games where Vega 64 performs at or very close to 1080ti from the last 2 years.
 
It isn't an exaggeration. I just listed like 20 mid to high profile games where Vega 64 performs at or very close to 1080ti from the last 2 years.


It's simple, you both are right;
you are saying; if you look at the majority of the games, then .......

he is saying; if you leave out most or all the games where Vega 64 has an advantage, then the 1080ti performs better
 
It isn't an exaggeration. I just listed like 20 mid to high profile games where Vega 64 performs at or very close to 1080ti from the last 2 years.

Of course it's an exaggeration. You're trying to create a false narrative that Pascal has lost so much performance since it's launch in relation to Vega that the 1080Ti now generally performs like a Vega64. This despite me already presenting average performance scores across a large range of modern games from very recent reviews at two of the webs most reliable benchmarking sites that show the 1080Ti is on average around 30% faster today. You cite those very reviews in your own cherry picked benchmarks.

And with regards to those hand picked games, these are the actual results:

Forza - 1080Ti is 12-20% faster than Vega64
BFV - no direct comparison but the 2080 is 19% faster than Vega64 while being only 8% faster than the 1080Ti according to TPU
RDR2 - Vega64 does equal the 1080Ti here
Squadrons - 1080Ti is 17% faster
Godfall - Vega64 does equal the 1080Ti here
Dirt 5 - 1080Ti is 11% faster
Doom Eternal - 1080Ti is 12% faster
WWZ - 1080Ti is 16% faster
Division 2 - no direct comparison but based on the 1080 comparison it's reasonable to assume the 1080Ti's advantage here woudl only be in single digits.

So even in your own cherry picked games, the Vega64 is only able to match the 1080Ti in 3 out of 9 games with the 1080Ti being between 11% - 20% faster in the rest. Again, these are your own cherry picked games designed to prove your point.

And again, circling back to your original point that started this debate - in what way is this evidence of Nvidia performance "falling off a cliff" after 18 months? It's been over 4 years since Pascal launched now.

It's simple, you both are right;
you are saying; if you look at the majority of the games, then .......

he is saying; if you leave out most or all the games where Vega 64 has an advantage, then the 1080ti performs better

This is literally the exact opposite of what is happening here. I'm ignoring no games - I'm taking the Techspot and TechPowerUp supplied averages which between them cover over 2 dozen games including at least 6 of the 9 games techuse is citing as evidence for his argument. Techuse is the one cherry picking only those games were the Vega64 performs relatively strongly while ignoring all the other games from those reviews where it doesn't.
 
Of course it's an exaggeration. You're trying to create a false narrative that Pascal has lost so much performance since it's launch in relation to Vega that the 1080Ti now generally performs like a Vega64. This despite me already presenting average performance scores across a large range of modern games from very recent reviews at two of the webs most reliable benchmarking sites that show the 1080Ti is on average around 30% faster today. You cite those very reviews in your own cherry picked benchmarks.

And with regards to those hand picked games, these are the actual results:

Forza - 1080Ti is 12-20% faster than Vega64
BFV - no direct comparison but the 2080 is 19% faster than Vega64 while being only 8% faster than the 1080Ti according to TPU
RDR2 - Vega64 does equal the 1080Ti here
Squadrons - 1080Ti is 17% faster
Godfall - Vega64 does equal the 1080Ti here
Dirt 5 - 1080Ti is 11% faster
Doom Eternal - 1080Ti is 12% faster
WWZ - 1080Ti is 16% faster
Division 2 - no direct comparison but based on the 1080 comparison it's reasonable to assume the 1080Ti's advantage here woudl only be in single digits.

So even in your own cherry picked games, the Vega64 is only able to match the 1080Ti in 3 out of 9 games with the 1080Ti being between 11% - 20% faster in the rest. Again, these are your own cherry picked games designed to prove your point.

And again, circling back to your original point that started this debate - in what way is this evidence of Nvidia performance "falling off a cliff" after 18 months? It's been over 4 years since Pascal launched now.



This is literally the exact opposite of what is happening here. I'm ignoring no games - I'm taking the Techspot and TechPowerUp supplied averages which between them cover over 2 dozen games including at least 6 of the 9 games techuse is citing as evidence for his argument. Techuse is the one cherry picking only those games were the Vega64 performs relatively strongly while ignoring all the other games from those reviews where it doesn't.
Some of your math is incorrect. Following are averages for all benchmarks/resolutions

Forza - 1080ti 7% faster
BFV - Vega 56 is 11% faster than 1080 in 1st, Vega 64 17% faster than 1080 in 2nd
Star Wars - 1080ti 11% faster
Dirt 5 - Vega 64 29% faster than 1080 1st, 1080ti 4% faster than Vega 64 2nd
WWZ - 1080ti 10% faster 1st, vega64 11% faster 2nd
Division - Vega 64 24% faster than 1080

In all of the titles 1080ti is rarely more than 10% faster.
 
Some of your math is incorrect. Following are averages for all benchmarks/resolutions

Forza - 1080ti 7% faster
BFV - Vega 56 is 11% faster than 1080 in 1st, Vega 64 17% faster than 1080 in 2nd
Star Wars - 1080ti 11% faster
Dirt 5 - Vega 64 29% faster than 1080 1st, 1080ti 4% faster than Vega 64 2nd
WWZ - 1080ti 10% faster 1st, vega64 11% faster 2nd
Division - Vega 64 24% faster than 1080

In all of the titles 1080ti is rarely more than 10% faster.

I assume you're using lower resolutions where CPU limits can come into play thus limiting the 1080Ti's potential.

All my figures were taken at 4k where it was available to put the maximum stress on both GPU's. Where you linked directly to lower res results I found the original article and used the 4k version of the same test.

And non of this changes that fact that when looking at a wider range of games as done in the two articles I linked, rather than Cherry picking games that favour AMD, the average difference between the two is still around 30%.
 
Of course it's an exaggeration. You're trying to create a false narrative that Pascal has lost so much performance since it's launch in relation to Vega that the 1080Ti now generally performs like a Vega64. This despite me already presenting average performance scores across a large range of modern games from very recent reviews at two of the webs most reliable benchmarking sites that show the 1080Ti is on average around 30% faster today. You cite those very reviews in your own cherry picked benchmarks.

And with regards to those hand picked games, these are the actual results:

Forza - 1080Ti is 12-20% faster than Vega64
BFV - no direct comparison but the 2080 is 19% faster than Vega64 while being only 8% faster than the 1080Ti according to TPU
RDR2 - Vega64 does equal the 1080Ti here
Squadrons - 1080Ti is 17% faster
Godfall - Vega64 does equal the 1080Ti here
Dirt 5 - 1080Ti is 11% faster
Doom Eternal - 1080Ti is 12% faster
WWZ - 1080Ti is 16% faster
Division 2 - no direct comparison but based on the 1080 comparison it's reasonable to assume the 1080Ti's advantage here woudl only be in single digits.

So even in your own cherry picked games, the Vega64 is only able to match the 1080Ti in 3 out of 9 games with the 1080Ti being between 11% - 20% faster in the rest. Again, these are your own cherry picked games designed to prove your point.

And again, circling back to your original point that started this debate - in what way is this evidence of Nvidia performance "falling off a cliff" after 18 months? It's been over 4 years since Pascal launched now.



This is literally the exact opposite of what is happening here. I'm ignoring no games - I'm taking the Techspot and TechPowerUp supplied averages which between them cover over 2 dozen games including at least 6 of the 9 games techuse is citing as evidence for his argument. Techuse is the one cherry picking only those games were the Vega64 performs relatively strongly while ignoring all the other games from those reviews where it doesn't.

i only used your result/image where you tried to prove that another card was more weak than another one. Now you are saying that very list which you posted yourself is inaccurate to show the difference between 64 and 1080ti, because.... you have other lists for that argument?

lists are like cherries; they are often picked

edit: looks like the list showed the 1080ti above the 64, I was looking at the 1080 (regular) lol
 
Still some think pascal GPUs age bad. Recently there was a forum member enjoying CP2077 at real nice fidelity on a 'potato' 1070. A gpu four and a half years old.
Anyone try it with a vega gpu?
 
Still some think pascal GPUs age bad. Recently there was a forum member enjoying CP2077 at real nice fidelity on a 'potato' 1070. A gpu four and a half years old.
Anyone try it with a vega gpu?

CP2077 specifically has “bugs” built-in that detect if you are using AMD cpu to disable multi threading...
https://gadgettendency.com/users-fo...s-logical-cores-on-intel-are-completely-used/

it’s not unthinkable that this game is ... “optimized” for Nvidia and Intel hardware
 
From what ive seen, most people are actually running CP2077 on Ryzen based systems without problems.

Il leave the conspiracy theories to you ;)
 
i only used your result/image where you tried to prove that another card was more weak than another one. Now you are saying that very list which you posted yourself is inaccurate to show the difference between 64 and 1080ti, because.... you have other lists for that argument?

lists are like cherries; they are often picked

edit: looks like the list showed the 1080ti above the 64, I was looking at the 1080 (regular) lol

Yes these are the two reviews I've consistently referenced since the beginning. Between them they include at least 6 out of the 9 AMD friendly games that techuse keeps referencing to I think they're a pretty fair sample.

And I just want to highlight (again) that I'm not saying AMD doesn't gain performance over the longer term than Nvidia (at least traditionally - every generation is different). It clearly does and yes, Vega has gained some ground on Pascal since it's launch. The point I'm arguing against is very specifically the "Nvidia performance "falling off a cliff" after 18 months" claim. And I've seen nothing presented so far to support that. Even if I agreed that the Vega64 is as fast as a 1080Ti today, that still wouldn't support the claim since Pascal is over 4 years old now.
 
From what ive seen, most people are actually running CP2077 on Ryzen based systems without problems.

Il leave the conspiracy theories to you ;)

People have reported they've gotten performance increases after changing that line but this is a case where I do indeed think it's an honest miss from the developers rather than any kind of intentional sabotage for AMD

It was actually somewhat similar when TW3 was released and even Nvidia users complained about Kepler performing worse than expected and felt jittery, and Nvidia was quite quick to improve a driver which improved it noticeably. One could claim that they hadn't even bothered to test it properly on Kepler, but people took it too far when they claimed the newer drivers intentionally were made to cripple Kepler.


I still use a GTX Titan and it's hit and miss with modern games. Hitman 2 DX12 worked quite fine, Gears 5 and Tactics too. Metro Exodus in DX12 was generally above 30 FPS but didn't really feel smooth. But Vulkan games seem to be terrible.

I never bothered to to any run any tools to identify the issue, but when I tried Wolfenstein 2 and RAGE 2 back in summer, even at lowest settings, they were never able to keep close to 60 FPS and it felt so not-smooth-at-all that they must have had severe frame pacing issues.
 
Last edited:
People have reported they've gotten performance increases after changing that line but this is a case where I do indeed think it's an honest miss from the developers rather than any kind of intentional sabotage for AMD

It was actually somewhat similar when TW3 was released and even Nvidia users complained about Kepler performing worse than expected and felt jittery, and Nvidia was quite quick to improve a driver which improved it noticeably. One could claim that they hadn't even bothered to test it properly on Kepler, but people took it too far when they claimed the newer drivers intentionally were made to cripple Kepler.

The games have tons of bug. This is probably just a bug...
 
People have reported they've gotten performance increases after changing that line but this is a case where I do indeed think it's an honest miss from the developers rather than any kind of intentional sabotage for AMD

It was actually somewhat similar when TW3 was released and even Nvidia users complained about Kepler performing worse than expected and felt jittery, and Nvidia was quite quick to improve a driver which improved it noticeably. One could claim that they hadn't even bothered to test it properly on Kepler, but people took it too far when they claimed the newer drivers intentionally were made to cripple Kepler.


I still use a GTX Titan and it's hit and miss with modern games. Hitman 2 DX12 worked quite fine, Gears 5 and Tactics too. Metro Exodus in DX12 was generally above 30 FPS but didn't really feel smooth. But Vulkan games seem to be terrible.

I never bothered to to any run any tools to identify the issue, but when I tried Wolfenstein 2 and RAGE 2 back in summer, even at lowest settings, they were never able to keep close to 60 FPS and it felt so not-smooth-at-all that they must have had severe frame pacing issues.

That the bug is there for ryzen cpus on pc doesnt either mean its present on consoles. I have not bothered at all to change values (yet) as i have zero problems with the cpu (3950x), nvme ssd or ram. It feels the 2080ti is the limiting factor now. I would have gotten an even better experience with a 3080 or something.

The first Titan? Also a dinosaur :p Its the Kepler architecture which didnt focus on GPGPU as much as the generations before (tech sites even predicted trouble) or amds GCN and therefore didnt age aswell in games taking advantage of that. Still, its a almost 5TF gpu packing 6gb's of fast ram and is almost a whole year older then the PS4/OneS. Its a bad value anyway, compared to say if you had gone for a R9 290x...... :)
 
Back
Top