Value of Consoles vs PC *spin off*

Sorry if the 670 did run all the games correct;
A coworker who has an old GTX770 rig showed me last year how RDR2 ran like garbage on his system, some other games as well. I assumed the GTX670 is even worse.
Maybe the GTX670 is much stronger and games are better optimized for it compared to GTX770. Maybe Nvidia killed the drivers so that the newer cards appeared to perform better? I honestly don’t know.

The 770 is faster than the 670. However that's not relevant to the point we were discussing which was how Keplers performance held up in the 4 years since it's launch from 2012 to 2016. RDR2 launched 2 years later than that in 2018 during a period when Kepler performance really had fallen off a cliff in some games.
 
The 770 is faster than the 670. However that's not relevant to the point we were discussing which was how Keplers performance held up in the 4 years since it's launch from 2012 to 2016. RDR2 launched 2 years later than that in 2018 during a period when Kepler performance really had fallen off a cliff in some games.

Actually a gtx 770 will turn in PS4 like (or maybe even better) performance in RDR2 on medium, even with high textures if you've got the 4GB version.


One of the problems is that people turn settings up too high for increasingly smaller gains, and complain about performance. Another is that if you don't cap at 30 fps, variation in the 30 ~ 40 range on a fixed refresh display can feel worse than 30 fps capped - especially with the far more responsive mouse and KB control input.

I really wish I'd bought a 4GB 680 back in the day. I'd still take that over a PS4, and even more so than the Xbox One.
 
I really wish I'd bought a 4GB 680 back in the day. I'd still take that over a PS4, and even more so than the Xbox One.

Absolutely, even if it didnt age as well as other GPUs, its still a good old beater. Even my 670 2gb served me well. But to be honest, a 7950 or 7970 would have done even better. 2012 products still kicking.
 
Kudos on the effort you put in there. I feel like I should match it now. So here are the results from what you've posted above in terms of Turing vs Pascal performance:

Forza - 2080 is between 1% - 12% faster than the 1080Ti depending on resolution which is perfectly in line with TPU's 8% faster rating on their GPU spec database which is presumably taken at the point of launch.

BFV - 2080 is 33% faster than the 1080 which is in line with the TPU database

RDR2 - 2080 is 15 % - 26% faster than the 1080Ti here so this one is definitely above where we would expect Turing to be in relation to Pascal

Starwars Squadrons - the 2080S is 14% faster than the 1080Ti - perfectly in line with the TPU database

Godfall - 2080 is 15% faster than the 1080Ti so a little more than the TPU DB

Dirt 5 - 2080 is 10% faster than the 1080Ti which is in line with the TBU DB

Doom Eternal - 2080 is 26% faster than the 1080Ti so well above where we would expect Turing to be in relation to Pascal

WWZ - 2080 is 4% - 9% faster than the 1080Ti which is in line with the TBU DB

Division 2 - 2060 is 5%-7% faster than the 1080 which is a little above the TPU DB which pegs them as even

So of the 9 games looked at, only 2 offer a significant variation from what we'd expect based on the launch performance. And there will of course always be outliers, especially where games take advantage of Turings newer feature set (something I already called out in the post you were responding to).

I get that you're trying to show AMD's growing performance over time vs Nvidia but if you cherry pick AMD friendly games then of course you'll be able to show that. I'm sure the reverse could be shown by cherry picking Nvidia friendly games. That said I won't deny that AMD consistently seems to gain ground over time, but that's different to Nvidia performance falling off a cliff like it did with Kepler (where the equivalent today would be something like the RX580 performing in line with a 2080Ti). Those gains can probably be attributed to AMD picking up more console level optimisations than Nvidia, as opposed the Kepler situation of it's architecture simply being unsuited to modern games.



Let me remind you of what I said earlier in this post:

"It seems to me that developers and Nvidia offer good support for at least n-1 architectures which would give a typical architecture 4 years of well supported life. Pascal for example is still more than capable in any new game now a little over 4 years from it's launch. But I do expect it to start falling behind now that Ampere has launched and it's likely receiving less support from Nvidia"

It seems Cyberpunk falls into that description perfectly. Pascal is now 2 generations and more than 4 years old so we should expect some performance loss. Especially in a game known to be using DX12U features which we know Pascal lacks.

Remember, this discussion started with you claiming that Nvidia performance "falls of a cliff" after 18 months which is what I'm disputing. I still see no evidence of that.
Its just odd how AMD starts consistently gaining ground as soon as an Nvidia GPU is no longer their current selling product. Battlefield One for example ran great on Pascal. Battlefield V comes along and performance drops quite a bit whilst remaining almost identical on GCN. Same situation for Forza Horizon 3-4. What DX12U features is Cyberpunk using outside of DXR? What DX12U features does GCN support that allows it to outperform Pascal? Pascal has more modern feature support than GCN. You’d think in an Nvidia co-developed game their own GPUs that most people still own would hand in respectable performance. What on earth is being done that a 2080 is 50% faster than 1080ti.

Edit - several of those games have since seen performance improvements on Turing specifically via driver updates. Forza for example improved by over 20%.
 
Are there any reviews with a 780 Ti in the results anymore?

I have played Star Wars Squadrons at 4K HDR on my 1080 and it seems to run smoothly. No complaints there! I gasped a little when I saw it in 4K HDR on the OLED vs my washed out HP Reverb lol. The game has seen some updates and there have been new drivers since it was last benchmarked I imagine.
 
Last edited:
Are there any reviews with a 780 Ti in the results anymore?

I have played Star Wars Squadrons at 4K HDR on my 1080 and it seems to run smoothly. No complaints there! I gasped a little when I saw it in 4K HDR on the OLED vs my washed out HP Reverb lol. The game has seen some updates and there have been new drivers since it was last benchmarked I imagine.
This is the only time I've seen the 780ti tested on a recent game.

You would probably have to search YouTube for specific games.
 
Kepler hasn't aged the best, but high / ultra settings in modern games aren't always going to go down too well on a 3GB card, and especially at 1440p and 4k (lmao).

That said, even the none Ti version of the 780 (still with 3GB) is clearly ahead of the PS4 and X1 that launched just afterwards. Kepler was what, 20 months old by then?

In terms of gaming on a 780 in 2020, if you're realistic it's still a perfectly okay way to play for most games (Doom Eternal is kinda struggling though).

 
It does look like 780 Ti fell back a bit compared to R9 290X but it's not exactly catastrophic.

Somebody needs to waste a lot of time testing GTX 580 now. ;)
 
Last edited:
Battlefield One for example ran great on Pascal. Battlefield V comes along and performance drops quite a bit whilst remaining almost identical on GCN.

Same situation for Forza Horizon 3-4.

I assume you're talking about Vega rather than GCN? Pascal is performing exactly where it should vs the GCN based RX580. Vega has indeed stretched its legs quite impressively since it's launch vs Nvidia though - at least in certain AMD favouring games. However if you looks at the multi game averages at both Techspot and Techpower Up for their recent 6900XT reviews you can see from that on average Vega is still performing in line with expectations vs Pascal. And even Maxwell!

What DX12U features is Cyberpunk using outside of DXR?

No idea. But it's certainly a massive outlier in terms of relative Turing to Pascal performance as well as being the newest and arguably most technically advanced game on the market. So the idea tha tit may be using some technical features that Turing has and Pascal lacks isn't that far fetched, especially given that Pascal is now over 4 years old.

What DX12U features does GCN support that allows it to outperform Pascal? Pascal has more modern feature support than GCN. You’d think in an Nvidia co-developed game their own GPUs that most people still own would hand in respectable performance. What on earth is being done that a 2080 is 50% faster than 1080ti.

Again I assume you mean Vega because I don't see any evidence of the older GCN cards (i.e. RX 5xx and below) performing better than expected against Pascal here. Vega certainly does but then I always saw Vega as a very forward looking architecture. Moreso than Pascal.

In terms of Nvidia optimising the game specifically for Pascal, why? All those Pascal users are potential upgrade targets for Nvidia now. There's no need for them to try and eek out the very best performance on a 4+ year old architecture that Nvidia wants gamers to move on from. Regarding the 2080 vs 1080Ti comparison I'm seeing around 35-44% more performance but yes that's still far above what we'd expect based on launch reviews which is more like 8%. So to me this looks like either a lack of driver / developer support/optimisation for a now 2 generation, 4 year old architecture, or the game using a more modern featureset that simply works better on Turing (and Vega), or most likely, a combination of both. Nevertheless Pascal is still delivering perfectly acceptable performance in this game for it's age, although a more detailed comparison to the mid gen consoles would be pretty interesting (along with the RX580).
 
I assume you're talking about Vega rather than GCN? Pascal is performing exactly where it should vs the GCN based RX580. Vega has indeed stretched its legs quite impressively since it's launch vs Nvidia though - at least in certain AMD favouring games. However if you looks at the multi game averages at both Techspot and Techpower Up for their recent 6900XT reviews you can see from that on average Vega is still performing in line with expectations vs Pascal. And even Maxwell!



No idea. But it's certainly a massive outlier in terms of relative Turing to Pascal performance as well as being the newest and arguably most technically advanced game on the market. So the idea tha tit may be using some technical features that Turing has and Pascal lacks isn't that far fetched, especially given that Pascal is now over 4 years old.



Again I assume you mean Vega because I don't see any evidence of the older GCN cards (i.e. RX 5xx and below) performing better than expected against Pascal here. Vega certainly does but then I always saw Vega as a very forward looking architecture. Moreso than Pascal.

In terms of Nvidia optimising the game specifically for Pascal, why? All those Pascal users are potential upgrade targets for Nvidia now. There's no need for them to try and eek out the very best performance on a 4+ year old architecture that Nvidia wants gamers to move on from. Regarding the 2080 vs 1080Ti comparison I'm seeing around 35-44% more performance but yes that's still far above what we'd expect based on launch reviews which is more like 8%. So to me this looks like either a lack of driver / developer support/optimisation for a now 2 generation, 4 year old architecture, or the game using a more modern featureset that simply works better on Turing (and Vega), or most likely, a combination of both. Nevertheless Pascal is still delivering perfectly acceptable performance in this game for it's age, although a more detailed comparison to the mid gen consoles would be pretty interesting (along with the RX580).
Vega is GCN. It's the direct competitor to the 1070+ lineup of Pascal cards. In what ways do you consider it more forward looking than Pascal? It's not changed much from previous GCN incarnations. The defining improvement it was suppose to bring was broken/unworkable. It's also almost 4 years and 2 generations old. Doesn't stop it from performing well in the majority of modern titles.
 
Last edited:
Vega is GCN.

No... it's not. It's the AMD architecture that succeeded GCN.

It's the direct competitor to the 1070+ lineup of Pascal cards.

Yes. At least we agree on this.

In what ways do you consider it more forward looking than Pascal? It's not changed much from previous GCN incarnations.

It's an entirely new architecture. We had GCN 1.0 - 4.0. 4.0 being the RX5xx series. Vega succeeded GCN as the new generation architecture contemporary with Pascal. If you want to understand it in detail, read the whitepaper. Here's a quote from it:

"With these needs in mind, the Radeon Technologies Group set out to build a new architecture known as “Vega.” “Vega” is the most sweeping change to AMD’s core graphics technology since the introduction of the first GCN-based chips five years ago."

The defining improvement it was suppose to bring was broken/unworkable. It's also almost 4 years and 2 generations old. Doesn't stop it from performing well in the majority of modern titles.

Of course not. No-one said it did. Vega still performs admirably, as does Pascal. That's the point.
 
No... it's not. It's the AMD architecture that succeeded GCN.



Yes. At least we agree on this.



It's an entirely new architecture. We had GCN 1.0 - 4.0. 4.0 being the RX5xx series. Vega succeeded GCN as the new generation architecture contemporary with Pascal. If you want to understand it in detail, read the whitepaper. Here's a quote from it:

"With these needs in mind, the Radeon Technologies Group set out to build a new architecture known as “Vega.” “Vega” is the most sweeping change to AMD’s core graphics technology since the introduction of the first GCN-based chips five years ago."



Of course not. No-one said it did. Vega still performs admirably, as does Pascal. That's the point.

It was a refinement of GCN, but still clearly GCN as noted in reviews. It's an entirely new architecture in the same way Pascal was. Vega does perform respectably, I don't agree that Pascal does. Techowerup performance summaries are also not useful for older GPUs as the results are just copied from old data. Even then, performance summaries don't often tell the whole story.

https://static.techspot.com/articles-info/1990/bench/1080p-p.webp
https://static.techspot.com/articles-info/1990/bench/1440p-p.webp

It's only 5-8% but look at the individual experiences and the types of titles where each leads.
 
Last edited:
It was a refinement of GCN, but still clearly GCN as noted in reviews.

You are literally arguing against the AMD whitepaper which specifically states that Vega is a new architecture. Of course it borrows heavily from the previous architecture, every new architecture does. Look at RDNA 2 or Ampere? Are they actually RDNA 1 and Turing?

It's an entirely new architecture in the same way Pascal was.

That's right, they are both new architectures. That's why they aren't called GCN and Maxwell, or even GCN 5.0 and Maxwell 3.0.

Vega does perform respectably, I don't agree that Pascal does.

Of course you don't.

Techowerup performance summaries are also not useful for older GPUs as the results are just copied from old data. Even then, performance summaries don't often tell the whole story.

https://static.techspot.com/articles-info/1990/bench/1080p-p.webp
https://static.techspot.com/articles-info/1990/bench/1440p-p.webp

It's only 5-8% but look at the individual experiences and the types of titles where each leads.

I'm not sure what you're trying to prove here. They each win in different games with the overall faster architecture winning in a few more by a bit more. So what? That's exactly the situation which would have defined the 580 being faster than the 1060 when they first launched. The bottom line is this:

4K.png


Using the most modern and popular games on the market, on average, Vega is 8% faster than Pascal. Which is basically where they were at launch.

But we're straying from the original point which was you saying that "Nvidia performance falls of a cliff after 18 months". And yet here we are 4 years later in Pascals case with it still performing well on average and roughly in line with where it was when it launched in relation to Vega and Turing. Where is this evidence of Pascal performance "falling off a cliff" two and a half years ago?
 
You are literally arguing against the AMD whitepaper which specifically states that Vega is a new architecture. Of course it borrows heavily from the previous architecture, every new architecture does. Look at RDNA 2 or Ampere? Are they actually RDNA 1 and Turing?



That's right, they are both new architectures. That's why they aren't called GCN and Maxwell, or even GCN 5.0 and Maxwell 3.0.



Of course you don't.



I'm not sure what you're trying to prove here. They each win in different games with the overall faster architecture winning in a few more by a bit more. So what? That's exactly the situation which would have defined the 580 being faster than the 1060 when they first launched. The bottom line is this:

4K.png


Using the most modern and popular games on the market, on average, Vega is 8% faster than Pascal. Which is basically where they were at launch.

But we're straying from the original point which was you saying that "Nvidia performance falls of a cliff after 18 months". And yet here we are 4 years later in Pascals case with it still performing well on average and roughly in line with where it was when it launched in relation to Vega and Turing. Where is this evidence of Pascal performance "falling off a cliff" two and a half years ago?
Vega 64 was slower when it launched.

https://www.techspot.com/amp/review/1476-amd-radeon-vega-64/page13.html

5% slower to 8% faster.
 
Vega 64 was slower when it launched.

https://www.techspot.com/amp/review/1476-amd-radeon-vega-64/page13.html

5% slower to 8% faster.

According to TPU it was about the same:

https://www.techpowerup.com/review/amd-radeon-rx-vega-64/31.html

But that's besides the point. I don't deny (in fact I've already agreed above) that AMD architectures tend to do a little better over time, largely because of their console links and I think AMD supports older architectures better than Nvidia. So yes, Vega moving a few percentage points up in relation to Pascal 4 years after Pascals launch isn't at all surprising.

But it's a few percentage points. After 4 years.

Again, where is the evidence of Pascal performance "falling off a cliff" two and a half years ago?
 
Back
Top