Nvidia Ampere Discussion [2020-05-14]

No Turing is pretty disappointing..
Planned obsolescence makes money.
This isn't exclusive to nvidia, obviously.
AMD kind of tried to pull a dirty one with the Zen3 motherboard compatibility episode from last year, and Intel used to not give a fuck and change sockets every year to make sure every new CPU had to be bought with a new motherboard.
 
Is this blacklist confirmed somewhere? It doesn't sound right.
I thought that AMD said that they have it during RDNA2 launch?
It is also somewhat plausible from benchmarks which show exactly the same results in games with and without SAM - which points to driver not using the RBAR functionality in such cases.

By default BAR is at 256 MB, and with Resizable BAR it's set to max VRAM, whatever that may be on each card. Surely you can't just turn it on/off at will after the fact, since by default everything assumes there's at least that 256MB BAR available?
Driver can provide only 256MB access on a system which allows full VRAM access on h/w level. This is transparent to the s/w anyway.

Planned obsolescence makes money.
Yeah, NV has planned for adding RBAR with vBIOS updates for years obviously. Just screams "planned" to me.
 
Yeah, NV has planned for adding RBAR with vBIOS updates for years obviously. Just screams "planned" to me.

They planned for the current gen cards to get a vBIOS update, and for the previous-gen cards to not get one.
That is planning for obsolescence on the previous cards.

But you can choose to believe poor and paragon-of-consumer-friendliness nvidia is doing all they can to keep their RTX20 owners from ever having to upgrade to RTX30 cards.
 
everything assumes there's at least that 256MB BAR available?

Nvidia claimed the benefit of the larger pool is that transfers can be larger or run concurrently to reduce the time it takes to transfer data to the GPU.

Does anyone know where exactly this is managed? Doesn’t seem like the game is doing anything different.

They planned for the current gen cards to get a vBIOS update, and for the previous-gen cards to not get one.
That is planning for obsolescence on the previous cards.

But you can choose to believe poor and paragon-of-consumer-friendliness nvidia is doing all they can to keep their RTX20 owners from ever having to upgrade to RTX30 cards.

That’s a strange interpretation of “planning”. I suppose they’re not releasing BIOS updates for Pascal cards either to force people to upgrade.
 
Planned obsolescence makes money.
This isn't exclusive to nvidia, obviously.
AMD kind of tried to pull a dirty one with the Zen3 motherboard compatibility episode from last year, and Intel used to not give a fuck and change sockets every year to make sure every new CPU had to be bought with a new motherboard.
Well mostly it's just a 5-10% performance increase, you can hardly say it's planned obsolescence. If it would be, Turing would lack all the important next gen features like DirectStorage and DX12U support.
 
They planned for the current gen cards to get a vBIOS update, and for the previous-gen cards to not get one.
That is planning for obsolescence on the previous cards.
Yeah, they've planned this for centuries. It is so obvious, just like the whole 5600XT launch a year ago - that was planned back in the 90s too.
As I've said already this has more to do with what AIBs are/were planning than NV because it is AIB's task to produce the new vBIOSes for their cards - and BIOSes for their motherboards for that matter.
I'm sure that NV would be happy to add RBAR support to all their GPUs starting with Fermi or so but alas that's not something which they can do on their own. Which is a pity tbh.
Arguably though the slower the GPU is the less likely it is to be limited by CPU-to-GPU communication and more likely it is to be limited by actual GPU rendering. So maybe they've decided that it doesn't worth it.
 
I thought that AMD said that they have it during RDNA2 launch?
Could be, but I don't remember hearing such
It is also somewhat plausible from benchmarks which show exactly the same results in games with and without SAM - which points to driver not using the RBAR functionality in such cases.
I'm more inclined to believe that it just makes no difference in those titles, unless someone actually finds source for the blacklist.

Driver can provide only 256MB access on a system which allows full VRAM access on h/w level. This is transparent to the s/w anyway.
Can it, or are you just assuming it can? I mean, there's relatively little information available on the matter currently, and since the size is defined at boot, I wouldn't be so sure driver could just artificially limit it.
 
Well mostly it's just a 5-10% performance increase, you can hardly say it's planned obsolescence. If it would be, Turing would lack all the important next gen features like DirectStorage and DX12U support.

That's 5-10% for now though. Consoles have had such memory access for a good while, this means this gen the shaders can compile the same sort of optimizations consoles get from that memory access for pc. So we'll be seeing better performance uplift down the line.

That being said, Turing was on the drawing board many, many years ago. It's not necessarily a case of planned obsolescence, it could easily be that there just wasn't time to add it.
 
Apparently it was AMD who together with HP suggested the feature to PCI-SIG originally. They also mention that AMD worked on this already in 2015 (on Linux)
https://hardwaresfera.com/en/articulos/resizable-bar/
Not doubting what he claims though his background (pc engineer) suggests supporting links would help his case. I did see somewhere many companies contributed back when the feature was incorporated into the PCI-E spec.
 
What about checking ECN?

res_bar_ecn1dj5r.png


https://composter.com.ua/documents/ECN_Resizable_BAR.pdf
 


It's nothing short of impressive how this only came up in a real implementation 12 years later. It really puts into perspective how slow some changes can happen and for how long had been in development some of the features we're seeing in recently launched products.
This year, someone might be making a request to the PCI-SIG consortium that won't have any impact until 2030. My first thought on that would have been "how the hell do we know if there'll even be a PCI-SIG 10 years from now?!".
 
It came into a real implementation 12 years later because it didn't help much if at all in these 12 years since its inclusion into the standard. Now there are some gains in titles which aren't optimized properly for PC NUMA.
 
It came into a real implementation 12 years later because it didn't help much if at all in these 12 years since its inclusion into the standard. Now there are some gains in titles which aren't optimized properly for PC NUMA.

https://www.techpowerup.com/review/amd-radeon-sam-smart-access-memory-performance/2.html

I'm seeing some pretty old games with measurable boosts in performance, like Civ VI from 2016.
It's definitely something that could have made a difference much earlier than late 2020.
 
Planned obsolescence makes money.
This isn't exclusive to nvidia, obviously.
AMD kind of tried to pull a dirty one with the Zen3 motherboard compatibility episode from last year, and Intel used to not give a fuck and change sockets every year to make sure every new CPU had to be bought with a new motherboard.

I have heard this fallacy a gazillion times, but every time I asked for evidence...we enter the conspiracy realm and I suspect this will be the case here too.
I think the last fallacy that got debunked was the "planned obsolence for Kepler"....which were utterly debunked in several articles and made the fallacies posters scurry and run for cover.

I bet this will be no different, but here goes:

Proof of "planned obsolence"...that is not just fanboi feelings?
 
Back
Top