AMD GPU gets better with 'age'

You can get bank conflicts on reads and writes to LDS.
Not in this case. Startup CB read from memory to LDS would be linear access = no conflicts. Reading a constant from LDS = all lanes read the same LDS address = no bank conflict. Indexed load from constant buffer is more tricky. GCN has 32 LDS banks (4 byte). If the constant buffer array is <= 32 elements, all accesses either hit different bank or same address = no bank conflicts.

But GCN already uses scalar loads + scalar registers for constant buffer reads (constant address or coherent index such as SV_GroupID). Same is true for coherent loads from ByteAddressBuffers and StructuredBuffers. Incoherent (per thread index) loads of CB/SB/BAB use standard memory loads (= much slower). LDS could help in these cases. Even if you hit bank conflicts, you are going to beat VMEM + L1 cache handily.
 
Last edited:
Right so we have to wait for 2 to 3 years for games to be made to run better on AMD hardware is that it? Lets buy an AMD card for and wait for two years? Sorry that just doesn't sell video cards.
Some people buy video cards knowing they will have to last 5+ years
 
Some people buy video cards knowing they will have to last 5+ years

Vast majority don't, lets look back 5 years ago, that is 3 to 4 gens back, so for AMD that is a 6xxx series and for nV that is Fermi. Now take today's game and run them on lets say 1080p, those cards for the most part won't give acceptable performance unless features are turned down, and that is for the top end cards.
 
Kepler gtx 680/670 ect came out may 2012 I havnt seen a game that is unplayable at 1680x1050 on a 670
the 6950 runs the witcher 3 ok
can even run it at 5292x1050 at 25fps

Is The HD 7970 Still Worth Buying? | The 5 Year Old Flagship
 
Last edited:
The 7970 was announced in december 2011, - five years ago. It still runs most games at acceptable levels at 1080p.

Cheers

A person buying a 500 buck card, is going to have a much better monitor than a 1080p monitor. System configs usually go hand in hand with what hardware they get.

The longest I've ever held a card was for 2 gens, that was the 9700 and then the G80, so 2 1/2 years and then 3 years. Going 5 years for graphics cards, for a person buying top end cards, they tend to want something a bit more then lower their settings down to play and res is part of that too.

And Davors's video link says it all, a 7970 is in the same league as a 1050ti, two people buying those cards will have similar systems? I highly doubt that. Enthusiast vs. Value customers.
 
Last edited:
I just bought a 1050ti for my kids computer I got them for xmas, some crappy Dell thing I got really cheap with an i5-6500 in it. I went the 1050ti because of it's <75w power requirement and it will be perfectly fine for 1080p gaming for the kids (5&7). It's the only circumstance I'd ever buy a low end card like that.
 
The 7970 was announced in december 2011, - five years ago. It still runs most games at acceptable levels at 1080p.

Cheers

And the 7970 are still incredible gpus for computing ... they are incredibly solid for compute task, incredibly stable.. not as fast as last generation, thats right, but incredibly solid on performance ( i speak mainly for raytracing )..
 
Last edited:
I don't think people on this forum have a clue what the "vast majority" does. This here is the land of perpetual, edgy speculation and once something is released it is history. I don't even know if most of you guys play games, judging by the PC games forum excitement factor. Not as exciting to speculate about games I suppose. Unless it's over in fruity internut Console land.
 
Last edited:
Just popping in to say I'm running an his 7950 boost still, thing overclocked like crazy and runs everything fine still. 2600k still killing it too, see no reason to upgrade. Maybe Zen and Vega will shake things up!
 
So when Fury cards have lower or equal performance to Polaris (whether due to VRAM problems, or geometry performance), they are aging well?
 
I don't think people on this forum have a clue what the "vast majority" does. This here is the land of perpetual, edgy speculation and once something is released it is history. I don't even know if most of you guys play games, judging by the PC games forum excitement factor. Not as exciting to speculate about games I suppose. Unless it's over in fruity internut Console land.


Console people tend to upgrade every 5 years that was the console typical life span, PC upgrading cycles are a lot faster than that. Just have to look at basic steam numbers and see the uptake of new graphics cards and down slide of older cards, to see most gamers tend to upgrade every 2 generation. This also correlates with nV's every other generation, tends to have more sales (spike) in total volume sales, as well as AMD and ATi when they were doing well.

People tend to buy graphics cards based on the games they play and want to keep the same settings or similar settings as much as possible as they have.

So 5 years ago, the main stream monitors were 1080p, enthusiasts were using 2k, now mainstream 1080p hasn't changed, enthusiasts, 4k.

So an enthusiast 5 years go, would need a 500 buck card or more to push 2k res, now well 2k can be done with performance segment but performance segment is still in that price range, but if they have upgraded to a 4k monitor they would likely keep going with enthusiast level hardware.

People buying mainstream, are going upgrade to a main stream card. 3 gens of a mainstream card they will be below the performance of the value market now. That is pretty much unplayable. Top end cards 3 gens till now are now low end.

If a person isn't gaming much, that would be the only way they would be keeping a graphics card for 5 years. Just look at the minimum specs of games coming out today.

here is quantum break

NVIDIA GeForce GTX 760 or AMD Radeon R7 260x

Gears of War 4 minimum specs which this game is optimized damn well for hardware.

GeForce GTX 750 Ti

Dishonered 2 minimum specs

NVIDIA GTX 660 2GB/AMD Radeon HD 7970 3GB or better.

Just going by minimum requirements of games coming out now, the 7970 is not going to be something a person that buys a 500 buck card is going to want to play at. Now they can choose to hold their card for 5 years, yeah, but they are going to be playing at the lowest of settings at the end of those 5 years.

Typically the way I've seen developers target systems, They take the top end cards they have at the moment of beginning developing, and make sure the game runs at 30FPS on those cards, so by the time they are done making the game, they have their minimum requirements, so average game takes 3 to 5 years to make? Well in 3 to 5 years that is the low end card.
 
Last edited:
With the new console paradigm of forwards compatibility from the base PS4/X1 onwards there is a good chance the latest $150-200 cards will last* another 5 years for the majority* of games, let alone $200-300 cards .

*Run at 1080p & 30fps at equivalent to ps4 visual settings.
 
Last edited:
With the new console paradigm of forwards compatibility from the base PS4/X1 onwards there is a good chance the latest $150-200 cards will last* another 5 years for the majority* of games, let alone $200-300 cards .

*Run at 1080p & 30fps at equivalent to ps4 visual settings.


Well we are going into more of subject area here, but I don't really see that happening, last gen consoles are updated now 3 years life time, an interim update, even with that, min specs on games are much higher than the graphics horsepower that the original PS4 and Xbone had. I expect to see the same thing with next gen consoles too, in 3 years games minimum requirements will be higher than the graphics processor in consoles (consoles developers can factor in what the ideal resolutions are, they can limit their games to 1080p max, which is perfect for that graphics processor in the next gen consoles). PC's developers can't do that the landscape change too fast in an open system.)
 
Back
Top