Whole lot of stuff that isn't about AMD Navi *cleanup*

That's just bullcrap and you know it very well. AMD has clawed back a lot of marketshare and went a lot more positive on revenue, dropped 2/3rd of their debts in 3 years thanks to Zen
Like 3% server and some 4-5% client over 3 years is about as unimpressive as it gets.
Good margin expansion and debt control, yeah, but still.
Meh.
One cannot kill the mountain; you gotta grind year over year and all.
RDNA2 has a potential - budgets might be a bit fatter, the design decision should be gaming oriented, etc.
Yeah it's an absolute killer, makes me wish for wider N23 adoption come early next year.
TL;DR AMD has to be the cheaper solution
But they won't be.
 
GCN was better than Nvidia's alternatives at the majority of price points. Higher power consumption doesnt matter for consumers. Performance and price do. AMD typical beat Nvidia at both whenever it competed at a given product segment.
 
From a pure engineering standpoint the criticism isn't unwarranted. Navi has a feature deficit and a node advantage over Turing and still didn't make much of a dent. Also, AMD's CPU division has proven that it is possible to topple goliath will a well timed shot.

That Goliath had drunken himself senseless on his spoils of victory, passed out, and was still sleeping when David nailed him with the rock. I mean he was just lying there!
Nvidia is a Goliath who has given up on fighting really, he wants to calculate the orbit of the stars in the sky instead, there’s just this annoying guy pestering him with slinging rocks that he has to slap down every so often to be able to tinker with the stuff he’s really interested in these days.

Totally different situations.
 
GPU A uses 150 watts. GPU B uses 215 watts. If GPU B is faster and will maintain its performance much better going forward, what logical reason would any consumer have to care about power usage?

Look, let's double down.

GPU A uses <150 watts. can only play most games at medium settings, 1080p, misses 60fps mark fairly often.
GPU B uses 215 watts, it's 50x times faster. Plays every game maxed, 16k resolution, 144 fps+.

I would take GPU A every day, cause I don't give a fuuuge about how much faster the card B is if not going to be able to actually play with it without sweating like a pig 9 months a year in my place. As simple as that.
 
Let's triple down...

GPU B can only be bought in one particular place,

1nl6pi.jpg
 
Look, let's double down.

GPU A uses <150 watts. can only play most games at medium settings, 1080p, misses 60fps mark fairly often.
GPU B uses 215 watts, it's 50x times faster. Plays every game maxed, 16k resolution, 144 fps+.

I would take GPU A every day, cause I don't give a fuuuge about how much faster the card B is if not going to be able to actually play with it without sweating like a pig 9 months a year in my place. As simple as that.
In what reality do AMD GPUs heat up an entire room because they use 50-75 extra watts?

The obvious catch here is that GPU B isn't really faster.

In the case of GCN vs Pascal/Maxwell/Kepler they certainly are.
 
In the case of GCN vs Pascal/Maxwell/Kepler they certainly are.
No, they aren't. In all of these cases NV parts were on the same performance level or actually faster (980Ti etc.) at launch frames - which coincidentally is the moment where the majority of PC gamers tend to make an upgrade.
Kepler fell back over time due to architectural issues but both Maxwell and Pascal are doing just fine even today.
 
No, they aren't. In all of these cases NV parts were on the same performance level or actually faster (980Ti etc.) at launch frames - which coincidentally is the moment where the majority of PC gamers tend to make an upgrade.
Kepler fell back over time due to architectural issues but both Maxwell and Pascal are doing just fine even today.
980ti was the one price point Nvidia did well at. At the lower price points the AMD offerings were clearly better. Maxwell and Pascal are both exhibiting the same performance drop off as Kepler, just to a lesser extent.
 
In what reality do AMD GPUs heat up an entire room because they use 50-75 extra watts?

What kind of stupid question is that? All cards do, everything that produces heat does, in our one and only reality. The higher the wattage the harder it becomes to cool the room down. And not everybody can have centralised AC systems you know...
 
980ti was the one price point Nvidia did well at. At the lower price points the AMD offerings were clearly better.
1060 was faster than 480 at launch.
1080 was faster than Vega 64 at launch (and now actually).
970 was faster than 290X.
2080 was faster than Radeon VII.
Should I continue or do you get the picture?

Maxwell and Pascal are both exhibiting the same performance drop off as Kepler, just to a lesser extent.
No, the are not exhibiting anything close to what Kepler was showing at this time in its life.
 
Look, let's double down.

GPU A uses <150 watts. can only play most games at medium settings, 1080p, misses 60fps mark fairly often.
GPU B uses 215 watts, it's 50x times faster. Plays every game maxed, 16k resolution, 144 fps+.

I would take GPU A every day, cause I don't give a fuuuge about how much faster the card B is if not going to be able to actually play with it without sweating like a pig 9 months a year in my place. As simple as that.
If you had any sense and enough money, you take B and enable vsync.

What kind of stupid question is that? All cards do, everything that produces heat does, in our one and only reality. The higher the wattage the harder it becomes to cool the room down. And not everybody can have centralised AC systems you know...
Ease up on the stupid. And not too long ago we were using multiple 60 and 100W bulbs in a single room. I don’t think a light bulb was the tipping point for turning on the AC. But I agree that I’d prefer lower power draw, because that usually means quieter fans.
 
Last edited:
Luckily Nvidia has access to the same process node as AMD.. so it's probably not going to be quite as easy as that when it comes to GPUs... but we'll just have to wait and see the difference between Ampere and RDNA2 GPUs. AMD has claimed that they would disrupt the 4K gaming space, so I'm ready either way. Light the fire under the butt of Nvidia and lets get a good high end GPU war going on between them!

4K gaming is already here so a true disruption would require lowering the cost of entry. The 2080 Ti is the only card that consistently but barely delivers 4K 60fps in today’s games. Is AMD going to give us 4K 60fps in cyberpunk 2077 for $500?
 
If you had any sense and enough money, you take B and enable vsync.

r/whoosh

Ease up on the stupid. And not too long ago we were using multiple 60 and 100W bulbs in a single room. I don’t think a light bulb was the tipping point for turning on the AC.

It can actually. When we had bulbs (when was that 20+ years ago?) we would tend to not turn on the lights in summer unless stricktly necessary or open doors/windows, etc. Which is not an option while doing an activity that has the potential to wake up the entire building (i.e. MP chat).
 
Instead of bickering about acceptable power/heat for highly subjective circumstances, isn't it possible to agree that - as a general rule of thumb - the better the power/performance the better the product for the vast majority? Higher efficiency has a generally positive impact on both the graphics hardware itself and the system required around it.

As it stands the power/performance difference between AMD and nVidia isn't extreme. Which is well and good so long as node-efficiency means little to the average consumer. But node-efficiency will come to the fore once nVidia releases their 7nm product unless AMD can close the assumed gap that's about to open between RDNA1 and Ampere.
 
Back
Top