Bondrewd
Veteran
Idk, AMD isn't making any money while Intel is posting record quarters.Also, AMD's CPU division has proven that it is possible to topple goliath will a well timed shot.
~Nothing changed~
Idk, AMD isn't making any money while Intel is posting record quarters.Also, AMD's CPU division has proven that it is possible to topple goliath will a well timed shot.
That's just bullcrap and you know it very well. AMD has clawed back a lot of marketshare and went a lot more positive on revenue, dropped 2/3rd of their debts in 3 years thanks to ZenIdk, AMD isn't making any money while Intel is posting record quarters.
~Nothing changed~
Like 3% server and some 4-5% client over 3 years is about as unimpressive as it gets.That's just bullcrap and you know it very well. AMD has clawed back a lot of marketshare and went a lot more positive on revenue, dropped 2/3rd of their debts in 3 years thanks to Zen
Yeah it's an absolute killer, makes me wish for wider N23 adoption come early next year.RDNA2 has a potential - budgets might be a bit fatter, the design decision should be gaming oriented, etc.
But they won't be.TL;DR AMD has to be the cheaper solution
From a pure engineering standpoint the criticism isn't unwarranted. Navi has a feature deficit and a node advantage over Turing and still didn't make much of a dent. Also, AMD's CPU division has proven that it is possible to topple goliath will a well timed shot.
Yeah, the latter is more of a Qualcomm-tier happening, and we all know what happened to QC GPU lead.Totally different situations.
Higher power consumption doesnt matter for consumers.
GPU A uses 150 watts. GPU B uses 215 watts. If GPU B is faster and will maintain its performance much better going forward, what logical reason would any consumer have to care about power usage?Source?
GPU A uses 150 watts. GPU B uses 215 watts. If GPU B is faster and will maintain its performance much better going forward, what logical reason would any consumer have to care about power usage?
In what reality do AMD GPUs heat up an entire room because they use 50-75 extra watts?Look, let's double down.
GPU A uses <150 watts. can only play most games at medium settings, 1080p, misses 60fps mark fairly often.
GPU B uses 215 watts, it's 50x times faster. Plays every game maxed, 16k resolution, 144 fps+.
I would take GPU A every day, cause I don't give a fuuuge about how much faster the card B is if not going to be able to actually play with it without sweating like a pig 9 months a year in my place. As simple as that.
The obvious catch here is that GPU B isn't really faster.
No, they aren't. In all of these cases NV parts were on the same performance level or actually faster (980Ti etc.) at launch frames - which coincidentally is the moment where the majority of PC gamers tend to make an upgrade.In the case of GCN vs Pascal/Maxwell/Kepler they certainly are.
980ti was the one price point Nvidia did well at. At the lower price points the AMD offerings were clearly better. Maxwell and Pascal are both exhibiting the same performance drop off as Kepler, just to a lesser extent.No, they aren't. In all of these cases NV parts were on the same performance level or actually faster (980Ti etc.) at launch frames - which coincidentally is the moment where the majority of PC gamers tend to make an upgrade.
Kepler fell back over time due to architectural issues but both Maxwell and Pascal are doing just fine even today.
In what reality do AMD GPUs heat up an entire room because they use 50-75 extra watts?
1060 was faster than 480 at launch.980ti was the one price point Nvidia did well at. At the lower price points the AMD offerings were clearly better.
No, the are not exhibiting anything close to what Kepler was showing at this time in its life.Maxwell and Pascal are both exhibiting the same performance drop off as Kepler, just to a lesser extent.
If you had any sense and enough money, you take B and enable vsync.Look, let's double down.
GPU A uses <150 watts. can only play most games at medium settings, 1080p, misses 60fps mark fairly often.
GPU B uses 215 watts, it's 50x times faster. Plays every game maxed, 16k resolution, 144 fps+.
I would take GPU A every day, cause I don't give a fuuuge about how much faster the card B is if not going to be able to actually play with it without sweating like a pig 9 months a year in my place. As simple as that.
Ease up on the stupid. And not too long ago we were using multiple 60 and 100W bulbs in a single room. I don’t think a light bulb was the tipping point for turning on the AC. But I agree that I’d prefer lower power draw, because that usually means quieter fans.What kind of stupid question is that? All cards do, everything that produces heat does, in our one and only reality. The higher the wattage the harder it becomes to cool the room down. And not everybody can have centralised AC systems you know...
Luckily Nvidia has access to the same process node as AMD.. so it's probably not going to be quite as easy as that when it comes to GPUs... but we'll just have to wait and see the difference between Ampere and RDNA2 GPUs. AMD has claimed that they would disrupt the 4K gaming space, so I'm ready either way. Light the fire under the butt of Nvidia and lets get a good high end GPU war going on between them!
If you had any sense and enough money, you take B and enable vsync.
Ease up on the stupid. And not too long ago we were using multiple 60 and 100W bulbs in a single room. I don’t think a light bulb was the tipping point for turning on the AC.