IHV Preferences, Performance over Quality? *Spawn*

It doesn't necessarily mean so, AMD has always preferred performance over quality for quite some time now, that's why they lagged so much behind NV in introducing new visual features on PC. Raja's statement is just an expression of this philosophy.

AMD has no preference of performance over quality and no features are lagging because of this mythical philosophy.

If anything it seems NVidia is far more concerned about performance over quality than AMD. At least that's been my experience with the GTX 1070. But then that could just be the games I mostly play (Warframe and GW2 remain my most played games) and AMD's IQ is far better in those games on a R9 290 than on a GTX 1070 (lots of visual anomalies and graphics oddities). And that's despite Warframe actually having NVidia specific code in the game (some features require the use of an NVidia graphics card) and being a TWITMTBP NVidia partner title.

Putting that aside, NVidia and AMD have the same focus on delivering graphics quality and performance. I doubt either company is actually sacrificing one to boost the other at a hardware level.

Regards,
SB
 
Putting that aside, NVidia and AMD have the same focus on delivering graphics quality and performance. I doubt either company is actually sacrificing one to boost the other at a hardware level.

AMD has no preference of performance over quality and no features are lagging because of this mythical philosophy.

I am going to post this again here:

Well, it certainly seems this way, PC users feel NV cares for them much more than AMD quality wise. NV constantly pushes the the visual front, they introduced various forms of AO: HBAO+, VXAO, they have the capability to activate AO for over a dozen games (through drivers) that didn't previously support it, they have various forms of AA, TXAA, MFAA, FXAA, they have FireWorks, WaterWorks, HairWorks, ShadowWorks (HFTS, PCSS, PCSS+). They still push for PhysX in games to this day. They have Adaptive V.Sync, Fast Sync, Adaptive half refresh rate .. etc. They did well with 3D Vision, have 3D vision surround, Simultaneous Multi-Projection. They were first with DSR, Geforce Experience, ShadowPlay and G.Sync as well. And these are just from the top of my head. AMD has answers to some of them obviously, but still not all. Hence why they lagged behind and focused on performance enhancing features like Mantle for example.
 
Last edited:
Well both of ya guys are right, both looking at different things. Yeah no IHV is sacrificing image quality for performance (like they have done in the past) which is great.

But from a technology point of view, software, nV is way ahead with features that can be added by game dev's or implemented through their drivers. And this comes down to AMD's lack of resources. The turn around on that though, if AMD has competitive products (which they really haven't been able to do for 2 gens, they will gain marketshare back and hopefully their bottom line too, and then they can start pushing for more dev support software and features. ATi, or AMD missed out on their opportunities when they had good products to do this hopefully they learned their lesson, and see the importance of working with dev's on these types of products.
 
Well both of ya guys are right, both looking at different things. Yeah no IHV is sacrificing image quality for performance (like they have done in the past) which is great.

But from a technology point of view, software, nV is way ahead with features that can be added by game dev's or implemented through their drivers. And this comes down to AMD's lack of resources. The turn around on that though, if AMD has competitive products (which they really haven't been able to do for 2 gens, they will gain marketshare back and hopefully their bottom line too, and then they can start pushing for more dev support software and features. ATi, or AMD missed out on their opportunities when they had good products to do this hopefully they learned their lesson, and see the importance of working with dev's on these types of products.

And what good are those enhancements if the game ends up looking worse because of visual anomalies (bugs) with how the hardware is rendering the game? I point out again that Warframe is part of the NVidia program. And there are a few things I could enabled on my 1070 that I can't on the R9 290. However, the game looks far better on the R9 290 because it doesn't render some things incorrectly in the game unlike the 1070. However, since it isn't a benchmarked game or a recently released hyped game, it gets very little driver support from NVidia and the visual glitches just continue to annoy.

Which brings me to my point. Everyone goes on and on about how great NVidia driver support is and how they support older hardware, yada yada. Well, what I'd like to see them do is support games as well as AMD does when they aren't the games being benchmarked and they aren't the newest hyped games. Or to put it another way, I wish their drivers didn't break rendering in older games as much as they do. AMD drivers are far better in my experience with rendering older games correctly. Not perfect by any means, but better. At least for the games I play.

I really expected far better, but the 1070 is very much a mixed bag on older games or non-hyped games.

Again I really like how fast the 1070 is, but it's annoying as hell sometimes to play games on it. OTOH - I could have a more visually stable game using the R9 290, but it's also much slower. And it's in my desktop in the US and not here with me in Japan. :p And before people take this as me hating on the card, I don't. There's a lot of things to like about it, but there are some things that are very not good about it as well.

Regards,
SB
 
Last edited:
AMD has no preference of performance over quality and no features are lagging because of this mythical philosophy.
And what good are those enhancements if the game ends up looking worse because of visual anomalies (bugs) with how the hardware is rendering the game? I point out again that Warframe is part of the NVidia program. And there are a few things I could enabled on my 1070 that I can't on the R9 290. However, the game looks far better on the R9 290 because it doesn't render some things incorrectly in the game unlike the 1070. However, since it isn't a benchmarked game or a recently released hyped game, it gets very little driver support from NVidia and the visual glitches just continue to annoy.

Which brings me to my point. Everyone goes on and on about how great NVidia driver support is and how they support older hardware, yada yada. Well, what I'd like to see them do is support games as well as AMD does when they aren't the games being benchmarked and they aren't the newest hyped games. Or to put it another way, I wish their drivers didn't break rendering in older games as much as they do. AMD drivers are far better in my experience with rendering older games correctly. Not perfect by any means, but better. At least for the games I play.

I really expected far better, but the 1070 is very much a mixed bag on older games or non-hyped games.

Again I really like how fast the 1070 is, but it's annoying as hell sometimes to play games on it. OTOH - I could have a more visually stable game using the R9 290, but it's also much slower. And it's in my desktop in the US and not here with me in Japan. :p

Regards,
SB


Oh, you are in Japan.. I love this country.. i have some good friends in Hokkaido actually ..
 
How are proprietary standards that force you to buy a certain hardware "care more about users" than open initiative that anyone can benefit from and dont add extra cost?


Most of gameworks features work well on both AMD and nV, the only ones that didn't was ones that uses tessellation, which that will be corrected by AMD because of changes in their hardware.
 
And what good are those enhancements if the game ends up looking worse because of visual anomalies (bugs) with how the hardware is rendering the game? I point out again that Warframe is part of the NVidia program. And there are a few things I could enabled on my 1070 that I can't on the R9 290. However, the game looks far better on the R9 290 because it doesn't render some things incorrectly in the game unlike the 1070. However, since it isn't a benchmarked game or a recently released hyped game, it gets very little driver support from NVidia and the visual glitches just continue to annoy.

Which brings me to my point. Everyone goes on and on about how great NVidia driver support is and how they support older hardware, yada yada. Well, what I'd like to see them do is support games as well as AMD does when they aren't the games being benchmarked and they aren't the newest hyped games. Or to put it another way, I wish their drivers didn't break rendering in older games as much as they do. AMD drivers are far better in my experience with rendering older games correctly. Not perfect by any means, but better. At least for the games I play.

I really expected far better, but the 1070 is very much a mixed bag on older games or non-hyped games.

Again I really like how fast the 1070 is, but it's annoying as hell sometimes to play games on it. OTOH - I could have a more visually stable game using the R9 290, but it's also much slower. And it's in my desktop in the US and not here with me in Japan. :p And before people take this as me hating on the card, I don't. There's a lot of things to like about it, but there are some things that are very not good about it as well.

Regards,
SB


AMD has their own issues with older games too for new cards, these aren't isolated instances, and they happen with new hardware and old software. But again, its nV fault they didn't fix the bugs, yeah, just have to wait till they get around to it. Irritating yes, but just one game like that shouldn't be the defining characteristic of their entire driver team.

I don't see anyone saying that its only AMD's problem when these things are concerned, and that is why I stated both of you guys are correct but are talking about two different things.
 
I am going to post this again here:

Well, it certainly seems this way, PC users feel NV cares for them much more than AMD quality wise. NV constantly pushes the the visual front, they introduced various forms of AO: HBAO+, VXAO, they have the capability to activate AO for over a dozen games (through drivers) that didn't previously support it, they have various forms of AA, TXAA, MFAA, FXAA, they have FireWorks, WaterWorks, HairWorks, ShadowWorks (HFTS, PCSS, PCSS+). They still push for PhysX in games to this day. They have Adaptive V.Sync, Fast Sync, Adaptive half refresh rate .. etc. They did well with 3D Vision, have 3D vision surround, Simultaneous Multi-Projection. They were first with DSR, Geforce Experience, ShadowPlay and G.Sync as well. And these are just from the top of my head. AMD has answers to some of them obviously, but still not all. Hence why they lagged behind and focused on performance enhancing features like Mantle for example.

Some are good (HBAO+, ANSEL), some are un-usable right now (VXAO is DX12-only and a massive fps killer in RofTR), some are too demanding for minor improvement in IQ (PCSS+), some are becoming irrelevant (PhysX ) and some where used only to cripple AMD cards in Witcher 3 (Hairworks); But this thread should be about Vega, wrong section I presume?
 
what I'd like to see them do is support games as well as AMD does when they aren't the games being benchmarked and they aren't the newest hyped games.
In my experience AMD does have as much problem as NV in older games, if not worse. AMD problems extend to newer games as well, newer unbenchmarked games, it is only natural too, NV has much bigger resources than AMD and can optimize for a much larger number of games.

Here is a scan of the less famous games in the last 2 years, with AMD vs NV performance comparisons, AMD's performance is significantly less than NV in these games (in an unusual way), due to lack of optimizations.

Divinity Original Sin 2
980Ti is 20% faster than FuryX @1080p (and 17% faster @1440p), 980 is almost as fast as FuryX @1080p!
http://gamegpu.com/rpg/роллевые/divinity-original-sin-2-test-gpu

Obduction
980Ti is 55% faster than FuryX @1080p(and 30% faster @1440p), 780Ti is delivering the same fps as FuryX @1080p!
http://gamegpu.com/rpg/роллевые/obduction-test-gpu

ABZU
980Ti is 52% faster than FuryX @1080p, (and 30% faster @1440p), 980 is 17% faster than FuryX @1080p as well!
http://gamegpu.com/action-/-fps-/-tps/abzu-test-gpu

War Thunder
980Ti is 15% faster than FuryX @1080p, @1440p it is 25% faster than FuryX, 980 is closely fast as it as well @1080p!
http://gamegpu.com/mmorpg-/-онлайн-игры/war-thunder-1-59-test-gpu

The Technomancer
980Ti is 25% faster than FuryX @1080p (and 17% faster @1440p)! 980 is as equal as well @1080p!
http://gamegpu.com/rpg/роллевые/the-technomancer-test-gpu

Firewatch
980Ti is 25% faster than FuryX @1080p and 1440p, 980 is almost as fast as it as well @1080p!
http://gamegpu.com/action-/-fps-/-tps/firewatch-test-gpu.html

Dragons Dogma Dark Arisen
980Ti is 24% faster than FuryX @4K and @1440p, 980 is as fast as it as well @1440p! (@1080p all cards are CPU limited).
http://gamegpu.com/rpg/rollevye/dragons-dogma-dark-arisen-test-gpu.html

Homeworld: Deserts of Kharak
980Ti is 20% faster than FuryX @4k, though it drops to just 15% @1440p! (@1080p all cards are CPU limited).
http://gamegpu.com/rts-/-strategii/homeworld-deserts-of-kharak-test-gpu.html

Crossout
980Ti is 46% faster than FuryX @1080p, (and 32% faster @1440p), even 980 is faster @1080p!
http://gamegpu.com/mmorpg-/-онлайн-игры/crossout-test-gpu

No so famous games, but still famous enough! Obviously AMD doesn't care about them that much.

Assassin's Creed Syndicate
980Ti is 24% faster than FuryX @1080p! 21% faster @1440p! 980 is almost equally as fast!
http://gamegpu.com/action-/-fps-/-tps/assassin-s-creed-syndicate-test-gpu-2015.html

Mad Max
980Ti is 23% faster than FuryX @1080p, 18% faster @1440p!
http://gamegpu.com/action-/-fps-/-tps/mad-max-test-gpu-2015.html

Call Of Duty Modern Warfare Remastered
980Ti is a whooping 72% faster than FuryX @1080, even a 970 is faster than FuryX! @1440p, the advantage collapses to 25%, and regular 980 is equal to FuryX!
http://gamegpu.com/action-/-fps-/-tps/call-of-duty-modern-warfare-remastered-test-gpu

Battleborn
980Ti is 30% faster than FuryX @1080p and 1440p, 980 is equally as fast as the FuryX!
http://www.pcgameshardware.de/Battleborn-Spiel-54612/Specials/Benchmark-Review-1194406/

Homefront: The Revolution
980Ti is 34% faster than FuryX @1080p, and 23% faster @1440p
http://www.overclock3d.net/reviews/gpu_displays/homefront_the_revolution_pc_performance_review/7
http://www.pcgameshardware.de/Homefront-The-Revolution-Spiel-54406/Tests/Benchmarks-Test-1195960/

Dead Rising
GTX 1060 is 44% faster than RX 480 @4K! (@1080p and @1440p all cards are CPU limited).
http://www.overclock3d.net/reviews/gpu_displays/dead_rising_pc_performance_review/6

some are un-usable right now (VXAO is DX12-only and a massive fps killer in RofTR
VXAO is a DX11 feature not DX12.
some are too demanding for minor improvement in IQ
Minor or major, they are there for those who would like to take advantage of them, or those who have excess juice to utilize them.
some where used only to cripple AMD cards in Witcher 3
That is debatable at best.
 
Last edited:
Imho NV was right in putting much work on optimizing their DX11 drivers to the current levels, even if their DX12 performance might lack at the moment. The reduced CPU overhead and the general improved performance is something that helps nearly every game out there today. And while DX12 benches do not look good, there are very few games which actually run smooth under DX12 on AMD either. AoS being the exception.
 
I don't think either is a reflection of the IHV's preference or philosophy, but rather the differing degree to which they're able to devote man hours to research, coupled with the resources necessary to turn research into technologies and initiatives that last and can be monetized. If you're going to spend money on developing new technology as a value-add to your products, then you own the responsibility of ensuring third parties actually implement it, and Nvidia seems to stand alone in that business. Short of having Jen-Hsun personally fly out to give developers hot stone massages, they seem to do everything in their power to make their investments matter.
 
Hmmm, OK, this is interesting, at least for me. Now that I'm back from Japan, I find I'm extremely reluctant to move the 1070 back into my main desktop rig. While in Japan I'd forgotten just how nice it is to have games rendered correctly. And the thought of going back to the 1070 with its broken rendering in some of the games that I play makes me quite sad.

It's so nice not to see my eyes shining through the back of my head in GW2. A major eyesore since it's a 3rd person game and you're constantly looking at the back of your character's head. Also nice that the lighting in the game is correct on the R9 290. As well, it's quite nice to have correctly rendered lighting in Warframe once again. /sigh.

But the speed. I find I'm faced with the question of what do I value more. Fast graphics rendering or correct graphics rendering? If all I played was the most recent AAA games, this wouldn't be an issue. NVidia spend a lot of time on optimizing their drivers for those games and do a very good job. But since my most played games aren't the most recent AAA games, their driver quality with regards to rendering things correctly is absolutely horrendous.

/sigh. I dearly hope that either NVidia's next generation of hardware doesn't have these rendering issues or that AMD can at least become competitive in the high end again.

Regards,
SB
 
Just have the dedicated GW2 rig? The game shouldn't be that taxing on the GPU (unless supersampling I guess). :p
 
I'm surprised SB bought a NVidia card. He's usually rather negative about the company. ;)

Doesn't GW2 have a history of technical quirks? That's what I remember about it.
 
Just have the dedicated GW2 rig? The game shouldn't be that taxing on the GPU (unless supersampling I guess). :p

I've considered that. However, I do a lot of World Vs. World which has a high CPU load. In that mode there's a very real possibility for hundreds of players to be in the same area at once due to there being competition between 3 servers and their respective player armies being funneled into contention with each other. The same goes if you're in a hub city (like Lion's Arch) or anyplace where an event is going on (like daily tasks or world events like Dragon's stand) albeit with slightly lower player counts.

The machine I put together for Japan just had a lowly i3 2100, so it was a tad sluggish at times in GW2. I'll switch that out whenever I finally upgrade my desktop machine to something newer than an i5 2500k.

The other drawback is having to constantly switch monitor inputs and machines depending on what game I want to play. Bleh.

I'm surprised SB bought a NVidia card. He's usually rather negative about the company. ;)

Doesn't GW2 have a history of technical quirks? That's what I remember about it.

Apparently only on NVidia cards. I've never had any graphical rendering problems in it up until I got the 1070. A friend of mine with a 970 also has some rendering issues. Not sure if they exist on older NVidia cards or not. The only drawback WRT to AMD cards in the game that I can see is that it renders slower on an AMD card than on an equivalent Nvidia card.

GW2 was actually what made me super excited to get the 1070 just due to the fact that it renders so much faster on NV hardware. You can imagine my disappointment the first time I tried to do some of the puzzles and couldn't do them without lowering post processing settings or saw my eyes shining through the back of my head.

And yeah, I used to have an extremely negative view of NVidia due to some of their business practices that I didn't agree with at the time. IIRC I put them on a personal purchasing ban for something like 1 or 2 years. Other than that, however, they were just another company that I could choose from to determine what to buy. I've owned a Riva TNT (both 1 and 2), Geforce 256, and this GTX 1070. I've also had a couple of their professional cards that I used for work and not gaming. When I consolidated my work and gaming machines into one, I stopped buying professional cards.

My personal, completely unfounded, theory is that modern Nvidia cards for some reason require extensive optimizations on a per title basis to work not just efficiently, but to render correctly. AMD as well, but it appears they are far less reliant on it, but also as a consequence at a disadvantage in Dx11 WRT rendering speed due to it.

In AAA releases, NV do a great job at that and thus they provide a great experience. And for older AAA games that won't generally be an issue either as most games are rarely updated after release. Driver optimizations for other games can sometimes break this, but you always have the option to install an older driver which renders things correctly.

GW2 and Warframe, however, are constantly updated. GW2 got an engine overhaul with the Heart of Thorns expansion. And Warframe is constantly receiving graphical updates with the occasional engine update. So now there's not only the potential for driver optimizations for other games to break things, but updates to the game itself and how it renders might now break optimizations that had previously worked for the game. Which means that AMD hardware, which receive (or at least did) far less optimizations per game than NV hardware, doesn't break as often in older games when they are updated since there are less potential optimizations to break. It still happens in some older games, but less frequently due to there being less optimizations in the first place.

In other words, optimizations that NVidia had previously done for these games (at launch, for example, GW2 ran great and without rendering errors on NV hardware) might become a detriment over time rather than a benefit. Not to mention that new hardware (like this 1070) are unlikely to get optimizations or fixes for older games, but which might still inherit whatever optimizations existed in the first place. And that would include games that are still being played and updated but aren't talked about on review sites. So GW2 doesn't get new optimizations or fixes, but World of Warcraft does.

Regards,
SB
 
Last edited:
Hmmm well I have a pair of 1070s myself and I play a lot of old and indie games and I haven't had any problems. I don't play Warframe or GW2 though. The only new AAA-like games I've played on them are Ashes of the Singularity Escalation, Dishonored 2 and Resident Evil 7.

Older games like Resident Evil 4 HD, Doom3 RoE, Evil Within, Portal 2, HL2, Crysis 1, modded STALKER SoC, Defense Grid 1 and 2, Alien Isolation, Aliens Colonial Marines with mods, Supreme Commander Forged Alliance modded, Alpha Prime, Anomaly 2, Nexus The Jupiter Incident, Homeworld Remastered, and some others have all been fine. Indie games Event[0] and Near Death fine too.
 
Last edited:
Back
Top