AMD and NVidia’s Developers Partnerships

Shortbread

Island Hopper
Legend
First off, I’m not against NVidia or how it does business. I’m more concerned about NVidia’s and AMD’s partnership polices that are segmenting gaming. Why should a PC developer cater to either or… when their audience is strictly PC gamers? What specifically does the developer achieve or gain in return when offering certain IP features for one card over the other? I would think PC developers would want to make it simpler or a more unified experience across the vast configurations that PC gamers have to deal with already.

Mind you I’m not against AMD or NVidia having specific hardware features or a decent working relationship with particular developers. What I am against, are some of the policies that certain developers have in these partnerships that do not allow other vendors proper access to the engine, time to optimize drivers, and more importantly, figuring out the best solution on rendering a particular proprietary feature in different fashion – so not to hurt gaming performance on their particular hardware.

Which leads me into this…

Should PC developers (especially multiplatform developers) use AMD GCN based PCs - if their lead platform is PC? Meaning: XB1 and PS4 SDKs aren’t getting highly optimized NVidia code that could potentially have ill effects on these AMD GCN based systems.

P.S. Please, let’s not turn this into AMD vs. NVidia or Sony vs. Microsoft debate. What I’m truly looking for is thoughtful answers towards solution of “openness” across developers’ partnerships… and some observations/opinions on how PC developers should approach this generation of consoles.
 
Last edited by a moderator:
I don't see how this is particularly relevant to console gaming in general so that may be why there's a dearth of replies. I highly doubt that devs wouldn't target the specific GCN hardware features in the consoles. They do have dev kits that have the actual hardware of the consoles within them so it would make sense to take advantage of whatever can be gotten, especially given the size of the console market. If there are devs out there who are using Nvidia hardware and not taking advantage of the hardware in consoles then that is their disadvantage.

The effects this has on PC games development assuredly belongs in a different forum.
 
Actually this is quite relevant to both sides. We have seen for years how PC ports can adversely affect console editions (poor performance), and vice versa, where console ports can look less than (IQ wise) on PC. However, this generation is slightly different, since the XB1 and PS4 are more closely matched to the PC’s architecture - albeit a midrange PC.

That being said, it would seem performance and IQ would scale relatively well (better) across this generation compared to other generations. However, IMHO, it hasn’t. If you look at games like Watch Dogs, Alien Isolation, The Evil Within and few other titles... you would think these titles would scale quite well across this generation, however, they did not. Especially when compared to similar spec’d midrange PCs that have shown better performance than XB1 and PS4.

I can also make the case that game titles developed in partnership with NVidia (NVidia PCs) have shown compromised performance (lower frame-rates) on AMD based PCs… and maybe, a reasonable hypothesis towards explaining XB1/PS4 poor performance and lower IQ settings. Mind you, I’m talking strictly if the lead platform is PC (NVidia based) and that porting highly optimized NVidia game code over to PS4 and XB1 SDKs could be hampering overall performance.

Which lead me to my original question “Should PC developers (especially multiplatform developers) use AMD GCN based PCs - if their lead platform is PC? Meaning: XB1 and PS4 SDKs aren’t getting highly optimized NVidia code that could potentially have ill effects on these AMD GCN based systems”.

As for the PC part… it was to see if there are any good solutions when ATI and Nvidia have partnerships in place with certain developers. In other words, allowing the vendor (regardless of partnerships) the proper game engine access on optimizing their specific hardware drivers/subsets/APIs… and more importantly, figuring out the best solution when rendering around a competitors proprietary feature, which more than likely affects their particular brand.
 
Last edited by a moderator:
http://www.ign.com/articles/2014/11/12/ubisoft-comments-on-assassins-creed-unity-pc-troubles

We are aware that the graphics performance of Assassin’s Creed Unity on PC may be adversely affected by certain AMD CPU and GPU configurations," Ubisoft said on their forum. "This should not affect the vast majority of PC players, but rest assured that AMD and Ubisoft are continuing to work together closely to resolve the issue, and will provide more information as soon as it is available."

As I stated before, these partnerships aren't doing any favors. AMD GPUs/CPUs are getting hosed without any proper optimization. I'm also willing to believe this is affecting how PS4/XB1 CPUs and GPUs are being handled across their SDKs. I'm almost dead sure UBISOFT is shoehorning very optimized code tailored for Nvidia Game Works across PS4/XB1. Watch Dogs anyone?

Edit: Anyone know if Nvidia PhysX is being used in ACU? If so, I'm willing to bet Nvidia GPU is handling most, if not all, the NPC crowd data rather than the CPU. Thus allowing the CPU more breathing room on doing other chores.
 
Last edited:
Nvidia optimized games run terrible on all platforms. AMD optimized games run pretty good on all, except the time with tressfx which nvidia had to patch. Its going to be this way until the end of time. PC gamers don't really look at things like this in detail. I remember the dx11 version of crysis 2 and how nvidia obviously just told crytek to put as much tessellation in there as possible while hampering performance on all PCs. I don't know how they don't get sued for stuff like this. The public could do class action lawsuits for things like this.
 
As I stated before, these partnerships aren't doing any favors. AMD GPUs/CPUs are getting hosed without any proper optimization. I'm also willing to believe this is affecting how PS4/XB1 CPUs and GPUs are being handled across their SDKs. I'm almost dead sure UBISOFT is shoehorning very optimized code tailored for Nvidia Game Works across PS4/XB1. Watch Dogs anyone?

If games on PC outsold those on console, that may be a risk worth taking but given console game sales generally eclipse those on PC and represent the bigger slice of the profit pie I can't fathom how Ubisoft would take such a risk knowingly.

I was 50/50 about Unity on both gameplay (old control issues still there) and technical issues but by far it's the technical issues that put me off buying it. There is simply no way I'll buy an action title that drops to 20fps. I had my fill of poor frame rates last generation thanks very much and there's more than enough games to go around where I have a choice :)
 
If games on PC outsold those on console, that may be a risk worth taking but given console game sales generally eclipse those on PC and represent the bigger slice of the profit pie I can't fathom how Ubisoft would take such a risk knowingly.

I would thinks so as well... however, UBISOFT has a strong partnership with Nvidia, something beyond the typical console promotional partnership.

http://www.gamewatcher.com/news/201...nnounce-continuation-of-pc-gaming-partnership
http://www.brandingmagazine.com/2014/11/06/ubisoft-strengthens-partnership-with-nvidia/
http://www.forbes.com/sites/jasonev...-pc-releases-including-assassins-creed-unity/
http://www.lazygamer.net/general-news/ubisofts-nvidia-partnership-to-continue/
http://www.geforce.com/whats-new/ar...c-gaming-alliance-for-this-falls-hottest-game
http://www.digitaltrends.com/computing/nvidia-and-ubisoft-partner-for-geforce-gpu-game-bundle/

Just a small Google sample between Nivida and UBISOFT partnership.
 
Partnerships are one thing, knowingly and willingly compromising your base technology on your biggest platform in a highly competitive market is another thing entirely.
 
I think UBISOFT somewhat doesn't give a damn... the majority of gamers are unaware of IQ and performance issues, until they read about it or see it for themselves. UBI and EA can package turds... which IMHO, they often do... will still get millions of gamers purchasing these awful games, even after getting burned by the last turd. For the life of me, I don't understand why!?!
 
I think most gamers can look at the blurry shots of Unity and compare them to the GTA V Remaster and see one is visibly better - subjective I know. Unity is competing with GTA, it's Ubisoft stablemate Far Cry 4, Dragon Age.

It's next gen, presentation matters to some. Certainly there's nothing nextgen about Unity's gameplay.
 
Should PC developers (especially multiplatform developers) use AMD GCN based PCs - if their lead platform is PC?

No. Because A) That'd be an advantage for AMD B) Nvidia has more marketshare, so that would disadvantage the larger group of gamers and C) PC developers should use whatever they happen to be most comfortable with.

Topic starts on a flawed premise and attracts dubious claims about what GPU makers are trying to do and how. I don't care for where this is going.
 
http://www.ign.com/articles/2014/11/12/ubisoft-comments-on-assassins-creed-unity-pc-troubles



As I stated before, these partnerships aren't doing any favors. AMD GPUs/CPUs are getting hosed without any proper optimization. I'm also willing to believe this is affecting how PS4/XB1 CPUs and GPUs are being handled across their SDKs. I'm almost dead sure UBISOFT is shoehorning very optimized code tailored for Nvidia Game Works across PS4/XB1. Watch Dogs anyone?

Edit: Anyone know if Nvidia PhysX is being used in ACU? If so, I'm willing to bet Nvidia GPU is handling most, if not all, the NPC crowd data rather than the CPU. Thus allowing the CPU more breathing room on doing other chores.

Yes, and people forget that Ubisoft decision to partnership with Nvidia was really meant to save developing costs, not a creative decision in order to do games with better gameplay or better graphics.
 
http://www.eurogamer.net/articles/d...dia-hairworks-really-sabotage-amd-performance

"We've been working with CD Projeckt Red from the beginning. We've been giving them detailed feedback all the way through," AMD's chief gaming scientist Richard Huddy told Ars Technica. "Around two months before release, or thereabouts, the GameWorks code arrived with HairWorks, and it completely sabotaged our performance as far as we're concerned. We were running well before that... it's wrecked our performance, almost as if it was put in to achieve that goal."

*tsk-tsk*

"GameWorks improves the visual quality of games running on GeForce for our customers,"GameWorks PR Brian Burke told PC Perspective. "It does not impair performance on competing hardware. GameWorks source code is provided to developers that request it under license, but they can't redistribute our source code to anyone who does not have a license. Most of the time we optimise games based on binary builds, not source code... I believe it is a resource issue. Nvidia spent a lot of artist and engineering resources to help make Witcher 3 better. I would assume that AMD could have done the same thing because our agreements with developers don't prevent them from working with other IHVs."

oh-really_636563.jpg


[UPDATE 22/5/15 6:44pm: Nvidia has been in touch to clarify the situation with regards GameWorks running on non-GeForce platforms: "It's not CD Projekt Red's decision to allow the Nvidia tech to work on AMD GPUs - that is Nvidia's decision and most commonly-used features from us are platform-agnostic. It's the same for CPU-based PhysX and Clothworks as well."]

Yet, if CDRed had a workaround or possibly tailored Nvidia's GameWorks Frameworks/libraries to better fit AMD's GPU, it would have been a no-no! Sad state of affairs in PC gaming, when contractual paid BS stops them from providing a better solution (or just a solution) across other hardware vendors.
 
Last edited:
http://www.eurogamer.net/articles/d...dia-hairworks-really-sabotage-amd-performance
Yet, if CDRed had a workaround or possibly tailored Nvidia's GameWorks Frameworks/libraries to better fit AMD's GPU, it would have been a no-no! Sad state of affairs in PC gaming, when contractual paid BS stops them from providing a better solution (or just a solution) across other hardware vendors.

Some good news then: It didn't stop them. The game also has a generic non-GameWorks hair option.

While there weren't any contractual restrictions preventing other solutions being put in place, sadly CDPR felt there wasn't enough time left to implement additional alternatives (like say TressFX).
Hairworks has low development overhead as Nvidia tends to provide resources to get the work done. The results are worth it.
 
Or sadly AMD didn't contact them much sooner to implement TressFX in addition to HairWorks.
 
Back
Top