Hairworks, apprehensions about closed source libraries proven beyond reasonable doubt?

Discussion in 'Graphics and Semiconductor Industry' started by MfA, May 20, 2015.

Thread Status:
Not open for further replies.
  1. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    Thats effectively the thing i heard from Nvidia, it seems they dont have ask for the code .. the second is they maybe just dont care ( for different reason, money, time, ressource etc )
     
    #61 lanek, May 23, 2015
    Last edited: May 23, 2015
  2. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    Looks like the studio already released patch with improved HairWorks performance (http://www.pcgamer.com/witcher-3-patch-adds-new-graphics-options-and-improvements/). Give it a few more days and AMD will release a driver (beta of course) and it will be on to the next temper tantrum. (Though maybe it's in AMD's best interest to not try to fix anything?)
     
    RecessionCone likes this.
  3. pMax

    Regular Newcomer

    Joined:
    May 14, 2013
    Messages:
    327
    Likes Received:
    22
    Location:
    out of the games
    Explain me how AMD would optimize for a closed source DLL that does everything it can do be performant on a specific platform and not on another, I am curious.
    You made an interesting assertion there.
     
  4. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    The CPU is identical irrespective of the GPU, so whatever code executes on the CPU is irrelevant.

    What matter is the stuff that gets passed on from the DLL via the generic Windows driver to the GPU specific driver: a shitload of API state changing calls (setup texture commands etc.), and shader programs.

    That's the stuff where the GPU specific driver can intercept and make specific optimizations. It could organize memory allocation for optimal performance, it can detect specific shader code sequences that can be mapped to certain instructions, it could so shader detection and replace it with a hand-written one etc.

    Having the source code can help to understand what the code is doing, but it's not going to help one bit to make it faster: it's not as if they have the build environment of each game developer and they can at will replace one DLL by another. That's something only the game developer can do.
    So the actual optimization work needs to be done at a lower level no matter what. I believe that's what Nvidia means when they say that they don't need the source code to optimize their driver for a particular game. And I believe that the same is true for AMD as well. Anything else would be unworkable.
     
    pharma likes this.
  5. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    So suppose AMD does these optimizations. What then? Do you think they have absolutely no CPU overhead? Many people complain about the apparent increased CPU overhead of AMD's drivers relative to Nvidia, yet they never stop to consider why that might be true.

    Regarding the closed source library vs. open source: What guarantee do you have that the closed source library does the same work on all platforms? With open source you can easily verify that.
     
    Lightman and BRiT like this.
  6. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    So let me explain it for you more clearly , after the theory who seems to be that Nvidia is sabotaging performance on AMD GPU's with gamework, your theory is: it is AMD who sabotage itself his gpu performance on Assassin Creed, FC4, Batman series for make believe that Nvidia is doing it, with the goal to achieve to discredit Nvidia .. ... ok.. Why not, maybe, im open to all theory.

    The same theory push by Forbes finally. A theory who will certainly like Nvidia.

    I think we should call back Mulder and Scully there for get the final response.

    In addition, the implementation can vary a lot of the initial code source, game by game, with specific optimization.

    As for the overhead, its clear that if you need to do all the work on the driver level, the cpu overhead caused by it will explode, hence why when peoples use COD or FC4 as an example it is not really representative at all. ( i dont deny that it seems the AMD driver have a larger overhear that Nvidia one, but if you add what need to do the driver compared to the Nvidia one on thoses titles, but on thoses titles it is completely normal that it happend )
     
    #66 lanek, May 23, 2015
    Last edited: May 23, 2015
  7. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,930
    Likes Received:
    1,626
    Maybe Mulder and Scully are on to something since Ars Technica echoes the Forbes thoughts:

    http://arstechnica.co.uk/gaming/201...completely-sabotaged-witcher-3-performance/2/
     
  8. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland

    And then again, the GTA5 things in preambule, .. do they know what tech have been "mixed" in GTA5? I was not know that GTA5 use PhysX + Bullet, Hairwork + tressfx, instead it use 2 independant plug in who have been added in the gamework library, HBAO+, HDAO and FXAA ..

    Specially when Hairwork was demonstrated on a pre demo of Witcher3.

    Do they know what type of contract is made by rockstar.. i can tell you the politic of rockstar with hardware brands is far different of some other studio. I ask me if thoses 2 articles have been write by the same person.

    I like too how they goes after AMD, when on the end, this is gamers, the consumers on forums who complain.. not AMD guys .. (outside the 2 responses of Huddy who are cited in loop ). Have you seen AMD going in a public campaign against it. huddy have just respond to 2 question in an interview.

    Things are going out of proportion in every aspect.

    Instead of speaking with half words, let say it, it is AMD who sabotage the performance of their gpus with the gamework titles, for discredit Nvidia ..
     
    #68 lanek, May 23, 2015
    Last edited: May 23, 2015
    Lightman likes this.
  9. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,930
    Likes Received:
    1,626
    Unfortunately in Witcher 3 it's not just HairWorks as a individual variable but also CD Projekt Red implementation of their own in-house developed Anti-Aliasing techniques which are probably not open-sourced. It's more difficult to identify potential outcomes when "new" techniques are expected to have an impact on an established gamework library.

    GTA5 development may have been different in that no extra in-house techniques interacted with HairWorks or TressFx implementations, so risk was much less.
     
    #69 pharma, May 23, 2015
    Last edited: May 23, 2015
  10. bgroovy

    Regular Newcomer

    Joined:
    Oct 15, 2014
    Messages:
    629
    Likes Received:
    493
    Yeah, if only AMD would stop wasting their time revolutionizing graphics APIs on PC with Mantle and its progenitors and instead focus on real innovations like black box graphical prettification plug ins developed specifically to disadvantage nVidia hardware.
     
  11. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    16,156
    Likes Received:
    5,091
    Gotcha, so you'd be absolutely happy in a world where, if both IHV's had equal GPU share in the market you couldn't play half the games released because it ran horrible on your video card.

    Even better, what if there were 3 IHVs? Then you'd be limited to only playing 1/3 of available games because the other 2/3 would perform horribly due to closed source sabotaging of performance. It's great that you applaud an IHV for abusing it's monopoly.

    But, at least we're seeing some backlash from software developer's, at least those that aren't tied monetarily to Nvidia. And we should start seeing the death of Gameworks. And good riddance.

    And this coming from an Nvidia user that isn't happy with the situation.

    Regards,
    SB
     
  12. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,496
    Likes Received:
    910
    Behaving ethically and pointing out that the competition doesn't is throwing a temper tantrum?
     
  13. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    I'm sure they have some cost, but most of these should be one-time costs.

    Some months ago, somebody posted on a forum somewhere about his experiences as an interns or employee at Nvidia: they seems to spend tons of effort optimizing their drivers for specific games, so you'd expect Nvidia drivers to have tons of overhead as well. The opposite is true. But Nvidia seems to be better at using at least 2 CPUs in their driver. Maybe AMD should have spent more time on that aspect?
    And it's not as if there are that many GameWorks with GPU acceleration out there anyway.

    It should be fairly trivial for a company like AMD to put a snooping layer between a game and the Windows API and dump all the transactions. And if they then discovered that this were the case, we'd hear it loud and clear. Since we didn't hear any whining of that sort, I'm pretty confident that this is not happening. :wink:
     
  14. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland

    He was not speaking about Nvidia, he was speaking how complex are the driver todays, and this include AMD drivers too, we can think that a driver optimized for the Witcher, Assassin Creed, Batman, is way more complex on the AMD side than it is for Nvidia .

    It is exactly what is the situation, a driver for gamework from AMD need 10x more engineer works and time to be made that for any other games .. with lets be honest, bad result whatever .

    And this increase the overhead, largely, with the driver, they need to solve problem who are solved on the developper side for Nvidia.

    Its crazy, do you imagine that Nvidia have put in the blackbox, " tesselation", ported from CG by ATI, pushed into DX" .. tesselation is now part of the gamework blackbox, the " made in Nvidia Tesselation"..
    What does this mean ? , it is not an specific Nvidia features, but gamework use it in is box, hidding control from the main API. graphic engine...

    Officially this new version of the tesselation, should increase performance by adding LOD on it ( well LOD is a basic features of the initial tesselation setting on directX, it was even part of the ATI in 2003 presentation of the tesselation: but no one game where Nvidia have push tesselation are using it.
     
    #74 lanek, May 24, 2015
    Last edited: May 24, 2015
  15. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    And here I thought that the whole point of a GPU was to make games look good!

    It's in Nvidia's and AMD's best interest to promote rendering techniques that use up as many GPU cycles as possible, otherwise we'd all still be playing PacMan. How many studios out there have the expertise of creating a full physics library, new anti-aliasing techniques, fire, hair, smoke, lighting techniques etc.? Definitely not all of them. So both AMD and Nvidia develop libraries to make this happen. The only difference is that Nvidia spends probably an order of magnitude more than AMD. And that they don't like to give away their code, which, given the millions they've spent on it, is not entirely unreasonable IMO.

    AMD spent probably a pretty penny on Mantle instead, and kept it closed until it became irrelevant in the PC space.

    Enabling games to use state of the art rendering techniques or finding a way around bottlenecks in your driver: both are worthwhile endeavors. The problem with focusing all your effort on bottleneck reduction is that you're solving a one-time thing without lasting competitive benefit, while improving graphics quality is an open-ended problem.
     
    pharma likes this.
  16. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    What makes you so sure about that? Have you studied the shader assembly that gets transferred from the Windows driver to the GPU specific driver and concluded that it maps better to the Nvidia instruction set than the AMD instruction set?

    AFAIK tessellation is not a black box at all: it's supposedly very well spec'ed by Microsoft and a standard component of the rendering pipeline. I don't think tessellation is the real issue, it's that AMD GPUs are worse at dealing with geometry, and that tessellation has the ability to generate tons of geometry.
     
  17. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    Accusing your competition of sabotage because your own driver/hardware can't keep up is. I have yet to see the first proof that there's anything unethical is going on. The performance on AMD GPU is usually bad compared to Nvidia. So was the performance of Dirt, but the opposite way around. Some architectures are better at one thing than the other. It's only logical that one's library should optimize for those features.
     
    Florin likes this.
  18. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    i m pretty sure that a driver for Assassin creed from Nvidia is way less complex than what should bring AMD for it.. for the Witcher, i cant tell i have not study them ..


    As for tesselation, Nvidia have move it to his GameworkBlackbox .. so you take the conclusion. Inititally this is a pure DirectX setting, but if you implement the nvidia version, its no more a DX component, it is a part of the gamework boxes library. You understand the difference
     
    #78 lanek, May 24, 2015
    Last edited: May 24, 2015
  19. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    No, I don't understand the difference at all. GameWorks is middleware as much as any other studio specific library is middleware. To the driver, it should make no difference.
     
  20. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    For Nvidia driver ... not for the AMD one.


    Im a bit sad about all this meltdown, the Witcher is a great game, and released by extremrely talentous guys, the work they have do, is phenomenal, the game is allready cult , in every aspect of the game
    and all of this technical question seems to take more place than they should on the PC version ... This said, it is the last game of a gamework series, who have clearly show how was used gamework on the Nvidia side ... maybe the fault of AMD as some suggest ( the counter theory it is a bit too much easy and have never work, ask Richard Nixon about it ) ...
     
    #80 lanek, May 24, 2015
    Last edited: May 24, 2015
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...