IHV-specific Enhancements in Games

Discussion in 'Politics & Ethics of Technology' started by OlegSH, Feb 26, 2013.

  1. milk

    milk Like Verified
    Veteran Regular

    Joined:
    Jun 6, 2012
    Messages:
    3,632
    Likes Received:
    3,663
    My guess is the Tress FX is using the same bouding box the original game´s ponytail did, but the ponytail´s colision mesh consisted of a volume-less string, so Lara´s larger bounding box compensated for its lack of thickness. That could have been an effective and very optmised way to get her fat simplistic hair mesh working fine, but it broke since it was replaced by a more refined one.
     
  2. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,261
    Likes Received:
    1,428
    Location:
    Treading Water
    Have you been living in a cave? Tomb raider isn't setting a trend it's following one. You can be upset about it if you like, but I have to question why you haven't been railing against this for years.
     
  3. PeterAce

    Regular

    Joined:
    Sep 15, 2003
    Messages:
    489
    Likes Received:
    6
    Location:
    UK, Bedfordshire
    #123 PeterAce, Mar 7, 2013
    Last edited by a moderator: Mar 7, 2013
  4. Richard

    Richard Mord's imaginary friend
    Veteran

    Joined:
    Jan 22, 2004
    Messages:
    3,508
    Likes Received:
    40
    Location:
    PT, EU
    Exactly the same issue with Rage, oh and DOOM 3 for that matter (remember ATI cards being returned because the stencil shadows were stressing them?). In Rage's case, there were 2 issues, the most public of which was ATI screwing up their build process in the rush to get the new driver out once the game was already in the open. The core problem, admitted by John Carmack at QuakeCon 2012 btw, was id Software assuming that every ATI user would have the most recent driver installed by the time the game shipped.

    Likewise with Crystal Dynamics. If indeed older drivers don't help in the case of Tomb Raider then AGAIN a developer assumes its audience will have the most recent drivers that are promised to work when the game is released. Like in the Rage situation this is primarily the developer's fault and a lack of interest in the PC platform. Or at least, not giving the PC its due attention.

    If a developer knows the currently mainstream driver has problems it DOES NOT MATTER that IHV promises it has beta drivers that work wonderfully because it still has to release them to users and then users being users will still take a couple of months to install it.

    Crystal Dynamics tested the game on PC? Let's assume so. Did they test with both ATI and nVidia cards? Let's assume they did. Did they test with the most likely driver versions their customers would have at the time of release?

    If you answer yes then they knew the game would be horrible on nVIDIA and it's their fault.
    If you answer no then they have incompetent people running QA and it's their fault.

    Either way you slice it it's their fault first and foremost. No matter what happens next, they signed off on something they either didn't test correctly or knew it would cause problems.

    When these problems started waaay back when I said: this IS what is killing the pc industry because gamers have to buy BOTH an ATI and Nvidia cards to be sure ALL their games work correctly. Like in Rage & ATI, nVidia has to share some of the blame here too. The more they want to fragment the pc platform the more frequent situations like this will be.
     
  5. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    16,480
    Likes Received:
    3,794
    Shouldnt the game be issuing the same direct 3d and same direct compute commands to the drivers (isnt that the point of directx)
    Has anyone tried one of those tools to change the vendor id of the card (3danalyze maybe?)
     
  6. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    16,480
    Likes Received:
    3,794
    So rage was iD's fault
     
  7. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    17,555
    Likes Received:
    7,453
    Yes. But, the theory is that Nvidia put in some optimizations to speed up the rendering based on how they thought the game would be released. And/or it's possible that the D3D instructions or whatever expose a bug in the GPU which requires a driver workaround (on AMD hardware this is why you never wanted to turn off Cat. AI when you could).

    Hence reports that Tomb Raider works fine on NVidia hardware using a certain driver set or a certain game profile.

    Regards,
    SB
     
  8. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,266
    Likes Received:
    1,522
    Location:
    London
    Dwarfed by the quantity of console gamers, I expect.
     
  9. Richard

    Richard Mord's imaginary friend
    Veteran

    Joined:
    Jan 22, 2004
    Messages:
    3,508
    Likes Received:
    40
    Location:
    PT, EU
    Ultimately, yes. In that huge Rage thread I even pointed out how surreal it was that nVidia (!) had to publish a blog on how to edit the config files to get the most out of the renderer.

    ATI made a huge (criminal as I called it then) mistake of publishing new drivers with an ancient ICD which compounded the issues, but the mistake started when id assumed the beta drivers that worked would be mass-installed to 50% of their customers a few days later. Even if ATI didn't screw up the build, most people would still have had a terrible 1st day experience. Heck, I only played the game roughly 6 months later and I still had a terrible first experience.

    Whether Crystal Dynamics dropped the ball (as John Carmack claimed at QCon '12), whether they did it on purpose, or whether they didn't care it's immaterial and not great solace for (this time) nvidia users.

    WRT your comment about renderers being coded to standards. Theory. Practice says they rarely are (non-standard extensions, FOURCC format extensions, etc.), game code has bugs, drivers have bugs, games aren't tested properly, etc. Fail to plan -> plan to fail.

    Trying to get this OT: if this is another example of the IHVs trying to egg the other in the face we're the ones that lose.
     
  10. Putas

    Regular Newcomer

    Joined:
    Nov 7, 2004
    Messages:
    488
    Likes Received:
    157
    As Carmack said long time ago, rule is drivers are always broken. Delaying game in hope IHVs do their job, and users update, may be futile. ATI had almost a decade long history of releasing catalysts with wrong/broken opengl for no reason, they just love to shoot themselves in the leg.
     
  11. Richard

    Richard Mord's imaginary friend
    Veteran

    Joined:
    Jan 22, 2004
    Messages:
    3,508
    Likes Received:
    40
    Location:
    PT, EU
    Not delaying the game. Rage could have done what BF3 did when it detected a known incompatible driver: the game didn't launch and gave you an URL where the user could download updated drivers. It's _exactly_ because they can't trust IHVs that id should have looked out for this. ATI might have (again) shot themselves in the foot but it was id who paid the consequences. Unfortunately, the PC version of Rage was not exactly their focus, now whas it.
     
  12. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    16,480
    Likes Received:
    3,794
    Ati opengl drivers are still borked to this day, glquake still doesnt work, infact none of the opengl based updated quake engines work.
    I had to find a dx port
     
  13. Putas

    Regular Newcomer

    Joined:
    Nov 7, 2004
    Messages:
    488
    Likes Received:
    157
    They had a word proper driver will be released in time. Link to bad catalyst release would not help. I am glad you acknowledged Id as a victim.

    I played glquake recently on 6950. Still dreaming of driver that would allow me to enable vsync in Quake Live though.
     
  14. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    16,480
    Likes Received:
    3,794
    had strange artifacts and flickering for me
    just tried glquake again with the 13.2 beta and its fine again, well done amd
     
  15. DSC

    DSC
    Banned

    Joined:
    Jul 12, 2003
    Messages:
    689
    Likes Received:
    3
    http://techreport.com/review/24562/nvidia-geforce-gtx-650-ti-boost-graphics-card-reviewed/3

    CLAP CLAP

    Good work Crystal Dynamics, AMD and AMD employees, you should be very proud of your dirty hands.
     
  16. willardjuice

    willardjuice super willyjuice
    Moderator Veteran Alpha

    Joined:
    May 14, 2005
    Messages:
    1,385
    Likes Received:
    299
    Location:
    NY
    lol well if Cyril says it then it MUST be true...
     
  17. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    11,147
    Likes Received:
    1,647
    Location:
    New York
    Never ascribe to malice that which is adequately explained by incompetence.
     
  18. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    16,480
    Likes Received:
    3,794
    And theres no evidence of Amd actively trying to prevent a feature working on a competitors card, unlike nvidia
    Amd are saints compared to nv
     
  19. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    17,555
    Likes Received:
    7,453
    Isn't this what everyone has been saying AMD should be doing the past few years? If they want good support in games they need to work with developers to take advantage of their cards strengths.

    It is exactly what Nvidia have been doing for over a decade now with TWIMTBP.

    That much isn't malicious. Just like Nvidia weren't going to help their partners code to take advantage of their competitor's strengths, AMD isn't going to go out of its way to help their partners to take advantage of their competitor's strengths.

    At least AMD haven't stooped to blocking features on their competitors cards via device IDs despite those features working just fine on competitors cards. Or having their development partners remove features their competitor helped implement because their cards didn't support those features (DX10.1 acceleration of certain things). But that said, at least Nvidia doesn't appear to be doing those shenanigans anymore, thank goodness.

    And even then. I warned that a day would come that once AMD got into the same business of developer support similar to Nvidia that we'll start seeing crap like this on Nvidia cards just like what has happened on
    AMD cards on all the years prior. But most people just said that AMD should just do what Nvidia does if it wants good support by developers in their games. And here we are.

    But as long as AMD doesn't stoop as low as Nvidia did for a couple years, then at least the games will still be able to run with all features enabled. Well, except games that might use CUDA or hardware PhysX. Since AMD will use directcompute or openCL, then at least everything AMD does will work on Nvidia hardware. But developers by now probably realize that using CUDA instead of directcompute or OpenCL would be doing their paying customers a disservice since not everyone runs Nvidia hardware capable of CUDA.

    Regards,
    SB
     
  20. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    11,147
    Likes Received:
    1,647
    Location:
    New York
    Even without IHV involvement games usually prefer one architecture over another anyway. Long term there's really no downside to close collaboration between IHVs and developers.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...