bit-tech Richard Huddy Interview - good Read

Discussion in 'Graphics and Semiconductor Industry' started by neliz, Jan 6, 2010.

  1. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
  2. Broken Hope

    Regular

    Joined:
    Jul 13, 2004
    Messages:
    483
    Likes Received:
    1
    Location:
    England
    What I don't understand is after the following Quote

    Why so many new games come out with glaring problems on ATI cards that don't get fixed until months later, by then most gamers are finished with the game that is having problems.

    Games like Borderlands having grey texture bugs for months after release that only got fixed in Catalyst 9.12, Resident Evil 5 having the screen black out during cut scenes, MW2 having bugs with flash bangs, Many have been unable to force AF in Dragon Age since the 9.11's. NFS Shift taking months to be fixed etc

    In a pretty big way ATI's rigid one driver per month release schedule has hurt them. Users shouldn't have to deal with such huge bugs on new games, having to wait for months after release before they can play them, bugs which users of Nvidia cards don't have.
     
  3. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Sorry? have you ever read one "new nv driver" release thread? The land seems to be too barren for grass to be green on both sides of the fence.
    Unless, of-course, you think that This is something that's not described as an issue.
     
    #3 neliz, Jan 6, 2010
    Last edited by a moderator: Jan 6, 2010
  4. Broken Hope

    Regular

    Joined:
    Jul 13, 2004
    Messages:
    483
    Likes Received:
    1
    Location:
    England
    I'm not saying Nvidia drivers are 100% bug free, but for new release games I see far more ATI users having issues than Nvidia users, are you saying it's the opposite?
     
  5. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,429
    Likes Received:
    430
    Location:
    New York
    Lol, I'm impressed. He managed to drop quite a few zingers in there :lol:
     
  6. Florin

    Florin Merrily dodgy
    Veteran

    Joined:
    Aug 27, 2003
    Messages:
    1,644
    Likes Received:
    214
    Location:
    The colonies
    I'd be inclined to believe Huddy is thinking the question is about the 5870 when he says this, but I've read the same sentiment before on this forum. Why would a Fermi card need to be faster than a dual GPU Cypress card?
     
  7. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    No I'm saying both have their own fair share of issues.
     
  8. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    5970 Also does not make sense to me, could be an honest typo (at least they didn't write "Firmy")
     
  9. Lightman

    Veteran Subscriber

    Joined:
    Jun 9, 2008
    Messages:
    1,803
    Likes Received:
    474
    Location:
    Torquay, UK
    50% bigger than dual RV870 is like 1000mm2 die so everything points out to a typo :smile:

    Very good interview indeed!

    This bit make my day :lol::
     
  10. Sontin

    Banned

    Joined:
    Dec 9, 2009
    Messages:
    399
    Likes Received:
    0
    Two things for Huddy:
    nVidia does not selling CPUs. Why do you expect that they do your job in helping the ISV?
    A lot of new games are using or will use CPU-PhysX. Somebody should tell the ISV that they are will get a very bad multi cpu support. And how will it helps nVidia selling more gpus when there is no GPU-PhysX support? :roll:
     
    #10 Sontin, Jan 6, 2010
    Last edited by a moderator: Jan 6, 2010
  11. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Mr. Sontin. HOw does nvidia run PhysX on an iPhone, which also doesn't have a GPU ..
    doh!
     
  12. Deathlike2

    Regular

    Joined:
    Aug 17, 2003
    Messages:
    542
    Likes Received:
    5
    I think he was talking about underutilizing of PhysX when using multi core CPUs. HardOCP had run these tests, but it showed that the CPU was not being efficiently used at all, so what Huddy is saying is correct. If a supposedly CPU driven feature such as physics can be done by the GPU, but no effort is made to have any CPU assistance for PhysX is the problem.

    Here's the link for that site: http://www.hardocp.com/article/2009/10/19/batman_arkham_asylum_physx_gameplay_review/11

     
  13. Sontin

    Banned

    Joined:
    Dec 9, 2009
    Messages:
    399
    Likes Received:
    0
    Will it helps them to sell more gpus because PhysX runs on the iPhone? :lol:
    Why should anybody use PhysX over Havok or other, free physic engines when the CPU support is in such a bad shape?

    No, he is speaking about the whole cpu support of PhysX. And i don't see the problem that nVidia is not optimizing the GPU-PhysX code for cpus. It's not their problem that the cpu vendors don't care about physics in games.
     
  14. Deathlike2

    Regular

    Joined:
    Aug 17, 2003
    Messages:
    542
    Likes Received:
    5
    Since when did CPU vendors control physics? They don't. Developers do, and they can do it with a CPU if they so desire. PhysX is a vehicle to make this "easier" for the CPU as a method of acceleration. It's not supposed to be "the only way" to make it accessible to all.

    If you consider PhysX as a form of T&L which was whole "give less work for the CPU to do" deal in the DX7 era, then they've done very poor job of it. It's not a great analogy, but for the purposes of what PhysX is supposed to offer, making no effort to make the CPU more efficient in getting better performance is not acceptable if that's what PhysX was claiming in the first place... to help out accelerating physics.

    I don't like the idea that PhysX does absolutely nothing for the CPU... you're supposed to help out the CPU to make your own product look better anyways. To say that it shouldn't help out the CPU is utter nonsense for hardware that is supposed to reduce CPU overhead for physics.
     
    #14 Deathlike2, Jan 6, 2010
    Last edited by a moderator: Jan 6, 2010
  15. Sontin

    Banned

    Joined:
    Dec 9, 2009
    Messages:
    399
    Likes Received:
    0
    So, what is the point of the PhysX bashing from Fuddy? There are developers who used or are using PhysX as their major cpu physics engine. It's not nVidia's fault that they don't use all free cores of a cpu.

    I see the point, but GPU-PhysX is only one way of the whole "GPU-Physics" thing. Until no independent developer will use CPU- and GPU-PhysX in a programm, i see no problem that nVidia will not optimize their code.
     
  16. Deathlike2

    Regular

    Joined:
    Aug 17, 2003
    Messages:
    542
    Likes Received:
    5
    He's saying that prior to acquiring PhysX, the company was doing just dandy with multi-core acceleration. Since tests prove that no more than dual-core is being accelerated, he's saying that the tech is being "held back" intentionally to want you to buy the video card than a CPU. I should think that upgrading from singlel core to a Core i7 would actually help physics out, but that's not really the case under this scenario.
     
  17. Sontin

    Banned

    Joined:
    Dec 9, 2009
    Messages:
    399
    Likes Received:
    0
    I don't see tests. He is claiming something.
    This came from Futuremark:
    http://www.pcgameshardware.com/aid,...xclusive-and-more-technical-details/Practice/
     
  18. Deathlike2

    Regular

    Joined:
    Aug 17, 2003
    Messages:
    542
    Likes Received:
    5
    I showed you a link with some analysis. Your link doesn't have any.

    You have to at least see the impact with some benchmarks than just a Q&A.

    I'm not saying that the Batman game was any good as a comparison (we don't know how optimized it really is), but if you're gonna market a marquee game with PhysX, you need to be able to see the performance differences that one can properly quantify.
     
    #18 Deathlike2, Jan 6, 2010
    Last edited by a moderator: Jan 6, 2010
  19. Sontin

    Banned

    Joined:
    Dec 9, 2009
    Messages:
    399
    Likes Received:
    0
    I know a lot of other Games with physics which only use 2-3 cores. It's wrong to assume that there is a bad multi cpu support because games only use 2-3 cores.
    Look at other games which don't use physx:
    Dirt2: http://www.hardocp.com/article/2009/12/23/dirt_2_gameplay_performance_image_quality/9
    Resident Evil 5: http://www.hardocp.com/article/2009/11/03/resident_evil_5_gameplay_performance_iq/9
    Wolfenstein 2: http://www.hardocp.com/article/2009/09/01/wolfenstein_gameplay_performance_iq/8

    All three games use 2 cores of the cpu. What do you think? All three game developers want to harm the gamer?
     
  20. Deathlike2

    Regular

    Joined:
    Aug 17, 2003
    Messages:
    542
    Likes Received:
    5
    We're exclusively talking about PhysX here with multi-core processors... since that is the scope of Huddy's comments. Those benchmarks don't show PhysX on and off.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...