Asking Tim Sweeney about NVIDIA and more

Discussion in 'Beyond3D News' started by Reverend, Sep 29, 2003.

  1. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    WTF are you talking about?
     
  2. micron

    micron Diamond Viper 550
    Veteran

    Joined:
    Feb 23, 2003
    Messages:
    1,189
    Likes Received:
    12
    Location:
    U.S.
  3. Rugor

    Newcomer

    Joined:
    May 27, 2003
    Messages:
    221
    Likes Received:
    0
    I have to say I laughed out loud when I read that too.
     
  4. jimbob0i0

    Newcomer

    Joined:
    Jul 26, 2003
    Messages:
    115
    Likes Received:
    0
    Location:
    London, UK
    Hmm this could well be an interesting point but I believe that the ability of teh NV3X to run DX9 code was misrepresented at the time. Gabe Newell stated that during development of HL2 NV told him that their card could do in excess of DX9 and would have no trouble handling it... which is 'one' of teh reasons that the team there use a R3x0 in devleopment. He said it came as a rude awakening (paraphrased) when they tested the game on teh NV3X and immediately contacted NV... with whom they then worked closely to write a specific NV3X path for all the shaders in teh game to try and recoup some speed... and with this they both succeeded (they got some speed back) and failed (not as much as the R3x0 does natively on DX9)... it wasn't for lack of trying - and heck NV helped write most of that alternate driver code so it can't be because they "didn't know what they were doing for NV3X code" or because "they were paid off by ATI"... they only logical solution left is that the NV3X basically sucks for the coming generation (ie next month+) of games.... that goes too for the anonymous poster reading Tim's words in odd ways.
     
  5. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    Re: Microsoft/Nvidia Fallout

    Well,maybe DX9 isn't "the bomb"...but isn't the point of DX to make an API standard so us users don't have to worry if the game'll run on our HW...?
    To make it a tad closer to console standard so to speak where you know the game'll run without major problems on your HW...
    Hence make it easier and cheaper for the programmers to make the games since they don't have to worry about extra coding...
    Also easier for smaller IHV's to make and sell cards since they don't have to worry if the programming houses will include their API into the game.(like it was back in the days of Glide(3df/x),MeTaL(S3) and Renditions offering which I don't remember name of....Redstorm or similar...)
    So claiming it's MS's fault since they denied accepting a special path for nVidia is strange,nVidia should have followed the standard IMHO...

    BTW,"what Unreal was for nVidia" seems wrong to me..."what Unreal was for 3df/x" seems more correct since Glide was the Unreal API...(until after a couple of UT patches where DX started to work as fine and later on better...but then due to 3df/x's cards starting to go far behind...)

    FInally,yes I use an ATi card (9600Pro) and have been using ATi since I bought a Radeon DDR,going via an 8500LE to my present...but this isn't due to me being a fan,it's because of two reasons....
    1.Price...(with the Radeon DDR I was going to get a GF2,but the Radeon was on sale,so I saved quite a bit of cash)
    2.2D quality...swapping from my GF2 MX to the Radeon made Fallout look like it never did...so I can't change to anything worse in this dep.(I know nVidia has caught up quite a bit here...but then reason #1 simply gave me a lot more in 9600P than any nVidia offering at the time....)
    What my next card will be I haven't got a clue,nVidia,PVR,S3,SiS...I'll decide when it's time...for now I'm very happy with my 9600Pro...
     
  6. RiotSquad

    Newcomer

    Joined:
    Feb 9, 2002
    Messages:
    38
    Likes Received:
    0
    Oooops

    I forgot to login...sorry to add another "guest" to the postings,above post was made by me.
    Not that I'm well-known or anything on the forum (or about 3D architecture),but still. :)
     
  7. Sxotty

    Veteran

    Joined:
    Dec 11, 2002
    Messages:
    4,895
    Likes Received:
    344
    Location:
    PA USA
    No they said it took 5 times as long to optimize the nv3x path, which means it took 20% of the time to optimize the default dx9 path, which is the r3xx path. The real question is did the nv3x benifit from the optimizations of the dx9 path? If it did not then they should quit talking thru their hat and call it a r3xx path.
     
  8. zxern

    Newcomer

    Joined:
    Jun 9, 2003
    Messages:
    2
    Likes Received:
    0
    What a clever boy.
     
  9. Rugor

    Newcomer

    Joined:
    May 27, 2003
    Messages:
    221
    Likes Received:
    0
    The path the R3xx cards run was written to the specs of the DX9 API, they didn't make any optimizations on that path for any hardware. Both ATI and Nvidia cards are capable of running that path, while the R3xx cards cannot run the NV3x path. Should they ever be released to the market both Volari and DeltaChrome will also be able to run the DX9, but not the path but not the NV3x path.

    It's called the DX9 path because any DX9 compliant part should be able to run it. It just so happens that R3xx follows the spec so closely that there is no way to optimize for it that doesn't also optimize for the spec. Unfortunately for Nvidia their NV3x architecture doesn't handle the spec very well, so it needs optimization.
     
  10. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    That's not really true. API compliance does not dictate an optimal useage of the API. Unreal had a really bad time on most Direct3d cards because, although it used the API correctly, it was grouping things suboptimally leading to rather poor performance.

    Each part has things it does best, and things it doesn't do best. Should you group your dispatches based on material? Or depth sort for best performance? Are strips better? Or fans? Or just indexed vertex buffers? All these things are compliant, but have an impact on performance, and that impact is architecture dependant.

    What runs best on one architecture might not be best on another--even on DirectX9 compliant architectures.
     
  11. Simon F

    Simon F Tea maker
    Moderator Veteran

    Joined:
    Feb 8, 2002
    Messages:
    4,560
    Likes Received:
    157
    Location:
    In the Island of Sodor, where the steam trains lie
    Eloquently put, Russ.
     
  12. Simon F

    Simon F Tea maker
    Moderator Veteran

    Joined:
    Feb 8, 2002
    Messages:
    4,560
    Likes Received:
    157
    Location:
    In the Island of Sodor, where the steam trains lie
    Arghhh No DELETE button for double post.
     
  13. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    It's not a question of "optimizing" a DX9 code path for the game, but rather a question of *creating* one--which took 1/5 the amount of time Valve spent on creating an optimized mixed-mode path for nV3x (that means partial & full precision, as well as *other things*.) Note that the mixed-mode, nV3x-specific path for the game is different from either the DX8.x path OR the DX9 code path.

    Valve was very clear in the fact that although the nV3x path was faster for nV3x than the DX9 path, it was still fairly far behind the R3x0's performance running Valve's "straight" DX9 path, so much so that Valve recommends running the DX8.x path for all nV3x, and has made it the default setting for 52/5600 products. Whether Valve will even retain the nV3x path in the actual game is still not yet decided, as far as I know. As you know, Valve professed that it had wasted its time on the nV3x, mixed-mode path, and would have been better off putting nV35 on their DX8.x path from the start. The development time for the DX8.x path would have been even less than the development time for the DX9 path in the game, I would imagine. So much for how nVidia's "real game optimizing" strategy actually works in real life.

    I put the word "straight" above in quotes, because every developer's "DX9 path" is going to be different from another's. It's not because they are hand-optimizing for specific 3d chips--nope, it's because the *level and kind* of DX9 feature support a developer uses in his game will *differ* among DX9 3d games. Not all 3d games will support the same DX9 features in the same way. This has nothing to do with a developer favoring one IHV over another.

    In other words, the DX9 path in Tomb Raider will be distinctly different from the DX9 path in HL2--different engines by different programmers are being used, and different features of DX9, at differing levels, will be used in these games.

    So the interesting thing here relative to your argument that the HL2 DX9 code path amounts to an "ATi-optimized" code path, as opposed to Valve's own generic DX9 code path for HL2, is that the same performance differences running the DX9 code path in two distinctly different games are apparent. We see the same problems for DX9 feature support in TR that we see in HL2.

    In fact, to move your argument to its logical conclusion, we would have to conclude that every DX9 game engine tested to date, and every synthetic DX9-feature benchmark tested to date which essentially show us the same thing Valve showed us have ALL been "optimized for ATi" with the express purpose of dishonestly characterizing ATi's hardware as better than nVidia's. Of course, that's ridiculous on the face of it.

    How much money do you think it would take to buy off the honesty of all of these people and companies? I'll tell you I think it would take way more than ATi's got to spend--or at least that ATi *would* spend on something like that. Obviously, ATi's going to put its money into R&D and stay ahead by making superior products if they can--which is a much more sensible way to spend a company's money, IMO.

    But most importantly of all--what's interesting here is that you are accusing ATi of engaging in the same kind of behavior nVidia has been engaging in all year--ever since it originally lied about having an 8x1 pixel-per-clock pipeline--which it started doing last year at the nV3x announcement at Comdex. Ever since then nVidia's conduct has been one long downhill slide into a morass of dishonesty and equivocation.

    Frankly, I've seen nothing out of ATi except excellent products, drivers, support and community involvement. When brought up on a legitmate fault, ATi has been quick to respond to its buying public, self-effacing when required, and most importantly--have always taken corrective action. This kind of corporate behavior and image projection stands in sharp relief to nVidia's conduct all year (the company admits nothing, promises nothing--apart from magical drivers--and corrects nothing and is seemingly completely unconcerned about its public image.)

    Therefore, I see no evidence whatsoever to support your allegation that Valve is lying about the nature of its generic DX9 code path for HL2.
     
  14. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    Walt that was a lot of talking for a very simple point which you missed.

    I said
    "if the optimizing for the directx9 path that they did does not benefit the NV30 when it runs it then they could simply call it a r3xx path"

    Similarly if other dx9 chips actually get out in retail and a developer makes a game and then optimizes the dx9 path and the volari sees an increase of 45%, and the r3xx sees an increase of 0% they are effectively optimizing for the volari. Do you understand? That is all I was saying, if in this case the only cards that benifit and run the directx9 path it could be called optimizing for the r3xx chips.

    And I am not saying ATI paid all devolpers off to report this everyone knows nvidia chips are poor perfomers in dx9 I am not arguing that. Although in this case ATI did pay them, but I am not saying ATI paid them off, only that they were paid. I think that the nv3x is poor why do you think I have a 9800?
     
  15. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    This is where i step off the Train.

    The implication that Valves benchmarks are influenced by the marketing deal is utter GARBAGE!!!!!!. Gabe went out of his way to explain that it was Because of the results they were seeing they made their choice. Anyone who cant see the obvious truth of that based on all the available evidence from all Dx9 situations is (IMO) a blind Fan***.

    This comment by Sweeney is an utter "Crap" statement. I am pretty confident as to why he made it. :roll: This guy is the one talking from nothing but their own personal Bias and Favrotism and Margeting agreement with Nvidia.

    Every Time i try to start to respect this guy he says something new that makes me remember why i peretty much think he is Ofal.
     
  16. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    Sooo.. Your saying that yup.. using UT2003 as a Benchmark is just as wrong as this? and All results should be ignored?? :roll:

    Please...
     
  17. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    82
    That's a misnomer. It's Microsoft's DX9, not a R300 optimised path. When other DX9 cards arrive, will we we call DX9 the "ATI/XGI path"? Do we call Formula 1 the "Ferarri Race" because that one company has been doing better at it than any other for the last few years? Of course not.

    You're implying that ATI got some kind of preferential treatment, when the exact opposite is true. It is Nvidia with it's poor performance that got *five times* more work to improve it's performance by using low precision, and *still* can't reach the performance levels of it's competitors.

    ATI runs the standard DX9 path that every other DX9 card will be expected to run. This means the ATI card (gasp) runs the *standard DX9 path*, not that Valve coded for ATI only. :roll:
     
  18. Rugor

    Newcomer

    Joined:
    May 27, 2003
    Messages:
    221
    Likes Received:
    0
    Both ATI's R3xx and Nvidia's NV3x based parts can run the DX9 path, albeit with very different performance.

    ATI's R3xx parts do not run the NV3x specific path.

    Both can run it, that's why it's the standard path.
     
  19. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    Up to this point, there's only been ATI and NVIDIA. So, if they're doing a separate path for NVIDIA, it stands to reason that the only card that's being tested against in DX9 is the ATI card. (Though, they might have others by now). If they've only been tested against ATI, we have absolutely no idea how well they'll run on other DX9 capable cards. If their architectures are close, then they'll benefit from the "DX9" optimization that's already happened. If they've got different performance characteristics, then the "DX9" path would be suboptimal for them and it will, indeed, be an "R3xx" optimized path.
     
  20. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    Excactly what russ said, since youguys dont like how I said it.

    I am not saying they are evil, everyone wants to jump up and down and pledge allegiance to ati, I don't really care and I am not saying valve/and or ATI is evil I am just saying that people seem to be a little close minded about saying there is "no optimizations for r3xx"

    On a sidenote if ferraris were the only thing that raced because everyone else was too slow to compete and dropped out I suppose they would call it the ferrari race.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...