Interview w/ Nvidia guy David Kirk at FS

Discussion in 'Graphics and Semiconductor Industry' started by beyondhelp, Sep 30, 2003.

  1. beyondhelp

    Newcomer

    Joined:
    May 16, 2003
    Messages:
    109
    Likes Received:
    0
    Location:
    Ruiz Base, Omicron Beta
    For your reading pleasure, this interview w/David Kirk, Nvidia Engineer @ Firing Squad...

    http://firingsquad.gamers.com/hardware/kirk_interview/


    a short quote...

    "FiringSquad: Do you feel that fact that you guys, your hardware came out later -- does that also contribute to the initial performance that’s coming out in terms of the DX9 titles that have been benchmarked with?

    Kirk: Yeah, I would say that one of the issues is that since our hardware came out a little bit later some of the developers started to develop with ATI hardware, and that’s the first time that’s happened for a number of years. So if the game is written to run on the other hardware until they go into beta and start doing testing they may have never tried it on our hardware and it used to be the case that the reverse was true and in this case now it’s the other way around. I think that people are finding that although there are some differences there really isn’t a black and white you know this is faster that is slower between the two pieces of hardware, for an equal amount of time invested in the tuning, I think you’ll see higher performance on our hardware."

    and...

    FiringSquad: Do you feel that in terms of the Half-Life 2 performance numbers that were released recently…do you feel that maybe you guys were, I don’t want to say given a bad rep, but maybe an unfair deal?

    Kirk: Well again, not wanting to start a flame war back and forth, my feeling is if they had issues with speed, it’s really not appropriate to say that it doesn’t run at all. (Our question had mentioned this --FS) It’s just that so far in their state of optimization it doesn’t run fast. But after we’ve had a chance to work together on [inaudible] that will be able to provide a very good game experience with Half-Life on the full GeForce FX family. There’s no question in my mind that we’ll get there, it’s just a matter of time.

    ...Sure, someday... maybe...you just wait, the next Dets will fix everything! The next spin will be Golden! It's just a matter of time.
     
  2. nelg

    Veteran

    Joined:
    Jan 26, 2003
    Messages:
    1,557
    Likes Received:
    42
    Location:
    Toronto
    [satire]
    FiringSquad: One of the things that ATI has kind of said, or least they were suggesting at Shader Day is the fact that they can do more floating-point operations than you guys can. How would you respond to those types of statements?

    Kirk: Well I guess the first response would be of course they would say that. But I don’t really see why you or they would think that they understand our pipeline, because in fact they don't, nobody, not even we understand it. The major issues that cause differing performance between our pipeline and theirs is we’re sensitive to different things in the architecture than they are, like math and stuff, so different aspects of programs that may be fast for us will be slow for them and vice versa. The Shader Day presentation that says they have two or three times the floating point processing that we have is just nonsense, our figures show it to be five times. Why would we do that? [/satire]
     
  3. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Famous last words..."...it's just a matter of time." Heh...:) Almost as good as, "Just wait for the next Detonators--you'll see--just wait."
     
  4. K.I.L.E.R

    K.I.L.E.R Retarded moron
    Veteran

    Joined:
    Jun 17, 2002
    Messages:
    2,952
    Likes Received:
    50
    Location:
    Australia, Melbourne
    Half Life 2...

    Now with STATIC CLIP PLANES!

    :lol:

    Sorry, couldn't help myself. :D
     
  5. nelg

    Veteran

    Joined:
    Jan 26, 2003
    Messages:
    1,557
    Likes Received:
    42
    Location:
    Toronto
    KILER, I guess that is the new SCP tech everyone is waiting for.
     
  6. Typedef Enum

    Regular

    Joined:
    Feb 10, 2002
    Messages:
    457
    Likes Received:
    0
    Was it just me or...

    Is Kirk's english really that bad?
     
  7. K.I.L.E.R

    K.I.L.E.R Retarded moron
    Veteran

    Joined:
    Jun 17, 2002
    Messages:
    2,952
    Likes Received:
    50
    Location:
    Australia, Melbourne
    w0t engrish!???!!!1 :lol:

    It's common practice to sound like an idiot to avoid media castration about a bad product. :lol: ;)
     
  8. XForce

    Newcomer

    Joined:
    Jun 12, 2003
    Messages:
    58
    Likes Received:
    0
    He SO doesn't get it, does he?
    It's not GFFX vs. Ati, it's GFFX vs. any DX9 capable card.
    Games aren't meant to be developed "for Ati" or "for GFFX", but "for DX9".
    It's rather a nice coincidence (or NOT) that "optimized for DX9" means almost the same as "optimized for Ati" lately.

    Having said that, I'm sure he gets it quite well.
    We just can't expect an Nvida engineer or representative (sp?) to really speak their minds on what is so obvious, making public statements that spread across the Web like a wildfire.
    Not gonna happen, PR-FUD is all we got and will get.

    Cheers,
    Mac
     
  9. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,081
    Likes Received:
    651
    Location:
    O Canada!
    This is a running theme in all their presentations lately - they state that ATI and NVIDIA have such different shader architectures that they both need their own spearate paths, however they appear to overlook that in all titles released presently, or known upcoming ones, the ATI path is purely the API default path (i.e. HLSL for TR:AoD & HL2, DX9 for 3DMArk03, ARB2 for Doom3). Yes, this may be a different path to NVIDIA, but its also the default path that any other DX9/OpenGL board will use so in essences its only NVIDIA that is requiring special paths.
     
  10. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    82
    Jesus Christ on a bicycle! It's back to this "oh poor Nvidia, please take pity on us, the game developers have done something to make us look bad, we don't know why they hate us but they do, and it makes us all cry".

    Pathetic, Nvidia, pathetic. Everytime someone from Nvidia opens their mouths, it makes me want to change channel like I've seen a slimy, lying polititian.
     
  11. jimbob0i0

    Newcomer

    Joined:
    Jul 26, 2003
    Messages:
    115
    Likes Received:
    0
    Location:
    London, UK
    worse BZB..... you've seen an NV graphics PR rep ;)

    My dad once said this saying to me:

    "If ten men tell you you're dead... you had better lie down"

    Now as a stubborn git at times my reponse was:

    "Guess they've seen a ghost"

    Now Nvidia please tell me why in your graphics department and in gfx PR you will not believe everyone in the community but just think "_____ ________ have done something to make us look bad, we don't know why they hate us but they do".

    Hehe it is getting kinda amusing though in a perverse way - just like SCO's recent press releases damning the GPL and linux... predictable, slanderous and funny :D
     
  12. andypski

    Regular

    Joined:
    May 20, 2002
    Messages:
    584
    Likes Received:
    28
    Location:
    Santa Clara
    Hmm... perhaps I should remind Mr. Kirk that processors for specialised applications (such as graphics) don't necessarily conform to his arbitrary rules. Intermediate processing sizes occur in applications like DSPs as one example - these may process internally at 24 bits (fixed or float) and output at 16, using the larger number of internal bits to retain accuracy.

    So it seems that bytes can happen in 3s contrary to his belief. Amazingly enough those with an understanding of the graphics industry will know that 24-bit screen resolutions exist as well and were widely used in the past. This seems more surprising if anything since it seems to me that reliance on bytes coming in twos, fours and eights is more associated with the requirements of external memory than any internal processing restrictions. Internally to hardware data paths don't necessarily conform to this rule, and in fact may not even come in byte sized elements at all...

    The Motorola 68000 only had a 24-bit address bus because this was deemed to be wide enough at the time, and that was quite a highly regarded piece of hardware as I recall.

    Still, I guess it sounds good if you're trying to make it appear that ATI have broken some unwritten law and hence dragged the whole industry down kicking and screaming. He might just as easily have said that GF3 and GF4 were 'funny places to be' because as far as I remember they use 10 bits per component in their pixel ALUs (range of 0->2 and 9 fractional bits).

    I think that what Mr. Kirk thinks here may be a somewhat inaccurate picture of events, but maybe I remember things wrongly.

    However, it seems to me that claims of being the guardians and evangelists for higher precision processing by parties who produce hardware that prefers lower default precisions would seem to be somewhat suspect.

    "Too much precision for pure colour calculations."

    I would have thought that when dealing with real-world colour elements like exposure where brightness ratios can easily be in the many 1000s you might want quite a lot of precision for colour calculations unless you want to risk banding artifacts.

    "Not enough precision for geometry"

    Certainly not necessarily the case. It is obviously possible to generate cases where it isn't enough (you can extend this to any arbitrary level of precision, however), but there is plenty of scope for working with geometry creatively within the bounds of 24-bit FP.

    "Not enough precision for normals"

    Hmm... certainly in many applications normals are frequently specified with less than 24 bits of FP (for example, normal maps might have only 8 bits per component). I don't think there are going to be many complaints about the accuracy of normal calculations in 24 bits for some time.

    "Not enough precision for directions"

    This would seem to be highly dependent on circumstances, just as with the geometry case. In all these cases you can generate examples where any chosen precision will turn out to be insufficient.

    "Not enough precision for shadows"

    Funny, we seem to be able to do plenty of shadow calculations within these limitations.

    "Not enough precision for any kind of real arithmetic work"

    I guess no ALU instructions in pixel shaders should ever use really inferior things like 16-bit FP then? Obviously 16-bit FP must only be any good for fake arithmetic, so it seems insane that anybody would ever use it at all, and yet apparently it manages to be really useful at the same time?

    Feh!

    Exactly what I believe ATI hardware gives them, and up until now it seems that some competitor's hardware may not.

    Can we have a timescale on when will we see this?

    I would prefer a statement like: "I think that through a combination of 16 and 32, we may manage to be about 30% slower, with lower quality, while clocking our hardware 30% faster. Maybe."

    Or perhaps I'm being too harsh - he's entitled to his opinion after all. I personally think that as things stand at the moment he's wrong.

    [edit - added reference to GF3/4]
     
  13. XForce

    Newcomer

    Joined:
    Jun 12, 2003
    Messages:
    58
    Likes Received:
    0
    Each time you read a NVidia PR-statement, a kitten dies.. :twisted:
     
  14. parhelia

    Newcomer

    Joined:
    May 15, 2002
    Messages:
    214
    Likes Received:
    0
    Ok, then why do they encourage editors to optimize games for "GFFX" with special codepaths then??
     
  15. jimbob0i0

    Newcomer

    Joined:
    Jul 26, 2003
    Messages:
    115
    Likes Received:
    0
    Location:
    London, UK
    That's funny... as far as i recall NV were touting 32bit and since ATI had the 'lowest precision' at 24bit that was chosen as teh DX9 minimum.... if NV had harped on enough about 16bit back then they might not be in the mess they are now :roll:
     
  16. XForce

    Newcomer

    Joined:
    Jun 12, 2003
    Messages:
    58
    Likes Received:
    0
    'Cause that, and reduced IQ, are the only ways for NV to stay competetive (sp?) at the moment.
    Plus big companies don't want their games too look "bad", even on inferiour hardware, so they take the effort, IF they have enough money/developers/time.
     
  17. CorwinB

    Regular

    Joined:
    May 27, 2003
    Messages:
    274
    Likes Received:
    0
    I may be wrong, but I thought the R3xx architecture proceeded geometry in full 32 bits. And that 24 bits was only used in the last part of the pipeline ?
     
  18. andypski

    Regular

    Joined:
    May 20, 2002
    Messages:
    584
    Likes Received:
    28
    Location:
    Santa Clara
    That's correct, but I believe he's referring to generating or modifying geometry using the pixel shader in this case.
     
  19. dan2097

    Regular

    Joined:
    May 23, 2003
    Messages:
    323
    Likes Received:
    0
    Dont the S3 DeltaChromeX and/or XGI Volari cards use this? :roll:
     
  20. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    Heh. THe DSP I use at work uses a 24 bit fixed point processor.

    And I hope to God it dies soon.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...