Interview with ATI's CEO, Dave Orton

Discussion in 'Beyond3D News' started by Dave Baumann, May 5, 2004.

  1. Rugor

    Newcomer

    Joined:
    May 27, 2003
    Messages:
    221
    Likes Received:
    0
    Just as it won't matter how fast Nvidia's chip is if people only have ATI in stock. ATI was first to market with the last generation too. It's going to be harder for Nvidia to get shelf space if all the stores have their money tied up putting ATI cards on the shelves. And as a former retail manager, I can tell you that since product that sells gets re-ordered, there's going to be less money for buyers to spend on NV4x.

    As has been said elsewhere, there's sales NV is going to lose just because people who got stirred up with all the new card hype went out and bought the first new generation card they saw: ATI.

    SM3.0 and even FP32 are gambles until we see how quickly the former is adopted and the latter is required. Faster SM2.0 and better AA aren't. We know people want those features. Plus never forget OEM friendliness for making profits.
     
  2. thatdude90210

    Regular

    Joined:
    Aug 9, 2003
    Messages:
    937
    Likes Received:
    6
    Betcha most retail managers would be happy to stock some Nv40s around. Because for people who really want one, you might be able to sell them a high priced powersupply also.
     
  3. Rugor

    Newcomer

    Joined:
    May 27, 2003
    Messages:
    221
    Likes Received:
    0
    I bet they will too-- I'm just saying that because ATI's getting out the door first with product availability, they're going to stock fewer NV40's than they might have otherwise.
     
  4. Diplo

    Veteran

    Joined:
    Apr 17, 2004
    Messages:
    1,474
    Likes Received:
    64
    Location:
    UK
    There's also an interview with Dave Orton at the desipicable, baby-murdering, ATI-hating Toms Hardware : http://www.tomshardware.com/hardnews/20040507_060001.html

    :shock:
     
  5. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    I think it must be Wintermane (ahh, if only it were Wintermute! j/k!) who is an nV stockholder, as he seems to get so cocky and pissed off about it when he thinks people are nV bashing, and he is under the impression that corporations DON'T LIE TO THEIR STOCKHOLDERS. I swear I almost lost my keyboard to that one! XD

    I respect Wintermane's opinion, regardless - he knows enough more about the subject than me, although I don't doubt many here know more than HE. It's just that this argument seems a little silly. Both products look pretty impressive, and we really have no idea what the middle and economy tiers are going to look like, and even though ATi has shipped at least the non-XT x800, neither company's ability to produce either chip en masse can really be judged yet.

    *Dons fireproof suit*
     
  6. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    Oh, and just a little gem I picked up a while ago...

    SELL! SELL!!!! :p
     
  7. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    Well, lets do the math...
    400M (from an NV con call) / 10k (rumor'd shipments of NV30) = X (NRE per chip)
    + $30 (absolute lower bound for chip cost) = 40K per chip in cost.

    If we want to be generous say 100M dev cost and 100k ship = $1030 per chip cost. Lets be really really nice and say 50M dev cost for $530 per chip cost.

    For NV40
    Say 200M dev cost and a total shipment of 2 million + 40 (per chip fab cost) = $140 per chip.

    The NV30 and the whole high end NV3x line had such low volumes that it would be almost inconceivable that a future product could have lower margin without NV immediately going under.

    Another thing to point at is that the boasted development cost numbers from Nvidia are most likely extremely bogus.




    The 6800 Ultra apparently uses ~50 watts more power at peak than an X800 Pro and ~40 watts more power peak than an X800 XT, which is roughly 50-70% higher power draw.

    you obviously don't understand the pricing models...

    It would appear that they win every test that matters. I don't know about you but I'm pretty sure you can't buy a monitor that will show the difference between 200 and 300 FPS. I'm pretty sure everyone has a monitor that can show the difference between 45 and 70 FPS.

    Maybe they just went with 520 because that was the sweet spot on the yield curve for them.

    And besides, why is clock for clock important? They are both on roughly equivelent processes.

    Oh, I don't think it matters when I'm in an A-10. Or when the tank does have any ammo, or it is slow and out of date and can't pierce my armor. WTF was/is the point of that gem? And while technically quartz is a gem, it really isn't that valuable.

    Aaron Spink
    speaking for myself inc
     
  8. Wintermane

    Newcomer

    Joined:
    Aug 5, 2002
    Messages:
    18
    Likes Received:
    0
    The day I invest in either of these companies stocks is the day flaming transvestite pigs fly out of my ass singing bavarian opera. On the dont lie its not so much that they dont lie its that they can get in deep poo if they do lie to thier investors... That doesnt mean they cant wafle weezle and just obfusticate like crazy just they cant out and out lie. Its sorta like politics without the hookers.
     
  9. Wintermane

    Newcomer

    Joined:
    Aug 5, 2002
    Messages:
    18
    Likes Received:
    0
    What oem would use any of these cards? In general most oems use mid and bottom range cards or they charge you 2700 for a "media" pc with one of the semi higher end cards in a system realy worth 1600 at most. As for sales they will lose yes maybe they will but I doubt it realy matters as they will still likely sell every single chip in the high end they expected to sell. As for retail space you mean that little chunk of space betwwen the xbox games and the please steal this trash discount games? You know the one thats still filled with gforce 2 mx 400s selling for 79.95;/ and that wierd hd AMZING 72.6 GIG ONLY 499.99! now on sale only 199.97!!!!!!
     
  10. Wintermane

    Newcomer

    Joined:
    Aug 5, 2002
    Messages:
    18
    Likes Received:
    0
    Well poo yes I was hoping you wouldnt notice it was as compared to the nv30;/

    On bottlenexks you do know ati has better tech for dealing with bandwidth limits then nvidia does. All else being equal the ati card should have won all the tests. The only thing I can figure is the nvidia card has a better cache system or the ati card has something a bit wonky.

    Anyhoo as I think I said before as far as all of this goes all I give a darn about is the low-mid low range cards and neither company is spilling the darn beans on em! Which happens to be making me very cranky. At least with the nv30 line we knew by now what the low end was gona be.
     
  11. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    I do?

    I believe that wat the case with R3xx vs. NV3x, but this is a new generation and the jury's still out.

    No, all else being equal, the ATI card should win tests that are "fill rate (but not bandwidth) limited, and many (but not all) shader tests". And that's pretty much what we see. (Look at Aniso scores, NON AA scores in particular, which stress fillrate in a higher proportion to bandwidth compared to running with AA and no Aniso).

    If rumor has it, ATI will be spilling the beans pretty soon (Fall at the latest, but maybe some or all with the PCI-E launch?), and nvidia has said "by the end of the year" at this point for the rest of the NV40 line...
     
  12. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    Question?

    Looks to me like this.

    R100 PS 1.0 / VS 1.1

    R200 PS 1.1-1.4 / VS 1.1

    R300/R350/R360/R420 = DX9.0 PS/VS 2.0

    R480 = mabe??? DX9.0+ PS/VS 3.0

    R500 = DX9.0+ PS/VS 3.0

    R600/R700 = DX10 PS/VS 4.0

    R800/R900 = DX11 PS/VS 5.0

    Question maybe R900 will be the last Radeon name before ATI comes up with new NAME...
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...