NVIDIA GF100 & Friends speculation

Discussion in 'Architecture and Products' started by Arty, Oct 1, 2009.

  1. Squilliam

    Squilliam Beyond3d isn't defined yet
    Veteran

    Joined:
    Jan 11, 2008
    Messages:
    3,495
    Likes Received:
    114
    Location:
    New Zealand
    So the 6 display port Eyefinity edition has a market in security? Wow I had NOT thought about that one.

    P.S How many displays would a game developer *want to* run? I can see AMD trying very hard to corner the developer workstation market.
     
  2. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    exactly then you have a multi-million dollar market for TV on air live presentations and offline presentations that all require professional cards.
     
  3. nagus

    Newcomer

    Joined:
    Aug 24, 2002
    Messages:
    134
    Likes Received:
    5
  4. CRoland

    Newcomer

    Joined:
    Jan 19, 2010
    Messages:
    114
    Likes Received:
    0
    Uh, why can't I edit... Got the 26 months mixed up. It should be 21 months, corresponding to 2.28 times the performance of 280. Sorry about that.
     
  5. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    And thats what I'm saying its stupid to do that. Do you know what drivers were used, do you know what motherboard was used, etc etc etc. Its dumb to even look at that, because now if we use use up to 2x from the marketing slides its slower, oh no wait if we use hardware review that you used before

    http://www.hardware.fr/articles/770-20/dossier-amd-radeon-hd-5870-5850.html

    now its faster by 10%,

    Where the hell can you even estimate anything from these numbers?
     
  6. Florin

    Florin Merrily dodgy
    Veteran Subscriber

    Joined:
    Aug 27, 2003
    Messages:
    1,707
    Likes Received:
    345
    Location:
    The colonies
    You don't. You know it, I know it, Aaronspink knows it. But Quadro is the corporate standard and that's not going to change anytime soon.
     
  7. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    Now you sound just like NV-PR (in fact, they say exact the same thin almost to the letter in one of their PDFs).
     
  8. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    well the NVS is that's true, but the market for that is much smaller then the advertisng + TV world. And into that CAD world, well yeah its very small.
     
  9. Sontin

    Banned

    Joined:
    Dec 9, 2009
    Messages:
    399
    Likes Received:
    0
  10. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    I could easily see how having 3-4 monitors would be a benefit for a game developer:
    1 3D display
    1 debug/perf data display
    1 IDE display
    0/1 Office work display

    Certainly I miss the 3 displays I had back in the day for IC design.

    The reality is until you've used a multi-display setup for business environment you don't realize how much it improve efficiency, esp in the technical industries. Certainly for anything that involves code and debug, 2+ monitors make a world of difference.
     
  11. Psycho

    Regular

    Joined:
    Jun 7, 2008
    Messages:
    746
    Likes Received:
    41
    Location:
    Copenhagen
    From nvidia's Q2 report, page 23:
    Quadro+Tesla: revenue $116M, income $41M
    Geforce: revenue $372M, loss $144M (the $120M bumpgate charge probably included)

    That's what compares to AMD's graphics division, MCPs ($237/53M) are in AMD's computing divison.
     
  12. nagus

    Newcomer

    Joined:
    Aug 24, 2002
    Messages:
    134
    Likes Received:
    5
    Forgive me, but thats just plain stupid! Cypress has the same amount of memory...
     
  13. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    you either need multiple display cards or a card that support more than 2 displays. Until 5xxx, >2 display graphics cards have almost always been sold as part of the professional display lines. Someone pointed out matrox which has subsisted this whole entire time on basically selling 3+ display cards to markets like these for prices that they couldn't get in any other market with 5-8 year old ICs. Nvidia and ATI have also had these solutions for quite some time.

    Some of the sub-markets are fairly specialized like transit/power monitoring stations and some are more mainstream like the financial sectors and the like, but they all share basically the same requirement which is lots of information display in a constant and understandable/readable presentation. In almost all cases the graphics performance isn't an issue as they aren't doing anything really taxing by todays standards graphically, they just have a LOT of information to display.
     
  14. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    The amount of memory that a card has is immaterial if they ways the cards use the memory is different.
     
  15. Sontin

    Banned

    Joined:
    Dec 9, 2009
    Messages:
    399
    Likes Received:
    0
    "ATi" made a lost of 1 million in the second quarter.
     
  16. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    I'll keep it short, forgive me.
    • You keep ignoring the possible different contracts at TSMC for AMD and Nvidia
    • you forget the PCIe-Brigde
    • 70% Yield would in my books provide a much higher rate of stock than it is right now. IOW: I doubt that number
    • HSF uses Vapor-Chamber, which should be a bit higher than your average heatpipe-radial-blower-combo
    • Board-Design should be using far more complex circuitry, because it has to switch two chips within mikrosekonds

    edit:
    Probably add to that higher per-unit-prices for even the same components because of volume.
     
    #956 CarstenS, Jan 19, 2010
    Last edited by a moderator: Jan 19, 2010
  17. Florin

    Florin Merrily dodgy
    Veteran Subscriber

    Joined:
    Aug 27, 2003
    Messages:
    1,707
    Likes Received:
    345
    Location:
    The colonies
    I know, and I agree completely, but when you claim that eyefinity will kill these markets practically overnight, you're not taking into account the purchasing cycles, software compatibility validation, support overhead and yes even brand loyalty in these industries.

    Eyefinity is a cost-effective and superior solution. But it's going to take (a lot of) time to make serious inroads. It's the same situation really with OpenCL and DC vs CUDA, and Bullet vs Physx. The emergence of a cool new technology doesn't mean that the established one will just disappear instantly. Too many people have an investment in time, experience or money going for that to happen as quickly as we might like from a purely technological perspective.
     
  18. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    The same way, a Geforce FX is similar, but less usefull than an 8800 GTX or a Radeon 8500 compared to Juniper?
     
  19. Periander

    Newcomer

    Joined:
    Apr 13, 2007
    Messages:
    24
    Likes Received:
    0
    Thanks for that, didn't realize they broke it down. No gross margins, but the operating results are quite interesting. They are here for last quarter too:

    http://www.sec.gov/Archives/edgar/data/1045810/000104581009000036/q310form10q.htm

    $129M and $48M for Q3, both still far below where they were in 2008 when professional was providing the large majority of Nv's operating income.
     
  20. seahawk

    Regular

    Joined:
    May 18, 2004
    Messages:
    511
    Likes Received:
    141
    Problem is that ATI has yet to reveal its DX1 generation archtiekture, with RV870 being an evolution of the previous generation. When R900 comes it might make Fermi look rather lame.

    Due to the fact that Fermi is hardly able to veat an evolutionary ATI chip, I have little confidence for the future of that line. Especially if you consider the production problems and the TDW, which means one GPU from Fermi is more expensive than 2 RV870s. More expensive for NV and more expensive for the user, while lacking eyefinity.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...