NVIDIA Fermi: Architecture discussion

Discussion in 'Architecture and Products' started by Rys, Sep 30, 2009.

  1. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,428
    Likes Received:
    425
    Location:
    New York
    I'm not sure what Fuad's article has to do with your spoiler but maybe I'm missing those valuable connecting dots. Obviously, you don't have that problem, eh? :D
     
  2. Silus

    Banned

    Joined:
    Nov 17, 2009
    Messages:
    375
    Likes Received:
    0
    Location:
    Portugal
    Heh. The latest fun part was when Fud said Fermi would be delayed until January. Until then, Charlie (and others) would say that Fud knows nothing of what he's talking about, but as soon as he said something similar to them, Fud became much more credible :)

    But if indeed what Rys speculates is true, NVIDIA doesn't really need a dual chip card, with two "complete" GF100 chips in it. If a single GeForce 380 is able to keep up with the HD 5970, then two GeForce 360 chips in a single PCB, should beat the HD 5970 without problems, assuming SLI scales well of course.

    I think it was Fudzilla that said that NVIDIA would be launching two single chip cards, upon GF100 release, with the dual GPU card releasing a bit later. Assuming that the second single chip is the GeForce 360, the GeForce 395 (dual GPU) could be using two of those.
     
  3. Silus

    Banned

    Joined:
    Nov 17, 2009
    Messages:
    375
    Likes Received:
    0
    Location:
    Portugal
    Also, in all this, I don't think I ever saw anyone guesstimating what could the GeForce 360 be.

    I'm guessing it will be what GT212 was supposed to be. A 384 SP part (and given what we know now) with a 256 or 320 bit memory interface, 2/3s of ROPs and 3/4s of TMUs of the full GF100 chip.
     
  4. A.L.M.

    Newcomer

    Joined:
    Jun 2, 2008
    Messages:
    144
    Likes Received:
    0
    Location:
    Looking for a place to call home
    I don't think so.
    It's the same issue that I was talking about before...
    NVidia wasn't able to pull a dual card out of a GT200 full stop. Because it was too big and too hot to handle two of them on the same card. They could have used a GTX260 GT200, but it wasn't the case, so they had to wait a new process in order to have it done.
    GF100 will be a little bit smaller than GT200, but probably bigger than a GT200b, so it seems quite difficult to me that they would be able to put that kind of beast on a dual card.
     
  5. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    Hes referring to this. I had to look it up when I got home today but It wasn't too hard to figure out what he was implying.

    http://twitter.com/NVIDIAGeForce
     
  6. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    See?
    Dots! more dots! dots dots dots! now stop
     
  7. Groo The Wanderer

    Regular

    Joined:
    Jan 23, 2007
    Messages:
    334
    Likes Received:
    2
    Actually, the card I was talking about was canned, the one that was released was a very different model. They also can't make a 2x 280 or 2x 285, they have a 2x 275 instead. Looking at the TDPs, they are cherry picking the hell out of those chips to make the cards.

    Also, if you look at production numbers it was really limited, there aren't many of them out there. $600 graphics cards don't sell much, but do grab a disproportionate share of reviews, mindshare, and fanboi froth.

    -Charlie
     
  8. Groo The Wanderer

    Regular

    Joined:
    Jan 23, 2007
    Messages:
    334
    Likes Received:
    2
    You are missing the most important point for the HPC market, power. It is probably the number one concern for that market. If NV disables a cluster or two for fermi, that is one method of yield drop out that they could use.

    If it doesn't clock high enough, that is because of timing problems or power use, one results in a crash, the other in out of spec power use. The first problem would work out fine for the HPC cards, but the second one won't. It is in NV's best interest to pick those cards for lower power.

    Also, I am REALLY skeptical that there will be enough demand for $3K+ compute cards to soak up the imperfect GF100s from the desktop market. The market of "well funded HPC customers with money to piss away" is notably smaller than the market for gamers saving up their allowance.

    For the Fermi to be a salvage part, it would have to be a salvage part for the '360' to be of any real value, and I doubt that there will be enough parts that fit a narrow enough bin to base a product on.

    If I had to bet, I would say that the fermi parts are the cherry picked GF100s, after all, at 6-7 times the cost, I would suspect that the '380s' are the second tier parts.

    -Charlie
     
  9. XMAN26

    Banned

    Joined:
    Feb 17, 2003
    Messages:
    702
    Likes Received:
    1
    Right? Thats why they have only sold more GTX295s than ATI 4870X2s and they are STILL in stock and STILL being bought compared to the 4870X2s. Charlie, for once, WOULD YOU PLEASE JUST ADMIT YOU WERE WRONG ABOUT SOMETHING or is it beneith you to do such a thing?
     
  10. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,428
    Likes Received:
    425
    Location:
    New York
    All those points apply to Hemlock too Charlie.

    I'm confused. I thought you were trying to scoop Chris on some breaking news. Wth does Nvidia think we should care how long the board is!!!
     
  11. psolord

    Regular

    Joined:
    Jun 22, 2008
    Messages:
    443
    Likes Received:
    54
    Would M$ have a problem with a product named "360"?
     
  12. FrameBuffer

    Banned

    Joined:
    Aug 7, 2005
    Messages:
    499
    Likes Received:
    3
    links pls (GT295 vs 4870X2 sales)
     
  13. A.L.M.

    Newcomer

    Joined:
    Jun 2, 2008
    Messages:
    144
    Likes Received:
    0
    Location:
    Looking for a place to call home
    384SP, 96 TMUs and a 256bit bus could not be enough to keep up with an HD5870 (which should be the main target of a GTX360), I fear.

    Don't think so, as long as they keep the GTX in front of it. :wink:

    HD4870X2 are EOL, while GTX295 aren't, because ATI has something that NVidia has not. A new lineup of cards. :wink:
     
  14. XMAN26

    Banned

    Joined:
    Feb 17, 2003
    Messages:
    702
    Likes Received:
    1
    Even if the GTX360 were to be that, I dont see why it couldn't be = to if not even faster than the 5870.
     
  15. Groo The Wanderer

    Regular

    Joined:
    Jan 23, 2007
    Messages:
    334
    Likes Received:
    2
    Totally, but ATI comes out and says it.

    -Charlie
     
  16. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    17,217
    Likes Received:
    1,736
    Location:
    Winfield, IN USA
    Seconded.
     
  17. Tchock

    Regular

    Joined:
    Mar 4, 2008
    Messages:
    849
    Likes Received:
    2
    Location:
    PVG
    :wink:

    I remember XMAN saying about no vendor preference a while ago. Now where is that now I wonder... :lol:
     
  18. stevem

    Regular

    Joined:
    Feb 11, 2002
    Messages:
    632
    Likes Received:
    3
    Being shorter than 5870/5970 allows it to fit in more cases? It may also suggest something about power/heat & inferred performance. Of course, Nvidia may prefer to tolerate higher ASIC/board temps & eschew greater SA, but perhaps they have more efficient all Cu/Vapo cooling allowing a more compact build at a higher BOM.
     
  19. seahawk

    Regular

    Joined:
    May 18, 2004
    Messages:
    511
    Likes Received:
    141
    Maybe too close to the Performance chip of the GT100 Series. Especially as the launch date of GT100 slips, the time difference between the top of the line GPU and the Performance GPU should decrease.
     
  20. Reputator

    Newcomer

    Joined:
    Nov 6, 2006
    Messages:
    66
    Likes Received:
    0
    Location:
    Florida, US
    Difference is, you could actually buy a GTX 295.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...