NVIDIA GF100 & Friends speculation

Discussion in 'Architecture and Products' started by Arty, Oct 1, 2009.

  1. Squilliam

    Squilliam Beyond3d isn't defined yet
    Veteran

    Joined:
    Jan 11, 2008
    Messages:
    3,495
    Likes Received:
    114
    Location:
    New Zealand
    Its hard to say, it depends whether they intended a 32nm refresh this quickly or whether it is simply a fixed G100 taking advantage of better yields. If it is indeed the fixed 32nm refresh instead backported to 40nm it could be different again. Hard to say and obviously it could go either way unless you're privy to some insider information its hard to call.
     
  2. Squilliam

    Squilliam Beyond3d isn't defined yet
    Veteran

    Joined:
    Jan 11, 2008
    Messages:
    3,495
    Likes Received:
    114
    Location:
    New Zealand
    The size of the location, whether you lived in an apartment or flat or a large house like your one would also come into it. Noise also relates to the noise from the cooling fans required in addition. It is something which has 'it depends' written all over it.
     
  3. DarthShader

    Regular

    Joined:
    Jul 18, 2010
    Messages:
    350
    Likes Received:
    0
    Location:
    Land of Mu
  4. SimBy

    Regular Newcomer

    Joined:
    Jun 21, 2008
    Messages:
    700
    Likes Received:
    389
    Cutting 'HPC features' from flagship GPU just to beat Cayman at gaming!? Sounds fishy and unlike nV to me.
     
  5. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,983
    Likes Received:
    1,298
    Location:
    New York
    Power consumption matters to me as well from a pure GPU comparison standpoint. It's when people try to imply that the higher power consumption results in signifiicantly higher real world power bills is when I call them on that nonsense.
     
  6. CarstenS

    Legend Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,231
    Likes Received:
    2,836
    Location:
    Germany
    You know you're comparing a graphics chip comprising part of a graphics card versus a whole product including clocks and all? Well…

    I can see it being cut down from what the chip would be able to deliver and I don't deny that. Btw, that's why I've opted against buying one, as I already said.

    I cannot see them putting excess voltage on the chip. In fact, I believe, most of AMDs high-end GPUs use a higher voltage. But that's not comparable anyway, since different ASICs have different power characteristics wrt voltage.

    I can see it killing their profits, as was the case with GT200 also, after AMD forced them into a price war. But that's not my concern from a consumer standpoint - I go to the shop, I select what I want and then I pay the bill.

    Actually, I've been wondering about that myself. It would make every sense in the world to have a midlife kicker, as JHH called it, ready. But you'd need to make sure, it's excessively priced in order to keep demand low so you can have at least a few etailers per continent ready to provide links where they are in stock, thus negating any paper launch accusations. :D

    I would do it! ;)
     
  7. NathansFortune

    Regular

    Joined:
    Mar 3, 2009
    Messages:
    559
    Likes Received:
    0
    By God I think you've got it. It really makes sense and sounds like something Nvidia would do! Honestly it wouldn't surprise me if there was no respin or anything like that and GTX580 just ended up being held back GF100 units that had a full configuration. Nvidia have really gone off the boil for some reason and I don't see them getting their mojo back until they part ways with Jen-Hsun Huang, similar to how AMD had to part ways with Hector as he seemed to be stuck in the Athlon era not wanting to move on.
     
  8. tannat

    Newcomer

    Joined:
    Dec 26, 2009
    Messages:
    61
    Likes Received:
    0
    Location:
    Malmö
    The thought has hit my mind also but I really don't think it is possible. The one thing we know is that it is named GF110, not GF100. Respin or even a die shrink would possibly make it a GF100b but binned devices are GF100 still, anyway you look at it.

    The new chip-designation GF110 still indicates either added or subtracted functionality on chip level - or both. Nvidia has never changed the name of the same chip to my knowledge, only the cards.
     
  9. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,528
    Likes Received:
    953
    That really doesn't mean anything. And NVIDIA is quite famous for rebranding products, so why not chips?
     
  10. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
    AMD was the first of them giving one GPU two codenames: http://www.ixbt.com/video2/images/rv570/chips.jpg

    Even if the shader-core is the same and they just added HDMI 1.4a it is a different GPU .;)
     
  11. tannat

    Newcomer

    Joined:
    Dec 26, 2009
    Messages:
    61
    Likes Received:
    0
    Location:
    Malmö
    Think of the possible consequences:

    Rebranding cards is really only applying to consumers and may be complained about from a marketing ethics perspective. So what?

    To say that there exist a new chip, with the related development and fabrication costs when they hav only used old ones. That's serious misconduct versus the owners. They can't tell the truth to the owners but keep it from the consummers, therefore they will not do something as stupid as this.

    The small possible gains by doing such a thing would be only in recognition of the product.
    The potential loss is a real business scandal with legal consequences. You don't make up products to your owners.
     
  12. tannat

    Newcomer

    Joined:
    Dec 26, 2009
    Messages:
    61
    Likes Received:
    0
    Location:
    Malmö
    Exactly. It's not the amount of change that gives it a new name. It's that it is a different chip.
     
  13. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    HDMI 1.4a is a software addition.
     
  14. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,528
    Likes Received:
    953
    I think you're over-thinking this. NVIDIA doesn't need to say "here, that's a new chip, with the related development and fabrication costs". All they need to do is say "here is GF110, it has characteristics X, Y, and Z, and it's awesome".

    That would be technically true, except the awesome part, which is subjective anyway.

    I'm not saying this is what's going to happen and that GF110 is just GF100-A3, actually I don't think it's likely because according to rumors, NVIDIA hasn't made more than one batch of GF100s, but it's not impossible.
     
  15. CarstenS

    Legend Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,231
    Likes Received:
    2,836
    Location:
    Germany
    In HD6800 also? :eek: If yes, any chance, we'll be seeing an upgrade for HD5k series any time soon?
     
  16. Unknown Soldier

    Veteran

    Joined:
    Jul 28, 2002
    Messages:
    3,513
    Likes Received:
    1,001
  17. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    In case it wasn't clear, Catalyst 10.10 already enabled HDMI 1.4a on Evergreen.

    Evergreen cannot support overlays in Windowed mode while displaying 3D stereoscopic content, so MVC playback needs to be fullscreen (and of course, software decoded) while NI has hardware differences that allow overlays in a windowed mode. However, HDMI 1.4a itself is a small and specific subset of HDMI 1.4 that dicates a few framepacking modes for 3D Stereoscopic which can be supported by the PHY speeds from most current HDMI recievers and transmitters - thats how and why you see things like the PS3 getting updated with HDMI 1.4a support.
     
  18. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    Companies do it all the time. Make a small change, and call it a new product. Marketing led companies need new products to keep their marketing cycles going, so all they need to do is stick to their own interpretation of what they consider "new".

    Pharmaceutical companies do it in particular to keep patents outside of their natural lifespan ie, change a small thing, call it a "new product" and hey presto, new patent, new marketing campaign. Car companies, electronics companies, etc, just about every company does it.

    Even if Nvidia just do rare-binned-GF100s-with-all-512-cores-and-a-different-clock-speed-vanity-edition, it can be justified by pointing out the differences to show why it gets a new model number. It's even easier if they actually did a respin an a few tweaks to make it work properly.
     
  19. CarstenS

    Legend Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,231
    Likes Received:
    2,836
    Location:
    Germany
    Thanks for pointing that out (my bold) Dave because I really didn't catch that. :)
     
  20. tannat

    Newcomer

    Joined:
    Dec 26, 2009
    Messages:
    61
    Likes Received:
    0
    Location:
    Malmö
    Small change - new envelope - new product. I know this is how it works but the chip designations are not model numbers. The chip names are the subsets that are used for communication with the owners and the board.

    You don't do this with products in the spotlight.
    What I'm saying is that if GF110 is binned GF100 chips JHH will not go in to his owners and put up a card and say this is our new flagchip GF110. This is the top model in the second generation of our fermi engine. We improved it 20% in performance and energy consumption. We think this will beat the new offerings from AMD.

    Any small change in the architecture is enough to pull it off, but binned chips from old batches that been sold as Gf100, that's just unnecessary.

    I'm just saying that it would be stupid, and give limited gain. I understand if some people think that this is not a reason to think it would not happen.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...