NVIDIA Maxwell Speculation Thread

Discussion in 'Architecture and Products' started by Arun, Feb 9, 2011.

Tags:
  1. DSC

    DSC
    Banned

    Joined:
    Jul 12, 2003
    Messages:
    689
    Likes Received:
    3
  2. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    I hope for Displayport 1.3 instead (and HDMI 2.0 won't hurt). I'm still wanting 120Hz or similar displays, if not g-sync/freesync ones and that stuff is needed regardless of the resolution.

    H265 won't be terribly useful for a while, I believe a transition from H264 if it happens would take about a decade, possibly more. Though the higher end or biggest streaming services will be able to offer content encoded with both codecs. Sure it needs to get into hardware as soon as possible.
     
  3. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland

    And thats a news ? Every hardware out will support H.265, its in the normal evolution of thing ( will it be used that much, i dont know )... Like you mention, will it be in hardware or only partial hardware + software,.. but well im not sure how can take this article like an information about anything.

    I will be more interested, to Know if Maxwell will finally support the 10Bits LUT color.. than now every monitor 4K out there is using it instead of the old standard 6-8bits .( as no one GPU of Nvidia support it right now outside maybe some Quadro from 2006 ).

    @Blazkowicz: DisplayPort 1.2 have been updated with the standard Vecsa for support freesync, ( Dynamic V-Blank, in reality it just put the V-Blank state controlled by the monitor to off, for let the gpu control it ) you dont need DP 1.3 for it.. Actual DP 1.2 is allready working for it, but for make it work you need or an update of the firmware with this new specifications, or a new monitor who have apply the new firmware.. It was an important point for AMD to make it a standard modification for DP1.2 and not DP1.3...
     
    #1663 lanek, May 8, 2014
    Last edited by a moderator: May 9, 2014
  4. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    Indeed everything DP1.2 coming out can be able to support Freesync. DP1.3 would be a superset of that. If I want adoption of that one it's because that I fear > 90% of monitors will still be limited to 60Hz (at least for input)

    DP 1.3 may be just enough for 4K 120Hz 8bit (letting compression aside).. Then 100Hz, 96Hz, even 90Hz would still be useful. (if you feel like needing multiples of 25, 24 and 30)
     
    #1664 Blazkowicz, May 9, 2014
    Last edited by a moderator: May 9, 2014
  5. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    mmm you put the bar really high for DP 1.3 ...monitors for gaming with 4K and 120hz... this is really hot... but lets stay honest, this is to much hot for now,...

    Ofc i understand your point of view, but lets be honest, if we want a 4K at abordable prices, 120hz is at this moment totally impossible, and outside any question ( and somewhere not needed as untill you use SLI or CFX for game, you can forget to get enough framerate for it. at least on max setting possible )
     
    #1665 lanek, May 9, 2014
    Last edited by a moderator: May 9, 2014
  6. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    In year 2000 you could buy an Iiyama 17" CRT, a geforce 2MX and play at 100Hz.

    I don't see where it's impossible.. it's stuff that will stay expensive if not mass produced enough. Granted, the industry might go on making hundreds millions of 60Hz stuff and a few millions of 120Hz stuff so nothing will change.
    That'd be boring, we'd have to wait for the 2020s to get back to what CRT gaming did in the early 00s.. Though by then maybe OLED will be replacing LCD and high refresh rate would be a feature a vendor would need to be attractive.

    120Hz refresh improves your framerate no matter what your GPU can push, as you can have double buffering and no vsync, so even if you're somewhere in the 30s to 50s you can see all your frames and have low latency. Vsync would make it slower/"jumpy" and 60Hz no v-sync seems to give extreme tearing on big displays (such as 21.5") sometimes.

    As for a single GPU you can get an extremely powerful one (GK110, Hawaii, and then GM204), play in non-ultra, play older or lighter games, and more relevantly perhaps you can expect a very good quality when playing at non-native resolution if you have a high pixel density.

    If we gotta compromise then at least have a display that can do 1080p 120Hz and 4K 60Hz (Seiki TV does 1080p 120Hz and 4K 30Hz). Hoping the OS, game, driver would be smart enough to handle alt-tabbing out of the game nicely.
     
  7. Erinyes

    Regular

    Joined:
    Mar 25, 2010
    Messages:
    808
    Likes Received:
    276
    Like Ailuros said, I had heard that GM204 and GM206 were on 28nm for sure and that even GM200 was intended for 28nm. However I have since heard conflicting information and I am not sure. There are indications that GM200 is on 20nm.

    20SoC still brings a density improvement and as Ailuros states, Finfets are expensive and the cost/transistor for 16FF will be a fair bit higher than 20SoC. As a result, process selection may differ for chips intended for different segments.
    Yes..20SoC to 16FF is in the region of ~5%. Thanks for the link, I had not read that news about 16FF+. The extra density increase would be good but I wonder what the cost implications there are, i.e. cost/transistor, compared to 16FF.
    Yes..two chips. Read the last few pages for more info.
    It is a full hardware decoder with 4k support. I mentioned this more than a month back in another thread - http://forum.beyond3d.com/showpost.php?p=1838296&postcount=2201

    I believe Fudo's information is not totally correct though. My information is that only GM206 is getting it.
    I think it was a good move and I can certainly see H.265 being useful in the near future. Even SoC's are integrating it these days. With NAND scaling slowing down and pricing stagnating, lowering file sizes will become ever more important, and even more so for streaming services considering the latest developments in the net neutrality fight.
    Not everything. Only the very high end SoCs and some specific SoC's for the TV segment will have full hardware h.265 decode in the near future. Even Intel has yet to publicly release any information and rumours say it will feature only in Skylake at the earliest. So there will be a large part of the market which will be left out for now.
     
  8. tviceman

    Newcomer

    Joined:
    Mar 6, 2012
    Messages:
    191
    Likes Received:
    0
    a dumb question

    I have seen comments here (and there) regarding costs per transistor. I know that transistor density affects yields, which affects prices, but doesn't TSMC charge per wafer, irregardless of how many transistors are built on that wafer?
     
  9. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    That depends on the contract between the fabless company and the fab.

    But these kind of contracts have stipulations with corrections regarding expected and actual yield. And when they don't match, things get investigated and corrections are made one way or the other.

    No one is going to sign a contract where one party has all the risk and the other has none. There has to be some incentive for the fab to do the best they can and not produce low yield crap because it would increase their volume.
     
  10. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    In addition to what silent guy says: Orders in a newer manufacturing process are going to be more expensive than those on old (which is real-world-speak for the marketing-term proven) technology.

    So if, for example, you cram twice as many transistors per area (i.e. wafer) on 20nm vs. 28nm and it's initially more than twice as expensiv, your cost per transistor goes up. This is early adopter stuff for companys who think they need features of newer processes earlier than their competition in order to stay competitive. Or to build chips in the first place. GF100 would not have been possible on 55nm at all, because of size and heat and I'm pretty sure the same holds true for Tahiti (on 40 nm instead of 28).
     
  11. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    Going forward, it may be quite a long time before processes cross over into being cheaper per transistor than 28nm. So the switch-over will have to be based on performance only instead of performance and cost.

    NVidia already complained about that quite a while ago.
     
  12. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    7,610
    Likes Received:
    825
    Maybe they should do a TSV MCM instead.
     
  13. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    If they'll use 20SoC theoretically just for the top dog at the beginning and just for professional markets, high manufacturing costs are more bearable but of course not ideal.
     
  14. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    That's something I've been wondering about myself; possibly too much risk for now?
     
  15. keldor

    Newcomer

    Joined:
    Dec 22, 2011
    Messages:
    75
    Likes Received:
    113
    How much of the hardware cost is actually the chip? I mean, you have circuit board, huge heatsink, memory etc. to contribute to the cost. This would determine what sort of impact higher chip manufacture costs would have.
     
  16. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland

    Circuit board cost nothing to products, sometimes, this is the material used for it who cost the most ............ heatsinks, are just made of "metal" and follow the cost of this ones in the international markets ( gold, copper ), memory dont cost so much ( thanks, or not thanks to Samsung )
    No, the real cost, is in Research and developpement....
     
  17. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    For as large a die as a big GPU, the silicon cost should exceed everything else. (And, no, R&D cost is pretty much irrelevant in this. It doesn't really enter the equation in setting price of a silicon product.) On pretty much everything, when you're taking millions, you can get insanely cheap high volume pricing: it's hard to ask for high margins on, say, a DVI connector that every little facility in Shenzhen can manufacture. There's only one TSMC...
     
  18. Erinyes

    Regular

    Joined:
    Mar 25, 2010
    Messages:
    808
    Likes Received:
    276
    Interesting article on EE Times - http://www.eetimes.com/author.asp?section_id=36&doc_id=1322399

    Some data which is very relevant to the discussion above:-

    1. Cost of a 28nm wafer - $4,500-$5,000
    2. Cost of a 20nm wafer - $6,000
    3. Cost of a 16/14nm Finfet wafer - $7,270
    There's a lot of other good info..worth a read. Some more key points mentioned were:-

    1. TSMC's 20nm capacity is expected to be 60,000 Wafers per month in Q4.
    2. A number of fabless companies will tape out their 16/14 FinFET product designs in the third quarter of 2014 with high-volume production planned for the second or third quarter of 2015.


    And finally...some news on GM200. Seems like there's been a lot of smoke and mirrors stuff going on. What I have again heard is that it is still on 28nm as planned..and should be taping out late this month/early next month.
     
  19. Dangerman

    Newcomer

    Joined:
    Apr 1, 2014
    Messages:
    43
    Likes Received:
    8
    Just one curious question; do wafer costs go down over time?

    So now it absolutely will be on 28nm? Did Nvidia find that the density decrease wasn't justifiable enough to be on 20nmSoC?
     
  20. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,541
    Likes Received:
    964
    For a given process, yes. Plus, yields tend to increase, so costs go further down on a per die basis.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...