NVIDIA Maxwell Speculation Thread

I hope for Displayport 1.3 instead (and HDMI 2.0 won't hurt). I'm still wanting 120Hz or similar displays, if not g-sync/freesync ones and that stuff is needed regardless of the resolution.

H265 won't be terribly useful for a while, I believe a transition from H264 if it happens would take about a decade, possibly more. Though the higher end or biggest streaming services will be able to offer content encoded with both codecs. Sure it needs to get into hardware as soon as possible.
 
http://www.fudzilla.com/home/item/34697-second-generation-maxwell-to-support-h265

Hope this is a full HEVC hardware decoder on the second gen Maxwell and not the hybrid hardware/software solution on the first gen Maxwell.


And thats a news ? Every hardware out will support H.265, its in the normal evolution of thing ( will it be used that much, i dont know )... Like you mention, will it be in hardware or only partial hardware + software,.. but well im not sure how can take this article like an information about anything.

I will be more interested, to Know if Maxwell will finally support the 10Bits LUT color.. than now every monitor 4K out there is using it instead of the old standard 6-8bits .( as no one GPU of Nvidia support it right now outside maybe some Quadro from 2006 ).

@Blazkowicz: DisplayPort 1.2 have been updated with the standard Vecsa for support freesync, ( Dynamic V-Blank, in reality it just put the V-Blank state controlled by the monitor to off, for let the gpu control it ) you dont need DP 1.3 for it.. Actual DP 1.2 is allready working for it, but for make it work you need or an update of the firmware with this new specifications, or a new monitor who have apply the new firmware.. It was an important point for AMD to make it a standard modification for DP1.2 and not DP1.3...
 
Last edited by a moderator:
Indeed everything DP1.2 coming out can be able to support Freesync. DP1.3 would be a superset of that. If I want adoption of that one it's because that I fear > 90% of monitors will still be limited to 60Hz (at least for input)

DP 1.3 may be just enough for 4K 120Hz 8bit (letting compression aside).. Then 100Hz, 96Hz, even 90Hz would still be useful. (if you feel like needing multiples of 25, 24 and 30)
 
Last edited by a moderator:
Indeed everything DP1.2 coming out can be able to support Freesync. DP1.3 would be a superset of that. If I want adoption of that one it's because that I fear > 90% of monitors will still be limited to 60Hz (at least for input)

DP 1.3 may be just enough for 4K 120Hz 8bit (letting compression aside).. Then 100Hz, 96Hz, even 90Hz would still be useful. (if you feel like needing multiples of 25, 24 and 30)

mmm you put the bar really high for DP 1.3 ...monitors for gaming with 4K and 120hz... this is really hot... but lets stay honest, this is to much hot for now,...

Ofc i understand your point of view, but lets be honest, if we want a 4K at abordable prices, 120hz is at this moment totally impossible, and outside any question ( and somewhere not needed as untill you use SLI or CFX for game, you can forget to get enough framerate for it. at least on max setting possible )
 
Last edited by a moderator:
In year 2000 you could buy an Iiyama 17" CRT, a geforce 2MX and play at 100Hz.

I don't see where it's impossible.. it's stuff that will stay expensive if not mass produced enough. Granted, the industry might go on making hundreds millions of 60Hz stuff and a few millions of 120Hz stuff so nothing will change.
That'd be boring, we'd have to wait for the 2020s to get back to what CRT gaming did in the early 00s.. Though by then maybe OLED will be replacing LCD and high refresh rate would be a feature a vendor would need to be attractive.

120Hz refresh improves your framerate no matter what your GPU can push, as you can have double buffering and no vsync, so even if you're somewhere in the 30s to 50s you can see all your frames and have low latency. Vsync would make it slower/"jumpy" and 60Hz no v-sync seems to give extreme tearing on big displays (such as 21.5") sometimes.

As for a single GPU you can get an extremely powerful one (GK110, Hawaii, and then GM204), play in non-ultra, play older or lighter games, and more relevantly perhaps you can expect a very good quality when playing at non-native resolution if you have a high pixel density.

If we gotta compromise then at least have a display that can do 1080p 120Hz and 4K 60Hz (Seiki TV does 1080p 120Hz and 4K 30Hz). Hoping the OS, game, driver would be smart enough to handle alt-tabbing out of the game nicely.
 
Didn't Erinyes say that even the big, proper Maxwell will be on 28nm? By late 2015 16nmFF will likely commercially viable for GPUs for mid-later 2016 unless they plan to make GP100, GP104, GP106 or whatever the heck those are called be released very late 2016 or 2017. I can see them doing a 870 later this year and then 880/880 Ti 1H 2015, with the former being 256-bit memory bus and the latter 384-bit bus. I just cannot see the HPC segement waiting for a 512-bit bus as they are primarily interested CUDA cores but then again I'm likely being ignorant and dumb.

Economics and laws of physics from the looks of it say that 20nmSoC at TSMC is not that good for High performance components and TSMC 16nmFF is based on TSMC's 20nm which will be more of a step up from 28nm and be more viable.

Like Ailuros said, I had heard that GM204 and GM206 were on 28nm for sure and that even GM200 was intended for 28nm. However I have since heard conflicting information and I am not sure. There are indications that GM200 is on 20nm.

20SoC still brings a density improvement and as Ailuros states, Finfets are expensive and the cost/transistor for 16FF will be a fair bit higher than 20SoC. As a result, process selection may differ for chips intended for different segments.
The jump from 20SoC to 16FF will be on the order of single digit percentages. 16FF+ offers only a 15% density increase.

http://www.cadence.com/Community/bl...-ahead-for-16nm-finfet-plus-10nm-and-7nm.aspx

My guess is that if there's no 20SOC GPU this year, there won't be an instead everyone is going to go to FF as soon as it's viable.

Yes..20SoC to 16FF is in the region of ~5%. Thanks for the link, I had not read that news about 16FF+. The extra density increase would be good but I wonder what the cost implications there are, i.e. cost/transistor, compared to 16FF.
So are we expecting a higher end Maxwell this year?
Yes..two chips. Read the last few pages for more info.
http://www.fudzilla.com/home/item/34697-second-generation-maxwell-to-support-h265

Hope this is a full HEVC hardware decoder on the second gen Maxwell and not the hybrid hardware/software solution on the first gen Maxwell.

It is a full hardware decoder with 4k support. I mentioned this more than a month back in another thread - http://forum.beyond3d.com/showpost.php?p=1838296&postcount=2201

I believe Fudo's information is not totally correct though. My information is that only GM206 is getting it.
I hope for Displayport 1.3 instead (and HDMI 2.0 won't hurt). I'm still wanting 120Hz or similar displays, if not g-sync/freesync ones and that stuff is needed regardless of the resolution.

H265 won't be terribly useful for a while, I believe a transition from H264 if it happens would take about a decade, possibly more. Though the higher end or biggest streaming services will be able to offer content encoded with both codecs. Sure it needs to get into hardware as soon as possible.

I think it was a good move and I can certainly see H.265 being useful in the near future. Even SoC's are integrating it these days. With NAND scaling slowing down and pricing stagnating, lowering file sizes will become ever more important, and even more so for streaming services considering the latest developments in the net neutrality fight.
And thats a news ? Every hardware out will support H.265, its in the normal evolution of thing ( will it be used that much, i dont know )... Like you mention, will it be in hardware or only partial hardware + software,.. but well im not sure how can take this article like an information about anything.

Not everything. Only the very high end SoCs and some specific SoC's for the TV segment will have full hardware h.265 decode in the near future. Even Intel has yet to publicly release any information and rumours say it will feature only in Skylake at the earliest. So there will be a large part of the market which will be left out for now.
 
a dumb question

I have seen comments here (and there) regarding costs per transistor. I know that transistor density affects yields, which affects prices, but doesn't TSMC charge per wafer, irregardless of how many transistors are built on that wafer?
 
I have seen comments here (and there) regarding costs per transistor. I know that transistor density affects yields, which affects prices, but doesn't TSMC charge per wafer, irregardless of how many transistors are built on that wafer?
That depends on the contract between the fabless company and the fab.

But these kind of contracts have stipulations with corrections regarding expected and actual yield. And when they don't match, things get investigated and corrections are made one way or the other.

No one is going to sign a contract where one party has all the risk and the other has none. There has to be some incentive for the fab to do the best they can and not produce low yield crap because it would increase their volume.
 
I have seen comments here (and there) regarding costs per transistor. I know that transistor density affects yields, which affects prices, but doesn't TSMC charge per wafer, irregardless of how many transistors are built on that wafer?

In addition to what silent guy says: Orders in a newer manufacturing process are going to be more expensive than those on old (which is real-world-speak for the marketing-term proven) technology.

So if, for example, you cram twice as many transistors per area (i.e. wafer) on 20nm vs. 28nm and it's initially more than twice as expensiv, your cost per transistor goes up. This is early adopter stuff for companys who think they need features of newer processes earlier than their competition in order to stay competitive. Or to build chips in the first place. GF100 would not have been possible on 55nm at all, because of size and heat and I'm pretty sure the same holds true for Tahiti (on 40 nm instead of 28).
 
Going forward, it may be quite a long time before processes cross over into being cheaper per transistor than 28nm. So the switch-over will have to be based on performance only instead of performance and cost.

NVidia already complained about that quite a while ago.
 
Going forward, it may be quite a long time before processes cross over into being cheaper per transistor than 28nm. So the switch-over will have to be based on performance only instead of performance and cost.

NVidia already complained about that quite a while ago.

If they'll use 20SoC theoretically just for the top dog at the beginning and just for professional markets, high manufacturing costs are more bearable but of course not ideal.
 
How much of the hardware cost is actually the chip? I mean, you have circuit board, huge heatsink, memory etc. to contribute to the cost. This would determine what sort of impact higher chip manufacture costs would have.
 
How much of the hardware cost is actually the chip? I mean, you have circuit board, huge heatsink, memory etc. to contribute to the cost. This would determine what sort of impact higher chip manufacture costs would have.


Circuit board cost nothing to products, sometimes, this is the material used for it who cost the most ............ heatsinks, are just made of "metal" and follow the cost of this ones in the international markets ( gold, copper ), memory dont cost so much ( thanks, or not thanks to Samsung )
No, the real cost, is in Research and developpement....
 
How much of the hardware cost is actually the chip? I mean, you have circuit board, huge heatsink, memory etc. to contribute to the cost. This would determine what sort of impact higher chip manufacture costs would have.
For as large a die as a big GPU, the silicon cost should exceed everything else. (And, no, R&D cost is pretty much irrelevant in this. It doesn't really enter the equation in setting price of a silicon product.) On pretty much everything, when you're taking millions, you can get insanely cheap high volume pricing: it's hard to ask for high margins on, say, a DVI connector that every little facility in Shenzhen can manufacture. There's only one TSMC...
 
I have seen comments here (and there) regarding costs per transistor. I know that transistor density affects yields, which affects prices, but doesn't TSMC charge per wafer, irregardless of how many transistors are built on that wafer?
In addition to what silent guy says: Orders in a newer manufacturing process are going to be more expensive than those on old (which is real-world-speak for the marketing-term proven) technology.

So if, for example, you cram twice as many transistors per area (i.e. wafer) on 20nm vs. 28nm and it's initially more than twice as expensiv, your cost per transistor goes up. This is early adopter stuff for companys who think they need features of newer processes earlier than their competition in order to stay competitive. Or to build chips in the first place. GF100 would not have been possible on 55nm at all, because of size and heat and I'm pretty sure the same holds true for Tahiti (on 40 nm instead of 28).

Interesting article on EE Times - http://www.eetimes.com/author.asp?section_id=36&doc_id=1322399

Some data which is very relevant to the discussion above:-

  1. Cost of a 28nm wafer - $4,500-$5,000
  2. Cost of a 20nm wafer - $6,000
  3. Cost of a 16/14nm Finfet wafer - $7,270
There's a lot of other good info..worth a read. Some more key points mentioned were:-

  1. TSMC's 20nm capacity is expected to be 60,000 Wafers per month in Q4.
  2. A number of fabless companies will tape out their 16/14 FinFET product designs in the third quarter of 2014 with high-volume production planned for the second or third quarter of 2015.


And finally...some news on GM200. Seems like there's been a lot of smoke and mirrors stuff going on. What I have again heard is that it is still on 28nm as planned..and should be taping out late this month/early next month.
 
Interesting article on EE Times - http://www.eetimes.com/author.asp?section_id=36&doc_id=1322399

Some data which is very relevant to the discussion above:-

  1. Cost of a 28nm wafer - $4,500-$5,000
  2. Cost of a 20nm wafer - $6,000
  3. Cost of a 16/14nm Finfet wafer - $7,270
Just one curious question; do wafer costs go down over time?

And finally...some news on GM200. Seems like there's been a lot of smoke and mirrors stuff going on. What I have again heard is that it is still on 28nm as planned..and should be taping out late this month/early next month.
So now it absolutely will be on 28nm? Did Nvidia find that the density decrease wasn't justifiable enough to be on 20nmSoC?
 
Back
Top