NVIDIA Maxwell Speculation Thread

Do you have knowledge that Keplers will also be in the Desktop 900 series?

That's the way I read the comment. And as this is a speculation thread, I'm going to throw caution to the wind and engage in wildly unsupported speculation ;^> Let's say that NV rebrands something like the 770 as the 960. That would seem nutty, unless there were some other forcing function preventing full maxwell rollout -- something that makes it worth taking the hit. Let's further assume that this forcing function is the imminent release of some other technology (GM200 or a smaller node or something else, take your pick). If this forcing function is 'imminent' (for suitable definitions of the word), one would assume that you wouldn't want the 970 and 980 out there long, so you can do one of two things: create a limited stock and price high, or price stock to move. The former would only be wise if NV could wait, in which case, why release at all? Although the current 970 rumors have risen back to $330, that still seems priced under the 570/470 launches, never mind the 670/770 launch prices. Read this way, it's suspicious....

What if 980 was originally designed for a smaller node? 400mm^2/(28^2/20^2) ~= 200mm^2. That smells like a 960 to me. NV brings that design forward a node because 20nm is running late/not-at-all. Being up a node, it runs a little hotter, and maybe you can't price it where a 960 would fall, your competitor's products aren't pushing quite so hard, the 960 was already spec'd for a wider bus in preparation for pascal's quantum bw leap, so you sell it as a 980/970, earn a better margin on the cheaper node and higher tier.

But then a funny thing happens, and your newer node comes online sooner than expected, so you have all this work that you need to recoup, you share a little of the love of the better margin to pay off the r&d and move product, and there you have your explanation for lower-than-previous pricing.

Likely a crazy theory. Maybe that's too nimble by half, sliding sideways dangerously close to conspiracy territory (or never straying far from pure crazy). Sorry, idle hands and all that.... Wish we had benchmarks already....
 
GK104 has no place in a 9xx lineup. It would be a bit silly to advertise power and cooling requirements for a 960 that surpass those for the 980.
 
Crazy theory alright... lol.

28nm Maxwell will be with us for a while. Don't expect any 16/20nm GPUs until Q4 2015 at the earliest.

GM104 and GM107 cover a huge range of the market. There's really no need for a Kepler part in the 900 series.
 
Okay, moving on from crazy :)
Has anyone seen 3 DisplayPorts on a 970? Seems like I'm only seeing that config on the 980s...?

Edit: Found some: Gainward Phantom, MSI's reference impl (not twinfrozr).
 
Last edited by a moderator:
The same reason AMD skipped the 8000 series - they were mobile parts that the OEM's wanted to market as 'new'.

For the record about the same happened to Geforce 100 and Geforce 300 series (they're mostly known for OEM desktop cards, but there's desktop Radeon 8000 too.)
 
I thought GM108 had been advertised as a GPU for ultrabooks? (i.e. paired with a 15 watt Intel CPU, sounds a bit useless but the story is that perf per watt is significantly better than the Haswell GPU so that you get an improvement). It makes me think it doesn't even have a display controller and outputs but I believe I read somewhere it does have them.

About both of these GPUs, I wonder if nvidia makes some batches of them every now and then, plenty of inventory can be built with not that many wafers ; then they sell a trinkle of them. Unless I'm missing something GK208 doesn't seem to be sold in that many end user products (computers) and is sold almost stealthily as aftermarket parts in the geforce 600/700 series. They have some inventory?

I would expect nvidia to sell both GK208 and GM108. The latter would be a bit nicer but more expensive. They do mix up Fermi (GF108) and Kepler, Kepler and Maxwell (GM107) already.
 
Do you have knowledge that Keplers will also be in the Desktop 900 series?

Do you expect a serious answer to that question? His sentence is on the safer side than any random bullshit I read around the net from any rumor mongering ninny. As it stands the 700-series of GeForces already contain both Kepler and Maxwell powered SKUs, meaning that it's neither a sin nor taboo to combine those two theoretically in a lineup until smaller cores like GM206 are ready for prime time.
 
Do you expect a serious answer to that question?

Actually, yes I do.

His sentence is on the safer side than any random bullshit I read around the net from any rumor mongering ninny.

Why is the "safer side" answer not also random. Or just plain wrong.

As it stands the 700-series of GeForces already contain both Kepler and Maxwell powered SKUs, meaning that it's neither a sin nor taboo to combine those two theoretically in a lineup until smaller cores like GM206 are ready for prime time.

If the Maxwell's contain some extra features that the Keplers don't then mixing them together is going to be a problem for end users.

I have read articles that Nvidia skipped over the 800 series and went to the 900 series to avoid that issue.

NVIDIA to skip GeForce 800 series
http://videocardz.com/51426/nvidia-...ies-geforce-gtx-980-and-gtx-970-mid-september

So when a poster states the the 900 series might also contain Keplers I would really like to know if that is true.
 
I don't think there are that many noteworthy new features, all I can think of is hardware decoding of corrupt h264 streams (from wikipedia's article on Pure Video) and Cuda capability version 5.0 instead of 3.x, which is probably significant but means "maybe some specific part of some specific software will be faster in a couple of years".

I'm interested though to know about Freesync support, or in other terms full Displayport 1.2a support.
 
With so many rumors flying around the last few days, it's hard not to be a bit sceptical. But if it's a fake, it's a very well done one.

And if it's true: how did they do this?

There is narrowing of the performance gap going from 1080p to 2560x1440, but even there it is still substantial. Can't wait to see the 4K numbers but , for now, fears that it might now be a suitable card for higher resolutions seem unfounded.

Also: there was a leak a few days ago showing firestrike only 5% ahead. That's confirmed here as well: it just happens to be the one benchmark with the lowest improvement.
 
Back
Top