Nvidia Pascal Announcement

I was under the impression from earlier reports that GP102 was a GDDR5X derivative of GP100 with 3846 cores and 384-bit (512-bit?) bus width. That seems to reconcile a lot better with Q3/Q4 release to be honest.
Unfortunately that would mean the same FP64 and slightly improved FP32/FP16 as the P100.
This is not what Nvidia needs for the next down in the Tesla range that is also shared with the Quadro and Titan.

Cheers
 
What to watch out for IMO is to see if they give the Titan the same compliment of SM at 56 or 60, and support for FP16x2, along with number of FP64 cores as the Quadro card.
Cheers
 
If they're going the BIG GPU route again, it would make sense to put the more traditional-style SMs with 128 ALUs in there (including FP16x2) and add an additional GPC or two instead of the FP64 cores. Instead of HBM2, you could use 512 Bit GDDR5X. That way, you had a much more complementary line-up.

If they want to cheap out, GP102 could just be GP104×1.5.
 
Think about memory capacity implications.

Can you run in clamshell mode with GDDR5X? If so you can do 8 GB on a 128 bit bus..but GDDR5X price and/or availability may be constraints. A 192 bit GDDR5 bus gives you similar bandwidth at lower cost.
This memory size datapoint might apply to the GTX 1060.. 3GB may indeed be enough for a mid-level GPU. But as ninelven pointed out, 3GB just feels dirty.

It may be enough today(that is also debatable), but what about a year from now? 2 years? I wouldn't buy anything less than 4 or 6 GB tbh. And besides the margins on the 3 GB card will be lower and its main purpose will be to serve as a price anchor (eg..the magic $199.99).

But one thing to keep in mind that if a particular card is popular then devs will definitely keep the memory capacity in mind. Considering how popular GTX970 is (as per the Steam hardware survey at least)..I'm sure many devs will aim for 3.5 GB ;-)
I was under the impression from earlier reports that GP102 was a GDDR5X derivative of GP100 with 3846 cores and 384-bit (512-bit?) bus width. That seems to reconcile a lot better with Q3/Q4 release to be honest.

384 bit. Q3.
 
Can you run in clamshell mode with GDDR5X? If so you can do 8 GB on a 128 bit bus..but GDDR5X price and/or availability may be constraints. A 192 bit GDDR5 bus gives you similar bandwidth at lower cost.
I think so, yes. But 5X probably carries another price premium, as you said, so...
 
If they're going the BIG GPU route again, it would make sense to put the more traditional-style SMs with 128 ALUs in there (including FP16x2) and add an additional GPC or two instead of the FP64 cores. Instead of HBM2, you could use 512 Bit GDDR5X. That way, you had a much more complementary line-up.

If they want to cheap out, GP102 could just be GP104×1.5.

I think a fair few will be disappointed if they did remove the 64 cuda cores per SM, but this is Nvidia so who knows :)
While we can expect the GP102 to have fewer FP64 cores, it will also be interesting if they provide the option as they did back with the GK110 Titan that enabled switching the ratio between 1/3 and was it 1/24 (to do with loads)?
I am assuming they are going to put a moderate-useable amount of FP64 on the GPU102 for the Tesla and Quadro, maybe a quarter of GP100 for now *shrug*.

Cheers
 
got same information. GP102 = big GP104. ready for back to school
With it supposedly launching so soon after GP102, one has to wonder what kind of damage to your wallet it is going to cause in order to set itself apart from (the already quite expensive) GP104 products...because NV isn't going to push GP104 down in price; not so soon after launch. Ugh. It's gonna be the worst reaming of all time, I fear.
 
I think a fair few will be disappointed if they did remove the 64 cuda cores per SM, but this is Nvidia so who knows :)
While we can expect the GP102 to have fewer FP64 cores, it will also be interesting if they provide the option as they did back with the GK110 Titan that enabled switching the ratio between 1/3 and was it 1/24 (to do with loads)?
I am assuming they are going to put a moderate-useable amount of FP64 on the GPU102 for the Tesla and Quadro, maybe a quarter of GP100 for now *shrug*.

Cheers
Just to be clear: They're gonna keep their mandatory 1/32th rate or so. I did not mean remove them completely. And those who want half-rate DP on a 10-TFLOP-chip still can go for GP100 (or Vega probably).
 
Just to be clear: They're gonna keep their mandatory 1/32th rate or so. I did not mean remove them completely. And those who want half-rate DP on a 10-TFLOP-chip still can go for GP100 (or Vega probably).
It is possible, but then would need to be considered against that at some point they need to update the Keplers still around for Tesla/Quadro that have 'moderate' DP, and also that they need to compete against AMD in the pro-workstation space in terms of FP16 solutions.

If they do not do a GK110 type die (would be the K40-K80/K600) that would mean another GP1xx card somehow positioned in the tiers being designed with DP to replace those.
Ah well time will tell.
 
Last edited:
I so hope that NV drops SLI from the 1060, it never made sense on this class of cards.

It make more sense for this type of range than for any.. specialy for peoples who dont have the money to buy 1 high end gpu .. but can buy 1 low end class one, and then "upgrade" it in some month..

No miracle there, the first SLI gpu capable was the 6600GT SLI edition, not the 6800 utra one .

SLLI and CFX have been sold like that since the start ( and even with APU ), use low end gpu's and upgrade them to match performance of high end.. This is how AMD and Nvidia have sold it from the start.. in fact, it is really surprising than Nvidia remove SLI capacity from the 1060 ( there's no reason, not even bad reason for do it )
.. I dont even understand why you think it is a good thing.. If it was SLI capable and you dont want to use SLI, this willl just dont do any difference for you ..

Im a little bit emotive when we are speaking about SLI, as i was surelly one of the first users of it, i even stil got the badges that Nvidia and DFI have send me at this time, we have create the first Guru3D SLI team and create the first real guide of user, most byte of code who are running in SLI in thoses periods was created by us... each day we was recode the drivers and mod them for been used in SLI.... It have continue then with the Tweaksrus driver ( who have been banned in justice by Nvidia advocate attack ) ... but in all honestly we had prepare the way...


Just to be clear: They're gonna keep their mandatory 1/32th rate or so. I did not mean remove them completely. And those who want half-rate DP on a 10-TFLOP-chip still can go for GP100 (or Vega probably).

They cant remove it complety, DX11 and DX12 ask for for a compliance of at least been abe to do " FP16 and FP64 " so at 1/32 they are just "capable of doing it." and respect the mandatory minimum of DirectX..
 
Last edited:
That's why i wrote "just to be clear". But then, I am not so sure about FP16 right now. A lot of cards report minimum precision still as full precision in the DX12 FL12_0/12_1.
 
It make more sense for this type of range than for any.. specialy for peoples who dont have the money to buy 1 high end gpu .. but can buy 1 low end class one, and then "upgrade" it in some month..

No miracle there, the first SLI gpu capable was the 6600GT SLI edition, not the 6800 utra one .

SLLI and CFX have been sold like that since the start ( and even with APU ), use low end gpu's and upgrade them to match performance of high end.. This is how AMD and Nvidia have sold it from the start.. in fact, it is really surprising than Nvidia remove SLI capacity from the 1060 ( there's no reason, not even bad reason for do it )
.. I dont even understand why you think it is a good thing.. If it was SLI capable and you dont want to use SLI, this willl just dont do any difference for you ..
They shouldn't be wasting any resources on (or encouraging customers to go with) boner solutions like 2 1060s when 1 1070 or 1080 is such a better option for all involved. It's a very good move in my book. SLI/Crossfire should be reserved to the high end or even better simply not exist at all.
 
They shouldn't be wasting any resources on (or encouraging customers to go with) boner solutions like 2 1060s when 1 1070 or 1080 is such a better option for all involved. It's a very good move in my book. SLI/Crossfire should be reserved to the high end or even better simply not exist at all.


Not everyone have the possiblity " money " wise to buy a 1080 or a 1070 .. but they can buy a 1050 a month, in their pay, and then upgrade to a second 1050.. it is why the SLI and CFX have been created .. its not perfect by no means, but it work..

I have the luck to been able to buy a good hundred of 1080 with my pay.. but i have not forget the time when i was young and i couldnt surely do it it.
 
Not everyone have the possiblity " money " wise to buy a 1080 or a 1070 .. but they can buy a 1050 a month, in their pay, and then upgrade to a second 1050.. it is why the SLI and CFX have been created .. its not perfect by no means, but it work..

I have the luck to been able to buy a good hundred of 1080 with my pay.. but i have not forget the time when i was young and i couldnt surely do it it.


Well with mGPU the need for SLi kinda goes down, if developers are going to move towards that more, it looks nV is guiding them towards that as well or expecting it to happen in the normal course of this gen.
 
Not everyone have the possiblity " money " wise to buy a 1080 or a 1070 .. but they can buy a 1050 a month, in their pay, and then upgrade to a second 1050.. it is why the SLI and CFX have been created .. its not perfect by no means, but it work..

I have the luck to been able to buy a good hundred of 1080 with my pay.. but i have not forget the time when i was young and i couldnt surely do it it.
But that is such a dumb thing to do when you can save for 2 months and get the 1070...less power and far fewer headaches. Buying 2 1050s is the wrong way to do it and I'm glad NVIDIA has taken the option away. They never should have offered SLI on mid-low end cards and it seems they've finally realized that.
 
Not everyone have the possiblity " money " wise to buy a 1080 or a 1070 .. but they can buy a 1050 a month, in their pay, and then upgrade to a second 1050.. it is why the SLI and CFX have been created .. its not perfect by no means, but it work..

I have the luck to been able to buy a good hundred of 1080 with my pay.. but i have not forget the time when i was young and i couldnt surely do it it.

Eh, it's not exactly smart though. Obviously it's better to save. The first video card I bought was a Voodoo 2 for $250 back in 98. I was 11 and got something like $10 pw pocket money. Ditto the 440Ti I bought for $450 when I was earning $8/hr. And again a 9800Pro for $500 when not earning much more.
 
232416bsq3lbmloblboslv.png.thumb.jpg
 
I don't know how relevant this is to the topic but Nvidia recently updated their GeForce Experience software, it looks and runs much better than the previous iteration (or the current non-beta version). According to nvidia it also has a smaller CPU footprint (for the share and streaming functionality).
 
Last edited:
Back
Top