Nvidia Turing Speculation thread [2018]

Status
Not open for further replies.
From an NVIDIA employee in a German Forum (autotranslated):
"Stupide to increase the raw performance is not the future. The requirements go beyond what conventional process increases provide.

That's why innovative solutions are needed and Turing has some of them that haven't been shown yet.

Such attempts and innovations are, of course, at the expense of large increases in the immediate short term, but we must have the courage to dare to make real progress in the long term. And the time was right. A shrink generation ala Pascal is more consumer friendly but I'm sure all developers are happy about the large feature catalog of the new chip. And the people who are happy are more the technology enthusiasts."


_

Personally I think it's good that the GTX 2080 Ti will be released so fast. However, it will probably cost more than the last one at release date.
 
Last edited:
I would absolutely consider buying if such a product were released as I expect Turing's performance to be in the range of what I need, only the VRAM will hold me back. If I have to wait until 7nm here's to hoping the replacement for the 20 series arrives next year.
Perhaps 2080 or 2080Ti x 2 in a NVLink setup would give you the VRAM plus additional performance?
 
Perhaps 2080 or 2080Ti x 2 in a NVLink setup would give you the VRAM plus additional performance?

I think only compute workloads are able to use the VRAM on multiple cards in a contiguous pool like that. Graphics workloads still see each card's VRAM as a separate pool. Which won't help in my case.
 
So the 80ti card is being released the same time as the 80? That isn't typical. That has to mean there isn't enough of a window between this series and the next series of cards (presumably 7nm) to legitimately stagger the 2080ti release, right?

I suspect you are correct. At least, I hope that is the case.
 
The way I read it Turing is a features play and raw performance over Pascal will be underwhelming. Looks like I’m waiting for 7nm.

Also for some reason I’m pretty anal about not buying cutdown chips or cards with ‘weird’ unit counts.
 
I'm still looking for something that's able to play most games in 4K @ 60fps. 1080Ti does not cut it, and SLI is not widely (nor well) supported.
Titan V is kind of close to that but it's kind of expensive for something not quite there. Unfortunately, it looks like even 2080 Ti is not going to be there.
Hope that it'll prove me wrong though.
 
The way I read it Turing is a features play and raw performance over Pascal will be underwhelming. Looks like I’m waiting for 7nm.
On the contrary, the 2080Ti packs a lot more TF than the 1080Ti, should be a nice upgrade over it. Also the gap between 2080Ti and 2080 should be larger than 1080Ti and 1080. And if the 2080 is neatly faster than 1080Ti (say + ~10%), then the 2080Ti is even more attractive than 1080Ti was.
 
From an NVIDIA employee in a German Forum (autotranslated):
"Stupide to increase the raw performance is not the future. The requirements go beyond what conventional process increases provide.

That's why innovative solutions are needed and Turing has some of them that haven't been shown yet.

Such attempts and innovations are, of course, at the expense of large increases in the immediate short term, but we must have the courage to dare to make real progress in the long term. And the time was right. A shrink generation ala Pascal is more consumer friendly but I'm sure all developers are happy about the large feature catalog of the new chip. And the people who are happy are more the technology enthusiasts."


_

Personally I think it's good that the GTX 2080 Ti will be released so fast. However, it will probably cost more than the last one at release date.

I still suspect the raytracing hardware is more about building it because they can, rather than because it's good for consumers or will even boost their sales a lot. Let's face it, Vega's FP16 support is perfectly useable in games under a lot of scenarios, and yet this "feature catalog" hasn't helped sales a lot. Raytracing hardware more easily showable in ooh and ahh, but none of the tech demos out will show up in games, at most it'll be a few extra FX thrown in a few games here and there in the near future.

It's also cutting down on Nvidia's potential profit margins, or sales depending. $800 isn't particularly affordable even by high end standards, and a 700nm die is huuuuge. If the rumored performance gains of 20-25% over Pascal are true, well that's just not maybe as impressive as one would hope for a die that size is all : /

I'm also sceptical of the way they've done raytracing. Thinking it through a lot, voxel modelling is a lot more developer friendly than RTX mesh only tracing, and cone and SDF tracing are a lot more shading friendly. And that's a key there at the end, the demos show large areas with simple shading, with close to no model variation (memory bound, can't fit unique textures for thousands of ultra high poly trees into memory, also definitely can't afford incoherent texture streaming) or very small enclosed demos (shading bound, this is going to be encountered a lot).

I kind of feel that, while very neat by itself, the whole raytracing hardware is bit narrow in what it's doing, isn't really balanced with the rest of the hardware to do what it's purported, and will end up a bit like some of AMD's features as of late. Very neat in concept, but not immediately applicable to a huge sales boost or many games (remember DXT is a Windows API, you can't use it for making a PS5 game). Some form of tracing is certainly in the future of gaming. But the three currently shipped games, Claybook and Kingdom Come Deliverance/The Hunt: Showdown use SDF tracing and cone tracing for very good reasons.
 
Where are you pulling $800 from? The only prices I've seen are for the Quadro RTX cards where the middle performance card is $6K, yes that's $6000.
 
We should know more about consumer RTX 2080/2080Ti Geforce card prices on Monday when they go on pre-order though it should be similar to their Pascal counterparts.
 
On the contrary, the 2080Ti packs a lot more TF than the 1080Ti, should be a nice upgrade over it. Also the gap between 2080Ti and 2080 should be larger than 1080Ti and 1080. And if the 2080 is neatly faster than 1080Ti (say + ~10%), then the 2080Ti is even more attractive than 1080Ti was.

The 2080 Ti should be a beast but looking at 2080 vs 1080 alu count is only up 15%. Bandwidth is +40%.

Unless clocks or efficiency are significantly higher there could be quite a few embarrassing moments for the 2080.

I wouldn’t be surprised if we never see a 12nm GP106 replacement.
 
Where are you pulling $800 from? The only prices I've seen are for the Quadro RTX cards where the middle performance card is $6K, yes that's $6000.

Just some rumor. The die is, again, huge (==$$$), and Nvidia's been making noise about the upcoming cards being more expensive. So it seems credible, but it is just a rumor.
 
The 2080 Ti should be a beast but looking at 2080 vs 1080 alu count is only up 15%. Bandwidth is +40%.
The GTX 1070 had only 1920 cores vs 2816 for the 980Ti, the 1070 also had less 25% bandwidth, yet the 1070 was faster in every scenario.

Unless clocks or efficiency are significantly higher there could be quite a few embarrassing moments for the 2080.
That's the jest of it, from the data we have there is indeed some combination of clocks/efficiency that propels the 2080 ahead of the 1080Ti.
 
Official link to the RTX 2080Ti from PNY, it could be taken down soon, important bits are the clocks: base 1350MHz, boost 1550MHz, 4352 Cores, TDP 285 W, Price 1K (might be a place holder).
http://www.pny.com/RTX-2080-Ti-Overclocked-XLR8-Edition?sku=VCG2080T11TFMPB-O

1350MHz Base/1545MHz Boost to be precise. Guess that whole "8% faster than last gen" rumor was true. Always figured that was for "mid-size" Turing though, not the big part. Pretty sad, given the size of the chip.

$1000 price for 11GB RAM and in all likelihood slightly faster than last gen is a hard pass for me. Oh well, guess I get to save my money. Maybe buy up one of those beat-on 1080 Ti parts for cheap.
 
Guess that whole "8% faster than last gen" rumor was true. Always figured that was for "mid-size" Turing though, not the big part. Pretty sad, given the size of the chip.
It's definitely NOT 8% faster than 1080Ti, that's nonsense. Even a TitanXP is faster than that. Just wait for the performance reveal, you should be satisfied performance wise. Also the 1K price is for a custom OC'ed card from PNY, it could be a place holder.
 
Status
Not open for further replies.
Back
Top