Nvidia Volta Speculation Thread

Simply speculation: If nVidia releases a series of new gaming cards that offer poor mining bang for the buck it could create an interesting situation. At current prices, existing owners of GTX 1050s/1060s/1070s who don't mine could sell their cards for what they paid for them, or more, and use that cash to subsidize the upgrade to the the new, hypothetical, gaming card.

On the other hand, nVidia would be foolish to not sell new cards to miners, so they'd likely release something that would have the effect of lowering the value to miners of existing cards. Interesting times indeed, waiting to see how this all turns out. Interesting times in a bad way for those who need/want a new card.

If a card is bad at mining it will also be bad at graphics.
 
They won't be worse than Pascal at mining, unless artificially limited.

However, I also think that whatever next gaming-related improvements go into nv's next gen won't usually improve mining. So I'd say that next gen just won't be worse than Pascal but also not better. Unless accidentatlly some bottleneck which one or another mining algo was hitting gets removed
 
Last edited:
Yes and no. NVIDIA cards were terrible in mining before Pascal compared to AMD counterparts despite the performance in gaming favoring NVIDIA.

They won't be worse than Pascal at mining, unless artificially limited.

However, I also think that whatever next gaming-related improvements go into nv's next gen won't usually improve mining. So I'd say that next gen just won't be worse than Pascal but also not better. Unless accidentatlly some bottleneck which one or another mining algo was hitting gets removed

Let's be honest here, if you do something hardwarewise to cripple the performance of mining you are leaving performance on the table. There is little to no chance of new hardware being worse at the same workloads as current hardware.

It will also be far too late to change the hardware.
 
Reported source rumours were saying similar things about Pascal and it would not launch anytime soon because of issues, we know how that ended up and also surprising most by also launching with the largest and most complex GPU-die very early for select clients.
One factor that may influence launch could be mining (more on Geforce rather than Tesla/Quadro), really is screwing with retail prices that Nvidia do not benefit from.

Anyway one needs to consider there will always be some synergy between Geforce/Quadro/Tesla outside the flagship mixed-precision/Tensor GPU; it is not possible to fully break that R&D-design-manufacturing/logistics synergy.
 
Last edited:
http://www.tomshardware.com/news/nvidia-turing-graphics-architecture-delayed,36603.html

Tom's says that Ampere is actually Volta successor for the professional field and Turing the gaming architecture. And neither would be unveiled at GDC or GTC.
GTC and GDC were never on the table for a new NV consumer graphics series. NV always does its own events, so GDC is out. And GTC is the wrong crowd; it's all HPC devs and the like.

When it's time for an NVIDIA launch, you'll know. It'll leak like a sieve and NV will hold a special event that has every consumer tech reporter and YouTube talking head available.
 
GTC and GDC were never on the table for a new NV consumer graphics series. NV always does its own events, so GDC is out. And GTC is the wrong crowd; it's all HPC devs and the like.

When it's time for an NVIDIA launch, you'll know. It'll leak like a sieve and NV will hold a special event that has every consumer tech reporter and YouTube talking head available.
Wasn't GTC always about gaming before just couple years back or so when they brought AI etc in?
 
Wasn't GTC always about gaming before just couple years back or so when they brought AI etc in?
It was kind-of sort-of a consumer event when it was NVISION a decade ago.

Since then it was rebranded as the GPU Technology Conference and has been a professional event ever since. Which isn't to say that gaming is absent, but it's game developers rather than the public. And even that has shifted more towards GDC, leaving GTC as more a venue for ProViz and HPC (with the makeup of the invited press shifting as well).

Edit: For reference, here were the last 4 NV consumer product line launches

10-Series/Pascal: GeForce press briefings + public gaming event in Austin, TX (Edit 2: I should add that all indications are that the public event was highly successful, and that we should expect to see something similar again)
900-Series/Maxwell: GeForce press briefings in Monterey Bay, CA
600-Series/Kepler: GeForce press briefings in San Francisco, CA
400-Series/Fermi: GeForce press briefings after CES, Las Vegas, NV. (Note this didn't actually launch for a few more months)

Now NVIDIA has done GDC events in multiple years. But those are always a grab-bag of consumer gear. GTX 1080 Ti, Shield STB, etc.
 
Last edited:
Wasn't GTC always about gaming before just couple years back or so when they brought AI etc in?
GTC 2016 was where they announced the P100 Tesla, the gaming model was announced a bit later I think, been like this for awhile now differentiating with GTC.
This is what surprised many people, the spec and size of the 1st Pascal GPU announced and when available to select clients.
GTC may give us some hints to the latest Tesla or Quadro range, which in a way gives us an indicator to Geforce.

Edit:
Sorry Ryan I posted before reading your response.
 
Last edited:
If true that really sucks for PC gaming in 2018.

Yup this is a real bummer. I’ve had a 1070 since launch and am itching for an upgrade beyond what a 1080ti can do.

The other cynical possibility that crossed my mind is that NVIDIA will go with a founder series for a longer time window this time and only sell through their store.
 
The other cynical possibility that crossed my mind is that NVIDIA will go with a founder series for a longer time window this time and only sell through their store.
I’d consider that a win if it meant availability at slightly elevated prices due to volume limits instead of no availability because miners buy it bulk at the factory doors.
 
I think what muddies the waters (and creates a sea of conflicting rumors) is that lunching chips/architectures and cards are not the same thing; the distinction which will be all the more pronounced as gaming and non-gaming lines further differentiate. For instance, Volta came out of the left field, took a long time to come out as a prosumer card and never made it to gaming.
 
I’d consider that a win if it meant availability at slightly elevated prices due to volume limits instead of no availability because miners buy it bulk at the factory doors.
Please no. Founders cards use NVs halfassed noisy blower coolers. I will never go back to that shit.
 
Contracted a local nvidia guy, it seems that the boost on Titan V is just that low (1335MHz for my two cards, and the boost is much less flexiable than average Geforce, more like the case in Tesla/Quardo, so maybe the Titan V should be renamed to Tesla V80 instead), but when play games, the card can boost to 1800MHz or so.

I suspect that must has something do with the FP64 thing, the original Titan will down clock significantly when full speed FP64 is enabled.

Its a shame the driver can no longer disable full speed FP64 on Titan V.



I always leave that option enabled.

Somehow I missed the review/report by PCPer.
Seems that the Titan V is throttling and more to do with temps (the dynamic boost is more conservative with this large and FP64/FP32/Tensor designed die) as an accurate power demand measurement with a scope show it drawing 210W, so considering the real world TDP-TBP the performance is even more impressive.
Worth noting the real-world clocks are still a bit above the spec max dynamic boost, this was without messing around with fan control.
The conservative temp management aspect may be to do with local die hotspots due to the GPU design/functions rather than the whole die in general, difficult to know for sure.
 
The first Volta: Nvidia Titan V review @ nl.hardware.info
March 5, 2018
For the highest stable overclock, we set the Titan V at 100% fan speed, 110% power limit, + 100 MHz at the gpu and +150 MHz at the hbm2 memory. While running 3DMark, the clock rate remained stable at 1987 MHz, 400 MHz higher than the standard clock speed in the endurance test! The temperature did not rise further than 70 degrees, although the card will undoubtedly become warmer with a longer workload.

What does that overclock provide? We achieved a 3DMark Timespy score of 13,987 points with a graphics score of 14,429 points. That's still 15 percent faster than standard, while the Titan V was by far the fastest video card ever tested. To give you an idea of how monstrous this score is: if we had uploaded it to overclocking community Hwbot , it would have been good for a tenth place on the world rankings!

https://nl.hardware.info/reviews/7993/de-eerste-volta-nvidia-titan-v-review
 
Last edited:
I’d consider that a win if it meant availability at slightly elevated prices due to volume limits instead of no availability because miners buy it bulk at the factory doors.

Today at 9:21 AM CDT I got an email notice that the GTX 1070 Ti was available for purchase from the store. I purchased two of them at the MSRP price of $449 at 9:23 AM. By 9:25 AM they were all sold out.
 
Posted in Post Volta thread but also applicable to here as well:

With the latest announcement around DX12 Ratracing at GDC 2018.
One quote stands out from Nvidia and is from Tony Tomasi in an article:
PCGamesn said:
“There’s definitely functionality in Volta that accelerates raytracing,” Tomasi told us, “but I can’t comment on what it is."

But the AI-happy Tensor cores present inside the Volta chips certainly have something to do with it as Tomasi explains:
.....
Nvidia’s new Tensor cores are able to bring their AI power to bear on the this problem using a technique called de-noising.

“It’s also called reconstruction,” says Tomasi. “What it does is it uses fewer rays, and very intelligent filters or processing, to essentially reconstruct the final picture or pixel. Tensor cores have been used to create, what we call, an AI de-noiser.

“Using artificial intelligence we can train a neural network to reconstruct an image using fewer samples, so in fact tensor cores can be used to drive this ai denoiser which can produce a much higher quality image using fewer samples. And that’s one of the key components that helps to unleash the capability of real-time raytacing.”
https://www.pcgamesn.com/nvidia-rtx-microsoft-dxr-raytracing

Yeah not going to impact gamers for some time as a complete solution, but will be interesting to see how this will unfolds sooner in the professional world, especially with Volta onwards.
 
Posted in Post Volta thread but also applicable to here as well:

With the latest announcement around DX12 Ratracing at GDC 2018.
One quote stands out from Nvidia and is from Tony Tomasi in an article:

https://www.pcgamesn.com/nvidia-rtx-microsoft-dxr-raytracing

Yeah not going to impact gamers for some time as a complete solution, but will be interesting to see how this will unfolds sooner in the professional world, especially with Volta onwards.
So they solve the core ray tracing performance problem by simply rendering less of them and guessing the gaps?

That’s certainly a novel way of attacking the problem and it might just work if you mix traditional rendering with ray tracing for lighting only.

Are there any demos out?
 
Back
Top