Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

It seems NVIDIA improved it's baseline performance in Vulkan games with Pascal (Doom and Wolf2). Turing just picked that up and built on it. Wolf 2 should also receive a variable shading patch that is exclusive to Turing.

Don’t think it’s Vulkan. Turing’s advantage over Pascal in Doom is similar to other titles ~30%. It leaps ahead in Wolf 2 by over 60%.

Would be good to confirm if it’s really using AMD specific extensions or core Vulkan APIs for FP16.
 
It's basically matching Volta, how is that surprising?

True, I just wasn’t aware that the Volta/Turing architecture lend itself so well to 3d rendering tasks. First evidence I have seen, but then I haven’t followed Volta at all. I think that there is no change that the Blender developers have spend time to optimize for Volta.
 
The good news is that we do have the B3D suite data. But it didn't make the cut for trying to get everything done on time. I'm thinking we're going to follow this up with a part 2, looking more at synthetics, clockspeed averages, etc.
Great Thank You!
 
Performance/price analysis on Linux: Dead heat between 2080 Ti and Vega 64 with most games tested.
While the GeForce RTX 2080 Ti offers incredible performance potential, the $1,199 USD price-tag on the Founder's Edition card is very steep.... To little surprise, that puts it at the back of the line for value. Though in many cases that was close to the value offered by the Radeon RX Vega 64 on Linux with the current pricing around $570 USD.
embed.php

https://www.phoronix.com/scan.php?page=article&item=nvidia-2080ti-linux&num=7
 
I think 2080 (non-TI) is really lost right now. 2080 TI(tan) at least lives in the "ultra performance at ultra price tier" that, while limited, is at least fairly well understood. 2080, on the other hand, with its 5-12% performance/power lead over 1080 TI is in purgatory at the current prirce. If the market was still in grips of crypto frenzy with 1080 TIs sold at or above MSRP then it would be a no-brainier, but with new mid-grade 1080 TIs dipping into sub-$650 range, the $150 premium is an extremely tough pill to swallow. Alternatively, if there were 2 or even 1 top-tier titles available alongside it that fully took advantage of raytracing capability and really showcased technological advances Turing brings, the situation would be different, but those are still months away. Thus, what we are left with a card with advanced tech but no opportunity to showcase it to the gamers, and singe-digit performance and efficiency increase not commiserative with the asking price premium over existing products. The card is in desperate, immediate need of a $100 price cut, which would actually make it into an exciting product worth recommending, because IMO right now its just threading water .
 
Last edited:
NVIDIA GeForce RTX 2080 Ti Shows Very Strong Compute Performance Potential
Besides the new GeForce RTX 2080 series being attractive for developers wanting to make use of new technologies like RTX/ray-tracing, mesh shaders, and DLSS (Deep Learning Super Sampling), CUDA and OpenCL benchmarking so far on the GeForce RTX 2080 Ti is yielding impressive performance -- even outside of the obvious AI / deep learning potential workloads with the Turing tensor cores. Here are some benchmarks looking at the OpenCL/CUDA performance on the high-end Maxwell, Pascal, and Turing cards as well as an AMD Radeon RX Vega 64 for reference.
https://www.phoronix.com/scan.php?page=article&item=nvidia-rtx2080ti-compute&num=1
 
Last edited:
Ethereum Crypto Mining Performance Benchmarks On The GeForce RTX 2080 Ti

embed.php

With Ethminer 1.6 when going from the GTX 1080 Ti to RTX 2080 Ti is a 57% performance increase for mining on this hardware, but when going from the GTX 980 Ti to GTX 1080 Ti was a 76% increase. But it's also possible with forthcoming Ethminer updates we could see better performance out of these Turing GPUs. Anyhow, in terms of raw performance it's a nice upgrade if you have the hardware.
...
One of the most important aspects though for cryptocurrency mining is the performance-per-dollar and that's where the RTX 2080 Ti at $1199+ doesn't fair well... It's possible the RTX 2080 (non-Ti) will deliver better in this department, but unfortunately I have no RTX 2080 for testing at this time. So the RTX 2080 Ti does deliver strong mining performance just as we saw with the great showing in the OpenCL/CUDA benchmarks (and a follow-up yesterday of the Folding@Homeperformance benchmarks), but due to the high cost, the RTX 2080 Ti likely won't be a big hit with miners.
https://www.phoronix.com/scan.php?page=article&item=rtx2080ti-crypto-mining&num=2
 
You can mine ETH with ASICs now, so it doesn't make much sense at the price. They should look at Cryptonight.

Because of those ETH ASIC's the current Nvidia GPU ETH miners jumped ship to Cryptonight (Monero) and caused a 200 MH/s increase in Monero resulting in the difficulty to jump from 55G to 70G.

https://bitinfocharts.com/comparison/monero-difficulty.html#3m

And of course the reduction from 3 to 2 ETH for block rewards also caused the mass exodus from ETH to XMR.

https://cointelegraph.com/news/ethe...difficulty-bomb-reduce-block-rewards-to-2-eth
 
Last edited:
You can mine ETH with ASICs now, so it doesn't make much sense at the price. They should look at Cryptonight.

Because of those ETH ASIC's the current Nvidia GPU ETH miners jumped ship to Cryptonight (Monero) and caused a 200 MH/s increase in Monero resulting in the difficulty to jump from 55G to 70G.

https://bitinfocharts.com/comparison/monero-difficulty.html#3m

And of course the reduction from 3 to 2 ETH for block rewards also caused the mass exodus from ETH to XMR.

https://cointelegraph.com/news/ethe...difficulty-bomb-reduce-block-rewards-to-2-eth

Neaah. ETH mining performance is a datapoint, as good as any. Obviously more algos tested would be nicer but these are general purpose sites so they prolly can;t justify it.

As to ASICs fo ETH, it's not so straighforward. They are just cheaper and and just a tinny bit more power efficient. Due to that, many AMD cards may have migrated in time to XMR from ETH, because of the slightly higher profits. Not all..

As for, nvidia cards even today ETH is more profitable than XMR. Hence I doubt that there was a massive transition. We cannot prove either way, of course

Also for ETH block reward is now 3, not 2. It will change to 2 on next fork only.
 
New overclocking system that got introduced as part of turing launch is great. Tried it with my 1080ti through evga precision x1 software. After the scanning process was done the end result was good. Probably there is little bit more to be gained by hand tuning but for such an easy way to overclock this should just happen by default when you first install gpu.

In essence you get optimized voltage for every clock speed instead of having to use conservative factory defaults.

Another nice thing is that the tool obeys power/temperature limits user sets. So in essence if you like you could also underclock and find optimal voltages -> more silent operation.
 
New overclocking system that got introduced as part of turing launch is great. Tried it with my 1080ti through evga precision x1 software. After the scanning process was done the end result was good. Probably there is little bit more to be gained by hand tuning but for such an easy way to overclock this should just happen by default when you first install gpu.

In essence you get optimized voltage for every clock speed instead of having to use conservative factory defaults.

Another nice thing is that the tool obeys power/temperature limits user sets. So in essence if you like you could also underclock and find optimal voltages -> more silent operation.
Will this work on my 970? It still runs most things fine but sometimes I could use a little more.
 
Back
Top