Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

TU102_annotated.jpg


https://cdn.nemez.net/die/Turing/
 
Blender 3.0 Benchmarks - Performance Across 19 Different NVIDIA GPUs - Phoronix
Today's article is focusing on the NVIDIA GPU render performance with Blender 3.0. Unfortunately, the AMD HIP support for Blender on Linux didn't make the v3.0 cut but is being targeted for Blender 3.1 next year. As such on Linux right now with Blender 3.0 the only form of GPU acceleration is using the NVIDIA proprietary driver stack with Blender's CUDA or OptiX back-ends. The OpenCL support was removed as part of the "Cycles X" work and thus for now Linux users will have either just CPU-based rendering or NVIDIA support.
 
NVIDIA GeForce RTX 3080 12GB to feature 8960 CUDA cores - VideoCardz.com
The upcoming RTX 3080 12GB will not only feature more memory, but also a wider 384-bit memory bus (which is required for such configuration), and as a result higher bandwidth (912 GB/s vs 760 GB/s on 10GB variant). Thanks to a faster memory, the card will be faster in ETH crypto mining. The official LHR (Lite Hash Rate) figures claim it will be 20% faster at 52 MH/s compared to 43 MH/s.

Moving on to CUDA core count. It was already known that this variant is to feature GA102-220 GPU, but it wasn’t clear whether it offers more cores than the 10GB variant. If it does, why even call this card the same as the 10GB model, and not RTX 3085 or RTX 3080 SUPER instead? Well, as it turns out, the RTX 3080 12GB will indeed have more cores, 8960 CUDAs to be exact (256 more cores, +2.9%).
 
So found this :
Adding to the previous RTX 20 series laptops, NVIDIA is announcing the GeForce RTX 2050 GPU. The GPU features RT (ray tracing) Cores, Tensor Cores as well as an NVIDIA Encoder which enables NVIDIA DLSS, Reflex, Broadcast, and more.

What is an nvidia encoder?
 
So found this :
Adding to the previous RTX 20 series laptops, NVIDIA is announcing the GeForce RTX 2050 GPU. The GPU features RT (ray tracing) Cores, Tensor Cores as well as an NVIDIA Encoder which enables NVIDIA DLSS, Reflex, Broadcast, and more.

What is an nvidia encoder?
Also while named 2050, it's actually based on Ampere
 
Does the Tu-106 based GeForce 1650 support RT and DLSS?
No, they're disabled on GTX-series (not sure tensors are also disabled or not, since TU117 outright replaces them with FP16 CUDA cores), only RTX-series support them currently. If one of the MX models does support those, it's the first for non-RTX GeForce
 
First time I've heard of GPU heat pipes containing fluid. - MSI RTX 2080 Ventus OC

This GPU has sprung a leak! | KitGuru
A friend of mine called me recently in a bit of a panic as his graphics card was suddenly overheating during any GPU-intensive workloads. The GPU core temperature would instantly heat up to around 85°C (100°C hotspot), the fans quickly ramped up to maximum RPM, the system would crash in games and just generally feel like there was a serious issue. So I got the card over to my workshop to take a closer look.

The card in question is an MSI RTX 2080 Ventus OC, which was purchased approximately 3 years ago so it’s out of warranty.
...
On the second disassembly of the card I was quite shocked to see that there was a small droplet of liquid on top of the GPU die and when I flipped the cooler over, there was more liquid seeping out of the rough spot on the heat pipe that I saw earlier. It seems as though the heat pipe has corroded and eventually given way completely, with a pinhole in the copper that was leaking fluid once the heat pipe got up to a certain temperature. I was able to replicate the problem in the video by using a heat gun to warm up the cold plate and sure enough, more liquid seeped out of the corroded heat pipe.
 
I haven't seen one fail but it does happen. If the liquid inside is lost they become completely ineffective.

Maybe that pipe just had a defective area that was thin and it eventually cracked from stresses.
 

There is another path as well, which is looking for "modded" consumer GPUs specifically designed for AI applications. In this case, an eBay listing has shown NVIDIA's GeForce RTX 2080 Ti retailing at the platform with a whopping 22 GB VRAM onboard, which is almost double what the original model features. The increase in memory is justifiable since AI workloads require a hefty amount of it, as high VRAM acts as a high-speed storage buffer for the massive datasets, complex models, and intermediate calculations that fuel AI algorithms.
...
Moreover, the eBay seller is indeed a professional in GPU modding since he has also sold similar projects. The ad's description reveals that users won't need to worry about driver support or performance stability with the modded NVIDIA GeForce RTX 2080 Ti GPU since the modders have thoroughly tested it. Interestingly, the modders have tested the GPU and found that it generates a detailed picture at 512 by 768 pixels in just 5.4 seconds, which is phenomenal considering the GPU is almost five years old.

Screenshot-2024-02-15-at-3.52.43%E2%80%AFPM.png

Right now, the 22 GB model is listed for $500 on eBay, which is high because original variants are listed around $300-$400 on the same marketplace. Still, judging by the increased memory onboard, it might look worthwhile for those who desire a decent AI performance. Another solution would be to get a used GeForce RTX 3090 GPU with 24 GB VRAM but those cost an extra $100-$200 US based on the same eBay listings. So at $500 US, the RTX 2080 Ti modded with 22 GB VRAM offers the highest memory capacity.
 
Back
Top