Nvidia Ampere Discussion [2020-05-14]

Yep, clock vs power is non-linear. Larger chips can thus be more efficient, up to a point. And depending on what the primary cap on performance is.
 
3080 vs 2080Ti(300W) on Nvidia Asteroid Benchmark which is a mesh shader demo. Difference is huge at 4K resolution.
That's impressive.
I see the most gains come from compute workloads.
Mesh shaders have the same execution model as compute shaders but with direct interface to rasterizers.
Here are other examples where Ampere scales almost linearly with flops, all compute workloads:
https://www.ixbt.com/img//x1600/r30/00/02/33/56/d3d1210nbodygravity64k.png
https://www.ixbt.com/img/r30/00/02/33/56/vray_668771.png
https://www.pugetsystems.com/pic_disp.php?id=63679&width=800
https://babeltechreviews.com/wp-content/uploads/2020/09/Sandra-2020.jpg
In the case of Sandra, image processing and many other kernels are all ~2x faster over 2080 Ti.
 
That's impressive.
I see the most gains come from compute workloads.
Mesh shaders have the same execution model as compute shaders but with direct interface to rasterizers.
Here are other examples where Ampere scales almost linearly with flops, all compute workloads:
https://www.ixbt.com/img//x1600/r30/00/02/33/56/d3d1210nbodygravity64k.png
https://www.ixbt.com/img/r30/00/02/33/56/vray_668771.png
https://www.pugetsystems.com/pic_disp.php?id=63679&width=800
https://babeltechreviews.com/wp-content/uploads/2020/09/Sandra-2020.jpg
In the case of Sandra, image processing and many other kernels are all ~2x faster over 2080 Ti.

Thus showing what the arch is really for. It's really a great deal for anyone looking for rendering, or even tooling around with machine learning. Those matrix multiplication numbers are amazing, especially for $700, though maybe that's why the 24gb model costs $1500. Still a good deal, I can see AI researchers snapping them up when they're available.

I mean, if the CDNA leaks are true it's going to have a hard time competing with this. Maybe it wasn't Nvidia's intention, but they seem to have produced a compute monster for mass market gaming prices.
 
Last edited:
Who knows? Maybe they've got some inventory of 2GB modules for 3090 launch already. We haven't even heard about G6X a month ago.
This confirms it though - and also makes 3080 20GB a 2021 product probably.
Why wouldn't they just clamshell it like 3090?
 
Why wouldn't they just clamshell it like 3090?

It’s probably just because of cost. I’m not completely sure but it’ll likely need more pins to connect more devices in x8 mode as the command pins will double. NVIDIA probably don’t want to make a different package for 3080 just for a 20GB SKU, but prefer to wait for Micron to make 16Gb devices instead.
 
Initial troubles with LG TVs and RTX Ampere cards?

https://www.forbes.com/sites/johnar...st-nvidia-rtx-30-graphics-cards/#3c616208267a

LG OLED TVs Having Issues With Latest Nvidia RTX 30 Graphics Cards
It’s long been feared that the crazy complications of the latest HDMI 2.1 format coupled with the big forward leap in graphics quality being offered by the next generation of PCs and games consoles would cause serious compatibility issues. And unfortunately it seems that one of the most talked about next-gen combinations, Nvidia’s new RTX 30 graphics cards and LG’s 2019 and 2020 OLED TVs, has fallen at the first hurdle.​

Owners of both LG 9 and X series OLEDs are reporting that the TVs aren’t handling the highest quality outputs properly from their new RTX 30 Series cards - even though, on paper at least, they should. In fact, LG has often talked up the gaming potential of its recent OLED TVs, as opened up by the high bandwidth support of the 48Gbps HDMIs on its 2019 9 series OLEDs, and the 40Gbps HDMIs on its 2020 X series OLEDs.

The two main problems being reported appear to be as follows. First, users of both the LG OLED 9 and X series are reporting a complete loss of picture (a black screen) when attempting to apply Nvidia’s G-Sync variable refresh rate technology at 120Hz frame rates. This occurs regardless of which bit depth or resolution you choose.

The second issue seems to be restricted to X series models, and finds the TVs reducing signals output in RGB/120Hz/4:4:4 to 4:2:2 chroma subsampling. This happens irrespective of whether you have G-Sync active or not, or which output resolution you have selected. And it results in notable image degradation - as shown in the examples I was kindly allowed to reproduce here by Twitter user Sixi82.

...​
 
So 18 months on there is still no video card that can properly drive my LG C9 65" with variable refresh rates at 4K...

LG's decision to not add support for AMD freesync on C9's was bad enough (since they were first advertised as an 'adaptive sync' capable TV well before they were G-Sync certified).
 
So 18 months on there is still no video card that can properly drive my LG C9 65" with variable refresh rates at 4K...

LG's decision to not add support for AMD freesync on C9's was bad enough (since they were first advertised as an 'adaptive sync' capable TV well before they were G-Sync certified).

C9 support VRR, right ? So it should work ok with amd cards (even if vrr have others problems : https://www.forbes.com/sites/johnar...uesbut-cant-promise-a-quick-fix/#57fba4ef13af )
 
Last edited:
So 18 months on there is still no video card that can properly drive my LG C9 65" with variable refresh rates at 4K...
I was under the impression that 2000 series GPU already have G-sync working with their 2019 LG OLEDs.

LG's decision to not add support for AMD freesync on C9's was bad enough (since they were first advertised as an 'adaptive sync' capable TV well before they were G-Sync certified).
Actually this is AMD's fault, not LG's. AMD promised it would support VRR on HDMI. Freesync and VRR are not precisely the same thing.

On the LG 2020 TVs, Freesync only works when you turn off some other feature on the TV (I can't actually remember what that feature is, sorry) because Freesync conflicts with the "flag" that is normally used for that feature.

Why consoles with AMD GPUs have VRR support, but PCs with AMD GPUs don't seems to be down to AMD, not LG.
 
I was under the impression that 2000 series GPU already have G-sync working with their 2019 LG OLEDs.
There seems to be a driver bug which affects all VRR enabled GPUs right now. Hopefully it will be fixed on NV side and won't require firmware updates from LG.
 
That's impressive.
I see the most gains come from compute workloads.
Mesh shaders have the same execution model as compute shaders but with direct interface to rasterizers.
Here are other examples where Ampere scales almost linearly with flops, all compute workloads:
https://www.ixbt.com/img//x1600/r30/00/02/33/56/d3d1210nbodygravity64k.png
https://www.ixbt.com/img/r30/00/02/33/56/vray_668771.png
https://www.pugetsystems.com/pic_disp.php?id=63679&width=800
https://babeltechreviews.com/wp-content/uploads/2020/09/Sandra-2020.jpg
In the case of Sandra, image processing and many other kernels are all ~2x faster over 2080 Ti.
Damn, I'd totally forgotten about ixbt:

https://www.ixbt.com/3dv/nvidia-geforce-rtx-3080-review-part1.html

Really excellent graphics card reviews, not the weak sauce of pretty much all English language sites. Translation to English is so good these days, too!

Results with the Perlin noise test, which historically was very useful for pure compute comparisons shows a significant problem for Ampere as it's only 12% faster. It's a long shader and has a fair amount of instruction-to-instruction dependency. At least that's what the 3DMark 06 version shows, which is about 500 instructions. I don't have the source code for 3DMark Vantage Perlin Noise though...
 
Back
Top