Let's wait for the real boost clocks to account for actual IPC increase. It certainly would explain the higher power consumption vs. RTX 2070. I just don't think the Ampere CUDA is such a huge jump from Turing as Nvidia wants us to believe. Just more of it. Nothing wrong with that per se.
Moved to DLSS discussion thread. You should see alerts from XenForo about your posts being moved.Mods have my post been deleted for some reason or is there a glitch somewhere ?
Andrew Burnes Admin • a few seconds ago Featured by GeForce.com
Hi. RTX IO is supported on all GeForce RTX Turing and NVIDIA Ampere-architecture GPUs.
Let's wait for the real boost clocks to account for actual IPC increase. It certainly would explain the higher power consumption vs. RTX 2070. I just don't think the Ampere CUDA is such a huge jump from Turing as Nvidia wants us to believe. Just more of it. Nothing wrong with that per se.
It is interesting that ASUS mention half the amount of CUDA cores in their press release than NVIDIA.
This: NEW SM – 2X FP32 THROUGHPUT
... might just mean that NVIDIA count that change as a double up on CUDA cores, never mind the SM count. Int32 replaced and quickly forgotten?
We don't know the real perf or power consumption characteristics, but yea. The 70% figure is a bit more than the "nV standard 60%" for "node jump" architectures. Maybe that's why the TDP is seemingly higher.Well that was a fun couple of hours and it seems to me that a ~70% performance gain for the top two cards is better than anyone was hoping for.
Not match.1.9x perf/Watt though...
2080Ti was 280W.Not match.
2080TI 250 watts
same performance 3070 220 watts
Less than theorically per node wattage reduction. It seems efficiency went worst.
Price reduction is another hint that AMD could have a winner with RDNA2.
Not match.
2080TI 250 watts
same performance 3070 220 watts
Less than theorically per node wattage reduction. It seems efficiency went worst.
Price reduction is another hint that AMD could have a winner with RDNA2.
You dont think AMD can make a GPU 50% faster than 5700xt?I don't see amd even beating the 3070. Going from 5700xt to 2080ti+ in one génération jump, with the same process ? And it will be their first rt gpu...
https://www.nvidia.com/es-es/geforce/graphics-cards/rtx-2080-ti/2080Ti was 280W.
Price reduction is for 3070 only which is like a 2080Ti plus 15% or so.
A winner over that? Sure.
A winner over 40TF 3090, especially in cases with RT (and DLSS) where the performance actually counts? I kinda doubt that.
The real question is will they be able to beat 3080?