Man Lovelace is going to be such an awesome upgrade from my current 1070. Can't wait!
Man Lovelace is going to be such an awesome upgrade from my current 1070. Can't wait!
On 8nm, Turing was on 12.RTX3080 has 68SMs on 12nm
It still doesn't look right for a AD104 to be on par with 3090 while somehow consuming 400W. The wattages are likely completely out of place. I'd bet that people are looking at PCIE5 power as if it must be used straight away to 100% since it's there.On 8nm, Turing was on 12.
Performance in rasterizing games are not limited by geometry or compute performance. So i dont know how you can improve performance in these games by 2x when even today Ampere isnt even fully utilized. I think nVidia should ignore it and go full in with spending transistors for raytracing, compute and DL.
As it's likely going forward that ray tracing will be the main "console+" graphics we get on the PC.
With a performance roadmap up to 24 Gb/s, GDDR6X is ready for data-hungry applications of the future.
And, while I'm here, I just don't get the power consumption and performance numbers being rumoured, Samsung 8nm to TSMC 5 (or 4) should be a massive boon for performance and power consumption.
It'll be interesting to see if the newer GDDR6X speeds reduce power consumption, too.