@trinibwoy I wonder if Nvidia is pusing for 4k and now 8k because it's the easiest way to increase utilization. It does seem weird to essentially double ampere when utilization is so low, especially at common resolutions like 1080p and 1440p.
They're also marketing to the high refresh rate, low latency crowd though so you would think that efficiency at lower resolutions is also a priority.
What about utilization during RT workloads?
Pretty crap based on what I've seen so far but Nvidia said as much in the Ampere whitepaper. I've only looked at Cold War, Star Wars and Atomic Heart though. Overall SM utilization is usually somewhere around 20% during RT passes with the INT ALU pipe seeing lots of action.