I was wondering that from a post about what it'll take to get 4K120 high quality RT on consoles. I was thinking 'Moore's Law has ended' and then checked out die sizes, but they haven't massively increased the past few generations of GPUs where I was expecting all performance to come from bigger silicon (although I might not be comparing the same thing). Lithography has progressed still. But then you have these massive cards, and I'm guessing the limiting factor is elsewhere? Like, sure, the dies are 400 mm^2 but the heat is bonkers and the rest of the packaging huge?LOL. I mean if this the future where the cost of silicon is so high gpu manufacturers are going high freq silicon paired with massive cooling. I doubt newly released consoles are ever going to come close to modern discrete cards again. The cost of shrinking isn’t getting cheaper while the transistion to smaller silicon is taking longer.
Can anyone spell out for me where we are in terms of lithographic progression and limiting factors on GPU costs and power going forwards? Is a 'next gen' console in the next 5 years going to need to be twice the size and 4x the power consumption? Is it going to be 10 years, or 12, until chip manufacturing can actually shirnk down to get 'next gen' into a current-gen sized box? Or are we going to have to wait for a revolution in computing strategy?
Hmmm, is there a DF investigative article here? Seems some stats could compare high end PC GPUs die sizes and thermos with consoles over the years to see if there's a trend.