Guess:
5080, 4080 +27%, $1,200 (still < 4090 @4k, for reference)
5090, 24gb, 520w, 4090 + 33%, $1,800
5090 AI, 32gb+, 600w, 4090 +48%, $2,500+
5090 AI will just be for existing lower power datacenters to run AI inference workloads, but it'll "technically" be available for consumers to purchase. If you can even fit a ginormous quad slot into your case without blowing up your power supply off a 600w GPU assuming you can even cool it, after you already spent an extra $2.5k on a 9% overclock.
Maybe the 24gb 5080ti will come out for$1200 24gb after the 5080 drops to $1k sometime in early 2026?
Anyway they'll claim their new stochastic anisotropic filtering units are the shit, despite this costing like < 1ms to replicate on compute shaders. I shouldn't have been that mean about the in ram neural net compression; but trying to claim you came up with something cool when what you're really doing is trying to force gamedevs to spend $$ and time on your platform exclusive thing so your corporate overlords $10 in ram per unit instead of just upping minimum to 12gb+ of ram is the time to ask "are we the baddies?" Nvidia is trying to push towards killing PC gaming inadvertently because this ups profit margin, and that's just depressing (not that AMD's "just below Nvidia" pricing strategy is helping).