Nvidia GeForce RTX 4080 Reviews

A lot of Microcenter stuff is pickup only, especially high value equipment. They don't ship a whole lot of gear.

I just checked our local Microcenter and there's like 40 units of 4080 GPUs available... lol no one wants these things.
too expensive. I bet they would sell really well at $800-$1k
 
Microcenters exist only where there's a large enough population to support them. During the crypto craze they couldn't keep any stock, everything sold immediately locally.
This is not true. There are plenty of cities all around the country with more than enough population to support a Microcenter.
 
This is not true. There are plenty of cities all around the country with more than enough population to support a Microcenter.
You're looking at it the wrong way. What Malo said does not conflict with what you said. He did not say they exist at every large city. He said they only exist where the size of the city supports it.

Transitive property was not claimed.
A implies B.
B does not imply A.
 
You're looking at it the wrong way. What Malo said does not conflict with what you said. He did not say they exist at every large city. He said they only exist where the size of the city supports it.

Transitive property was not claimed.
A implies B.
B does not imply A.
Fair enough. I took it as implication that Microcenters exist in all the places worthwhile.
 
Are we sure that the 1200$ price for the 4080 is bad btw ? I mean, if a 7900xtx at 999 is behind in RT and without dlss2/3, you can argue 200$ is the price to pay to have theses performances ?
 
Are we sure that the 1200$ price for the 4080 is bad btw ? I mean, if a 7900xtx at 999 is behind in RT and without dlss2/3, you can argue 200$ is the price to pay to have theses performances ?

It's priced badly against the RTX 4090, which I would argue is it's actual primary competitor.

Its more like the price for the 7900XT and XTX is as bad...

Is it? The 7900XTX seems fine. As long as the numbers even somewhat hold up it will deliver generation perf improvements and perf/$ against it's segment just like the RTX 4090 does. it also serves as a halo for AMD fans and Nvidia haters (I don't think people should pretend those two groups don't exist).

The 7900XT though runs into the same problem as the 4080, it subverts the improving perf/$ expectation as you go down the stack and as a by product means perf/$ improvements are lacking compared to the previous gen at the <halo price points. It's around 85% of the hardware of the 7900XTX for 90% of the price of the 7900XTX. Given the nominal gap is even smaller why would you get it over the 7900XTX if you were given the choice?
 
Raster is very good too...
Sure, but you are paying more to potentially get less rasterisation performance. You could make the same argument against AMD generally for ray tracing, so it does depend on consumer preference. However last generation Nvidia were able to equal AMD in rasterisation while offering much better ray tracing performance, so in that sense it is a regression.
 
so in that sense it is a regression.
Last gen, AMD didn't have a ray tracing competitor, but had a semi rasterization competitor to the 3090 (the 6900XT was faster at 1440p, but slower at 2160p), this gen, AMD doesn't have a rasterization competitor (to the 4090), nor a ray tracing competitor (to the 4080), the regression is on the AMD side this time, as they are back to the days of Vega vs Pascal. (Vega 64 vs 1080Ti).
 
Last edited:
Last gen, AMD didn't have a ray tracing competitor, but had a semi rasterization competitor to the 3090 (the 6900XT was faster at 1440p, but slower at 2160p), this gen, AMD doesn't have a rasterization competitor (to the 4090), nor a ray tracing competitor (to the 4080), the regression is on the AMD side this time, as they are back to the days of Vega vs Pascal. (Vega 64 vs 1080Ti).
Yes, that's right, but we're talking about the value proposition of the 4080, not the 4090.
 
Last gen, AMD didn't have a ray tracing competitor, but had a semi rasterization competitor to the 3090 (the 6900XT was faster at 1440p, but slower at 2160p), this gen, AMD doesn't have a rasterization competitor (to the 4090), nor a ray tracing competitor (to the 4080), the regression is on the AMD side this time, as they are back to the days of Vega vs Pascal. (Vega 64 vs 1080Ti).
I don't think anyone is going to be silly enough to compare a halo product to something like that.

Then again these 80 series GPUs are both venders are becoming halo products slowly over time as well
 
Its more like the price for the 7900XT and XTX is as bad...
A supposed analysis on why these GPUs are relatively expensive is posted by Coreteks, I don't know how accurate it is, but it sounds plausible to a certain degree.

The multi chips design in general has an area overhead that is larger than a monolithic die, maybe 10% to 15%.

RDNA3 is on an interposer (unlike Zen which is on an organic substrate) .. so, doubling up the size of the compute die, will exceed the reticle limit of the silicon, also costs go way up in that case .. the problem is, that an interposer is needed for an ultra fast chiplet to chiplet interconnect.

Interposers also make it difficult to make the design modular (unlike Zen which uses an organic substrate).

For AMD to use chiplets to offer similar performance to NVIDIA's high end, they will have to build a huge interposer, with the added area overhead for chiplets, and the added huge cost, which makes it infeasible to use in a consumer market (economically speakibg).

 
Are we sure that the 1200$ price for the 4080 is bad btw ? I mean, if a 7900xtx at 999 is behind in RT and without dlss2/3, you can argue 200$ is the price to pay to have theses performances ?
From the 3080 to 4080 it's 1.5x perf raster/bit more with RT with the price jumping 1.7x 2 years after the 3080. The 4090 has an excuse of being the absolute fastest you can buy right now and makes sense for some professional/cuda applications (e.g. video editing/processing/upscaling, ML stuff) too so even if it's bad for the "average gamer" it makes sense for some others. The 4080 seems to exist to make people spend more on the 4090 given how it's about 1.33x faster for 1.33x the price, the 4080 really only makes sense for people who want the fastest nvidia card but can't quite afford the 4090. I'm not sure if I've ever recommended saving and buying the most expensive option if possible before but it's true this gen, if not get a 3080 or 3090 if you need 24GB VRAM. Then for the 7900XTX, more and more games have upscaling so native RT perf is less of an issue even with a significant a big arch RT gap. There are Nvidia features people might want so again go 4090 or 3080/3090 for better value, ignore the 4080 and 7900XT

A supposed analysis on why these GPUs are relatively expensive is posted by Coreteks, I don't know how accurate it is, but it sounds plausible to a certain degree.
I know it's not fair but if I'm not mistaken they thought big RDNA2 would be 2080Ti level performance and championed the "RT co-processor" for Ampere because of the fan on both sides of the cooler so thank you for the video but I'll pass
 
Are they using an organic substrate then? Or something similar to FuryX/Vega 64?
I referenced the tech here:


and you can read the page I linked which goes into quite a lot of detail.

The entire point of it is that it solves the physical and cost problems associated with silicon interposers that have traditionally been used for HBM. Indeed it's entirely possible to implement HBM with this technology. There appears to be no need to use silicon interposers for HBM in the future.
 
Back
Top