Nvidia GeForce RTX 4080 Reviews

I understand that there are outliers (GT200, Hawaii, all the HBM stuff etc.), but this is not one of them. Looking at the memory bus is a useful way to determine where a product belongs in the stack.

Bus width isn’t that useful any more as a price indicator with the humongous caches.

I'm in full agreement with HUB here. The 4080 is just a horribly priced part. It offers lower value than the halo product while likely being terrible value vs the 4070Ti assuming NV don't bollock up the pricing on that too. And that's without considering the AMD parts.

I can see myself potentially holding my nose and getting a 4070Ti in Jan provided it's $799 or lower and we don't get ripped off on the pricing in the UK. It'll still feel dirty spending so much on a x070 class product but at this stage I don't think things are going to get any better than that. AMD is out for me unfortunately as DLSS and RT performance are major selling points for me. FSR2 is actually great in the games I'm currently using it on, but with DLSS being even better, ad more importantly, more common, that's the way I have to go.

Yeah it’s quite an accomplishment to have worse perf/$ than the flagship. My question was more in relation to the phenomenon of new products selling out immediately. If they don’t sell out immediately doesn’t it mean it’s priced appropriately based on simple supply/demand economics?

The irony is that we’re basically arguing that if a product sells out and stays out of stock for weeks/months then it’s priced correctly which is academically inaccurate. That’s why I want to see the price trajectory over a few months. It should drop sharply when demand dries up.
 
Bus width isn’t that useful any more as a price indicator with the humongous caches.
This begs the question, why use up that super expensive die space on gigantic caches while simultaneously shrinking the memory bus on the entire stack outside of the very top end? Seems like they are creating a problem (not enough bandwidth) and solving it in the most expensive way possible (bigger die). Or maybe the bigger die is still cheaper than an extra couple of memory channels?
 
Exactly, architecturial differences matter.
Sorry, I was being kind of a dick. Just saying, the fact that new GPUs are faster than old ones doesn't really invalidate my argument. However the increased cache argument does make sense.
 
This begs the question, why use up that super expensive die space on gigantic caches while simultaneously shrinking the memory bus on the entire stack outside of the very top end? Seems like they are creating a problem (not enough bandwidth) and solving it in the most expensive way possible (bigger die). Or maybe the bigger die is still cheaper than an extra couple of memory channels?

If it's a CPU then the answer would be a no-brainer. Bigger cache is always better, for most workloads anyway.
GPU used to like more bandwidth because its workloads are massively parallel and less latency sensitive (or more accurately, it's easier to hide latency through parallelism). However, GPU workloads are now more similar to CPU than ever, and its parallelism is probably not as much as it used to be. The number of pixels and triangles on screen does not increase that much, as many people seem to want better pixels than more pixels. Raytracing is also likely to be more latency sensitive.
 
If it's a CPU then the answer would be a no-brainer. Bigger cache is always better, for most workloads anyway.
GPU used to like more bandwidth because its workloads are massively parallel and less latency sensitive (or more accurately, it's easier to hide latency through parallelism). However, GPU workloads are now more similar to CPU than ever, and its parallelism is probably not as much as it used to be. The number of pixels and triangles on screen does not increase that much, as many people seem to want better pixels than more pixels. Raytracing is also likely to be more latency sensitive.
So then for a fixed budget, more cache and smaller bus is better? I guess my confusion comes from the fact that in the past, when discussing cache on GPUs it was almost always in the context of increasing effective memory bandwidth (e.g. Infinity Cache).

Also, how much harder is it to drive GDDR6X than GDDR6? Does the cost increase significantly?
 
So then for a fixed budget, more cache and smaller bus is better? I guess my confusion comes from the fact that in the past, when discussing cache on GPUs it was almost always in the context of increasing effective memory bandwidth (e.g. Infinity Cache).

Also, how much harder is it to drive GDDR6X than GDDR6? Does the cost increase significantly?

Well, you can always see cache in light of 'expanding bandwidth' or 'reducing latency' or both. Naturally cache tends to have more internal bandwidth than external memory, but at least for CPU it's much less important than latency. That's why you see measurements of latency in CPU reviews (for cache) but generally not bandwidth.
For GPU it's of course a bit different, as their workload characteristics are different, but I think they are getting closer. Cache are still having more bandwidth (as demonstrated by AMD's presentation for RDNA3), but to have that additional bandwidth you need good data reusability, which were not that much in older generation of GPU workloads (other than textures, of course, which is covered by texture cache).

As for complexity of GDDR6X, the most important difference is probably it's PAM4 instead of NRZ. I'd imagine that PAM4 is not that expensive for current transistor density, but more error correction is probably required (however GDDR6 also have ECC on the interface too). My understanding is that GDDR6X uses less power per bandwidth.
 


womp womp


JusDax
OP·1 mo. ago·edited 1 mo. ago
Take My Energy

DLSS Frame Generation doesn't seem to be hardware locked to RTX 40 series. I was able to bypass a software lock by adding a config file to remove the VRAM overhead in Cyberpunk. Doing this causes some instability and frame drops but I'm getting ~80 FPS on an RTX 2070 with the following settings:
2560x1440 res
HDR off
DLSS Balanced
DLSS Frame Generation ON
Ray Tracing Ultra preset
(I get ~35-40 FPS without Frame Generation and DLSS at Quality)
Edit: forgot to add some details
 
Sorry, I was being kind of a dick. Just saying, the fact that new GPUs are faster than old ones doesn't really invalidate my argument. However the increased cache argument does make sense.

I wouldnt look at just the mem bit-width to determine if the GPU is worth its price or not. Not saying the 4080 is a killer-value, though its performing better than a 3090(ti?) so the 192bit isnt really a problem. 960 and 670 where practically one generation/architecture apart, kepler to maxwell (GTX760 etc where the same arch?) whilest 7900GT to 3050 is a something else.
 
My question was more in relation to the phenomenon of new products selling out immediately. If they don’t sell out immediately doesn’t it mean it’s priced appropriately based on simple supply/demand economics?

The irony is that we’re basically arguing that if a product sells out and stays out of stock for weeks/months then it’s priced correctly which is academically inaccurate. That’s why I want to see the price trajectory over a few months. It should drop sharply when demand dries up.

As with anything where we don't know enough details, we can't determine much without more information.

If a product sells out at launch, you can't really tell much from that. Early adopters are more willing to pay excessive amounts of money for a product than would be the case during, let's say the first year (generally speaking). So selling out at launch isn't necessarily a sign that it's priced too low, too high, or just right. If the product remains sold out in the first year, then it's likely priced too low. If there is a large stockpile in stores (like when you see pallets of product with a price tag attached), that's a sign it's likely priced too high. If all you see are some product on store shelves and a bare spot here or there where one vendor's model has temporarily sold out, that's likely the sweet spot. That indicates it's selling close to stock levels but not over stock levels.

But that only tells us whether pricing may or may not be appropriate for a given supply combined with existing demand. It doesn't say anything WRT how much is being supplied.

OTOH - if a product never sells out, that also doesn't necessarily mean it's priced correctly or priced too high. See above for the too much or just enough supply qualifiers. Same applies here.

And addendum to the last is that if a product doesn't sell out on launch for a product category that typically sells out at launch (like GPUs or game consoles, for example) that's an indication that pricing is too high as buyers are generally more willing to spend more at a launch for a new product than they are a few months after launch (because those people that would spend more have already bought their product).

Even underwhelming product such as the HD 2900 XT or FX 5800 initially sold out at launch as there were still an initial wave of people that wanted one regardless of price or performance.

Of course, there are complications that prior GPU launches haven't had to face. IMO, excessive pricing on GPUs due to expectations from GPU makers that pricing that was appropriate during covid lockdown + crypto mining bubble = the new norm. Combine that with record high inflation and fears of a recession. Top it off with less free money from the government to spend on luxury items (although in the US, there's a poll showing that most people qualifying for the college loan forgiveness plan are likely to use that money for vacations and luxury items so that might help with GPU sales. https://www.cnbc.com/2022/11/09/how-student-debt-forgiveness-recipients-plan-to-spend.html :p) products that people that are really well off can afford without even thinking about the cost (IE - cost doesn't matter) will still sell well, but everything else in the product stack might suffer.

It's a weird place for PC tech at the moment. The whales who are willing to buy the highest end product at any price are likely unaffected by worries about inflation or a recession with regards to a 2000 USD product (average scalping price for a 4090 on Ebay) that represents just pennies to them compared to your average buyer.

And as you move down from that into more middle class and lower middle class buyers, they're going to be looking at their budget far more closely now.

So, we might be in a weird position where ultra high end sales are unaffected by prevailing economic conditions but everything below that is likely to suffer to some extent.

Regards,
SB
 
The irony is that we’re basically arguing that if a product sells out and stays out of stock for weeks/months then it’s priced correctly which is academically inaccurate. That’s why I want to see the price trajectory over a few months. It should drop sharply when demand dries up.

If they spend whole year stocking up inventory throughout retail channels and 1 month before releasing new products and the previous generation is still above MSRP in most cases while still increasing prices on the new products above previous MSRP (that is nothing but a fake scapegoat excuse to point at to win arguments in forums and scalp consumer indirectly through AIB's) of older gen and above inflation in expectation retail channels manage to flush stockpilled inventory could be considered evidence it's not priced correctly.

NVidia(or AMD) may well announce their sales in quaterly reports or YoY but those "sales" aren't effectively sold to the final consumer if they just keep stacking quarter after quarter, YoY, and then resourt to accounting tricks to temporarily hide it under the rug for a while longer in hopes they won't have to do a write off.

They already lost interest from one long time AIB partner wouldn't surprise me if isn't an isolated case if they keep standing with supposedly ~65% margins and partners with 10% or less stuck with inventory they cant sell (at a loss) or make a profit of.

Another nail on the priced correctly and selling well coffin.
 
Last edited:
I'm in full agreement with HUB here. The 4080 is just a horribly priced part. It offers lower value than the halo product while likely being terrible value vs the 4070Ti assuming NV don't bollock up the pricing on that too. And that's without considering the AMD parts.

I can see myself potentially holding my nose and getting a 4070Ti in Jan provided it's $799 or lower and we don't get ripped off on the pricing in the UK. It'll still feel dirty spending so much on a x070 class product but at this stage I don't think things are going to get any better than that. AMD is out for me unfortunately as DLSS and RT performance are major selling points for me. FSR2 is actually great in the games I'm currently using it on, but with DLSS being even better, ad more importantly, more common, that's the way I have to go.
Which games out of curiosity? I've only tried it in FH5 and consider it awful.
 
Can I post an unpopular idea. I like initial high cost. It screws scalpers. Let them lose their shirts and drop the prices gradually. Yes it is mining out consumer surplus but scalpers are leeches that do nothing. I can wait to buy shiny new things.
 
Last edited:
Can I post an unpopular idea. I like initial high cost. It screws scalpers. Let them lose their shirts and drop the prices gradually. Yes it is mining out consumer surplus but scalpers are leeches that do nothing. I can wait to buy shiny new things.
Scalpers are a non-factor these days. We have no shortages. The 4090 is in very high demand and I still had no trouble getting mine.
 
Can I post an unpopular idea. I like initial high cost. It screws scalpers. Let them lose their shirts and drop the prices gradually. Yes it is mining out consumer surplus but scalpers are leeches that do nothing. I can wait to buy shiny new things.

I’ve been shopping for furniture recently and one of the stores has a no price match policy and they often raise and drop prices at random on the same items. This wouldn’t work for GPUs as we have been conditioned to expect price consistency over at least a few months.

Even before the pandemic scalpers proved that the market clearing price at launch is much higher than the steady state price. I can’t think of a good way for IHVs to set higher launch prices and then drop them soon after though. It might work once but the next time around people will be a lot more wary of jumping in first.
 
Scalpers are a non-factor these days. We have no shortages. The 4090 is in very high demand and I still had no trouble getting mine.
The high launch price makes scalpers a non issue. That is what I was saying. The downside is the loss of consumer surplus, but the upside is instead of scalpers mining out the surplus it goes to the companies actually producing things.
 
I’ve been shopping for furniture recently and one of the stores has a no price match policy and they often raise and drop prices at random on the same items. This wouldn’t work for GPUs as we have been conditioned to expect price consistency over at least a few months.

Even before the pandemic scalpers proved that the market clearing price at launch is much higher than the steady state price. I can’t think of a good way for IHVs to set higher launch prices and then drop them soon after though. It might work once but the next time around people will be a lot more wary of jumping in first.
If people were leery and did not jump in then scalpers could not find customers either. The problem of mining out the consumer surplus still sucks though. The efficiency of the market is ruined and consumers would get a worse deal. I don't see any way to fix it though without some draconic tracking and restrictions on reselling within a year or something like that. I would have bought a 3080 at launch but it was ridiculous to try and get one. This new economy of reselling around tech, vehicles, even clothing seems incredibly stupid to me. Companies like it as it builds hype but hopefully we've rounded a corner where they realize it alienates customers as well.
 
The high launch price makes scalpers a non issue. That is what I was saying. The downside is the loss of consumer surplus, but the upside is instead of scalpers mining out the surplus it goes to the companies actually producing things.
Scalpers would have been a non-factor regardless. They were really a problem when there was a shortage like during the pandemic.
 
Back
Top