NVIDIA discussion [2025]

Nvidia is so obviously upselling us on these GPU's.

I don't want to respond to the comparisons with consoles because this will always end up in the meaningless PC vs console wars, so I'll only respond to this.
Do you have any evidence? I mean, let's not forget that AMD lost its market share during these times. If NVIDIA is so "obviously upselling" these GPUs, it should be trivial for AMD to sell their GPU at much lower prices to gain market share, instead of losing it.
 
@pcchen in the past when either Nvidia or AMD slipped the other punished them for it. The fact that both of them are struggling to realize huge gains tells me a lot about how difficult it is. The fact that Sony is now saying that rasterization is nearly a dead end and CNNs plus ray tracing are needed is just the last confirmation I needed. Nvidia made a bet with the 20 series and it turned out to be correct. People waiting for an easy or cheap fix have their heads buried in the sand.
 
People waiting for an easy or cheap fix have their heads buried in the sand.

I think the proposal here is that these companies should eat the increased cost and give us great perf/$ increases while intentionally reducing their profits for no reason. Fairy tale stuff.

Intel seems to be trying that to some extent but their tech just isn’t competitive. AMD may try it with RDNA 4 to buy market share. Will see if fairy tales do come true.
 
Intel seems to be trying that to some extent but their tech just isn’t competitive.
Intel is doing it because there are plenty of reasons why their GPUs (or CPUs for that matter) wouldn't be selling at the same price as competition's. So it's not really a validation of the idea as much as the opposite of that - that without any solid reasons no GPU maker would reduce their margins to provide the same h/w at lower pricing.

It is also arguably a very destructive idea for the market in general.
Without profits there would be no R&D on the future graphics h/w and we'd be stuck even more in perf/price increases.
Also in case of all vendors lack of GPU division profitability would actually make the argument against their existence a lot stronger - something which is popular on the Internet lately in the form of "Nvidia abandoning gamers for the AI market". Well if they'd be pushed into making their gaming GPU division margins low or even negative then guess what - that scenario would actually stop being the usual Internet FUD and start making business sense.
 
How have they gotten more expensive? Do you have to pay more to get the same performance in a newer generation? No. At any dollar amount, every new generation gives you more performance.

What has changed is the amount of “more”. We all understand that. We are all disappointed by it. Can we move the fuck on?

In absolute terms there have been significant performance increases. But the complaint is about “relative” performance. So the comparison point is a moving target with lots of arbitrary variables.

Clearly a $300 card today will absolutely destroy a $300 card from 10 years ago not accounting for inflation. That 10 year old card would cost $400 today.

I think this explains why these “terrible value” cards are still selling like crazy. The people who actually play games on them are having a decent experience and aren’t sitting there doing these arbitrary relative comparisons.
 
I don't want to respond to the comparisons with consoles because this will always end up in the meaningless PC vs console wars, so I'll only respond to this.
Do you have any evidence?
Yes, I've pretty much proven it before a number of times, but the response from people here, including a couple mods, is that I simply need to STOP bringing up such facts that demonstrate exactly what I'm talking about.

It's absurd, but that's the situation. I'm literally not allowed to argue the obvious, demonstrable facts that prove what I'm saying, because people here dont want to hear it and simply dont think 'greed' is a thing that can even exist in a conceptual sense. Any price that is, must be perfectly logical and justified, always. The free market is perfect.
 
Yes, I've pretty much proven it before a number of times, but the response from people here, including a couple mods, is that I simply need to STOP bringing up such facts that demonstrate exactly what I'm talking about.

It's absurd, but that's the situation. I'm literally not allowed to argue the obvious, demonstrable facts that prove what I'm saying, because people here dont want to hear it and simply dont think 'greed' is a thing that can even exist in a conceptual sense. Any price that is, must be perfectly logical and justified, always. The free market is perfect.

Yet you haven't answered my question. If you have your "evidence" that'd be quite easy.
I don't really want to waste any more time on this topic, but I'll give another example. Smartphones. Apple is not willing to lower their prices because they want to preserve their brand values. It's very common, just like Ferrari is not going to sell a US$20k car. However, as you can see, in this case other competitors are willing to sell at lower prices and now they have more market share. Apple's global market share is now less than 20%, although they still make a lot of money.
Now, with my question, what's your theory on why AMD's not willing to grow their market share by lowering their prices even more?
 
NVIDIA is set to wipe out any hope of competition entirely, next-gen Rubin architecture is expected to come under "trial production" by H2 2025, as SK Hynix is now focused on supplying HBM4 earlier than scheduled.

If true why would anyone invest in Blackwell especially given the reports of delays and overheating unless Rubin is going to be supply limited for a long time. Hopper got a nice long run.
 
NVIDIA is set to wipe out any hope of competition entirely, next-gen Rubin architecture is expected to come under "trial production" by H2 2025, as SK Hynix is now focused on supplying HBM4 earlier than scheduled.
If true why would anyone invest in Blackwell especially given the reports of delays and overheating unless Rubin is going to be supply limited for a long time. Hopper got a nice long run.
Not to mention they've yet to release the "Blackwell ultra" which Jensen just couple days ago confirmed again to be their next AI chip
 
If true why would anyone invest in Blackwell especially given the reports of delays and overheating unless Rubin is going to be supply limited for a long time. Hopper got a nice long run.
They have moved forward their plans in a great way, Blackwell Ultra (B300) is coming up 6 months early, and Rubin is the same. It's the same situation as H100 and H200. NVIDIA keeps pumping up new hardware and customers buy what's available according to their budget, order volume and development plans.

As for reports of delay:
Taiwan suppliers of Nvidia GB200 servers and components responded to fresh reports of overheating issues on GB200 servers by saying, “How many times is this rumor going to get repeated?”, media report, adding GB200 shipments are on schedule and have not been impacted.

 
So what's the deal with this "The Information" outfit? They're just leaking false info to manipulate Nvidia's stock price?
 
In my experience, I prefer XeSS results to FSR -- at least in Cyberpunk. I'm not sure how relevant it is for other games...
 
It is decent, but DLSS still has superior upscaling, and it has Ray Reconstruction and Frame Generation.

What's more, the transition to the transformer model will boost DLSS quality in all categories even further.

Yeah I know but presumably Intel hasn’t been training their model for 6 years so maybe it doesn’t take much to get decent results. 80/20 rule.
 
Back
Top