NVIDIA discussion [2025]

But did we though? There's another conversation somewhere on this forum about how high AA and high resolution and high framerates really never happened in the top AAA titles because there still wasn't enough horsepower then, either.

If my memory serves right it’s always been the case that the most demanding settings and resolutions of the day were out of reach of the fastest single card solutions. Hence the rise of SLI etc.

I don’t know what it is. Maybe the modern internet has provided the perfect platform for grievance culture and the angriest voices get amplified. Then there’s the rise of clickbait that further riles up the community for clicks and revenue. Whatever the reason there’s certainly a greater sense of entitlement now than there was in the past. People feel they deserve to own the fastest card for some arbitrary price they think is “fair” .

My statement remains: we are getting more and better hardware for our dollar, even though the absolute maximum prices continue to go up. Why do they continue to go up? Because no amount of money RIGHT NOW could ever buy you the performance you're about to get from a 5090. Does that mean everyone needs a 5090? Does that mean ANYONE needs a 5090? Nope. But they're gonna make a ton of them, and they're gonna sell them all, because people will want them and will be willing to pay for them.

The number of 5090 SKUs is surprising for a $2000 product. Maybe it was the same with the 4090 and I didn’t notice. AIBs are gearing up like they plan to sell a lot of these things.
 
Yeah, just finished watching GN's overly snarky summary of all the 5090 and 5080 SKUs coming out and I agree it does seem like there are quite a few. I'm running an ASUS Tuf gaming 4090 today; it will soon find its way into one of my dedicated folding rigs when I sell off the EVGA FTW3 Gaming 3080Ti (I'm still very sad EVGA isn't in the biz anymore.) A 5090 will replace it in the 9800X3D gaming rig, and it will also get the severe-duty undervolt + overclock treatment along with 24/7 folding duties just as the 4090 has.

Now I just gotta figure out which one to buy ;)
 
Last edited:
Has something changed to indicate that WoA will be any more successful in 2025 than it was in the past?

Personally I think Microsoft did it the wrong way. Windows on ARM devices should be cheap. They need to attack from below, instead of trying to make a high end devices competing with MacBook Pro.
On the other hand, this could just be a preparation step. For some reason they gave Qualcomm one year exclusivity. Now it's rumored that vendors like MediaTek will be making SoC for Windows on ARM, and it'll likely to be much cheaper.
A cheap Windows on ARM device will have a much better fighting chance IMHO.
 
Personally I think Microsoft did it the wrong way. Windows on ARM devices should be cheap. They need to attack from below, instead of trying to make a high end devices competing with MacBook Pro.
On the other hand, this could just be a preparation step. For some reason they gave Qualcomm one year exclusivity. Now it's rumored that vendors like MediaTek will be making SoC for Windows on ARM, and it'll likely to be much cheaper.
A cheap Windows on ARM device will have a much better fighting chance IMHO.
Yeah, I think you're not wrong here. I gotta give Microsoft credit for doing a great job establishing the Surface range as a premium brand ever since its initial launch. They managed to shake up the Windows OEM landscape quite a bit, with vendors like HP, Dell and Asus now all offering devices that seem to prioritize build quality and elegance a lot more than they used to (I reckon Lenovo always did, in their way). All things considered, Windows laptop no longer means nasty plastic bling.

And I'm just not sure how Snapdragon fits in there. Other than battery life, which is pretty good - though by no means groundbreaking if you consider how far Apple's CPUs are ahead, and Lunar Lake and Strix Point matching it or being not that far behind, respectively - I would call the Qualcomm devices... super unexciting. I think I would prefer Strix Halo for a Surface device, or Lunar Lake or its successor, along with bringing back the optional Geforce.
 
Yeah, just finished watching GN's overly snarky summary of all the 5090 and 5080 SKUs coming out and I agree it does seem like there are quite a few. I'm running an ASUS Tuf gaming 4090 today; it will soon find its way into one of my dedicated folding rigs when I sell off the EVGA FTW3 Gaming 3080Ti (I'm still very sad EVGA isn't in the biz anymore.) A 5090 will replace it in the 9800X3D gaming rig, and it will also get the severe-duty undervolt + overclock treatment along with 24/7 folding duties just as the 4090 has.

Now I just gotta figure out which one to buy ;)
That's just GN's brand isn't it, but given the marketing materials, perhaps for once it is not inappropriate :D

I am eyeing a 5090 as well, as I anticipate the 5080 may turn out to be an underwhelming upgrade for my 4080. I'll be looking out for the Asus ProArt - I love the unassuming wolf in sheep's clothing sleeper PC aesthetic.
Just give me something black, compact, and devoid of LEDs and I'm happy..
 
Biden’s latest round of AI export restrictions is pretty hardcore. Nvidia isn’t pleased.

"As the first Trump Administration demonstrated, America wins through innovation, competition and by sharing our technologies with the world — not by retreating behind a wall of government overreach," Finkle said. "We look forward to a return to policies that strengthen American leadership, bolster our economy and preserve our competitive edge in AI and beyond."

Nvidia stock is down 3% on the news. I don’t know if appealing to Trump will work as he doesn’t seem to love China.
 
Man, what did Portugal, Austria and Switzerland ever do?
and Greenland tier 2 but Denmark tier 1 and Greenlands foreign policy is managed by Denmark
 
Last edited:
They’re a lot more expensive if your criteria is “I used to max out games for $600” or “The best card used to cost $700” whether or not those things are actually true or relevant.
A Titan cost $1000 in 2013. Nobody is against the existence of expensive GPU's. But what Nvidia GPU's are actually offering today at almost any given price point is absolutely quite a bit worse than what we used to get in relative terms.

It's completely undeniable. At least to anybody outside this bizarre forum. It's more than four years since the new consoles came out, and people are still having to spend $350-400+ just to get a new GPU where you dont have to to lower the settings to below console quality. That's just insane. Nvidia is so obviously upselling us on these GPU's. A 4060 is a tiny low end GPU. The 4060Ti is similarly pretty dang small, and in both of these cases, their 8GB of RAM wouldn't be questioned much at all if these were named 4050 and 4050Ti and priced appropriately. And they've done this upselling across basically the whole range.

That's how Lovelace in particular jacked up prices on us. So yes, people are 100% correct when they say that GPU's have gotten more expensive and worse value. Because they simply have. Whether you think there's some justifiable financial need for them to do this is a different argument, but the fact that we're getting less for more these days is undeniable.
 
A Titan cost $1000 in 2013. Nobody is against the existence of expensive GPU's. But what Nvidia GPU's are actually offering today at almost any given price point is absolutely quite a bit worse than what we used to get in relative terms.

It's completely undeniable. At least to anybody outside this bizarre forum. It's more than four years since the new consoles came out, and people are still having to spend $350-400+ just to get a new GPU where you dont have to to lower the settings to below console quality. That's just insane. Nvidia is so obviously upselling us on these GPU's. A 4060 is a tiny low end GPU. The 4060Ti is similarly pretty dang small, and in both of these cases, their 8GB of RAM wouldn't be questioned much at all if these were named 4050 and 4050Ti and priced appropriately. And they've done this upselling across basically the whole range.

That's how Lovelace in particular jacked up prices on us. So yes, people are 100% correct when they say that GPU's have gotten more expensive and worse value. Because they simply have. Whether you think there's some justifiable financial need for them to do this is a different argument, but the fact that we're getting less for more these days is undeniable.
How have they gotten more expensive? Do you have to pay more to get the same performance in a newer generation? No. At any dollar amount, every new generation gives you more performance.

What has changed is the amount of “more”. We all understand that. We are all disappointed by it. Can we move the fuck on?
 
How have they gotten more expensive?

Well, he did specify his critera quite clearly in the very post you quoted:

It's more than four years since the new consoles came out, and people are still having to spend $350-400+ just to get a new GPU where you dont have to to lower the settings to below console quality.

Now, it's not a perfect metric - like-for-like when comparing performance profiles across different platforms is a little more difficult to use when we have things like DLSS, and far more widespread use of dynamic res on console.

However, I can't see anyone not at least recognizing that halfway through a consoles lifespan, still having to spend ~70% of the price of an entire console for just a competitive GPU - and one where in some cases, it will be result in an even worse experience due to VRAM limitations on these ~$300 cards, is a new thing.
 
First: in what scenario does a console have more VRAM than literally any of the new cards of this gen? Or howabout the graphics cards of last gen? Then, how are we comparing console graphics, which live within this far more restricted memory footprint in both terms of absolute capacity as well as bandwidth, as somehow "equal to" the graphics output from the same engine burning through notably more graphics memory in a dGPU on the PC platform?

Second, I fail to understand how anyone thinks the consoles are further ahead than the dGPUs at equal graphics settings. Let's ACTUALLY talk about the frametime consistency of a console versus a PC, let's talk about the actual raster resolution (yeah, upscaling has been a thing on consoles for a long time now), let's talk about object density, let's talk about polygon budget, let's talk about particle density, let's talk about dynamic lighting.

And finally, a new dGPU is NOT required (in the slightest) to "match" console performance. A used 1080 Ti can match console performance today, and it's seven years old and costs less than a Switch Lite.

P.S. Are we going to just handwave off how incredibly subsidized consoles are, and the prices you're required to pay for a locked ecosystem of games? Nah? Backwards compatibility? No? Let's not start a PC vs Console war in here, because it's "undeniable" (to borrow seanspeed's word) the consoles are not on equal footing to a PC for a litany of obvious reasons.
 
Well, he did specify his critera quite clearly in the very post you quoted:

Now, it's not a perfect metric - like-for-like when comparing performance profiles across different platforms is a little more difficult to use when we have things like DLSS, and far more widespread use of dynamic res on console.

However, I can't see anyone not at least recognizing that halfway through a consoles lifespan, still having to spend ~70% of the price of an entire console for just a competitive GPU - and one where in some cases, it will be result in an even worse experience due to VRAM limitations on these ~$300 cards, is a new thing.
These are all arbitrary metrics. You and I can cook up 10 others to express our disappointment, but that doesn't change the root issue -- that perf/$ for GPUs has stopped increasing at the rate that has been programmed into our heads as a seemingly inalienable birth right.

Console vs. PC comparisons seem to be interesting because PS5 and XBSX rode out the last major slope on Moore's Law's final gasp. Look at the PS5 Pro's price.
 
First: in what scenario does a console have more VRAM than literally any of the new cards of this gen?

In any scenario where its shared 16GB allows for texture settings that will have to be lowered on the PC version running on 8GB cards at comparable settings. You'll have to lower texture settings on an 8GB card for titles like Spiderman (when using RT), Hogwarts, Diablo IV, Ratchet and Clank, Hellblade 2, just off the top of my head.

Let's ACTUALLY talk about the frametime consistency of a console versus a PC

<cough>#stutterstruggle<cough> We uh, really want to go down this road?

let's talk about the actual raster resolution (yeah, upscaling has been a thing on consoles for a long time now), let's talk about object density, let's talk about polygon budget, let's talk about particle density, let's talk about dynamic lighting.

And finally, a new dGPU is NOT required (in the slightest) to "match" console performance. A used 1080 Ti can match console performance today, and it's seven years old and costs less than a Switch L

Well yeah - considering it has 11GB of ram, which is the main sticking point with cards in the $300 range. The very fact that your go-to card for this comparison is a 7-year old flagship just reinforces seanspeed's point! Yes, that card would be decently competitive, precisely because it addresses the main problem that we have with ~$300 GPU's in the past 2 generations!

Regardless, not sure why used products with no warranty and extremely limited availability are being compared to actual products shipping new from the manufacturer.

P.S. Are we going to just handwave off how incredibly subsidized consoles are,

No one is 'handwaving' it off, which is why the discussion is about the price vs. 4 year old consoles, not when they first come out. The PC is always at big disadvantage wrt price/performance of new console out of the gate, but this long into the gen it is indeed a new development where $300 GPU's aren't just wiping the floor with them, and in some cases require quality compromises to boot.

and the prices you're required to pay for a locked ecosystem of games? Nah? Backwards compatibility? No? Let's not start a PC vs Console war in here,

It seems that's what you were kinda going for with this post? This isn't about the viability of each as a game platform and all their relative strengths and foibles, seanspeed and I have merely noted (as Digital Foundry has done themselves many times) how the ~$350 segment of cards has remained relatively weak compared to launch day consoles.

that perf/$ for GPUs has stopped increasing at the rate that has been programmed into our heads as a seemingly inalienable birth right.

:rolleyes:
 
Last edited:
Back
Top