Nvidia GeForce RTX 50-series Blackwell reviews

In compute, it also doesn't offer many advances:


llamacpp-nvidia-cuda-mistral-7b-instruct-v03-q8_0-pp5.svgz


The card has about similar flops to 4070 (non-S) and the same MSRP.
So any compute benchmark where it's faster than 4070 "offer advances".

Let me know when linear swept spheres actually matter to an end user.
Sure, IJ is using them for hair rendering in Full RT mode right as I'm writing this.
 
It's absolutely a boring release for those of us who chomp at the bit for exciting product launches
This is the point of my original comment
Sure, IJ is using them for hair rendering in Full RT mode right as I'm writing this.
they're using a feature that isn’t supported on anything prior to a few weeks ago? Doesn’t sound all that important then lol, the game runs fine on Ada and Ampere.
 
they're using a feature that isn’t supported on anything prior to a few weeks ago? Doesn’t sound all that important then lol, the game runs fine on Ada and Ampere.

So instead of being impressed that a brand spanking new feature is already available in a high profile game your take is that new graphics features aren't important? Man the graphics enthusiast scene is bitter AF.
 
Couple of notes from skimming through the reviews:
  1. The results suggest that lower AD104 cards (4070/4070S) were in fact memory bandwidth limited to some degree.
    The bandwidth advantage is true for all Blackwell cards launched thus far but here it actually transforms into consistent wins over the previous cards in this segment - sometimes it's enough to hit 4070TiS level of performance even.
    So this is the first tier where G7 actually makes a difference for average performance I'd say and not just provide some interesting one offs.

  2. This also means that 5070 goes down harder than 4070/4070S when more math is being pushed into it.
    The most obvious cases are with RT - 5070's advantage over 4070 cuts in about half when RT is enabled.
    And it leads to 5070 and 4070S trading places basically where w/o RT 5070 is slightly ahead while w/RT 4070S ends up being slightly faster.
    It's an interesting change of balance which will probably lead to some unexpected results between 5070 and the competition tomorrow.

  3. And again this bandwidth induced difference is interesting when applied to a presumably math heavy UE5 - I would expect the card to suffer there more than on average but the results aren't as uniform as I'd expect. Many UE5 titles seem to be doing just fine and show a similar +20% or so over 4070 as non-RT titles do. This is a bit unexpected but it may be related to how these games are using the engine.
 
Last edited:
Yes. Because it's falling back to how it used to be done on the h/w which doesn't support the feature.
Looking at comparison shots for RTX Hair on and off it looks almost identical.

So instead of being impressed that a brand spanking new feature is already available in a high profile game your take is that new graphics features aren't important? Man the graphics enthusiast scene is bitter AF.
New graphics features are important, barely perceptible new technologies aren't. Do you think this generation's essentially zero uplift vs last gen is somehow made up by Hairworks 2?
 
Do you think this generation's essentially zero uplift vs last gen is somehow made up by Hairworks 2?
Is it really zero uplift though? What metric are you using to arrive at a claim of zero uplift from last gen? I'm not sure the available data supports a claim of zero uplift.
 
Is it really zero uplift though? What metric are you using to arrive at a claim of zero uplift from last gen? I'm not sure the available data supports a claim of zero uplift.
It's not literally zero but that's why I said essentially zero: the uplift is small enough to be inconsequential to the customer. The last generation with a real improvement to price/performance across the stack was prob Ampere, and the Ada uplifts were good but the price went up with it.

The 5070 is basically a 4070S, maybe sometimes a 4070ti, for $50 less than a 4070S on launch a year ago. In reality it will likely cost more depending on supply. This might as well be a 4070S.
 
Is it really zero uplift though? What metric are you using to arrive at a claim of zero uplift from last gen? I'm not sure the available data supports a claim of zero uplift.
Perf/watt seems to be about the same as 4070 Super (https://www.techpowerup.com/review/nvidia-geforce-rtx-5070-founders-edition/44.html) or even worse (https://www.computerbase.de/artikel/grafikkarten/nvidia-geforce-rtx-5070-test.91530/seite-7)

edit:
And for what it's worth, since so many here think only RT matters, 5070 does relatively worse than 4070 Super with it, I think it was the same with AI but not 100 % sure on that
 
Last edited:
And for what it's worth, since so many here think only RT matters, 5070 does relatively worse than 4070 Super with it, I think it was the same with AI but not 100 % sure on that
It doesn't do "worse than 4070 Super" with RT, it does worse *against* 4070 Super with RT than without it.
TPUs numbers are +4/+5/+7% for non-RT average in 1080p/1440p/4K. With RT this turns into +1/+1/+4.
CB.de's numbers are +3/+4% in 1440p/4K w/o RT and -2/-2% with RT.
All in all this is the epitome of a negligible difference.
 
Looks effectively like a $50 price cut on 4070 Super which is welcome. But for the love of god lets get some cards on the market. I hope AMD opens the floodgates tomorrow.
 
New graphics features are important, barely perceptible new technologies aren't. Do you think this generation's essentially zero uplift vs last gen is somehow made up by Hairworks 2?

No I don’t think that. I also didn’t see much difference. I was responding to the idea that a brand new feature isn’t important cause it’s not in games yet. That implies no new feature that requires game integration will ever be important.

The general malaise in the enthusiast community is getting really tiresome. Can you think of a graphics feature in the past 5 years that people were generally happy about? 10 years even? Nanite is the only one that comes to mind and that’s all software running on old APIs.
 
The general malaise in the enthusiast community is getting really tiresome. Can you think of a graphics feature in the past 5 years that people were generally happy about?
Raytracing? People like the way it looks they just don’t like the performance hit, however pretty much everyone sees it as the future and it’s a big reason people have been buying Nvidia cards.

Blaming enthusiasts for being malaised (not a word but you know what I mean) is absurd, this generation is boring and I see nothing to get excited about.
 
Raytracing? People like the way it looks they just don’t like the performance hit, however pretty much everyone sees it as the future and it’s a big reason people have been buying Nvidia cards.

The vibe around RT has been very negative and is only now turning around as more examples of path traced GI are shipping and RT is slowly making inroads as a base requirement for some titles. Yes, people still buy Nvidia in droves but buying habits and online discourse have long been out of sync with each other.
 
I might be beating a dead horse mentioning this so much but a problem is we don't really have data that separates out DIY Retail. The overall market data we always use factors in everything, and Nvidia basically annihilates AMD in mobile, OEM prebuilts, and likely even SI.

My suspicion is that for the DIY retail only the market share distribution might be higher for AMD, even if it still favours Nvidia. Then if you can further isolate for certain sub demographics that actually matched the online discourse the distribution might be more inline.
 
I might be beating a dead horse mentioning this so much but a problem is we don't really have data that separates out DIY Retail. The overall market data we always use factors in everything, and Nvidia basically annihilates AMD in mobile, OEM prebuilts, and likely even SI. My suspicion is that for the DIY retail only the market share distribution might be higher for AMD, even if it still favours Nvidia.

That's true. We can probably assume Nvidia has close to 100% of pre-builts and mobile. So in order to have 10% of the overall discrete market AMD must be shipping >10% in DIY. The numbers are so incredibly lopsided though that I don't think a few percent here or there matters.
 
That's true. We can probably assume Nvidia has close to 100% of pre-builts and mobile. So in order to have 10% of the overall discrete market AMD must be shipping >10% in DIY. The numbers are so incredibly lopsided though that I don't think a few percent here or there matters.

I'm not so sure. Hypothetically if say mobile and OEMs were 75% of the market and it were all Nvidia that would mean out of the remaining 25% it's 15% vs 10% for Nvidia vs AMD. This would translate to 60 vs 40 of that segment, which interestingly is roughly the numbers pre when mobile started to really take off.

Again these numbers are just illustrative. The only somewhat concrete numbers I know of is that mobile is now roughly 50% of discrete GPU shipments, albeit it's accurate to say Nvidia has 100% of that segment either.
 
The vibe around RT has been very negative and is only now turning around as more examples of path traced GI are shipping and RT is slowly making inroads as a base requirement for some titles. Yes, people still buy Nvidia in droves but buying habits and online discourse have long been out of sync with each other.
A lot of people don’t think basic RT is worth it (especially with rising costs) but I’ve never seen anyone seriously suggest that it’s inconsequential as a graphics technology. Unlike stuff like RTX Hair, which might as well not exist. It’s fine, but it’s not worth the price they’re asking.

But yeah in general I’d say the last 5 years of the graphics enthusiast community has been downright depressing. Neither vendor is really giving us anything interesting.
 
Back
Top