Nvidia's 3000 Series RTX GPU [3090s with different memory capacity]

I don't get edit: it. Why all the hate for that lovely 8-GByte RTX 3050 Ti just because they misspelled it's name. ;)

35% in the worst of their 12 titles, 17% on avg IIRC.
It's „just“ 26-ish percent, and that's also the reason for the changed thumbnail, because they were talking about a perf degradation, not increase.
1670019636270.png
 
Last edited:
If it's slower by as much as those thumbnails suggest then yes, that's obviously BS.

Generally speaking Nvidia are legends from an engineering perspective and total twats from a corporate perspective.

That market share bump must be going to their heads or they're desperate to stay in the black.
 
Unfortunately it doesn't matter what the criticism is, whether it's justified or not. They will defend their beloved company at every turn... for some reason. No idea why anyone gets so obsessed with any company unless they're specifically paid to do so.

If DF call out Nvidia on something then it's crickets here by them.
Why don't you just stop the crap? First you piss on anyone criticizing product shortcomings in the AMD thread and now your starting the same in the Nvidia thread when people level criticism against product reviews they don't agree with. Either justify your positions or stop posting your one liners designed to create conflict.
 
Why don't you just stop the crap? First you piss on anyone criticizing product shortcomings in the AMD thread and now your starting the same in the Nvidia thread when people level criticism against product reviews they don't agree with. Either justify your positions or stop posting your one liners designed to create conflict.
What is your criticism of this review?
 

This isn't even the first time they did something bad like this. They did the same thing with the RTX 2060 (6 GB versus 12 GB) where the difference wasn't only memory amount but also lower clock frequency, less SMs, etc. All around despicable.

They should have brought back something like their XT/XE/LE suffixes if they were going to pull shit like that. Those allowed for the reuse of the model "number" while clearly distinguishing that one was faster than the other. Sure they have the Ti now to denote that a faster version of a card came out, but they don't have anything letting a consumer know that they just released a significantly slower card with almost the exact same name as the launch card.

Maybe it wouldn't be so despicable that they did this if instead of calling a new faster version of a card a Ti, they just called it the base name. That would at least be amusing and consistent. Then instead of 3060 Ti, 3060, 3060 ... we'd have 3060, 3060, 3060. Now pick the one you want. :D

Also AMD are just as bad whenever they do something similar, although they seem to try to limit those types of product to the Chinese market usually.

Regards,
SB
 
I do wonder why we don't things like this gimped 3060 being called a 3060 XT and the astonishing Radeon 6400 being a 6400 Ti. That's what they once did.
 
Last edited:
If it's slower by as much as those thumbnails suggest then yes, that's obviously BS.

Generally speaking Nvidia are legends from an engineering perspective and total twats from a corporate perspective.
On the contrary, I’d say they’re experts from a corporate perspective too.

They’re pulling off some Apple-like feats here. Very impressive!
 
12 GB has more cuda cores actually.

Isn't that what I said? The 2060 6 GB has less of most of the important things. Less cores, less frequency, fewer SMs, less memory, less RT cores, etc.

If someone just went into a retailer and bought one off the shelf, they'd have no idea that the 2060 6 GB would be significantly slower than the 2060 12 GB.

Regards,
SB
 
If someone just went into a retailer and bought one off the shelf, they'd have no idea that the 2060 6 GB would be significantly slower than the 2060 12 GB.
If that someone would somehow completely avoid the fact that the memory size is plastered all over the box of the card, the price label and in the drivers, the utilities etc then sure.

If I would go and buy myself a very cheap Mercedes because it's Mercedes but then I'd be shocked to find out that it's not in fact a S65 AMG then it would surely be a Mercedes fault. Who's else?
 
Last edited:
Isn't that what I said? The 2060 6 GB has less of most of the important things. Less cores, less frequency, fewer SMs, less memory, less RT cores, etc.

If someone just went into a retailer and bought one off the shelf, they'd have no idea that the 2060 6 GB would be significantly slower than the 2060 12 GB.
Apologies, I did not realize you are complaining about this kind of disparity. Usually, these complaints arose when a card with more memory is slower, which seems (more) scammy.
 
If that someone would somehow completely avoid the fact that the memory size is plastered all over the box of the card, the price label and in the drivers, the utilities etc then sure.

If I would go and buy myself a very cheap Mercedes because it's Mercedes but then I'd be shocked to find out that it's not in fact a S65 AMG then it would surely be a Mercedes fault. Who's else?

And I suppose it's also plastered all over the box that is has less frequency, less cores, less SMs, etc.?

That's like buying a Mercedes GLA 250 SUV with the base trim level (6 GB model) or the GLA 250 SUV with the AMG Line w/Night Package (12 GB). Except here where the 19-inch AMG wheels might give it a slight advantage in certain specific scenarios (like 6 GB versus 12 GB), unlike the RTX 3060 6 GB versus 12 GB, they both have the same performance.

Pretty much what you expect from something like a difference in memory amount or a difference in trim level in the exact same base car or base graphics card. Except again, the base 2060 hardware in the 6 GB is significantly different than the base 2060 hardware in the 12 GB. unlike the Mercedes which is IMO, criminally deceptive.

Regards,
SB
 
Maybe? Do you buy video cards based on what is plastered on their boxes?

I used to and I know quite a few people who still do. ESPECIALLY for the low end GPUs as they don't often look at reviews for those and expect them to all perform similarly. And at that price level, often they'll see the cheaper card and buy it assuming that for their gaming needs it'll be about the same performance as the more expensive version of the same card.

Regards,
SB
 
NVIDIA needs to stop this practice, they've been doing it for years, first with the 1060 (6GB and 3GB with different cores count), then the 2060 (6GB and 12GB with a bigger core count difference) and now the the 3060 (12GB and 8GB with a much bigger core count difference). Even the 3080 got involved too with a 10GB and 12GB versions with different core count too!

They need to stop this behavior in all honesty, since they seem to be grown accustomed to it. And it does nothing but cause headache for both the consumer and the corporation.
 
Back
Top