Nvidia's 3000 Series RTX GPU [3090s with different memory capacity]

Hoping the 3060 Super is real and a good sign for the "Super" series if that's what it's called. Except for the price feeling maybe $50 too high it's everything the average consumer might need spec wise for this entire generation. At least assuming re-bar and dx12 do their job.

Here's hoping the 3070 and 80 Supers are of a similar quality.
 
Hoping the 3060 Super is real and a good sign for the "Super" series
"3060 Ultra" you mean? That's basically the long rumored 3060 12GB model, likely photoshopped with "Ultra" and "12GB" by some bloke bored by lockdowns.
What's good in it? It's doubtful that even 3070 will need more than 8GB of VRAM during its lifespan.
I also fail to see what would be so good about 3070 and 3080 with double the VRAM - it would likely kill off their price/perf and that's about it.
 
"3060 Ultra" you mean? That's basically the long rumored 3060 12GB model, likely photoshopped with "Ultra" and "12GB" by some bloke bored by lockdowns.
What's good in it? It's doubtful that even 3070 will need more than 8GB of VRAM during its lifespan.
I also fail to see what would be so good about 3070 and 3080 with double the VRAM - it would likely kill off their price/perf and that's about it.

Ooops, Ultra, right. Anyway, the "You won't need more than" argument is hilarious, just stop. You already need more on some current titles than 8gb if you're run at 4k. As long as there's fractionally above the 10gb limit the Series X has for fast ram then it should be perfectly acceptable for the mainstream. The argument that somehow devs won't use up all the VRAM they can get for some titles this gen, despite doing it every single previous gen, is flagging on pathetic. Nvidia didn't put enough VRAM in their cards so they could maximize profit margins and sales throughput. Oh look companies aren't your friends! Get over it already.
 
Ooops, Ultra, right. Anyway, the "You won't need more than" argument is hilarious, just stop. You already need more on some current titles than 8gb if you're run at 4k. As long as there's fractionally above the 10gb limit the Series X has for fast ram then it should be perfectly acceptable for the mainstream. The argument that somehow devs won't use up all the VRAM they can get for some titles this gen, despite doing it every single previous gen, is flagging on pathetic. Nvidia didn't put enough VRAM in their cards so they could maximize profit margins and sales throughput. Oh look companies aren't your friends! Get over it already.

Using or allocating more than 8 GB....BIG difference?

Because at the Laptop at work I see this:
upload_2021-1-8_7-32-55.png

The green is actual use...the blue....cached data than can be re-allocated on the fly.

While it might seem that I only have 7 MB free....the actual number is actually around 1 GB.
As far as I know no GPU monitering software does this differentiation...everything is being lumped into "used RAM".
 
Last edited:
Msi after burner does the difference since september or october. But you have to dig in the options.
 
Anyway, the "You won't need more than" argument is hilarious, just stop.
Yeah, facts are just hilarious when they go against your agenda.

You already need more on some current titles than 8gb if you're run at 4k.
Now this one is actually hilarious because of:
a) You can provide actual data for this claim, right? Like a chart showing the performance and quality differences for the titles which supposedly "need" more than 8GB to run in 4K?
b) We're talking about a 3070, a card which isn't even targeting 4K in the first place and the one which generally isn't able to keep even at 60Hz in 4K in PC enhanced titles from previous generation of consoles. You expect it to magically soar above that mark in future titles somehow?

As long as there's fractionally above the 10gb limit the Series X has for fast ram then it should be perfectly acceptable for the mainstream.
Perfectly acceptable for what exactly? How many PC gamers are running their games "in 4K" with a mix of medium settings on a mid range GPU while limiting them to 30 fps to boot? If this even something which anyone does?

The argument that somehow devs won't use up all the VRAM they can get for some titles this gen, despite doing it every single previous gen, is flagging on pathetic.
What's flagging on pathetic at this point is the argument that you "need" more than 8GB for 4K on a card which can't really do 4K already due to processing limits and without any data to actually back up that "need".

Nvidia didn't put enough VRAM in their cards so they could maximize profit margins and sales throughput. Oh look companies aren't your friends! Get over it already.
Again, this is what's actually hilarious and pathetic as you can't provide even one example of 8-10GB not being "enough". So yeah companies are not your friend and you're just doing AMD PR at the moment. Get over it already.
 
Msi after burner does the difference since september or october. But you have to dig in the options.

Not a fan of 3rd party apps (seen to many bugs with all the crap some people install....clogging down their hardware), this should be buildt into Windows/drivers...clarity will bring better debates based on facts.
 
I'd love a 16Gb card for hobby rendering.
For the same price sure, I'd love a 20GB one too. It wouldn't be the same price though and would likely be the exact same in gaming performance. And if you actually need 16GB or more for work there are products with such VRAM sizes.

I think it's pretty ballsy of NV to plan to release both 6 and 12 GB versions of 3060 as these will lead to some really interesting comparisons, especially if there will be a way of matching the GPU specs on them. I fully expect there to be exactly zero performance difference between the two in anything below 4K.
 
For the same price sure, I'd love a 20GB one too. It wouldn't be the same price though and would likely be the exact same in gaming performance. And if you actually need 16GB or more for work there are products with such VRAM sizes.
Why would anyone expect it to be the same price? I'd pay $50-100 for a 16gb version of a 3070. And yes if it's for work there are obviously professional cards for they purpose that have higher VRAM and additional support. I can't justify $1500 for Daz 3D.
 
I play older games at 5K and they never break 4GB. The newer stuff gets closer to 8GB at 4K and that’s allocated not used. Memory size requirements are growing for sure but it’s primarily marketing at this point. The average gamer isn’t running 4K anything (especially on console where everything is rendered at 1/2 or 1/4 output resolution).

The one example people use is Doom Eternal with Ultra Nightmare textures at 4K. Unfortunately it has zero IQ difference to regular old Ultra textures.
 
Why would anyone expect it to be the same price? I'd pay $50-100 for a 16gb version of a 3070. And yes if it's for work there are obviously professional cards for they purpose that have higher VRAM and additional support. I can't justify $1500 for Daz 3D.
Well I dunno, people who are saying that "8GB isn't enough"? What they should be saying instead is that 16GB at $650 is better than 8GB at $500 - which would be factually incorrect in any price/perf metric so they don't say this at all.

The one example people use is Doom Eternal with Ultra Nightmare textures at 4K. Unfortunately it has zero IQ difference to regular old Ultra textures.
That's because it's not textures in case of Doom Eternal, it's the size of the streaming buffer, and the faster storage you have the smaller it can actually be with zero effect on IQ.
10GB cards are starting to hit their VRAM limit in 5K in this case and this shows as texture trashing with noticeable hitching and texture loading issues. No such issues are present on either 8 or 10 GB cards in 4K there which means that even in this purely synthetic case even 8GB are in fact "enough" at the moment.
 
Last edited:
Why would anyone expect it to be the same price? I'd pay $50-100 for a 16gb version of a 3070. And yes if it's for work there are obviously professional cards for they purpose that have higher VRAM and additional support. I can't justify $1500 for Daz 3D.

$50-100 extra seems pretty optimistic for double the GDDR6. In the past, in my location at least, double VRAM cards have commanded ridiculous premiums and I've always felt they were purely there to trick/gouge consumers who don't understand exactly how much they need.
 
I think it's pretty ballsy of NV to plan to release both 6 and 12 GB versions of 3060
It is alleged that the RTX 3060 12GB will have a different GPU variant than previously reported. Basically, the GA106-400 SKU has been replaced with GA106-300, which means that both RTX 3060 12GB and ‘rumored’ RTX 3060 6GB would now offer the same specifications. The 6GB version has been delayed, in fact, some sources claim it might even be canceled.
ASUS GeForce RTX 3060 TUF GAMING with 12GB GDDR6 memory leaked
Well, yeah.
 
Back
Top