Nvidia's 3000 Series RTX GPU [3090s with different memory capacity]

If Big Navi is at 3080 level while having 16GB VRAM, IMO it's good chance that nvidia will release 3080 20GB version or 16GB 3070Ti/Super.
Is there any reason why more memory is needed? The need for bandwidth absolutely critical for RT, for compute, for ML etc. But what is the reasoning behind having more storage? Is there concern on hitting memory limits due to a lack of asset compression for higher resolutions? It appears to be proving itself well at 4K. DLSS should reduce the memory requirements as well provided you're hanging around 1080p.
 
Is there any reason why more memory is needed? The need for bandwidth absolutely critical for RT, for compute, for ML etc. But what is the reasoning behind having more storage? Is there concern on hitting memory limits due to a lack of asset compression for higher resolutions? It appears to be proving itself well at 4K. DLSS should reduce the memory requirements as well provided you're hanging around 1080p.

Doom Eternal already suffers performance drops at 4k for 8 GB GPUs. It's not a stretch at all to assume 10GB will become an issue over the next few years.
 
I assume Anandtech are releasing a review eventually with a deep dive?

It's delayed due to wildfire related issues I believe. It and the RTX 3080 review is planned to be rolled into the RTX 3090, so when that embargo lifts.

Is there any reason why more memory is needed? The need for bandwidth absolutely critical for RT, for compute, for ML etc. But what is the reasoning behind having more storage? Is there concern on hitting memory limits due to a lack of asset compression for higher resolutions? It appears to be proving itself well at 4K. DLSS should reduce the memory requirements as well provided you're hanging around 1080p.

It wouldn't be a concern for current gaming workloads but what people are projecting going forward. You're talking about a usage lifespan for likely roughly 2 years likely for the majority, and more so in the 2-6 year range for the majority of buyers. So you do need to project requirements out somewhat.

The above is also compounded by the upcoming console cycle, if we look at what happened last generation if/when games do shift upwards in VRAM usage it'll likely be a broad spike as opposed to gradually.

The question of whether or not you need more VRAM is also a bit complex. A key issue has always been texture size (quality) and what the expectations are with that. Texture size increases essentially have negligible performance implications aside from VRAM usage. Some people consider max settings and along with max texture settings a necessity. I'd also guess we might see some outlier games have have texture assets designed for 8k resolutions. Some people will consider those assets "needed" for <8k (even for needed for 1080p). So where you fall on that belief would influence one's viewpoint on this.

In general there's always going to be outliers as well, not just in games but it could situational within games. Some people will say they can't accept any compromise (eg. all games including mods, zero stutters no matter when in game due to VRAM), some will be fine with lower texture settings and the occasional stutter at some specific point, and anywhere in between.

If I had to guess I think 10GB will age like a 4GB-5GB (hypothetical since non existed) that was released at the onset of the last console generation (for context mainstream Nvidia VRAM was 2gb/3gb with Kepler, Maxwell a year later moved that to 2-4GB). Of course whether or not 4GB-5GB was enough is debatable.
 
Lots. Pretty much anyone into esports or competitive gaming on pc.

there’s a reason 1080p 360Hz screens are coming out.

HUB claim that Ampere isn't suited for low-res high refresh rate gaming since it seems to be bottlenecked with geometry workload at lower resolutions and not just due to CPU.( see at 27m10s )

 
It wouldn't be a concern for current gaming workloads but what people are projecting going forward. You're talking about a usage lifespan for likely roughly 2 years likely for the majority, and more so in the 2-6 year range for the majority of buyers. So you do need to project requirements out somewhat. The above is also compounded by the upcoming console cycle, if we look at what happened last generation if/when games do shift upwards in VRAM usage it'll likely be a broad spike as opposed to gradually.
I think you will likely see a shift to VRAM usage when consoles are no longer dependent upon a boat load of optimizations for performance/visual quality requirements. When rendering barriers between consoles and PC's no longer exist you might see a spike in VRAM usage (next console cycle, maybe), though don't necessarily think there will be a need for developers to use more outside the occasional game that may.
 
OcUK has probably done irreparable harm to its reputation, since many people were being charged an extra £50 between basket and checkout and other nonsense.

Quite a few orders shipped out today, more shall ship tomorrow, we just cannot disclose numbers, more stock will arrive next week also.
https://www.overclockers.co.uk/foru...tuation-pricing.18899065/page-5#post-33949189

Gibbo said:
LeMson said:
Can you confirm there was no mountain or Fort of boxes because you didn't have stock or enough to build anything of substance, rather than the nvidia won't allow us spiel :p

Shipped a lot more than that today alone for delivery tomorrow. One forum member already has posted his card up and even fitted a water block and several others confirming shipments
https://www.overclockers.co.uk/foru...tuation-pricing.18899065/page-7#post-33949463

Compare with RX 5700 launch:

Sapphire and Powercolor NAVI stock we have 1000 units
https://www.overclockers.co.uk/forums/threads/18858719/page-7#post-32846534

NVidia seems to have been caught out by plans for October and November: Ryzen, RDNA2, XSX, PS5 over a period of 5 or 6 weeks, it would seem that NVidia did the rational thing in terms of launching ahead of this flurry, rather than in the middle. It allows NVidia to "control the messaging".

Well, until the bots get to the FE orders. But it looks like it won't take the humans at NVidia long to sort through the botted orders and kill them entirely. So perhaps there'll be a few thousand 3080s appearing on NVidia's store over the next day.

“Our job at Bounce Alerts was to ensure our consumers were able to purchase the product for their needs,” the admin said.

[...]

“When given [the] chance, I’m sure most people would purchase more than 10+ units if they have the capital and look to make upwards of $25,000+ in one single day from [the] secondary market,” the admin said, later adding: “We hope they’re able to get on the next release!”
How a Bot Bought Dozens of RTX 3080 Units Before Consumers Could Grab Them

The people who botted and then eBay'd have widely suffered their own bot attacks: bots have been written to auto-create fake eBay accounts and then bid-up the prices on the scalpers' eBay listings :)

ahahahaha.jpg
 
Some people consider max settings and along with max texture settings a necessity.
I guess my question is whether or not DLSS relieves this issue. Your rendering resolution is 1080p will your texture quality realistically be 8K? I can’t imagine a scenario where setting texture resolution to be >>> rendering resolution and Its going to look better with DLSS.

do we have any comparisons on this type of thing ? @Dictator where we are varying texture resolution and seeing its output on 4K dlss?
 
I guess my question is whether or not DLSS relieves this issue. Your rendering resolution is 1080p will your texture quality realistically be 8K? I can’t imagine a scenario where setting texture resolution to be >>> rendering resolution and Its going to look better with DLSS.

do we have any comparisons on this type of thing ? @Dictator where we are varying texture resolution and seeing its output on 4K dlss?

I wouldn't think so, not in this context at least for most cases. In most games the texture setting is independent from the resolution setting.

DLSS (or any method of upscaling) will alleviate VRAM usage from resolution increases (although I'm not sure if anyone's looked into how much VRAM DLSS itself uses?), as in presumably if its upscaling from 1440p to 4k you'll have VRAM usage closer to 1440p instead of native 4k. But at least my understanding is that resolution delta itself generally is not the largest contributor to VRAM usage but game settings are (and for the current gen the bulk being textures), for instance this one example with Shadow of the Tomb Raider (at least in terms of allocation) - https://www.overclock3d.net/reviews/software/shadow_of_the_tomb_raider_pc_performance_review/13

1080p to 4k is roughly 1GB in allocation difference, while lowest to highest settings is 3GB in allocation difference.

In terms of the texture setting itself I think some users will feel the need to max the setting regardless (or even go beyond with add-on downloads, mods). I actually don't think most users really consider the actual real noticeability of higher texture settings relative to resolution. As in if the odd game for instance starts offering assets for 8k resolution users will still try to max it if their playing at 1080p. Those users will then consider 10GB (as I think it would struggle with native 8k texture assets) as maybe not enough for 1080p.

Which of course comes back to it being difficult to generically answer the question of whether 10GB will be enough even if you can forecast perfectly. Users who are flexible and don't think you need to max everything and/or can deal with the occasional stutter on the occasional game will feel differently from those can't deal with those "compromises," along with everyone else in between having a different opinion.
 
I gather you want to suggest there'll be a lot more GA102 chips than N21 ones, which is a futile discussion because you don't know yields nor waffer allocation on either chip.
But you're somehow fine with suggesting the opposite two posts above?

"Lack of volume" or demand which is way higher than even that which we saw back at Pascal launch?
I don't really see any "lack of volume" considering that the cards are already on sale here in Russia even - doubtful that this would be case if there would be a serious lack of volume of production.
What I do see is the absolutely insanely front loaded demand - which is basically impossible to satisfy without going into inventory overstock - which NV has specifically said to be aiming at avoiding in the future after the Pascal inventory surplus at the end of the mining boom in 2018.
I'm expecting this situation to gradually resolve itself over the next month or so - can take more or less depending on a country/territory.
 
I think 10GB will be enough for most things. It's what the XSX (and probably) PS5 have allocated for VRAM. That some games today can fill the whole 10GB doesn't mean future games require that or more. Games tend to fill the whole pool, even if you have 16GB.

2 and 3GB GPU's from 2012 7870, 7950 etc came along very well even today (if you match console settings ofc).

Otherwise, wait for the 20GB version or a 30tf 16GB hbm RDA2 product. Or the 24GB 3090 if your unsure :p
 
But you're somehow fine with suggesting the opposite two posts above?
I did not. This is a lie. When you pushed for me to make comparisons between RTX3080 and N21 my answer was "Who knows what to expect".



"Lack of volume" or demand which is way higher than even that which we saw back at Pascal launch?
I don't really see any "lack of volume" considering that the cards are already on sale here in Russia even - doubtful that this would be case if there would be a serious lack of volume of production.

Yeah right.




People everywhere are suspecting the card was never actually in stock other than a couple of units to make the (mostly paper) launch. There are retail stores that had two cards for sale. and there were best buys that received none. Getting two cards or zero in a store means lack of volume, not too much demand.


This isn't a case of demand > volume. It's a $700 card and there aren't that many people who are capable/willing to spending that much money on a GPU for games.


https://www.mooreslawisdead.com/post/nvidia-s-ultimate-play

I will start this post by getting straight to the point – I have evidence that suggests Nvidia is trying to have their cake and eat it too when it comes to the perceived price/performance of their new Ampere RTX 30-Series lineup. They are attempting to appear to be launching a lineup that is priced lower than their much maligned Turing generation, but in reality these things will cost far more than they are letting on for the overwhelming majority of shoppers this fall.
 
Yeah right.
Even still, ultimately, this launch saw unprecedented demand. Inventory was similar to the RTX 20 launch, but the demand has been astronomically high for PC parts since March of this year, and that hasn't slowed down; further still, bolstering this, the product was relatively highly hyped from months of leaks, and it also performed objectively well overall. These all combine to lead to angry customers, evidently, and that's something we're talking about today.
Yeah. Right.

This guy is still around? Wow.
 
I wonder if Nvidia has any kind of channel these days with developers to prioritize them regarding supply of new cards?
I remember the days where ATI and Nvidia would actually supply upfront free cards to developers.
I still have a free Radeon 9700 pro and a Fermi GTX 480, and I'm working on some really hot software right now :)
 
No matter how expensive these GPUs are, their always sold out at launch. People want these things, its a very big market. Perhaps the same as with PS5, no matter how expensive they would be, people order them anyway till its out of stock.

Anyway, i hope AMD can match these 30TF monsters, we need more competition. Fun times ahead.
Nice with the 16 and 20GB 3070 and 3080 models.
 
I wouldn't think so, not in this context at least for most cases. In most games the texture setting is independent from the resolution setting.

DLSS (or any method of upscaling) will alleviate VRAM usage from resolution increases (although I'm not sure if anyone's looked into how much VRAM DLSS itself uses?), as in presumably if its upscaling from 1440p to 4k you'll have VRAM usage closer to 1440p instead of native 4k. But at least my understanding is that resolution delta itself generally is not the largest contributor to VRAM usage but game settings are (and for the current gen the bulk being textures), for instance this one example with Shadow of the Tomb Raider (at least in terms of allocation) - https://www.overclock3d.net/reviews/software/shadow_of_the_tomb_raider_pc_performance_review/13

1080p to 4k is roughly 1GB in allocation difference, while lowest to highest settings is 3GB in allocation difference.

In terms of the texture setting itself I think some users will feel the need to max the setting regardless (or even go beyond with add-on downloads, mods). I actually don't think most users really consider the actual real noticeability of higher texture settings relative to resolution. As in if the odd game for instance starts offering assets for 8k resolution users will still try to max it if their playing at 1080p. Those users will then consider 10GB (as I think it would struggle with native 8k texture assets) as maybe not enough for 1080p.

Which of course comes back to it being difficult to generically answer the question of whether 10GB will be enough even if you can forecast perfectly. Users who are flexible and don't think you need to max everything and/or can deal with the occasional stutter on the occasional game will feel differently from those can't deal with those "compromises," along with everyone else in between having a different opinion.
So I definitely agree with the idea of just raw amount of storage is going to be critical as the years move on. But when I try to parallel what's happening in the console space, with the SSDs feeding into VRAM and things like improved sampler feedback tied into tiled resources, it does appear to: (a) only bring in textures that are needed, (b) reduce the amount of VRAM space needed for texture buffering. Combined with direct Storage, the question for me is whether in the next 7 years we will see that happen on the PC space as well or if the adoption time will be too long.

My expectations is for both AMD and Nvidia to be supported in this way, as we see with DX12U as being a baseline featureset across both consoles and the RDNA 2 and Turing+ lineups. I will have to question whether it's necessary to get more memory as this seems to be one of the largest components of costs on the for both GPUs and consoles. Moving from 10G to 16G would likely move price points to being at least 100+ extra on any card, but still having the same amount of bandwidth. Is this desire for expanded capacity rooted in older rendering methods/views? Or would games be very different if operating from a baseline of say 24GB of the sort.
 
NVIDIA GeForce RTX 3080 Overclocked To 2340 MHz, Achieves 3DMark Time Spy World Record With Over 10,000 Points
However, even with existing GA102 dies, overclockers managed to hit clock speeds north of 2300 MHz. Interestingly, all overclockers have not touched the memory during the overclocks which means that we could get even higher scores in the coming days as more overclockers get access to custom-designed board partner cards.
https://wccftech.com/nvidia-geforce-rtx-3080-overclocked-achieves-3dmark-time-spy-world-record/
 
How Bots Took Over the RTX 3080 Launch

https://www.tomshardware.com/news/how-the-bots-stole-rtx-3080-launch-mass

The Nvidia GeForce RTX 3080 launched yesterday, then promptly sold out in 5 minutes according to stores like Newegg and in 15 seconds according to some very disappointed would-be customers who weren’t able to place an order. The likely culprit? Bots run by scalpers. The evidence is pretty cut and dry, but now thanks to the folks at PC Mag, we have admissions from the people running the bots, as well as an explanation for how exactly they snatched so many units away from folks hoping to get the new best graphics card.

In an article published yesterday, PC Mag spoke to the people behind Bounce Alerts, as well as a few of their customers, about how the company’s automated purchasing bots work. For the uninitiated, Bounce Alerts is a service that members can subscribe to for $75 a month which then gives them access to scripts that monitor store pages and automatically purchase items when they go in stock. The company first sprung up in the sneakers market, but there’s nothing stopping savvy resellers from applying it to tech as well.

Bounce Alerts first came up as a culprit for hogging the 3080 launch when some resellers took to Twitter to publicly thank the company. Some of these posts have since been deleted, probably to avoid orders being cancelled by Nvidia, but others are still up. One particular deleted post, which PC Mag caught in a screenshot before it got taken down, showed a reseller who managed to buy 42 units thanks to the script.
 
Back
Top