Nvidia's 3000 Series RTX GPU [3090s with different memory capacity]

I imagine they will have no problem selling all of these cards. Nvidia will go higher next time. Margins must be awesome.

I got a kick out of the kitchen video and the CEO talking about Fortnite parties or whatever.

AMD ought to do one of their pricing disruption bouts again with their next products to try to gain market share.
 
Last edited:
I imagine they will have no problem selling all of these cards. Nvidia will go higher next time. Margins must be awesome.

I got a kick out of the kitchen video and the CEO talking about Fortnite parties or whatever.

AMD ought to do one of their pricing disruption bouts again with their next products to try to gain market share.

Yes it was in the kitchen of the NV boss himself lol. Thought their show was much better then either MS or Sonys shows.
 
Was it really I mean who has that many spatulas ?
0KVgOJj.jpg
 
Last edited:
Hum, after watching this particular video from Linus and Luke, I'm starting to wonder if Nvidia is mischaracterizing how they arrived at the 3000 series TF-performance numbers. Is it simply based on raw throughput... or is it factoring in [better] DLSS optimizations inflating TF metrics favoring their next-generation cards. Because Linus seems to pointing towards that direction in a roundabout way (under NDA at the moment). That the 3000 series gains over the 2000 series is mostly rooted in [better] optimizations, not purely raw hardware strength.

 
How do they classify the FP32 + [ FP32 | INT32 ] change, is that "[better] optimizations" or "purely raw hardware strength"?
 
How do they classify the FP32 + [ FP32 | INT32 ] change, is that "[better] optimizations" or "purely raw hardware strength"?

Hopefully someone with more knowledge on this subject can answer that. As Linus pointed out, scene optimizations (or tricks) back in the days used to be frowned upon when arriving at raw GPU performance numbers.

I remember the days/battles between Nvidia and ATI when it came to certain hidden image optimizations (e.g., trilinear optimizations) inflating certain GPU performance/fps numbers.
 
Hopefully someone with more knowledge on this subject can answer that. As Linus pointed out, scene optimizations (or tricks) back in the days used to be frowned upon when arriving at raw GPU performance numbers.

I remember the days/battles between Nvidia and ATI when it came to certain hidden image optimizations (e.g., trilinear optimizations) inflating certain GPU performance/fps numbers.
did a company force 8 and 16bit shaders because the other team had 24bit shaders that ran faster than their 32bit shaders and about the same speed as their 16bit but looked better in half life 2
 
did a company force 8 and 16bit shaders because the other team had 24bit shaders that ran faster than their 32bit shaders and about the same speed as their 16bit but looked better in half life 2

Rise from the grave...
It comes down to how much precision is sufficient.
Even when the transition to 32-bit registers began, it was recognized that some workloads did not need that much precision. ATI went with an intermediate 24-bit precision for pixel shaders for a time.
There are processing steps and workloads that are fine with 16 bits or fewer, such as various postprocessing steps on targets that are already lower-precision, neural network training (with inference even going lower), or algorithms that will iteratively approach a sufficiently accurate result.
Some things, such as addressing larger memory spaces, position data, or algorithms that accumulate data (and error) from multiple sources, benefit from more bits in the representation.

That would need to be weighed against the costs of having higher precision: power, hardware, storage, bandwidth.

For a period of time, the benefit in reaching the same level of precision used for other programmable architectures, larger memories, and more advanced algorithms was high enough to justify bumping everything to a consistent and sufficient internal precision. Then limited bandwidth growth, power consumption, and poorer silicon density+cost gains gave designers a reason to revisit that initial trade-off.

We may not necessarily be done at FP16, as some workloads can get away with less and there are low-power concepts with FPUs that dynamically vary their precision to shave things down further.

More towards your question on half-life 2
One of the presenters at Shader Day was Gabe Newell of Valve, and it was in Gabe's presentation that the information we published here yesterday. According to Gabe, during the development of Half-Life 2, the development team encountered some very unusual performance numbers. Taken directly from Gabe's slide in the presentation, here's the performance they saw initially:

As you can guess, the folks at Valve were quite shocked. With NVIDIA's fastest offering unable to outperform a Radeon 9600 Pro (the Pro suffix was omitted from Gabe's chart), something was wrong, given that in any other game, the GeForce FX 5900 Ultra would be much closer to the Radeon 9800 Pro in performance.

Working closely with NVIDIA (according to Gabe), Valve ended up developing a special codepath for NVIDIA's NV3x architecture that made some tradeoffs in order to improve performance on NVIDIA's FX cards. The tradeoffs, as explained by Gabe, were mainly in using 16-bit precision instead of 32-bit precision for certain floats and defaulting to Pixel Shader 1.4 (DX8.1) shaders instead of newer Pixel Shader 2.0 (DX9) shaders in certain cases. Valve refers to this new NV3x code path as a "mixed mode" of operation, as it is a mixture of full precision (32-bit) and partial precision (16-bit) floats as well as pixel shader 2.0 and 1.4 shader code. There's clearly a visual tradeoff made here, which we will get to shortly, but the tradeoff was necessary in order to improve performance.

The first thing that comes to mind when you see results like this is a cry of foul play; that Valve has unfairly optimized their game for ATI's hardware and thus, it does not perform well on NVIDIA's hardware. Although it is the simplest accusation, it is actually one of the less frequent that we've seen thrown around.
 
inflating TF metrics favoring their next-generation cards.

If they would go so far to straight out lie about performance, why didn't they advertise those numbers in the NV show two weeks ago? Most don't even know about the TF metrics as far i see, only that going from a 2080 to a 3080 nets a close to 100% increase in performance, not even DF talked much (if anything) about TF numbers.

If your gonna lie and make up things better use it as PR in the beginning so it gains traction instead of letting it get under the radar basically everywhere and be caught anyway. But we will see soon if they made things up.
 
If they would go so far to straight out lie about performance, why didn't they advertise those numbers in the NV show two weeks ago?

Linus and Luke are talking about that show and stated performance metrics during that show. Linus is currently under NDA not to speak about performance metrics, especially on how Nvidia has arrived at them.

Most don't even know about the TF metrics as far i see, only that going from a 2080 to a 3080 nets a close to 100% increase in performance, not even DF talked much (if anything) about TF numbers.

Most PC enthusiasts, especially core PC gamers, who are shopping for $400+ GPUs will know more about the cards performance and feature-sets than your typical console gamers.

If your gonna lie and make up things better use it as PR in the beginning so it gains traction instead of letting it get under the radar basically everywhere and be caught anyway. But we will see soon if they made things up.

It's not about lying. But more so about understanding how Nvidia is arriving at these final performance numbers. Is it at the driver level with image and rendering optimizations such as DLSS, or is it simply raw hardware strength!?
 
If they would go so far to straight out lie about performance, why didn't they advertise those numbers in the NV show two weeks ago? Most don't even know about the TF metrics as far i see, only that going from a 2080 to a 3080 nets a close to 100% increase in performance, not even DF talked much (if anything) about TF numbers.
What? Nvidia had Tflop numbers during their presentation and on their announcement page, using the same slides.

geforce-rtx-30-series-2nd-gen-rtx.jpg
 
What? Nvidia had Tflop numbers during their presentation and on their announcement page, using the same slides.

geforce-rtx-30-series-2nd-gen-rtx.jpg

Aha never noticed those, they should have been more loud if they are made up/not representative etc. i think in the end most are intrested in bench/real performance. Anyway we will know soon enough if the TF numbers are indicative if real performance or not.
 
i think in the end most are intrested in bench/real performance.
Yes, which is why waiting until after reviews is always the best. But Nvidia marketing love generating hype and inflating numbers (well, any company does) to encourage pre-orders and instill a sense of FOMO.
 
Yes, which is why waiting until after reviews is always the best. But Nvidia marketing love generating hype and inflating numbers (well, any company does) to encourage pre-orders and instill a sense of FOMO.

Just like every other corporation ;)
 
Edit: Links Updated - Part 1; more links on Part 2
Sept. 16, 2020 11am EST


Ampere RTX Reviews.

3DMGame NVIDIA GeForce RTX 3080 Founders Edition
4Gamer NVIDIA GeForce RTX 3080 Founders Edition
Adrenaline NVIDIA GeForce RTX 3080 Founders Edition [Video]
arhn.eu NVIDIA GeForce RTX 3080 Founders Edition
Ascii NVIDIA GeForce RTX 3080 Founders Edition
AusGamers NVIDIA GeForce RTX 3080 Founders Edition
Bit-Tech NVIDIA GeForce RTX 3080 Founders Edition
Bitwit NVIDIA GeForce RTX 3080 Founders Edition
Benchmark NVIDIA GeForce RTX 3080 Founders Edition
blackwhite TV NVIDIA GeForce RTX 3080 Founders Edition
ComptoirHardware NVIDIA GeForce RTX 3080 Founders Edition
ComputerBase NVIDIA GeForce RTX 3080 Founders Edition
Cowcotland NVIDIA GeForce RTX 3080 Founders Edition
ElChapuzasInformatico NVIDIA GeForce RTX 3080 Founders Edition
Eurogamer / DigitalFoundry NVIDIA GeForce RTX 3080 Founders Edition [Video]
Expreview NVIDIA GeForce RTX 3080 Founders Edition
GamersNexus NVIDIA GeForce RTX 3080 Founders Edition
GDM NVIDIA GeForce RTX 3080 Founders Edition
Geeknetic NVIDIA GeForce RTX 3080 Founders Edition
Gear Seekers NVIDIA GeForce RTX 3080 Founders Edition
Golem NVIDIA GeForce RTX 3080 Founders Edition
Greg Salazar NVIDIA GeForce RTX 3080 Founders Edition
Guru3D NVIDIA GeForce RTX 3080 Founders Edition
Forbes NVIDIA GeForce RTX 3080 Founders Edition
HardwareBattle NVIDIA GeForce RTX 3080 Founders Edition
HardwareCanucks NVIDIA GeForce RTX 3080 Founders Edition
HardwareLuxx German NVIDIA GeForce RTX 3080 Founders Edition
HardwareLuxx Russian NVIDIA GeForce RTX 3080 Founders Edition
HardwareUnboxed NVIDIA GeForce RTX 3080 Founders Edition
HardwareUpgrade NVIDIA GeForce RTX 3080 Founders Edition
Hot Hardware NVIDIA GeForce RTX 3080 Founders Edition
Hexus NVIDIA GeForce RTX 3080 Founders Edition
igor’sLAB NVIDIA GeForce RTX 3080 Founders Edition [Video]
ITMedia NVIDIA GeForce RTX 3080 Founders Edition

Thanks to Videocardz!
https://videocardz.com/newz/nvidia-geforce-rtx-3080-founders-edition-review-roundup
 
Last edited:
Back
Top