Nvidia Ampere Discussion [2020-05-14]

The scariest part is when they said the main avenue to higher performance is more power. I guess power efficiency is no longer cool in Nvidia land.

well what they said is generally true on all components. Performance doesn’t come for free. That said if a 3080 is 300W, it would have to be a bigger than 50% performance jump over the 2080 otherwise there’s a problem.
 
index.php
 
Old engines using outdated APIs. I think once we'll see true Next-gen engines flexing their muscles, they should be using textures much more efficiently. Sampler Feedback, Windows DirectStorage etc. should really help keeping VRAM requirements sane.

Not that I wouldn't mind a doubling of VRAM throughout the entire stack. What's up with Samsung's and Micron's 16Gbit density modules anyway? Their website report their 16Gbps modules are still in "sampling" for almost a year now. Have they fallen asleep or something?

16 Gbit GDDR6 is already in use and has been for awhile. Whether it's the 5500XT 8GB uses them in 4x2GB configuration to the Quadro RTX 8000 48GB in a 12x2x2GB configuration. The issue is cost, as memory basically follows a $/bit formula.

Memory bit cost has basically been stagnant due to variety of industry reasons really beyond the controller of either GPU IHV. The diminishing economics of manufacturing improvements via process shrinks, industry consolidation, and demand from "new" sources (eg. smart phones).

The MS XSX Hotchips presentation for instance had a slide regarding this issue - https://images.anandtech.com/doci/15994/202008180215571.jpg
You can just look at how both consoles were much more conservative in memory increase compared to last gen.

GDDR6 per bit is more expensive than GDDR5 was when we last essentially doubled VRAM in the industry.

If we take the 5500XT as a reference an extra 10GB of GDDR6 would likely add at least $100 to the price of a GPU in this case the hypothetically RTX 3080. Which can be a hard sell as it's trickier to market than pure perf/$ at this point to gamers.

Was hoping for more than 10 GB. We’re at a generational switch in games because of the new console hardware.

Relatively speaking we're actually in a better state compared to the last console launch as this new console generation was much less aggressive in memory increases.

Remember when the PS4/Xbox One (8GB) were launched we were still with the Nvidia 7xx (2GB and 3GB being standard config) and AMD 2xx series (2, 3, 4GB being standard config). Outside of up memory options only the Titan (6GB) and the R9 290/x (4GB) had half the memory equivalent or more. Whereas currently 8GB availability is pretty wide in PC graphics by comparison. It wasn't really until 2.5 years after the console launch that we moved to today's value (which have been stagnant ever since) with ample VRAM matching (or more) console memory.
 
3090 with 24GB is already using the chips which can be used to create 3080 with 20GB. Or is there any other way for 3090 to have 24GB?
3090, if it's really 24GB, is using 8Gb GDDR6X's in clamshell mode. 16Gb chips are coming next year if everything goes well.
 
NVIDIA-GeForce-RTX-30-Series-Cooling-1030x579.jpeg


Is the fan on the bottom of the card an intake in a blower config? Pulls in air and blows it out of venting in the pcie bracket? The top fan blows air towards the top of the case
 
NVIDIA-GeForce-RTX-30-Series-Cooling-1030x579.jpeg


Is the fan on the bottom of the card an intake in a blower config? Pulls in air and blows it out of venting in the pcie bracket? The top fan blows air towards the top of the case

I think they are both axial. The one near the bracket works like any other axial fan and the one near the end blows through the fins (well actually it's sucking) since it's not blocked by PCB.

Problem I have with this design is you are blowing very hot air directly on a CPU cooler.
 
Last edited:
I think they are both axial. The one near the bracket works like any other axial fan and the one near the end blows through the fins (well actually it's sucking) since it's not blocked by PCB.

Problem I have with this design is you are blowing very hot air directly on a CPU cooler.

the air in any pc case is going to be mixed anyway, isn’t it? Most air will exhaust from the top or rear fan unless you’re using a blower card, no?
 
well what they said is generally true on all components. Performance doesn’t come for free. That said if a 3080 is 300W, it would have to be a bigger than 50% performance jump over the 2080 otherwise there’s a problem.

Exactly. It would be awesome if Nvidia was doing this because they wanted to and not because they had to. Efficient architecture + more power? Yes please.
 
iu

The PSU end for modular PSU's are not standardized, not even within the same manufacturer. It would be rather logistically difficult to supply something like that as an included adapter. The above picture is likely something Seasonic (and other PSU manufactures) may offer as an add-on for their own products.

As for the included adapter with 3090 FE's I think it's a bit tricky. 3x8pins to 12 pin? Leveraging EPS connectors?
 
As for the included adapter with 3090 FE's I think it's a bit tricky. 3x8pins to 12 pin? Leveraging EPS connectors?
There are three active wires in an 8-pin cable. Two 8-pins would work fine with 12-pin connector, 3 would need 18 wires to work.

What is interesting is the current which this 12-pin connector will be able to transmit. I suspect that it should be able to carry more than the 300W of 2x8-pin. Which means that not all PSUs will be compatible through usage of 8-pin cables - maybe not with Ampere specifically but eventually at some point.
 
The PSU end for modular PSU's are not standardized, not even within the same manufacturer. It would be rather logistically difficult to supply something like that as an included adapter. The above picture is likely something Seasonic (and other PSU manufactures) may offer as an add-on for their own products.
Not sure I get this. 8-pin female connectors are very standard with many PSUs designed to run high end graphics cards.
 
Maybe they sorted things out by now, but a few years back I recall horror stories of burnt out components from enthusiasts who accidentally used cables from one PSU on the other PSU.
 
Not sure I get this. 8-pin female connectors are very standard with many PSUs designed to run high end graphics cards.

The female 8 pin connector on PSU is not standard key. So it would be hard for Nvidia to include an adapter. Well unless Nvidia includes a male 12pin to 2x/3x female standardized 8 pin. That would work.
 
Last edited:
There are three active wires in an 8-pin cable. Two 8-pins would work fine with 12-pin connector, 3 would need 18 wires to work.

Wouldn't there be a safety issue with regards to power draw? In theory the maximum for 2x8pins is 300w (150w each). It's presumed part of the reason here is that the draw will need to exceed that. AiB models using the more conventional PCB are rumoured to use 3x8pin I believe.

Then there's also the issue on the PSU end. Given the Seasonic example it seems like at least they feel you need x2 PSU end connectors. Many PSUs have single PSU->2x8pin cables. You'd run the a heavy user error risk here as well if you simply put in the instructions to use separate cables.

I see your point regarding the wire issue with 3x8pins though. So perhaps it'll need to be 4x8pins. Or something else more complex.

Not sure I get this. 8-pin female connectors are very standard with many PSUs designed to run high end graphics cards.

The Seasonic picture is a modular PSU cable designed to connect directly from the PSU to the 12pin.

The end at the PSU side for modular PSUs are not standardized. The component connection end for those modular cables following existing standards such as PCIe 8 pins for GPUs.
 
Back
Top