Nvidia Ampere Discussion [2020-05-14]

I guess they doubled the ALUs and not worried about feeding them fully since they are limited by other things anyways, like bandwidth or power? Maybe they can be used for specific things in RT. Probably doesn't cost that much area to just add them if they don't add too much extra silicon to try to feed them.
 
I'm excited. 3070 beating 2080Ti at a cost of 500. This makes sense. PC gaming is saved :)
Maybe that's a hint of good competition as well...
Just keep in mind it's likely beating the 2080ti in RT-limited games, not necessarily across the board. Even if it just comes fairly close in standard FP32, it's still a nice deal though.
 
According to DigitalFoundry: 2080 to 3080 65% to 80% .. 90% more in RT games.

According to DigitalFoundry: 2080 to 3080 65% to 80% .. 90% to 100% more in RT games.
Impressive, the 3080 is stronger than I had thought it would be. Did a better job of hyping me than the presentation itself.
Just keep in mind it's likely beating the 2080ti in RT-limited games, not necessarily across the board. Even if it just comes fairly close in standard FP32, it's still a nice deal though.
Check out the DF video, the 3080 absolutely crushes the 2080 by 70% or more in non RT games. Don't underestimate the 3070 in non RT games, either.
 
I guess they doubled the ALUs and not worried about feeding them fully since they are limited by other things anyways, like bandwidth or power?

I don't really think it's just bandwidth, etc. I still believe in what was discussed here before when 2xFP rumours arised, that it can either do either 1 FP+1 INT like Turing does or 2xFP per clock. Since according to Nvidia (on Turing presentation) games on average do 36 INT ops per 100 FP ops, Turing would do on average 136 ops per 200 "cycles" despite being in theory capable of 200. Gaming Ampere does 200 regardless of FP/INT mix. And if you do the math 200/136 = 1.47, so about 50% faster SM on average though it probably varies wildly on game basis.

Probably doesn't cost that much area to just add them if they don't add too much extra silicon to try to feed them.

Sure that's the thing. I have no idea how much an extra 16xFP32 SIMD would add, but even a 20% SM size increase for a 50% perf boost is pretty good. Tho the question that I've had since the rumours about 2xFP arised is why they didn't do it for Turing, when it makes much more sense. I mean they actually had the numbers.
 
I wonder how much faster the 3090 will be than the 3080. Seems like it will be power limited with 82SM compared to the 68SM of the 3080, 350W vs 320W, no? The huge amount of VRAM the 3090 has and the huge price gap between the two leaves one wanting for a 3080ti.
 
Is it? I thought it was also from a non RT game perspective.
Well this is their slide
307081kha.png


Since we have no breakdown or numbers yet, I think it's safe to assume that "multiple popular graphics intensive games" includes a mix of RT and non-RT titles, which means as an average it's probably winning in RT and losing in pure FP32.

Check out the DF video, the 3080 absolutely crushes the 2080 by 70% or more in non RT games. Don't underestimate the 3070 in non RT games, either.
RT performance delta vs non RT delta is not that different as youd expect though ~85% to ~70% respectively 3080 vs 2080.

The 3080 has a fairly significant bandwidth difference whereas the 3070 might be rather starved for bandwidth compared to the 2080ti.

As always, just trying to temper expectations vs marketing :)

Example, their 3070 page has this graph and BL3 compared to a 2080ti which can push 90fps easily at 1440p. But for $500? Yeah, there's going to be some cheap 2080ti's on the market soon...

3070-graph.PNG
 
@ShaidarHaran
Maybe this time you shouln't buy any nvidia card an price will drop fast? Thats the way captialissm work. If you buy a 3080 Nvidia will win.

If AMD had product on the market that would be an upgrade from the 2080 Ti I just sold, I would gladly consider it. That's highly unlikely, though. Also I have a Gsync display.
 
But for $500? Yeah, there's going to be some cheap 2080ti's on the market soon...
Perhaps, but that's usually the case with any new architecture. As long as your willing to forego all the "new" features that a new architecture brings I guess you should be fine.
 
What? How can a 60% increase in perf (roughly 50 to 80) for a 27% increase in SM count at same clocks in any way confirm no IPC gain?

Let's wait for the real boost clocks to account for actual IPC increase. It certainly would explain the higher power consumption vs. RTX 2070. I just don't think the Ampere CUDA is such a huge jump from Turing as Nvidia wants us to believe. Just more of it. Nothing wrong with that per se.
 
Back
Top