Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

The Tomshardware review is also up.
I see the same frametime spikes on Wolfenstein II and it also shows on Shadow of the Tomb Raider, both at 1440p.
https://www.tomshardware.com/reviews/nvidia-geforce-rtx-2060-ray-tracing-turing,5960-5.html


It also seems very clear that nvidia gave strong instructions to reviewers to stop them from showing results of the RTX 2060 at 4K, probably to avoid graphs like this:

KP912PT.jpg
 
How do prices of 1070 versus 2060 compare? Positioning 2060 as a 1080p raytracing GPU seems reasonable to me so far.
 
How do prices of 1070 versus 2060 compare? Positioning 2060 as a 1080p raytracing GPU seems reasonable to me so far.

Looks like right now the MSRP is 30 USD less than the 1070 and much slower (at 1080p) in general than the 1070 as well. I haven't seen any benchmarks of it with RT enabled titles though, so I'm skeptical how well it'll do for games.

Might be decent for someone that just wants to dabble in RT though.

Regards,
SB
 
I haven't seen any benchmarks of it with RT enabled titles though, so I'm skeptical how well it'll do for games.
It will likely provide a minimum level of gaming with RT and DLSS enabled. Having both features enabled for RT games is part of the package.
 
2060 very close to a 1080, sometimes matching the1080, and beating it when VRS enabled. That with RT and other features, not bad. Still too expensive, $100 down and the 2060 has a better value.
 
Looks like right now the MSRP is 30 USD less than the 1070 and much slower (at 1080p) in general than the 1070 as well.
All review are saying the 2060 is on par with 1070Ti @1080p and 1440p, am I missing something?
the RTX 2060 (6GB) truly performs like the GTX 1070 Ti. By the numbers, the RTX 2060 (6GB) is 2-3% faster than the GTX 1070 Ti at 1440p and 1080p
https://www.anandtech.com/show/13762/nvidia-geforce-rtx-2060-founders-edition-6gb-review/15
I haven't seen any benchmarks of it with RT enabled titles though
ComputerBase did some, in Battlefield V it's 6% slower than 2070 @low DXR 1080p, and it averaged 77fps in multiplayer
https://www.computerbase.de/2019-01...est/5/#abschnitt_battlefield_v_mit_raytracing


GamersNexus tested Singleplayer and he 2060 averaged 66fps@low DXR 1080p
https://www.gamersnexus.net/hwrevie...-founders-edition-review-benchmark-vs-vega-56

rtx-2060-bfv_1.png
 
@Picao84 will you please find someone else to fixate on? I'm not even slightly interested in this back&forth of yours.


So RTX 2060 is faster than GTX 1080 already? Yesterday, when I checked the reviews, average performance was at the level of GTX 1070 Ti. I'd say RTX 2060 is faster in high-FPS situations, slower in low-FPS situations. Significantly faster than GTX 1070, but no way better than GTX 1080. As for GTX 1070 Ti, its 8GB memory and more stable performance seems to make it a better solution.
The RTX 2060 seems to be equal or slightly faster than the GTX 1080 in compute-heavy games while they're not bottlenecked by fillrate or VRAM amount.
One such example is Wolfenstein II before reaching 4K, even more because it's probably taking advantage of 2x FP16 throughput in Volta ALUs.

So until we get more solid info on the impact of 6GB VRAM on frametimes (which are still scarce), I think it's safe to say the RTX 2060 stands head and shoulders above the GTX 1070 Ti.
As games tend to become more compute-centric and eventually make more use of FP16 pixel shaders, the RTX 2060 could be a safer long-term bet than the GTX 1080.
 
@Picao84 will you please find someone else to fixate on? I'm not even slightly interested in this back&forth of yours.



The RTX 2060 seems to be equal or slightly faster than the GTX 1080 in compute-heavy games while they're not bottlenecked by fillrate or VRAM amount.
One such example is Wolfenstein II before reaching 4K, even more because it's probably taking advantage of 2x FP16 throughput in Volta ALUs.

So until we get more solid info on the impact of 6GB VRAM on frametimes (which are still scarce), I think it's safe to say the RTX 2060 stands head and shoulders above the GTX 1070 Ti.
As games tend to become more compute-centric and eventually make more use of FP16 pixel shaders, the RTX 2060 could be a safer long-term bet than the GTX 1080.
VRAM would be indicative of the target resolution then?
Is 6 enough for proper 4K experience, I mean, just looking at the industry at large but 6 seems like it's cutting it thin. X1X has 12 with at least 9 dedicated for non-OS tasks. Most of the other higher end cards are operating with at least 8.
 
*9GB for scorpio games, while PC games currently appear to recommend 6-8GB cards for max texture quality (whether it's smart use or not depends). A larger streaming cache is probably desirable for "4K-target" textures versus wasteful uncompressed textures.

Things might balloon in the future if devs go for more volume texture storage or as the 2020 consoles raise the target bar anyway.
 
VRAM would be indicative of the target resolution then?
Is 6 enough for proper 4K experience, I mean, just looking at the industry at large but 6 seems like it's cutting it thin.
I don't know.
If you look at 4K average framerates on the RTX 2060, save for an odd duck like RE7 it looks like 6GB VRAM are indeed enough for now. But we still need to look at frame times because if VRAM fills up then the GPU will have to wait for system RAM data from the PCIe bus.

I suspect nvidia may be doing a colossal driver work on a game by game basis to get the driver to select what is placed in the VRAM at 330GB/s and what's being slowly streamed through the PCIe bus at 15GB/s. Like what AMD reportedly did with Fiji, and ended up developing HBCC to avoid that weight on the driver dev team.

Then again, word is that nvidia already did some of that work on the GTX 970 to avoid putting latency-sensitive data on those last slow 512MB of VRAM.
So maybe the work isn't colossal after all, and they have some automated tools that they only need to tweak for each game.



X1X has 12 with at least 9 dedicated for non-OS tasks. Most of the other higher end cards are operating with at least 8.
Yet not everything in those 9GB need a 320GB/s bandwidth. Perhaps the XboneX would be much better served with fast 4GB at 512GB/s plus 16GB at 15GB/s duplex (if it didn't need to do BC with XBOne games, that is).
 
I don't know.
If you look at 4K average framerates on the RTX 2060, save for an odd duck like RE7 it looks like 6GB VRAM are indeed enough for now. But we still need to look at frame times because if VRAM fills up then the GPU will have to wait for system RAM data from the PCIe bus.

I suspect nvidia may be doing a colossal driver work on a game by game basis to get the driver to select what is placed in the VRAM at 330GB/s and what's being slowly streamed through the PCIe bus at 15GB/s. Like what AMD reportedly did with Fiji, and ended up developing HBCC to avoid that weight on the driver dev team.

Then again, word is that nvidia already did some of that work on the GTX 970 to avoid putting latency-sensitive data on those last slow 512MB of VRAM.
So maybe the work isn't colossal after all, and they have some automated tools that they only need to tweak for each game.

Yet not everything in those 9GB need a 320GB/s bandwidth. Perhaps the XboneX would be much better served with fast 4GB at 512GB/s plus 16GB at 15GB/s duplex (if it didn't need to do BC with XBOne games, that is).
I feel as though we've had this discussion somewhere on B3D. We did talk about the amount of vram necessary required for GPUs. I suspect that certain vritual texturing /streaming options could get away with less ram, and there are times where I think more ram would be required. But how much does resolution actually impact here? I sort of agree that 4K frame buffers can be big, but 6 GB should cover it. So are we really talking about a discussion of 4K at ultra settings or high resolution textures which is where VRAM amounts are important?

And that driver thing, ugg, sounds so unsustainable.
 
So RTX 2060 is faster than GTX 1080 already?

Yes, in wolfenstein its actually faster, its a modern title optimized for modern hardware. This likely wont decrease with future drivers and titles. I would never opt for a 1080 before a RTX2060 now, and especially not in a year or two.

Things might balloon in the future if devs go for more volume texture storage or as the 2020 consoles raise the target bar anyway.

This. A 2060 easily can handle 4k with its 6gb, i dont think the One x is using that much more for Vram. Could be around that actually, 6gb.
But its certain the next consoles are going to sport 16gb if not more, 11gb vram will be the mininum for pc games then. Almost double the 2060. The 6gb 2060 has a limited future regarding 4k/highest settings then. I dont think a 8 to 11gb 2060 is going to happen.
 
9GB is possible if they source 12gbit density chips (for the same 192 bit bus) or they can do 12GB via clamshell of 8gbit density chips.
 
Yes, in wolfenstein its actually faster, its a modern title optimized for modern hardware. This likely wont decrease with future drivers and titles.
Sorry, that's cherry picking. A single title with quite unusual rendering process. I'm talking about average performance in a broader number of games.

I would never opt for a 1080 before a RTX2060 now, and especially not in a year or two.
I would prefer even GTX 1070 Ti. Maybe even GTX 1070, because these don't suffer from microstuttering like RTX 2060 does. If the issue is caused by insufficient VRAM capacity (already at launch), than it will get even worse in time.
 
Sorry, that's cherry picking. A single title with quite unusual rendering process. I'm talking about average performance in a broader number of games.

Choosing the lower performing 1070TI over a 2060 isnt what many people will do im afraid.
 
The 2060 would be benchmarked at a wide range of resolutions and quality settings, but time constraints mean websites are going to limit their attentions to targets they think represents their audience.
Time constraints intentionally forced upon reviewers by Nvidia who know what they're doing and get the cards to them just before CES when any embargoes are lifted. So they get a day to do tests and record content knowing they have to get the review out and then attend CES.
 
Regarding memory constraints, it might be fun to see what a GTX 980 Ti can do, as part of the stable of cards used to compare against the RTX 2060. Especially fun if they compare overclock to overclock as that series realy gets a boost from an OC.

It too has "just" 6 GB of ram, and it ran very close to the GTX 1070 in the benchmarks I saw around when that card was popular. IIRC, there was scant mention of it bumping into any kind of a memory ceiling that limited it, rather than just the work load.

P.S. I think that the RTX could bump into a memory ceiling in some titles, but I think it likely that in those scenarios it will only be running at an iffy level of frames per second. So having to sacrifice being able to run at 4k wouldn't be much of a sacrifice to most people. They could dial the settings to get an average of 60 fps, and then I think they'd no longer be getting stuttering from pushing their memory buffer too hard.

That should be the case, otherwise cards with 8 GB of ram would also be hitting their buffer too hard when running at Ultra, at 4k.

It's not like the RTX 2060 has nearly the same oomph of a RTX 2080, albeit with cut down memory. My RTX 2080 struggles with some titles at 4k, and so I drop the resolution to lower (custom) ones. For the RTX 2060 the amount of ram looks to be in balance with its performance. Giving it, say, 12 GB of ram wouldn't make any difference in any titles (albeit an odd exception here or there) to gamers who want a steady fps.

If the memory were free, sure, it would be nice to have for casual gaming at 4k of titles that are drop dead gorgeous with their textures, and what not, but which can be enjoyed at significantly less than 60 fps.

And now that I think about it, free sync has suddenly made having only 6 GB of ram more of an issue. Gaming at around 40 fps can now be silky smooth to many, and they'd be pissed off by micro-stuttering. Lol, I've refuted myself, but I'll post this anyway.

I stand by how it would be neat to see the benchmark results, including frame times, of a GTX 980 Ti.
 
Even at 1080p rainbow six siege requires more than 6gb of vram for the optional ultra textures pack, and thats an oldish game. The future of high res textures will most likely require greater than 6gb. Main knock against the card in my mind.

Edit: side note. I wish more games were good at telling me when I'm exceeding vram.
 
Back
Top