Nvidia Turing Product Reviews and Previews: (2080TI, 2080, 2070, 2060, 1660, etc)

Discussion in 'Architecture and Products' started by Ike Turner, Aug 21, 2018.

  1. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,409
    Likes Received:
    4,058
    BRiT, Babel-17 and pharma like this.
  2. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,409
    Likes Received:
    4,058
  3. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    39,695
    Likes Received:
    9,744
    Location:
    Under my bridge
    How do prices of 1070 versus 2060 compare? Positioning 2060 as a 1080p raytracing GPU seems reasonable to me so far.
     
  4. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    15,491
    Likes Received:
    4,405
    Looks like right now the MSRP is 30 USD less than the 1070 and much slower (at 1080p) in general than the 1070 as well. I haven't seen any benchmarks of it with RT enabled titles though, so I'm skeptical how well it'll do for games.

    Might be decent for someone that just wants to dabble in RT though.

    Regards,
    SB
     
  5. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,734
    Likes Received:
    1,467
    It will likely provide a minimum level of gaming with RT and DLSS enabled. Having both features enabled for RT games is part of the package.
     
  6. vipa899

    Regular Newcomer

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    352
    Location:
    Sweden
    DIgital Foundrys 2060 reviews out

     
    Lightman, pharma and eloyc like this.
  7. vipa899

    Regular Newcomer

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    352
    Location:
    Sweden
    2060 very close to a 1080, sometimes matching the1080, and beating it when VRS enabled. That with RT and other features, not bad. Still too expensive, $100 down and the 2060 has a better value.
     
    yuri and Picao84 like this.
  8. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,570
    Likes Received:
    2,121
    All review are saying the 2060 is on par with 1070Ti @1080p and 1440p, am I missing something?
    https://www.anandtech.com/show/13762/nvidia-geforce-rtx-2060-founders-edition-6gb-review/15
    ComputerBase did some, in Battlefield V it's 6% slower than 2070 @low DXR 1080p, and it averaged 77fps in multiplayer
    https://www.computerbase.de/2019-01...est/5/#abschnitt_battlefield_v_mit_raytracing


    GamersNexus tested Singleplayer and he 2060 averaged 66fps@low DXR 1080p
    https://www.gamersnexus.net/hwrevie...-founders-edition-review-benchmark-vs-vega-56

    [​IMG]
     
    pharma and yuri like this.
  9. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,409
    Likes Received:
    4,058
    @Picao84 will you please find someone else to fixate on? I'm not even slightly interested in this back&forth of yours.


    The RTX 2060 seems to be equal or slightly faster than the GTX 1080 in compute-heavy games while they're not bottlenecked by fillrate or VRAM amount.
    One such example is Wolfenstein II before reaching 4K, even more because it's probably taking advantage of 2x FP16 throughput in Volta ALUs.

    So until we get more solid info on the impact of 6GB VRAM on frametimes (which are still scarce), I think it's safe to say the RTX 2060 stands head and shoulders above the GTX 1070 Ti.
    As games tend to become more compute-centric and eventually make more use of FP16 pixel shaders, the RTX 2060 could be a safer long-term bet than the GTX 1080.
     
  10. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    7,038
    Likes Received:
    5,320
    VRAM would be indicative of the target resolution then?
    Is 6 enough for proper 4K experience, I mean, just looking at the industry at large but 6 seems like it's cutting it thin. X1X has 12 with at least 9 dedicated for non-OS tasks. Most of the other higher end cards are operating with at least 8.
     
    pharma likes this.
  11. AlBran

    AlBran Just Monika
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,081
    Likes Received:
    5,034
    Location:
    ಠ_ಠ
    *9GB for scorpio games, while PC games currently appear to recommend 6-8GB cards for max texture quality (whether it's smart use or not depends). A larger streaming cache is probably desirable for "4K-target" textures versus wasteful uncompressed textures.

    Things might balloon in the future if devs go for more volume texture storage or as the 2020 consoles raise the target bar anyway.
     
    vipa899, iroboto and ToTTenTranz like this.
  12. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,409
    Likes Received:
    4,058
    I don't know.
    If you look at 4K average framerates on the RTX 2060, save for an odd duck like RE7 it looks like 6GB VRAM are indeed enough for now. But we still need to look at frame times because if VRAM fills up then the GPU will have to wait for system RAM data from the PCIe bus.

    I suspect nvidia may be doing a colossal driver work on a game by game basis to get the driver to select what is placed in the VRAM at 330GB/s and what's being slowly streamed through the PCIe bus at 15GB/s. Like what AMD reportedly did with Fiji, and ended up developing HBCC to avoid that weight on the driver dev team.

    Then again, word is that nvidia already did some of that work on the GTX 970 to avoid putting latency-sensitive data on those last slow 512MB of VRAM.
    So maybe the work isn't colossal after all, and they have some automated tools that they only need to tweak for each game.



    Yet not everything in those 9GB need a 320GB/s bandwidth. Perhaps the XboneX would be much better served with fast 4GB at 512GB/s plus 16GB at 15GB/s duplex (if it didn't need to do BC with XBOne games, that is).
     
  13. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    7,038
    Likes Received:
    5,320
    I feel as though we've had this discussion somewhere on B3D. We did talk about the amount of vram necessary required for GPUs. I suspect that certain vritual texturing /streaming options could get away with less ram, and there are times where I think more ram would be required. But how much does resolution actually impact here? I sort of agree that 4K frame buffers can be big, but 6 GB should cover it. So are we really talking about a discussion of 4K at ultra settings or high resolution textures which is where VRAM amounts are important?

    And that driver thing, ugg, sounds so unsustainable.
     
  14. vipa899

    Regular Newcomer

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    352
    Location:
    Sweden
    Yes, in wolfenstein its actually faster, its a modern title optimized for modern hardware. This likely wont decrease with future drivers and titles. I would never opt for a 1080 before a RTX2060 now, and especially not in a year or two.

    This. A 2060 easily can handle 4k with its 6gb, i dont think the One x is using that much more for Vram. Could be around that actually, 6gb.
    But its certain the next consoles are going to sport 16gb if not more, 11gb vram will be the mininum for pc games then. Almost double the 2060. The 6gb 2060 has a limited future regarding 4k/highest settings then. I dont think a 8 to 11gb 2060 is going to happen.
     
  15. AlBran

    AlBran Just Monika
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,081
    Likes Received:
    5,034
    Location:
    ಠ_ಠ
    9GB is possible if they source 12gbit density chips (for the same 192 bit bus) or they can do 12GB via clamshell of 8gbit density chips.
     
    vipa899 likes this.
  16. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,251
    Likes Received:
    185
    Sorry, that's cherry picking. A single title with quite unusual rendering process. I'm talking about average performance in a broader number of games.

    I would prefer even GTX 1070 Ti. Maybe even GTX 1070, because these don't suffer from microstuttering like RTX 2060 does. If the issue is caused by insufficient VRAM capacity (already at launch), than it will get even worse in time.
     
  17. vipa899

    Regular Newcomer

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    352
    Location:
    Sweden
    Choosing the lower performing 1070TI over a 2060 isnt what many people will do im afraid.
     
  18. Malo

    Malo YakTribe.games
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    6,674
    Likes Received:
    2,711
    Location:
    Pennsylvania
    Time constraints intentionally forced upon reviewers by Nvidia who know what they're doing and get the cards to them just before CES when any embargoes are lifted. So they get a day to do tests and record content knowing they have to get the review out and then attend CES.
     
  19. Babel-17

    Regular

    Joined:
    Apr 24, 2002
    Messages:
    992
    Likes Received:
    232
    Regarding memory constraints, it might be fun to see what a GTX 980 Ti can do, as part of the stable of cards used to compare against the RTX 2060. Especially fun if they compare overclock to overclock as that series realy gets a boost from an OC.

    It too has "just" 6 GB of ram, and it ran very close to the GTX 1070 in the benchmarks I saw around when that card was popular. IIRC, there was scant mention of it bumping into any kind of a memory ceiling that limited it, rather than just the work load.

    P.S. I think that the RTX could bump into a memory ceiling in some titles, but I think it likely that in those scenarios it will only be running at an iffy level of frames per second. So having to sacrifice being able to run at 4k wouldn't be much of a sacrifice to most people. They could dial the settings to get an average of 60 fps, and then I think they'd no longer be getting stuttering from pushing their memory buffer too hard.

    That should be the case, otherwise cards with 8 GB of ram would also be hitting their buffer too hard when running at Ultra, at 4k.

    It's not like the RTX 2060 has nearly the same oomph of a RTX 2080, albeit with cut down memory. My RTX 2080 struggles with some titles at 4k, and so I drop the resolution to lower (custom) ones. For the RTX 2060 the amount of ram looks to be in balance with its performance. Giving it, say, 12 GB of ram wouldn't make any difference in any titles (albeit an odd exception here or there) to gamers who want a steady fps.

    If the memory were free, sure, it would be nice to have for casual gaming at 4k of titles that are drop dead gorgeous with their textures, and what not, but which can be enjoyed at significantly less than 60 fps.

    And now that I think about it, free sync has suddenly made having only 6 GB of ram more of an issue. Gaming at around 40 fps can now be silky smooth to many, and they'd be pissed off by micro-stuttering. Lol, I've refuted myself, but I'll post this anyway.

    I stand by how it would be neat to see the benchmark results, including frame times, of a GTX 980 Ti.
     
    iroboto, pharma and Picao84 like this.
  20. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,103
    Likes Received:
    3,403
    Even at 1080p rainbow six siege requires more than 6gb of vram for the optional ultra textures pack, and thats an oldish game. The future of high res textures will most likely require greater than 6gb. Main knock against the card in my mind.

    Edit: side note. I wish more games were good at telling me when I'm exceeding vram.
     
    iroboto, Picao84 and DavidGraham like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...