Nvidia Turing Product Reviews and Previews: (2080TI, 2080, 2070, 2060, 1660, etc)

Discussion in 'Architecture and Products' started by Ike Turner, Aug 21, 2018.

  1. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,571
    Likes Received:
    2,121
    Exactly, the way I see it, 2060 serves only one purpose: a cheap entry to ray tracing, it's very mildly lower than 2070 in ray tracing, and even though it's a capable 1440p card, future prospects dictates it should be used as a 1080p card, as it's 6GB framebuffer would last at that resolution much longer, just like a 1060.
     
    vipa899, pharma and Picao84 like this.
  2. Picao84

    Veteran Regular

    Joined:
    Feb 15, 2010
    Messages:
    1,287
    Likes Received:
    512
    This is exactly why a 349 card should not be presented as capable of running 4K. If nvidia would have said that, we would now be discussing if they were being too generous with the truth or misleading consumers. It's a glass half full, half empty situation.

    It's the perfect type of argument for whomever has an evil furry cat. <ModEdit>
     
    #462 Picao84, Jan 9, 2019
    Last edited by a moderator: Jan 9, 2019
  3. Picao84

    Veteran Regular

    Joined:
    Feb 15, 2010
    Messages:
    1,287
    Likes Received:
    512
    Did that also happen with the GTX1060? Because it was also not tested at 4K by most outlets.
     
  4. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,734
    Likes Received:
    1,467
    Thread needs cleaning up
    BS, no time constraints forced upon reviewers by Nvidia.
    HH at Guru3D mentioned they received and tested the RTX 2060 more than a week before published review.
     
    Picao84 likes this.
  5. AlBran

    AlBran Just Monika
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,081
    Likes Received:
    5,036
    Location:
    ಠ_ಠ
    There does seem a trend in recent AAA games to have those metrics either in the benchmark screen or in the graphics settings. I hope it continues (and gets more detailed/better), and maybe it's just a sign of the tools getting more streamlined/better that it's worthwhile for developers to expose for QA to test on various hardware as it would make life rather easy for a seemingly simple metric.

    Remember kids, fibre is all part of a healthy balanced diet. Source: Bran. This message is brought to you by the letters A and L, the irrationAl number pi, and paid by some Toilet Hardware Company for the low low price of a shilling.
     
    #465 AlBran, Jan 9, 2019
    Last edited: Jan 9, 2019
    Silent_Buddha and Scott_Arm like this.
  6. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,103
    Likes Received:
    3,403
    I find it particularly annoying on PC because most optimization guides are written by people testing with the highest end cards. Turn lighting to ultra because it had very little effect on performance on my 1080ti! Games should do a better job of informing a player what settings will have an impact, maybe by profiling cards with micro benchmarks (fillrate, overdraw, shading etc). R6 Siege has a benchmark, but you still have to twist the knobs yourself and figure out which settings will have the biggest impact on your particular pc.

    But back to the 2060, the limit of 6GB is going to be non-obvious to many gamers. They know that 8GB is better than 6GB, but they might not know why, or what impact that's going to have. At least if I were playing a game that was showing me vram consumption, I could see that I need at least 8GB if I want to turn up the various knobs. 6GB does not turn the 2060 into a bad product, but I think if you expect to use it for 2-4 years, the 6GB limit has real considerations, even at 1080p. I think most people buying it are on a budget and probably would hope for that lifetime.
     
    Picao84 likes this.
  7. AlBran

    AlBran Just Monika
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,081
    Likes Received:
    5,036
    Location:
    ಠ_ಠ
    Yeah, I guess it'd have to be more like an actual profiling tool to examine the nitty gritty of the breakdown of a frame - need snapshots for various scenes, but it might be pretty nice to just see the make-up of a frame that shows what settings affect which part of the frame (shadow pass, lighting, each post-process step), which is perhaps something the driver teams do to some extent :?: Could make life a little easier for troubleshooting too - "such and such settings aren't running properly versus other folks' experience, and this is what the frame breakdown is showing as fubar on x-driver etc."
     
  8. Babel-17

    Regular

    Joined:
    Apr 24, 2002
    Messages:
    992
    Likes Received:
    232
    Anandtech included a GTX 980 Ti in their review of the RTX 2060. I'll give it a look.

    https://www.anandtech.com/show/13762/nvidia-geforce-rtx-2060-founders-edition-6gb-review

    They checked the frame times on some of the games, and both the 980 Ti and the 2060 felt the effect of Wolfenstein 2's Mein Leben! setting when running at 3840 x 2160 (the 980 Ti and the 2060 becoming equally bound). The other games tested for their frame times, Battlefield 1, Ashes of the Singularity: Escalation, appear to have been fine.

    Edit: And GTA 5 was fine as well.
     
    #468 Babel-17, Jan 9, 2019
    Last edited: Jan 9, 2019
    pharma likes this.
  9. Malo

    Malo YakTribe.games
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    6,674
    Likes Received:
    2,711
    Location:
    Pennsylvania
    It's funny there's contention about 6Gb being limited and there's rumors of 4Gb and 3Gb variants, with both GDDR5x and GDDR6 coming out for the 2060.
     
  10. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    15,491
    Likes Received:
    4,405
    Oh hell, you're right, I messed up. When I looked at the Anandtech and Tomshardware reviews, I didn't have much time so just glanced at the graphs. For some reason (probably since I was in a hurry), I saw the 1060 as the 2060. Big oops.

    There are some weird results for the card though. GTA V at 1080p is way slower on the 2060 than the 1070, but faster at 1440p and 2160p. Not that it matters since even at 1080p it was pulling 123 FPS.

    https://www.guru3d.com/articles_pages/geforce_rtx_2060_review_(founder),22.html

    Regards,
    SB
     
    #470 Silent_Buddha, Jan 9, 2019
    Last edited: Jan 9, 2019
    vipa899, pharma and DavidGraham like this.
  11. AlBran

    AlBran Just Monika
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,081
    Likes Received:
    5,036
    Location:
    ಠ_ಠ
    We're just elite here. :p
     
    Scott_Arm likes this.
  12. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    39,695
    Likes Received:
    9,745
    Location:
    Under my bridge
    Those guidelines don't dictate what sites should test though. If a website wants to test 4K, they can, unless one wants to go into the territory of companies controlling the media by refusing samples etc. There's no point in that discussion in this thread. The benchmarks and reviews are what they are.
     
    pharma and vipa899 like this.
  13. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,409
    Likes Received:
    4,058

    They do.
    https://videocardz.com/77983/nvidia-geforce-rtx-2080-ti-and-rtx-2080-official-performance-unveiled

    I'm happy to leave the conversation as it is, though. It's not important for this thread.
    All I said was the RTX 2060 wasn't tested more thoroughly at 4K because the card's reviewer guide says so. We could discuss the semantics of what constitutes "suggestion" or "instruction" to test at resolution X or Y, as well as the repercussions / or lack of thereof for not following said implicit/explicit guidelines, but that's not important for this thread either.


    Point is name nor price are an indicator of what resolution should be used for testing. Performance at time of release is.
    If we take a 40 FPS average as "minimum acceptable performance", the $300 GTX 260 was a card for playing demanding titles in 2008 at 1280*1024. In 2011 the $200 GTX 560 had raised those stakes for 1680*1050. In 2013 the $250 GTX 760 rose the resolution to 1920*1080. In 2016 the $300 GTX 1060 drove that resolution threshold up again to 2560*1440.
    There's nothing on the "xx60" name or its $350 price that says it can't/shouldn't run games at whatever resolution. If nvidia keeps up this naming scheme (and we don't all transition to GaaS), a point in time will come where the xx80 card is meant for dual 5K VR 90FPS + reprojection, and it doesn't even make any sense to test the $99-$999 xx60 of that family at anything lower than 4K.
    Besides, Anandtech's title for the RTX 2060 review even says "Not Quite Mainstream", as they recon the 2060 marks an even larger departure from the usual target market of the xx60 cards.


    The RTX 2060 looks like a pretty competent card to play at 4K (or eventually 1440p + DLSS when/if that ever gets widely adopted) and it'll fit quite nicely in a small HTPC case with a modest 500W power supply. Which is the case of my living room PC.
    Don't like that I'm considering one for myself? Sue me :p
     
  14. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    39,695
    Likes Received:
    9,745
    Location:
    Under my bridge
    It's a guide, not a set of orders. You don't have to follow them, unless I'm mistaken and there's a contract stating what reviews are and are not allowed to report on. What was stopping videocardz.com benchmarking at different resolutions to the guide recommendations.
     
  15. Malo

    Malo YakTribe.games
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    6,674
    Likes Received:
    2,711
    Location:
    Pennsylvania
    The increasing fear of being left out of Nvidia's good graces perhaps. There are more tech sites being left out of 2060 sampling for example than before due to less-than-stellar reviews of initial RTX cards.
     
  16. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,734
    Likes Received:
    1,467
    Do you have a link? Or is this more BS like similar to what you posted before regarding RTX 2060 reviewers timeline?
     
    vipa899 and Picao84 like this.
  17. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,409
    Likes Received:
    4,058
    Mini-ITX builders rejoice:
    https://www.gigabyte.com/Graphics-Card/GV-N2060IXOC-6GD#kf





    There's no mid-range Turing at the moment IMO. The TU106 is a 445mm^2 chip. It's 2.2x larger with 2.45x more transistors than the GP106 in the GTX 1060.
    $350 is also the most expensive a xx60 card has ever been at launch. The GTX 970 launched with a $330 MSRP.
    The mid-rangers right now are the Polaris 10/20 and GP106 cards.


    IMO, the Turing chips are just a byproduct of nvidia cancelling a full family of Volta chips (GTX 11-series?) that were planned to launch in mid 2018 using 12FFN. This would be akin to a Kepler -> Maxwell transition and nvidia was ready to implement a tick-tock strategy.
    AMD's inability to compete and regain marketshare plus the mining boom during 2017 led nvidia to cancel the development of the whole Volta family in the except for GV100. With this, nvidia saved a bunch of money in R&D plus marketing plus whatever they were going to spend on replacing Pascal with Volta production lines. This obviously gave them considerable YoY revenue increases which are now impossible to keep up with, hence their latest stock value "crash" down to Q2 2017 levels.
    Regardless, with all of the above, nVidia found themselves with the time and money to make a line of dedicated Quadro chips that will be unbeatable at offline rendering for years to come. There's dedicated hardware for raytracing plus tensor units for denoising, plus Volta's new shader modules.
    And why is Turing coming for consumers after all? Because post mining crash the 2nd-hand market is being flooded with cheap Pascal cards and nvidia needs some steady revenue from gaming GPUs (which is their primary source of revenue by far). So they had to release something with an increased value proposition over those existing Pascal chips.

    That said, I don't think Turing was initially meant for real-time rendering and gaming. The RT hardware isn't fast enough to provide a clear-cut advantage over screen space reflections (and probably never will be) and no one seems to know what to do with the tensor units in games, as DLSS implementations keep being pushed back month after month. I'll be happy to be proven wrong, but at the moment I'll stick to the same opinion as Gamers Nexus on BFV.
    They're very fast at rasterization, sure, but that's coming from its Volta heritage and the fact that they're all large chips with the smallest being almost as big as a GP102.

    I don't think for a second that a group of engineers at nvidia thought "what would be a great mid-range chip for 2019?" and came up with a partially disabled TU106.



    Absolutely no one here made such a statement.
    Which is part of the reason why a discussion with you seems so exhausting from the get go. The other part being the completely unnecessary flamebait jabs like this:
    First the accusation was me having an agenda against the card. Now I'm chasing imaginary windmills because I'm considering the card for myself to play some games at a specific resolution. Next will be..?
    Look, I might've had the patience (maybe even eagerness I confess) for this in the past, but I certainly don't have it now. I might be better off just hitting the ignore button..



    Here's your link.
    To be honest, I don't think nvidia finds anything inherently wrong with reviewers showing positive or neutral 4K results.
    That Chapuzas Informático graph on the other hand...



    Sigh..
    If the mods could have a dollar for all the times this is said for either side, they'd be too busy sipping a 1973 Port in a secluded Hawaii beach to moderate the forum.
     
  18. Benetanegia

    Newcomer

    Joined:
    Sep 4, 2015
    Messages:
    202
    Likes Received:
    123
    Apples and oranges. First of all, GTX 1060 was the full GP106, RTX 2060 isn't. Not to mention the addition of Tensor and RT cores. TU104 and TU102 are also significantly bigger than their Pascal counterparts, so that's hardly of relevance. There's no point comparing die sizes.

    As for price, Turing has seen an increase in all segments, of which the 2060 is the smallest. That's what lack of compatition does. People were expecting a Vega 20 consumer part for less than $600, but it's $700 instead, consumes more power and does not have new features. So the landscape is what it is, sadly.
     
    Picao84, vipa899 and DavidGraham like this.
  19. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,734
    Likes Received:
    1,467
    Never considered that review site mainstream, so not terribly surprised.
    Usually when this happens the common excuse for Nvidia and AMD is "lack of cards" to go around. I recall Vega did something along those lines.

    Edit: spelling.
     
    Picao84 and vipa899 like this.
  20. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,571
    Likes Received:
    2,121
    You can clearly see a big IQ difference between SSR and RT reflections, It's obvious from the very get go. Also Ray Tracing isn't only about reflections, there is shadows, lighting, AO and refractions.
     
    pharma, OlegSH and vipa899 like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...