Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

Discussion in 'Architecture and Products' started by Ike Turner, Aug 21, 2018.

  1. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,112
    Location:
    New York
    Don’t think it’s Vulkan. Turing’s advantage over Pascal in Doom is similar to other titles ~30%. It leaps ahead in Wolf 2 by over 60%.

    Would be good to confirm if it’s really using AMD specific extensions or core Vulkan APIs for FP16.
     
    Lightman and DavidGraham like this.
  2. LeStoffer

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,262
    Likes Received:
    22
    Location:
    Land of the 25% VAT
  3. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    It's basically matching Volta, how is that surprising?
     
  4. LeStoffer

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,262
    Likes Received:
    22
    Location:
    Land of the 25% VAT
    True, I just wasn’t aware that the Volta/Turing architecture lend itself so well to 3d rendering tasks. First evidence I have seen, but then I haven’t followed Volta at all. I think that there is no change that the Blender developers have spend time to optimize for Volta.
     
  5. Digidi

    Regular

    Joined:
    Sep 1, 2015
    Messages:
    428
    Likes Received:
    239
    Great Thank You!
     
  6. pharma

    Veteran

    Joined:
    Mar 29, 2004
    Messages:
    4,887
    Likes Received:
    4,534
    Performance/price analysis on Linux: Dead heat between 2080 Ti and Vega 64 with most games tested.
    [​IMG]
    https://www.phoronix.com/scan.php?page=article&item=nvidia-2080ti-linux&num=7
     
    Lightman and Geeforcer like this.
  7. pharma

    Veteran

    Joined:
    Mar 29, 2004
    Messages:
    4,887
    Likes Received:
    4,534
    Heinrich4, nnunn, LeStoffer and 2 others like this.
  8. Geeforcer

    Geeforcer Harmlessly Evil
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,320
    Likes Received:
    525
    I think 2080 (non-TI) is really lost right now. 2080 TI(tan) at least lives in the "ultra performance at ultra price tier" that, while limited, is at least fairly well understood. 2080, on the other hand, with its 5-12% performance/power lead over 1080 TI is in purgatory at the current prirce. If the market was still in grips of crypto frenzy with 1080 TIs sold at or above MSRP then it would be a no-brainier, but with new mid-grade 1080 TIs dipping into sub-$650 range, the $150 premium is an extremely tough pill to swallow. Alternatively, if there were 2 or even 1 top-tier titles available alongside it that fully took advantage of raytracing capability and really showcased technological advances Turing brings, the situation would be different, but those are still months away. Thus, what we are left with a card with advanced tech but no opportunity to showcase it to the gamers, and singe-digit performance and efficiency increase not commiserative with the asking price premium over existing products. The card is in desperate, immediate need of a $100 price cut, which would actually make it into an exciting product worth recommending, because IMO right now its just threading water .
     
    #288 Geeforcer, Sep 21, 2018
    Last edited: Sep 21, 2018
    BRiT, pharma and Malo like this.
  9. pharma

    Veteran

    Joined:
    Mar 29, 2004
    Messages:
    4,887
    Likes Received:
    4,534
    NVIDIA GeForce RTX 2080 Ti Shows Very Strong Compute Performance Potential

    https://www.phoronix.com/scan.php?page=article&item=nvidia-rtx2080ti-compute&num=1
     
    #289 pharma, Sep 22, 2018
    Last edited: Sep 22, 2018
    trinibwoy, Heinrich4, xpea and 2 others like this.
  10. pharma

    Veteran

    Joined:
    Mar 29, 2004
    Messages:
    4,887
    Likes Received:
    4,534
  11. pharma

    Veteran

    Joined:
    Mar 29, 2004
    Messages:
    4,887
    Likes Received:
    4,534
    Ethereum Crypto Mining Performance Benchmarks On The GeForce RTX 2080 Ti

    [​IMG]
    https://www.phoronix.com/scan.php?page=article&item=rtx2080ti-crypto-mining&num=2
     
  12. Pressure

    Veteran

    Joined:
    Mar 30, 2004
    Messages:
    1,655
    Likes Received:
    593
    You can mine ETH with ASICs now, so it doesn't make much sense at the price. They should look at Cryptonight.
     
  13. A1xLLcqAgt0qc2RyMz0y

    Veteran

    Joined:
    Feb 6, 2010
    Messages:
    1,589
    Likes Received:
    1,490
    Because of those ETH ASIC's the current Nvidia GPU ETH miners jumped ship to Cryptonight (Monero) and caused a 200 MH/s increase in Monero resulting in the difficulty to jump from 55G to 70G.

    https://bitinfocharts.com/comparison/monero-difficulty.html#3m

    And of course the reduction from 3 to 2 ETH for block rewards also caused the mass exodus from ETH to XMR.

    https://cointelegraph.com/news/ethe...difficulty-bomb-reduce-block-rewards-to-2-eth
     
    #293 A1xLLcqAgt0qc2RyMz0y, Sep 24, 2018
    Last edited: Sep 24, 2018
    Lightman likes this.
  14. entity279

    Veteran Subscriber

    Joined:
    May 12, 2008
    Messages:
    1,332
    Likes Received:
    500
    Location:
    Romania
    Neaah. ETH mining performance is a datapoint, as good as any. Obviously more algos tested would be nicer but these are general purpose sites so they prolly can;t justify it.

    As to ASICs fo ETH, it's not so straighforward. They are just cheaper and and just a tinny bit more power efficient. Due to that, many AMD cards may have migrated in time to XMR from ETH, because of the slightly higher profits. Not all..

    As for, nvidia cards even today ETH is more profitable than XMR. Hence I doubt that there was a massive transition. We cannot prove either way, of course

    Also for ETH block reward is now 3, not 2. It will change to 2 on next fork only.
     
  15. Geeforcer

    Geeforcer Harmlessly Evil
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,320
    Likes Received:
    525
  16. manux

    Veteran

    Joined:
    Sep 7, 2002
    Messages:
    3,034
    Likes Received:
    2,276
    Location:
    Self Imposed Exhile
    New overclocking system that got introduced as part of turing launch is great. Tried it with my 1080ti through evga precision x1 software. After the scanning process was done the end result was good. Probably there is little bit more to be gained by hand tuning but for such an easy way to overclock this should just happen by default when you first install gpu.

    In essence you get optimized voltage for every clock speed instead of having to use conservative factory defaults.

    Another nice thing is that the tool obeys power/temperature limits user sets. So in essence if you like you could also underclock and find optimal voltages -> more silent operation.
     
    A1xLLcqAgt0qc2RyMz0y and pharma like this.
  17. homerdog

    homerdog donator of the year
    Legend Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,294
    Likes Received:
    1,075
    Location:
    still camping with a mauler
    Will this work on my 970? It still runs most things fine but sometimes I could use a little more.
     
  18. manux

    Veteran

    Joined:
    Sep 7, 2002
    Messages:
    3,034
    Likes Received:
    2,276
    Location:
    Self Imposed Exhile
    I have no idea if it would work for 970.
     
  19. Malo

    Malo Yak Mechanicum
    Legend Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    8,929
    Likes Received:
    5,529
    Location:
    Pennsylvania
    I was surprised it worked on Pascal. I assumed it involved new power management and sensors on Turing.
     
  20. pharma

    Veteran

    Joined:
    Mar 29, 2004
    Messages:
    4,887
    Likes Received:
    4,534
    What version of Evga precision x1? I thought beta was only for RTX and later GTX.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...