Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

Discussion in 'Architecture and Products' started by Ike Turner, Aug 21, 2018.

  1. w0lfram

    Newcomer

    Joined:
    Aug 7, 2017
    Messages:
    217
    Likes Received:
    38
    Correct, Microsoft's DirectX Raytracing (DXR) and Nvidia's RTX are two different things. And that is my over-all point...

    If you want to make use of Turing's properties, you will need Nvidia's proprietary API.
     
  2. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    7,619
    Likes Received:
    3,684
    Location:
    Pennsylvania
    No, that's what we're trying to point out, you don't.
     
    DavidGraham, pharma, sonen and 2 others like this.
  3. Osamar

    Newcomer

    Joined:
    Sep 19, 2006
    Messages:
    205
    Likes Received:
    23
    Location:
    40,00ºN - 00,00ºE
    https://developer.nvidia.com/rtx#source=pr
    just take a look at the image.
     
  4. Ike Turner

    Veteran Regular

    Joined:
    Jul 30, 2005
    Messages:
    1,969
    Likes Received:
    1,993
    That's right. All RT operations go through DXR and are cross vendor/GPU arch compatible but on Turing GPUs some of the calls are automatically translated to OptiX (CUDA) through the driver and accelerated buy the (still mysterious) RT Cores.

    Here's an example with ChaosGroup (VRay) project Lavina's real-time RT renderer which also interestingly doesn't use OptiX AI denoising but their own cross vendor AI denoising solution (VRay Next does support both for production rendering):


    https://www.chaosgroup.com/blog/ray-traced-tendering-accelerates-to-real-time-with-project-lavina
     
    #104 Ike Turner, Aug 26, 2018
    Last edited: Aug 26, 2018
    Kej, Silent_Buddha, Scott_Arm and 6 others like this.
  5. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    11,031
    Likes Received:
    5,576
    These "complaints other than price" you mention, where are they?

    All I saw was a completely wrong statement that was promptly corrected. Raytracing on BFV didn't take 2 weeks. It took 8 months.
    Where are the posts complaining about 8 months being too long?
     
    BRiT likes this.
  6. BRiT

    BRiT Verified (╯°□°)╯
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    15,543
    Likes Received:
    14,093
    Location:
    Cleveland
    Can everyone behave like adults?
     
    milk likes this.
  7. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,278
    Likes Received:
    3,523
    Lightman, pharma and OCASM like this.
  8. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    3,536
    Likes Received:
    2,220
    I love that graphic preset 'Overkill' in the Cyberpunk menu! :cool2:
     
  9. Frenetic Pony

    Regular Newcomer

    Joined:
    Nov 12, 2011
    Messages:
    480
    Likes Received:
    200
    I haven't checked, but I hope hairworks has significantly improved since The Witcher 3, weird strands of rope aren't hair : (

    That being said, I do hope raytraced reflections show up in Cyberpunk. It looks great, except for how diffuse everything is and how many reflections are missing. The efficacy of raytraced reflections for smaller enclosed levels without time of day changes or etc. like BFV are questionable, especially for how much they cost (ooh look a puddle has somewhat better reflections!). But for open world games they're a good solution for a very difficult problem.
     
  10. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    3,536
    Likes Received:
    2,220
    NVIDIA GeForce RTX 2080 3DMark TimeSpy Score Leaked – Clocked At 2GHz And Beats A GTX 1080 Ti Without AI Cores

    [​IMG]
    https://wccftech.com/nvidia-geforce...ghz-and-beats-a-gtx-1080-ti-without-ai-cores/
     
    Heinrich4, Lightman and DavidGraham like this.
  11. itaru

    Newcomer

    Joined:
    May 27, 2007
    Messages:
    156
    Likes Received:
    15
  12. McHuj

    Veteran Regular Subscriber

    Joined:
    Jul 1, 2005
    Messages:
    1,555
    Likes Received:
    740
    Location:
    Texas
    If that's the 2080, that's pretty concerning about the performance. I would expect it to beat the 1080Ti at stock clocks.

    If that's the 2070, sure that seems great.
     
    #112 McHuj, Aug 28, 2018
    Last edited: Aug 28, 2018
  13. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,582
    Likes Received:
    625
    Location:
    New York
    None of those clocks appear to be at stock for any of the cards. Not exactly sure what that pic is showing. Are those max boost clocks?
     
  14. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    9,084
    Likes Received:
    2,952
    Location:
    Finland
    2080 far more likely, the benches NVIDIA released put 2080 around the 1080 Ti performance
     
  15. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    7,619
    Likes Received:
    3,684
    Location:
    Pennsylvania
    Expectations are quite high if people want the 2070 to be faster than a 1080ti IMO.
     
    Lightman likes this.
  16. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    3,536
    Likes Received:
    2,220
    You've got to leave some room for the 2070 Ti. :runaway:
     
  17. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,464
    Likes Received:
    831
    Location:
    France
    Well Pascal is 2 years old (even if Ti is only one), and nVidia basically jumped Volta and are releasing a n+2 gen.
     
  18. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    7,619
    Likes Received:
    3,684
    Location:
    Pennsylvania
    How does Turing qualify as a n+2 gen? Both Volta and Turing would have been in development concurrently and they're both using the same process and very similar structure. Just because Volta was released last year doesn't make it a complete generation between Pascal and Turing. What makes a "generation" is a significant architecture change usually combined with a process advancement.

    Volta should just be ignored, especially since there was no consumer version.
     
  19. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,464
    Likes Received:
    831
    Location:
    France

    My guess is they delayed Volta / cancelled it for the gamers and kept pushing Pascal because Vega was a dud. So, they worked on turing instead, and deliver it now. In a world were AMD / RTG would be good, it would have been Vega vs Volta, and they would have waited for 7nm to release turing vs AMD next gen thing.
    But without anyone in front of them, they did this.

    And for me Turing is a gen after Volta because of RT, it's a pretty big change....

    Yes of course, it's only my opinion.
     
    #119 Rootax, Aug 28, 2018
    Last edited: Aug 28, 2018
    Heinrich4, pharma and Malo like this.
  20. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,582
    Likes Received:
    625
    Location:
    New York
    Why would Volta be a dud? Turing is basically Volta + RT + Tensor so Volta rasterization performance with the same number of SMs should be very similar with a smaller die size.

    If Volta would be a dud then so is Turing.

    It does seem though that nvidia recalibrated its releases after seeing Pascal’s competition. This was probably the best time to gamble on RT transistors.

    Thanks @Geeforcer for the fix, meant Volta not Vega :)
     
    #120 trinibwoy, Aug 28, 2018
    Last edited: Aug 28, 2018
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...