Nvidia Turing Speculation thread [2018]

Status
Not open for further replies.
RTX is nothing more than a PR name for driver support of DXR...or propaganda to make it sound like Nvidia GPU's have "special" RT HW..you choose what suits you.
...
That's a load of nothingness...giving that the only "RTX" feature (which used Volta's Tensor Cores) is OptiX Denoising (which BTW also work on non Volta GPU's...in OptiX)
...
Anyway..there's no magic sauce..as stated by MS every DX12 GPU should be able to use DXR with the right drivers..
This comment didn’t stand the test of time. ;-)
 
It looks like DLAA could be a good reason to include TensorCores in consumer products.
Also, it's important to find a way to incorporate ray tracing into the rendering pipeline of a game, or otherwise it'll be difficult for consumer products to use it (the good old chicken and egg problem). I can imagine that some global illumination and reflection could be done with ray tracing to improve image quality in a game. That could be a selling point for including some ray tracing hardware in a consumer hardware.
 
I'm confused as to who this is for. Reading through the recent TOG papers on production raytracing, and of the 4 covered none are particularly bound by actual raytrace times, and face far more memory bound scenarios which 96gb just isn't going to cover. Advancements in parallel raytracing could easily be setup to use AVX512, as I think Blue Sky? (Ice Age studio) does with their internal renderer already. More specialized hardware that only one vendor produces doesn't seem helpful here (AVX512 will probably be supported by Ryzen 2 CPUs.)

Tensor cores. RT cores. And it's a Quadro. While there's vague mention of things that could appear in a consumer facing GPU none of the highlights could. And the $10k price confirms that. I'm sure memory bound Neural Net training will eat it up regardless. But otherwise I'm not sure who or what this architecture is for. Making games is hard enough already, and consoles are still the primary money winners for triple A titles. "Hey make use of our proprietary tech we'll never let anyone else use because we're Nvidia!" isn't an appealing argument when high end games cost enough to make as it is.
 
"Hey make use of our proprietary tech we'll never let anyone else use because we're Nvidia!" isn't an appealing argument when high end games cost enough to make as it is.
Some titles are already being made with RTX in mind, Metro Exodus is among the first of them. It's gonna start slow, with some AO, GL, or real time reflections .. etc. Then scale bigger as hardware becomes more powerful and widespread.
 
Some titles are already being made with RTX in mind, Metro Exodus is among the first of them. It's gonna start slow, with some AO, GL, or real time reflections .. etc. Then scale bigger as hardware becomes more powerful and widespread.

Nah, Metro's demo is a hard sell at best, you can barely tell it's there, and there's still games that have HDR on consoles but not on PC. The idea that a bunch of studios will spend money just to cater to people that have newer cards from only one vendor is silly. It reminds me of PhysX (which Nvidia bought). Sure, every once in a while you still see "PhysX effects!" or something in a game. But barely ever.

These extra effects aren't really going to make a difference to sales, and that's what most studios, indie to big, are going to consider far more than throwing a few extra effects in. Hell you're seeing less and less of an "upgrade" from PC settings on most games as it is. PC versions of Far Cry 5, AC Origins, Monster Hunter, and etc. are hard to distinguish from the Xbox 1 X versions as it is.

Also it looks like another week till consumer variations are revealed.
 
The idea that a bunch of studios will spend money just to cater to people that have newer cards from only one vendor is silly.
Not really silly when you consider the dozens of titles with specific GameWorks implementations, HBAO+, HFTS, VXAO, PCSS+, TXAA, Ansel, HairWorks, WaveWorks, TurfWorks .. etc. You also mentioned PhysX. NVIDIA will just adapt the Ray Tracing stuff into new effects that will be implemented just like the others: RT AO, RT GI, RT Shadows .. etc.
 
NVIDIA released a teaser for the Geforce variants, showing some players chatting online, the player names have hidden meanings and hints:

RoyTeX = RTX
Mac-20 = 20 Series
Eight-Tee = 80

NVIDIA RTX 2080

Also one of the players is named: Not 11
There is also the phrase: give me 20

Date of the launch is 20 August 2018, the numbers of the date appear in a specific order that translates into 2080.

 
Last edited:
NVIDIA released a teaser for the Geforce variants, showing some players chatting online, the player names have hidden meanings and hints:

RoyTeX = RTX
Mac-20 = 20 Series
Eight-Tee = 80

NVIDIA RTX 2080

Also one of the players is named: Not 11
There is also the phrase: give me 20

Date of the launch is 20 August 2018, the numbers of the date appear in a specific order that translates into 2080.


Yeah, right. I will believe the launch is imminent when I see Nvidia organize a 7-stop townhall tour where pre-screened YouTubers and bloggers get 30 minutes of game time with taped-over FPS counters and then report back how it made them feel; explain how warm and danceable it was and what repressed childhood memories it helped them work through.
 
Going for broke: I would be really happy if the Quadro RTX 5000 is GT104 in workspace form, and its specs end up being close to what we get with the ?TX ??80.
 
I would be really curious to have a nVidia RTX vs PowerVR 6500 architecture and see what are the main differences between them (on the ray tracing stuff, not the overall chip).

EDIT : and, do you think Ray tracing was a plan for Navi ? Or will AMD release a chip without that functionality ?
 
To get the best out of the new architecture will of course need Nvidia and software partners to work hard on building from the foundations of the Nvidia RTX development platform. Nvidia, for its part, has "a brand new software stack for computer graphics merging rastering and ray tracing, computing and AI," according to Huang. Now that Turing has been officially announced Huang revealed that more than two dozen key ISVs are already working on RTX support.

Software support will be delivered by well known companies such as Adobe, Autodesk, and Pixar - on the content creator side of things. For entertainment and games support Nvidia named EA, Epic Games, and Remedy as early adopters of RTX technology. Probably the most interesting gaming industry assertions came in a testimonial from Epic which said "Just as we saw with the movie business over a decade ago, ray tracing is going to revolutionize the realism of real-time applications, cinematic experiences and high-end games". Epic added that gamers should look forward to content that is "indistinguishable from reality".

https://hexus.net/tech/news/graphics/121055-nvidia-turing-brings-ray-tracing-real-time-graphics/
 
Going for broke: I would be really happy if the Quadro RTX 5000 is GT104 in workspace form, and its specs end up being close to what we get with the ?TX ??80.

I'm going to guess that what it end up being. Quadro P5000 is based on GP104 with 2560 shader cores and RTX 5000 has only 3072 so it seems like the bump is in the reasonable ball park.


Now the question I have given that the 2080 will have 50-66% of the gigarays/second in terms of ray tracing performance of the big chip, is that enough to do anything meaningful in games? Obviously, I don't expect full ray-traced games, but what will change?
 
We porbably gonna see raytracing effects (shadows,AO,reflections) as slower high quality options to their raster counterparts in games
 
Status
Not open for further replies.
Back
Top