Digital Foundry Article Technical Discussion [2020]

Status
Not open for further replies.
Is it the same?
I was under the impression that the version MS showed (DXR) during xsx reveal is different than the RTX one.
Could be wrong but I'm sure DF said they was different, even though Nvidia did help out?
3:34 in.



So as i said, the RTX version/branch isn't wasted effort as its all learning, but DXR is a fork and my point was I'm surprised they didn't just put all the effort into the DXR one now.
I suspect the RTX will not run on RDNA2 RT.
Nah both use DXR to run it.
 
Nah both use DXR to run it.
My point is that they are branches, when could focus on one. The 'official' dxr one was based on the rtx one.
Also not saying this isn't useful to be fed into the dxr one that ms working on.

But there's no guarantee that Nvidia hasn't or wont make it so it only runs on rtx hardware.
Pretty sure i don't need to go into things like physx which could run on other hardware but they made sure to restrict it. Or is this a new Nvidia and I missed the change? :D

This goes for all their rtx games like quake rtx
 
My point is that they are branches, when could focus on one. The 'official' dxr one was based on the rtx one.
Also not saying this isn't useful to be fed into the dxr one that ms working on.

But there's no guarantee that Nvidia hasn't or wont make it so it only runs on rtx hardware.
Pretty sure i don't need to go into things like physx which could run on other hardware but they made sure to restrict it. Or is this a new Nvidia and I missed the change? :D

This goes for all their rtx games like quake rtx
If it runs on DXR - it works on every vendor's hw that supports DXR in the driver. There is no such thing as "limiting a game's RT to RTX". The worst case scenario is that AMD GPUs do not support Quake 2 RTX or Wolf: Youngblood because the AMD driver team does not offer up support for the original VK extensions for RT before it was codified. Or if those games do not get updated to just switch out the extension names.

Turing has existed for more than 1.5 years now and still this misnomer or misinformation pops up. RTX is just a marketing name for a HW and driver implementation for certain technologies: not an exclusivity moniker. It has just as much value as the word "Geforce" almost.
 
RTX is just a marketing name for a HW and driver implementation for certain technologies:
I know RTX is using DXR.
I'm just not as confident as you that Nvidia wouldn't make RTX branded games that they funded, feature exclusive to their cards.
It's not about IF it is possible to run on other cards, its if they would stop it or not.
 
I know RTX is using DXR.
I'm just not as confident as you that Nvidia wouldn't make RTX branded games that they funded, feature exclusive to their cards.
It's not about IF it is possible to run on other cards, its if they would stop it or not.
They are just doing API calls though, so that is not possible. Why fearmonger about something that is not possible?
 
Pretty sure i don't need to go into things like physx which could run on other hardware but they made sure to restrict it. Or is this a new Nvidia and I missed the change? :D

If PhysX was part of DirectX then you would have a point, but PhysX was never part of Microsofts DX API.

NV can't restrict things that are in the DirectX API, they can only restrict things that aren't in the DirectX API. Hence, anything that uses DXR will run on ALL hardware that supports those DXR features.

Regards,
SB
 

And a summary from ILikeFeet from era.

  • "what's the difference between ray tracing and path tracing?" not much, RT is more single solutions to a task (RT shadows, reflections, etc) while PT is everything at once
  • started in March 2019, no one remembers who came up with the idea
  • contrary to popular belief, the simple style doesn't make path tracing easier as Minecraft has a lot of polygons on screen and physically based materials
  • RenderDragon already supported DX12 so adding DXR wasn't too much work
  • it took 2-3 weeks for some simple path traced AO
  • uses DXR1.0, they'll look into 1.1 support, but might not happen (might not need it)
  • irradiance caching is in Minecraft while it's not in Quake 2
  • irradiance caching stores ray data on geometry and accelerates secondary rays for more detail and performance
  • used to get multiple bouncing
  • Metro Exodus stored GI data using spherical harmonics to reconstruct specular data
  • perfect mirrors has 8 bounces, rougher surfaces only do 2 bounces
  • volume fog uses a similar method as rasterized volume fog
  • RT allows them to make colored shadows
  • each transparent surface has a transmission value (rgb) and as the ray passes through, a transmission ray is cast to collect the values of every transparent surface afterwards to determine the color of non-transparent surface the ray might hit (this one was difficult to summarize, so correct me if I'm wrong)
  • water is slightly different in that transmission loss is heightened in order to get that fade to darkness with the depth
  • their method for motion vectors "works", will be getting better
  • denoising costs the most (15% cache updates, 40% ray tracing, 45% denoising)
  • GI is noisy AF
  • denoising is very bandwidth heavy
  • Minecraft uses 3 separate denoisers (shadows, speculars, diffuse)
  • diffuse and shadows move differently than specular/reflection so they have to be handled differently
  • screen space denoising via spherical harmonics
  • more emissive surfaces = easier denoising, because of a larger, cleaner signal
  • explicit lights (torches, rods, lamps) are small, so harder to denoise
  • more rays = less noise, but there are diminishing returns
  • rays have to increase exponentially to have the same jump in denoise quality
  • AI denoising is used a lot in offline rendering, but not yet for realtime, but research is being done
  • performance drop with higher render chunks is a memory issue
  • objects in the distance are shaded at the same level as objects up close (I recall somewhere, probably Beyond3D, that LoD is an issue with ray tracing because of this)
  • primary visibility is done via RT rather than rasterized because laziness, but to rasterize, they'd needed to modify the render engine to output a g-buffer
  • lensing effect is cause by casting those primary rays so changing the primary visibility to rasterizer would lose that effect unless they worked it into the renderer, "they'll see"
  • global illumination data is in the fog volume, but not in the beta build. so emissives will light the fog, hope to update the beta
  • mesh-based caustic are a crazy idea
  • temporal lag is being worked on, is a denoising issue
  • there's a light leaking issue in caves
  • particles are rasterized
  • path tracing is more general than rasterization (rasterizers can be different between genres due to needs)
  • path tracing allows for a lot of unplanned features like lenses and camera obscura
  • in first person, you don't have a body to be rendered. maybe they'll make something for the final release
 
Last edited:
If PhysX was part of DirectX then you would have a point, but PhysX was never part of Microsofts DX API.

NV can't restrict things that are in the DirectX API, they can only restrict things that aren't in the DirectX API. Hence, anything that uses DXR will run on ALL hardware that supports those DXR features.

Regards,
SB
I accept that and fully agree, as i think theres a musunderstanding what i mean.

I was talking about a driver check to allow enabling of the feature. i.e. enable RTX branded feature, regardless if its on top of DXR.
If its their funded feature they can limit it to whatever hardware they wanted. Even if it would be hacked straight away.
Windows is open, as a developer you can support or not whatever you want, you don't have to support everything.

And that was a throw away comment anyway, my post was about there being what looks like 2 branches, even though this will still be beneficial.

Anyway, think got to the end of this line of discussion as it wasn't the point to begin with.
 
Please read the entire written DF article too: https://www.eurogamer.net/articles/digitalfoundry-2020-is-stadias-free-tier-1080p60-makes-more-sense

With its free service, Stadia is starting to make sense
1080p gaming with a 60fps focus trumps the 4K Pro experience.​

Stadia is now free! Or rather, any Google user can now sign up to the service and access the games library without having to subscribe to the Pro tier or purchase the firm's bespoke controller and Chromecast Ultra 4K HDR receiver. It's a good jumping on point for users interested in the service and as we shall discover, accessing Stadia via Chrome browsers, smartphones or tablets can actually offer a key advantage over the Pro-level 4K Chromecast Ultra experience. In returning to Google's cloud service, we also wanted to take the opportunity to go back to Doom Eternal and revisit our latency metrics - a key point of criticism in prior coverage. Was Stadia just having a bad day when we tested it? Was there something wrong with our network? Could we bring latency back down to the impressive level we saw in our Stadia review?

The good news is that we have managed to reduce latency in our Stadia test set-up, improving the Doom Eternal experience significantly. id Software's port succeeds in pushing an 1800p resolution, excellent visuals and a highly consistent 60fps. However, fast response is a must for a fast-paced first-person shooter and our initial results just weren't good enough. We logged a range of latency results between 79-100ms extra compared to the Xbox One X version of the game - a surprise given the 300Mbps fibre connection behind it. Google itself asked for permission to access our telemetry (which we granted) but our end goal is to give the system and the software the fairest assessment we can, so we spent a lot more time investigating the metrics ourselves and looking to optimise the experience.

...

Meanwhile, the variability in our latency tests is also a concern, even if +45ms was ultimately achievable via a Chrome browser. In asking for our telemetry while using a Chromecast Ultra, Google mentioned to us that they're aware of extreme cases involving a large number of connected devices causing latency issues at the user side. Initial results with all devices stripped out of the network showed a good improvement, but to lose that improvement in identical conditions the next day was dispiriting. Whether latency was +58ms vs Xbox One X or +100ms, the connection was rated as 'excellent' in all cases when the variation in the gameplay experience was easily felt. Stripping out all WiFi from our setup - up to and including the Stadia controller - solved the issue and does suggest a problem at our end (the controller worked fine in the Stadia review sessions, after all) but it would have been useful to get some kind of feedback from Stadia itself when our experience was running under spec by such a margin.

At this point, we've tested all major Stadia ports and there remains the sense that there's a wealth of potential here but the execution isn't entirely right. There's still the excellent accessibility aspect - super-fast loading of our entire library of games isn't to be sniffed at. And it's great to see titles like The Division 2 hitting 60fps when Xbox One X can't, meaning that developers can tap into CPU performance that's much higher than console counterparts. However, this is offset in many cases by quantifiably lower GPU performance from a graphics core that was promoted as being considerably more powerful. Meanwhile, it's difficult to recommend Stadia's premium subscription when 4K - or sub-native 4K - rendering can come with a profound performance penalty in so many games. All of these areas can and should be addressed but the inconsistency in our input lag testing is another concern. On the one hand, it's good that after extended testing and re-testing, we were able to bring latency down to a much more manageable level - but on the other hand, it felt like we were operating in the dark, with the Stadia system itself offering no indications that anything was wrong at any point, nor offering any advice on correcting the lag issues we encountered. Relatively speaking, it's still early days for Stadia - and cloud gaming in general - so hopefully platform stability and proper tools for tuning the experience will be delivered over time.​
 
It's like Intel want to retire from laptop and desktop computers, but they can't compete with Ryzen anymore. So I think they gave up....

Their next generation is all about renaming technology that they already have, which is much more inefficient than Ryzen, slower, filled with vulnerabilities, they can't reduce their chips size to 7nm and stuff. This looks to me like the last stone in Intel's grave -in the desktop/laptop market-.

 
Status
Not open for further replies.
Back
Top