https://www.guru3d.com/news-story/download-3dmark-for-windows-v2-11-6846-available.htmlToday, they're adding a new option to use a more versatile and sophisticated form of Variable-Rate Shading in the VRS feature test, Tier2.
...
With Variable-Rate Shading, a single pixel shader operation can be applied to a block of pixels, for example shading a 4×4 block of pixels with one operation rather than 16 separate operations. By applying the technique carefully, VRS can deliver a big performance boost with little impact on visual quality. With VRS, games can run at higher frame rates, in a higher resolution, or with higher quality settings. You need Windows 10 version 1903 or later and a DirectX 12 GPU that supports Variable-Rate Shading to run the 3DMark VRS feature test. Tier 1 VRS is supported by NVIDIA Turing-based GPUs and Intel Ice Lake CPUs.
Tier 2 VRS is currently only available on NVIDIA Turing-based GPUs.
3DMark for Windows update v2.11.6846 available - UL adds Variable-Rate Shading Tier 2
December 5, 2019
https://www.guru3d.com/news-story/download-3dmark-for-windows-v2-11-6846-available.html
Nice.3DMark for Windows update v2.11.6846 available - UL adds Variable-Rate Shading Tier 2
December 5, 2019
https://www.guru3d.com/news-story/download-3dmark-for-windows-v2-11-6846-available.html
Like a rage fury max or a 7950 gx2 ?In favor of multigpu.
tested using https://www.testufo.com/ghosting#ba...tion=160&pps=960&graphics=bbufo.png&pursuit=1
recorded via NVENC Geforce Xprience and NVENC+CPU OBS running on GTX 1660 SUPER driver 441.41 w10 1909
Recordings made using NVENC results in stutters while recording made with X264 CPU was smooth.
- NVENC OBS
- NVENC GFE
- X264 CPU OBS
Googling around, people has been complaining about NVENC stutter since first half of 2019
https://www.google.com/search?q=nvenc+stutter
https://www.google.com/search?q=RTX+nvenc+stutter
is NVENC on Turing (RTX and GTX 1660 SUPER) are defective in some cards? How's yours?
Yup, really nice series.Nvidia has some short videos on RT basics. The 3rd one talks a bit about RT core capabilities.
It looks like Asus is planning on updating the Nvidia RTX 2060 with a full 8GB of 14Gbps GDDR6 video memory. That will push it further ahead of the latest Radeon, the AMD RX 5600 XT, and could even potentially put it on par with the AMD RX 5700. It’s possible that this is coming directly from Nvidia, and there will be 8GB versions of the RTX 2060 coming from all of the green team’s graphics card partners, but so far we’ve only seen details of three different Asus Republic of Gamer cards.
So, where would these new 8GB cards sit? At the moment, thanks to our graphics card comparison charts, you can see that the current RTX 2060 sits in between the RX 5600 XT and the RX 5700. With a bit of a memory upgrade you could see the newer edition pulling alongside the higher spec AMD Navi card and leaving the other in its wake.
https://www.anandtech.com/show/15637/microsoft-intros-directx-12-ultimate-next-gen-feature-setTo be sure, what’s being announced today isn’t a new API – even the features being discussed today technically aren’t new – but rather it’s a newly defined feature set that wraps up several features that Microsoft and its partners have been working on over the past few years. This includes DirectX Raytracing, Variable Rate Shading, Mesh Shaders and Sampler Feedback. Most of these features have been available in some form for a time now as separate features within DirectX 12, but the creation of DirectX 12 Ultimate marks their official promotion from in-development or early adaptor status to being ready for the masses at large.
...
All told – and much to the glee of NVIDIA – DirectX 12 Ultimate’s feature set ends up looking a whole heck of a lot like their Turing architecture’s graphics feature set. Ray tracing, mesh shading, and variable rate shading were all introduced for the first time on Turing, and this represents the current cutting edge for GPU graphics functionality. Consequently, it’s no mistake that this new feature level, which Microsoft is internally calling 12_2, follows the Turing blueprint so closely. Feature levels are a collaboration between Microsoft and all of the GPU vendors, with feature levels representing a common set of features that everyone can agree to support.
Ultimately, this collaboration and timing means that there is already current-generation hardware out there that meets the requirements for 12_2 with NVIDIA’s GeForce 16 and 20 series (Turing) products. And while AMD and Intel are a bit farther behind the curve, they’ll get there as well. In fact in a lot of ways AMD’s forthcoming RDNA2 architecture, which has been at the heart of this week’s console announcements, will serve as the counterbalance to Turing as far as 12_2 goes. This is a feature set that crosses PCs and consoles, and while NVIDIA may dominate the PC space, what AMD is doing with RDNA2 is defining an entire generation of consoles for years to come.
DLSS 2.0 Features :
● Superior Image Quality - DLSS 2.0 offers native resolution image quality using half the pixels. It employs new temporal accumulation techniques for sharper image details and improved stability from frame to frame.
● Customizable Options - DLSS 2.0 offers users 3 image quality modes (Quality, Balanced, Performance) that control render resolution, with Performance mode now enabling up to a 4X super resolution.
● Great Scaling Across All RTX GPUs and Resolutions - a new, faster AI model more efficiently uses Tensor Cores to execute 2X faster than the original, improving frame rates and removing restrictions on supported GPUs, settings, and resolutions.
● One Network for All Games - While the original DLSS required per-game training, DLSS 2.0 offers a generalized AI network that removes the need to train for each specific game. This means faster game integrations and more DLSS titles.
DLSS 2.0 is officially announced
Control will be retrofitted with the new DLSS version.
This specific implementation is NVIDIA specific, but Microsoft has demonstrated similar super-resolution tech on DirectML which would run on any compatible hardware, including XSX.DLSS was critized when it launched but now it seems really amazing. Is this tech also in the XSX, or is that a long shot since it's NV based? Much performance gains to be had.