AMD: Navi Speculation, Rumours and Discussion [2019-2020]

Status
Not open for further replies.
Well some guy in the comments section mentioned it was showing a difference on his 1080 Ti.

That's not saying much though as the orginal image quality isn't exactly brillant, but is definitely sharper with it "on".

OFF:
RxMzzkr.png


ON:
yPDNuzQ.png
 
Well some guy in the comments section mentioned it was showing a difference on his 1080 Ti.

That's not saying much though as the orginal image quality isn't exactly brillant, but is definitely sharper with it "on".

OFF:
ON:
Maybe a driver thing then, or just no visible difference in some areas? DSOGaming shots don't seem to show difference really
 
Any idea what those bolded features actually are? I don't remember reading anything about them before, curiously enough.
No, sorry, they are new to me as well.
This was covered by Scott Herkelman at the AMD E3 "Next Horizon Gaming" event (rewind to 46m20s):


FidelityFX is an "open-source image quality toolkit" for game developers, available from GPUopen.com - basically a collection of post-processing filters which incur no performance penalty.

Radeon Image Sharpening is a similar effect but comes as a driver feature which can be force enabled in settings (rewind to 49m20s).
 
Last edited:
We can do that with reshade for a while...

Granted it will be easier for "novice", but their is nothing specific for Navi here...
 
I dunno, since this is being done at the driver level they should have access to more data than the reshade injector.

At any rate its definitely a step up from the horrible morphological aa which was slow and smeary and applied itself to everything including menus making it worse than nothing in most instances.
 
This may force Nvidia to offer DLSS up-scaling, without enabling RTX, a big win also for RTX owners.
Really looking forward to see tests of these new features, sounds very promising.
 
What about the anti-lag feature ? Anyone has an idea of how it works ? Because it looks way more promising than a sharpening filter.
It's suspected to be just setting pre-rendered frames to 1 (or can it be 0 without issues?), but no confirmation one way or the other yet
 
Its a good first step towards feature parity with nvidia.

Just a few days to go. The 2060 and 2070 super are going to be basically neck and neck with the 5700 and 5700xt so it will be an interesting match up.

Will the ray tracing be enough to justify a $50 increase in cost over the 5700xt? Maybe! I guess well have to wait and see. Great to see some competition finally though.
 
Its a good first step towards feature parity with nvidia.

Just a few days to go. The 2060 and 2070 super are going to be basically neck and neck with the 5700 and 5700xt so it will be an interesting match up.

Will the ray tracing be enough to justify a $50 increase in cost over the 5700xt? Maybe! I guess well have to wait and see. Great to see some competition finally though.
TBH variable rate shading is IMO bigger selling point and justification for higher price at least for now
 
TBH variable rate shading is IMO bigger selling point and justification for higher price at least for now

There's some good better selling points for Turing right now, and variable rate shading is one but not the biggest. To run next gen games, like say, GTAVI and whatever Cyberpunk ++ games is rumored to be under development, among all others, you'd want DXR compatibility, and probably to make primitive shaders programmable (like mesh shaders, we need a standard there!). Programmers just don't want to support two separate graphics paths if they can help it, and since both the PS5 and Xbox(?) support some sort of hardware enabled raytracing, odds are there'll be plenty of games that require just that over time. And as DXR backwards compability patches for Nvidia show, you really don't want to be stuck on a software only emulation.

As for DLSS and whatever, it's glorified PR. Shove inject SMAA or some other dll into the exe folder and you get the same thing. Most people probably don't know either Nvidia nor AMD's hyped up driver AA exist, and next to no one should care. What they don't know exist, but what they should be caring about, is the arbitrarily low ram specs and lack of hardware raytracing on AMD cards. What they do know and care about is the high prices of these cards compared with performance. I'm very, very grateful for competitions sake that Intel will be entering the high end GPU market next year. True competition, woot woot!
 
TBH variable rate shading is IMO bigger selling point and justification for higher price at least for now
Variable rate shading is transparent to the user, and it "only" boosts performance, and according to the results we've seen from Ice Lake U presentations, it's mostly useful for GPUs that lack raw compute power for its segment (i.e. not most AMD solutions).

Lack of DXR will mean a constantly greyed-out option for every AMD user. That has a whole different impact.
In 2018 I think the feature was completely useless. In 2019 it's still not very useful but it's definitely a long-term feature to pay attention to.
But in 2020 (6 months from now)? Well in 2020 we'll have Cyberpunk supporting it. And Cyberpunk will most probably be the system seller of the year.

And people spending $450 on a newly released 5700XT next week will have a greyed out IQ option in Cyberpunk.
AMD definitely knew DXR/RT would be widely adopted by 2020. They knew because they've been designing the console GPUs that support RT hardware.

Which is yet another reason why Navi was most definitely not an architecture designed to release in H2 2019. These should have been releases for early Q4 2018 at the latest.
 
Variable rate shading is transparent to the user, and it "only" boosts performance, and according to the results we've seen from Ice Lake U presentations, it's mostly useful for GPUs that lack raw compute power for its segment (i.e. not most AMD solutions).

Lack of DXR will mean a constantly greyed-out option for every AMD user. That has a whole different impact.
In 2018 I think the feature was completely useless. In 2019 it's still not very useful but it's definitely a long-term feature to pay attention to.
But in 2020 (6 months from now)? Well in 2020 we'll have Cyberpunk supporting it. And Cyberpunk will most probably be the system seller of the year.

And people spending $450 on a newly released 5700XT next week will have a greyed out IQ option in Cyberpunk.
AMD definitely knew DXR/RT would be widely adopted by 2020. They knew because they've been designing the console GPUs that support RT hardware.

Which is yet another reason why Navi was most definitely not an architecture designed to release in H2 2019. These should have been releases for early Q4 2018 at the latest.


Correct.

That is why Dr Su said they will release Ray Tracing when it doesn't impact performance. Everything you've just said applies for Nvidia too, but with a mocking laugh. because Nvidia TRIED to give us ray tracing, but failed horribly. Nobody is going to turn ray tracing on when you take a massive performance hit. That is why you just bought a $800 card, because you want performance, not eye glitter.

Who is pushing (ie paying) for Ray Tracing in games..? Not the Gamer, they have been pushing for more raw performance and poly crunching. I suspect that in 2020 we will get Cards that support real time ray tracing on hardware.
 
Correct.

That is why Dr Su said they will release Ray Tracing when it doesn't impact performance.
Total non sens. How is this possible ?:rolleyes:

Everything you've just said applies for Nvidia too, but with a mocking laugh. because Nvidia TRIED to give us ray tracing, but failed horribly.
No they don't. Its a first gen try and as usual, each new tech takes time to mature. Expecting a free ride on such massive rendering change is delusional. In fact, I think the performance is pretty good when you compare to fallback DXR software.

Nobody is going to turn ray tracing on when you take a massive performance hit. That is why you just bought a $800 card, because you want performance, not eye glitter.

Who is pushing (ie paying) for Ray Tracing in games..? Not the Gamer, they have been pushing for more raw performance and poly crunching. I suspect that in 2020 we will get Cards that support real time ray tracing on hardware.
You can't talk for everybody. Many gamers enjoy RT now. With DLSS, it works quite well (exit the blurry mess of the beginning).
And what is not hardware on Nvidia Turing BVH traversal dedicated silicon ?

Finally, why so much hate ?
 
Total non sens. How is this possible ?:rolleyes:
With further software optimisations, hopefully with minimal visible artefacts?
I tend to agree with that statement regardless of whether it's from Lisa Su or not. The future lithographic prospects for HP circuitry is rather bleak, and while the industry generally will increasingly lean on advanced packaging to push the envelope going forward, that's more difficult with chips that emit a lot of heat. The last ten years the number of TFLOPS/mm2 in GPUs has increased a factor 4. (Yes I know that this is a ball park number and that other factors play into performance). I strongly doubt it will climb at a higher rate for the next 10 years.
Lithography is unlikely to solve the performance problems of RT, so unless efficiency is improved to the point that you get better performance than alternative acceptable methods the technique faces an uphill battle in the overall graphics market.
 
because Nvidia TRIED to give us ray tracing, but failed horribly. Nobody is going to turn ray tracing on when you take a massive performance hit. That is why you just bought a $800 card, because you want performance, not eye glitter.

I still feel to be the hardest critic of DXR on this forum, but i have to disagree with all of this.
RTX is no failure. Adoption seems at its peak. Cyperpunk, CoD and everything else announces support.
If you want high performance, lower resolution to a number still large enough, or turn RTX off. But if i spend more than 400 just for a GPU, i do not expect it gives me any advantage other than better image quality, and what could server better here than RT?

Actally my FuryX is toast and i have to upgrade. I always planned to wait on Navi, but for the money they want i can get RT for the same price and performance.
Vega56 is a great offer - but paying almost twice just for the newer architecture with the same features and TF? No.
I'll wait a bit to see how prices go, but looking towards NV monopoly. Pricing could be their only argument until they have RT too. FidelityFX on vs. RTX on will not even convice AMD fanboys.
 
That is why Dr Su said they will release Ray Tracing when it doesn't impact performance.
That's the lamest excuse i have ever seen, it's just a ruse to cover up the fact that they are two years behind NVIDIA in Ray Tracing, worse yet they will have it next year in consoles while their current 700$ and 500$ GPUs lack even the most basic DXR support, which really doesn't earn any points regarding future proofing their products.
That is why you just bought a $800 card, because you want performance, not eye glitter.
People buy expensive GPUs to push eye candy to the max as well, if you want performance you turn down settings to Medium ona 1060/580 and let your fps fly.
 
That's the lamest excuse i have ever seen, it's just a ruse to cover up the fact that they are two years behind NVIDIA in Ray Tracing, worse yet they will have it next year in consoles while their current 700$ and 500$ GPUs lack even the most basic DXR support, which really doesn't earn any points regarding future proofing their products.

People buy expensive GPUs to push eye candy to the max as well, if you want performance you turn down settings to Medium ona 1060/580 and let your fps fly.
And that product segment represent the bulk of discrete desktop GPUs. It is still much stronger than average discrete laptop GPUs which in turn is stronger than integrated graphics that is the bulk of PC graphics.
So when will RT be more than a curiosity switch in games, for those that enjoy tech nerding? Ever?
The upcoming consoles will pretty much define the AAA games market for the next decade or so. It will be interesting to see what they bring to the table.
 
So when will RT be more than a curiosity switch in games, for those that enjoy tech nerding? Ever?
That is the case with every new major API feature, it's effective usability trickles from high end GPUs down the stack. It was the case with Tessellation, HDR lighting, Shader Model 3 .. etc.

In fact, most Ultra graphics settings in today's PC games are curiosity switches to 1060/580 owners. That doesn't stop developers from implementing them in EVERY PC title. And the way I see it, RT is no exception to that rule, especially now that consoles/major APIs/almost all engines/AAA games are supporting it.
 
Status
Not open for further replies.
Back
Top