AMD Navi Product Reviews and Previews: (5500, 5600 XT, 5700, 5700 XT)

I liked the raytracing-based lighting in Metro Exodus enough to deal with 1080p on my 1080 Ti running RTX. I've been waiting forever for better lighting on the scale of that global illumination. Unfortunately Metro Exodus is actually rather boring and crash prone. :) But that sort of tech is probably the future.
 
Last edited:
I liked the raytracing-based lighting in Metro Exodus enough to deal with 1080p on my 1080 Ti running RTX. I've been waiting forever for better lighting on the scale of that global illumination. Unfortunately Metro Exodus is actually rather boring and crash prone. :) But that sort of tech is probably the future.
Maybe when it's applied to more than just the one skylight, it kills the immersion from better lighting pretty well when you realize it doesn't apply to most lights
 
AMD-Radeon-RX-5700-XT-nereferencni-MSI-Hardwareluxx-01.jpg


Msi contribution to non reference Navi graphic card range. Pretty unusuall step from company who´s managements thinks that AMD products are not worthy of their customer base. Seeing it from MSI management perspective, Navi must be very good product then .-)
 
minimalist with glossy plastic red accents. ;) It looks like the standard Twin Frozr sort of thing though which always worked pretty well.
 
I didn't feel the immersion was in any way reduced compared to the typically fake looking non-RTX mode. I didn't even notice what you're talking about.
So you see that RT is great unlike fake looking non-RTX mode, but don't see that every light source except for sun is still that fake looking non-RTX mode? o_O

Msi contribution to non reference Navi graphic card range. Pretty unusuall step from company who´s managements thinks that AMD products are not worthy of their customer base. Seeing it from MSI management perspective, Navi must be very good product then .-)
That's their one or two steps down model, Gaming X is the top of the line.
 
I want a high consistent frame rate on my main monitor which only supports 60Hz.
A 5700XT/2070S can't do that in every game at Ultra settings no matter how you slice it.

I always turn all the ”fake film” processing off such as distortion, chromatic aberration, grain filters, motion blur and DOF effects other than in cut scenes.
In the unlikely event that I need to turn anything off after that, the first to get lowered quality is shadows and reflections because they just don’t matter much to how I perceive the world, real or rendered.

If you say you are not convinced of the effect of RT, then your logic applies pretty much the same to most Ultra settings, the difference between Ultra and High is very little in most games, and they often come with a large performance impact as well. In fact RT doesn't look bad in comparison to these ultra settings considering the amount of IQ it adds.

both parties have features which other lacks,
Such as?
 
general gamers don't think that far ahead when planning purchases.
Most informed customers do. Never the less most customers buy NVIDIA anyway, they will hear that NVIDIA supports something called RTX while AMD doesn't. Which will probably only enforce their choice.
And it's not like NVIDIAs RTRT-performance is anything to write home about,
Still leaps and bounds better than not even supporting the feature.

we have no clue if their current RTRT-implementation is even relevant a year from now
All RT support on PC must come through DXR, which means any implementation compliant with it will be relevant.
 
Most informed customers do. Never the less most customers buy NVIDIA anyway, they will hear that NVIDIA supports something called RTX while AMD doesn't. Which will probably only enforce their choice.
Informed customers crawling through forums is like a drop in the ocean, heck, even people who bother to read more than a review or two are minority.
Still leaps and bounds better than not even supporting the feature.

All RT support on PC must come through DXR, which means any implementation compliant with it will be relevant.
Compliant for sure, but depending on how high AMD and NVIDIAs next gen raise the bar the performance might not be enough, it barely is on their top of the line card or two in current games which are hand tailored for just NVIDIAs first gen RT-implementation.
 
The 5700 is being compared to the almost a year old 2070, don't know why, probably because it looks better in benches, because the 2060S is around same performance, while the 5700XT is around old 2070 range, but lacking a truckload of features. If the consoles are going to have mid-range products like current gen got, it's sure nice performance compared to PS4/XOne, but we are getting 2060S performance, which is 2019's Nvidia's lowest end Turing. 2020 7nm from Nvidia and it's not even close to that even.

And it's not like NVIDIAs RTRT-performance is anything to write home about, until we see what the consoles

I don't think RNDA2's RT will be that much better then Turing's current RT performance, let alone nvidia's next-gen on 7nm.
Aside, Turings current ray tracing abilities aren't bad at all. In the beginning things where kinda slow for lower end Turing GPU's, but right now a 2060 non S is quite capable in BFV. Ray Tracing wasn't even a thing pre-2018 and look where we are now, even consoles seem to go after (limited) RT, audio in PS5's case probably for the most, but its a beginning.
 
The 5700 is being compared to the almost a year old 2070, don't know why, probably because it looks better in benches, because the 2060S is around same performance, while the 5700XT is around old 2070 range, but lacking a truckload of features.
Whole truckload? I can't even imagine such quantity. Could you give me about ten examples, please?
 
The 5700 is being compared to the almost a year old 2070, don't know why, probably because it looks better in benches, because the 2060S is around same performance, while the 5700XT is around old 2070 range, but lacking a truckload of features. If the consoles are going to have mid-range products like current gen got, it's sure nice performance compared to PS4/XOne, but we are getting 2060S performance, which is 2019's Nvidia's lowest end Turing. 2020 7nm from Nvidia and it's not even close to that even.



I don't think RNDA2's RT will be that much better then Turing's current RT performance, let alone nvidia's next-gen on 7nm.
Aside, Turings current ray tracing abilities aren't bad at all. In the beginning things where kinda slow for lower end Turing GPU's, but right now a 2060 non S is quite capable in BFV. Ray Tracing wasn't even a thing pre-2018 and look where we are now, even consoles seem to go after (limited) RT, audio in PS5's case probably for the most, but its a beginning.

Youre comparing a $350 AMD card to a $400 Nvidia card, why would you when AMD also has a $400 card?

Then youre pivoting to a lack of features for why the $400 card is bad.

When nvidia releases their 7nm cards AMD will already have their big navi out, probably for a while. They will reduce prices for the 5700 to compete when they have to.
 
So you see that RT is great unlike fake looking non-RTX mode, but don't see that every light source except for sun is still that fake looking non-RTX mode? o_O

Yup, that's one of the things that bothered me when I played Metro 2033 on my friends PC that has a 2080 Ti, the lighting is extremely inconsistent. The non-RTX lighting isn't anything to write home about compared to really good non-RTX lighting in other games. That, of course, makes the RTX lighting stand out a bit more, but even then the RTX lighting while very noticeable in some areas was almost indistinguishable from really good non-RTX lighting in other areas.

Basically this and other RTX "showcases" is the reason that I have absolutely zero desire to get a Turing based card (especially at their prices) as I'd immediately turn it off due to the performance impact. While nice in some cases, the inconsistencies in implementation combined with the performance hit, means it just isn't worth it for me.

However, that said, I'm glad the product exists and that it is deemed valuable for some people as I think RT might become more interesting in the future in the gaming sector. Something that wouldn't be a possibility if someone didn't have an RT card available for developers and researchers to purchase.

A 5700XT/2070S can't do that in every game at Ultra settings no matter how you slice it.

Neither can my 1070. And I'd still rather have my 1070 than a 2060 despite them having the same performance because the 1070 doesn't cost me as much. Also when running games, I'd be using exactly the same settings which would not include RT.

It doesn't bother me that I can't run games full time at Ultra settings. I look at them occasionally to see how it looks, then promptly lower settings until I have a locked 60 FPS.

My most common gaming resolution now is 3200x1800 in a window unless a game won't allow me to choose that custom resolution, then it's either 2880x1800, 2880x1620, 2560x1600, or 2560x1440 in that descending order depending on what the game allows me to choose.

This means that even with a 2080 Ti, I'd likely be turning off RT while playing the game. The benefits it offers currently aren't any more valuable than some of the Ultra features that I regularly turn off. Especially since the application of those RT benefits is so inconsistent that to me it actually makes the game look worse. Anything applied inconsistently, again to me is going to look worse than a technically worse solution that is applied consistently.

But again, I'm glad there are people that can appreciate the level of RT that Turing brings as that is needed in order to keep the product out there so that the possibility of good RT in future games exists. I just wish those people wouldn't assume that everyone appreciates or should appreciate the level of RT that Turning brings and thus should be willing to pay a premium for it.

Regards,
SB
 
AMD cards will get closer and closer to Nvidia as their revenue stream gets better, which at this point is guaranteed.

Nvidia is not making anything out of ordinary with their cards. They are just battling company that hasnt put 1/10 of $$ in their GPU division compared to Nvidia in last 5 years. Its no surprise that before 2014 both where nip and tuck for more then a decade, but obviously when your company is hanging on a string for much of a decade its hard to battle with your competitors on two fronts.
 
Could we not have Nvidia rampant marketing and RTX-is-jesus in the Navi thread?
And RTX-is-the-devil and RT offers no benefits in current games that showcase it because the feature is missing in Navi?
Edit: Let's get back on topic and leave personal opinion and likes/dislikes for another thread.
 
Last edited by a moderator:
I agree, RT is selling point no matter how you slice it. I am personally not interested in completely tanking my frame rate and resolution for relatively minor effect on even high end cards, but its 100% legitimate talking point.
 
Well there's not much to be excited about here anyway. It's just a decent midrange card with no future. It's probably a better choice than a GTX 1660, assuming you want to spend more, and assuming the unrefined drivers don't let you down with your games.

Maybe the architecture can go somewhere exciting with a bigger, fancier chip someday. Probably in a console and a new card next year.
 
Back
Top