AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

I definitely feel the PC frustration that comes with hardware upgrades, windows weirdness, driver bugs and glitchy games. Granted that last one infects console games too. I just don’t have the patience any more.

However the idea of playing an FPS with a controller or being locked into stale console hardware, middling IQ and no opportunity to configure games to my liking is so much worse.
 
I definitely feel the PC frustration that comes with hardware upgrades, windows weirdness, driver bugs and glitchy games. Granted that last one infects console games too. I just don’t have the patience any more.

However the idea of playing an FPS with a controller or being locked into stale console hardware, middling IQ and no opportunity to configure games to my liking is so much worse.

I'm pretty much the same. Find some weird annoyance with windows or pc sotware frequently, but ultimately I can get my 120+ fps, play with mouse and keyboard in everything, use discord etc.
 
However the idea of playing an FPS with a controller or being locked into stale console hardware, middling IQ and no opportunity to configure games to my liking is so much worse.

For me there is this additional psychological factor of having no control over that black box of a console. So i feel too much of being a consumer only.
Sadly this slips to PCs too. I don't wanna automatic updates, Cortana, lacking the option to block some things from internet, etc. But i have no choice. It's like pay to rent, somehow. It's also like helping hands forcing their aid on me because i'm probably too dumb to operate my box on my own.
Well, probably an issue of getting old and generations. Younger people mostly have no problem with backups of their lives on some google server.

but ultimately I can get my 120+ fps, play with mouse and keyboard
I think, as graphics improve, the advantage of higher fps or res will shrink and go unnoticed. I'm also optimistic at some point progress will even start reducing the demands on HW power.
If so, we would talk only about input devices finally. Not enough to save or justify a platform, because USB ports are everywhere.
Latest at this point we should see seamless cross platform devolpement form low power to high power devices until the term 'platform' becomes obsolete. Our geeky tech interests are doomed to become nostalgia, and there will be no more independent client devices which work without their big brothers, servers. :)

Is Ready Player One a nightmare? Or is it what we wanted all the time? Probably both.
 
Mining explains the current situation, not the fact that AIBs complained already after NVIDIA announcements before the cards came for sale and mining craze hit in full force that the "MSRPs are unrealistic" and that they possibly couldn't match them.
AMD came in later when the craze had already started so it could be explained with mining if it wasn't for what happened already on the other side first.
 
Mining explains the current situation, not the fact that AIBs complained already after NVIDIA announcements before the cards came for sale and mining craze hit in full force that the "MSRPs are unrealistic" and that they possibly couldn't match them.
That was for 3060 which hasn't been at MSRP for even 1 second since launch. Mining is the reason. I suspect that AIBs complaining is also due to mining - why would they sell cards at low MSRPs when people are standing with trucks in line to get them for 10X more? For them it's a huge loss of profit.
 
No Capcom listed the 2060 and 6700XT for min RT, then they upgraded to 2070 for recommended, meaning it is equal to the 6700XT. They didn't upgrade to the 2080, the 3060 or anything else. Just the 2070.

Anyways, Capcom thinks both RTX 2070 and RX 6700XT are applicable for 4K/45fps with RT.

Only the 3070 and 6900XT can do 4K60 with RT according to Capcom.

https://www.resetera.com/threads/resident-evil-village-system-requirements-released.397822/
And how exactly is this related to how confident AMD is with their RT implementation?
 
Just as I thought. Even in AMD sponsored and console optimized Raytracing titles, Nvidia pulls far ahead of AMD. There is no magic optimization here, Nvidia simply accelerates RT faster thanks to BVH traversal in hardware. If anything, it would be extremly fishy if it were the opposite because that would be evidence of AMD purposely slowing down the competitors RT solution, which is thankfully not the case.
 
Even in AMD sponsored and console optimized Raytracing titles, Nvidia pulls far ahead of AMD.
That's not possible, because PC is the only platform we could compare them, but on PC there is no way (yet) to utilize the higher flexibility of AMD.
A4 games confirmed in some recent interview they do custom traversal on console, which is not possible on Windows.
 
That's not possible, because PC is the only platform we could compare them, but on PC there is no way (yet) to utilize the higher flexibility of AMD.
A4 games confirmed in some recent interview they do custom traversal on console, which is not possible on Windows.
They also do RT reflections only on Windows which should tell you some things about the extent of custom traversal benefits on the same h/w.
 
They also do RT reflections only on Windows which should tell you some things about the extent of custom traversal benefits on the same h/w.
Nah, tells me nothing. My examples would be fine grained LOD or streaming BVH as discussed before. I'm sure AMD would perform better, because i'm unsure if it can work on NV at all.
Ofc. any DXR title, or any usage close to classical RT will run faster on NV. And ofc. there won't be much else than that, making my point a bit rhetorical in practice.
 
I'm pretty sure that would violate their non compete agreement with AMD. Which no doubt goes something like "We'll (AMD) sell you (MS) these APUs at a super low price as long as you agree not to use them to compete in markets we're already in" AKA using it for a PC.

I mean who could compete with a decent 8 core CPU, higher end SSD, 16gb of ram, and a solid GPU for $500?

Well, we don't know the contracts or how they are applied. Also at the end of the day is just ink over paper, they can be changed if both parties agreed.
 
Its not all that intresting in perspective to whats available today anyway. In a console its a much more intresting package though.

I think it would be very interesting. As a PC it won't be much but! as a gaming platform it would. Best of both worlds.
 
As a PC it won't be much
Why not? That Ryzen 2700 i have feels like a supercomputer to me :) Though i regret i have only 16GB, but with fast SSD out of core processing would more than compensate. I think it could be a nice workstation even.

hmmm... i don't think that's going to happen, but could PCs be cheaper if there would be less fragmented manufacturing? Meaning, one big company building 3 basic models at huge quantities for the masses?
I don't want that from economical perspective, but such company could ensure to get GPUs before those money diggers do.
 
hmmm... i don't think that's going to happen, but could PCs be cheaper if there would be less fragmented manufacturing? Meaning, one big company building 3 basic models at huge quantities for the masses?
PCs wouldn't get cheaper, just the big company would make a lot more profit.
 
This is in response to the previous discussion that started with this post:
And where does that post or AMD claim their RT performance would be better than NVIDIAs? Being confident in their RT performance means they're happy with how their cards perform in the game with RT on.

I did misread part of your earlier post though, so I'll re-address it.
No Capcom listed the 2060 and 6700XT for min RT, then they upgraded to 2070 for recommended, meaning it is equal to the 6700XT. They didn't upgrade to the 2080, the 3060 or anything else. Just the 2070.
No, upgrading it to 2070 doesn't mean 2070 equals 6700 XT. It might, but it doesn't actually mean it.
What it means, is that 2060 (Super) isn't enough for recommended spec, but 2070 is. 6700 XT is also enough for recommended spec, but since AMD doesn't have slower cards with RT support, it just means it's at minimum as fast as 2070 and could be a lot faster too, maybe 2080 level, maybe 3060 level, maybe something else.
 
No, upgrading it to 2070 doesn't mean 2070 equals 6700 XT. It might, but it doesn't actually mean it.
There is no room for this kind of convoluted explanation when Capcom lists exactly what each AMD card is capable of, under the preset "Ray Tracing".

RX 6700XT: 4K/45fps same as RTX 2070
RX 6800: 4K/45fps
RX 6900XT: 4K/60fps same as RTX 3070

nBhuB2M.png
 
However the idea of playing an FPS with a controller or being locked into stale console hardware, middling IQ and no opportunity to configure games to my liking is
so much worse.
While I agree with all of the above, I think there's an increasing number of mitigating factors for each point:

- You can use KBM on the consoles. We'll just need for 3rd party FPS developers to increase the number of games that officially support it (Overwatch, Elder Scrolls Online and Call of Duty already support it).
- Console hardware became less stale with the launch of the mid-gens. We're now getting a new console every 3-4 years. Is your PC upgrade cadence much lower than that?
- Middling IQ was the main point of the discussion. It seems the ROI in IQ is ever decreasing. I wonder if as soon as the consoles start using whatever FSR is, if the IQ difference will become moot.
- Some of the conventionally moddable games like Skyrim are already moddable on the consoles too.

It looks like both consoles have been targeting an increasing chunk of the PC crowd. From integrated audio+video streaming support, modding, KBM, mid-gen releases to even native 1440p monitor output, it seems like both platforms are slowly converging into a single "gamer" market.



Just as I thought. Even in AMD sponsored and console optimized Raytracing titles, Nvidia pulls far ahead of AMD.
Can you link to independent and publicly available benchmarks for this AMD sponsored raytracing title you speak of?
If those assumptions are coming from the RE8 specs recommendations, are we also supposed to believe the RX 6800 offers the exact same 4K-45FPS performance as the RX6700XT, despite benchmarks showing 24-32% higher 4K+RT performance on the former? The 3070 is only 33% faster in 4K RT than the 2070, when other benchmarks point to a 65-70% difference?


And where does that post or AMD claim their RT performance would be better than NVIDIAs?
It (I) did not as later clarified in a later post. Nor did AMD, who just recently very clearly stated they know their RT performance isn't matching Ampere's and aren't trying to hide it (nor the fact that they think rasterization performance is more important and their RT performance is sufficient).

My problem here is the yet another derailment of this thread into a nvidia has best raytracing hardware series of statements, out of a fake response to a statement they fabricated.
This is a speech that apparently needs to find its way into this thread every handful of pages.


I don't get what's actually being claimed here. Is this just more Nvidia are gods crap?
I'm surprised DLSS hasn't been put into the mix yet.
 
Back
Top