AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

Yeah, like AIBs adding the US import taxes straight to their prices even though majority of cards will never touch the US soil and thus they're not actually paying those tariffes themselves
(no, I don't have any facts to back that up, but unless other electronics manufacturers didn't just decide to take huge hit themselves, there's no other explanation for video cards price jump, not to the extent it happened)
Mining.
 
You're all ignoring the harsh reality of Moore's Law leveling off. We're still getting area shrinks and power reduction, but $/xtor is the problem. Pandemic, mining and tariffs are blips but physics is the long-term issue. Historical GPU perf scaling rates can only be sustained with increased prices.
 
You're all ignoring the harsh reality of Moore's Law leveling off. We're still getting area shrinks and power reduction, but $/xtor is the problem. Pandemic, mining and tariffs are blips but physics is the long-term issue. Historical GPU perf scaling rates can only be sustained with increased prices.

Too damned right, we're looking at 2nm being the last node ever, even if we get there. There's no roadmap past HighNa EUV, among a thousand other problems. The good news is, some people with funding seem to realize this. There's private company research into carbon nanotube transistors, as well as a really promising sounding DARPA program funding another project.

CNTs offer much higher clockspeeds for current/heat, and while not the insane speeds of graphene they also seem practically much closer. I'm particularly interested in the DARPA program, which aims for the basic research and industrial processes foundries might then use to create CNT chips, as well as aiming to use stacked logic/ram at the same time. Which makes a lot of sense for clockspeeds measured in the 10s of gigaherz. Current processors are often always struggling against latency, and often bandwidth as it is, trying to raise the clockspeeds several times and more without offering a new solution for latency and bandwidth is asking for severe underutilization.
 
Which coincidentally got me thinking. If AMD has chiplets and TSMC has ultra dense SRAM caches, why not make the big caches even bigger for the CPU? The GPU is already overmaxxed on its cache size versus use, but surely the CPU could use a giant 256mb+ LLC with ease.

Also, most relevant to this thread, I don't see why they're using a cache memory structure for the BVH. I get why the cache memory is being used, but using it as a cache structure for such a giant memory load means latency is severely hurt thanks to sorting through all the tags. Why not use the already assumed static analysis of the code finding the BVH to virtualize the cache as a standard memory address when it comes to the BVH? Lessen the latency penalty just through whatever microcode/drivers controls that without any hardware changes. Also use that same static analysis to see if you can drive whatever separate voltage planes/clocks control the areas are slowing things to most during RT as high as you can go. If they're redesigning cooling for higher power delivery anyway...
 
The good news is, some people with funding seem to realize this. There's private company research into carbon nanotube transistors, as well as a really promising sounding DARPA program funding another project.

Yeah, Bob Colwell was beating that drum a lot in various EE talks a ~decade ago, I assume he's left DARPA by now. He's definitely the polar opposite of Jim Keller's optimism.
 
Which coincidentally got me thinking. If AMD has chiplets and TSMC has ultra dense SRAM caches, why not make the big caches even bigger for the CPU? The GPU is already overmaxxed on its cache size versus use, but surely the CPU could use a giant 256mb+ LLC with ease.
AMD Rome (2019) features 256MB of L3 in 16 partitions. AMD Milan (2021) has the same amount in 8 partitions. It's not a true LLC but the amount of SRAM is there. Going forward with the density and advanced packaging techniques, the need for less data movement and locality exploitation should bring even bigger memory pools.
 
Hey, i have a great shady business idea: Buy some XBoxes, hack and install Win10, resell as gaming PC. You know, like it did happen with PC -> Macintosh clone 20 years ago. :D

I would order! No way i'm gonna get some next gen GPU otherwise this year, it seems :/

I have no idea why MS is not doing this. Like the Xbox already use Windows and Direct X....Making it a real PC is just some little work.
 
I have no idea why MS is not doing this. Like the Xbox already use Windows and Direct X....Making it a real PC is just some little work.

Why would they do that ? It's not like they have a huge inventory waiting in stock.
 
I have no idea why MS is not doing this. Like the Xbox already use Windows and Direct X....Making it a real PC is just some little work.

I'm pretty sure that would violate their non compete agreement with AMD. Which no doubt goes something like "We'll (AMD) sell you (MS) these APUs at a super low price as long as you agree not to use them to compete in markets we're already in" AKA using it for a PC.

I mean who could compete with a decent 8 core CPU, higher end SSD, 16gb of ram, and a solid GPU for $500?
 
Its not all that intresting in perspective to whats available today anyway. In a console its a much more intresting package though.
 
Its not all that intresting in perspective to whats available today anyway. In a console its a much more intresting package though.
U joking? Nothing is available. I could get some 6900XT for the price of a used car. (I'm not going for 6700 because either i want 6800 for games or something smaller, which would be enough for dev.)
You sound like PC gaming == high end master race, but that's not the case. On average people always have similar PC specs than actual consoles.
Actually i'm more worried the situation becomes the reverse, because couch gamers get more chips than chair gamers.
Though, who cares. High spec seems no longer making such a big difference anyways.
 
RE8 requires an RTX 2070 or RX 6700XT as recommended specs to enjoy RT at 1080p60. So much for AMD being confident in their RT implementation.

https://www.dsogaming.com/news/resident-evil-village-pc-requirements/

Well it's the lowest end card they've announced that supports RT, so I wouldn't read a lot into it. If you wanted to take a bigger dig you could also say minimum is 2060/6700XT.

More curious to me is that recommended for rasterisation is 1070 or 5700. The latter's normally ahead of a 1080 isn't it?
 
RE8 requires an RTX 2070 or RX 6700XT as recommended specs to enjoy RT at 1080p60. So much for AMD being confident in their RT implementation.

https://www.dsogaming.com/news/resident-evil-village-pc-requirements/
Erm, how exactly RX 6700 XT being the recommended spec suggests AMD isn't confident in their RT implementation?
As @Qesa pointed out above, it's the slowest RT supporting card they've announced, and even if we disregard that, I simply can't figure out the logic behind your claim.
 
Actually i'm more worried the situation becomes the reverse, because couch gamers get more chips than chair gamers.
This is actually an important point that Scot Herkelman raises in his PCGamer live conversation.

He said that their market research found out that people will start to mentally distance themselves from buying PC gaming hardware - and PC gaming itself - if they go through a long period of time without being able to buy new stuff. They observed it in 2017 and they're observing it now.
I.e. mining is taking away PC gamers for good, as apparently there's a tendency to not come back when the market stabilizes.

I'd guess there's also a seasonal tendency to decrease the PC gaming marketshare for a couple of years whenever a new generation of consoles comes out. However, this time it's a perfect storm:

1 - the new consoles are offering high-end gaming experiences:
2 - PC CPUs and graphics cards that would be equivalent to said consoles are overpriced. For example, the 6700XT released with a similar MSRP to the PS5 and SeriesX, while back in 2013 the HD7870 GHz was going for less than $280;
3 - The CPUs/GPUs that would offer a substantially upgrade over the consoles are nowhere to be found (and/or reaching ridiculous prices).

So while AMD and Nvidia are making record profits out of their GPU sales these quarters, their serviceable available market is shrinking.


Speaking on a personal level, I had planned to do a major upgrade in the beginning of this year (probably Ryzen 5900 + best deal between Navi21 or GA102 at around $600), but of course I couldn't get any of those.
Nowadays I'm progressively less inclined to deal with the miner/scalper/availability shitshow and I certainly don't have the time+patience to follow availability on select twitter accounts to rush on to an estore just to see the unavailable red sign on a product that was already way pricier than my initial budget.
If push comes to shove and they start releasing games I want to play that aren't available on the PS5 (e.g. elder scrolls) and don't play well on my old PC hardware, I think I'll just give up on PC hardware and buy an xbox.


Though, who cares. High spec seems no longer making such a big difference anyways.
It usually doesn't, at the start of a new generation of consoles (which tends to increase the baseline by ~8x over the previous generation), but I do agree the "graphical ROI" has been going down.
Regardless, it's also a good thing for us consumers that high/top-end graphics cards don't provide a big difference from mid-end offerings because the price of the high-end GPUs has been steadily rising at a pace way above the inflation and manufacturing cost.



More curious to me is that recommended for rasterisation is 1070 or 5700. The latter's normally ahead of a 1080 isn't it?
Yes, the 5700 is around 25% over the 1070 and even the 5600XT is some 10-15% faster. It's a bit strange they're not mentioning the Vega cards that are contemporaneous with the Pascal models. It's a cross-gen game so there's probably a very optimized path for GCN GPUs that put the Vega 56/64 on par with the 1070/1080.

Though as we've been seeing, the system requirements lists often have these weird nonsensical comparisons. Cyberpunk 2077's recommended system requirements have something like a 4-core skylake from 2015 or a 6-core Zen2 from 2019.
Don't read too much into it.


I simply can't figure out the logic behind your claim.
I bet you can.
 
This is actually an important point that Scot Herkelman raises in his PCGamer live conversation.

He said that their market research found out that people will start to mentally distance themselves from buying PC gaming hardware - and PC gaming itself - if they go through a long period of time without being able to buy new stuff. They observed it in 2017 and they're observing it now.
I.e. mining is taking away PC gamers for good, as apparently there's a tendency to not come back when the market stabilizes.
Yeah, that's some really good arguments.
On the long run, or if looking at it from some distance, there is more than that:
* Looking at that huge box under my desk it really feels old school already. Much worse if somebody maintains this expensive box only to play games. Considering the trend should be smaller boxes not bigger ones, this argument now hits consoles as well. PS5 really is a ridiculous ugly something. This can't be the future, this does not feel modern or efficient.
* Tech oriented marketing does not help: 'Look! we have RT now! For only 1000$$$!' - 'Meh - can't spot a difference at all. Try harder.' Though, i'm not sure if majority of gamers even pays attention, but they ofc. question the prices and ask for what. Arguments of 4K and RT are too weak.
* Most important: PC lacks exclusive games developed for that platform, utilizing it's immersion advantage of close display and mouse controls. I feel PC gamers are dominated by memories about Half Life or Diablo, complaining now about console ports while not being heard.

This latter point is the one i fail to comprehend. The market is big enough, but only Indies target it specifically, which defies the argument game production is just too expensive so cross platform is the only option.
It feels fragile, and current chip shortenings might indeed be enough to give PC gaming the death sentence after some further years of struggle.

The obvious solution for now can only be to hold minimum specs low and make good games.
One option i think is underutilized is platform dependent tuning and design. I don't mean graphics options but gameplay itself. The term 'scaling' is meant only technically, currently. That's not enough i think.
 
Erm, how exactly RX 6700 XT being the recommended spec suggests AMD isn't confident in their RT implementation?
As @Qesa pointed out above, it's the slowest RT supporting card they've announced, and even if we disregard that, I simply can't figure out the logic behind your claim.
No Capcom listed the 2060 and 6700XT for min RT, then they upgraded to 2070 for recommended, meaning it is equal to the 6700XT. They didn't upgrade to the 2080, the 3060 or anything else. Just the 2070.

Anyways, Capcom thinks both RTX 2070 and RX 6700XT are applicable for 4K/45fps with RT.

Only the 3070 and 6900XT can do 4K60 with RT according to Capcom.

https://www.resetera.com/threads/resident-evil-village-system-requirements-released.397822/
 
Last edited:
The increasing cost of PC hardware relative to consoles combined with the decreasing level of improvement they offer has to be a concern for PC tech companies.
 
Back
Top