Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
Not to mention the genius idea of using Xbox's purchasing scheme of selling games and DLCs using points in the Microsoft Store rather than a flat fee. You were forced to buy points packs, often ending up with wasted points. You have 200 and this thing you need costs 800? Well, no 600 points package, buy the 1000 and end up with 1200, and then find yourself with 400 you don't need. Brilliant.

Yup. As a gamepad PC player I am glad MS brought Xinput and the Xbox 360 controller as the standard for PC gamepads, no doubt. That does not forgive the multitude of fuck-ups with GFWL overall. It was a very poorly thought out service that deserved to die long before it did.
 
You have a very romanticized view of GFWL.
I'm not saying it was perfect, I'm saying it moved PC gaming in the right direction. Basically everything you pointed out as a negative was either fixed (like paid multiplayer) or was abandoned by developers (like crossplay) without real fault on MS end. Microsoft gave them the tools, they chose not to use them. Which is really the sad thing about GFW. The games that used it had great results in many cases, but it sounds as if you had issues with the way some things were implemented at launch or features that never got used by developers.

Halo 2 only on Vista? Yeah, that game had a system requirement that was arbitrary. That happens all of the time. There were plenty of XP games that ran on 98 just fine once you got them installed.
 
Hardware Unboxed Looks at RX 6800 vs 3070

This is largely focused on recent titles that are regarded to be VRAM constrained, and in particular looks at them after they've received patches to supposedly address that. What in particular I appreciate about this video is it's actual gameplay with video through each benchmark run, along with measured frame times, not simply a summary of the 1% lows/average. The recorded run is particularly important as you'll discover why.

Of particular note is that the Last of Us at 1080p/High still sucks with 8gb, even on recent patches which supposedly reduce vram usage a bit. While I don't agree with the comment that "If most gamers had 16GB vram there would be little to complain about with performance" (the game is massively CPU and render-bound regardless of vram constraints), there have been arguments presented here that High fits in fine in the 8gb buffer, even at 1440p. At least in these tests, that does not appear to be the case. Playable? Yeah sure I guess - but that consistency is shit.


View attachment 8717

Note that they do look at a suite of recent titles in the latter part of the video that perform well within 8gb to give a more complete picture of recent releases, and they also state that 8GB is perfectly fine and will be for a bit - but it's entry level now. The point of contention is this is being shipped on $500+ GPU's.

Bullshit video, the 6800 is a performance tier above the 3070 and that accounts for a lot of the results in this video and not the VRAM.

A Plague's Tail for example uses ~6GB VRAM at native 4k on max settings and the reason his results show the 6800 being faster has nothing to do with VRAM but the fact it's just a faster GPU.

He should have used the 16GB 6700XT as that's the same tier as the 3070/ti.
 
Bullshit video, the 6800 is a performance tier above the 3070 and that accounts for a lot of the results in this video and not the VRAM.

A Plague's Tail for example uses ~6GB VRAM at native 4k on max settings and the reason his results show the 6800 being faster has nothing to do with VRAM but the fact it's just a faster GPU.

He should have used the 16GB 6700XT as that's the same tier as the 3070/ti.
Partially true but the 3070 dipping to the tens while the 6800 remains above 30 isn’t because the 6800 is 3x faster. It’s likely a VRAM bottleneck.
 
Bullshit video, the 6800 is a performance tier above the 3070 and that accounts for a lot of the results in this video and not the VRAM.

It absolutely does not. First off, they're comparing the 6800 because that's the actual retail price on the street vs the 3070, you can easily go to Newegg to confirm this. The 6800 is actually cheaper here.

Secondly, no - the performance data is perfectly clear. It's the vram. That's the bottleneck. You can tell by the frametimes and the reported vram usage. This is not about sustained performance difference (that's shown in the latter part of the video), it's the massive stuttering, and when it doesn't, it's the problem games simply not loading in higher res textures at all.

A Plague's Tail for example uses ~6GB VRAM at native 4k on max settings and the reason his results show the 6800 being faster has nothing to do with VRAM but the fact it's just a faster GPU.

They're using RT. You can clearly see in the video the VRAM memory readings from Afterburner, when they go to 1440p it clearly runs out of vram and gets massive stutters.

You can argue maybe the settings they test are aren't realistic (and bear in mind - RT should also be hurting the Radeon), but if you're arguing that the performance delta in those games is just due to the power differential of those cards (and again, irrelevant when we're talking about actual price), you clearly didn't watch the video.

He should have used the 16GB 6700XT as that's the same tier as the 3070/ti.
There is no such thing as a 16GB 6700XT, it's 12GB. Secondly in price it's far cheaper than the 3070, in actual retail the 3060 is more its competitor.

I mean this isn't really controversial, most everyone has acknowledged in terms of raster performance for the midrange to low, Radeon has been far cheaper on actual shelves for quite a while now. Nvidia still commands a premium due to DLSS and RT - well, depending if you're not VRAM limited I guess.
 
Last edited:
It absolutely does not.
Yes it does, we were only talking the other day how well optimised the game is in regard to VRAM use.
First off, they're comparing the 6800 because that's the actual retail price on the street vs the 3070, you can easily go to Newegg to confirm this. The 6800 is actually cheaper here.
So his video is bullshit then, he needs to isolate the VRAM causing the performance issues and not have the actual processing performance of the core influence the results.
Secondly, no - the performance data is perfectly clear. It's the vram.
I's not in A Plauge's Tale.
That's the bottleneck. You can tell by the frametimes and the reported vram usage. This is not about sustained performance difference, it's the massive stuttering, and when it doesn't, it's the problem games simply not loading in higher res textures at all.

His whole video is junk, and he even confirms the 6800 had an MRSP that was 16% higher than the 3070 so not a fair comparison at all.

RTX3080 12GB would have been a better and fairer comparison.
 
Yes it does, we were only talking the other day how well optimised the game is in regard to VRAM use.

Not using RT.

So his video is bullshit then, he needs to isolate the VRAM causing the performance issues

He did. It's clearly in the video and the resulting data shown. Again, you obviously didn't watch the video, as they go on to test 8 games that aren't vram limited to isolate that specifically!

His whole video is junk, and he even confirms the 6800 had an MRSP that was 18% higher than the 3070.

As was explained at the start of the video and as I just explained as well, who gives a fuck about MSRP. I care about what I can actually buy the card for. The 6800 is the same, or cheaper. That has been the case for many months now, the 6800/6700xt/6600 have all been significantly outperforming Nvidia's comparable offerings at raster price/performance since fall of last year. This is not news.

RTX3080 12GB would have been a better and fairer comparison.

Show me the links. At Newegg they're often more than double the 6800's price, and only slightly below that if you restrict 3080 12GB models coming from Hong Kong. Like what are you doing man?
 
Last edited:
"I care about what I can actually buy the card for."

.....Good for you, but that's irrelevant for a video like this as 'what people can buy them for' is to wild of a variable so MRSP is a better fit, especially as card prices varied massively by country.

It's not 'too wild a variable', we can see the prices right now. The price disparity, or lack there-of, is constant across many regions - UK, US, Canada, Aussie. This has been the case for months in this price bracket and below, you're just out of touch on this.

This is not some 24 hour fire sale. Yes, if someone compared the price of a product that say, had a flash sale at Microcenter which required you to physically pick it up and expired in 2 days, that's obviously disingenuous when comparing price/performance of products as that's a very narrow exception that isn't applicable to the vast majority of consumers. This is not that. They're at this price because Nvidia holds a commanding mindshare and still some significant technical advantages, in spots. AMD should update their MSRP to match what retailers are discounting them in order to actually get them off the shelves, yes.

But the facts remain - as they have for 6+ months now - if you want something in the 3070 class and below and don't particularly care about RT and DLSS (or, care more about vram), Radeons have a very compelling price/performance ratio on actual store shelves.
 
Last edited:
In my case, I disabled steam hardware accerelation and run a super duper clean system. Even if Steve uses super duper clean system, he wouldn't really bother with disabling steam's hardware accerelation. To be fair, I also reduced "visual effects" and "volumetrics" to low respectively. So I took three important steps that increased %1 lows at 1440p with high critical textures (enviroment, character and dynamic objects).

A Plague Tale Requiem has a higher VRAM usage at ultra but it actually is super "tame" at high preset/high texture setting, and still produces good quality textures.

Ray tracing performance at 1440p DLSS Quality however is not that good. Blurrier image quality, 35-45 FPS average... Not worth it IMO. The raw performance is simply not there. I personally would enjoy playing at locked 30/40, but there is a much better alternative. Game is super demanding on rasterization.


So I prefer 4K DLSS performance + 60 FPS instead. Sharp, pristine and cleaner image quality, 55-65 FPS average... Yeah...


Both works anyways. Ultra in this game will break 8GB cards, its true. I never even bothered with it. The second I saw high textures looking very good and respectable, I never looked back. Gamers like me will be happy with 8 GB for a while, as long as devs like iron galaxy/nd can figure out what Asobo did to gracefully reduce VRAM requirements of texture without making them look like PS2 assets.

I will later do a 1440p/native focused video in TLOU with %1 metrics open. I can guarantee you on my end, with my tweaks and special settings, I do not get %1 lows that drop to 30s. I really wish I had a capture card though. Recording without impacting performance is nearly impossible on such a tight budget. :/
 
I will later do a 1440p/native focused video in TLOU with %1 metrics open. I can guarantee you on my end, with my tweaks and special settings, I do not get %1 lows that drop to 30s. I really wish I had a capture card though. Recording without impacting performance is nearly impossible on such a tight budget. :/

Thanks, appreciated. Shadowplay is weird for me, often shows 'stuttering' in my uploaded videos on youtube that aren't in gameplay, and are somewhat less in the native mp4, but the combo of youtube+sp recorded video butchers the frametime for some reason and always has. Other sources, such as from my PS5 uploaded to YT, don't have this problem. Albeit doesn't really matter much wrt recorded frametimes if your GPU budget has ~3-5% to spare but just annoying.

I also just upgraded my rig to 32GB DDR4 3200 (from 16GB 2800), a little surprised by the meagre improvement in TLOU at high 1440p/DLSS perf. It's still CPU limited so often on my i5-12400f.
 
Thanks, appreciated. Shadowplay is weird for me, often shows 'stuttering' in my uploaded videos on youtube that aren't in gameplay, and are somewhat less in the native mp4, but the combo of youtube+sp recorded video butchers the frametime for some reason and always has. Other sources, such as from my PS5 uploaded to YT, don't have this problem.

I also just upgraded my rig to 32GB DDR4 3200 (from 16GB 2800), a little surprised by the meagre improvement in TLOU at high 1440p/DLSS perf. It's still CPU limited so often on my i512400.
Thanks. 1440p high, I won't even try. Nor I see any reason to do so. PC gaming is supposed to be scalable. And they actually did the decency of breaking up textures into categories where you can sacrifice some of them that you wouldn't notice much. (not many games do it)

Ideally, no PC in existence can have free 8057 VRAM. Quite impossible. DWM will always take around 150-300 mb. The below configuration of settings will create stutters, low %1 lows. It is unavodiable.

BfQjYDF.png


Reducing visual effects streaming quality is my 1st step. Why? Because effects did not look that bad to me. Even if they did, if it means it allows me to get high quality charactar and enviroment with decent frametimes otherwise, it is a worthy sacrifice. Why burn down the everything just for... fire effects?

2eJdsqc.png


And volumetrics. Frankly, they look decent even at low setting. They also grant you some rasterization performance boost, so it is welcome.

eUVYcJe.png


Turning off motion blur also reduces VRAM usage just a bit. But I don't like motion blur anyways (hello, LCD screen)

Then; I will admit pushing 7600 mb game application is still extreme. Most 8 gig users from what I'm seeing have idle free VRAM around 6.9 to 7.2 GB. I can, personally, have 7.5-7.6 GB free VRAM so it works for me. Enabling DLSS quality on top of these settings however will put you to the safe heaven of 7.2 GB.

So it depends on user too. And some scalability is there. Just my 2 cents.
 
As a follow up, here is my performance at native 1440p with only two settings changed compared to High preset (volumetric and visual effects). After 5 mins of gameplay; 54 FPS average and 44 FPS %1 lows were recorded. Overall gameplay was very smooth. Without recording, averages and lows would be a bit higher I'd have to guess. T


I get much higher %1 lows at native 1440p than Hb unboxed gets at 1080p/high (exact same benchmarked place)

And of course, 1080p high (no tweaks, just high preset itself) is a cakewalk on my end;


I'm definitely not getting %1 lows near 30 FPS or extreme stutters or stalls.

Heck; in game VRAM usage says 6.5 GB of game application usage. Color me surprised; Game never breaches past 6.6 GB and dynamically adjusts itself between 6.4 and 6.6 GB. The fact that total VRAM usage never breaches past 7 GB (1 GB unused free VRAM) proves that "total VRAM usage" metric shown by the game is not a parameter you have to take seriously nor it determines whether you will get a stable experience or that... That bar tells me 1080p high puts me around 8089 MB, bu total system VRAM usage never goes past 7 GB.

uIiUkO5.png




A1EpJit.png


I really don't know how they got "total" 7.9 GB usage at 1080p/high in this place. Mine never breaches past 7 GB with this preset/resolution...
 
As a follow up, here is my performance at native 1440p with only two settings changed compared to High preset (volumetric and visual effects). After 5 mins of gameplay; 54 FPS average and 44 FPS %1 lows were recorded. Overall gameplay was very smooth. Without recording, averages and lows would be a bit higher I'd have to guess. T


I get much higher %1 lows at native 1440p than Hb unboxed gets at 1080p/high (exact same benchmarked place)

And of course, 1080p high (no tweaks, just high preset itself) is a cakewalk on my end;


I'm definitely not getting %1 lows near 30 FPS or extreme stutters or stalls.

Heck; in game VRAM usage says 6.5 GB of game application usage. Color me surprised; Game never breaches past 6.6 GB and dynamically adjusts itself between 6.4 and 6.6 GB. The fact that total VRAM usage never breaches past 7 GB (1 GB unused free VRAM) proves that "total VRAM usage" metric shown by the game is not a parameter you have to take seriously nor it determines whether you will get a stable experience or that... That bar tells me 1080p high puts me around 8089 MB, bu total system VRAM usage never goes past 7 GB.

uIiUkO5.png




A1EpJit.png


I really don't know how they got "total" 7.9 GB usage at 1080p/high in this place. Mine never breaches past 7 GB with this preset/resolution...

Good to hear. The game is saying it will use over 13GB on my 4070Ti at all Ultra and native 3840x1600 but that's including 2.5GB reserved for the OS/Apps lol. I can bring it to about 12GB with DLSSQ but might try running at native and see how it compares.
 
Good to hear. The game is saying it will use over 13GB on my 4070Ti at all Ultra and native 3840x1600 but that's including 2.5GB reserved for the OS/Apps lol. I can bring it to about 12GB with DLSSQ but might try running at native and see how it compares.
You must take a look at your idle VRAM usage before running the game through task manager. I was surprised my friend had so much bloat on his PC, his idle VRAM usage was actually 2.5 GB on his 10 GB 3080. He had;

- Epic games launcher
- A browser that is always open
- Steam
- Discord
- Some Razer software

List goes on. Literally, his idle VRAM usage was 2.5 GB. More than what the devs assumed in this case. Turning off stuff really did help in his case. He was unaware so much of his VRAM was already used by other apps.

I told him, why let epic games run on the background and consume VRAM/resources if you're not going to use it while playing a game on Steam? It really makes no sense. Browser however is another beast. Multitasking is an important aspect of PC gaming so I can't ask people to shut off their browsers. I personally do that not only because of performance reasons but also to focus on the game itself. That's the nature of the beast. Like if you had limited RAM, you would have to cut down multitasking severely. Same is the case for VRAM too, but just that people are not aware of it yet.

So set your settings according to the free VRAM you have. I'm sure you won't have any major problems. Maybe set it -100/150 mb below what you have as free. Game likes to use an extra 100-150 MB VRAM on top of game application usage number.

ce2oXOK.png


I have a whopping free 7.8 GB VRAM for example. And game happily uses all of it in my case.
 
How much does L2 cache matter? I hear they increased it's amount by some crazy amount compared to the 3000 series.
 
Status
Not open for further replies.
Back
Top