Digital Foundry Article Technical Discussion [2020]

Status
Not open for further replies.
One of my pc's is still on a 670 (yes i know 8 years old :p), it does about every game better then my base ps4 does. I consider that gpu lower then low end anno 2020. I tell you, things have been in a much worse state with other generations, people had to upgrade constantly to keep up. My Ti500 had to be upgraded rather fast to a 9700pro, that 9700 had to be upgraded far too quick aswell.
A family member has a 7970ghz edition, and yes those performance has held up better then my kepler.
also, this new SSD craze is quite old, the technology exists on PC since the invention of the wheel. Load times comparison are going to be interesting but this video shows that there's little difference between PCIe 4.0 NVmE, PCIe 3.0 NVmE and a typical SSD. So, I wonder how consoles fare.

 
That's because of all the additional overhead and baggage that comes with PC Filesystems and IO. We've talked a lot about this in the console Tech threads. It's also why Microsoft created a new technology area for consoles and will be bringing it to PCs, called DirectStorage.

So this isn't a like for like thing, you can't compare it at all yet because it doesn't exist for PCs yet and next-gen consoles aren't readily available either.
 

Here they compare loading times, its over 22 secs on consoles, basically no loading at all on a PC (assume NVME SSD).
I copied the link/time where they start talking about loading in doom, and how they achieved. So, were already there regarding loading times.

That's because of all the additional overhead and baggage that comes with PC Filesystems and IO. We've talked a lot about this in the console Tech threads. It's also why Microsoft created a new technology area for consoles and will be bringing it to PCs, called DirectStorage.

So this isn't a like for like thing, you can't compare it at all yet because it doesn't exist for PCs yet and next-gen consoles aren't readily available either.

Yea, things will only get better from here.
 
DF was right, black levels in RE2 Remake are modified because of artistic reasons. If you have a monitor with good contrast/good HDR/contrast you can see that if you get -like I did- the Noir dlc for Claire, which gives her a new costume but also adds a noir filter to the game which you can enable/disable.

That filter shows pitch dark blacks and the grey looking blacks or the occasional banding issues just disappear. It looks soooo good, but the game just becomes black and white, it'd be perfect to have those perfect black and white tones with everything else in color.

 
its not a VRAM issue. Its an architetcure deficiency one.
arch deficiency or a game coded around certain arch types? Obv. Turing is light years more advanced than pascal, but I think devs lean heavily on compute or async due to consoles. Pascal is OK at compute, but basically not a real feature complete async implementation, unlike turing.
I thought they couldn't properly analyse VR games because of the headset ?

I talk about this in my video coming out today. Performance review was doable with HL Alyx - although requires more subjective input from us in the review due to Async time warp.
 
arch deficiency or a game coded around certain arch types? Obv. Turing is light years more advanced than pascal, but I think devs lean heavily on compute or async due to consoles. Pascal is OK at compute, but basically not a real feature complete async implementation, unlike turing.

Id say arch one since its not at all limited to this title. Its become quite common. Why has it taken Nvidia 4 architectures to surpass GCN?
 
My GTX670 still outperforms my base PS4 in the games ive tested it agaist. It's a 2.6TF gpu but still, i guess not much optimization for that gpu :p
 
My understanding is that on the PC side of things a lot of the card optimisation it is done by the likes of NV and AMD in the drivers.

True, i'm still getting driver updates for that GTX670, even as of today (bought it in summer of 2012), but i doubt much of work in those drivers are towards kepler related gpu's. I could be wrong though, last year one of the driver updates improved performance for apex legends, so that's that :p
 
Anything with less than 8GB of RAM will run horribly at Ultra. You need High settings for that.

Hardwareunboxed published its benchmarks today and tested it at Low, with alot of old cards, and it's still worse than what you'd expect. He claims that their testing sequence is one of the most demanding in the game. The game apparently has a very minor performance impact going between the different graphical presets.

The graph is shown after 4:35.

True, i'm still getting driver updates for that GTX670, even as of today (bought it in summer of 2012), but i doubt much of work in those drivers are towards kepler related gpu's. I could be wrong though, last year one of the driver updates improved performance for apex legends, so that's that :p

Yeah I've also seen that Kepler still has gotten updates for Vulkan 1.2 and DX12 updates. Though I wouldn't expect the Game ready drivers to be more than bug fixes at this point. EDIT: I actually tried Vulkan in Ghost Recon Breakpoint now because it's free weekend and latest driver actually corrupted the graphics. Performance wise, the earlier and latest driver otherwise seemed on par. But it seems Kepler users should stick to DX11 anyway, my own observation was DX11 ran up to 10 FPS faster.

Hopefully, Doom Eternal is just the big exception. Even though AMD's older GCN parts have aged better, nothing else has been this bad for Nvidia's older GPUs AFAIK.
Red Dead Redemption 2 for example still has the GTX 770 competing with the R9 280, instead of falling beneath the HD 7790 like in Doom.
https://www.sweclockers.com/test/28500-red-dead-redemption-2-sweclockers-utmanar-systemkraven
 
Last edited:
Status
Not open for further replies.
Back
Top