Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
The Hitman engine is too old to support ray tracing efficiently, time to evolve it.

What does this mean? DXR is here since four years and the UE4 was updated multiple times. Maybe it has more to do with the cooperation between IO and Intel which in handsight didnt work...
 
So did I on...

Athlon 64 x2 5600+
2Gb DDR2
8800GT SLI

But the difference for me when playing Crysis on the above rig in 2007 was the visuals on offer were worth the performance hit and were far and above any other game on the market.

I can't say the same for games with RT in 2022, the overall visual improvement in a lot of cases for me doesn't justify the huge drop in performance.

If turning on RT was the equivalent of going from 'High' to 'Very High' in Crysis it would be worth it, but alas, that difference just isn't there.
The difference from high to very high in Crysis wasn’t that big of a visual jump. Going from medium to high was the game changer.
 
Maybe it has more to do with the cooperation between IO and Intel which in handsight didnt work...
Don't really know that until we see how it performs on ARC hardware.
May not perform well on RTX and RDNA2 hardware, doesn't mean doesn't on Intel
 
The difference from high to very high in Crysis wasn’t that big of a visual jump. Going from medium to high was the game changer.

I think it depends on the map, the deeper in to the game you go the more of a difference there is imo.

Still crazy to think that in 2022 on the best hardware available we still can't lock it to a solid 60fps.
 
Not that crazy considering how little single-threaded performance has improved since Dennard scaling died. Same reason why 10+ year old CPUs are still pretty capable today. I think in terms of IPC we're only maybe a 2-3x faster than we were at the time of Crysis. I don't think cache and memory latency has improved all that much either.
 
I think it depends on the map, the deeper in to the game you go the more of a difference there is imo.

Still crazy to think that in 2022 on the best hardware available we still can't lock it to a solid 60fps.
Can you show us some examples? I've never seen a huge difference but I also stopped checking the higher settings after determining early game that they didn't make a difference early game.
 
DF Weekly Article @ https://www.eurogamer.net/df-direct...n-2024-and-is-the-next-xbox-a-streaming-stick

DF Direct Weekly: Will new consoles arrive in 2024 and is the next Xbox a streaming stick?
Plus: Will Sony's plans for PS4's 'long tail' extend the cross-gen transition?

The weeks in the run-up to E3 - or what remains of it - are always something of a dry spell, which perhaps may explain some of the more outlandish nature of the topics for discussion this week. After a presentation from TV manufacturer, TCL, rumours are starting to gather pace that Pro/enhanced consoles will come from Microsoft and Sony in 2024. Meanwhile, Microsoft is planning an xCloud streaming stick - something that does actually seem to be happening - and news emerges from Sony about its long term strategy, covering PC, mobile and a 'longest ever tail' for PlayStation 4. All this and more is discussed in the new DF Direct Weekly, embedded below for your viewing pleasure.

...
 
DigitalFoundry Weekly said:
The topic of cross-gen also cropped up in Sony's recent investor disclosures, the most fascinating statistic being that 80 percent of PlayStation 4 online revenue now comes from the digital domain, with just 20 percent from physical discs.

So the PS4 market is mostly second hand sales or they're doing more digital sales.
 
Can you show us some examples? I've never seen a huge difference but I also stopped checking the higher settings after determining early game that they didn't make a difference early game.

Not without installing it again and running through it no, if I get some time over the weekend I'll see what I can do as my new GPU should be here by then.
 
Can you show us some examples? I've never seen a huge difference but I also stopped checking the higher settings after determining early game that they didn't make a difference early game.


https://imgsli.com/MTEwMjUw
https://imgsli.com/MTEwMjQ5
https://imgsli.com/MTEwMjQ4
https://imgsli.com/MTEwMjQ2
https://imgsli.com/MTEwMjQ3

Crysis 1 had very high locked to Vista and dx10 by default. HDR, god rays and i guess its original ambient occlusion are enabled in very high settings. High is very flat and has a very uniform lightning, while very high adds a ton of depth and granularity to everything
 
https://imgsli.com/MTEwMjUw
https://imgsli.com/MTEwMjQ5
https://imgsli.com/MTEwMjQ4
https://imgsli.com/MTEwMjQ2
https://imgsli.com/MTEwMjQ3

Crysis 1 had very high locked to Vista and dx10 by default. HDR, god rays and i guess its original ambient occlusion are enabled in very high settings. High is very flat and has a very uniform lightning, while very high adds a ton of depth and granularity to everything
Not at all a big difference IMO. The screen space god rays could be enabled with a notepad file at a negligible performance hit under DX9. Those account for the largest difference in those screens.
 
Its a very big difference. Maybe it doest come of that well in static pics, but when moving, playing, its huge. High has that crude green over every foliage and the flat lightning. And thats always in your face. Very high changes that, it might seem kinda subtle looking at a picture, but since everywhere you look is just trees and jungle, that subtle shading difference and shadowing applies to every single element, everything being rendered in a scene at one time, so the change is big. Back then, after i got a 8800GT somewhere in 2008 i replayed the game, played it when it launched on a 7600 something, so a very visually poor experience. And i really loved how smooth high felt to me at the time while playing, but i just couldnt leave the quality of very high alone. I remember switching between them and, damn, i cant turn this off, it looks to good like this. Richer image.
 
So the PS4 market is mostly second hand sales or they're doing more digital sales.
not that surprising. Free2Play games can be quite massive. Fortnite alone makes enough profit to cross-fund the whole epic game store.

Also, there were not that many new retail games but many, many new digital only games (smaller and bigger). Especially smaller studios mostly are digital only as they can't afford to sell retail versions of a game.
And than there are the now always available sales that are quite cheap. (not to forget that you can almost always get cheap PSN credit ... currently again 100€ for 75€)
 
I'd think the lack of parallax mapping is the biggest difference...?

Yep, it makes a huge difference in some of the maps.

Crysis_POM.gif
 
Very High in Crysis gives you a lot - Colour grading, screen-space crepuscular rays, ray marched volumetrics when they show up on a level (core), SSAO (this one is pretty big for a game from 2007), Parallax Occlusion Maps, per object motion blur... it really goes on and on. It was a pretty big deal back then IMO.

I think the general hate for windows vista may have clouded the judgement at that time.
 
https://imgsli.com/MTEwMjUw
https://imgsli.com/MTEwMjQ5
https://imgsli.com/MTEwMjQ4
https://imgsli.com/MTEwMjQ2
https://imgsli.com/MTEwMjQ3

Crysis 1 had very high locked to Vista and dx10 by default. HDR, god rays and i guess its original ambient occlusion are enabled in very high settings. High is very flat and has a very uniform lightning, while very high adds a ton of depth and granularity to everything

Very High in Crysis gives you a lot - Colour grading, screen-space crepuscular rays, ray marched volumetrics when they show up on a level (core), SSAO (this one is pretty big for a game from 2007), Parallax Occlusion Maps, per object motion blur... it really goes on and on. It was a pretty big deal back then IMO.

I think the general hate for windows vista may have clouded the judgement at that time.

As far as I remember with custom config you could have very high settings on dx9 with windows xp, difference vs dx10 wasn't big and performance was better.
 
Last edited:
Very High in Crysis gives you a lot - Colour grading, screen-space crepuscular rays, ray marched volumetrics when they show up on a level (core), SSAO (this one is pretty big for a game from 2007), Parallax Occlusion Maps, per object motion blur... it really goes on and on. It was a pretty big deal back then IMO.

I think the general hate for windows vista may have clouded the judgement at that time.
I think I'm just misremembering the presets. I was an early adopter to Vista because I wanted "tons" of RAM, and 64bit XP was a mostly unsupported mess compared to 64bit Vista. I played Crysis on my Vista rig back in the day, and I remember the POM, motion blur and crepuscular rays, but I also remember not playing it at the absolute maximum settings. Perhaps I just tweaked down a few things to meet a performance target.

I also think that the differences now are bigger than they were back then because the resolution of the average display would have been much lower in 2007. It's easier to see the advantages of higher quality rendering when the resolution is higher.

Also, why isn't POM used more? Isn't it a fairly performant way to add a bunch more detail? I know it has limitations but so does everything else when it comes to traditional realtime rendering.
 
Status
Not open for further replies.
Back
Top