Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
Yeah, you can. I remember watching a Pixar documentary and they explained why this was - and I'm paraphrasing - a motherfucker to do and Pixar again have full control over the camera and can stage the lighting manually on every frame.

You sure can have light coming from invisible nonsensical places but if the aim of RT is have realistic lighting aesthetic then you're wandering into some lighting uncanny valley.
I am not sure what you are getting at.
The aim of RT is not "just realism". A cartoon movie can be made with RT. Something like jet set radio future can be made with RT. It is just a way of trying to understand surfaces, colours, or directions off-screen - it can be made as wild, cartoony, done-up, fake, controlled, whatever as an artist wants. Hiding surfaces, subsituting surfaces, whatever.
 
Yeah, you can. I remember watching a Pixar documentary and they explained why this was - and I'm paraphrasing - a motherfucker to do and Pixar again have full control over the camera and can stage the lighting manually on every frame.

You sure can have light coming from invisible nonsensical places but if the aim of RT is have realistic lighting aesthetic then you're wandering into some lighting uncanny valley.

I think there seems to be some crossing of arguments with RT as being a model for light transport versus the look of realistic lighting models in games; The two are fully separable.
 
I am not sure what you are getting at. The aim of RT is not "just realism". A cartoon movie can be made with RT. Something like jet set radio future can be made with RT. It is just a way of trying to understand surfaces, colours, or directions off-screen - it can be made as wild, cartoony, done-up, fake, controlled, whatever as an artist wants. Hiding surfaces, subsituting surfaces, whatever.

I'm doing a bad job of explaining this. I'm going to assuming you've some BTS documentaries of how movies are made, like the extras you get on the non-bargain basement DVD and Blu-ray boxsets. Movies often have a central aesthetic but similarly within that aesthetic, particular scenes will demand a particular lighting to emphasise mood or tone. There's no reason games can't do that too, but the question I was originally asking is will the issues RT implementation's introduce save time (iroboto's post).

I think there seems to be some crossing of arguments with RT as being a model for light transport versus the look of realistic lighting models in games; The two are fully separable.
I'm not arguing what I think people think I am. I was responding to your original point about RT saving time. I've linked above, then it went off the rails.
 
I'm doing a bad job of explaining this. I'm going to assuming you've some BTS documentaries of how movies are made, like the extras you get on the non-bargain basement DVD and Blu-ray boxsets. Movies often have a central aesthetic but similarly within that aesthetic, particular scenes will demand a particular lighting to emphasise mood or tone. There's no reason games can't do that too, but the question I was originally asking is will the issues RT implementation's introduce save time (iroboto's post).


I'm not arguing what I think people think I am. I was responding to your original point about RT saving time. I've linked above, then it went off the rails.
You don’t need to bake. Which saves you a significant amount of time when you shuffle things around. Time savings should be significant.
 
You don’t need to bake. Which saves you a significant amount of time when you shuffle things around. Time savings should be significant.
That's something you can stop doing. What do you need to add into the pipeline?
 
You’re going to have to explain this one; I’m not exactly sure what you are referring to.
What's the resource implication for adding BVH maps to a game world? Creation, testing, etc.
 
I'm doing a bad job of explaining this. I'm going to assuming you've some BTS documentaries of how movies are made, like the extras you get on the non-bargain basement DVD and Blu-ray boxsets. Movies often have a central aesthetic but similarly within that aesthetic, particular scenes will demand a particular lighting to emphasise mood or tone. There's no reason games can't do that too, but the question I was originally asking is will the issues RT implementation's introduce save time (iroboto's post).
Oh, I get what you are saying now. Obviously, this would be a case by case scenario, but I think overall using a unified RT lighting solution would save time. Even if you have to spend time putting moody lighting into a scene when you are using RT, you already have to do that now for everything using traditional lighting methods. Back when I was making Unreal levels (not professionally), I spent hours tweaking the light to have the desired effect. Putting lights where they didn't physically exist, and adding negative lights (yes, this was a thing in Unreal engine) to darken up areas where light was leaking through walls and floors. I made this level in a sporting goods store one and the inside of these kayaks kept glowing like they were lit from the inside when I wanted them to look shadowed inside.... What a pain. And once I fixed one by putting a negative light inside it, I had to do the same for the rest. If you aren't wasting time tweaking your normal lighting like that, even if you have to spend time tweaking lighting for specific scenes to create just the right mood or atmosphere, you are probably still spending less time overall on lighting. And if you are moving from pre baked lighting to a real time RT solution, like @iroboto said, you will save time during the baking stage. The amount of time I spend rebaking the lighting on some levels because I moved a barrel and the shadow didn't move with it, or when I moved some kayaks and the negative lights had to be moved with them. And any time you move a light, you have to rebake before playing it or the lighting is different than what's in the preview. Also, that level wasn't fun to play.

Anyway, yeah, I think it will save time.
 
What's the resource implication for adding BVH maps to a game world? Creation, testing, etc.
I dont know. Typically you can assign something like tags or layers to objects for various purposes. I’m not sure if they would do something like this for BVH.

without knowing what 4A did, I couldn’t tell you. I’m not sure if everything is included or things are selected. But even then I don’t see that as being a big job. If you move lighting 1m and you want to see the proper effect of it you need to rebake. Add all the additional light sources to try to emulate proper bounce lighting to make up for deficits in technology and I think you spend a lot more time baking than actually doing work.

I don’t think this is an issue tooling wise. Lighting and environment artists need to handle this anyway even with traditional lighting models.
 
Last edited:
I dont know. Typically you can assign something like tags or layers to objects for various purposes. I’m not sure if they would do something like this for BVH.
This is my point. You don't know how much work this is. So how can you be so sure it will save time? :???:
 
This is my point. You don't know how much work this is. So how can you be so sure it will save time? :???:
Because you have to do it anyway even with traditional lighting. You still need to do the same work. You’re asking if there is a tooling problem, and I don’t think there is. The editor refreshes every frame anyway. Adding an object to BVH or not is going to be trivial because it shows up the next frame since lighting is calculated in real time. Everything added and removed to a BVH is done during run time. It only needs to be specified before hand if it should be included in the BVH.

Baking takes minutes for every single change and you still have to specify what objects can be lit by dynamic lights or not.

UE5 moved to this model with Lumen and specified huge time savings for artists and UE5 is used for tv shows.
A first look at Unreal Engine 5 - Unreal Engine
Lumen is a fully dynamic global illumination solution that immediately reacts to scene and light changes. The system renders diffuse interreflection with infinite bounces and indirect specular reflections in huge, detailed environments, at scales ranging from kilometers to millimeters. Artists and designers can create more dynamic scenes using Lumen, for example, changing the sun angle for time of day, turning on a flashlight, or blowing a hole in the ceiling, and indirect lighting will adapt accordingly. Lumen erases the need to wait for lightmap bakes to finish and to author light map UVs—a huge time savings when an artist can move a light inside the Unreal Editor and lighting looks the same as when the game is run on console.

I don't see any difference between Lumen and what A4 studios did except one is compute based and the other is DXR based; so really we're only discussing fidelity at this point in time. I don't think Lumen will save cycles while A4 studios cannot.

They can erase the entire old lighting generation process and build towards this new one, I'm pretty sure the savings on time will be significant.

What Unreal 5 means for the future of game development | Business News | MCV/DEVELOP (mcvuk.com)
“At the end of the day, what makes great games is about iteration and iteration time. So it’s really, really important for us to make sure that we’ve got accessible tools that allow developers to concentrate on the important stuff, great gameplay, and that’s one of the reasons that Quixel is part of Epic.”

Nanite and Quixel aren’t the only things saving developer time, of course. The dynamic lighting made possible by Lumen allows developers to adjust the lighting in-real time, saving a huge amount of iteration time.

“This is another area that enables new levels of visual fidelity but also is pretty transformative on the workflow side. Artists don’t need to go through a long multi hour process to build lighting. They can just build the environment and see lighting update as they’re building it in the editor, and it all just works on the console.”

“One of the coolest parts of making this demo is that I’m in an office right next to where all our artists and a bunch of them have got the monitors towards me,” says Libreri. “Which is probably a bit weird for them, but it’s great to see them making the world, and literally picking up mountains and just moving them, dropping rocks and changing lighting direction. And it all looks real. It’s this very surreal experience, like, oh my god, have we beamed into the future? I don’t see them ever wanting to go back to the old way of working.
 
Last edited:
speaking of Lumen.
@Dictator your article here covers it and it and 4A's solution look incredibly similar with the only difference being the usage of hardware vs compute.
Inside Unreal Engine 5: how Epic delivers its generational leap • Eurogamer.net
Another crucial technique in maintaining performance is through the use of temporal accumulation, where mapping the movement of light bounces occurs over time, from frame to frame to frame. For example, as the light moves around in the beginning of the demo video, if you watch it attentively, you can see that the bounced lighting itself moves in a partially staggered rate in comparison to the direct lighting. The developers mention 'infinite bounces' - reminiscent of surface caching - a way to store light on geometry over time with a sort of feedback loop. This allows for many bounces of diffuse lighting, but can induce a touch of latency when the lighting moves rapidly.
 
You’re asking if there is a tooling problem, and I don’t think there is.

I've not mentioned tools. I'm querying if it's quicker. The Pixar documentary, was included with Cars - which was the first time had their rendering use raytracing - because of the 'shiny' cars. This documentary has also stuck with me because of the challenges that this technology introduced, that they didn't anticipate.
 
I've not mentioned tools. I'm querying if it's quicker. The Pixar documentary, was included with Cars - which was the first time had their rendering use raytracing - because of the 'shiny' cars. This documentary has also stuck with me because of the challenges that this technology introduced, that they didn't anticipate.
hmm.. just thinking out loud, I do wonder if that's a movie specific issue. I think for cutscenes or movies I think your point might be much stronger given the visual medium aspect of it. You really need to control everything because of what story you're trying to convey. Though with gameplay, you're sort of just lighting the level to be playable the way they want it to be.
 
I've not mentioned tools. I'm querying if it's quicker. The Pixar documentary, was included with Cars - which was the first time had their rendering use raytracing - because of the 'shiny' cars. This documentary has also stuck with me because of the challenges that this technology introduced, that they didn't anticipate.
So I think I watched that years ago, and IIRC they ran into problems because some of their stuff isn't rendered at... runtime. Not sure if that's the right word I'm looking for. But because it's a film, they fudge some stuff. And because some stuff is added in post, or done with tricks, things like that would show up (or not) in RT reflections. We see stuff like this in games also, like the leaves in Metro only being reflected in screen space. I might go back and watch that again, I don't remember it being overly long.
 
hmm.. just thinking out loud, I do wonder if that's a movie specific issue.
The thing about movies is you can contrive and hide the magic. The thing about games where the player has full agency over the character and camera is that much more difficult. It's why we see graphics artefacts like missing/low-res textures, or pop-in. Real-time raytracing is relatively new in games but I can't imagine for a second it's a wholly win-win technology immune to design overhead to solve some of those side effects and I think it's too early to say given how few games are really pushing it hard.
 
The thing about movies is you can contrive and hide the magic. The thing about games where the player has full agency over the character and camera is that much more difficult. It's why we see graphics artefacts like missing/low-res textures, or pop-in. Real-time raytracing is relatively new in games but I can't imagine for a second it's a wholly win-win technology immune to design overhead to solve some of those side effects and I think it's too early to say given how few games are really pushing it hard.

I agree, but that also means you design your lighting with that in mind. I'm pretty sure most game lighting today is designed so the game is functionally playable and not so much designed for artistry like a movie would.
case in point. Bottom picture looks terrible lighting wise because no one understands where all this lighting is coming from. It just looks incorrectly lit, blue light is everywhere visually, and all we see is yellow. I think video game players are much more tuned towards seeing realistic lighting models in games, then they are catching that issues in movies. When people say a game looks bad, a poor lighting model is going to be a big reason for that. So I think for lighting and environment artists, they're going to design levels and lighting to emulate what they think real life should look like. And I think a realtime system will be able to do this better than to do it manually. Even if there are drawbacks to realtime lighting systems, they do a superior job at unifying the lighting in a scene than our current systems. Which I believe is also where you are going to save a lot of labour.

I think this is where you see a lot of people go back and reference Drive Club vs Forza Horizon. It just comes down to how much more unified Drive Club lighting is compared to Forza Horizon, which it's still trying to emulate the realtime GI model of DriveClub.

But at the same time, once you switch to a cinematic: (linked below the pictures) people sort of just accept movie based lighting. It's perhaps jarring that there can be such a difference between cutscene and gameplay lighting; but it happens a lot and in a lot of games.
halo_infinite_32_9.0.jpg


 
Last edited:
I6ZC5gy.png


This looks great. The left looks videogamey, the right cgi.
nice comparison and comment. The super bloom behind the door gives it away though, in that sense I prefer the look of the original, the bloom on the image looks like a HDR image on a non tonemapped SDR output display-

But it looks more like CGI for the most part, that's how I imagined Crysis 2 would look like with Global Illumination, then we got no GI on consoles, sigh :(
 
Last edited:
Looks great, but what irritates me, is that the light from the particles does only affect the ground, walls etc if they are really near to it. As bright as the light is, it should have a bigger impact. Would look even cooler if those would be handled as actual light sources.
But it really doesn't look like it must be a PS5 exclusive. Might be the video compression etc, but it looks much more like a last gen game with a few visual upgrades.

Also ~1080p + TAA => ~1440p + Checkerboarding => 4k is ... interesting.
 
Last edited:
Status
Not open for further replies.
Back
Top