Next gen lighting technologies - voxelised, traced, and everything else *spawn*

Yes, but we may see some solution that separates light source calculations (shadowing really) from GI bounces for efficiency reasons. eg. Low-res voxelised GI with some higher fidelity traced lighting. Also maybe separate AO instead of properly sampling the sky light or GI room light.

Similar to faking specular highlights with the concept of specularity instead of raytacing reflections at perfect fidelity with microscopic roughness - we just need a solution that works.
 
I'm going to stand by my area-light theory though. I reckon the first game with a good area-light solution is going to be immediately discernible as itself and not like the usual, interchangeable 'AAA' style we have now.

Have you ever checked out TearAway on PS4? They've implemented a beautiful soft sun shadows with a pretty wide variable penumbra with reasonably crisp resolution at the contact point, and it makes that game look beautiful. You feel like you could bite it.
I don't know how they did it though.
My hunch is they used variance shadowmaps, only because MM had used it before on LBP1, so I'm taking that precedent.
 
eg. Low-res voxelised GI with some higher fidelity traced lighting. Also maybe separate AO instead of properly sampling the sky light or GI room light.
It will still look like games.
Voxel GI solutions are very inaccurate because they approximate the environment with a single digit number of (leaky) cones. This gives you bounce light, even color bleeding, but your eyes will spot the cheat. Irradiance is low frequency, but we need good accuracy nevertheless.
AO does not help either - it's a completely artificial effect. It can look similar than GI but it will never look realistic in general. (GI adds light, AO can only remove it) We will use it only as long as we need it to add some detail that GI can't handle.
Area shadows can create very realistic images, but only in cases where the contribution of GI is so tiny we do not spot it.
I'm pretty sure what you miss is GI.

Here are two videos that show the difference in a non Minecraft setting:
(I still claim to have similar quality in 3ms on first gen GCN for those diffuse settings, but only for the first video. I have no test case for a scene so large as in the second one, and i still have infinite work to do.)
Notice those videos use only one bounce most of the time, which is still to dark. (They change this setting with console inputs for short times)
But it looks more realistic than games, almost perfect - or not? (Not sure how subjective this is.)

But we need still more.
Having GI means we know about the environment of the shading point and have all information we need. But then we still need to shade it, which is what PBS handles already very well in real time. But it's also a never ending research task if we think of complex materials (layered, coated, subsurface, etc...).
Finally also the geometry matters a lot. Again CG tends to simplify, to be too perfect, to handle transitions badly (e.g. abrupt transition from grass to a tree, or rocks intersecting each other due to copy pasted instances.)

CG tending to be too sharp is surely a very major point. Kane and Lynch 2 looked super realistic because of using camera alike filters to blur but to sharpen as well. I also know a tiny indy horror game but forgot the name. (They make the screen VHS alike and it looks astonishing real just because of that)
Tone mapping is something only very few companies seem to do right. Uncharted looks vivid and colorful eye candy, Tomb Rainder looks grey and dead. Tone mapping could fix this with little manual work and zero runtime cost. I wonder it is so underutilized in games, but it is probably not the key to realism IMO.
 
We've had a whole thread on a developer being purely against the implementations of DXR.

One developer being against DXR-implementations doesn't mean that DXR is at fault though? You should always listen to multiple verified developers, not just one who is against all fronts of RT/DXR.

Make games, not movies!

A complaint by many these days!

Have you ever checked out TearAway on PS4?

I do have that game on my Vita, or had it. Is it a PS4 title also?
 
One developer being against DXR-implementations doesn't mean that DXR is at fault though? You should always listen to multiple verified developers, not just one who is against all fronts of RT/DXR.
On Twitter I would say the elite developers were disappointed at DXR at not exposing some lower level items.

It’s entirely possible that they purposely started with a high level API for RT first and maybe go back later for a low level one after adoption. Because I suspect like DX12, only a handful of developers are really capable of making very effective usage out of low level APIs.

The concerns they have for the API are legitimate, but equally just because it doesn’t have feature X or does something like Y, renders it useless in comparison to a completely custom scenario.

While I suspect that a completely custom compute RT pipeline could will outperform DXR (no hardware acceleration) how transposable that is across a variety of engines is questionable, let alone the education for it. This is where inefficient performance but standardized calls has their advantages; an important consideration since DXR can be used for much more than just graphics.
 
On Twitter I would say the elite developers were disappointed at DXR at not exposing some lower level items.

It’s entirely possible that they purposely started with a high level API for RT first and maybe go back later for a low level one after adoption. Because I suspect like DX12, only a handful of developers are really capable of making very effective usage out of low level APIs.

The concerns they have for the API are legitimate, but equally just because it doesn’t have feature X or does something like Y, renders it useless in comparison to a completely custom scenario.

While I suspect that a completely custom compute RT pipeline could will outperform DXR (no hardware acceleration) how transposable that is across a variety of engines is questionable, let alone the education for it. This is where inefficient performance but standardized calls has their advantages; an important consideration since DXR can be used for much more than just graphics.

Could be right or wrong and depends on the black boxed things we do not know. I still suspect NV to either reject flexibility in favor of pushing to market and being first, or to black box to protect their inner workings from competition.
If you look on this PDF about VK subgroup operations https://www.khronos.org/assets/uploads/developers/library/2018-vulkan-devday/06-subgroups.pdf there is a comment: "No NDA Required! :)" from the author.
This is like saying: "Exceptionally this time i am allowed to tell you guys how to use our hardware efficiently. Isn't that great?"
NV is known to say very little about how their hardware works in general. Likely this becomes only worse with new functionality the competition does not have yet.
We don't know so it remains personal opinions and speculations.

One thing i certainly disagree with is the assumption (most) devs would be unable to handle low level / flexibility. This might have been an argument many years ago, but right now we have only a small number of engines serving everything, and all of them made by experts.
DXR being usable for more than graphics is almost a marketing lie. Physics broadphase impossible due to inaccessible BVH, audio yes but likely you want to use a simplified approximate scene where hardware acceleration is not necessary (not sure).
Btw, are we right now at the tipping point where DX12 begins to outperform DX11 on average? It seems so to me...

But opinions aside, i think the real question here is: "Would it be possible at all to offer flexibility but still maintaining the performance?"
I doubt it. If we want flexibility, we end up with a compute implementation. The restricted approach presented by RTX can well be the option that makes the most sense, and we might need to use custom alternatives just for the cases where it isn't efficient.

I really think the best option is to have some platforms with FF and others without it. That's the only way to find out what's right and what's necessary.

Edit: Unfortunately we have only one mixed platform now, which is bad. I'm not excited about games with optional RTX support.
 
Last edited:
Well, I don't think HDR is needed at all for a CGI look but I agree that the game industry copying the Holywood color grading practices is truly awful.

That's because I can't define it, which is why I asked what makes them different such that games are immediately recognisable as such. If you're aware of changes artists could make, that'd point me to an answer. ;)

Applying a low level blur to a game screenshot doesn't make it look more realistic though. I think by softness, you mean...separation? Like, everything's computed in passes - albedo, shadows, lighting - in perfect fidelity, and composited in a clinically clean fashion. Like the RT images of old, that weren't the slightest bit realistic because they were perfect mathematical creations.

I'm going to stand by my area-light theory though. I reckon the first game with a good area-light solution is going to be immediately discernible as itself and not like the usual, interchangeable 'AAA' style we have now.
Consider this: the CGI in Jurassic Park looks very realistic on DVD. On Blu-Ray not so much. The extra sharpness allows you to detect all the flaws easily, lowering immersion. It's better to allow the brain to fill in the blanks than to clearly show that something is fake.
 
Well, I don't think HDR is needed at all for a CGI look but I agree that the game industry copying the Holywood color grading practices is truly awful.

....

The blog is basically arguing that you don't need HDR for good image composition, and more games should observe the knowledge of colour science and tone mapping that has been developed in the film industry.
 
The blog is basically arguing that you don't need HDR for good image composition, and more games should observe the knowledge of colour science and tone mapping that has been developed in the film industry.
Which has made every movie look like a high contrast combination of teal and orange. Hard pass.
 
Which has made every movie look like a high contrast combination of teal and orange. Hard pass.
You limit yourself to action movies if you think teal and orange are in every movie and even then it's not quite true
 
Which has made every movie look like a high contrast combination of teal and orange. Hard pass.

Yeah, that's kind of the current trend at the moment, but Hollywood has always had their aesthetical fads that come and go. Still, colour grading had rarely been as good as it is on modern movies.
 
Well, I don't think HDR is needed at all for a CGI look but I agree that the game industry copying the Holywood color grading practices is truly awful.


Consider this: the CGI in Jurassic Park looks very realistic on DVD. On Blu-Ray not so much. The extra sharpness allows you to detect all the flaws easily, lowering immersion. It's better to allow the brain to fill in the blanks than to clearly show that something is fake.
Actually, simple 2011 version was a good HD release...


..it's the 2013 disneycrapremasteredshit (C) -like that ruined everything.
 

They said 60 fps on a single 1080 Ti. Dynamic lightning and physics. No raytracing.
What is the service that Quixel is providing over unreal engine? From what I can see they are heavy with the photo realistic textures. Anything else in particular?
 
What is the service that Quixel is providing over unreal engine? From what I can see they are heavy with the photo realistic textures. Anything else in particular?
Quixel is simply a provider of photogrammetry based assets (for use in any engine, software or whatever you wat to do with those assests) & a software to blend/mix those assets together. Essentially nothing to do with UE per se (well they..for the first time got paid directly by Epic this year to pump out a fluff marketing piece..)
 
Last edited:

They said 60 fps on a single 1080 Ti. Dynamic lightning and physics. No raytracing.

Oh no. If kojima's seen this, he is cracking the whip at his artists and engineers right now! If their office had a new cat, consider it dead too.
 
Back
Top