New Square Enix Luminous Engine tech demo - Path Tracing

Luminous engine now supports ray tracing and RTX, the developers created a full path traced demo running on the 2080Ti.


Designed by Luminous Productions, a subsidiary studio of Square Enix Holdings staffed by FINAL FANTASY XV veterans, Back Stage is rendered almost exclusively with path tracing, an advanced form of ray tracing also used in Quake II RTX and Minecraft, that enables real-time rendering of lifelike lighting, shadows and reflections.

Back Stage is a showcase demo of our work to answer the question, "How can you use ray tracing in a next generation game?" GeForce RTX graphics cards have power beyond our imagination, and with NVIDIA's technology even real-time path tracing has become a reality. Together with Luminous Engine and RTX technology, we have taken one more step forward towards the kind of beautiful and realistic game that we strive to create.” - Takeshi Aramaki, Studio Head of Luminous Productions


https://www.nvidia.com/en-us/geforc...ns-geforce-rtx-2080-ti-ray-tracing-tech-demo/
 
Back Stage is a showcase demo of our work to answer the question, "How can we use ray tracing in our next generation of unrealistically optimistic demos that can't possibly translate into actual games in the near future?" - Takeshi Aramaki, Studio Head of Luminous Productions

I've, uh, corrected a typo in the above quote.
 
Last edited:
http://www.luminous-productions.com/news/03/

On September 4th , 2019, Luminous Productions unveiled "BackStage", their latest tech demo that utilizes the next-gen real-time game graphics technology known as path tracing, at a conference for game developers in Japan, CEDEC 2019.

The tech demo was launched as part of the studio's efforts to advance the next-gen gaming functionality of their proprietary game development engine, "Luminous Engine." The "Luminous Engine" is among the first to adopt path tracing, which NVIDIA's Morgan McGuire predicted in his SIGGRAPH 2019 presentation to be "fully adopted in game graphics by 2035." In collaboration with NVIDIA Corporation, "BackStage" was created taking full advantage of real-time game graphics technology of the future. As one of the features of the studio's proprietary game development engine, "Luminous Engine," this feature will be utilized in next-gen games that will be developed on the "Luminous Engine" from here on out.

*Path tracing is a one of the rendering techniques of ray tracing, a next-gen game graphics technology. Path tracing unifies all lighting effects into a single ray tracing algorithm, and traces the path of light rays throughout a scene.


 
I've, uh, corrected a typo in the above quote.
undoubtedly though, someone will look at work from purely compute/T&L and say they can't see a difference between what Quantic Dreams / Supermassive (Man of Medan) is doing and what this is. And that's not a slight against anyone, there is valid criticism in the need to keep framerates and resolution up while introducing these features that sit in the ultra level category of things. (where layman have a harder time seeing the difference)

I think all of that is okay, it will be some time for developers to be able to really figure out what makes sense to leverage from ray tracing, fully path tracing a scene isn't it. It's nice for movie making sure, but that's not going to fly for games looking to hit 60+fps or higher than 1080p resolution.
 
From the NVIDIA site:

“rendered almost exclusively with path tracing”

Does this mean the geometry is rendered via traditional rasterization with lighting and shading computed with RT?
 
From the NVIDIA site:

“rendered almost exclusively with path tracing”

Does this mean the geometry is rendered via traditional rasterization with lighting and shading computed with RT?

The Luminous Studio head mentioned this in his presentation regarding the demo. But yeah all of the direct lighting, GI, AO, shadows, reflections are path-traced.

"In addition, the scene contains many translucent objects, including hair and eyelashes. When a ray hits a polygon that makes up such a semi-transparent object, it is necessary to read the texture and test whether it is opaque or semi-transparent every time.

Nonetheless, if everything is drawn with ray tracing, it can't be moved in real time, so there are a number of compromises. For example, in the case of skin expression, a subcutaneous scattering simulation that throws countless rays under the skin is not performed, and the same processing as the simple subsurface scattering technique “Screen Space Subsurface Scattering”, which is often used in recent game graphics, is used as a proxy."​

They started developing the demo back in June. The first version they developed only ran at 5fps.

Also, regarding next-gen consoles:

"Since the Back Stage demo has been developed as a demo that runs on Windows, in order to implement real-time ray tracing, it is necessary to install "DirectX Raytracing" (hereinafter DXR), a real-time ray tracing framework for Windows 10.

However, Luminous Productions must also consider compatibility with stationary game machines other than Windows PCs. Moreover, the next-generation Playstation and the next-generation Xbox “ Project Scarlett” have been predicted to support ray tracing.

As with the current Xbox One series, the next-generation PlayStation is expected to adopt its own real-time ray tracing framework, apart from Project Scarlett, which is sure to be based on Windows x DirectX.

Therefore, Aramaki says that Luminous Engine has intentionally abstracted the functional blocks that perform ray-tracing processing without using DXR-only specifications. In other words, the Luminous Engine side defines the API and functions for ray tracing processing and controls the DXR from here. Now, for example, in order to support the next generation PlayStation, the same graphics can be drawn just by replacing the abstraction layer."
jHsl9WO.jpg


QgMgx07.jpg


toOexpP.jpg
 
Last edited:
That's amazing indeed! It would help a ton in making a world filled with NPCs that looks unique without the artists manually making them one by one.

That would be a nightmare for character creation tho.. I already could spend hours BLACK DESERT ONLINE or Astral chain character creation / modification....with this new luminous tech? Yikes! I won't ever finish making a character hahaha

Hopefully the game that will use next Gen luminous won't get too much delay / development hell.
 
That vid is kind of creepy. The quality of the animation does not match the quality of the rendering. I know it's just a demo, but it's off putting.
A very polite way of saying it.
To me, the two problems of lighting and realistic characters only intersect at SSS, which is not improved here against previous realtime methods (for good resons).
I wish they would have chosen a very different scene to show lighting progress. Better lighting won't help much with characters - pretty sure of that.
So... being impressed, or even being disgusted? How subjective is it? Not sure, for me it's both, but more of the latter.
For games i think we should continue (or even relearn) what it always have been like: Design around limitations, avoid issues like uncanny valley. Target games, not movies. There seems only one company which seems to get both right, and it's less a matter of technology to do so i guess.
 
Back
Top