Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

This interview around the requirements is a bit of a mess:


It's seems neither the interviewer or the guy being interviewed really understand PC hardware.

The statement about their specs drawing a line between RT capable and non RT capable hardware was particularly amusing.

They also mention using FSR on console to achieve HD resolution but I'd take that with a huge pinch of salt.
 
The statement about their specs drawing a line between RT capable and non RT capable hardware was particularly amusing.
They are going to support hardware RT obviously, they are also still optimizing the game, I think their 5700XT requirement will be updated accordingly close to launch, they said so themselves.

The 5700 XT would be the minimum spec we are targeting for software-based raytracing, and our team is working hard to continue optimizing the game until launch in July. We will be transparent with our community and update our recommended minimum requirements if anything changes.
 
Here's a longer gameplay trailer.

Is it just me, but the magic effects don't seem to have any interactive lighting? Don't know if that's a performance or gameplay decision, but seems like a bit of a shame.


I hate the visual direction of this game. Too much contrast in everything. It's like Michael Bay Transformers "wtf is going on" levels of visual clutter with extreme contrast. If they had dynamic lights on everything I'd probably have a seizure.
 
This interview around the requirements is a bit of a mess:


It's seems neither the interviewer or the guy being interviewed really understand PC hardware.

The statement about their specs drawing a line between RT capable and non RT capable hardware was particularly amusing.

They also mention using FSR on console to achieve HD resolution but I'd take that with a huge pinch of salt.
Isn't this game UE5? Why wouldn't they be using TSR?
 
I hate the visual direction of this game. Too much contrast in everything. It's like Michael Bay Transformers "wtf is going on" levels of visual clutter with extreme contrast. If they had dynamic lights on everything I'd probably have a seizure.
The self-shadowing on the right hand/weapon here is certainly an interesting choice - it's both temporally unstable and very low resolution, which seems a bit odd given that you'd figure their gameplay trailers would be captured on a 4090 to showcase the game in the best possible light. In the trailer to me it was distracting enough that I'd probably prefer it not to be there at all:


Hopefully it's just a bug / can be optimized away.
 
just a small thought, since Fortnite used software RT to reach 60fps on consoles with lower quality Lumen, why didn't they used hardware RT at software RT level of quality?
Or is the cost of turning on hardware RT always higher even at the lowest quality possible, and only scale better with higher level of quality?
 
just a small thought, since Fortnite used software RT to reach 60fps on consoles with lower quality Lumen, why didn't they used hardware RT at software RT level of quality?
Or is the cost of turning on hardware RT always higher even at the lowest quality possible, and only scale better with higher level of quality?
Lumen IS software RT.
 
Lumen IS software RT.
From UE5 docs:
Lumen provides two methods of ray tracing the scene: Software Ray Tracing and Hardware Ray Tracing.
  • Software Ray Tracing uses Mesh Distance Fields to operate on the widest range of hardware and platforms but is limited in the types of geometry, materials, and workflows it can effectively use.
  • Hardware Ray Tracing supports a larger range of geometry types for high quality by tracing against triangles and to evaluate lighting at the ray hit instead of the lower quality Surface Cache. It requires supported video cards and systems to operate.
Software Ray Tracing is the only performant option in scenes with many overlapping instances, while Hardware Ray Tracing is the only way to achieve high quality mirror reflections on surfaces.

what I'm wondering is that can we do HWRT from a simple scene closer to the Mesh Distance Fields level of quality, but still leverage the hardware acceleration from HWRT
 
There is a penalty with Hardware RT - building BVH, trace rays etc. You can not speed up software Lumen. UE5 has gone against GPU architectures, so you end up with slow software Lumen and even slower hardware Lumen.
 
what I'm wondering is that can we do HWRT from a simple scene closer to the Mesh Distance Fields level of quality, but still leverage the hardware acceleration from HWRT

The mesh distance fields and surface cache used in software tracing are pretty low resolution but relatively fast to trace and shade. SDFs have multiple LODs (mips) which help. Theoretically you could also build a very low resolution triangle BVH representation of the scene and trace that instead. No idea if it would work as well as the SDF and it’ll probably be slower.

Software Lumen is nice because it works on older hardware without RT acceleration. However SDF tracing requires quite a few hacks and compromises. I suspect it will be abandoned on the next generation of consoles. Triangle tracing is a much simpler and more robust tracing solution with fewer compromises. The surface cache will probably be superseded too by faster, higher resolution and more space efficient solutions like neural radiance caching.
 
The #1 advantage of M&K often touted is the ability to get headshots, no gameplay mechanic which makes that exceedingly more difficult is going to survive playtesting. The aspect of this which helps the realism isn't just the quick panning, is that the pans don't stop on a dime, which is a key benefit of m&k.
I think they are pretty clearly aiming for a different type of game here though, right? Like if you're going for "realism" we're not gonna be doing 360 no scope headshots or whatever. I suspect this will not appeal to the same audience, but it will be very interesting to see where they go with the gameplay and how it feels in practice!

The rendering definitely looks pretty nice although there are still a few telltale things. Primarily I'd say this is a really good demo of the power of photogrammetry. As that gets easier and more ubiquitous and the pipeline between stuff like "point your phone at something for 5 seconds" into "your object is now in Fortnite or Roblox or whatever and you can instantly use it make design stuff with others" gets better, I'm very excited to see the explosion of content. Obviously this is primarily useful for making realistic games so there will always be artistry involved regardless, but it should open up development to a broader set of people, which is good for everyone!
 
I suspect it will be abandoned on the next generation of consoles.
I keep mentioning this but I don't think it has really registered with the tech press/common folks yet but... building BVHs and TLASs for large, dynamic games is currently still an unsolved problem. The limitations I foresee here are a lot less to do with tracing and shading rays and a whole lot more to do with maintaining these expensive data structures. Any problems with various forms of geometry that Nanite has, raytracing has even worse.

I've been playing Cyberpunk a bit and it is interesting because it has fairly complex lighting but very simple geometry, all things considered (both from a triangle count and instance count perspective). This post (https://chipsandcheese.com/2023/03/22/raytracing-on-amds-rdna-2-3-and-nvidias-turing-and-pascal/) pegs the instance count at ~50k instances which in UE5 terms is more like Fortnite than something like CitySample (assuming a full BVH, which obviously it cannot afford to do currently). I'm not sure if Cyberpunk is actually tracing primary rays even on "path tracing" mode... it's kind of unclear between some of the tech press around it vs. the SIGGRAPH presentation (which contradict each other in multiple places). But ultimately if the goal is to get to fully ray traced complex, open world games it's not entirely clear how we overcome the current BVH building limitations without major API and possibly hardware changes. And of course on the performance front there's still quite a ways to go.

Does anyone happen to know of a game that manages hundreds of thousands or millions of instances with RT currently?

That said, I definitely agree we're in a phase where all of this stuff is continuously evolving and raytracing is an increasingly important tool. I'm mostly just frustrated that the time where it starts to seem reasonable to rely on using it exclusively is still "vaguely too far off in the future" due to the issues around BVHs. It's easy enough to project ray intersection and shading performance, but at least for now it's still hard to see how a good amount of games could rely on having a full BVH present out to the horizon with no compromises at all.
 
I think they are pretty clearly aiming for a different type of game here though, right? Like if you're going for "realism" we're not gonna be doing 360 no scope headshots or whatever. I suspect this will not appeal to the same audience, but it will be very interesting to see where they go with the gameplay and how it feels in practice!
That's fine as an approach to deliver a unique visual experience, it's possible they will still keep this type of camera movement in the game, sure. My point was more than attributing this look primarily to the superiority of mouse and keyboard movement doesn't necessarily track given how much of the movement is seemingly outside of player input, and also that this type of inaccuracy goes completely against the ethos that M&K users often tout, which is the advantage of responsive, fine-grained control that only that input combo can provide.

I have sincere doubts that this group in particular will accept a control system that effectively neuters the primary effectiveness of their input method to such a degree for the sake of realism.
 
My point was more than attributing this look primarily to the superiority of mouse and keyboard movement doesn't necessarily track given how much of the movement is seemingly outside of player input, and also that this type of inaccuracy goes completely against the ethos that M&K users often tout, which is the advantage of responsive, fine-grained control that only that input combo can provide.
Agreed, I wasn't aware they were attributing any of the look to M+KB controls in particular...

I have sincere doubts that this group in particular will accept a control system that effectively neuters the primary effectiveness of their input method to such a degree for the sake of realism.
Ehh, I think it'll be fine. Lots of folks play 3rd person action games on M+KB that were really designed for controllers, with heavy lock-on and brawler type controls. I tend to play these with controllers (on PC still), but it's "fine" on M+KB as well and I have lots of friends that are happy with it. Just depends on expectations for a given game. This one looks different enough that I suspect people will not immediately bucket it in with other FPS games/controls. That said, whether it will be fun in its own right remains to be seen.

Anyways sorry for the digression. Definitely an interesting visual concept.
 
My point was more than attributing this look primarily to the superiority of mouse and keyboard movement doesn't necessarily track

To be clear, that's not what I was saying. The look is achieved by the shaky cam, independent hand and head movement, and amazing animations. But IMO it's built on a foundation on mouse controls. Because without that, the basic camera movement itself would be far to smooth, even and slow to be convincing as a true body cam, regardless of how much you shake it.
 
Back
Top