Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

Tried Immortals - but no. Runs bad, looks bad and it is always good to be CPU limited under 80 FPS. Dont know what has been happened in the last 10 years, but UE5 is the epitome of what is just wrong with PC gaming:
 
Tried Immortals - but no. Runs bad, looks bad and it is always good to be CPU limited under 80 FPS. Dont know what has been happened in the last 10 years, but UE5 is the epitome of what is just wrong with PC gaming:

It's pretty easy to hit spots in Remnant 2 where i'm cpu-limited and the fps is around that mark. Unfortunately I think having a single main render thread (from what I can tell from the docs) is going to make cpu limitations fairly common.
 
Consolidation is bad. No competition means medicore results. Inefficiency means less performance. Cyberpunk PT with DLSS performance will perform 50%+ better than this game because this is so CPU limited.
 
Consolidation is bad. No competition means medicore results. Inefficiency means less performance. Cyberpunk PT with DLSS performance will perform 50%+ better than this game because this is so CPU limited.

I think a lot of studios are up against the wall as rendering and game development in general becomes more complex. Staying cutting edge becomes more and more difficult and everyone is competing to hire the same experts. Tons of the are at UE and Nvidia.
 
  • Like
Reactions: JPT
I think a lot of studios are up against the wall as rendering and game development in general becomes more complex. Staying cutting edge becomes more and more difficult and everyone is competing to hire the same experts. Tons of the are at UE and Nvidia.
Why is this? It seems there are so many universities now teaching those things. I lack any academic background and don't know how much this helps, but hell - you can even study game design now - there should be enough people?
 
Why is this? It seems there are so many universities now teaching those things. I lack any academic background and don't know how much this helps, but hell - you can even study game design now - there should be enough people?

I don’t really know the answer. It could be that a lot of the people graduating with masters or phd in graphics fields are going to nvidia or film of other fields that are not gaming, as game studios are generally not believed to pay competitive wages. Undergrads that major in game development probably know some fundamentals but lack experience to really make breakthroughs on average.
 
Undergrads that major in game development probably know some fundamentals but lack experience to really make breakthroughs on average.
Yeah, probably. I also assume AAA studios have no time to do long years research, and can't assert talent to speculative ideas which might work out or not.
But this does not explain why they release games with serious performance issues. No breakthroughs are needed - just optimization.

Idk. Maybe game developers nowadays are just content creators, and they think optimization is to click the right checkboxes and tweaking parameters.
That's what the angry gamer in me thinks at least. But usually angry gamers don't know anything about the game development they love to hate so much. :D
 
Why is this? It seems there are so many universities now teaching those things. I lack any academic background and don't know how much this helps, but hell - you can even study game design now - there should be enough people?
I think most courses are aimed at lower level, using existing tools. Creating tools is very multidisciplinary and I dare say to be good at, you need to 'apprentice' for years among those who know what to do. There's also little scope to create an engine. Who's going to employ you if they are all using UE5? So you want to create your own engine for your own game, which ends up as interesting experiments with videos on YT but which don't progress to broad engines others can use. And if you have that level of ability to create bleeding-edge graphics or process the maths to derive algorithms, there'll be better paid jobs out there than creating your own engine, not least of which is working for those already in the field with big monies from ubiquitous products. If you are taking on the cost of a Uni course that'll provide the knowledge and experience to create engines, it's likely with a view to getting a job at one of these major employers.

There's a fair bit of grumbling at the moment about UE5 and performance. I think it's really all explained by economics. If you want a different landscape with competing engines driving efficiency and techniques forwards, you need to solve the economics of efficient AAA game-dev and engine maintenance. Oh, and the economics of graphics hardware and why it's so expensive so the same hardware progress can't be relied upon to power software advances any more.
 
Why is this? It seems there are so many universities now teaching those things. I lack any academic background and don't know how much this helps, but hell - you can even study game design now - there should be enough people?

I assume it is because education churns out people that think it is a job and not a calling, ie they do not care to much. They just do a minimum of things they have to, why learn the more difficult things when you can just whip up something i VB for Fortune 500 and earn more?
In the older days, you where interested in the field, today it seems like more people are into the field due to it can be a well paid field to be in.
 
In the older days, you where interested in the field, today it seems like more people are into the field due to it can be a well paid field to be in.
It was also easier to experiment and discover. There's only so much you can do in 64kb RAM and 16 colours, or 4MBs and a primitive GPU. To progress graphics now you need huge amounts of code mixing complex APIs.

It's just like invention. Back in the day people could mix chemicals in their own homes and discovered stuff. Now chemicals are regulated and all discovered and if you want to progress the art, you need to work in a well funded lab that's pursuing a capitalistic resolution to their tech.
 
So you want to create your own engine for your own game
I see some problems with that mindset. Depending on ambitions you can't do both as a single person, and you better pick just one.
If you work on your own engine, you end up replicating state of the art. Importing fbx files, animation blending and IK, PBR renderer, network, sound, your own scripting language... It's way too much these days, so likely you'll fail to catch up with state of the art, and chances you come up with something revolutionary new are tiny. It's just reinventing wheels.
Still, it's those people who then can maintain a large AAA engine. So we need them.
But their number is declining aggressively. That's obvious and eventually worrying.
I'd like to see more open source libraries which people can use and plug together. So they can focus on just one thing in detail, increasing chances to come up with new things.

I think it's really all explained by economics. If you want a different landscape with competing engines driving efficiency and techniques forwards, you need to solve the economics of efficient AAA game-dev and engine maintenance.
Yeah. Maybe i underestimate the technical side of this.
I always assume it is content creation and its cost, which hurts the AAA industry the most.
A solution to that would be to make smaller games and targeting niches. AAA makes huge games targeting some averaged mainstream player. But maybe this player does not exist, and so the result is often disappointment.
BG3 would be an example, looking at how happy RPGs fans are about it. But it's also an example of games being just too big, so it does not show a path forwards either.

I assume it is because education churns out people that think it is a job and not a calling, ie they do not care to much. They just do a minimum of things they have to, why learn the more difficult things when you can just whip up something i VB for Fortune 500 and earn more?
Everybody says this currently about younger workers in general. It seems Covid crisis and lockdowns have amplified this problem.

In the older days, you where interested in the field, today it seems like more people are into the field due to it can be a well paid field to be in.
If so, this is the result of our failure. If it is true that games industry recently has cared only about profit rather than innovation, it is no wonder we lack a younger generation still excited about the medium.
We would need another Doom moment.

Well, currently just everything else looks bad too. Let's hope there are better times ahead...
 
Lumen is probably not the most efficient choice if your environment lighting is going to be completely static which seems to be the case in this Immortals game.
 
Why is this? It seems there are so many universities now teaching those things. I lack any academic background and don't know how much this helps, but hell - you can even study game design now - there should be enough people?
Rendering is specialist, difficult work. Shipping a game at a high quality level means knowing tons of un-documented or barely documented things about each graphics api, having detailed working knowledge of the trade-offs between all of the gpu hardware features for all of the gpus you support, and being a great systems programmer (already quite rare!) on top of that. This is rare and expensive.

It's never surprising when a first game by a new studio fails to achieve the perf and quality heights of an AAA game by an established studio.

I do agree this game doesn't really look amazing, I don't think the perf is all that terrible really but you're probably cpu bound by gameplay code, not core engine features. Admirable and ambitious first attept by a new small studio, I hope they get the resources they need to grow and make another game in the future.

I assume it is because education churns out people that think it is a job and not a calling, ie they do not care to much. They just do a minimum of things they have to, why learn the more difficult things when you can just whip up something i VB for Fortune 500 and earn more?
In the older days, you where interested in the field, today it seems like more people are into the field due to it can be a well paid field to be in.

Sorry if this reads like calling you out, but: I hate this attitude. I might be one of the younger workers you're complaining about (early 30s) but people treat it like a job because it is one -- it's difficult work* in difficult conditions in a constantly changing landscape. However I also don't think anyone is truly in it for the money, at least on the engineering side, pay is well below the rest of tech in most studios.

*as far as white collar office jobs are concerned. There are of course many harder jobs out there.
 
Last edited:
Additionally, if a university wants to run a course that can train people specifically at engine development and advanced rendering techniques, who would they get to teach it? I think the level of staff you'd be able to get outside one of the absolute top establishments in the world would be broadly knowledgable and kinda experienced-ish. ;) They'd impart the knowledge enough for the enthusiast to go watch GDC lectures, cobble together a homebrew start on an engine, and get employment. Which is what the current Comp Sci education seems to rely on.

Even if CalTech or MIT secure John Carmack or Sebastian Aaltonen to speak once a week as part of their comp sci courses, how many people will be trained and what will their expectations be? And if they dared create a course specific to realtime rendering theory, how many people will it produce and what jobs will they want?

I think all the world's leading authorities were just in it from early enough that they could be self taught from bedroom coding and grow alongside the technology. Any new great minds will be those with a basic comp sci education and a butt-load of enthusiasm following GDC!
 
Sorry if this reads like calling you out, but: I hate this attitude. I might be one of the younger workers you're complaining about (early 30s) but people treat it like a job because it is one -- it's difficult work* in difficult conditions in a constantly changing landscape. However I also don't think anyone is truly in it for the money, at least on the engineering side, pay is well below the rest of tech in most studios.

*as far as white collar office jobs are concerned. There are of course many harder jobs out there.

No offense taken, you are right it is just a job, but as an employer I see that people who threats it just a 8-16 job, usually do not do great things like the genuine interested person.
I am also a big fan of work/life balance, so I never demand anything more than 8-16 from an employee, but we do not make technical complex products like an AAA game is.

It was not a dig at younger people in general, but today computer science/programmer is a regular job as being an accountant or taxi driver or a doctor, its a paycheque for most people.
When I grew up, programmers where a bunch of crazy dudes* mostly, where money was not even in the equation :) I say that because I wanted to be like that and that was the path I started on.

*I did not know of any female programmers at that time, but there was of course many and they where top notch to.
 
Gave IoA a quick try last night. On the plus side performance was pretty steady for me, generally hovering around 100 with everything maxed (which is not the default btw, even on a very high end machine. Some of their settings recommendations around CPU stuff I don't really understand nor do they seem to affect performance much at all), DLSS quality and reflex on (but no frame interpolation). Certainly some places are CPU bound but on my machine I didn't really drop below 90, and everything felt responsive. Didn't really run into any shader compile stutter, so nice to see that. Had the odd frame skip on camera cuts but wasn't too common.

Graphics were a bit uneven, but there were a few legitimately impressive bits (and lot better than Remnant 2 overall on that front). View distances and level of clutter was notably higher than most games which was nice to see and my son even commented on it when he walked in the room. There's a bit of noise visible in a few scenes.

If I were to nitpick on the shadows front - keeping in mind this is just my first impressions from a couple hours of messing around:

1) Indeed there are some objects that don't cast shadows... some likely for performance reasons, but would be nice to have a setting regardless since in mainly of these scenes there is GPU performance available. Some of them appear to just be oversights as static nanite objects can generally cast shadows with VSMs very cheaply.

2) I've seen this trend in a lot of recent games, but there's an over-use of screen space shadows which have unavoidable artifacts by their very nature (see attached). I presume this is primarily to cover cases on lower end settings/GPUs where you would otherwise be missing shadows from more stuff, but it would be nice if these settings were backed off at the higher shadow quality settings where VSMs can cover the detail more accurately.

3) There's a few places with some weird exposure swings. I assume this is just some slightly buggy volumes but it's kind of distracting when you find the "edge" of one of them.

We'll see how later environments are in the game. As with Remnant 2 there are some that are quite impressive looking and others that are kind of bland. I hope the trend towards more clutter and detail continues further as that - in addition to lighting - is a big part of closing the remaining gap between "game-y looking" and more realistic IMO.
 

Attachments

  • ioa2.png
    ioa2.png
    4.9 MB · Views: 31
  • ioa3.png
    ioa3.png
    5 MB · Views: 31
Last edited:
I assume it is because education churns out people that think it is a job and not a calling, ie they do not care to much. They just do a minimum of things they have to, why learn the more difficult things when you can just whip up something i VB for Fortune 500 and earn more?
In the older days, you where interested in the field, today it seems like more people are into the field due to it can be a well paid field to be in.

There's still many people going through the system that do care and programming is their life. So those people are still making it through the system.

However, demand for programmers far exceeds the supply of those people, thus lower tier (like anything below say MIT, Carnegie Mellon, Cal Tech, etc.) colleges are constantly lowering the bar of entry for their CS programs in order to draw in enough student bodies to fill expansion in those programs to meet the demand. It really was a really shocking to me to meet some of the fellow CS students of one of the kids of one of my cousins. But that's not surprising as there was a study done a few years ago showing that programming is a field where you not only need to have a desire for programming but your brain needs to be capable of understanding programming logic and complex algorithms. And no matter how much time is spent attempting to teach someone without an "affinity" for programming they'll never progress beyond relative basic programming ability. IE - most people could learn to do advanced math with time, most can't learn to become a good programmer.

It's also harder now for a student to have a really broad technical base for CS due to the extremely limited number of elective credits available due to mandatory non-core credits required (the basics such as english, foreign language, history, art and non-core sciences as well as newer requirements for social and gender programs). When I went through university about 1/3 to 1/2 of my credit load could be dedicated to electives of my choice. Nowadays that is down to maybe 1/10th (or less depending on the college/university) of a typical 4 year degree credit load.

So, for example it's extremely difficult to take a core CS focus and then have a broader understanding of alternative CS focuses or even math and science. IE - due to the non-core requirements being greatly expanded at the expense of elective credits, it's almost impossible for a student in college/university to have as broad an understanding of the field and related fields as in the past just due to the inability to fit it all within their 4 year degree. And going for a Master's degree doesn't help with that since a Master's degree just focuses you even more on a narrower focus.

On top of that, non-gaming companies are able to offer significantly more money to attract top talent coming out of University. Due to the boom in AI, companies like Google, Amazon and Microsoft are getting close to offering a million USD a year to experienced AI programmers, that means the bottom end of the stack (programmers coming out of university) are likely also getting a huge bump in what they are being offered.

That means that for the people you are referencing, gaming is being relegated down to the level of webdev in terms of financial compensation when compared to what the big tech companies are offering for top talent.

IE - it's all a combination of the really smart guys from before retiring and the remaining really smart guys increasingly being drawn to opportunities that pay more. So, people that would have gone into engine (rendering) development are instead looking at what the big tech companies are offering and choosing to go that direction instead.

There's not much that game developers can do to attract that talent, especially when they are limited by how much they can charge for games.

The only companies that might be able to afford to offer something vaguely comparable to what the big tech companies are offering are dedicated engine devs ... like Epic. That means that all the best engine developers will eventually migrate out of development houses and towards dedicated engine development houses (Epic, Unity, etc.).

Due to the economics involved and limitations placed on game developers WRT budgeting, it's just not possible for most AAA game development houses to support a dedicated engine developer, much less multiple in the same as they could even 10 years ago. It's far easier to have someone that can modify an established engine's rendering code and leverage resources that can be shared across multiple development houses.

When so much of AAA game development relies on graphics and graphics assets. Do you pay 1 guy multiple hundreds of thousands of dollars or do you hire say 10 or 20 asset design artists for the same amount of money?

Regards,
SB
 
Last edited:
PCGamesHardware did an excellent write up benchmarking the game Desordre, and it's Lumen and RTXDI implementations.

With Hardware Lumen at max quality and native 4K, the 4090 is 2.2x times faster than a 7900XTX, and the 3090Ti is 2x times faster than 6950XT.
With RTXDI, the the 4090 is 2.3x times faster than a 7900XTX, and the 3090Ti is 90% faster than 6950XT.

In summary the gap remains almost the same between vendors whether with max hardware Lumen or RTXDI.

 
Last edited:
Back
Top