Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

Dont know what has been happened in the last 10 years, but UE5 is the epitome of what is just wrong with PC gaming
Personal opinions aside, I decided to take a quick peek back 10 years to see if my memory was fuzzy. As it turns out Battlefield 4 launched pretty close to 10 years ago. Skimming some reviews from the time it turns out my memory is not yet entirely gone... it ran around 60fps (with dips quite a bit lower) on a GTX Titan on Ultra, and about 70-80 on High, both at 1080p. You can of course have whatever opinions you want on visual quality vs. performance for specific games (and certainly we'll increasingly pass points where graphics are good enough for some people), but I don't think the characterization that baseline game performance has gotten way worse really stands up to serious scrutiny.

The irony of the whole "rendering engineers back in the day were the true heroes" line is that it's mostly the same people now that it was then. That's certainly not a great situation in the long run, but the reality is when I was a kid everyone wanted to be game developers and sometime between then and now people primarily want to go work for big internet/tech companies. The shift has become very obvious as I have interviewed and hired over the years; there are fewer people who are trained and/or interested in the systems-level programming work that game development and rendering specifically require. That said, we're not yet at a dire point as many of us still have at least a decade or two of work left in us, but this will increasingly become a problem at some point, assuming AI doesn't just take over the remainder of the work and we brute force it even more via that route 😆
 
Personal opinions aside, I decided to take a quick peek back 10 years to see if my memory was fuzzy. As it turns out Battlefield 4 launched pretty close to 10 years ago. Skimming some reviews from the time it turns out my memory is not yet entirely gone... it ran around 60fps (with dips quite a bit lower) on a GTX Titan on Ultra, and about 70-80 on High, both at 1080p. You can of course have whatever opinions you want on visual quality vs. performance for specific games (and certainly we'll increasingly pass points where graphics are good enough for some people), but I don't think the characterization that baseline game performance has gotten way worse really stands up to serious scrutiny.

The irony of the whole "rendering engineers back in the day were the true heroes" line is that it's mostly the same people now that it was then. That's certainly not a great situation in the long run, but the reality is when I was a kid everyone wanted to be game developers and sometime between then and now people primarily want to go work for big internet/tech companies. The shift has become very obvious as I have interviewed and hired over the years; there are fewer people who are trained and/or interested in the systems-level programming work that game development and rendering specifically require. That said, we're not yet at a dire point as many of us still have at least a decade or two of work left in us, but this will increasingly become a problem at some point, assuming AI doesn't just take over the remainder of the work and we brute force it even more via that route 😆
BF game back then were offering an experience that no other games were though.
 
Personal opinions aside, I decided to take a quick peek back 10 years to see if my memory was fuzzy. As it turns out Battlefield 4 launched pretty close to 10 years ago. Skimming some reviews from the time it turns out my memory is not yet entirely gone... it ran around 60fps (with dips quite a bit lower) on a GTX Titan on Ultra, and about 70-80 on High, both at 1080p. You can of course have whatever opinions you want on visual quality vs. performance for specific games (and certainly we'll increasingly pass points where graphics are good enough for some people), but I don't think the characterization that baseline game performance has gotten way worse really stands up to serious scrutiny.
I disagree with the premise that things have not gotten worse. Didn’t BF4 run at 900p on the ps4 at 60fps with high settings? There were dips but that’s a game I played at launch. Furthermore, Dice was doing some crazy stuff with destruction, networked waves, etc. This stuff hadn’t been seen. The performance was also visibly justifiable. Of all the engines out there, UE5 is definitely the worst imo. It offers a lot of new features which theoretically are great but in practice offer marginal improvements in visuals.
The irony of the whole "rendering engineers back in the day were the true heroes" line is that it's mostly the same people now that it was then.
Is it? I’m not so sure. We saw what happened to Battlefield after DICE lost core devs. You cannot convince me that the same people who made Arkham knight made the new entry. The regression in technical proficiency so large that it was visible even to the eyes of the uneducated. These are some of the many examples of technical regression we’re seeing today. Outside of Sony’s ICE team, the guys at Nvidia, and a hand full of European devs, no one appears to be even doing anything that’s technically impressive. Instead they rely on bloated engines like UE5 and deliver subpar results.
 
Outside of Sony’s ICE team, the guys at Nvidia, and a hand full of European devs, no one appears to be even doing anything that’s technically impressive.

I read what he said as, today its the same (amount of) engineers but a lot more projects and studios.
So there will be a bunch of titles that do not have them that coast a long, while a some projects here and there does something extra.
Which sort of is what you are saying.
 
I read what he said as, today its the same (amount of) engineers but a lot more projects and studios.
So there will be a bunch of titles that do not have them that coast a long, while a some projects here and there does something extra.
Which sort of is what you are saying.

It's more like if you look at GDC and siggraph talks, and names on research papers, you'll see a lot of the same names you saw a long time ago. There are definitely new people, but I don't know if the numbers in a way that's growing to satisfy the industry for everyone to have their own custom engine with a cutting edge renderer. You're seeing a lot of talent moving to Nvidia and Epic.
 
I disagree with the premise that things have not gotten worse. Didn’t BF4 run at 900p on the ps4 at 60fps with high settings? There were dips but that’s a game I played at launch. Furthermore, Dice was doing some crazy stuff with destruction, networked waves, etc. This stuff hadn’t been seen. The performance was also visibly justifiable. Of all the engines out there, UE5 is definitely the worst imo. It offers a lot of new features which theoretically are great but in practice offer marginal improvements in visuals.

Is it? I’m not so sure. We saw what happened to Battlefield after DICE lost core devs. You cannot convince me that the same people who made Arkham knight made the new entry. The regression in technical proficiency so large that it was visible even to the eyes of the uneducated. These are some of the many examples of technical regression we’re seeing today. Outside of Sony’s ICE team, the guys at Nvidia, and a hand full of European devs, no one appears to be even doing anything that’s technically impressive. Instead they rely on bloated engines like UE5 and deliver subpar results.

A lot of the core devs that left Battlefield/Dice went to Embark studios. You might want to look up what game engine they're using for Arc Raiders and The Finals. I'm personally looking forward to whatever id software cooks up next.
 
Why is this? It seems there are so many universities now teaching those things. I lack any academic background and don't know how much this helps, but hell - you can even study game design now - there should be enough people?

I'm not sure there ARE too many universities teaching high level rendering tech these days.
You get your basic intro to graphics, with openGL / DX course, you get your theory of computer graphics course,
but then it's high level Graphics in UE or Unity, or maybe writing directly to OGL/Vulkan or DX.

I doubt too many Universities are teaching advanced graphics programming from scratch type course.
The huge gap between theory of computer graphics and implementing cutting edge concepts in practice is a big one.

Plus getting good performance out of modern GPU's is as much about data management as it is about fancy effects.
being good at both skills is even harder.
 
Personal opinions aside, I decided to take a quick peek back 10 years to see if my memory was fuzzy...I don't think the characterization that baseline game performance has gotten way worse really stands up to serious scrutiny.
I disagree with the premise that things have not gotten worse.
I think there's value in this discussion to understand diminishing returns and the state of the industry, but it'd need to be handled by someone willing to invest in proper research. Back and forths with people 'remembering' is just going to be subjective opinion. An important part of that evauation would also be "if high-end GPUs had better framerates, why weren't they tapped for better visuals?"

I think the defining comparison would be games that look the same but perform better. If you can find games not using UE5 that look the same or better as games that are, and sample enough titles to identify weaker devs as opposed to the engine, a case can be made. On the flip side, even one UE5 game that runs well but looks better than the games on other engines would show UE5 itself isn't the problem.
 
You cannot convince me that the same people who made Arkham knight made the new entry.

It literally wasn't the same team, so anyone attempting that has a very strange hobby. Also Arkham Knight at launch was not exactly a flawless PC experience.

To pick on the AK thread a little more, that was 9-10 years of evolving tech and production experience built on UE3. It's such early days for Nanite/Lumen titles.
 
The irony of the whole "rendering engineers back in the day were the true heroes" line is that it's mostly the same people now that it was then. That's certainly not a great situation in the long run, but the reality is when I was a kid everyone wanted to be game developers and sometime between then and now people primarily want to go work for big internet/tech companies. The shift has become very obvious as I have interviewed and hired over the years; there are fewer people who are trained and/or interested in the systems-level programming work that game development and rendering specifically require. That said, we're not yet at a dire point as many of us still have at least a decade or two of work left in us, but this will increasingly become a problem at some point, assuming AI doesn't just take over the remainder of the work and we brute force it even more via that route 😆
Thanks. Your words manage to reduce my worries. In times like the current information age, where we have no more intellectual leaders but instead get lost in our own noise, it's always good to hear what experienced and 'senior' level people think about it.

Regarding irony, i've just seen DLSS 3.5. And it made me think that progress on RT is fast paced currently, as it's a new technology for realtime. Just reflections towards full PT in a few years.

Maybe this is a Doom moment for the current younger generation? Maybe they are as excited about this as i was about seeing Doom for the first time, and following the fast paced progress of 3D gfx after it? Maybe we older and grumpy guys just don't notice that games (and their progress) are still exciting for the young?

This question bugs me for a long time. But i can't tell, since the people i know are old too.

However, Doom was not just better graphics. It was this new immersive 'be in the game yourself' experience which former games did not have. When i saw it, i was already out for years of playing and programming games on C64. But then i bought a PC and came back to it.
Now i don't think we can have such moment again, at least not with better gfx. Gfx is exhausted, and we're almost there.
But i hope we can continue to impress people by improving what those gfx show. Richer simulations, smart characters and their interactions, for example.

I doubt too many Universities are teaching advanced graphics programming from scratch type course.
The huge gap between theory of computer graphics and implementing cutting edge concepts in practice is a big one.
I don't know what they teach and learn. I only see research papers coming out of it. And even here in Austria, lacking any relevant game studios, technical universities show good output in researching new methods.
I am completely self thought, and i regret i've chosen art school over programming. Not because of the programming, which is easy, but because of my lacking math background. Now i'm busy till the end of life trying to catch up with this. Better education surely would have done wonders to me.

But univerities are late when it comes to passion and motivation. This builds up in child age for most, i guess.
And maybe that's much harder nowadays. When i was a kid, C64 and its handbook was enough. I could figure out how to make simple games without further books or software.
Nowadays people have internet and can look up everything. But to start developing, there are many hurdles. Which language? What do i need to compile and run? Should i use an engine instead? Which one? And will i ever learn how it really works at all, using just some premade engine?
That's a lot of fuzz. If i had a little kid, i'd propose Pico-8 if there is interest in games programming, not GameMaker or Unity.
 
I think there's value in this discussion to understand diminishing returns and the state of the industry, but it'd need to be handled by someone willing to invest in proper research. Back and forths with people 'remembering' is just going to be subjective opinion. An important part of that evauation would also be "if high-end GPUs had better framerates, why weren't they tapped for better visuals?"

I think the defining comparison would be games that look the same but perform better. If you can find games not using UE5 that look the same or better as games that are, and sample enough titles to identify weaker devs as opposed to the engine, a case can be made. On the flip side, even one UE5 game that runs well but looks better than the games on other engines would show UE5 itself isn't the problem.
It's too soon to know if it's an engine or developer weakness as we have a nearly nonexistent sample size. That said, I expect we will see a similar trend to UE4 where a performance cost is paid by the user to facilitate easier development.

PCGH has an interview up with the developer of Immortals.
 
Maybe this is a Doom moment for the current younger generation? Maybe they are as excited about this as i was about seeing Doom for the first time, and following the fast paced progress of 3D gfx after it? Maybe we older and grumpy guys just don't notice that games (and their progress) are still exciting for the young?

Objectively, I don’t think there’s been anything like a Doom or Wolfenstein moment in the past 30 years. Things moved much faster back then. Crysis probably comes closest. There have been great gameplay innovations e.g. multiplayer shooters, open world, MMOs etc since then but graphics has advanced at a steady but incremental pace.

Let’s look at the games a kid might have played 10 years ago. Far cry 3, mass effect 3, battlefield 4, Metro last light, Dishonored. Fast forward 10 years to today and those games are still very playable and things haven’t changed much at all. Games still basically look and play the same.

The promise of Lumen and PT is that rendering is coherent and natural looking but I expect that change will also be gradual. All of the rendering tech in the world doesn’t matter if things like pop-in, missing shadows, janky animations etc are still prevalent.
 
Maybe this is a Doom moment for the current younger generation? Maybe they are as excited about this as i was about seeing Doom for the first time, and following the fast paced progress of 3D gfx after it? Maybe we older and grumpy guys just don't notice that games (and their progress) are still exciting for the young?
It can't be. DOOM et al were almost paradigm shifts. Nothing like them existed before hand. Everything happening now already has established precursors creating a gently evolutionary curve, not a transformative leap. Back then there were so many untapped ideas. Nowadays every idea has been used already. The only direction is bigger and more realistic in the same directions, save a few untapped avenues like completely interactive environments, which would be this:

 
All of the rendering tech in the world doesn’t matter if things like pop-in, missing shadows, janky animations etc are still prevalent.
If i imagine to fix all this, and with have full photo realism at high res and fps, would it be exciting?
If it's used to make the same games - FC12, ME7, BF8, etc... just with better gfx, then i think it's not exciting enough. Not becasue the gfx along the way is gradual and subtle, but because it's still the same games.
So the real question is: Can we still invent new mechanics and genres?
I do not accept the honest answer the subconscious voice in my head is yelling. : )
There must be a way...

In this sense there is again a memory about the past, maybe explaining why we all are still so focused about gfx.
Back then in the 90's (or whatever your personal golden age of gaming you pick) EVERY game was good. But some looked better than others, especially newer ones vs. older ones.
I never thought about game design, or which kind of game i want to make. It has to be some FPS, and it has to look great. Nothing else mattered or seemed to be a problem.
Nowadays, that's very different. FPS became boring because it's too simple to represent something interesting. RPG is still just fake and mirrors, pretending choices but your path goes through a static graph of branching story. RTS is dead.
Back then i only thought about how to simulate reality. But we always need to put a game on top of that, to give it a purpose.
So what can we do in this game, which we could not do in former games already?
It's such a simple question, but everybody is clueless.
I rather hope that new technology exposes new opportunities, and we realize them only as we see the new technology. And we come up with new ideas coincidentally.
It's easier to come up with new tech than with new game designs. : )
 
If i imagine to fix all this, and with have full photo realism at high res and fps, would it be exciting?
If it's used to make the same games - FC12, ME7, BF8, etc... just with better gfx, then i think it's not exciting enough. Not becasue the gfx along the way is gradual and subtle, but because it's still the same games.
So the real question is: Can we still invent new mechanics and genres?
I do not accept the honest answer the subconscious voice in my head is yelling. : )
There must be a way...

In this sense there is again a memory about the past, maybe explaining why we all are still so focused about gfx.
Back then in the 90's (or whatever your personal golden age of gaming you pick) EVERY game was good. But some looked better than others, especially newer ones vs. older ones.
I never thought about game design, or which kind of game i want to make. It has to be some FPS, and it has to look great. Nothing else mattered or seemed to be a problem.
Nowadays, that's very different. FPS became boring because it's too simple to represent something interesting. RPG is still just fake and mirrors, pretending choices but your path goes through a static graph of branching story. RTS is dead.
Back then i only thought about how to simulate reality. But we always need to put a game on top of that, to give it a purpose.
So what can we do in this game, which we could not do in former games already?
It's such a simple question, but everybody is clueless.
I rather hope that new technology exposes new opportunities, and we realize them only as we see the new technology. And we come up with new ideas coincidentally.
It's easier to come up with new tech than with new game designs. : )
New mechanics aren't the only way to bring new experiences. A significant increase in the amount of things that are physicalized in the game world is one of many paths to offering players new experiences. We can also go far in improving how the player interacts with the world. I do agree though that game design is worse now than ever before.
 
I read what he said as, today its the same (amount of) engineers but a lot more projects and studios.
So there will be a bunch of titles that do not have them that coast a long, while a some projects here and there does something extra.
Which sort of is what you are saying.
Ahh my misunderstanding.
A lot of the core devs that left Battlefield/Dice went to Embark studios. You might want to look up what game engine they're using for Arc Raiders and The Finals. I'm personally looking forward to whatever id software cooks up next.
I’m quite familiar with Embark and played the beta of the finals. Unfortunately, I think both games are going to perform poorly. Great tech though.
It literally wasn't the same team, so anyone attempting that has a very strange hobby. Also Arkham Knight at launch was not exactly a flawless PC experience.

To pick on the AK thread a little more, that was 9-10 years of evolving tech and production experience built on UE3. It's such early days for Nanite/Lumen titles.
Yea, that makes sense. I guess that team is making Suicide squad? If yes, that game also looks awful.

AK had issues on PC but on consoles, it was fantastic. To this day, it still looks impressive. Furthermore, I believe UE3 was heavily modified by the team.
 
Yea, that makes sense. I guess that team is making Suicide squad? If yes, that game also looks awful.

Unfortunately it is Rocksteady making SS. I'm sure there's a story of what's gone wrong there. Making a GaaS clearly doesn't help. It's also rumoured that they lost years to a canned Superman game.

AK had issues on PC but on consoles, it was fantastic. To this day, it still looks impressive. Furthermore, I believe UE3 was heavily modified by the team.

A sequel or two down the line from developers initial UE5 title are likely similar improvements. When hasn't game development worked liked that, in general?
 
Back
Top