Game Engine Convergence And The Problem With UE5

BitByte

Regular
Frankly to start off, I must state that this isn’t a UE5 bashing thread. However, I’m not a fan of all the ongoing convergence and I think it’s a mistake in the long run. Personally, I think it’s a huge error for studios to abandon their engine technologies in the long run from a financial perspective, a performance perspective, and a product differentiation perspective.


At the moment, it feels like a lot of studios are running to unreal engine because it’s saves them money on engine maintenance, upgrades, etc. It also saves them money on staffing because there’s a bigger pool of potential hires. However I feel the savings they get in the short term will be detrimental in the long term. Essentially, they’re choosing to become beholden to epic in the future. As these studios pivot towards unreal, it consolidates the engine development talent at Epic leading to a duopoly(UE5/Unity) which will in turn lead to higher prices. The convergence is also bad for engine developers because it limits the variety of jobs and drastically reduces the opportunity for individual/career growth. Again, if there are two main places to work at and ply your trade, your career growth will be very limited.

Even from a performance perspective, the convergence to UE5 is bad. UE5 is basically a general purpose engine. It’s used for games, it’s used for movies, animations, etc. In my opinion and I could be wrong, a general purpose engine can never be as good as a game engine built for a specific type of game. As an example, I’ll reference Frostbite and EA. The convergence of technology that happened at EA has been detrimental to a massive amount of games ranging from Mass Effect, to Fifa, to Madden, and even to Need for Speed. These games had custom engines built for their gameplay but, the decision to save money by converging had detrimental effects on the quality of the game. Mass effect andromeda was riddled with issues that and all the other games mentioned have seen a drastic decline in quality. Now, it’s possible to take Unreal Engine and customize it to suit your needs like we saw with Returnal. However, I’d argue that the performance will still never be as good as a custom built engine. Returnal’s performance in my opinion is extremely poor for what it’s rendering on screen. I believe it runs at like 1080p with tsr on the ps5 at like 40-60 fps which is awful for what it’s doing. It’s also not the only UE game with completely awful performance on consoles. Gotham Knights is another example and the recently delayed Suicide Squad is another example.


Finally, I want to talk about product differentiation. During the PS360 era where a lot of studios used ue3, we started to get a “samey” vibe to the games created with ue3. It was like the mass adoption of ue3 lead to a convergence in art design and style. Naturally, this shouldn’t make sense but yet for some reason it happened. I for one complained about it and I many more complaints on several forums. I think in many ways, the engine itself helps differentiate a game.

Like I said at the start, my intention was not to bash ue5. I think Epic has done a fantastic job but, I feel this should push other studios to attempt to compete not to converge. I hope we can see a slow down in the adoption of UE5 as I personally don’t feel like it’s best for the industry going forward. Anyway, feel free to chime in as I’m sure I’ve made a ton of assumptions here.
 
Super high end performance narrowly targeting specific gameplay will be worse (this does matter a lot, especially for aaa) but opaque static rendering performance will always be great, which is a big deal and I expect will be a meaningful change from previous generations. Ultimately though I think we remember custom engines that get the time and space they need and work great, and forget ones that fall apart and immediately get dropped... Even just recently we can see a few examples of famous, pedigreed engines that aren't maintained as well as they needed to be in cyberpunk and redfall.

I've never really believed in the "samey ue3" thing -- obviously you can see it in the games, but I think that was an artifact of the time, of the way development was organized, of the trends people we chasing, etc.

Also, Isn't returnal like locked 60 on ps5? I played a lot of it and don't remmeber it being bad at all. It is, as I've said on this forum before, a fairly simple looking renderer with a limited feature set -- but the game looks great, and simpler renderers (and simpler tools, etc) that are laser targeted on the game's needs are why many "custom engine" games perform so well in the first place.
 
I agree with some moments. There was similarity in many ue3 games, but most of ue3 games released near end of 7th gen was quite different. Later games on ue4 was less similar to each other and I think games on ue5 will be less similar. But I also prefer custom made engines like in Sony games. Sony studios made a lot of custom engines for ps3 exclusives mostly due to ps3 unique architecture. But later they made custom engines for ps4 games and maje now for ps5. Games looks different and each engine suited for each game. Maybe many studios don't used and don't use unreal because of specific genre. There is not many racing games and as I know there is no top racing game on any version of unreal. Same for sports games and open world games. And Sony don't used and don't use any version of unreal for their first party games. I think that is because of performance.
I can say even more, I think hard architectures of consoles from previous generations was one of the aspects why those games were better in general.
 
I'll try and steer clear of the Unreal-specific takes - beyond using it as an example - for obvious reasons but having experience with the gamut of hobby engine all the way up to big game engine I do have some general thoughts.

Mass effect andromeda was riddled with issues that and all the other games mentioned have seen a drastic decline in quality.
I think ME:A is perhaps a counter-example to a lot of this for a number of reasons. Yes it had some technical issues at launch but overall the gameplay itself was a pretty significant step up from the previous ones (particularly obvious in the multiplayer). Most of the real issues were design/story/etc.-related in my opinion.

But that's not even the main thing - I assume you are aware that the previous ME games were Unreal... not some bespoke in-house engine. Of course many developers modify the big engines to customize them for their needs, and that leads to what I think is a far more interesting discussion.

Now, it’s possible to take Unreal Engine and customize it to suit your needs like we saw with Returnal. However, I’d argue that the performance will still never be as good as a custom built engine.
Is there a good summary of the changes that they made somewhere curiously? I would guess that folks like the Coalition and similar make far more changes to the engine than them, so that might be a better example. That said, I'm not super familiar with what they changed and am curious.

This is the interesting technical discussion for this forum though so I want to focus on it specifically. In theory there is nothing stopping you from rewriting large portions of any of these game engines as full source code is provided; in that sense you can see them anywhere on the gamut from "a bunch of example code and tools to get you started" to "complete solution that will never modify".

When considering what parts of an engine are typically modified and which are left alone, that's where I think the reasons for using a game engine in the first place become more stark. It used to actually be pretty common for games to almost entirely rewrite the rendering portion of Unreal back in the UE3 days, and some of that carried over into UE4. Time will tell with UE5 but the complexity of the task has grown significantly now if you want to take advantage of the new technologies.

One of the main reasons people tend to use established game engines (in house or otherwise) is that you want to get the content pipeline up and running as quickly as possible so that folks can work in parallel while game-specific tech is developed. In practice that does mean that certain areas of the engine that affect content production in a deep way (ex. the material editor and system in Unreal) typically don't get large overhauls or replacements as that would undermine one of the key benefits. That's not to say it never happens or could not happen if it was important for a given game, but it's certainly high effort.

Like I said at the start, my intention was not to bash ue5. I think Epic has done a fantastic job but, I feel this should push other studios to attempt to compete not to converge.
I agree that more varied ideas and technology is a good thing, but I don't think that implies you have to roll your own engine from scratch to do that. At some level does anyone care that pretty much all games have settled on a small number of middleware for everything from logging, networking, audio, physics, compression/decompression, etc? If you're making a game that is primarily focused on one of the above (I want to see the data-logging game :p) you probably want to at least modify if not use your own solution, but then you may be fine using a standard solution for the other subsystems.

No game or game engine is written entirely in isolation - it all borrows code heavily from various places and that's fine. Conversely the general game engines aren't really just "one thing" - they often have several different packages, renderers or configurations built-in that get used and modified differently for difference cases (ex. the deferred, forward and mobile renderers in Unreal are almost entirely separate things, and Unity has at least 3 renderers last I checked as well). Studios can and do modify the core technology more to suit their needs although the content-facing areas definitely have trade-offs there.

To put it another way, if folks can add stuff like ReSTIR or similar on top of Unreal, I don't know that people should feel fundamentally beholden to the technology within. I think we have and will continue to see significant innovation from people in the future, even if it is done on top of or via modifications to some of these big game engines. The fact that academic rendering research itself has increasingly been able to use Unreal and Unity to avoid all the boilerplate issues and get access to much more varied and better test content (which was a recurring complaint back when I was writing and reviewing academic papers more) is I believe another testament to this fact.

Fundamentally I do hope we'll continue to see a range of bespoke game engines because they are always interesting in their own right. But even if the big AAA games do converge on tech at some point, I don't think that means innovation or optimization is going to die or anything.
 
Last edited:
Super high end performance narrowly targeting specific gameplay will be worse (this does matter a lot, especially for aaa) but opaque static rendering performance will always be great, which is a big deal and I expect will be a meaningful change from previous generations. Ultimately though I think we remember custom engines that get the time and space they need and work great, and forget ones that fall apart and immediately get dropped... Even just recently we can see a few examples of famous, pedigreed engines that aren't maintained as well as they needed to be in cyberpunk and redfall.
I think it does matter because certain studios make certain types of games. Cyberpunk wasn't a lack of engine maintenance issue, it was a mismanagement issue by the project managers and the senior staff. Ultimately, they lost a lot of talent due to mismanagement and poor working conditions which can happen to any studio. In the case of Redfall, it wasn't an engine issue at all. The project was directionless for a long time and it should have been canceled because the team wasn't equipped to deliver this type of game. They released an unfinished game so I don't think we can blame the engine.
Also, Isn't returnal like locked 60 on ps5? I played a lot of it and don't remmeber it being bad at all. It is, as I've said on this forum before, a fairly simple looking renderer with a limited feature set -- but the game looks great, and simpler renderers (and simpler tools, etc) that are laser targeted on the game's needs are why many "custom engine" games perform so well in the first place.
Returnal is definitely not a locked 60. I played it and have seen it drop several times. Also, I think they got away with a lot because the environments were fairly dark and you can hide a lot of rendering deficiencies in the dark. On a whole, the game looks alright in my opinion. Certainly nothing impressive.
 
I'll try and steer clear of the Unreal-specific takes - beyond using it as an example - for obvious reasons but having experience with the gamut of hobby engine all the way up to big game engine I do have some general thoughts.


I think ME:A is perhaps a counter-example to a lot of this for a number of reasons. Yes it had some technical issues at launch but overall the gameplay itself was a pretty significant step up from the previous ones (particularly obvious in the multiplayer). Most of the real issues were design/story/etc.-related in my opinion.

But that's not even the main thing - I assume you are aware that the previous ME games were Unreal... not some bespoke in-house engine. Of course many developers modify the big engines to customize them for their needs, and that leads to what I think is a far more interesting discussion.
See I suspected that but, I wasn't fully sure. If that's the case though, Bioware must have really customized unreal.
Is there a good summary of the changes that they made somewhere curiously? I would guess that folks like the Coalition and similar make far more changes to the engine than them, so that might be a better example. That said, I'm not super familiar with what they changed and am curious.
They brought their VFX over to UE4 via a plugin for UE4. They talk about it in the links below:
Returnal Article On Epic's Website

Returnal GDC Presentation:
This is thenb interesting technical discussion for this forum though so I want to focus on it specifically. In theory there is nothing stopping you from rewriting large portions of any of these game engines as full source code is provided; in that sense you can see them anywhere on the gamut from "a bunch of example code and tools to get you started" to "complete solution that will never modify".

When considering what parts of an engine are typically modified and which are left alone, that's where I think the reasons for using a game engine in the first place become more stark. It used to actually be pretty common for games to almost entirely rewrite the rendering portion of Unreal back in the UE3 days, and some of that carried over into UE4. Time will tell with UE5 but the complexity of the task has grown significantly now if you want to take advantage of the new technologies.
It has grown significantly but again, I think a large part of the complexity is due to the dependency on a 3rd party engine provider. If it's your own technology, it's easier to understand it's limitations/constraints and also know how to improve it. Even if you have the source code to UE5, it's so large that it'll take a lot of time just to understand it before you can start making changes. Furthermore, the design goals of a 3rd party engine might not be consistent with the design goals of your game. Since studios usually stick to one genre of game, it doesn't really make sense in my opinion to depend on a 3rd party engine if you don't want to limit the potential of your game from a performance perspective and from a design perspective.
One of the main reasons people tend to use established game engines (in house or otherwise) is that you want to get the content pipeline up and running as quickly as possible so that folks can work in parallel while game-specific tech is developed. In practice that does mean that certain areas of the engine that affect content production in a deep way (ex. the material editor and system in Unreal) typically don't get large overhauls or replacements as that would undermine one of the key benefits. That's not to say it never happens or could not happen if it was important for a given game, but it's certainly high effort.


I agree that more varied ideas and technology is a good thing, but I don't think that implies you have to roll your own engine from scratch to do that. At some level does anyone care that pretty much all games have settled on a small number of middleware for everything from logging, networking, audio, physics, compression/decompression, etc? If you're making a game that is primarily focused on one of the above (I want to see the data-logging game :p) you probably want to at least modify if not use your own solution, but then you may be fine using a standard solution for the other subsystems.
I understand this logging, networking, compression/decompression, audio are not the same as physics. I for one have been complaining about the rapid adoption and convergence on Havok physics. Don't take this the wrong way but, in my opinion, it's made developers very lazy and unable to think outside the box. During the ps4/xb1 generation, we didn't get much physics in games because it was deemed that the cpus on the consoles were too weak. Now it's 2023 and we have tears of the kingdom flexing its physics on cpus that were far weaker than that in the ps4/xb1 generation. How did that happen? A lot of times, with these middleware tools, I feel like if it doesn't work right away, the idea is tossed. Very little attempt is made to find another solution which in many instances might be more performant than the middleware tools used. I'm not saying that devs should go custom all the time but, part of the fun of game development is innovating in design and technology. If you're just relying on middleware tools most of the time, you're not innovating in technology at all.
No game or game engine is written entirely in isolation - it all borrows code heavily from various places and that's fine. Conversely the general game engines aren't really just "one thing" - they often have several different packages, renderers or configurations built-in that get used and modified differently for difference cases (ex. the deferred, forward and mobile renderers in Unreal are almost entirely separate things, and Unity has at least 3 renderers last I checked as well). Studios can and do modify the core technology more to suit there needs although the content-facing areas definitely have trade-offs there.

To put it another way, if folks can add stuff like ReSTIR or similar on top of Unreal, I don't know that people should feel fundamentally beholden to the technology within. I think we have and will continue to see significant innovation from people in the future, even if it is done on top of or via modifications to some of these big game engines. The fact that academic rendering research itself has increasingly been able to use Unreal and Unity to avoid all the boilerplate issues and get access to much more varied and better test content (which was a recurring complaint back when I was writing and reviewing academic papers more) is I believe another testament to this fact.

Fundamentally I do hope we'll continue to see a range of bespoke game engines because they are always interesting in their own right. But even if the big AAA games do converge on tech at some point, I don't think that means innovation or optimization is going to die or anything.
I don't know but I feel like convergence is the antithesis of innovation. When you converge, you're not innovating, you're following the path of least resistance. Even if a studio were to rewrite parts of unreal, it's still built on the foundation that is unreal. You're inherently limited by that foundation and if you were to advocate for rewriting a majority of unreal, then you might as well just make a new game engine. I'd go as far as to argue that by building on top of Unreal, you're just iterating and refining not innovating. Again, my complaint is mainly centered around studios that choose to abandon their own in house technology in favor of UE5.

With regards to academics getting better access to these technologies, I think it's great for learning purposes. I use UE5 myself to dabble around and create little games. However, I feel its just an attempt by Epic and Unity to make their technologies the defacto industry standard by ensuring graduates are trained on their toolset. That's just my opinion though.
 
During the ps4/xb1 generation, we didn't get much physics in games because it was deemed that the cpus on the consoles were too weak. Now it's 2023 and we have tears of the kingdom flexing its physics on cpus that were far weaker than that in the ps4/xb1 generation. How did that happen?

Which big budget titles would have benefited from the ability to make vehicles and devices out of a handful of pieces? That's all there is to it. TotK is a design triumph in making this stuff a fun part of the game and meme worthy.

You only have to hold up Dreams, Nuts 'n Bolts or Trials Evolution to say there's nothing new here technically.
 
Last edited:
I understand this logging, networking, compression/decompression, audio are not the same as physics. I for one have been complaining about the rapid adoption and convergence on Havok physics. Don't take this the wrong way but, in my opinion, it's made developers very lazy and unable to think outside the box. During the ps4/xb1 generation, we didn't get much physics in games because it was deemed that the cpus on the consoles were too weak.
..
I don't know but I feel like convergence is the antithesis of innovation. When you converge, you're not innovating, you're following the path of least resistance. Even if a studio were to rewrite parts of unreal, it's still built on the foundation that is unreal. You're inherently limited by that foundation and if you were to advocate for rewriting a majority of unreal, then you might as well just make a new game engine.

These days, if you want to do a AA+ 3D game you can't do it all (unless you have established team/tech like Insomniac, Rockstar North, Nintendo and similar). If you want to innovate you have to choose where. Maybe that is in graphics like sebbi's clay game, or maybe it is in physics or something else entirely. If you try to do it all you must have the most awesome team ever, and even then it will be really difficult. If you want to be succesful you will have to choose your battles.
 
It's definitely a concern WRT performance. UE4 for example was so heavy on the GPU and I never understood the specifics as to why given the rendering output.
 
I understand the sentiments regarding game engine convergence, but I think if you look back through modern gaming history, as in PS2 and up, you see plenty of games that were trend setting, genre defining titles and franchises built around off the shelf technology. Half-Life runs on a licensed game engine, as does GTA 3/Vice City/San Andreas, and so does Call of Duty. Hell, the console versions of Call Of Duty runs on the same engine as GTA.

I'm not sure a convergence like the one we are seeing has a historical precedent to stifle creativity, and convergence around a few game engines with a mostly know workload may lead to greater performance in the long run if hardware manufactures target those workloads.
 
Last edited:
They brought their VFX over to UE4 via a plugin for UE4. They talk about it in the links below:
Returnal Article On Epic's Website
That's cool, thanks for the link. I think the interview there echos some of the points from my previous post, but directly from the developers, ex. "Unreal Engine has been great because we have been able to continue to create our own technology and also leverage tools like Blueprints, cinematic systems, and more with Returnal."
https://www.unrealengine.com/en-US/...arrative-driven-procedural-horror-of-returnal
Even if you have the source code to UE5, it's so large that it'll take a lot of time just to understand it before you can start making changes.
In the same way that you only build the engine technology you need to make your game, you only need to support/replace those elements in a proprietary engine as desired as well. This is actually very common; developers will often add features or change the way things work to better fit their game needs but the changes can't necessarily come back into the general engine because they only work in the specific cases/platforms that the game studio cares about. It's not too hard to understand enough of an engine to make changes to specific parts effectively. Hell I barely know how to use Unreal editor and definitely would need to learn a lot to actually make a project or game with it, but I'm obviously able to write new rendering techniques for it.

I don't know but I feel like convergence is the antithesis of innovation.
I don't think that's a reasonable thesis given history. The only way we can make these complicated games and keep making them even more complicated is by continuing to build on the work of others, increasingly across the global industry, regardless of studio. We've long since past the days where anyone can keep any significant portion of the entire stack in their heads, particularly if you include all the design and manufacturing that goes into the hardware that this all runs on as well.

The word "convergence" here is a bit ill-defined, but I'll certainly say that the whole point of academia and industry conferences is to share knowledge to allow others to start with your work, and then build further off of it. Increasingly academic conferences are requiring source code for this very reason. I see open source game engines as facilitating this goal in a lot of ways. There's no reasonable way a single grad student is going to implement all of ex. Nanite and then add something useful to it, but if they are given the system and source as a baseline they absolutely can innovate on top of it in useful ways. To push tech forward as quickly as possible we really want as much of it to be openly talked about and ideally open source as possible. Proprietary stuff is great in some ways, but it does slow innovation in others.

You're inherently limited by that foundation and if you were to advocate for rewriting a majority of unreal, then you might as well just make a new game engine.
I'm not sure from your previous comments if you've done much development work, but I don't think this is a reasonable characterization of how game engines - or programming in general - work. There are of course areas that are more integrated than others but since this is DF I think it's definitely worth noting that the renderers are not really one of those areas in either Unreal or Unity. As I noted, both have several renderers so it's obviously very possible to add additional ones, based on the existing ones or not as desired.
 
I don't know but I feel like convergence is the antithesis of innovation. When you converge, you're not innovating, you're following the path of least resistance. Even if a studio were to rewrite parts of unreal, it's still built on the foundation that is unreal. You're inherently limited by that foundation and if you were to advocate for rewriting a majority of unreal, then you might as well just make a new game engine. I'd go as far as to argue that by building on top of Unreal, you're just iterating and refining not innovating. Again, my complaint is mainly centered around studios that choose to abandon their own in house technology in favor of UE5.

You do realize that virtually noone is building a new engine for AAA games?

Bethesda is using an engine from 2011. DICE is using an engine from 2008. Keep in mind that while those are considered "new" engines, they didn't start them out completely from scratch. The FarCry series is using an engine (Dunia) from 2004 which has as a starting point an engine from before 1999 (CryEngine was first shown in 1999 but development started on it before it's first showing, obviously. :)). Naughty Dog has been using the same engine for well over a decade now. Yes, it got a large enough technology update to the base engine that called it a "new" engine but it still has its roots in their older engine because starting over from a blank slate makes no sense. They just rewrote large chunks of the rendering code and other stuff. You know, like how the better UE devs approach modifying UE. :p

If a new or established AAA developer were to attempt to release a game using a brand new built from the ground up engine taking advantage of all modern rendering techniques, you're likely looking at about 8-10 years before the game is released if the developer is good. More if they aren't.

Almost ever developer is using an engine that is decades old with heavy modifications to adapt to new technology.

In other words, you aren't getting away from building on top of older engines in AAA development. Developing a brand new engine from the ground up is far too time consuming.

Only indie devs can really afford to grow their own engines organically because they don't necessarily need to immediately implement the latest technology into their engine. They can start small and grow it modestly over multiple titles or a single passion project that takes that over a decade. But even in the indie scene, most developers are choosing to go with an established engine because building your own engine is hard and extremely time consuming.

Regards,
SB
 
I imagine writing an engine from scratch for current gen consoles would be a hugely expensive and time-consuming endeavour. Even Bethesda with all their resources can't ditch Gamebryo and write a new engine, so I imagine studios with less resources are even less likely to attempt it.
 
Mass effect andromeda was riddled with issues that and all the other games mentioned have seen a drastic decline in quality.
ME Andromeda's issues really had nothing to do with using Frostbite. In fact, I'd say that was a successful example of a game adopting a more 'generalized' engine from a technical perspective. I just think people forgot that Bioware was never a developer of highly polished games and the main Mass Effect trilogy had loads of jank that people were simply more forgiving of the time because people didn't seem to expect games to 'do it all' back then.

This isn't to say that using Frostbite might not have been difficult for Bioware, but it wasn't even their first game using it. Dragon Age Inquisition was also Frostbite, and also another example of the devs doing a pretty good job utilizing it in the end. Sure, it meant Bioware had to create a sort of bespoke animation system and all that, but that really comes to the next talking point that needs to be addressed here when blaming engines for issues in games - would continuing to use Unreal Engine for Mass Effect have resulted in less challenge or less issues in creating a proper next gen Mass Effect game? Would it have solved all their animation woes in having to create accurate facial animations for many dozens of characters saying thousands and thousands of lines(motion capture for this scale at the time was simply not practical).

There's genuinely no guarantee at all the results would have been better had they not used Frostbite, and I'd argue there's a good chance they could have been worse(especially with the common issues that UE4 games have...), given that Dragon Age Inquisition and Mass Effect Andromeda, on the whole, were actually both fairly technically successful games, with their main issues coming more from complaints about dialogue, lack of interesting new story and characters, and few new alien species to fight against.
 
In the same way that you only build the engine technology you need to make your game, you only need to support/replace those elements in a proprietary engine as desired as well. This is actually very common; developers will often add features or change the way things work to better fit their game needs but the changes can't necessarily come back into the general engine because they only work in the specific cases/platforms that the game studio cares about. It's not too hard to understand enough of an engine to make changes to specific parts effectively. Hell I barely know how to use Unreal editor and definitely would need to learn a lot to actually make a project or game with it, but I'm obviously able to write new rendering techniques for it.
Echoing this: Building source or shipping final changes to an engine is certainly a very big undertaking, but I think non-devs vastly over-estimate (for once) the challenge of working with a big codebase -- usually you want to change source if you know a certain feature doesn't work how you need, or you find an edge-case bug for your workflows, or you have a specific functionality you plan to add to, say, the renderer. Even just being able to look at the source might make all of that possible -- the ability to press a button on the function that you know isn't doing what you want and immediately see, debug, edit, etc, the code is a day to day lifesaver. Just ran in to a unity graphics call yesterday that I wasn't sure exactly how worked, and even though my org has source access and I could get a license and get it all set up if necessary, the nature of unity as a closed source engine makes that a huge obstacle, vs Unreal I would just look at it on github.

ME Andromeda's issues really had nothing to do with using Frostbite. In fact, I'd say that was a successful example of a game adopting a more 'generalized' engine from a technical perspective. I just think people forgot that Bioware was never a developer of highly polished games and the main Mass Effect trilogy had loads of jank that people were simply more forgiving of the time because people didn't seem to expect games to 'do it all' back then.
While I'm not going to strongly argue frostbite was at fault for me:a (certainly the tech itself wasnt) the pains of challenging, frustrating, or off schedule dev work don't necessarily show up in bugs or performance issues or low scope -- This stuff all ripples through the whole team, it could mean more developers working with less poloished (or non existant) tools because the "engine" work took longer than expected, it could mean last minute animation integration, etc.
 
Apologies in advance to interrupt the really interesting discussion with a cheap shot, but you are really unlucky with your example :)

I for one have been complaining about the rapid adoption and convergence on Havok physics. Don't take this the wrong way but, in my opinion, it's made developers very lazy and unable to think outside the box. During the ps4/xb1 generation, we didn't get much physics in games because it was deemed that the cpus on the consoles were too weak. Now it's 2023 and we have tears of the kingdom flexing its physics on cpus that were far weaker than that in the ps4/xb1 generation. How did that happen?
How? Well...they used Havok. No really. (This being BoTW, but according to people on Twitter datamining ToTK, it's the same for the latter, just a newer version)
 
Apologies in advance to interrupt the really interesting discussion with a cheap shot, but you are really unlucky with your example :)


How? Well...they used Havok. No really. (This being BoTW, but according to people on Twitter datamining ToTK, it's the same for the latter, just a newer version)

Huh, interesting, a slight tangent to your tangent. It looks like Nintendo paid a little more money for any middleware that they used in order to not have to display (give credit to) the middleware in their end credits. :D I can see how that might create some confusion for some people.

Regards,
SB
 
Fundamentally I do hope we'll continue to see a range of bespoke game engines because they are always interesting in their own right. But even if the big AAA games do converge on tech at some point, I don't think that means innovation or optimization is going to die or anything.
I don't think either that innovation or optimization would stagnate completely, but it would reduce chances that any potential optimiaztion / innovation is explored. If just one company serves a whole industry with engine tech, that's impossible to avoid.
I'm actually more worried about the increased convergence trend affecting game design, not so much about tech.
Though, there is only two 'recent' examples i can list, where custom engine went hand in hand with game design innovation, and that's Minecraft and Penumbra (or the better known follow up games from Frictional like Amnesia / Soma).
Let's say Minecraft would have been possible with UE or Unity.
But for Penumbra that's not the case. Key for it's very immersive experience was a custom physics engine (Newton), which is more robust and precise than state of the art engines (e.g. Havok, Bullet, PhsyX).
So if the devs had tried to use UE, the revolutionary physical interaction model would not exist, and we would still think a Gravity Gun to hide simulation jitter behind some 'force field effect' is all that can be done.

Well, sadly neither HL2 nor Penumbra had much influence to others (attempts to copy the features were mostly pretty limited or just failures), but still - there are some unique, interesting and inspiring games out.

Now i'm a Newton user myself and it's really important for me. I work on self balancing ragdolls and hope to replace character animation with simulation completely.
That's just a hobby for now, but if i would start work on a game, that's the actual reason why i could not use some U engine.
I haven't tried myself, but people say replacing the physics engine in such game engines is too much work / too hard to maintain. So i conclude it's easier the write a custom renderer than to tweak off the shelf engines to my needs.

That's pretty bad. Also, some people using custom engines report problems to find a publisher, because they don't use a trusted engine such as UE or Unity. Which means the industry seemingly reduces support for custom engine development in practice, to some degree.

My wish here is pretty simple: Make UE more modular, so systems can be swapped out and replaced very easily.
Then we'd get the best of both worlds: Devs can focus on just the parts they want to innovate, and Epic still gets the money and good reputation.

But just saying. I do not really believe in those high level ideals such as modular software design so much. Making it work is hard enough. :D
 
A lot of romanticism over engines missing the business aspect of it. The main goal of the engine is to support running all aspects of the game in unison, and in todays world that also includes being able to deploy to multiplatform.

We are seeing convergence because the costs are steep to maintaining and updating an engine to support as many configurations as possible.

Most of the failure you see with games however is because they rolled a custom engine, but because they likely tried to make a game under 2 years.

No engine is going to solve that problem. But the reality is, it’s a lot of work to innovate new game design experiences and have them still be valid 4-5 years later.
 
Back
Top