Capcom's "Framework" game engine

Debatable. I'm clearly talking from an agile development perspective here and it's a common misconception, in my opinion, that agile practices are best implemented by experienced programmers. On the contrary, I think you need very experienced programmers to tame the complexity of a Big Monster Framework and bend it to your will, while less experienced programmers work better with clear practices that lead to very simple and mantainable code.

Ok, I see what you mean especially after your last few posts. I think you're not far from the truth. I wasn't clear too. My mind was more on the junior developer part (and less on the reuse). It takes lots of patience and determination to inculcate junior developers in good practices especially under time pressure.

For my last project, we end up making the young developers write the app from scratch (We showed them some samples) because the requirements are different enough. For other older projects, we had frameworks that the team built over time. I guess in the game industry, every game is probably different enough to warrant a rewrite.
 
Slightly Off Topic

But I wanna know how in the world they got that huge 2 level Lost Planet Demo complete with boss and all in 300MB!
 
But I wanna know how in the world they got that huge 2 level Lost Planet Demo complete with boss and all in 300MB!

I want to know how they packed the best game I've played all year into 30 minutes!!! For free, no less. And I played RE4 this year. For free.
 
Designing and building a piece technology that can be used by two games costs 3-5x over just building it for your immediate requirements.
Yes, that's a concern, and i think the result is only large developers would consider it. If your production house is expecting to make 15 games this generation, creating a framework that would support the development of those games should be economical. Looking at EA's acquisition of Renderware, we see an intention to use a framework for multiple titles. Larger firms are also more likely to work through junior developers with higher turnaround from what I hear. For an indie like Ninja Theory or a 1st party like Lionhead, working through each game as it comes is the only sensible decision, adapting existing code if useful that way.

I think it's hard to comment on this particular framework without seeing it, though the results don't seem bad so far. If it was a long-term project well spec'd and designed to serve the company's aims, it might well help streamline development and get costs down. It depends how complex the framework is for people to pick up and how much work needs to be done to get it to work effectively on the next project. I'd have thought an in-house framework, if well written, is a better solution than a generic middleware framework. In the PC indie world, the numerous middleware solutions have enabled many a game that 'from scratch' development would never have completed. That's whole games created for thruppence (though the quality often shows ;)). Theoretically that should extend to professional development and I can see managers thinking as much, seeing a middleware solution as needing only a handful of devs to complete rather than lots of expensive A-class level programmers. At the moment I guess the Engine theory, that games are always cheaper if you don't start from scratch but build around an existing code-base, is more a theory. One the devs here don't subscribe to! I can see a concern though that if this works for Capcom, and EA reports savings for using Renderware or whatever, companies will start to think of Frameworks/Engines as the magic bullet to drop costs regardless of realities. Managers almost never listen to the workers who work with the tools and know what works and what doesn't.

How much are middleware and in-house engines being imposed on devs now, versus previous generations? Have any devs here worked with another company's in-house engine for their company's project? UE was a framework for UT, and has grown in popularity. SnowBlind Studios created their engine for themselves and it was licensed for other companies. How much money have these games cost/saved using third-party engines?
 
That said, sometimes promoting your "lazy implementation" to a higher level requires a complete rewrite - so I do like to consider what are the chances of ever wanting to reuse something before writting it.

Possibly, this is such a grey area. Having been one of the guys writing reusable and generic code whenever I could, I have switched to my new lazy approach and noticed a huge increase in my productivity (in terms of features delivered and used), so I tend to be draconian on this. But I can expect a less lazy approach could work well for other people. It's just a matter of being pragmatic after all.

To complete the topic, I like to add that I find these frameworks very useful during prototyping, when you want to quickly explore some solutions and some specific aspects of your game, and you don't care if you can't do everything you have in mind and you might have problems with the quality of the code you are writing. After all, it must be thrown away anyway.

Fran/Fable2
 
Last edited by a moderator:
How much are middleware and in-house engines being imposed on devs now, versus previous generations? Have any devs here worked with another company's in-house engine for their company's project? UE was a framework for UT, and has grown in popularity. SnowBlind Studios created their engine for themselves and it was licensed for other companies. How much money have these games cost/saved using third-party engines?

This is one of my favorite topics....
How much engineering does middleware save?
The short answer is nothing, the somewhat longer answer is, if you can use it as is and consider it a black box, it can save you something.

Everyone I know who has used a piece of middleware has had to re-engineer massive parts of it to make it work for them. This re-engineering is at least as expensive as building the technology would have been, since it has to be done in place on a foriegn codebase. If your not trying to build a tripple A game and you can live with the restrictions it can be a win.

What middleware really buys you is the ability to start content development very early in the process. I can have designers fiddling with levels months before I could if I had to build something from scratch.

There are exceptions to this, I think for example Havok has value, but only if you're actually using it. If you need constrained dynamics it's a hard problem so use Havok, if you just need a simple vehicle model, or a character proxy it's 800K of code that provides a very inefficient solution.
 
Everyone I know who has used a piece of middleware has had to re-engineer massive parts of it to make it work for them. This re-engineering is at least as expensive as building the technology would have been, since it has to be done in place on a foriegn codebase. If your not trying to build a tripple A game and you can live with the restrictions it can be a win.
In your opinion then, why was the choice made to go with middleware with the games that do? Low goals? A desire for early content creation? Poor management decision? This gen seems to be earmarked as 'the generation of the Middleware' so finding out why middleware is being chosen and what benefits it does or doesn't have is kinda important ;)
 
In your opinion then, why was the choice made to go with middleware with the games that do? Low goals? A desire for early content creation? Poor management decision? This gen seems to be earmarked as 'the generation of the Middleware' so finding out why middleware is being chosen and what benefits it does or doesn't have is kinda important ;)

No-one has existing technology at the point of platform transitions, there is significant management "fear" about the complexity of the new platforms, and it is seen as a way to save money.

Frankly Epic were in the right place at the right time.

Having said that, having content people start earlier can't be undervalued, it's a huge win from a production standpoint. Engineering isn't the only cost in building a game and most companies don't have the luxury of ramping staff up and down on an as needed basis. So you pay the staff overhead whether or not they are currently productive.
 
Having said that, having content people start earlier can't be undervalued, it's a huge win from a production standpoint. Engineering isn't the only cost in building a game and most companies don't have the luxury of ramping staff up and down on an as needed basis. So you pay the staff overhead whether or not they are currently productive.
It's not only a question of man-power. Firstly when you take middleware, most of your architectural decision are already made. They may not perfect, even mediocre, but they work. When you build something from ground up, you may simply find out after a couple of months that you made the wrong decisions and need to throw a lot of code out or even start from scratch. That's even more true as the development model of this-gen consoles (multicore, multithreaded) is fundamentaly different from last gen.
Secondly even if you have the man power, sometimes development does not scale. No "bring me nine women, if want to have the child in one month".
Thirdly even if you have to reengineer large parts of your middleware, it is usually easier to start with something working and then replace it with something that works better than to start with nothing at all. Plus the chances are lower that your whole development process will stall.
However I'm not saying the middleware is the holy grail. But if I had to choose between using middleware or building from scratch for the first project on a new platform, I think I would choose the middleware, try to finish the project in time and budget, learn from the mistakes (my own and the middlewares) and then decide whether to keep the middleware for the next project or dump it and start from scratch.
 
At the moment I guess the Engine theory, that games are always cheaper if you don't start from scratch but build around an existing code-base, is more a theory. One the devs here don't subscribe to!
It really stands to question how much can you really share and how much do you really want to share across separate projects? There is certainly little to gain by not sharing tools if you're doing multiple projects at once. The things we often associate with a "game engine" on immediate inspection tend to be the things that also need to evolve rapidly, so if you're working on one game at a time, it really isn't a big deal to start over.

But traditionally, it's not the "engine" part of the engine that's a pain in the neck. It's the tools part of the package that's hard to get right. And as there is a drive for more content, the tools need to be that much more usable and that much more powerful at the same time. And it's also why people like to spend a million on Unreal.

The previous studio I worked at was an indie developer pretty much relying on one publisher, and they managed to keep costs down with a base framework that just continually evolved as they kept pounding on it... though they did so to a fault, in that it still contained 10-year code rot and 10-year old bugs (and I mean innocuous ones like "if (direction.y = 0.f)" type bugs), and all the while hoping that it would be a feasible codebase off of which to build for next-gen. Even otherwise, I don't think it would have worked for anybody else, as most studios tend to pay their artists more than minimum wage.

In the case of where I currently am, it works out well not so much because we're big, but because we're owned by our publisher, who owns several other studios and features tend to move around. Though it is rather absurdly expensive, part of the trick is to isolate what makes sense to be turned into something totally generic -- a good deal of which has to do with making every last tool completely data-driven, and that's the sort of things that can live on through several iterations of the engine irrespective of what we do to the renderer or the physics or animation, which is what most people see of the "engine."

OT : Who the hell comes up with these names for game engines? Source, Build, Coded, and now Framework? I'm sure someone will come up with "Library" next... Maybe "API."
 
I have asomewhat different opinion on evolving a codebase written by the same a set of developers rather than using middleware.

The problem with evolving middleware is the ramp up cost, you don't know the code and often can't fathom the intent. If it's something you wrote, you have some idea of why it works the way it does.

Building a codebase from scratch is HARD. Without a huge amount of experience you will fuck it up, and tools never get the investment they ought to. Evolving what you have is a much more practical idea.

One of the problems with Middleware is that there is an assumption by some people that they solve the hard problems or that they are particularly performant. Neither is usually true, middleware is designed to work in as wide a variety of environments as possible, it's a compromise top to bottom.

Unreal 3 engine for example makes very little effort to solve the issues of parallelism, they have a very basic model in there. Unreal 3 in general is "old tech" there isn't much in it that could be described as innovative, but I wouldn't expect there to be, and most game engines will evolve from where they are rather than leap to the next level.
 
One of the problems with Middleware is that there is an assumption by some people that they solve the hard problems or that they are particularly performant. Neither is usually true, middleware is designed to work in as wide a variety of environments as possible, it's a compromise top to bottom.
Of course. That's generally the problem anywhere when someone tries to create "one-size-fits-all" software -- the result is almost always "this-size-fits-nobody". Though I would at least hope that the misconceptions about middleware are primarily among the public who think UE3 is the second coming of Christ and using the same engine for multiple games automatically means that they'll look identical, rather than within the industry (wealthy egomaniacs with IQs of -100 notwithstanding). It's sickening to hear the same things over and over again, but I'd be more concerned if a producer believed that.

Unreal 3 engine for example makes very little effort to solve the issues of parallelism, they have a very basic model in there. Unreal 3 in general is "old tech" there isn't much in it that could be described as innovative, but I wouldn't expect there to be, and most game engines will evolve from where they are rather than leap to the next level.
Which is why Epic themselves probably aren't intending to move UE3 much further forward and are more concerned with the development of UE4. I'm curious, but being a studio who licenses their engine to everyone and their brother, I'm going to have to assume that the same song and dance will prevail.
 
"For better texture compression, they do original texture compression which appropriates an alpha channel for an extended information area and decompress it with programmable shaders."

Now this doesn't sound as great..using shader power to aid decompression? Anyway, amazing how clever programmers always are about doing stuff like this, on all platforms.

My raw guess is that they were referring to some kind of HDR texture encoding, RGBe format most possibly.

nAo said:
Why would you resolve it? you need supersampled depth data, you don't want to resolve it.
BTW..how would you resolve a zbuffer (I mean..which filter would you apply)?
I'm confused. If you don't resolve it from EDRAM to main memory how is that possible used as a shadow map later? As for the filter, I think D3DRESOLVE_FRAGMENTS0123 can resolve the surface without down sampling? Haven't try it myself.

I'm interested with the "MSAA mini buffer" thing. It looks like they draw all particles into a quarter sized framebuffer with 4xMSAA on, and then resolve it back without down sampling. That could save certain amount of fillrate, considering the regular sized framebuffer is also with MSAA on. Not sure if this is the only optimization boosted the performance that much. I mean, 24fps to 32fps is a huge boost.
 
3dlp06.jpg


Holy crap it looks like the have Lost planet running on a quad core XP PC! Is this going to be a Vista game now too or something?
 
My japanese is not very good but it seems to me that that AA at 5% cost myth was..well..a myth :) (at least on this game)
 
My japanese is not very good but it seems to me that that AA at 5% cost myth was..well..a myth (at least on this game)
With this engine performance hit goes from 11.5% with 2xAA to 18% with 4xAA on the three first shot
and from 7.2% with 2xAA to 13.7% with 4xAA in the three second shot
Not as good as advertised by Ms, but not so bad, I find.
Especialy if it's automatic tiling, there's room for improvement for first party dev team who can spend two or three year one a game.
And are we sure that the hit came only from tiling? (the difference could came for others reasons)

Can soneone translate how they are doing multithreading and how job queue is manage?
Look a lot like what team ninje is doing with HS as far as jobs are concerned?

EDIT, i'm just reading the google translated article... it's better than my english :oops:
 
Last edited by a moderator:
With this engine performance hit goes from 11.5% with 2xAA to 18% with 4xAA.
Not as good as advertised by Ms, but not so bad, I find.
I agree, it's not bad at all, but it's still 4x time slower than 'advertised'..at least in this game :)
I would not be suprised if other platforms out there pay a smaller cost, performance wise..
 
Back
Top