You don't ship if you don't have some sort of architectural understanding when your codebase is 2Million+lines. We have more discussions about the semantics of Init and Shutdown than we do how to optimize a particular piece of animation code. Because the former causes more issues.
Game development can be somewhat Ad-hoc at the gameplay level because design fluctuates so violently through a large portion of development, but even there my experience is the "hackiness" is at the bottom not at the top level.
The problem is as much about differing cultures accross large companies as much as anything else, when you have several thousand engineers and probably several hundred senior engineers you're not going to get agreement on even what the definition of a term like resource management is never mind on implementation details.
There is no "right" way to write a game, there are plenty of wrong ways, but if you take two people senior people (with decades of experience a piece) with disparate visions, stick them in a room and tell them to figure it out about 50% of the time it turns into what I like to call "programmer survivor", and you get a bad compromise out the other end.
The next step in the technology unification process is usually aquiring something external and dictating its use top down, this usually causes issues because the imported tech is less well featured than the old internal tech was. It leads to mass "fixing" of the technology accross the teams and usually the loss of the most talented technical people.
There are secondary costs to sharing, if I'm writing a "graphics engine" for one game it's a few thousand lines of code, to generalize that for arbitrary usage it's as much as 10x that, plus it's a significantly more difficult architectural problem. Tools go the same way. The advantage of the specific solution in this case is it's simpler and I personally understand it, so when there is an issue I can quickly isolate it. In the shared tech case this usually degenerates into a debugging nightmare and an agument as to who's code the problem is in.
The other issue in shared tech is misuse of the external interface, programmers make assumptions about what an external interface does, and if those assumptions proove to be wrong, it can lead to extremly subtle difficult to isolate bugs.
Plus the usual nightmare of living libraries, where someone subtley changes the semantics of an interface in a code drop.
Now I'm not suggesting you build everything from scratch for a game, I'm suggesting you look at what's available and make informed decisions, assume any technology you adopt, your going to end up doing a large portion of the maintenance. You make technology decisions based on technology merit for your application.
Our current codebase has a lot of external code in it, most of it was selected for solid technical reasons, but some of it was selected because of internal political pressure.
Game development can be somewhat Ad-hoc at the gameplay level because design fluctuates so violently through a large portion of development, but even there my experience is the "hackiness" is at the bottom not at the top level.
The problem is as much about differing cultures accross large companies as much as anything else, when you have several thousand engineers and probably several hundred senior engineers you're not going to get agreement on even what the definition of a term like resource management is never mind on implementation details.
There is no "right" way to write a game, there are plenty of wrong ways, but if you take two people senior people (with decades of experience a piece) with disparate visions, stick them in a room and tell them to figure it out about 50% of the time it turns into what I like to call "programmer survivor", and you get a bad compromise out the other end.
The next step in the technology unification process is usually aquiring something external and dictating its use top down, this usually causes issues because the imported tech is less well featured than the old internal tech was. It leads to mass "fixing" of the technology accross the teams and usually the loss of the most talented technical people.
There are secondary costs to sharing, if I'm writing a "graphics engine" for one game it's a few thousand lines of code, to generalize that for arbitrary usage it's as much as 10x that, plus it's a significantly more difficult architectural problem. Tools go the same way. The advantage of the specific solution in this case is it's simpler and I personally understand it, so when there is an issue I can quickly isolate it. In the shared tech case this usually degenerates into a debugging nightmare and an agument as to who's code the problem is in.
The other issue in shared tech is misuse of the external interface, programmers make assumptions about what an external interface does, and if those assumptions proove to be wrong, it can lead to extremly subtle difficult to isolate bugs.
Plus the usual nightmare of living libraries, where someone subtley changes the semantics of an interface in a code drop.
Now I'm not suggesting you build everything from scratch for a game, I'm suggesting you look at what's available and make informed decisions, assume any technology you adopt, your going to end up doing a large portion of the maintenance. You make technology decisions based on technology merit for your application.
Our current codebase has a lot of external code in it, most of it was selected for solid technical reasons, but some of it was selected because of internal political pressure.