*Spin-off* Coding Education & Practise

Ok last OT I promise :p

What you ask during interview is besides the point(even if you don't ask, you're going to see the proficiency on technical questions anyway). I was talking about filtering the resumes before interviews. There's little point to interview a Comp.Sci engineer fresh out of school if his school didn't include the relevant basics and he doesn't mention acquiring them in other ways.


Generally we just need better tools to work with increasingly paralel architectures, there's only so much hardware can to do by itself.

It's interesting to see the different philosophies various teams have with regards to development staffing -- I talked to one group recently that only hires the most senior and experienced developers they can find (their most junior programmer has 8 years in the industry). That's certainly one approach to take, but it is also an expensive one. I've seen other groups successfully onboard junior programmers and teams I've worked on have also benefited from internship programs. In my view, the key is strong team leadership and project management. If you have good leads, then they can bring up bright new-grads and make them very productive in a short time. Likewise, for interns, it really helps if you can throw them on an appropriate project. For a summer internship, you might put a guy on a high-reward project -- if he succeeds, you've just tackled an important feature (most of our interns handled important tools that no-one ever seems to have time for). If not, then you're really no worse off than before.

That said, I do like having a core of senior developers, especially in the early phases.
 
It's interesting to see the different philosophies various teams have with regards to development staffing -- I talked to one group recently that only hires the most senior and experienced developers they can find (their most junior programmer has 8 years in the industry). That's certainly one approach to take, but it is also an expensive one.
How successful is that company? Does all that experience ease development and facilitate production, creating a more efficient team? And how much more does it cost overall in dollar figures?
 
It's interesting to see the different philosophies various teams have with regards to development staffing -- I talked to one group recently that only hires the most senior and experienced developers they can find (their most junior programmer has 8 years in the industry). That's certainly one approach to take, but it is also an expensive one. I've seen other groups successfully onboard junior programmers and teams I've worked on have also benefited from internship programs. In my view, the key is strong team leadership and project management. If you have good leads, then they can bring up bright new-grads and make them very productive in a short time. Likewise, for interns, it really helps if you can throw them on an appropriate project. For a summer internship, you might put a guy on a high-reward project -- if he succeeds, you've just tackled an important feature (most of our interns handled important tools that no-one ever seems to have time for). If not, then you're really no worse off than before.

That said, I do like having a core of senior developers, especially in the early phases.


All senior people has it's issues as well, other than expense. Egos tend to be a problem, and if everyone isn't on the same page it can be a total disaster.

But there is a limit to the number of interns or junior engineers your senior engineers can absorb. It's also a question of timing, it's much easier early in a project to make time to mentor junior engineers.

In the end team building is team building, doesn't much matter what the industry is, you need a mix of talents, personalities that mesh and leadership that can lead.

In an ideal world, I'd take a hand picked senior team over a team 3 times the size consisting of 50% junior engineers. You'd be surprised how much more productive the former can be. But you can't construct many of those hand picked teams and there is still a tendency at larger companies to still think of project scheduling in terms of generic man hours.
 
The main problem with projects seems to be coordination and communication. If you have a team of individualistic developers and you can all get them in the spot they like, they do tend to take care of that by themselves. I think that's the best and most productive development environment, which requires the least amount of people by far. But that requires management to let them do the things as they see fit.

In that scenario, it doesn't really matter what they know and like, as long as you've got a good diversity. The main thing is that they go out and discuss the interactions and common goals by themselves.

The main problem with the above approach is, that while it's very likely the result will work, it's very hard to determine up front how it will look like. And strong management works counter-productive. So, you might have them all go off in directions that don't offer the ROI required.


If, on the other hand, you want to be able to develop things according to a plan made in advance (which is VERY HARD in large software projects), you need to lay down the works and do it by the book. Which means: lots of people, with a strict job description and clear borders, who all build thier piece as required.

The main problem with that approach is, that it is often rather hard to deliver a working software product at all.


So, I think the trick is in seeing what people are available, and choosing the right path in between.
 
I have been doing web development for years now and it may not relate directly to game development. But I do notice a couple of things.

New grads do know the stuff, however they just don't know how or when to apply the stuff they learn in school. For an example, many of them can tell me about class interface and when and how to use it. But when it's time to work, they seem to forgotten everything about it. They never apply it in their work, or even thought of using it in their design.

Older / senior developers who have been working on a legacy code tends resist change. They're more fearful of doing things differently. They just want to continue on in their set way. Finding a balance between these two groups is difficult.

Working in a larger team, DO NOT go down the democracy route...Do not let your developers do design by democracy, it just doesn't work. It's okay to have brainstorm session, but it's still the lead that need to make the final call. Have a couple good developers to lay the infrastructure (identify the design pattern) for the other developers to follow.

I kinda agree with Joker454, it's harder to find a developer that can solve a problem than a developer who is tech savvy. But this is where my experience doesn't really apply, since the problems we face aren't technical, but is usually less well defined...requiring a new breed of developer with domain expert combining into one.
 
I've only been out of college for three and a half years. My job is not nearly as technical as I want it to be, so I'm still doing the job hunt. I'll say this, being on the somewhat recent grad end of the discussion:

It's very difficult for schools to balance fundamentals with the amount of different topics computer science encompasses. If I get a bachelor of computer science degree, what exactly does that mean? Universities get a lot of feedback from grads and industry to try and tailor their programs. You have people entering a work force that demands experience and a laundry list of skills, so the schools try to give you as broad a knowledge base as possible. The problem with that is you know a little bit of everything. You're not excellent at anything in particular.

Companies do not seem to want to have to train or teach their employees. They want them to have enough knowledge to start working very quickly. It's not acceptable to most companies for a person to walk in and say they don't know 'x' but they'll learn really fast.

One exception to that here is cooperative education. You can hire a student to work for cheap, and the government gives most of that money back anyway. So you get to try before you buy, and steer them along the right path during their schooling. If you hire someone post grad, it's far riskier, because they take up a different head count, cost more money, and are more of a pain in the ass to replace.

Anyway, that was long winded, but I hope there is some useful argument in there about why universities take the broad education route rather than focusing on the nitty gritty. They like problem solving, broad concepts and a taste of everything.
 
As I've learnt over the last 20 odd years, it's never in the best interest for a prospective employer to dole out training schemes to employees who may well just make a jump to better pastures. Why should they pay for someone else to gain?

As for the fundementals of computer science, what happens when your stack meets the heap? I learnt about solid core memory (still have a large cube of copper wires that equates to about 4k), and from there onwards. Currently it all seems to be about quick turn over with very little understanding of the actual hardware you are using, neer mind the software principles that drove the hardware to its current level.

I did my first degree in Artificial Intelligence and that was all about Java, integrating agents etc. Nothing in that degree taught me more than I knew about computers at a base level already, in fact it was almost a dumbing down inprinciple. Mainly because you could look at a hardware spec, look at the performance you'd get from Java and then long to be able to do the same thing in C, because you knew it would be ten times faster.
 
I don't necessarily mean having a regimented training program sending people on courses. I just mean that there should be jr positions around that are willing to let people learn on the job, under the guidance of the sr. employees. It seems weird to write off a smart candidate with limited or without formal experience in a certain area. I've seen it happen a lot. Of course, some companies are too small to have that luxury and they need the entire staff working at 100% right away.
 
I don't think it's just idealism - there's a reason why default CS course includes those subjects, just like there's a reason for specific math subjects in the course when you study math, etc.

But the software industry has changed radically over the past 20 years. For 95% of software developer graduates today software correctness (ie. no bugs) is way more important than how the code performs. This trend is reflected in the tools, languages and platforms that are used nowadays: Java and .NET specifically runs on virtual machines where any notion of hardware is removed (abstracted away). Then you have all the script languages: PERL, Ruby and PHP, all of which are dog slow and all are gaining more and more traction in the webserver space. On the client side you see more Javascript and Action scripting (Adobe's Flash).

C (and C++) is the new Cobol. Not because it is useless but because so few people will be using these languages. How come nobody learns Fortran anymore? It's way better suited at a lot of game related tasks than C, yet nobody, to my knowledge, uses it.

Game development is a software industry niche where knowledge about memory usage, cache locality, various hardware defined latencies (instruction, memory etc.) and similar stuff is important because it influences the decisions you make. But for the vast majority of the software industry it doesn't matter

The problem is compounded by the hire-develop-fire cycle most studios seem to employ on a per-project (game) basis. This makes it impossible to retain skilled developers in the long run. Although things are looking like they are maturing.

Cheers
 
I think that most of the problems with fresh grads is lack of experience and not lack of technical knowledge. Two things that helped me writing solid code were: writing for embedded systems at the uni and working with code inherited from other developers at work.

Former gave me understanding of how technical limitations of hardware enforce certain patterns in my code - it's much harder to see the correlation if you write for 2GHz machine with 4GB or RAM.

Latter let me experience first hand what works and what doesn't (and when). If you work in a large team or if you inherit a piece of code that was in development for years with multiple ppl contributing, you have to deal with both excellent and crappy code. You debug stuff developed by someone who's never heard of the term "supportability" and start appreciating simple things in a clean code: from proper factorization to object refcountring.

I'm by no means senior developer but I deal with new hires every day and see a lack of comp arch knowledge (which I was lucky to acquire through embedded systems programming). This has two reasons: one is that many people don't experiment in their free time (mentioned by ERP), the other is that universities put more stress on higher level programming these days: Java, C#, scripting languages. Even if you take comp arch classes, you're going to acquire dry knowledge that's very unlikely to be utilized during programming classes. This is unfortunate but I believe that as long as you deal with a smart person, he/she will be able to learn stuff quickly and will do so willingly.

Passionate developers with lack of experience are never a long-term problem. People who don't want to grow are.
 
I think a lot of schools are getting heavy into teaching embedded courses, so grads should be better versed in computer architectures.
 
Embedded system and Real time Systems books are what I've been recommending to many of the less experienced people.

I think over the last 10 years games have become a uniquely challenging technical discipline with little in the way of best practices. You have truly massive legacy code bases, on what amount to large embedded systems. It's become about walking a fine line between Software engineering and technical excellence.

The smart people do learn and grow, and my last company had a very active internal training program, it just doesn't buy you a lot in the scope of a single project if you have too many junior engineers.

The problem isn't just new grads either, I've seen good engineers, with 3 or 4 years of experience give up when the game crashes and the debugger isn't gracious enough to show the stack frame.
 
How successful is that company? Does all that experience ease development and facilitate production, creating a more efficient team? And how much more does it cost overall in dollar figures?

The company is very successful, the team -- well, we'll see, but it is an impressive group of people with deep experience.
 
In an ideal world, I'd take a hand picked senior team over a team 3 times the size consisting of 50% junior engineers. You'd be surprised how much more productive the former can be. But you can't construct many of those hand picked teams and there is still a tendency at larger companies to still think of project scheduling in terms of generic man hours.

That was how my last project went, initially we had a seasoned 5 man team. This time around, we also have a strong core of experienced developers, with a few younger guys on staff as well.

These days, so much of the game industry is moving to Agile that a lot of DDs are thinking in terms of generic man hours or days, but sprints are adjusted for velocity, so that should factor in the overall team expertise and capacity.
 
New grads do know the stuff, however they just don't know how or when to apply the stuff they learn in school. For an example, many of them can tell me about class interface and when and how to use it. But when it's time to work, they seem to forgotten everything about it. They never apply it in their work, or even thought of using it in their design.

It'd be nice if universities taught more of the development process. One of the most common difficulties for new-grads I see is a lack of experience in working on a large (or even small) development teams. Many grads I've worked with have come out of school very fuzzy on source control. Things like, how to use branches, sandboxes, how to integrate changes, atomic checkins, etc. It might not be a bad idea to add a P4 class to the CS curriculum, or at least spend a semester or two on a large team project (12+ developers).

Of course, many things are only learned by actually doing them, but this stuff is generic enough that it should be something students are exposed to.
 
Universities can't cover everything in Comp Sci, it's too large in scope. However, I do believe they could offer more specific courses and work harder to get their students into jobs tailored for their education. On top of that, a school must offer many coop and internships, where I honestly think much of the most valuable education comes from. From a employment perspective I rather go to a school with a great career finding department vs a bad one with slightly more offered classes.
 
The problem isn't just new grads either, I've seen good engineers, with 3 or 4 years of experience give up when the game crashes and the debugger isn't gracious enough to show the stack frame.

Debugging skills are a problem by themselves. You get to relay a lot on your debugger to tell you what went wrong and where is went wrong, so most people these days don't know what to do when the debugger fails. Specifically with race conditions, this leaves many people completely helpless. But those skills are easily taught to someone who is bright enough. I would like it if these things were actually taught at universities, because a scary amount of companies don't really have any actual experienced senior programmers, but that's another topic.

More to the point, I was wondering how many companies actually have "debugging strike teams" to take over hard cases and who know all the dirty little tricks.
 
Debugging skills are a problem by themselves. You get to relay a lot on your debugger to tell you what went wrong and where is went wrong, so most people these days don't know what to do when the debugger fails. Specifically with race conditions, this leaves many people completely helpless. But those skills are easily taught to someone who is bright enough. I would like it if these things were actually taught at universities, because a scary amount of companies don't really have any actual experienced senior programmers, but that's another topic.

More to the point, I was wondering how many companies actually have "debugging strike teams" to take over hard cases and who know all the dirty little tricks.

I mostly agree and would add that people are afraid of digging around as well. I've determined countless bugs were the result of library implementations or other code that I'm depending on that were problematic. In Java applications, JAD (Java Byte Code Decompiler) has been invaluable. With C/C++ programs, I've also had situations where I had to look at the results from the different compile stages (most useful where you had a C/C++ preprocessor based compiler) or even looked at the resultant assembly code from a debugger, analyzing a core dump, etc.

I still see the horror on peoples faces when I make suggestions like "just dump the process and look at the core file in a debugger" -- if only to look at the call stack.

So.... I do rely on debuggers... and that's one point I might disagree with you on, that debuggers are definitely useful and I don't think you can have an "over reliance" on them.

dbx/gdb is your friend.
 
Nah, ntsd is your friend. ;)

But yeah, I totally agree that debugging skills is something most new hires lack. But that, in most cases, boils down to comp arch. If you have optimized binary and you can't find your this pointer, just go through disassembled code and find your address. Obviously if you have no clue about the arch, you won't be able to do anything.

Also the way many bugs manifest themselves makes it easier to guestimate what is going on if you simply know how the code actually works. That means both architectural details as well as compiler details.
 
Nah, ntsd is your friend. ;)

But yeah, I totally agree that debugging skills is something most new hires lack. But that, in most cases, boils down to comp arch. If you have optimized binary and you can't find your this pointer, just go through disassembled code and find your address. Obviously if you have no clue about the arch, you won't be able to do anything.

Also the way many bugs manifest themselves makes it easier to guestimate what is going on if you simply know how the code actually works. That means both architectural details as well as compiler details.

Ah, good old ntsd! Such an invaluable tool!
 
Back
Top