What is expected of a Sr. Graphics Engineer wanting to get into gaming?

All:

I've been in the film industry for almost 16yrs now. It's treating me well, and because of seniority at my current location, I'm pretty much allowed to push forward with new tech at my leisure.

However, I've always thought how would it be if I ever worked on games. At some point in my life, I want to make the jump (before I get too old) or at the very least make a real-time tool at our job. Because I want to push for a better rendering approximation that can rival film but in realtime.. ;)

Can any of you guys give me some info on what you'd expect me to know if you interviewed me today. How much DX or OpenGL would you expect? Obviously C++ is required, but you have to know that in either industry. Although, I wouldn't expect college graduate questions like bitwise operators, templates, and data structures. Things that I would assume most people don't deal with at such a low level on a day-to-day basis. Would it be enough that I have a significant background in general graphics as opposed to a specific part like GPU APIs? Our way of working is going to be significantly different, would that turn you off from someone in the film industry?

Thoughts?
 
I think my biggest expectation of you would be to prove you can transition from what I suspect is offline rendering to realtime rendering on consumer grade hardware and having to "play nice" in terms of what will be very limited resources - you'll be sharing CPU time with every other programmer and probably GPGPU time going forward. You'll be expected to surrender CPU and/or GPU time as projects progress and new engineering decisions are made which will impact you - is that something you face now? If not, you'll need to think about how you will answer questions about things like this.

Similarly you're going to be grilled about experience on common middleware, engine tools and the standard SDK environments common to your target platform. Getting to grips with the SDK and tools can be the biggest hurdle and nobody is going to re-write because you're not familiar with it.

I interview a lot of programmers wanting to get into very large scale server environments and occasionally we'll get somebody with some gaming experience and they may have worked on games I've heard of, played and even liked, but writing a game is no preparation to writing highly parallel code where you have to go 300,000 cores wide and manage the data.

It'll likely be a very different discipline and when I am faced with candidate with a lack of practical experience in big iron, I will usually prepare some theoretical code problems and see how they propose to tackle it. I would expect you to face the same given you can't cite any games experience.
 
I think my biggest expectation of you would be to prove you can transition from what I suspect is offline rendering to realtime rendering on consumer grade hardware and having to "play nice" in terms of what will be very limited resources - you'll be sharing CPU time with every other programmer and probably GPGPU time going forward. You'll be expected to surrender CPU and/or GPU time as projects progress and new engineering decisions are made which will impact you - is that something you face now? If not, you'll need to think about how you will answer questions about things like this.

We don't share CPU time no. Our production process depends on bidding on the appropriate tools/shaders to get the job done. Some shots may require special shaders or FX. We do watch render times however. We bid out what we think it would take to finish a movie and if we are seeing render times much greater or frames not being rendered, we have to track down why. most of the time it's due to the artists not being familiar enough with the renderer or general rendering techniques to light or surface in the most efficient manner. This is the "support" form of our work. However, we have dev time at the beginning to read papers, and implement tools.

Similarly you're going to be grilled about experience on common middleware, engine tools and the standard SDK environments common to your target platform. Getting to grips with the SDK and tools can be the biggest hurdle and nobody is going to re-write because you're not familiar with it.

This would require experience from already being in gaming. Middleware I could handle if it's something like Maya or Houdini. Engine tools (i.e. UE3/Unity, etc..) isn't our tools that we use. We'd most like have our own. SDK targeted for a console would also be out of our realm since the general CPU is what we write for. :)

I interview a lot of programmers wanting to get into very large scale server environments and occasionally we'll get somebody with some gaming experience and they may have worked on games I've heard of, played and even liked, but writing a game is no preparation to writing highly parallel code where you have to go 300,000 cores wide and manage the data.

Our multi-threading scope isn't as large. Our offline renderers do use MT, but when we write tools, we usually harness those threads in various ways. For example, if our renderer started up 8 cores to render the frame, and after it build the database of bvh, when I'm making a procedural geometry (i.e. say for opening up alembic files), I would keep in mind that I need to either share data by mutex locking or synchronization or I have enough memory to make copies of data structures and have them work in parallel. Things like that. Sending jobs to the farm is quite easy enough as none of our procs have GPUs..;)

It'll likely be a very different discipline and when I am faced with candidate with a lack of practical experience in big iron, I will usually prepare some theoretical code problems and see how they propose to tackle it. I would expect you to face the same given you can't cite any games experience.

Gotcha.

From these comments, I can see that we come from 2 different worlds but basically trying to achieve the same thing.

I would probably have a hard time trying to push new visuals instead of sticking to being allocated only a few cycles to implement whatever I wanted to propse. I can see that being frustrating for me, but I'd probably go into R&D mode and try to get what I wanted. Yes, it would take time, but I think seeing things from a film background and trying to implement techniques without raw resources would be quite interesting. I may end up being fruitful.. or you may end up letting me go.. :p

One thing I have noticed is the enormous development time it takes to make a game. I think if I kept up for 1 game release that took 5yrs, I'd pretty much know my way around the landscape. Then things would get really interesting because after that.. I can start to apply what I've learned in film to the game engine.
 
Last edited:
Experience. If you want to be a graphics engineer, write your own engine in your spare time. There's no way book-derived theory or non-realtime experience would be a valid basis for creating commercial, time critical engines. You might have valuable experience-based insight into realistic rendering, but as an employer I'd want to see that field tested at least in your own experiments. I can't imagine a senior position going to anyone without real game experience either, so you'd have to start in a junior role perhaps.

Maybe dabble in an engine and share it. Perhaps if you produce something novel you can find an audience and generate experience? Given the interest people have in photomodes, maybe create a simple renderer that imports Unreal scenes and renders in your own engine, or somesuch. Even just a visualiser for models.
 
Experience. If you want to be a graphics engineer, write your own engine in your spare time.

Yeap. I just started that actually. ;)

There's no way book-derived theory or non-realtime experience would be a valid basis for creating commercial, time critical engines.

I'm not sure I agree with that. I've got friends who were in film that definitely got jobs at some of the bigger studios like EA and Blizzard. Granted Blizzard has their own cinematics division that translates 1:1 of anyone coming from film. Add to that a lot of hires I see in the gaming industry came from college making a small project that landed them a job. Sure they got the little experience in, but they are still not experienced in graphics overall. I remember when I got my first job because I made a published Maya volume plugin. Thought I knew a lot until I started working for Blue Sky and realized how much I didn't know. The R&D manager laughed at me when I asked to see the code to their ray-tracer as if it was as small as my own tiny little sphere ray-tracer.

You might have valuable experience-based insight into realistic rendering, but as an employer I'd want to see that field tested at least in your own experiments. I can't imagine a senior position going to anyone without real game experience either, so you'd have to start in a junior role perhaps.

This is probably why I'm thinking it'll be near impossible for me (unless I land at some place like Nvidia, Apple, or Intel doing R&D). I just got an opportunity to work at Gearbox but we agreed that the pay decrease would be too great for me. If it's not economical for my family, I might as well do it on the side in my own time for fun.

Maybe dabble in an engine and share it. Perhaps if you produce something novel you can find an audience and generate experience? Given the interest people have in photomodes, maybe create a simple renderer that imports Unreal scenes and renders in your own engine, or somesuch. Even just a visualiser for models.

I'd more than likely dabble in something that would help my company rather than investing in UE (unless the company sees some benefit in that). At that point, even if I were to create something novel -- I doubt I'd be able to share it at that point. I do have some ideas to get my feet wet in real-time though. I just need to find cause to motivate me.

One last thing concerning the time-critcal comment. Even film has time-critical code. By the sheer nature that I am constantly writing to the renderer doing it's shading process (which obviously takes up the bulk of the render time), I have to be constantlly aware of repeating function calls, redudant code, huge data structures that do slower than linear searches, etc.. so we aren't THAT much in the dark.. :p
 
Last edited:
It's been since late 2008 since I bailed from the biz so take what I say with a cube of salt in case things have radically changed since then. But back then honestly it was a total crap shoot because every employer would have different criteria as to what made someone a potential hire as a senior graphics coder.

There were some that for whom you could have all the experience in the world, but if you didn't have extensive math background and knew exactly how various mathematical formulas worked you'd never get the job. They were of the feeling that ok the internet doesn't exist, you can't look anything up, now explain such and such mathematical formulas and how they work. If you can't you were toast. If you could then you were in.

There were some that could care less about your background in math and instead wanted to know what games you have shipped and what your graphics role was on them as their primary hire criteria.

There were some that had a fetish for logic puzzles and would drill you on those for a good chunk of the day I guess because it was the hipster thing to do.

There were some that wanted you to have a degree in such and such otherwise they wouldn't consider you, but if you had that degree they figured you were smart enough to learn everything on site and would hire you even with little experience.

There were some that wanted you to have written at least one renderer to prove your understanding of how the entire graphics chain works. Then again there were some that preferred you had a graphics background but never wrote a render so that you could provide a fresh approach.

There were some that wanted you to have good graphics knowledge along with just being a good guy that people get along with. You'd be shocked at how far just being chill and fitting in will take you in an interview. Given the crazy hours the job entails, no one wants to hire a creeper or someone that won't match the company culture.

And on and on, it's so variable. I'd suggest just brush up a bit on what you know and throw your resume into the hat.
 
Thanks for the response Joker.

I am really not looking to get into the biz right now as I'd be really crippling my current employer from doing that (as well as my family). I've interviewed with some game companies way back when I was just starting out (about 5yrs in) and they didn't go rather well. ;) Some questions where too much for me at the time and others were personality issues.

In any case, I plan on learning real-time at my own speed for now. I'm pretty comfortable with never leaving the film industry since I've pretty much established tenure and a lot of film studios are starting to create their own real-time lighting tools ( which is something I'm going to look into ). In another 5yrs or so, I can see film companies using real-time engines as the norm. There were some really smart guys I worked with at Disney/Pixar. I can easily see gaming companies licensing film's own real-time engines one day. ;)
 
Thanks for the response Joker.

I am really not looking to get into the biz right now as I'd be really crippling my current employer from doing that (as well as my family). I've interviewed with some game companies way back when I was just starting out (about 5yrs in) and they didn't go rather well. ;) Some questions where too much for me at the time and others were personality issues.

In any case, I plan on learning real-time at my own speed for now. I'm pretty comfortable with never leaving the film industry since I've pretty much established tenure and a lot of film studios are starting to create their own real-time lighting tools ( which is something I'm going to look into ). In another 5yrs or so, I can see film companies using real-time engines as the norm. There were some really smart guys I worked with at Disney/Pixar. I can easily see gaming companies licensing film's own real-time engines one day. ;)
A little OT, but I'd be interested in reading threads about how the film industry is moving in that direction.
 
Back
Top