Some comments from John Carmack

alexsok

Regular
John Carmack posted a couple of intresting comments on slashdot:

My comment specifically regards the "shelf life" of a rendering engine. I think that an upcoming game engine, either the next one or the one after that, will have a notably longer usable life for content creation than we have seen so far. Instead of having to learn new paradigms for content creation every couple years, designers will be able to continue working with common tools that evolve in a compatible way. Renderman is the obvious example -- lots of things have improved and evolved, but its fundamental definition is clearly the same that it was over a decade ago.

This is only loosly related to the realism of the graphics. I don't think a detailed world simulation that is indistinquishable from reality will be here in the next decade, except for tightly controlled environments. You will be able to have real-time flythroughs that can qualify as indistinguishable, but given the ability to "test reality" interactively, we have a lot farther to go with simulation than with rendering.

The X-Box GPU is more of a GF4 than a GF3, but a modern PC is generally much higher end than an X-Box.

However, you can usually count on getting twice the performance out of an absolutely fixed platform if you put a little work into it. There are lots of tradeoffs that need to balance between the different cards on a general purpose platform -- things that I don't do with vertex programs because it would make the older cards even slower, avoiding special casing that would be too difficult to test across all platforms (and driver revs), and double buffering of vertex data to abstract across VAR and vertex objects, for instance. We might cut the "core tick" of Doom from 60hz to 30hz on X-Box if we need the extra performance, because it has no chance of holding 60hz, but the PC version will eventually scale to that with the faster CPUs and graphics cards.

The generic back end does not use vertex programs, or provide specular highlights, so the custom back ends provide both performance and quality improvements.

There are some borderline cases that may or may not get custom coding -- Radeon R100, Matrox Parhelia, and SiS Xabre are all currently using the default path, but could benefit from additional custom coding. I will only consider that when they have absolutely rock solid quality on the default path, and if it looks like they have enough performance headroom to bother with the specular passes.

The NV20 back end has more work in it than any other, with two different cases for the lighting interaction, but on the X-Box I would probably create additional special cases to optimize some of the other possible permutations.
 
Those are some great (and informative) comments. The "2x" performance number for a fixed platform seems a bit high to me, but Carmack never ceases to amaze me. :)
 
I'd just like to insert a little jab in here.

It seems to me that Tim Sweeney over at Epic showed JC!

Alright, now that that's over, it seems to me that TS really had the right idea with the scripting and modularity of his engine design for Unreal (Which was designed to be upgraded, not rewritten). Where JC seems to be much better is in 3D graphics programming, and in overall foresight in computer technology.
 
A couple of things about that.

DOOM3 was entirely rewritten, but it used sort of a shell of the Quake3 engine to start development (i.e. not from scratch...) early-on. I believe rendering was done first, and the other parts, such as sound, physics, etc. were done later.

All of JC's previous engines were entirely rewritten from scratch, from what I have heard.

That isn't to say that Tim Sweeney necessarily influenced John Carmack in any way, shape, or form. It's just a natural extension that as things get more complex to program, it becomes better to just try your best to program it right the first time, with significant modularity, so that you don't need exponentially increasing time for the development of exponentially more complex software.

It's just that TS did it first in the game engine space.....at least as far as 3D fps engines go, anyway...
 
All of JC's previous engines were entirely rewritten from scratch, from what I have heard.

Some code has been used for some time, especially the hand tuned assembly. Doom III was the first time that everything was from scratch. Really both of them seem to make good games and I wish them both well...
 
I know for a fact Doom3 is not scratch written. At the very least the entire Q3 shell is present (console, GL wrapping, etc.) and little has changed in this area. The guts of the code are completely different, though.
 
It seems to me that Tim Sweeney over at Epic showed JC!

I don't agree with this. The Unreal engine in previous builds has had piss poor performance. And the modularity didn't seem to do squat for him. Either that or he was lazy when it came to for the most part completely redoing the graphics engine. Modularity is nice to have. I've seen UCC et al. It's all fine and dandy, but a lot of his stuff was neat but either poorly implemented, stunted by poor design choices or simply too taxing for current day hardware ie. worthless at the time. Make it work, then make it good. Carmack made it work, now he's making it good.

Maybe I've come down too hard on Tim, but I firmly believe in working before working well. As for Carmack, do you really thing he doesn't have code reuse where it counts? That and modularity implies that major paradigm shifts aren't usual occurances which they were in the earlier days.
 
Friend of mine made this

3.gif


Enjoy
US :)
 
Saem said:
I don't agree with this. The Unreal engine in previous builds has had piss poor performance.

You do realize that the base of the rendering part of the Unreal engine has not been touched since before the Unreal days right? UT only had a few tweaks and a bunch of items for on-line play. Did you know that UT was oringally supposed to be an add-on bot back for the Unreal game but grew into its own game. After playing around with the latest version of the UT2003 engine I can assure you that the old performance grips (which are/were valid with some hard ware but not with others) are gone. After talking to Tim I have more respect for him. I dont think he drive the industy as much as JC does. But he is very talented none-the-less.
 
Saem said:
Make it work, then make it good. Carmack made it work, now he's making it good.
I think both Tim and John were working within the constraints of their own fundamental models.

Tim's original architecture for Unreal was very software-centric, and it didn't transition particularly well to hardware on anything but a very thin-layer API (which is why it was the only thing that was worth buying a Savage2000 for - the Metal port was blindingly fast, it ran as fast as a GF2 Ultra). He didn't get to fix this by Unreal Tournament, although it didn't really matter because what was there was good enough in that time frame - it's about 70% of where it could be (except on Metal where every last cycle was taken advantage of), which is good enough.

In contrast, John went the other way, creating a custom renderer designed to make the best out of the hardware paradigm (multitexture + OpenGL) he was working within, because that's one of the things he finds interesting, and he wanted to push a shader-type model (note: that's nothing to do with pixel or vertex shaders) to drive the industry forwards. As to flexibility, remember that Quake3 runs interpreted C code, and the mods can be written in an extremely flexible manner.
 
JB: what that man said, double! :)

jb said:
After talking to Tim I have more respect for him. I dont think he drive the industy as much as JC does. But he is very talented none-the-less.

Both Tim and John are very nice guys and very focused.

The key difference is that John, virtually uniquely in the game industry, actually wants to take the next step and go forward. I think that's the combination of his focus being on technology plus the luxury of having far more money than he needs :)

That's only on the 3D/tech side; there's a lot of innovation in the game industry around now, just not much on the 3D side.
 
I don't think anybody should, for a second, think that Tim is a slouch when it comes to software development. I have, at times, come down on him for various reasons, most of which have more to do with things I've heard from the hardware guys...

But, it's very clear that he has a ton of talent. I personally feel that John is @ center stage, if you're talking about leadership, while Tim is much more of an "audience" kinda guy. I think Tim is probably more reserved in real life, and doesn't mind being able to go places and not get swamped by a bunch of Quake-heads :)
 
Back
Top