Carmack's comments on NV30 vs R300, DOOM developments

Nagorak said:
Reverend said:
I honestly think all John wants is have DOOM3 run the best all things considered. I doubt he'd ignore the fact that the Radeon 9700Pro was available commercially 6 months ago.

Why would it matter to him? He doesn't play any games and his "game" (and I use the term loosely) isn't out yet.
Coz he makes engines that sells that other developers use to make games?
 
Chalnoth said:
Well, Humus, that may be your opinion, but which one did you learn first? Often people tend to prefer the one that they program with first. And please note that this has purely to do with the programming interface, not the underlying assembly code.

Compare these code snippets from my Shadows that don't suck demo:

Code:
GLuint shaderID = glGenFragmentShadersATI(1);
glBindFragmentShaderATI(shaderID);
glBeginFragmentShaderATI();
	GLfloat offset[] = { 0.02f, 0.02f, 0.02f, 0.02f };
	glSetFragmentShaderConstantATI(GL_CON_0_ATI, offset);

	glSampleMapATI(GL_REG_0_ATI, GL_TEXTURE0_ARB, GL_SWIZZLE_STR_ATI);
	glSampleMapATI(GL_REG_1_ATI, GL_TEXTURE1_ARB, GL_SWIZZLE_STQ_ATI);
	glSampleMapATI(GL_REG_2_ATI, GL_TEXTURE2_ARB, GL_SWIZZLE_STR_ATI);
	glSampleMapATI(GL_REG_3_ATI, GL_TEXTURE3_ARB, GL_SWIZZLE_STR_ATI);

	glColorFragmentOp2ATI(GL_DOT3_ATI, GL_REG_0_ATI, GL_NONE, GL_SATURATE_BIT_ATI, GL_REG_0_ATI, GL_NONE, GL_2X_BIT_ATI | GL_BIAS_BIT_ATI, GL_REG_1_ATI, GL_NONE, GL_2X_BIT_ATI | GL_BIAS_BIT_ATI);
	glColorFragmentOp2ATI(GL_SUB_ATI,  GL_REG_2_ATI, GL_NONE, GL_NONE, GL_REG_2_ATI, GL_NONE, GL_NONE, GL_REG_3_ATI, GL_NONE, GL_NONE);
	glColorFragmentOp2ATI(GL_MUL_ATI,  GL_REG_0_ATI, GL_NONE, GL_NONE, GL_REG_0_ATI, GL_NONE, GL_NONE, GL_PRIMARY_COLOR_ARB, GL_NONE, GL_NONE);
	glColorFragmentOp2ATI(GL_ADD_ATI,  GL_REG_2_ATI, GL_NONE, GL_NONE, GL_REG_2_ATI, GL_NONE, GL_NONE, GL_CON_0_ATI, GL_NONE, GL_NONE);
	glColorFragmentOp3ATI(GL_CND0_ATI, GL_REG_0_ATI, GL_NONE, GL_NONE, GL_REG_0_ATI, GL_NONE, GL_NONE, GL_ZERO,      GL_NONE, GL_NONE, GL_REG_2_ATI, GL_NONE, GL_NONE);

glEndFragmentShaderATI();


----------------------------------------------------


GLuint shaderID = glGenLists(1);
glNewList(shaderID, GL_COMPILE);
	
	glCombinerParameteriNV(GL_NUM_GENERAL_COMBINERS_NV, 4);

	glCombinerInputNV(GL_COMBINER0_NV,  GL_RGB, GL_VARIABLE_A_NV, GL_TEXTURE0_ARB, GL_EXPAND_NORMAL_NV, GL_RGB);
	glCombinerInputNV(GL_COMBINER0_NV,  GL_RGB, GL_VARIABLE_B_NV, GL_TEXTURE1_ARB, GL_EXPAND_NORMAL_NV, GL_RGB);
	glCombinerOutputNV(GL_COMBINER0_NV, GL_RGB, GL_TEXTURE0_ARB, GL_DISCARD_NV, GL_DISCARD_NV, GL_NONE, GL_NONE, GL_TRUE, GL_FALSE, GL_FALSE);

	glCombinerInputNV(GL_COMBINER1_NV,  GL_RGB, GL_VARIABLE_A_NV, GL_TEXTURE2_ARB, GL_SIGNED_IDENTITY_NV, GL_RGB);
	glCombinerInputNV(GL_COMBINER1_NV,  GL_RGB, GL_VARIABLE_B_NV, GL_ZERO,         GL_UNSIGNED_INVERT_NV, GL_RGB);
	glCombinerInputNV(GL_COMBINER1_NV,  GL_RGB, GL_VARIABLE_C_NV, GL_TEXTURE3_ARB, GL_SIGNED_NEGATE_NV,   GL_RGB);
	glCombinerInputNV(GL_COMBINER1_NV,  GL_RGB, GL_VARIABLE_D_NV, GL_ZERO,         GL_UNSIGNED_INVERT_NV, GL_RGB);
	glCombinerOutputNV(GL_COMBINER1_NV, GL_RGB, GL_DISCARD_NV, GL_DISCARD_NV, GL_TEXTURE2_ARB, GL_NONE, GL_NONE, GL_FALSE, GL_FALSE, GL_FALSE);

	GLfloat offset[] = { 0.52f, 0.52f, 0.52f, 0.52f };
	glCombinerParameterfvNV(GL_CONSTANT_COLOR0_NV, offset);
	glCombinerInputNV(GL_COMBINER2_NV,  GL_ALPHA, GL_VARIABLE_A_NV, GL_TEXTURE2_ARB,       GL_SIGNED_IDENTITY_NV, GL_BLUE);
	glCombinerInputNV(GL_COMBINER2_NV,  GL_ALPHA, GL_VARIABLE_B_NV, GL_ZERO,               GL_UNSIGNED_INVERT_NV, GL_BLUE);
	glCombinerInputNV(GL_COMBINER2_NV,  GL_ALPHA, GL_VARIABLE_C_NV, GL_CONSTANT_COLOR0_NV, GL_SIGNED_IDENTITY_NV, GL_BLUE);
	glCombinerInputNV(GL_COMBINER2_NV,  GL_ALPHA, GL_VARIABLE_D_NV, GL_ZERO,               GL_UNSIGNED_INVERT_NV, GL_BLUE);
	glCombinerOutputNV(GL_COMBINER2_NV, GL_ALPHA, GL_DISCARD_NV, GL_DISCARD_NV, GL_SPARE0_NV, GL_NONE, GL_NONE, GL_FALSE, GL_FALSE, GL_FALSE);

	glCombinerInputNV(GL_COMBINER3_NV,  GL_RGB, GL_VARIABLE_A_NV, GL_ZERO,             GL_SIGNED_IDENTITY_NV, GL_RGB);
	glCombinerInputNV(GL_COMBINER3_NV,  GL_RGB, GL_VARIABLE_B_NV, GL_ZERO,             GL_SIGNED_IDENTITY_NV, GL_RGB);
	glCombinerInputNV(GL_COMBINER3_NV,  GL_RGB, GL_VARIABLE_C_NV, GL_TEXTURE0_ARB,     GL_UNSIGNED_IDENTITY_NV, GL_RGB);
	glCombinerInputNV(GL_COMBINER3_NV,  GL_RGB, GL_VARIABLE_D_NV, GL_PRIMARY_COLOR_NV, GL_UNSIGNED_IDENTITY_NV, GL_RGB);
	glCombinerOutputNV(GL_COMBINER3_NV, GL_RGB, GL_DISCARD_NV, GL_DISCARD_NV, GL_SPARE0_NV,   GL_NONE, GL_NONE, GL_FALSE, GL_FALSE, GL_TRUE);

	glFinalCombinerInputNV(GL_VARIABLE_A_NV, GL_ZERO,      GL_UNSIGNED_IDENTITY_NV, GL_RGB);
	glFinalCombinerInputNV(GL_VARIABLE_B_NV, GL_ZERO,      GL_UNSIGNED_IDENTITY_NV, GL_RGB);
	glFinalCombinerInputNV(GL_VARIABLE_C_NV, GL_ZERO,      GL_UNSIGNED_IDENTITY_NV, GL_RGB);
	glFinalCombinerInputNV(GL_VARIABLE_D_NV, GL_SPARE0_NV, GL_UNSIGNED_IDENTITY_NV, GL_RGB);

glEndList();

These two both do the same thing, now tell me which one looks more confusing.
 
Humus said:
Chalnoth said:
Well, Humus, that may be your opinion, but which one did you learn first? Often people tend to prefer the one that they program with first. And please note that this has purely to do with the programming interface, not the underlying assembly code.
Compare these code snippets from my Shadows that don't suck demo:
Um, but those are using the register combiners? I thought I was talking about the pixel shader/vertex program extensions?
 
antlers4 said:
Humus just gave an excellent example of why some people prefer DirectX.

Well, I could compare vs2.0/ps2.0 vs. GL_ARB_vertex_program/GL_ARB_fragment_program where OpenGL is way simplier to deal with.
 
Chalnoth said:
Um, but those are using the register combiners? I thought I was talking about the pixel shader/vertex program extensions?

Well, if that's confusing then I suppose you haven't worked with pixel shaders in OpenGL.

GL_NV_register_combiners == nVidias pixel shaders in OpenGL.

Or do you know any other pixel shading extensions that runs on GF3? It's GL_NV_register_combiners + GL_NV_texture_shader that defines nVidia's pixel shader. The ALU stuff is in the combiners and dependent texture reads and similar is in the texture_shader. And that's it. So yes, nVidia's pixels shading extensions for sub-GFFX cards are much more confusing.
 
Nagorak said:
Why would it matter to him? He doesn't play any games and his "game" (and I use the term loosely) isn't out yet.

He'd want to develop and see his game running as best as possible, so he'd for sure use what's been the best card commercially available for the last 6 mo. and also code with it in mind (though not exclusively).
 
Both of those extensions suck, because building up a program or expression procedurally sucks. A syntactic interface is much better, concise, and easier to read.
 
We have at least some information on the performance of each card using the leaked demo :!:
doom3-2.png
 
Back
Top