A whole load of random questions

1. nVidia Cg. It's a shader language that's considered an extension of an existing API, like Direct3D or OpenGL, correct? Are there any games that use it yet? I thought Doom 3 was suppose to, but id never implemented it. Correct me if I'm wrong on any of this.

2. Culling. Say you're standing in front of a wall with a model on the other side. Is that model being culled? Is culling only considered if an object or model previously in view is removed from view?

3. Are any games currently using any programmatically generated textures?

EDIT-
4. A low refresh rate on a CRT monitor is known to cause eyestrain for main users. Yet 60Hz on an LCD monitor is no problem. Why?

EDIT 2-
5. From my understanding, vertex lighting only affects the vertices of the polygon's which it acts upon. Is this true? In all actuality, wouldn't Phong lighting be more impressive?
 
1. Cg is built beyond of DX and OGL, Cg shaders can be compiled to both vertex/pixel shader(for DX) or vertex/fagment program(for OGL), and then the underlying APIs will use the compiled result for the rendering. I think farcry used Cg during development, but I'm not sure if it uses Cg in runtime(note that there's a Cg compiler bundled with the retail game).

2. Generally, culling means avoiding processing unnecessary data in computer graphics, either because the object is occuled by another object(the example given by you), or because the object is out of view frustum(i.e. the object is too far away to be seen, or the object is right behind you). I think all games uses culling to some extent, and they usually take different approaches, depending on the circumstances. You can search for BSP, Quadtree, Octree, Portal for detailed info on culling technique.

In a simple implementation, the visibility set(those stuff survive the culling) should be recalculated each time the camera or the scene moves around each other. I think you can take an "addictive" approach for acceleration, yet I quite doubt if it's worth it, considering the complexity of implementation and sky-rocketing processing power of hardware.

3. I dunno. :LOL:
 
For three I think the answer could very well be "all the games which have water shaders" since those usually have some sort of fractal that creates the actual water pattern. Dunno about the rest though.
 
Doom 3 has a Cg render path, but it's pretty experimental and only accepts the interaction programs. Carmack said something about adding Cg support to the ARB2 path which is very interesting...especially since the ARB2 path looks for the Cg equivalent of trhe interaction programs at start up. If it's in the ARB2 path, it should be properly implemented into the material system as well.
 
The Great Bundini said:
1. nVidia Cg. It's a shader language that's considered an extension of an existing API, like Direct3D or OpenGL, correct? Are there any games that use it yet? I thought Doom 3 was suppose to, but id never implemented it. Correct me if I'm wrong on any of this.
For the most part, Cg is a development tool that is roughly akin to HLSL, but also compiles to OpenGL shaders, not just Direct3D ones.

3. Are any games currently using any programmatically generated textures?
Sure. But I don't know if any currently use ones generated within the shader. I do know that the majority of the animated textures in the original Unreal were procedural.
 
The Great Bundini said:
1. nVidia Cg. It's a shader language that's considered an extension of an existing API, like Direct3D or OpenGL, correct?
Cg is a shading language close to DirectX HLSL. To use it, you either need the Cg toolkit to compile the shaders down to an assembly shader target, or you can pass Cg code to the GLSL compiler in the NVidia OpenGL driver.
 
OK, that answers a great deal.

One more (I'll add it to the first post as well)

4. A low refresh rate on a CRT monitor is known to cause eyestrain for main users. Yet 60Hz on an LCD monitor is no problem. Why?
 
CRT pixels use phosphors that glow when struck by an electron beam. The beam sweeps across the screen to draw each line. After being hit, the glow of the phosphors starts to dim until they are struck again. The refresh rate in Hz determines how many times each second a given phosphor will be struck. If it's too low, then parts of the screen start to dim in between refreshes, and you get flicker and eyestrain. For commonly used CRT phosphors, this becomes noticeable below 75 Hz or so (although some people's eyes are more or less sensitive).

LCD pixels use liquid crystals to mask light coming from a backlight. Polarizing the crystals can block the light completely, partially, or not at all, thus controlling the brightness. Once a pixel is set to a certain brightness level, it stays there until it's told to change. So LCDs never flicker unless there is some sort of motion on the screen (i.e. pixels changing brightness). Even then, only the moving areas will flicker, rather than the whole screen (as is the case with CRTs). This generally means that LCDs can get away with lower refresh rates.
 
The Great Bundini said:
4. A low refresh rate on a CRT monitor is known to cause eyestrain for main users. Yet 60Hz on an LCD monitor is no problem. Why?
LCDs don't flicker as much as CRTs. 60 Hz would be usable if you had a slow phosphor CRT, but you'd probably not appreciate the trails objects would leave when in motion.
 
OpenGL guy said:
LCDs don't flicker as much as CRTs. 60 Hz would be usable if you had a slow phosphor CRT, but you'd probably not appreciate the trails objects would leave when in motion.
Well, if the phosphors were that slow, a white region on the screen would get brighter the longer it stayed white. This is a much bigger problem, and would result in the screen getting burned out.
 
Chalnoth said:
OpenGL guy said:
LCDs don't flicker as much as CRTs. 60 Hz would be usable if you had a slow phosphor CRT, but you'd probably not appreciate the trails objects would leave when in motion.
Well, if the phosphors were that slow, a white region on the screen would get brighter the longer it stayed white. This is a much bigger problem, and would result in the screen getting burned out.
Are you speaking from experience or what? What is your experience with such monitors? I've used a slow phosphor monitor on the Amiga to alleviate the interlaced mode flicker. I never noticed the screen getting brighter the longer something white was on the screen, however, I did notice that moving objects left trails, making it pretty much unusable for games. (The early Amigas only supported 15.7 kHz hsync which many VGA monitors refused to even sync to.)

Given that particular pixel is only being hit by the electron beam for a short period of time and that most of the time the pixel is not being hit by the beam, don't you think there'd be a balance reached by the amount of energy absorbed by the phosphor when hit by the beam and the amount of energy released? I assume the designers of such monitors took these factors into account.

The trails don't last for very long so I assume the "warm up time" for a pixel is about the same as its cool off time.

I never did any testing on the longevity of such a monitor, but it sounds like you've done extensive research so please clue us in.
 
Yeah, you're right in that it won't necessarily burn it out. Sorry about that. The phosphor decay time is an exponential decay, and thus there would be an equilibrium reached. I'm not sure if it'd be symmetric with the "cool off time," but I suppose it'd be close.
 
Cg was kind of an attempt by Nvidia to force everyone to 'follow their lead' by using a language made & controlled by them.
Or alternatively, their temporary attempt to fill the gap until Dx9/OGL came out with their own shader language/compilers

It was mostly unsucessful because:
*ATI refused to write a compiler (understandable)
*By the time proper shader heavy titles came out, Dx9 & OGL shader languages were there & supported by everyone
*Because of the poor shader performance/feature problems of the FX series, Nvidia was having to work very closely with developers to hand craft workarounds for common performance/graphical problems, thus there was no point in using an automatic compiler

Nvidia even wound up officially stating that developers should just use the Dx9/OGL shader languages.
 
arrrse said:
Cg was kind of an attempt by Nvidia to force everyone to 'follow their lead' by using a language made & controlled by them.
That's an extremely ingnorant statement perpetrated by those who have no knowledge of programming whatsoever. There was never any reason for ATI to end up at a disadvantage due to Cg.
 
OK, yet another one.

5. From my understanding, vertex lighting only affects the vertices of the polygon's which it acts upon. Is this true? In all actuality, wouldn't Phong lighting be more impressive?

Thanks. :p
 
The Great Bundini said:
1. nVidia Cg. It's a shader language that's considered an extension of an existing API, like Direct3D or OpenGL, correct? Are there any games that use it yet? I thought Doom 3 was suppose to, but id never implemented it. Correct me if I'm wrong on any of this.

2. Culling. Say you're standing in front of a wall with a model on the other side. Is that model being culled? Is culling only considered if an object or model previously in view is removed from view?

3. Are any games currently using any programmatically generated textures?

EDIT-
4. A low refresh rate on a CRT monitor is known to cause eyestrain for main users. Yet 60Hz on an LCD monitor is no problem. Why?

EDIT 2-
5. From my understanding, vertex lighting only affects the vertices of the polygon's which it acts upon. Is this true? In all actuality, wouldn't Phong lighting be more impressive?

1: Cg is used to create shaders AFAIK, just like ATI's rendermonkey. Its a high level language for programming your pixel shaders with. It compiles to run under D3D or OGL

2:Culling is done as much as it can practically. There are many techniques some in the game software and now some on the graphics card. The situation described is certainly one where the object would wan to be culled, but it is considered per scene only. Each frame visibility will be determined independantly of the previous or next frame.

3:3DMark;P Dunno to be honest, would be nice as its gonna cut down on texture bandwidth/footprint and if you can create a marble like texture with a procedural texture then why not use that instead of a texturemap.

4:A CRT monitor is not always drawing to every part of a screen at once, it scans across horizontally one sweep then moves down vertically and sweeps horizontally again and so on. The light emitted is generated when this beam of electrons being swept across hits the phospherescent inside surface of the monitor screen. Once the beam moves away froma section of screen it stop giving off light. So you see flicker. Conversly, on an LCD each pixel of the screen has its own source of light, so the refresh rate merely effects how quickly the picture can change, rather how quickly it flashes like a strobe.

5: When you render a triangle using vertex lighting each pixel of the triangle has its light intensity value worked out by interpolating between the light intensity at the 3 vertices of the triangle. Obviously with large triangles this looks poo. With a scene of entirely single pixel triangles it would look perfect. Phong shading would of course look better in all situations, this is kinda acheieved now with the per pixel lighting available using pixel shaders and dot3 bump mapping. (but not actually being used for bump mapping)
 
Chalnoth said:
OpenGL guy said:
LCDs don't flicker as much as CRTs. 60 Hz would be usable if you had a slow phosphor CRT, but you'd probably not appreciate the trails objects would leave when in motion.
Well, if the phosphors were that slow, a white region on the screen would get brighter the longer it stayed white. This is a much bigger problem, and would result in the screen getting burned out.


That wouldn't happen unless the electrom beam was too intense.
 
The Great Bundini said:
3. Are any games currently using any programmatically generated textures?

Try Krieger:Chapter 1 which is a whole 3D shooter done in under 96K and all it's textures are procedurally generated:

snap4.jpg
snap5.jpg


http://www.theprodukkt.com/kkrieger.html
 
Back
Top