A whole load of random questions

Thats pretty damn cool.

Im impressed, 96KB!! :oops:

At the very least a procedural texture could be blended in with other textures in current games to produce things like the grain of the wood when you get up close...
 
Dave B(TotalVR) said:
1: Cg is used to create shaders AFAIK, just like ATI's rendermonkey. Its a high level language for programming your pixel shaders with. It compiles to run under D3D or OGL
No, it's not just like ATI's Rendermonkey. ATI's Rendermonkey is more of an IDE platform, more similar to nVidia's FX Composer than the Cg language.
 
That's an extremely ingnorant statement perpetrated by those who have no knowledge of programming whatsoever. There was never any reason for ATI to end up at a disadvantage due to Cg.
Nothing to do with programming.
All to do with control & apparent leadership.
 
epicstruggle said:
Whats the (next-gen) replacement for DVI?
Dual-link DVI? :) I actually just learned about this myself, what with all the ruckus the Apple 30" LCD is making, but it seems to get around DVI's 1600x1200@60Hz limitation quite nicely, as it uses basically the same physical connectors and allows for double the bandwidth.

The real question is, when will we see OSes and games that aren't fixed-pitch? (Specifically, when will Longhorn get here, as I think OS X is already vector-based.) It's still annoying to see games whose menus, HUD, and text don't remain the same size regardless of res. Heck, I'm flabbergasted that CS:S has kept the same HUDs, when free games like Wolf:ET have far superior (read: non-overlapping) text and score boxes.


OpenGL guy said:
The Great Bundini said:
4. A low refresh rate on a CRT monitor is known to cause eyestrain for main users. Yet 60Hz on an LCD monitor is no problem. Why?
LCDs don't flicker as much as CRTs. 60 Hz would be usable if you had a slow phosphor CRT, but you'd probably not appreciate the trails objects would leave when in motion.
I was under the impression that LCDs don't flicker at all--at least, not in the way CRT monitors do? Although I believe (may be wrong) that fluorescent lights themselves refresh at something like 30 or 60Hz, which may give people similar problems (headaches/eyeball charlie horses).

(Warning: The following is what I picked up from online articles and forums, not a technical education. I may be wrong, but it makes sense to me. I'm writing just because I find the differences between CRT and LCD tech interesting, and this question comes up a lot. Still, you'd probably be better served by reading an Anandtech/Ars/ET/THG/etc. article on the subject, as they probably relied on better info than their memory.)

To the OP, GV covered #4 quite well, but allow me to try to explain a bit further. The refresh rate is literally how often the CRT refreshes, or redraws, the picture. The more often it does so (the higher the Hz, or times per second the screen is redrawn), the less evident phosphor decay becomes (the refresh rate is above your threshold for interpolating the constantly varying intensities of the drawn then fading then redrawn pixels into steady ones). LCDs don't deal with phosphor decay, though, so they don't exhibit flickering when displaying a single color. They have a constant (fluorescent white) backlight behind the (screendoor-like) liquid crystal array. The LCD's "pixels" (consisting of discrete subpixels, one each R, G, and B) change from transparent to opaque to represent different colors and shades. (For instance, to show red, the green and blue subpixels in a pixel are transparent, and the red subpixel is opaque--thus you see red from this stained glass effect. The shade of red depends on how opaque the pixel is. To show purple, the red and blue subpixels are opaque, and green remains transparent.) So the more relevant term for LCD is pixel response, or how quickly a pixel can change its opacity to go from one color to another. Flicker is not a problem.

Note that LCDs can apparently switch from black to white to black (opaque to transparent and back) quicker than they can from grey to grey (partially opaque to partially opaque). AFAIK, LCD response times quote the former measurement. A "16ms" LCD takes that long to switch from opaque to transparent, but may take longer to switch from gray to gray--and most images (at least in games and movies) involve varying grays rather than black to white transitions. Early 16ms panels were able to offer faster grey switching by reducing their available transitions, from 24-bit to 16-bit. They represented the full 24-bit range via dithering between two defined opacities, which presents itself as a video snow effect with certain shades. I think newer 16ms (and 12ms and 8ms) LCDs are truly 24-bit, though.

LCDs can still cause eyestrain because they tend to be much brighter than CRTs, which may be a problem in dimly-lit rooms. (They can also have trouble representing pure black--the backlight seeps through completely opaque pixels to offer more of a dark grey than a black--but this only causes headaches for video afficionados. ;))

(FYI, LCDs also don't repaint the screen line by line, like CRTs, thus LCDs on TV show a solid image, whereas CRTs show a line rolling down the screen [the leading edge of the cathode ray]. You see the rolling line because the CRT refresh rate is often different from your TV's [72Hz or above, vs. 50/60Hz for PAL/NTSC]. The line will "roll" faster depending on the CRT's refresh rate.)
 
The real question is, when will we see OSes and games that aren't fixed-pitch? (Specifically, when will Longhorn get here, as I think OS X is already vector-based.)
OS X is currently about as vector based as Windows is - Quartz2D is a device-independent vector rendering system, which is about equal in capabilities to GDI+ (albeit somewhat nicer to use), but the Quartz Compositor is resolutely bitmap-based (there's some interesting discussion here, and in a few his other entries).

Actually, Windows is currently better than OS X at resolution-independence, since it at least allows you to change the notional screen resolution, whereas OS X is fixed at 72dpi, no matter what screen size/resolution you use (admittedly, not many programs cope well with non-standard dpi, since they simply aren't tested, and just draw things with fixed pixel sizes).
 
Pete said:
epicstruggle said:
Whats the (next-gen) replacement for DVI?
Dual-link DVI? :) I actually just learned about this myself, what with all the ruckus the Apple 30" LCD is making, but it seems to get around DVI's 1600x1200@60Hz limitation quite nicely, as it uses basically the same physical connectors and allows for double the bandwidth.
lol, i was reading about it today on hardforums, also finding info on apple's 30" lcd. :p Only problem is that i dont think there are very many graphic cards that support dual-link.

epic
 
Diplo said:
The Great Bundini said:
3. Are any games currently using any programmatically generated textures?

Try Krieger:Chapter 1 which is a whole 3D shooter done in under 96K and all it's textures are procedurally generated:

snap4.jpg
snap5.jpg


http://www.theprodukkt.com/kkrieger.html

lol at a 96k game having far far far far superior lighting and shadowing than all of half life 2 :LOL: :LOL:
 
Thanks for the correction and the link, arhra.

epic, that 30" deserves all the recognition it's getting, IMO. I saw it in-store, and it looked amazing with OS X during my five minutes with it, even a mere two feet away. Yep, not many dual-link video cards, but that should change when we move beyond 1280x1024 as a semi-standard for most computer monitors (read: LCDs). IOW, another decade or so? :)
 
hovz said:
lol at a 96k game having far far far far superior lighting and shadowing than all of half life 2 :LOL: :LOL:

I'd have to say better than Doom3, since it was nothing but darkness...
 
Pete said:
epicstruggle said:
Whats the (next-gen) replacement for DVI?
Dual-link DVI? :) I actually just learned about this myself, what with all the ruckus the Apple 30" LCD is making, but it seems to get around DVI's 1600x1200@60Hz limitation quite nicely, as it uses basically the same physical connectors and allows for double the bandwidth.

If you want an eye opener, try a searcg for IBMs "Big Bertha" LCD display. IIRC it has 4 inputs.
 
The Great Bundini said:
5. From my understanding, vertex lighting only affects the vertices of the polygon's which it acts upon. Is this true? In all actuality, wouldn't Phong lighting be more impressive?

Just to let you know you can have Phong lighting be per vertex then you are just using Gourad shading. Phong shading actually is where you interpolate the normals from the vertices and then do Phong lighting at each pixel.
 
Or more correctly Bui-Tuong lighting...

Phong Bui-Tuong is his full name, with Bui-Tuong being his family name and Phong being is given name (first name), but on his seminal PHD he wrote his name in the Vietnamese style of family name first, given name second so seeing Bui-Tuong Phong, all the western readers quickly named these revolutionary ideas Phong shading and Phong lighting.

Which I'm assured is for Vietnamese readers, its the equivilent of something like "John's shading".
 
Simon F said:
Pete said:
epicstruggle said:
Whats the (next-gen) replacement for DVI?
Dual-link DVI? :) I actually just learned about this myself, what with all the ruckus the Apple 30" LCD is making, but it seems to get around DVI's 1600x1200@60Hz limitation quite nicely, as it uses basically the same physical connectors and allows for double the bandwidth.

If you want an eye opener, try a searcg for IBMs "Big Bertha" LCD display. IIRC it has 4 inputs.
icon_smile_jawdrp.gif


epic
 
Pete said:
...it seems to get around DVI's 1600x1200@60Hz limitation quite nicely...
Just wanted to point out that 1600x1200@60Hz is not the limit for single link DVI. Staying within spec and with full horizontal and vertical blanking, you can get 1920x1200@60Hz. Reduce the H. & V. blank (which lcd displays shouldn't need anyway) and you can go even higher.
 
Big Bertha in all her glory:

Talk about eye candy! The IBM T221, a 22.2-inch LCD monitor, offers such a stunning image that we feel we must warn you: There is no going back. This may be a slight problem because of the $18,999 price tag for the display...

Bertha's core technology is a jumble of the latest letters: It uses an active-matrix TFT LCD panel with dual-domain in-plane switching (IPS) technology. It has a resolution of 3,840-by-2,400, which allows for 12 times as much data as a 1,024-by-768 monitor. (That's QUXGA-W, for Quarter-UXGA Wide; the wide part comes from the 16:10 aspect ratio, which is just slightly bigger than a typical movie screen's.) Another way to look at it: That's 9.2 million pixels. Two DVI outputs simultaneously channel the digital signals, and four bricks supply AC power.

http://www.pcmag.com/article2/0,1759,7688,00.asp
 
hovz said:
Diplo said:
The Great Bundini said:
3. Are any games currently using any programmatically generated textures?

Try Krieger:Chapter 1 which is a whole 3D shooter done in under 96K and all it's textures are procedurally generated:

snap4.jpg
snap5.jpg


http://www.theprodukkt.com/kkrieger.html

lol at a 96k game having far far far far superior lighting and shadowing than all of half life 2 :LOL: :LOL:

That make you think what they could with a whole CPU dedicated to that (like XB2 apparently will have).
And it is a small team.
 
pc999 said:
That make you think what they could with a whole CPU dedicated to that (like XB2 apparently will have).
And it is a small team.
Dedicate to what? Lighting? Procedural content generation?
 
pc999 said:
hovz said:
Diplo said:
The Great Bundini said:
3. Are any games currently using any programmatically generated textures?

Try Krieger:Chapter 1 which is a whole 3D shooter done in under 96K and all it's textures are procedurally generated:

snap4.jpg
snap5.jpg


http://www.theprodukkt.com/kkrieger.html

lol at a 96k game having far far far far superior lighting and shadowing than all of half life 2 :LOL: :LOL:

That make you think what they could with a whole CPU dedicated to that (like XB2 apparently will have).
And it is a small team.

my post was aimed more at discredting the lame source engine, which is nothing more than unreal 2 engine with a few dx9 shaders.
 
Back
Top