nVidia's Jen-Hsun Huang speaks out on X-Box 2 possibility.

BenSkywalker said:
No matter how you look at it 1080i is only displaying 1920x540 pixels @60Hz. Given how TV standards advance in the US that will likely be the highest end solution for decades.

At the analog interface 1080i is actually specifying 1080 (odd+even) interlaced active lines in a 1125 interlaced lines frame, there is no requirement on the horizontal pixel width as there is no such things as a pixel in the analog interface. A digital display samples/digitizes the line with its own sampling rate.

The number of pixels per line is for digital spec and they can be 1280, 1440 and 1920, at the analog side you just have to get the timing right and the number of pixels you use to generate the lines is arbitraty.
 
BenSkywalker said:
Hey Crazyace-

All the information I can find indicates that 3DSM uses 64bit color(16bit per channel), along with Mental Ray, Lightworks and PRRenderman.

I can certainly believe that they specify their output at 16 bits per component, but I find it very hard to believe that they do internal calculations (their shader precision) at that level since pretty much nothing supports it natively.

When talking about precision it is a natural assumption that you need an extremely long and complex calculation before visible differences show up on screen, but that's certainly not true. While it seems intuitive that this is the case (since screens are only at about 8 bits of component resolution), it is actually a fallacy.

In reality it is highly dependent on what data you are calculating and how it is used in the shader - you can easily run out of precision in just one step (a good example is in texture coordinate calculations), and radically affect what you see on screen. Since a texture lookup indirectly gets a colour whose value depends on the accuracy of the coordinate calculation used to obtain it you can end up fetching white when a more accurate calculation would have displayed black.

Real life is certainly not as simple as "I'm only ever outputting 8 bits, so 16 bits in every calculation is enough"
 
Look, basically if a console is rendering 1080i the graphics core's frame buffers will be set at 1920x1080 pixels.

OR at least the front buffer will. Many PS2 games do use half-height back buffers, but that requires a very stable frame rate to look right.

GCN and Xbox use full-height back buffers even in pure interlaced (480/1080i) mode for games which don't support 480/720p.

The raster load is still 1920 * 1080 pixels (or whatever the horizontal resolution ends up being).
 
16 bit

Hi Ben,

The packages store the final images in 16 bit per channel format, but all rendering and lighting calculations are performed as floats. ( 16 bit gives a good range to suit laying to film, or performing 2d compositing and post effects )
The only equivelant on the new cards is the new 10:10:10:2 dac format, and the compression of the floating point buffer or output colour to a gamma corrected final integer value... There probally isnt as much need for the extra precision then because the final rendered image is immediately displayed.
( If you want a high quality image for film work from one of the new cards then there is always the floating point pbuffer support )
 
Tagrineth said:
Look, basically if a console is rendering 1080i the graphics core's frame buffers will be set at 1920x1080 pixels.

This is not a requirement for 1080i analog output, it can be 640x1080 or whatever (say 960x1080) horizontal resolution it wants to be. Just that the output will not be as good as 1920x1080 rendering as the horizontal detail are a lot less. But the end result at the analog output is still 1080i as long as the horizontal timing is right.

The image will not be stable with only a half height buffer for interleaved modes unless the hardware is capable of rendering interleaved lines (I believe PS2 GS is capable of doing that, correct me if I am wrong).
 
Tagrineth said:
OR at least the front buffer will. Many PS2 games do use half-height back buffers, but that requires a very stable frame rate to look right
You got back/front buffer reversed. ;)
As for GC, you're stuck with a half height buffer(front and back) if you use antialiasing.

The image will not be stable with only a half height buffer for interleaved modes unless the hardware is capable of rendering interleaved lines
All modern rasterizers can do this of course (same principle as AA methods).
 
Fafalada said:
The image will not be stable with only a half height buffer for interleaved modes unless the hardware is capable of rendering interleaved lines
All modern rasterizers can do this of course (same principle as AA methods).

I thought they may have left that out as computer interlaced only displays are negligible nowadays and they will use TV encoders whenever they want to have a TV out.
 
I thought they may have left that out as computer interlaced only displays are negligible nowadays and they will use TV encoders whenever they want to have a TV out.
Actually maybe I expressed myself wrong. I was referring to subpixel offsets (which are needed to render the two interlaced fields - offset 0.5 pixel relative to each other).

If you meant actually rendering scanlines interleaved (eg. by leaving empty lines in between), I don't think it's so common no.
GS does offer a mode for this, but it's basically just a register mask for odd/even lines, which means your effective fillrate is cut in half if you use it, so I don't think it's all that usefull.
 
Crazyace-

The packages store the final images in 16 bit per channel format, but all rendering and lighting calculations are performed as floats.

I knew the radiosity was for all the packages I've used, for the rest of the rendering they do state 64 bit color numerous times, although they don't specify the internal calculations(I went and double checked).

Dave-

Seeing as you are arguing that 'FP16 is enough' then I assume you accept that FP24 will be suficient as well.

When trying to claim that the NV30 has some sort of unfair advantage because it is running the desired level of precission instead of a higher one, I think it is relevant to point out that Carmack has explicitly stated his desires and I haven't ever seen them include 96bit color. Also, out of all the other developers I've seen comments for they have repeated JC's 64bit color request, but none of them for 96bit. When developers want 96bit color and the NV30 is forced to run FP32 are the same people complaining about the edge now going to again say it is unfair? I know you weren't the one who said it, but that was what the discussion was revolving around.

Thowlly-

Looking over the HDTV specs there aren't any 540p, 1080p or 240p standards. Your mention of them, outside of speaking of 1080p in a hypothetical sense that we hope they will add, is the first time I have seen mention of them.

Maskrider-

1080i has a 16:9 aspect ratio according to the HDTV standards. Talking about what else is possible outside of the specifications is something different then discussing the actual specs.

Tag-

When you are talking about running 4x AA the need for running double height frame buffer isn't nearly the same(which should be a given for next gen hardware).
 
BenSkywalker said:
Thowlly-

Looking over the HDTV specs there aren't any 540p, 1080p or 240p standards. Your mention of them, outside of speaking of 1080p in a hypothetical sense that we hope they will add, is the first time I have seen mention of them.

Whether they are hypothetical or not does not affect the discussion at all. 1080i has more than 540 lines of resolution.

And concerning 240p, I guess you must be new to gaming, since that is the resolution that almost all games used to run at before this generation. It might not be an official standard, but it is what practically every console (and every computer made to be used with a TV) used.
 
but 240p is a progressive scan signal......... ps1 (or anything before it) didn't output a progressive scan signal............. my head hurts... :LOL: :?
 
london-boy said:
but 240p is a progressive scan signal......... ps1 (or anything before it) didn't output a progressive scan signal............. my head hurts... :LOL: :?

Yes it did. If you ever have played an interlaced PS1 game, then it would be obvious to you that the normal PS1 games did not run interlaced.

A progressive signal is simply one that is not interlaced. PS1 games that ran at 320x224 were not interlaced. (The resolution should have been 320x240, but they left a little border unused)

Just in case I have to convince you that PS1 games does indeed not run interlaced:

Take any new game that runs at 640x480 or similar on an ordinary TV. Move your head really close to the TV screen, and look at it. The picture flickers a lot, but looks continuous. Now take a PS1 game that runs at 320x224 or similar. Now look really close. The image doesn't flicker (or at least flickers much less than the 640x480 one) and on most TV's you will be able to see faint black lines between the lines of the graphics. Those are there because the extra interlaced lines that would normally fill them in are not there.

The new game draws first 240 lines on the screen leaving lots of faint empty lines, but the next update it draws 240 lines on the spaces left empty in the previous update, while leaving the lines drawn in the previous update untouched this update. This is an interlaced picture.

The old game also draws 240 lines on the screen leaving lots of faint empty lines, but the next update it does not fill in those remaining lines, instead it draws over the same lines it drew over the last update one more time. This is a progressive picture. This is what most PS1 games used.

Sorry for the crappy explanation, I'm not very good at expressing myself in English (or any other langue, some might claim :)) I tried to explain it like I would explain it to an idiot, but I failed (that does not mean I consider you an idiot, I don’t), instead it came out like it was explained by an idiot... :oops:
 
Thowllly said:
Sorry for the crappy explanation, I'm not very good at expressing myself in English (or any other langue, some might claim :)) I tried to explain it like I would explain it to an idiot, but I failed (that does not mean I consider you an idiot, I don’t), instead it came out like it was explained by an idiot... :oops:




...errr... mmmmmmmkay... ;)

but the TV would still display it interlaced... oh i see...it displays it interlaced but the odd frame is...... no hold on... i still don't get it...

what u're saying is that the odd frame lines are black.... so that means that NO GAME could possibly have run at 60fps.... i'm sure i'm getting this whole thing wrong... :LOL:
 
I’m saying that it displays only odd or only even frames, but still 60 of them every second.

A TV can display odd or even frames, both have the same number of lines, but one of them is displayed half a line further down the screen... So, how does the TV know when to push a frame half a line down? It's the device that’s outputting the signal that tells it that, and it does it by ending each frame either with a complete line (A complete NTSC line last for 0.0000635s), or with a line that is only half as long as normal (roughly 0.000032s, I'm not sure about the exact number). If the last line is complete the TV will display the next frame at the normal position, and if the last line is half length, then it will display the next frame pushed down half a line. (or it's the other way around, I don't remember).

When a console is displaying an interlaced image, it alternates between ending the frames with a complete line and a half line, and so the TV ends up displaying every other frame pushed down half a line. This gives an interlaced image.

When a console is displaying a non-interlaced (progressive) image, it always ends each frame the same way (always with a complete line or always with a half line), and so the TV always displays the image in the same position, never shifted half a line in any direction. This gives a progressive image.
 
Basically a 480i console rasterises 480 lines and sends 480 lines, but the TV only displays 240 lines - the rest are blank. A 60th of a second later the 480i console sends a new 480-line image to the TV, and the previously blank lines are filled in with the new image, with the old lines fading out - resulting in a flickery 60fps.

"240p" instead rasterises 240 or less lines and sends the whole lot - so that the TV displays all 240 and has blank lines in between each pair of lines. BUT. Next 240-line frame, instead of filling in the blank lines for a flickery interlaced 60fps, replaces the already-filled line - for a far less flickery 60fps.

Basically it's what causes scanlines in old console games. :) Teh scanlines are simply the lines which are never updated, in favour of the "progressive" trick.
 
Man, that is interesting stuff! I always suspected the missing field approach in some older games, but never did I imagine the progressive mode via half-line signal.

I take it the worst looking games where the ones programmed to render 240 lines and send it for display and then send the same 240 lines to display in the next field, as well. That way you get 240 lines scaled to 480i display for the ultimate in aliasing goodness?

So does this mean that the previous generation of consoles did not do true 480i, at all, or did they figure out how to make it happen in the better games toward the end?
 
amiga used this trick.

by default non-interlaced mode was used so you got resolutions like 320*240..
you could use interlace mode and double vertical resolution but the flickering was very bad, especially when using the workbench (amiga's gui).

every amiga game i knew used non-interlaced mode.
 
randycat99 said:
Man, that is interesting stuff! I always suspected the missing field approach in some older games, but never did I imagine the progressive mode via half-line signal.

I take it the worst looking games where the ones programmed to render 240 lines and send it for display and then send the same 240 lines to display in the next field, as well. That way you get 240 lines scaled to 480i display for the ultimate in aliasing goodness?

I you mean first sending a frame as an even field and then the next update sending the same the same frame as an odd field then, yes that does indeed look like crap. I've done it on my Amiga, but I’ve never seen a game doing it (not on purpose, at least)

So does this mean that the previous generation of consoles did not do true 480i, at all, or did they figure out how to make it happen in the better games toward the end?

No, I think all previous generation could do true 480i. They didn't need to figure out anything to enable 480i, they would simply have to switch to that screenmode. But that would require double the memory and fillrate, unless they used field rendering. Many of the consoles in the 80's could not do interlaced, they were hardwired for 240p. For those it would be impossible to output 480i. Most likely they ended all frames with a full line, and where unable to end a frame with a half line.
 
BenSkywalker said:
Maskrider-

1080i has a 16:9 aspect ratio according to the HDTV standards. Talking about what else is possible outside of the specifications is something different then discussing the actual specs.

The spec is for HDTV digital, not something at the analog. And rendering mode can be square or anamorphic pixel aspect ratio, not necessarily be square pixels (1:1).
 
Back
Top