some technical questions about consoles (Erp, Far...)

PiNkY

Regular
-60fps vs 30fps on ntsc (50 vs 25 for Pal)
If you sync at 60hz, do you actually render 60 frames at full resolution or are these frames rendered interlaced (ps2 output looks that way, while xbox/gcube seem to render @ full res). I know that all of them *can* render at 480p, but i am referring to standard output.

-Is vsync forced on consoles?

-sound encoding on xbox/ps2/cube
how much latency does the dolbydigital encoding/decoding process (dts on ps2/dpl2 on cube) carry on average and how do you counteract it (if at all).

-physics
do you normally sample the worldstate on a timely basis or per frame. (I always thought that for consoles it was per frame (thats why most pal games seem to run slower) but people argued otherwise in this forum).

-xbox specific
why come antialiasing + asf has not yet been employed by any xbox game
(2x quincux + 8sample aniso should make a big difference in games and should be feasible at 30hz rendering on nv2a)
 
I will try to answer some of your questions from the various interviews I've read and various questions I asked and got answers for. Feel free to correct me, as I could be wrong on some points.

PiNkY said:
-60fps vs 30fps on ntsc (50 vs 25 for Pal)
If you sync at 60hz, do you actually render 60 frames at full resolution or are these frames rendered interlaced (ps2 output looks that way, while xbox/gcube seem to render @ full res). I know that all of them *can* render at 480p, but i am referring to standard output.
I believe Xbox always renderes to 640x480 front buffer and therefore can output games progressively. It also renders 640x480 back buffer. With GC and PS2 it's a mixed bag. Some games render 640x480 front buffer - mostly 30FPS games and 60FPS games that can't keep up that framerate all the time (EA for example). Some of the games that render to full front buffer output progressively - Tekken 4, Burnout 2 for example. Many games on PS2 render to 640x240 front bufer and display only necessary lines lines every 1/60 second. These games look fine IF their back buffer is 640x480 so they can filter it to 640x240 and IF their framerate is locked to 60FPS. These games cannot have progressive output. The games that have both front and back buffer set to 640x240 exibit the problem common with older PS2 games - very visible aliasing.

-Is vsync forced on consoles?
Console manufacturers have a set of rules that developers should follow. One of the rules that Sony has, is that VSync must be enabled (Faf's words) Some games, though, don't do that (MGS2 is an example)


-physics
do you normally sample the worldstate on a timely basis or per frame. (I always thought that for consoles it was per frame (thats why most pal games seem to run slower) but people argued otherwise in this forum).
I'm pretty sure some games (Jak & Daxter for example, from their interview) calculate physics on time basis. I remember developers talking precisely about that, when they explained that it was easy for them to port the game to PAL without changing it's speed. I don't know if most games do it that way nowadays.
 
I like how one developer put it -- the video mode the Xbox calls aliased (640 x 480) is the same one the PS2 calls anti-aliased. :)

Some old video games were locked to the vertical retrace, but I think that developers long ago learned to seperate their game simulation logic from the frame rate.

Especially for 3D games, where you sometimes take more than one frame to render your graphics.
 
I like how one developer put it -- the video mode the Xbox calls aliased (640 x 480) is the same one the PS2 calls anti-aliased.

That's more of a PC logic versus TV logic. On a normal TV 640x480 actually *is* antialiased 2x vertically because you display 640x240 every 1/60 sec, and that's how it's usually referred to. On a PC monitor, or progressive scan TV, it's not.

There are very few games that actually have both horizontal and vertical AA, as far as I can see. PS2 has one that I know of (BG:DA) and Xbox and GC have few.
 
duffer said:
I like how one developer put it -- the video mode the Xbox calls aliased (640 x 480) is the same one the PS2 calls anti-aliased. :)

Being anti aliased has nothing to do with the size of the front buffer. Read Marconelly explanation twice.

Some old video games were locked to the vertical retrace, but I think that developers long ago learned to seperate their game simulation logic from the frame rate.

Especially for 3D games, where you sometimes take more than one frame to render your graphics.

You should visit us in europe and taste our infamous 50HZ system.
 
640x480 will only be "antialiased" if it gets downsampled. If the frame is downsampled to 640x240 before output (rather than dropping even/odd lines), it's only vertical antialiasing, and I hestiate to even call it antialiasing. I guess it could be downsampled to 320x240 before output, but that wouldn't look too hot either.


What bothers me about the X-Box is I have seen many X-Box games that don't push the limits of bandwidth, run at 640x480, and yet don't turn on even 2X Quincunx antialiasing. It would be nice if games allowed people to choose between 60fps (aliased) and 30fps (antialiased) on the x-box.
 
I've got another question about consoles in general:
-How do you do PAL-conversions correctly? PAL uses a higher vertical resolution than NTSC so does that mean you have to render additional scanlines? And if that's the case: Could this be a problem on consoles with a very limited ammount of video-ram (PS2, gamecube) because of the framebuffer needing more ram?
 
640x480 will only be "antialiased" if it gets downsampled.

That, of course, is a given. Just dropping every even or odd line doesn't help.

I hestiate to even call it antialiasing

Well, it's sort of antialiasing (and only vertical)

Console game developers also have to take into account that majority of people uses simple composite connectors for their console which produce rather blurred picture, so antialiasing it *before* it gets to TV often just makes things even more blurred for Joe Regular.
 
CeiserSöze said:
I've got another question about consoles in general:
-How do you do PAL-conversions correctly? PAL uses a higher vertical resolution than NTSC so does that mean you have to render additional scanlines? And if that's the case: Could this be a problem on consoles with a very limited ammount of video-ram (PS2, gamecube) because of the framebuffer needing more ram?

There is something called 60HZ PAL, which gives exactly the same speed/ratio than NTSC. Add RGB output as standard in europe, you can not make better conversion. It seems that the bigger the company are, the slower they learn it, but things tend to change.
 
wazoo said:
There is something called 60HZ PAL, which gives exactly the same speed/ratio than NTSC. Add RGB output as standard in europe, you can not make better conversion. It seems that the bigger the company are, the slower they learn it, but things tend to change.

Isn't 60Hz PAL just the standard PAL-resolution with 60Hz instead of 50Hz?
 
Most importantly, doing 2x vertical interpolation does a *very* good job on eliminating the nasty interlace flicker. I actually think more games should use a non-interlaced output. I like the steady visible rasterlines, and handpixelled graphics looks absolutely best non-interlaced. Marvel vs Capcom 2, for instance, would have looked absolutely marvellous in 320*240 (plus overscan), with the 3D backdrops rendered at 640*480 and downsampled to a crisp 4xAA non-interlaced output - with perfectly stable sprites as they originally were intended to look on top of that. Getting a bit OT now, I guess.. :)

I for one never cared for VGA output and other modernities, I like the gaming kind of picture the way it is. Using high definition displays makes me think "PC" rather than "arcade".
 
CeiserSöze said:
wazoo said:
There is something called 60HZ PAL, which gives exactly the same speed/ratio than NTSC. Add RGB output as standard in europe, you can not make better conversion. It seems that the bigger the company are, the slower they learn it, but things tend to change.

Isn't 60Hz PAL just the standard PAL-resolution with 60Hz instead of 50Hz?

50Hz pal and 60Hz NTSC have ~ the same bandwidth requirement.

a 60HZ PAL with the same resolution as 50HZ PAL would need more bandwidth. I do not think this is the way they did it.

I think 60HZ PAL is the same as 60HZ NTSC, save for a better color scheme.
 
wazoo said:
50Hz pal and 60Hz NTSC have ~ the same bandwidth requirement.
a 60HZ PAL with the same resolution as 50HZ PAL would need more bandwidth. I do not think this is the way they did it.
I think 60HZ PAL is the same as 60HZ NTSC, save for a better color scheme.

Thanks, that clears up a lot :)
But if you want to do a 50Hz PAL-conversion (ok, there's no need to do this now thanks to PAL60 but afaik not every TV supported PAL60 a few years ago) you have to render a bigger frame, which means additional vram-requirements (but no additional fillrate/bandwith-requirements because it's only 50Hz, not 60Hz), correct?
 
marconelly!:

> ... so antialiasing it *before* it gets to TV often just makes things even
> more blurred for Joe Regular.

How would you AA something once it's gone through the DAC? Anyway, a proper deflicker filter will benifit even those with composite connections. Chroma crawl and jaggies are a match made in hell.


CeiserSöze:

> Isn't 60Hz PAL just the standard PAL-resolution with 60Hz instead of 50Hz?

Nope. NTSC res and refreshrate but PAL color system.
 
How would you AA something once it's gone through the DAC? Anyway, a proper deflicker filter will benifit even those with composite connections. Chroma crawl and jaggies are a match made in hell.

Perhaps I worded it wrong. TV with composite connections already blurs picture quite a lot. Having it antialiased in hardware + blurred on a TV doesn't always look good as it ends up even more blurred. Well, at least that's my experience from experimenting with different games and composite / component connectors.

Deflicker is definitely important, though, I was thinking more about FSAA.
 
CeiserSöze said:
But if you want to do a 50Hz PAL-conversion (ok, there's no need to do this now thanks to PAL60 but afaik not every TV supported PAL60 a few years ago)

TVs that don't support 60Hz displays might exist, but I myself have never encountered one. Newer TVs seems to just support it out of the box, and older TVs used to have a Vertical Hold (or something like that) knob that you could turn to make them display 60Hz images without rolling the picture.

you have to render a bigger frame, which means additional vram-requirements (but no additional fillrate/bandwith-requirements because it's only 50Hz, not 60Hz), correct?

Yes, that is true, but I don't think it's a problem. The GC has enough vram for a framebuffer up to 640x546, or something in that area. It shouldn't be a problem on the Xbox, just compressing a few more textures should free enough memory. If you run out of vram on the PS2 you could lower the horizontal resolution from 640 to 512, unless it’s already at 512. And if all else fails, you can just use black bars on the top and bottom of the image :)
 
Teasy said:
In 24bit yeah, and in 16bit it can even handle just over 800x600.

Are you sure about that? What I've heard is that the only 16bit mode the GC supports is when 3x AA is enabled, in wich case the max resolution is about 640x273.
 
Are you sure about that? What I've heard is that the only 16bit mode the GC supports is when 3x AA is enabled, in wich case the max resolution is about 640x273.

I don't know which modes are actually supported. I was just saying that in 16bit GameCube has enough frame and Z-buffer ram to handle over 800x600. Although I'd be suprised if the resolution/bit depth options are as limited as you've heard, after all why would they limit the modes to only allowing 16bit with FSAA? Why not just allow the dev to chose what res and bit depth they want?.. or at least let them select what res and bit depth they want to a reasonable level.
 
What bothers me about the X-Box is I have seen many X-Box games that don't push the limits of bandwidth, run at 640x480, and yet don't turn on even 2X Quincunx antialiasing. It would be nice if games allowed people to choose between 60fps (aliased) and 30fps (antialiased) on the x-box.

The problem with antialiasing on Xbox is that although few games are actually bandwidth limited there are some significant associated costs with enablind it.
The first is the copy forwards from the larger Backbuffer, this can be a significant portion of your frametime (especially at 60Hz (16ms/frame)) of the order of around 1.5ms for 2 to 1 ordered grid and slightly over 2ms for a quincunx filter.
The second is the fact that enabling quincunx pretty much nullifies any Z compression you were getting, this cost is hard to quantify, and can range from negigible to significant depending on triangle size, overdraw and whether stencil was in use.

Not of these are particularly steep technical hurdles, but you need to plan for the shorter (10-20% less) frame time and the additional memory overhead of the large back/Zbuffers when building your assets, it isn't something you can just turn on at the end.
This means that the developer faced with the decision of reduced model quality versus a higher quality TV image, quite honestly it's rarely an easy call, a lot of it depends on the personal preferences of the person making the decision in the preproduction stage.
 
Back
Top