Did most dreamcast games use super sampling?

Guden Oden said:
darkblu said:
what i've been saying is that for this technique to work you need a full-size (as in progressive) framebuffer in the first place.

Would you mind terribly much not using the word "progressive" in this context? You see, it has no relevance here. The height of the back buffer has no bearing on wether the game is displaying in a progressive or interlaced fashion. Thank you. :)

i mind. you're welcome.
 
Guden Oden said:
darkblu said:
dunno about the xbox but not all gc games use progressive. of course those that output in progressive could just as well be supersampled when interlaced.

Wtf are you talking about? :) What Simon is describing has nothing to do with progressive scan, so just get those ideas out of your head, alrighty?

All current consoles can do what he describes, and most of them ALWAYS do it. There's nothing magic about DC's video out, it's just good old-fashioned nostalgia and rose-colored graphics that makes people believe it has "much sharper video" than even modern PC hardware. That's nothing but silly-talk! :)

I can compare right now and the DC still looks sharper.
 
darkblu:

> 1) the output is very relevant. if you don't intend to produce
> progressive then you do not need a full-height framebuffer.

It is not required. However, most games still render full frames in order to provide an artifact free image.

> are you saying that such titles "go against how those systems are
> designed"? if so, how?

Look at the size of the eFB on the GC. Look at the huge amount of memory on Xbox. Unlike the PS2 where you have to manage a meagre 4 mb video RAM there's really no excuse to not render full frames on GC and Xbox. You even have hardware support for flicker filters. And even on PS2 where things aren't as streamlined the vast majority of all titles have full height backbuffers.

> and what would it be downsampled from if the game originally produced
> field-height output?

Try reading a little more carefully the next time. The backbuffers (ie. the frame that's being rendered) in the overwhelming majority of all console software are "full height".
 
cybamerc said:
> 1) the output is very relevant. if you don't intend to produce
> progressive then you do not need a full-height framebuffer.

It is not required. However, most games still render full frames in order to provide an artifact free image.

which "most games"? those which have a progressive output option to start with?

> are you saying that such titles "go against how those systems are
> designed"? if so, how?

Look at the size of the eFB on the GC. Look at the huge amount of memory on Xbox. Unlike the PS2 where you have to manage a meagre 4 mb video RAM there's really no excuse to not render full frames on GC and Xbox. You even have hardware support for flicker filters. And even on PS2 where things aren't as streamlined the vast majority of all titles have full height backbuffers.

the idea that rendition actually takes time is fully foreign to you, isn't it? it never occured to you that interlaced output may be used to save on rendition fillrate? welcome to the realm of real-time computer graphics.


> and what would it be downsampled from if the game originally produced
> field-height output?

Try reading a little more carefully the next time. The backbuffers (ie. the frame that's being rendered) in the overwhelming majority of all console software are "full height".

jolly good. you still have not answered what happens to the "unfortunate minority" of console sw that do not produce full-height framebuffers.
 
darkblu:

> which "most games"?

Go look at a shelf in a store that sells video games.

> it never occured to you that interlaced output may be used to save on
> rendition fillrate?

No. Because any decent developer would rather cut back on fillrate demanding effects than sacrifice IQ.
 
cybamerc said:
darkblu:

> which "most games"?

Go look at a shelf in a store that sells video games.

funny, why didn't you quote the whole line of mine?

> it never occured to you that interlaced output may be used to save on
> rendition fillrate?

No. Because any decent developer would rather cut back on fillrate demanding effects than sacrifice IQ.

are capcom "decent" enough for you? because i have a capcom gc title (which has apparently passed ninty's IQ sieve) sitting right next to my pc monitor which does not support progressive. go figure.

anyways, cybamerc, my taking-forever re-compile has ended so i can drop this fruitful argument for good. have a good night.
 
darkblu:

> funny, why didn't you quote the whole line of mine?

No point in it. You aren't capable of differentiating between a render mode and an output mode.

> are capcom "decent" enough for you? because i have a capcom gc title
> (which has apparently passed ninty's IQ sieve) sitting right next to my
> pc monitor which does not support progressive. go figure.

A full height backbuffer doesn't equal progressive scan support.
 
which "most games"? those which have a progressive output option to start with?

Most ps2 games do have full height backbuffers or the blaze vga adapter wouldn't work. It works in 95% of all ps2 games, so that means 95% have full height backbuffers. They don't always have it 100% of the time though, some go half height during movies or menus but generally keep it during gameplay. But whether the game has a full height front buffer is irrelevent. Not every dreamcast game had a full height front buffer, but every dreamcast game had a full height backbuffer. You could force the non vga games into displaying vga by doing some kind of trick...I believe the trick was to plug in the vga cable as the game was loading and then even non full height front buffer games would display with a full height front buffer. The only software I ever encountered on dreamcast without a full height backbuffer was a boot loader, and I'm not sure why it didn't go full height.

There's even 1 or 2 xbox games that don't support progressive scan, and I'd imagine they're not doing it for fillrate or memory reasons as they're crappy looking quickly done ps2 ports, and there is no reason not to go full height back buffer there. They probably just forgot the one line of code it would take to enable pscan.

Supposendly pscan on gamecube barely takes any coding at all, but adds a few weeks of testing onto the dev time to make sure the game looks right under pscan which is why many devs just skip it. I don't think gamecube games use a half height backbuffer, but some do seem to rely on an interlaced display for certain effects to display properly, but I have games that do support pscan where the effects don't look the same in interlaced and pscan yet they still support it. Maybe the extra few weeks of testing is required to be done by nintendo's people, and thus requires an extra fee from the developers that many refuse to pay.
 
Fox5 said:
Guden Oden said:
All current consoles can do what he describes, and most of them ALWAYS do it. There's nothing magic about DC's video out, it's just good old-fashioned nostalgia and rose-colored graphics that makes people believe it has "much sharper video" than even modern PC hardware. That's nothing but silly-talk! :)

I can compare right now and the DC still looks sharper.
It could be how the two systems/games set up the interlace filter kernel weights. <shrug>....

...or perhaps the analog hardware is better <double shrug>
 
Well i always thought the hardware setup in DC allowed for clearer pictures. Whereas for example PS2 has a very "simple and rudimentary" CRT converter, good enough to display the resolutions needed but not very easy on the eye. Or that's what i heard.

But no game on DC ever used AA, and i'd take the "dirtier" look of other consoles, including the Xbox (GC looks very clean to me, kinda like DC), if that means i'm getting all the other things that DC couldn't even dream of rendering.

Guden, why do you always have to be so aggressive? Take a chill pill.
 
Squeak said:
Simon F said:
...or perhaps the analog hardware is better <double shrug>
Perhaps the DC outputs rounder pixels (less sharply defined), so you get a free blur effect?

Well, that sounds like the opposite of what I'm seeing. The DC image is much more jagged than gamecube's or my pc's 640x480 image, but the image is also much sharper.
 
I fail to see how 2 640x480 images without AA differ from each other. I think what makes you think is sharper than something else is the choice of colours (DC games, Sega games in general, are all very colourful, very high contrast), the brightness and contrast levels of the output... Things like that.
Nothing to do with one pixel being "sharper" than another one.
When 2 images are not filtered by AA, the pixels are the same. Little square things changing colours very fast. 640 of them horizontally, 480 of them vertically. Most of the time. They're always the same square bits, unless as i said, there's a filter applied.

There might be other factors to count in, but that's my take on this issue, which by the way is OLD.
 
london-boy said:
I fail to see how 2 640x480 images without AA differ from each other. I think what makes you think is sharper than something else is the choice of colours (DC games, Sega games in general, are all very colourful, very high contrast), the brightness and contrast levels of the output... Things like that.
Nothing to do with one pixel being "sharper" than another one.
When 2 images are not filtered by AA, the pixels are the same. Little square things changing colours very fast. 640 of them horizontally, 480 of them vertically. Most of the time. They're always the same square bits, unless as i said, there's a filter applied.

There might be other factors to count in, but that's my take on this issue, which by the way is OLD.

Well, I'm mainly comparing Soul Calibur 1 with Soul Calibur 2(though it happens for pc to dc as well, house of the dead 2 on dc versus house of the dead 2 on pc, or the phantasy star games... and those are the exact same games but they look different, dreamcast is slightly sharper), I guess a port of a psone game would have a simpler color scheme than a game made from the ground up for next gen systems. The difference does depend on the size of the screen used though, when on larger screens soul calibur 1 looks much sharper, but when on much smaller screens soul calibur 1 looks only slightly sharper.
I believe the gamecube version of soul calibur 2 is supposed to lack mip mapping or something though.

BTW, why should all 640x480 images look the same? N64, PSX, and PS2 definetely don't have the same output quality as other systems. Even nvidia and ati cards on pc don't have the same output quality.
 
Fox5 said:
Well, I'm mainly comparing Soul Calibur 1 with Soul Calibur 2(though it happens for pc to dc as well, house of the dead 2 on dc versus house of the dead 2 on pc, or the phantasy star games... and those are the exact same games but they look different, dreamcast is slightly sharper), I guess a port of a psone game would have a simpler color scheme than a game made from the ground up for next gen systems. The difference does depend on the size of the screen used though, when on larger screens soul calibur 1 looks much sharper, but when on much smaller screens soul calibur 1 looks only slightly sharper.
I believe the gamecube version of soul calibur 2 is supposed to lack mip mapping or something though.

BTW, why should all 640x480 images look the same? N64, PSX, and PS2 definetely don't have the same output quality as other systems. Even nvidia and ati cards on pc don't have the same output quality.

Technically they are the same. Like i said, one architecture might have ways of optimise the brightness and contrast levels to make them look better than others.
Also, N64 and PSX never output anything at 640x480 except a few exceptions. And N64 had a blur filter all over it, so it doesn't count.
 
london-boy said:
Fox5 said:
Well, I'm mainly comparing Soul Calibur 1 with Soul Calibur 2(though it happens for pc to dc as well, house of the dead 2 on dc versus house of the dead 2 on pc, or the phantasy star games... and those are the exact same games but they look different, dreamcast is slightly sharper), I guess a port of a psone game would have a simpler color scheme than a game made from the ground up for next gen systems. The difference does depend on the size of the screen used though, when on larger screens soul calibur 1 looks much sharper, but when on much smaller screens soul calibur 1 looks only slightly sharper.
I believe the gamecube version of soul calibur 2 is supposed to lack mip mapping or something though.

BTW, why should all 640x480 images look the same? N64, PSX, and PS2 definetely don't have the same output quality as other systems. Even nvidia and ati cards on pc don't have the same output quality.

Technically they are the same. Like i said, one architecture might have ways of optimise the brightness and contrast levels to make them look better than others.
Also, N64 and PSX never output anything at 640x480 except a few exceptions. And N64 had a blur filter all over it, so it doesn't count.

Ideally they should be the same, but only if every console is doing exactly the same things. Also, that would only hold true if everyone was outputting a digital signal, all the images go through a conversion to analog. Actually, if the conversion to analog is where the differences lie it could be because the dreamcast has an external vga box and is thus isolated from any interference to the image, whereas the DAC chip in my pc would be completely exposed. Or maybe the dreamcast isn't outputting true vga, but just a line doubled image. Why did the dreamcast need a vga box? Shouldn't it just be able to take a RGB image straight from the DC, modify it a little, and then output it straight to the PC like the gamecube vga cable does? I think all the gamecube's vga cable does to the image before it's outputted is add the sync frequencies.
 
Fox5 said:
london-boy said:
Fox5 said:
Well, I'm mainly comparing Soul Calibur 1 with Soul Calibur 2(though it happens for pc to dc as well, house of the dead 2 on dc versus house of the dead 2 on pc, or the phantasy star games... and those are the exact same games but they look different, dreamcast is slightly sharper), I guess a port of a psone game would have a simpler color scheme than a game made from the ground up for next gen systems. The difference does depend on the size of the screen used though, when on larger screens soul calibur 1 looks much sharper, but when on much smaller screens soul calibur 1 looks only slightly sharper.
I believe the gamecube version of soul calibur 2 is supposed to lack mip mapping or something though.

BTW, why should all 640x480 images look the same? N64, PSX, and PS2 definetely don't have the same output quality as other systems. Even nvidia and ati cards on pc don't have the same output quality.

Technically they are the same. Like i said, one architecture might have ways of optimise the brightness and contrast levels to make them look better than others.
Also, N64 and PSX never output anything at 640x480 except a few exceptions. And N64 had a blur filter all over it, so it doesn't count.

Ideally they should be the same, but only if every console is doing exactly the same things. Also, that would only hold true if everyone was outputting a digital signal, all the images go through a conversion to analog. Actually, if the conversion to analog is where the differences lie it could be because the dreamcast has an external vga box and is thus isolated from any interference to the image, whereas the DAC chip in my pc would be completely exposed. Or maybe the dreamcast isn't outputting true vga, but just a line doubled image. Why did the dreamcast need a vga box? Shouldn't it just be able to take a RGB image straight from the DC, modify it a little, and then output it straight to the PC like the gamecube vga cable does? I think all the gamecube's vga cable does to the image before it's outputted is add the sync frequencies.


Obviously different hardware handles output differently. I think all the DC VGA box did was convert the already VGA-friendly output into a VGA-compatible signal. Not rocket science i guess. And it's still analogue.
The PS2 outputs pro-scan directly from the console (The Blaze VGA Adapter is merely a normal VGA Cable - like the one given out with the Linux Kit - but modified in order to make it work on any monitor out there, not only on the ones that support synch-on-green).
Not sure what you mean by sharper, but i don't think the DC output is any "sharper" than PC cards from the original Radeon to today's ones.
As i said, colour, brightness and contrast levels can trick the eye and are usually very very cheap ways to improve IQ dramatically without affecting performance.
 
Fox5 said:
Or maybe the dreamcast isn't outputting true vga, but just a line doubled image.
No, with the exception of one or two very early games that only rendered a field, the system rendered a true 640x480 per frame/field than was then down filtered to field resolution.
Why did the dreamcast need a vga box?
Shouldn't it just be able to take a RGB image straight from the DC, modify it a little, and then output it straight to the PC like the gamecube vga cable does?
I think you have probably answered your own question. If you do a bit of googling you can see what's in the VGA converter. ...i.e. not a lot.
 
Back
Top