PS1 / PS2 resolution and framerate

So there were blank lines in such cases? What was the point? I suppose reduce flicker at the cost of looking stripy.
 
About the PAL and NTSC difference in games.

If I remember PAL output allowed higher "resolution" but couldn't refresh as fast as NTSC at full screen. So it was either black boarders and identical framerates or slower but full screen, but never black boarders and slower framerate. Correct?

Also what was the theoretical resolution of CRTs? I don't remember ever seeing any reference to interlaced resolution on any specs. How did games interlaced modes work on each CRT if each CRT had different resolutions?
 
PAL was fixed at 50fps, higher resolution. Letterboxed output was the same resolution as NTSC but it still ran slower. You could also have a wrong aspect with 448 lines being stretched to 480 or a 4:3 aspect being letterboxed into a slightly shorter vertical.

Also what was the theoretical resolution of CRTs? I don't remember ever seeing any reference to interlaced resolution on any specs. How did games interlaced modes work on each CRT if each CRT had different resolutions?
CRTs were analogue, so the resolution was determined by the frequency you could change and scan the electron beam. On a monochrome display, an oscilloscope (the origins of the CRT display), you had effectively 'infinite' resolution in terms of illuminatable points and producing perfect resolution lines. The Sony FW900 achieved 2560x1600 @ 75Hz on input, but it couldn't resolve that perfectly.

I expect modern tech would enable very high resolutions.
 
Halves the frame buffer size.
Actually, halves the memory required is probably a better description.
But you can get that with interlaced 480i. A 640x480 60 Hz display could show interleaved 640x240 fields. You can either update those 60 fps, or 30 fps and just double up the lines. Heck, the very first home computers and consoles had well below full TV display resolution but output a full screen image with chunky pixels, so innate upscaling was clearly possible and happening.

How common were 240p displays to be targeted and used this way?
 
But you can get that with interlaced 480i. A 640x480 60 Hz display could show interleaved 640x240 fields. You can either update those 60 fps, or 30 fps and just double up the lines. Heck, the very first home computers and consoles had well below full TV display resolution but output a full screen image with chunky pixels, so innate upscaling was clearly possible and happening.

How common were 240p displays to be targeted and used this way?

Do not know if I got you right here.
You mean a montor displaying only 240 pixels vertical resolution?

That is not what I meant.
At the time of Saturn and PS1, you usually used your TV set that did 480i max (or 576i PAL), but the image was displayed at what would later be called 240p/224p almost all the time.

It looked like this if you went close to it:

1740323261722.png

You definitely saw the black lines.

There were a few games in High Res mode Like Virtua Fighter 2, some N64-games with RAM-Pack, the examples I mentioned earlier and a few more that displayed double the vertical resolution in interlace mode, which looked great and pretty much like this:

1740323360256.png

(But of course you got that slight interlace flickering, as you had when watching a TV broadcast, which I cannot recreate on a screenshot here.)


I highly recommend trying Tobal 2 on the PS1 on a CRT TV from back then (not a CRT monitor)!

The image quality you get is phenomenal. :)
 
But you can get that with interlaced 480i. A 640x480 60 Hz display could show interleaved 640x240 fields. You can either update those 60 fps, or 30 fps and just double up the lines. Heck, the very first home computers and consoles had well below full TV display resolution but output a full screen image with chunky pixels, so innate upscaling was clearly possible and happening.

How common were 240p displays to be targeted and used this way?
Memory at the time, both working and storage, was a real cost bottleneck. So think of it like a form of compression. You draw the image as if it's 480 and then throw half the lines away. You then let the natural blurriness of a composite signal trick your mind into filling in the gaps like it does with impressionist paintings. And there was no scaling, it's all done with the timings on the analogue video signal.
If you went with a 480i image, you'd still need to store your graphics as if you were using a 480p display otherwise everything would look half height.

The recent popularity for pixel art is a nostalgic look back at a time that never existed. You never could see sharp pixels.
Things were a little sharper in arcades where the connections were generally RGB, but the tubes used were the lowest quality possible and the driving electronics weren't the best. We all crave Sony PVM/BVM CRT monitors for that authentic arcade look, but arcades never looked that good.
 
Memory at the time, both working and storage, was a real cost bottleneck. So think of it like a form of compression. You draw the image as if it's 480 and then throw half the lines away. You then let the natural blurriness of a composite signal trick your mind into filling in the gaps like it does with impressionist paintings. And there was no scaling, it's all done with the timings on the analogue video signal.
If you went with a 480i image, you'd still need to store your graphics as if you were using a 480p display otherwise everything would look half height.

Things were a little sharper in arcades where the connections were generally RGB, but the tubes used were the lowest quality possible and the driving electronics weren't the best. We all crave Sony PVM/BVM CRT monitors for that authentic arcade look, but arcades never looked that good.

Well I remember the Sega Arcade Monitors back then had the reputation of excellent quality (compared to other monitors from that time, of course), I also read that in the magazines of that time.

But I can't tell from own experience, as arcades were not a thing in my country back then. :(

Sega's monitors at those days (Model 1, 2 and 3) were only 496x384 tho. But they were progressive scan.

It was not until Naomi they got to 480p.
 
Do not know if I got you right here.
You mean a montor displaying only 240 pixels vertical resolution?
My point of reference is a home TV with an RF input and 1970s/early 80s computer hardware. That was the most basic tech. Resolutions back then were of the order of 300x200 on displays with 576 visible lines. They did not have visible scanlines but chunky pixels. This continued to 320x256 and 640x512 Amiga graphics - low res mode did not draw only half the screen.
That is not what I meant.
At the time of Saturn and PS1, you usually used your TV set that did 480i max (or 576i PAL), but the image was displayed at what would later be called 240p/224p almost all the time.
I'm definitely lost. If a game wanted high res graphics, it'd surely interleave fields, and if it went with low res graphics, it'd do exactly the same as the 1980s machines and just double lines. 240p progressive makes no sense to me.
It looked like this if you went close to it:

You definitely saw the black lines.
The only time I've seen that was a retro arcade in Norwich a couple of years ago on a crusty game of Golden Axe. The missing lines were very obvious, so I'd surely have been well aware of them on other displays had they been present. That said, maybe it's an arcade thing as I didn't frequent those?
I highly recommend trying Tobal 2 on the PS1 on a CRT TV from back then (not a CRT monitor)!
How?

Memory at the time, both working and storage, was a real cost bottleneck. So think of it like a form of compression. You draw the image as if it's 480 and then throw half the lines away. You then let the natural blurriness of a composite signal trick your mind into filling in the gaps like it does with impressionist paintings. And there was no scaling, it's all done with the timings on the analogue video signal.
As I say above, in the 70s/early 80s display buffers were all of 300x200, but they drew the full screen without visible scanlines. Same with low resolution images in the 90s.
 
PAL was fixed at 50fps, higher resolution. Letterboxed output was the same resolution as NTSC but it still ran slower. You could also have a wrong aspect with 448 lines being stretched to 480 or a 4:3 aspect being letterboxed into a slightly shorter vertical.
Why would they letterbox it if it didn't have any other improvement? Or was it an easy solution when converting one game from NTSC to PAL?

Also another curiosity, a PAL PS1 when modded to run NTSC games it required a chip and a converter. PAL games run always at PAL modes and NTSC games always run at NTSC modes on PS1. Curiously on the Saturn, a very simple internal rewiring allowed the console to be both NTSC and PAL. A PAL game would automatically run at NTSC speeds and res if someone installed an NTSC/PAL switch. Saturn games didnt have visible black boarders AFAIR, but it probably upsampled the image to hide them. Because I remember switching the console to NTSC while running the PAL Panzer Dragoon Saga and the screen grew and hid portions of the image. But it was at the same time faster.
 
Why would they letterbox it if it didn't have any other improvement? Or was it an easy solution when converting one game from NTSC to PAL?
Just an easy solution. A game designed for 640x480 could be displayed on a 512 line display by just letterboxing. Otherwise you either had to tweak the rendering to render full res, or stretch the image.
 
My point of reference is a home TV with an RF input and 1970s/early 80s computer hardware.

Cannot tell you about that, this topic was about PS1/PS2 era. I was not around on this planet when 70s/80s hardware was new. ;)


Hook up a PS1 on an CRT TV, preferably using RGB for a sharper image (or S-Video if RGB is not available).

As I say above, in the 70s/early 80s display buffers were all of 300x200, but they drew the full screen without visible scanlines. Same with low resolution images in the 90s.

No, you are definitely not right here about the 90s (and I would assume also about 70s or 80s).

They did NOT fill the blank lines in 240p-Modes of PS1, Saturn, N64 and the like, you could clearly see the empty black lines in between.

As MrSpigott already explained:
"
Although not an official standard, 240p (or numbers thereabouts depending on region) did exist. It was where the gun was forced to write to field 1 again, after fly back, rather than alternating between field 1 and field 2.
Exactly the same number of fields were being written a second, but the resolution was halved."
 
Hook up a PS1 on an CRT TV, preferably using RGB for a sharper image (or S-Video if RGB is not available).
I haven't a PS1 nor a CRTV and I'm not going to be getting either. ;)
No, you are definitely not right here about the 90s (and I would assume also about 70s or 80s).

They did NOT fill the blank lines in 240p-Modes of PS1, Saturn, N64 and the like, you could clearly see the empty black lines in between.
If so, I don't understand why. Why not just interleave or double up? Doesn't this depend on how the display handles the signal?
As MrSpigott already explained:
"
Although not an official standard, 240p (or numbers thereabouts depending on region) did exist. It was where the gun was forced to write to field 1 again, after fly back, rather than alternating between field 1 and field 2.
Exactly the same number of fields were being written a second, but the resolution was halved."
Found a couple of photographs of a 320x240 Sinclair Spectrum on a CRTV:

1740330329334.png

1740330443365.png

The aperture grille is visible, so entire missing lines should certainly be!

That led me to various things, including a photo of ICO with obvious scanlines...

1740330903142.png

And then another without:

1740331178534.png

That's more what I'm used to seeing with offset phosphor columns. Perhaps the irregular phosphors hide the scanlines better which is why I never saw them? But I still don't understand why not either interlace or double lines, when that was a proven solution years earlier.
 
So there were blank lines in such cases? What was the point? I suppose reduce flicker at the cost of looking stripy.

A TV from back then could not do it they way you describe.
Otherwise, it would also be progressive scan compatible, which was done only in much later TVs (and monitors of course).

Multisync monitors could do it if the graphics hard- and software supported it, but low resolutions arguably look better with the blank lines, especially console games that were made for it since it was the way a TV would display it. You have no new information to fill the blank lines other than line doubling, which does not look good. With blank lines, the brain will complete the image.

Compare:
1740331477373.png

The left image looks way more natural, while the right one looks artificial and very blocky/pixellated.
 
If so, I don't understand why. Why not just interleave or double up? Doesn't this depend on how the display handles the signal?

Because a TV from back than would not work this way.
Only computer monitors and later progressive scan compatible TVs could do that.

Also, it would not look good, see the comparison image in my last posting. ;)
 
Well I remember the Sega Arcade Monitors back then had the reputation of excellent quality (compared to other monitors from that time, of course), I also read that in the magazines of that time.

But I can't tell from own experience, as arcades were not a thing in my country back then. :(

Sega's monitors at those days (Model 1, 2 and 3) were only 496x384 tho. But they were progressive scan.

It was not until Naomi they got to 480p.
They had no reason to be excellent quality, just good enough. The analogue output stage of an arcade PCB was far simpler, and much cheaper, than broadcast spec' equipment's equivalent; so why hook up an expensive monitor. They just had to be durable.

As I say above, in the 70s/early 80s display buffers were all of 300x200, but they drew the full screen without visible scanlines. Same with low resolution images in the 90s.
Hidden in the blur and we all used much smaller screen sizes then, but they were definitely there.
 
They had no reason to be excellent quality, just good enough.

Hm? That does not make sense if you're talking bout the mid to end 90s here.
I mean a Model 3 arcarde board for example was about $ 20.000 for the board alone.
Cost was much less a factor in the arcades than back home. Most of the times, you also had expensive hardware like hydraulics, real cars to sit in and the like to provide an exclusive experience in the arcade which you just could not get at home. Why should they use the cheapest monitors then and save a little money, which in relation to the other cost of the investment was almost negligible?
It absolutely makes sense to NOT save a few bucks, but to use higher quality displays to get across the graphics from the powerful hardware.
 
I'm definitely lost. If a game wanted high res graphics, it'd surely interleave fields, and if it went with low res graphics, it'd do exactly the same as the 1980s machines and just double lines.
NTSC SD CRTs scan pairs of 240-line fields at 29.97Hz, with lines scanned at 15734Hz. That's it.

The only two options are to send standard NTSC signals and have it be 480i, or to fudge the signals to trick the television into drawing both fields at the same offset, resulting in 240p60 progressive scan with "scanlines." There's no way to do 480p.

If the underlying imagery is true 240p60 and you allow the video signal to alternate even/odd fields, you'd eliminate the "scanlines", but in doing so introduce a slight vertical bobbing to the image.

Now a CRT TV from back then has a certain afterglow (even more than a CRT monitor from those days) and also the humen eye/bain is a bit slow, so you will not see the scanlines
Lack of interlaced scanline visibility has at least as much to do with our visual persistence as it does television persistence.

A simple proof of this is to just look at a TV in a way that makes your eyes eye-track something vertically at roughly 60 on-screen-lines per second. This amount of eye movement will keep the scanline grid "fixed" in your vision, allowing you to see the gaps, especially on bright imagery. I think the first time I noticed this was when looking at the brightly-lit ring in the skybox on Truth and Reconciliation in Halo CE.

It can be a little tricky to force yourself to eye-track something without having something to eye track. The easiest way is probably to load up a first-person shooter, and then make the camera tilt slowly while you look at a fixed point or object.
 
Last edited:
Also another curiosity, a PAL PS1 when modded to run NTSC games it required a chip and a converter. PAL games run always at PAL modes and NTSC games always run at NTSC modes on PS1. Curiously on the Saturn, a very simple internal rewiring allowed the console to be both NTSC and PAL. A PAL game would automatically run at NTSC speeds and res if someone installed an NTSC/PAL switch. Saturn games didnt have visible black boarders AFAIR, but it probably upsampled the image to hide them. Because I remember switching the console to NTSC while running the PAL Panzer Dragoon Saga and the screen grew and hid portions of the image. But it was at the same time faster.

A lot of (but not all) PAL Saturn games were reworked to render at a higher resolution to fill up more of the screen. This was relatively straight forward for the Saturn because it had two dedicated banks of 256KB each for the front and back buffer. NTSC resolutions didn't fill these so there was room to extend the resolution, and fillrate was basically the same due to the lower frame rate.

Downside to this is that you normally had memory on the Saturn that wasn't being used, so not great from an efficiency pov. On Playstation the frame buffers were stored in 1MB of unified vram, so there probably wasn't room to for larger frame buffers without reworking some of the other contents of vram, which wasn't a priority for anyone.

Some Saturn PAL conversions also had the speed corrected so that they ran as fast as the NTSC version, but for games that tied game speed to frame rate and that weren't suitably re-worked, that meant you'd drop every sixth frame, leading to uneven pacing of the game despite regular frame times.
 
Back
Top