So there were blank lines in such cases? What was the point? I suppose reduce flicker at the cost of looking stripy.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
CRTs were analogue, so the resolution was determined by the frequency you could change and scan the electron beam. On a monochrome display, an oscilloscope (the origins of the CRT display), you had effectively 'infinite' resolution in terms of illuminatable points and producing perfect resolution lines. The Sony FW900 achieved 2560x1600 @ 75Hz on input, but it couldn't resolve that perfectly.Also what was the theoretical resolution of CRTs? I don't remember ever seeing any reference to interlaced resolution on any specs. How did games interlaced modes work on each CRT if each CRT had different resolutions?
Halves the frame buffer size.So there were blank lines in such cases? What was the point? I suppose reduce flicker at the cost of looking stripy.
So there were blank lines in such cases?
But you can get that with interlaced 480i. A 640x480 60 Hz display could show interleaved 640x240 fields. You can either update those 60 fps, or 30 fps and just double up the lines. Heck, the very first home computers and consoles had well below full TV display resolution but output a full screen image with chunky pixels, so innate upscaling was clearly possible and happening.Halves the frame buffer size.
Actually, halves the memory required is probably a better description.
But you can get that with interlaced 480i. A 640x480 60 Hz display could show interleaved 640x240 fields. You can either update those 60 fps, or 30 fps and just double up the lines. Heck, the very first home computers and consoles had well below full TV display resolution but output a full screen image with chunky pixels, so innate upscaling was clearly possible and happening.
How common were 240p displays to be targeted and used this way?
Memory at the time, both working and storage, was a real cost bottleneck. So think of it like a form of compression. You draw the image as if it's 480 and then throw half the lines away. You then let the natural blurriness of a composite signal trick your mind into filling in the gaps like it does with impressionist paintings. And there was no scaling, it's all done with the timings on the analogue video signal.But you can get that with interlaced 480i. A 640x480 60 Hz display could show interleaved 640x240 fields. You can either update those 60 fps, or 30 fps and just double up the lines. Heck, the very first home computers and consoles had well below full TV display resolution but output a full screen image with chunky pixels, so innate upscaling was clearly possible and happening.
How common were 240p displays to be targeted and used this way?
Memory at the time, both working and storage, was a real cost bottleneck. So think of it like a form of compression. You draw the image as if it's 480 and then throw half the lines away. You then let the natural blurriness of a composite signal trick your mind into filling in the gaps like it does with impressionist paintings. And there was no scaling, it's all done with the timings on the analogue video signal.
If you went with a 480i image, you'd still need to store your graphics as if you were using a 480p display otherwise everything would look half height.
Things were a little sharper in arcades where the connections were generally RGB, but the tubes used were the lowest quality possible and the driving electronics weren't the best. We all crave Sony PVM/BVM CRT monitors for that authentic arcade look, but arcades never looked that good.
My point of reference is a home TV with an RF input and 1970s/early 80s computer hardware. That was the most basic tech. Resolutions back then were of the order of 300x200 on displays with 576 visible lines. They did not have visible scanlines but chunky pixels. This continued to 320x256 and 640x512 Amiga graphics - low res mode did not draw only half the screen.Do not know if I got you right here.
You mean a montor displaying only 240 pixels vertical resolution?
I'm definitely lost. If a game wanted high res graphics, it'd surely interleave fields, and if it went with low res graphics, it'd do exactly the same as the 1980s machines and just double lines. 240p progressive makes no sense to me.That is not what I meant.
At the time of Saturn and PS1, you usually used your TV set that did 480i max (or 576i PAL), but the image was displayed at what would later be called 240p/224p almost all the time.
The only time I've seen that was a retro arcade in Norwich a couple of years ago on a crusty game of Golden Axe. The missing lines were very obvious, so I'd surely have been well aware of them on other displays had they been present. That said, maybe it's an arcade thing as I didn't frequent those?It looked like this if you went close to it:
You definitely saw the black lines.
How?I highly recommend trying Tobal 2 on the PS1 on a CRT TV from back then (not a CRT monitor)!
As I say above, in the 70s/early 80s display buffers were all of 300x200, but they drew the full screen without visible scanlines. Same with low resolution images in the 90s.Memory at the time, both working and storage, was a real cost bottleneck. So think of it like a form of compression. You draw the image as if it's 480 and then throw half the lines away. You then let the natural blurriness of a composite signal trick your mind into filling in the gaps like it does with impressionist paintings. And there was no scaling, it's all done with the timings on the analogue video signal.
Why would they letterbox it if it didn't have any other improvement? Or was it an easy solution when converting one game from NTSC to PAL?PAL was fixed at 50fps, higher resolution. Letterboxed output was the same resolution as NTSC but it still ran slower. You could also have a wrong aspect with 448 lines being stretched to 480 or a 4:3 aspect being letterboxed into a slightly shorter vertical.
Just an easy solution. A game designed for 640x480 could be displayed on a 512 line display by just letterboxing. Otherwise you either had to tweak the rendering to render full res, or stretch the image.Why would they letterbox it if it didn't have any other improvement? Or was it an easy solution when converting one game from NTSC to PAL?
My point of reference is a home TV with an RF input and 1970s/early 80s computer hardware.
How?
As I say above, in the 70s/early 80s display buffers were all of 300x200, but they drew the full screen without visible scanlines. Same with low resolution images in the 90s.
I haven't a PS1 nor a CRTV and I'm not going to be getting either.Hook up a PS1 on an CRT TV, preferably using RGB for a sharper image (or S-Video if RGB is not available).
If so, I don't understand why. Why not just interleave or double up? Doesn't this depend on how the display handles the signal?No, you are definitely not right here about the 90s (and I would assume also about 70s or 80s).
They did NOT fill the blank lines in 240p-Modes of PS1, Saturn, N64 and the like, you could clearly see the empty black lines in between.
Found a couple of photographs of a 320x240 Sinclair Spectrum on a CRTV:As MrSpigott already explained:
"
Although not an official standard, 240p (or numbers thereabouts depending on region) did exist. It was where the gun was forced to write to field 1 again, after fly back, rather than alternating between field 1 and field 2.
Exactly the same number of fields were being written a second, but the resolution was halved."
So there were blank lines in such cases? What was the point? I suppose reduce flicker at the cost of looking stripy.
If so, I don't understand why. Why not just interleave or double up? Doesn't this depend on how the display handles the signal?
They had no reason to be excellent quality, just good enough. The analogue output stage of an arcade PCB was far simpler, and much cheaper, than broadcast spec' equipment's equivalent; so why hook up an expensive monitor. They just had to be durable.Well I remember the Sega Arcade Monitors back then had the reputation of excellent quality (compared to other monitors from that time, of course), I also read that in the magazines of that time.
But I can't tell from own experience, as arcades were not a thing in my country back then.
Sega's monitors at those days (Model 1, 2 and 3) were only 496x384 tho. But they were progressive scan.
It was not until Naomi they got to 480p.
Hidden in the blur and we all used much smaller screen sizes then, but they were definitely there.As I say above, in the 70s/early 80s display buffers were all of 300x200, but they drew the full screen without visible scanlines. Same with low resolution images in the 90s.
They had no reason to be excellent quality, just good enough.
NTSC SD CRTs scan pairs of 240-line fields at 29.97Hz, with lines scanned at 15734Hz. That's it.I'm definitely lost. If a game wanted high res graphics, it'd surely interleave fields, and if it went with low res graphics, it'd do exactly the same as the 1980s machines and just double lines.
Lack of interlaced scanline visibility has at least as much to do with our visual persistence as it does television persistence.Now a CRT TV from back then has a certain afterglow (even more than a CRT monitor from those days) and also the humen eye/bain is a bit slow, so you will not see the scanlines
Also another curiosity, a PAL PS1 when modded to run NTSC games it required a chip and a converter. PAL games run always at PAL modes and NTSC games always run at NTSC modes on PS1. Curiously on the Saturn, a very simple internal rewiring allowed the console to be both NTSC and PAL. A PAL game would automatically run at NTSC speeds and res if someone installed an NTSC/PAL switch. Saturn games didnt have visible black boarders AFAIR, but it probably upsampled the image to hide them. Because I remember switching the console to NTSC while running the PAL Panzer Dragoon Saga and the screen grew and hid portions of the image. But it was at the same time faster.