Scaler - complexity, structure & cost

MyEyeSpy

Newcomer
Scalers that scale the resolution, such as the ones found in high-end dvd-players/TV's/consoles (PS3 & Xbox 360 Hana). How do they work? Are fundamentally different from GPU's? Which brands are there today(IC, not DVD's using them etc)? What does it cost to integrate one of these babies? How complex are they, IE how many million transistors can one expect to find in a 1080p capable scaler in very rough numbers? 10? 50? 100? WHAT is there in? CPU/GPU/CACHE? Can't a GPU do the same job as a scaler, if not better (ie why include them in top-line consoles)?

How much more complex would a scaler need to be (compared to those capable of 1080p today) if we are talking 4k resolution?

My post is in no way "zomg, why not include R600 & 8800GTX ULTRA 2 in that TV instead of a puny scaler?!". I am trying to understand them but find little to no references at all :cry:
 
ie why include them in top-line consoles
Two reasons, video fanboys and the ability to conserve energy during video playback (and thus reduce fan noise).

Most of the time the scalers are just relatively simplistic linear filters (sometimes with a lot of taps ... but that matters a lot less for interpolation than some people think) which would take only a small percentage of the power of a modern GPU.
 
I don't know if GPUs have scalars built in, but someone once told me Matrox had done a Lanczos scalar (not sure if I'm spelling that right). Though that person worked on the Rainbow Runner video product in addition to some graphics chips so he might have been referring to Rainbow Runner.
 
Scalers that scale the resolution, such as the ones found in high-end dvd-players/TV's/consoles (PS3 & Xbox 360 Hana). How do they work? Are fundamentally different from GPU's?
High-end scalers do a lot more than just stretch the image and apply some kind of filter to it: they're intended to display a low resolution, low frame rate, sometimes interlaced image on a high-frame rate, high resolution screen.

When your original source is interlaced at 30Hz (basically, a standard SD NTSC signal) and you have a 100Hz 1080p display screen, you'll first need a deinterlacer. Since there is a time delta between the A field and the B field, you can't just merge 2 fields together to get a double resolution image.
Well, you can, but then you get horrible artifacts that are unacceptable for a high end TV.

High-end deinterlacers are smarter and, sometimes, WAY smarter than that: the smarter ones will do motion detection on the image. For static parts of the image, they'll merge the A field and B field, effectively doubling the resolution. For moving parts, they'll do so-called 'bobbing', whereby they just repeat the previous line: this is not as nice as merging, but you avoid the ugly artifacts. Because that part of the image is moving anyway, the lower quality will be less noticeable.

After the de-interlace step, you can then use regular upscaling techniques to get to the final resolution.

The way smarter de-interlacer uses motion compensation instead of just motion detection: not only will it detect if there is motion, it will detect which pixel is moving where over the course of multiple frames and use this to interpolate both spatially and temporally: invent completely new intermediate frames (because the scan rate changes) and inventing new pixels within an A or B frame during de-interlacing.

I don't know to what extent GPU scalers are able to this kind of stuff. But AVIVO and PureVideo are getting increasingly higher HQV scores, so they must get a lot of stuff right.

Which brands are there today(IC, not DVD's using them etc)?
Silicon Optix, Trident, Gennum, ATI. There must be a lot more. Some large CE companies may have their own in-house developed stuff. (Sony being a likely candidate, they are famous for their NIH tendencies.)

WHAT is there in? CPU/GPU/CACHE?
CPU: very likely, to manage the dataflow in the chip. An ARM, ARC, Tensilica, MIPS, or some other embedded processor.
Cache: I$ and D$ for the cpu.
GPU: Why?

Those components are standard stuff these days in pretty much any complex embedded chip and probably only a very small part of it. Without in-depth knowledge of the inner workings of the other blocks, area and complexity estimation is pretty much meaningless.

Can't a GPU do the same job as a scaler?
For motion estimation, you start with a 16x16 (or smaller) pixel block and then move it in a region of, say, 32x32 in steps of 1/4 pixel and try to find the location with the closest match. (It's very similar to what a video encoder is doing.) I don't think GPU shaders have the ability to be efficient for that kind of memory access patterns.
Simple upscaling is a small piece of dedicated hardware. Probably not worth putting on a shader.

I am trying to understand them but find little to no references at all.
The Google is your friend. Any permutation of the words 'motion estimation compensation frame rate conversion scaling' will give you way too much to read. ;)
 
Shame I can't give rep, very helpfull.

Those components are standard stuff these days in pretty much any complex embedded chip and probably only a very small part of it. Without in-depth knowledge of the inner workings of the other blocks, area and complexity estimation is pretty much meaningless.

The question regarding complexity was mainly in relation to chip price. What does it cost to integrate and how complex are these chips compared to one and the other? For example, audio chips we can have one from 70 cents up to Xi-fi with 50ish million transistors and still there is everything in between, above & below that. I was mainly refering to the top-end now so the 70 cent equviliant is not really that interesting although I know it is "fairly" more complicated and costly.

The Google is your friend. Any permutation of the words 'motion estimation compensation frame rate conversion scaling' will give you way too much to read. ;)

Thank you, my search references were nothing like that and probably why I got too much noise and too little food.
 
Anyone with some insight who can speculate, in VERY rought numbers, what it does cost for the OEMs to include a scaler in their products? Not THE top of the line, but a more than decent one. References would be greatly appreciated :rolleyes:
 
Anyone with some insight who can speculate, in VERY rought numbers, what it does cost for the OEMs to include a scaler in their products? Not THE top of the line, but a more than decent one. References would be greatly appreciated :rolleyes:

Why roll eyes? It could be interpreted as rude, and considering the nice reply you got from silent_guy you should not complain.

And about your question, this prices can be hard to estimate, considering a manufacturer prices depending of the amount of units and the past relationship and deals with the customer. In the most cases they don't like those details to be known by consumers. If I had to guess, I would say something like 5 to 30 euros per scaler, depending on quality, but that can be really off, and most likely is.
 
It was meant in irony as I seriously doubt real references can be given on the same ground you just mentioned, the closest would be if anyone knew the size or level of complexity of the chips so one could begin in that end and calculate the production costs.

I fail to see how it could be interpretated as rude since I commented on being grateful for the informative & constructive input, that was in no form or way intended, quite the contrary.
 
Anyone with some insight who can speculate, in VERY rought numbers, what it does cost for the OEMs to include a scaler in their products? Not THE top of the line, but a more than decent one. References would be greatly appreciated.

I'm afraid this is not the place where you're going to get any reasonable numbers. You may want to call some of the companies I've listed and ask for marketing material or, with a little persistence, a datasheet? ("I'm a poor student doing a thesis on...")

It's clear that you didn't want to insult anyone with the rolling eyes, so don't worry. But compress is right that it can fairly negative twist to your message. Forums would be a friender place with a smiley repertoire limited to just ';)'.

;)

(If you manage to get the data you want, come back and share it here!)
 
It's just too small a part of modern video processors to warrant much attention in scientific papers and you will have a hard time prying area information out of companies.

H.264 is quite popular as far as publications go at the moment though, and it's subpixel interpolation isn't that bad (6 tap in each dimension to form half-pel interpolants, bilinear after that). AFAIK subpixel interpolation for H.264 HDTV decoders takes on the order of 1 mm to implement at 180 nm ... or in other words, it's not really relevant.
 
It's just too small a part of modern video processors to warrant much attention in scientific papers and you will have a hard time prying area information out of companies.

H.264 is quite popular as far as publications go at the moment though, and it's subpixel interpolation isn't that bad (6 tap in each dimension to form half-pel interpolants, bilinear after that). AFAIK subpixel interpolation for H.264 HDTV decoders takes on the order of 1 mm to implement at 180 nm ... or in other words, it's not really relevant.

Yes, that's for the plain vanilla resolution upscaling. For mid-end or high-end stuff, the picture is completely different. (The scalers are usually in the panels, BTW, not in the DVD players.)
 
Meh, the algorithms I have actually seen described from Philips (linear + LTI) and Sony (VQ based) aren't all that complex either (well not as far as multiplications goes anyway).
 
It can't be all that small

Microsofts new Xbox 360 elite with Hana (shrink? new packaging?) got the scaler as a separate chip. Had it been all that "tiny" it would have made sense to include it on the NB to avoid additional packaging? Might have proven economical not to with yield / packaging cost but since they have full control over all chips I just think they would have included it with one of the others if it was not that complex, maybe even the GPU to save pins (balls)?

I know not a lot about the one in

On an completely unrelated note, which company or chip is considered "the" best? Or can that be answered?

A little more on the xbox scaler, I am sure most of you have read it, but if not:
http://arstechnica.com/articles/headstart.ars/2

when I ask how much it costs to put Ana into the 360: "This isn't a $1,000 scaler," Henson says, "but it's a good one."
 
I'm not sure, if this info could be any helpful, but hardware video scalers are integrated in common graphic cards from 1996 (or maybe 1995). e.g. ATi Mach 64 VT from 1996 is the first chip from ATi, which supports HW video scaling (max. input resolution is 320x240 or 352x240 with some driver versions). This VGA/2D/video accelerator consists of less than 1 million of transistors (I have no idea about exact number, but newer and 3D capable Rage I is under 1 meg too, so hundreds of thousands could be more accurate estimation).

Rage II supported input resolution of 720x480 (NTSC DVD). Not sure about Rage PRO, but it's scaler was improved and used 4-tap (horizontal) and 2-tap (vertical) scaling filter.

The next big leap was Rage 128 PRO, which supports even some HDTV resolutions (720x480 is still officially the highest input resolution, but the chip actually manages much higher res. - 14xx*1xxx works, i haven't tested any higher, but if you give me a link to some HD-MPEG2 videos, I could test it for you.) The scaler uses 4-tap/4-tap filter and I think this was the first consumer graphics chip supporting 4-tap/4-tap filtering. Rage 128 PRO consists of 8 mil. trans. (don't forget it's dual-pipeline 3D core, AGP4x interface, 128bit mem. controller, integrated TMDS with ratiometric expander, RAMDAC etc.)

I think that a scaler for HDTV resolutions can consist of less than 1 million of transistors if you don't have any special requirements.
 
I don't really know why m$ included an analog scaling chip ... my guess is that engineering designed for 720p initially (makes sense, why bother with scaling if the displays can do it?) and marketing later decided it should be able to push out 1080p/i for bullet point reasons (probably a wise choice in retrospect).

PS. I wouldn't be surprised if that (H-)ANA chip has an ADC.
 
Last edited by a moderator:
But if a scaler does not take up so much space, why don't nVidia & ATI(AMD) integrate one now in every chip? For example, say you have a 30" 2560 x 1600 but can only get good frame rates at 1600x1200 or whatever, and then let it scale up to whatever resolution is your maximum? If it is really small, it would make even more sense (in one way) to include it in the mid level.
 
Every todays graphics chip contains a scaler, but it's unusable for 3D graphics because of horrible image-quality (jagged and blurred). Standard scalers don't add any details; they only enlarges and filters existing image (the difference between up-scaled image end image already rendered in higher resolution is similar to digital vs. optical zoom in cameras)

example:

image rendered at 160x120

s1_160x120.png



image rendered at 160x120 upscaled to 320x240

s1_upscaled.png



image rendered at 320x240

s1_def.png



image rendered at 320x240 using FSAA 16x (OGSS)

s1_ogss16.png



image rendered at 320x240 using FSAA 16x (OGSS) and some gamma correction

s1_ogss16_g122.png
 
I know where you are going at, but it is my understanding "real" (except real cheap ones) do a lot more, Silent_guy described it very nicely, so I'll just quote him:

When your original source is interlaced at 30Hz (basically, a standard SD NTSC signal) and you have a 100Hz 1080p display screen, you'll first need a deinterlacer. Since there is a time delta between the A field and the B field, you can't just merge 2 fields together to get a double resolution image.
Well, you can, but then you get horrible artifacts that are unacceptable for a high end TV.

High-end deinterlacers are smarter and, sometimes, WAY smarter than that: the smarter ones will do motion detection on the image. For static parts of the image, they'll merge the A field and B field, effectively doubling the resolution. For moving parts, they'll do so-called 'bobbing', whereby they just repeat the previous line: this is not as nice as merging, but you avoid the ugly artifacts. Because that part of the image is moving anyway, the lower quality will be less noticeable.

After the de-interlace step, you can then use regular upscaling techniques to get to the final resolution.

The way smarter de-interlacer uses motion compensation instead of just motion detection: not only will it detect if there is motion, it will detect which pixel is moving where over the course of multiple frames and use this to interpolate both spatially and temporally: invent completely new intermediate frames (because the scan rate changes) and inventing new pixels within an A or B frame during de-interlacing.

Would not a "real smart" scaler give quite nice result in games? If they only blow it up to fit, would it not always be better to look at media in its original form?
I know dvd on a 4k screen would look like ****, but with a real good scaler I thought it would look "quite" nice, or am wrong about what a good scaler really can do?
 
I know where you are going at, but it is my understanding "real" (except real cheap ones) do a lot more, Silent_guy described it very nicely, so I'll just quote him:



Would not a "real smart" scaler give quite nice result in games? If they only blow it up to fit, would it not always be better to look at media in its original form?
I know dvd on a 4k screen would look like ****, but with a real good scaler I thought it would look "quite" nice, or am wrong about what a good scaler really can do?

Im guessing the results aren't all that good even with the best scalers (or at least the best reasonably priced ones) as otherwise all GPU's would just be rendering at 640x480 and then upscaling to get a 1600x1200 equivilent output. Would sure be nice if it worked that way though ;)

Imagine what we could do if GPU's only had to render at 640x480 and a scaler boosted the image quality to a high res equivilent!
 
Would not a "real smart" scaler give quite nice result in games? If they only blow it up to fit, would it not always be better to look at media in its original form?
I know dvd on a 4k screen would look like ****, but with a real good scaler I thought it would look "quite" nice, or am wrong about what a good scaler really can do?
The tricks I talked about are all very much video oriented and more or less irrelevant for GPUs: it doesn't require frame rate conversion and you don't need deinterlacing. So other than just regular upscaling with filtering there's not a lot you can do.
 
Back
Top