The Nature of Scaling (Signal Processing Edition)

Xalion

Regular
I did notice that user tend to use past Gametrailers comparison as a reference for their bias toward the XBOX 360. The thing which is kind of funny to know, for the videophile is that every comparisons in the past gives an advantage to the PS3 version, due to the video quality of them made so far.

I really wanted to leave this post alone. However, the claim that videophiles would think that these favor the PS3 just grates on me. While I would consider myself a computer enthusiast, I have spent far more time researching every last aspect of video that I can for my home theater.

GameTrailers specifically disables EVERY advanced video option for the PS3 when outputting a signal. They do this claiming they want the console on "even" ground. This includes things like true color, 24hz output, color balancing options, and even most of the advanced sound. Every videophile I know would tell you that crippling a player for an "even" comparison is ridiculous.

Keep in mind that at the same time they choose HDMI output for the 360 - even though this is a nonstandard feature of the console. The majority of XBox owners right now don't even have an HDMI port on their player. Even then, the point that most of us have been making is not that platform X is being picked on. It is the fact that their "testing" methods do not make sense for real comparisons.

Some specific points on your list:

The low bitrate & the quality of the encoding pass hide the jaggies & the textures qualities along with hidden ghosting(All titles).

This depends entirely on the encoding. A videophile would never point to this as a reason it is biased towards the XBox or the Playstation. They would point to it as a reason that neither stream represents a decent comparison. However, most encoding algorithms average over pixels in such a way that while the jaggies may shrink, you get edge effects in the locations that they are compressed. This results in either halos or increased jaggies in the picture.

Ghosting would NOT be hidden by this type of compression. Ghosting is when one image gets duplicated in subsequent frames. If the object moves, then you see a "ghost" version of the object following it. If ghosting is truly present, then even compressed you should be able to see it clearly from frame to frame. Decimation might, but you would have to assume ghosting only happens on even or odd numbered frames.

The video are not shown in 1080P which again hide the jaggies.

The videos aren't recorded in 1080p either. Once again, the reference frame here would be to show rendered output resolution - not 1080p. Unless you allow consoles to do upscaling. That of course would then lead to subjective interpretations of the picture. The XBox's upscaler tends to make images "soft". Some people like that. Some people hate that. Sure, you get less jaggies but you get less detail as well.

The video are rendered at 29.97fps which various version of the games on the 360 run at a faster framerate(Armored Core 4, Sports Games).

This is the closest you came to an actual issue. However, higher frame rate does not always equal a better game. Any videophile or gamer can tell you it isn't what happens between 30 and 60 that really matters. It is what happens when the frame rate drops BELOW 30 that matters.

While I will give you that a video like this is a poor way to compare and contrast frame rate, I would also point out that a video at 60fps would ALSO be a poor way to compare and contrast framerate. Instead, most people who want to compare framerates show two numbers. Time averaged framerate and a plot showing framerate vs time.

What really irks me about you including this on your list is that the framerate on the video is high enough that you would see drops below 30fps from the PS3 or the XBox - meaning that it really doesn't benefit or hide anything for either console.

Really - while people are struggling to turn this into an "XBox 360" vs "PS3" issue it really has nothing to do with that. It is related to integrity in video game press. Your list of issues only serves to highlight that problem. These comparisons are NOT done right.

If you go to a computer review site like HardOCP or Anandtech you can tell they take great lengths to ensure their tests are proper and representative of what the public gets. If you go to an AV review site like Secrets of Home Theatre or AVS you can tell they take great lengths to ensure their tests are proper and representative of what the public gets. Even all of the good restaurant or movie critics I know of try to give even coverage from review to review - so that even if you disagree with their opinions you can get a feeling for where you would stand based on what they say. Can't we expect the same out of video game sites?
 
I really wanted to leave this post alone. However, the claim that videophiles would think that these favor the PS3 just grates on me. While I would consider myself a computer enthusiast, I have spent far more time researching every last aspect of video that I can for my home theater.

GameTrailers specifically disables EVERY advanced video option for the PS3 when outputting a signal. They do this claiming they want the console on "even" ground. This includes things like true color, 24hz output, color balancing options, and even most of the advanced sound. Every videophile I know would tell you that crippling a player for an "even" comparison is ridiculous.
Keep in mind that at the same time they choose HDMI output for the 360 - even though this is a nonstandard feature of the console. The majority of XBox owners right now don't even have an HDMI port on their player. Even then, the point that most of us have been making is not that platform X is being picked on. It is the fact that their "testing" methods do not make sense for real comparisons.

Some specific points on your list:

I do Believe you & I agree, but you're really off the point. PS3 games doesn't even take advantage of those. They're comparing graphic with the signal picture via HDMI on both system with full RGB "on". Some XBOX 360 users don't have HDMI, but the current sold on the market have HDMI.

You will have the true color indeed on "both". :p

This depends entirely on the encoding. A videophile would never point to this as a reason it is biased towards the XBox or the Playstation. They would point to it as a reason that neither stream represents a decent comparison. However, most encoding algorithms average over pixels in such a way that while the jaggies may shrink, you get edge effects in the locations that they are compressed. This results in either halos or increased jaggies in the picture.

Ghosting would NOT be hidden by this type of compression. Ghosting is when one image gets duplicated in subsequent frames. If the object moves, then you see a "ghost" version of the object following it. If ghosting is truly present, then even compressed you should be able to see it clearly from frame to frame. Decimation might, but you would have to assume ghosting only happens on even or odd numbered frames.

Why do you think people seek for super low time respond LCD screen. To get rid of ghosting obviously. The compression alghorithm reduce the number of color used in group of frames. The ghost will be erased depending of the situation.

By the way, Gametrailers use MPEG4 video compression in a windows media content.

The videos aren't recorded in 1080p either. Once again, the reference frame here would be to show rendered output resolution - not 1080p. Unless you allow consoles to do upscaling. That of course would then lead to subjective interpretations of the picture. The XBox's upscaler tends to make images "soft". Some people like that. Some people hate that. Sure, you get less jaggies but you get less detail as well.

What do you think are scaler chip do? Soft the picture? :rolleyes:. At higher resolution, it will force the video to show more detail out of the texture if it's recorded at an higher bitrate & 2 pass.


This is the closest you came to an actual issue. However, higher frame rate does not always equal a better game. Any videophile or gamer can tell you it isn't what happens between 30 and 60 that really matters. It is what happens when the frame rate drops BELOW 30 that matters.

While I will give you that a video like this is a poor way to compare and contrast frame rate, I would also point out that a video at 60fps would ALSO be a poor way to compare and contrast framerate. Instead, most people who want to compare framerates show two numbers. Time averaged framerate and a plot showing framerate vs time.

What really irks me about you including this on your list is that the framerate on the video is high enough that you would see drops below 30fps from the PS3 or the XBox - meaning that it really doesn't benefit or hide anything for either console.

Really - while people are struggling to turn this into an "XBox 360" vs "PS3" issue it really has nothing to do with that. It is related to integrity in video game press. Your list of issues only serves to highlight that problem. These comparisons are NOT done right.

If you go to a computer review site like HardOCP or Anandtech you can tell they take great lengths to ensure their tests are proper and representative of what the public gets. If you go to an AV review site like Secrets of Home Theatre or AVS you can tell they take great lengths to ensure their tests are proper and representative of what the public gets. Even all of the good restaurant or movie critics I know of try to give even coverage from review to review - so that even if you disagree with their opinions you can get a feeling for where you would stand based on what they say. Can't we expect the same out of video game sites?

Gamer mesure framerate & performance which MORE FRAMERATES PER SECOND & STABLE is better. If you try to claim that 30>60fps, there sure something wrong about it. Everyone prefer better framerate on the same & exact game.

Having an higher framerate will make the frame drop more obvious on either consoles. If I point out the PS3 is due that there were more cases where the PS3 dropped frames with the current gametrailers comparisons.

Contrast have nothing to do with framerate but the frame alone itself. Anyway, I have no idea what you are talking about...
 
I do Believe you & I agree, but you're really off the point. PS3 games doesn't even take advantage of those. They're comparing graphic with the signal picture via HDMI on both system with full RGB "on". Some XBOX 360 users don't have HDMI, but the current sold on the market have HDMI.

Yeah, and SOME PS3 users have Televisions that support super white, full range color, and actually have the contrast and gamma on their television set correctly instead of the modified color palette used on the XBox. So we should use those settings right?

The point is that if you are really testing equipment, you should go for best on both. Many of these cross console games by your own admission are using a different color palette. It isn't that there is more contrast or dynamic lighting - it is that for all intents and purposes they are set incorrectly on the XBox. If you use a properly calibrated PS3 you can replicate those colors.

You cannot claim you are being "fair" by making sure to default all of the PS3 settings while using a non-standard aspect of the XBox. It is definitely not a head to head comparison.

You will have the true color indeed on "both". :p

I assumed since you proclaimed any videophile would support the XBox as being discriminated against, you would know that the PS3 videos are all run with full range color turned off and super white disabled..

This full range color is often referred to as true color. It is something the XBox does not support at all, so you cannot have it on both consoles. GameTrailer's claimed that the full color setting on the PS3 was an "artificial enhancement" and refused to use it. The same holds true of super white. What this tends to do is to brighten the image - hence the "washed out" look you see on most PS3 titles on GameTrailers.

At this point, I also want you to review those two pictures you posted. You will notice that the contrast (meaning the difference between various color levels instead of the contrast setting on your television) is actually higher on the PS3 version. Look at the head of the statue. On the XBox version it is entirely black - leaving you unable to see it at all. On the PS3 version you can actually see the entire head. Hence contrast is greater in the PS3 version - not the XBox one as claimed.

The XBox one will probably look better to most people though - especially in a room with a lot of ambient light. The red level is set higher. You could achieve the same result on the PS3 by increasing the red level of your television set. You will have to lower contrast (or brightness - depends on which setting your television manufacturer uses for this) slightly to compensate and keep green and blue in balance. Most decent calibration DVDs come with the necessary patterns to do this.

Why do you think people seek for super low time respond LCD screen. To get rid of ghosting obviously. The compression alghorithm reduce the number of color used in group of frames. The ghost will be erased depending of the situation.

You are totally off base here. Ghosting on an LCD screen is a function of the pixels not updating fast enough to keep up with the image. So the LCD screen doesn't quite fade before the next image is drawn. You can and do usually test ghosting with a 2 color black and white test pattern. You up the refresh rate of the pattern to a set rate and then flash it across the screen. If your LCD has a slow response time, you will see the very obvious ghost as a grey bar. The test pattern is still only 2 colors. Reducing the color space does absolutely nothing because ghosting is not a function of color space. It is a function of response time.

Even more to the point, this type of ghosting is machine independent. It is actually far more dependent on the FPS than it is the machine producing the picture. If the Xbox is refreshing at 60hz and the PS3 at 30hz, you would be far more likely to see this type of ghosting on the XBox. Unless you were recording the video with a camcorder, you also would not see this type of ghosting at all on a directly recorded video.

The only possible type of ghosting a console could produce directly would be if it was getting the frame information mixed up in rendering and re rendering (or just re outputting) part of the previous frame. I have never seen this in any game I have played on ANY console. This type of ghosting would be apparent in the video regardless of compression used.


What do you think are scaler chip do? Soft the picture? :rolleyes:. At higher resolution, it will force the video to show more detail out of the texture if it's recorded at an higher bitrate & 2 pass.

Let me explain in detail to you what an upscaler does. An upscaler takes a set of pixels - say the following:

....
.....
....

It attempts to increase the resolution - generally by adding rows and columns. Let us suppose you are trying to double them. You could just copy each row:

....
....
.....
.....
....
....

Now it is bigger - but the shape which was semi-circular is lost. Instead it looks far more oblong. So this is obviously a bad way to upscale. Instead, algorithms are used that average between pixels. Depending on the algorithm, it selects of different number of pixels from different areas around the pixel in question to determine the value of the pixels you are going to fill in. For example, the algorithm might do the following:

....
.....
......
......
.....
....

Now by averaging and changing some of the pixels it has made the shape much more circular. This would be a better approximation. Notice two things about upscaling:

1) YOU CANNOT EVER ADD DETAIL UPSCALING.

Repeat that too yourself over and over again. Upscaling adds pixels, not detail. All it can do is make the detail already present bigger. To show more detail you would have to entirely re render the game at a higher resolution. The upscaler cannot sample textures. It cannot increase polygon count. It cannot redraw your character. ALL an upscaler does is sample pixels from the picture it is fed by the GPU and run it through an averaging algorithm to increase the number of rows and columns in the image.

2) This process of averaging tends to destroy hard lines.

If you have a thin 1 pt black line between a red and a blue background, when you upscale you generally get the red blackground, a row averaged between red and black, the black line, a row averaged between black and blue, and then the blue background. When this is done, you lose the "hard" transition between the red and the blue. So the picture gets "softer".

That is exactly what an upscaler does. Most upscalers make pictures softer. Once again, repeat to yourself over and over until you understand this:

An upscaler does not re render the picture - ever.

Gamer mesure framerate & performance which MORE FRAMERATES PER SECOND & STABLE is better. If you try to claim that 30>60fps, there sure something wrong about it. Everyone prefer better framerate on the same & exact game.

This is just plain patent nonsense. Did you know I hit 120 fps on Hellgate London once? It must be a GREAT game compared to the Xbox and PS3 right?

Like I said before, anyone who is serious about framerate wants to see two things. They want to see a time averaged framerate and they want to see a graph of the framerate vs. time. I bet you any gamer you ask would prefer a rock solid 30 fps for 6 seconds to 60 fps for 3 seconds then 2 for 3 seconds alternating. The second has an average of 31 which is higher than 30 so according to your logic it must be better right?

Maximum framerate is worthless. It tells you absolutely nothing about the game. Average framerate without context is worthless. It might tell you which is better normally, but if the action bogs down every time you go to shoot someone it doesn't matter how good it is when you are looking at a wall. You require BOTH average and a time plot to show any decent comparison.

Contrast have nothing to do with framerate but the frame alone itself. Anyway, I have no idea what you are talking about...

Read the sentence again. Compare and contrast. What does that mean? Well, when you are asked to compare and contrast 2 books, generally people are saying you want to show not only what is similar, but what is unique about each.

Now, go back and read that sentence very very carefully. I think you'll get it this time.
 
Last edited by a moderator:
The Nature of Scaling (mathematics, Fourier etc)

I do Believe you & I agree, but you're really off the point. PS3 games doesn't even take advantage of those. They're comparing graphic with the signal picture via HDMI on both system with full RGB "on". Some XBOX 360 users don't have HDMI, but the current sold on the market have HDMI.

Yeah, and SOME PS3 users have Televisions that support super white, full range color, and actually have the contrast and gamma on their television set correctly instead of the modified color palette used on the XBox. So we should use those settings right?

The point is that if you are really testing equipment, you should go for best on both. Many of these cross console games by your own admission are using a different color palette. It isn't that there is more contrast or dynamic lighting - it is that for all intents and purposes they are set incorrectly on the XBox. If you use a properly calibrated PS3 you can replicate those colors.

You cannot claim you are being "fair" by making sure to default all of the PS3 settings while using a non-standard aspect of the XBox. It is definitely not a head to head comparison.

You will have the true color indeed on "both". :p

I assumed since you proclaimed any videophile would support the XBox as being discriminated against, you would know that the PS3 videos are all run with full range color turned off and super white disabled..

This full range color is often referred to as true color. It is something the XBox does not support at all, so you cannot have it on both consoles. GameTrailer's claimed that the full color setting on the PS3 was an "artificial enhancement" and refused to use it. The same holds true of super white. What this tends to do is to brighten the image - hence the "washed out" look you see on most PS3 titles on GameTrailers.

At this point, I also want you to review those two pictures you posted. You will notice that the contrast (meaning the difference between various color levels instead of the contrast setting on your television) is actually higher on the PS3 version. Look at the head of the statue. On the XBox version it is entirely black - leaving you unable to see it at all. On the PS3 version you can actually see the entire head. Hence contrast is greater in the PS3 version - not the XBox one as claimed.

The XBox one will probably look better to most people though - especially in a room with a lot of ambient light. The red level is set higher. You could achieve the same result on the PS3 by increasing the red level of your television set. You will have to lower contrast (or brightness - depends on which setting your television manufacturer uses for this) slightly to compensate and keep green and blue in balance. Most decent calibration DVDs come with the necessary patterns to do this.

Why do you think people seek for super low time respond LCD screen. To get rid of ghosting obviously. The compression alghorithm reduce the number of color used in group of frames. The ghost will be erased depending of the situation.

You are totally off base here. Ghosting on an LCD screen is a function of the pixels not updating fast enough to keep up with the image. So the LCD screen doesn't quite fade before the next image is drawn. You can and do usually test ghosting with a 2 color black and white test pattern. You up the refresh rate of the pattern to a set rate and then flash it across the screen. If your LCD has a slow response time, you will see the very obvious ghost as a grey bar. The test pattern is still only 2 colors. Reducing the color space does absolutely nothing because ghosting is not a function of color space. It is a function of response time.

Even more to the point, this type of ghosting is machine independent. It is actually far more dependent on the FPS than it is the machine producing the picture. If the Xbox is refreshing at 60hz and the PS3 at 30hz, you would be far more likely to see this type of ghosting on the XBox. Unless you were recording the video with a camcorder, you also would not see this type of ghosting at all on a directly recorded video.

The only possible type of ghosting a console could produce directly would be if it was getting the frame information mixed up in rendering and re rendering (or just re outputting) part of the previous frame. I have never seen this in any game I have played on ANY console. This type of ghosting would be apparent in the video regardless of compression used.


What do you think are scaler chip do? Soft the picture? :rolleyes:. At higher resolution, it will force the video to show more detail out of the texture if it's recorded at an higher bitrate & 2 pass.

Let me explain in detail to you what an upscaler does. An upscaler takes a set of pixels - say the following:

....
.....
....

It attempts to increase the resolution - generally by adding rows and columns. Let us suppose you are trying to double them. You could just copy each row:

....
....
.....
.....
....
....

Now it is bigger - but the shape which was semi-circular is lost. Instead it looks far more oblong. So this is obviously a bad way to upscale. Instead, algorithms are used that average between pixels. Depending on the algorithm, it selects of different number of pixels from different areas around the pixel in question to determine the value of the pixels you are going to fill in. For example, the algorithm might do the following:

....
.....
......
......
.....
....

Now by averaging and changing some of the pixels it has made the shape much more circular. This would be a better approximation. Notice two things about upscaling:

1) YOU CANNOT EVER ADD DETAIL UPSCALING.

Repeat that too yourself over and over again. Upscaling adds pixels, not detail. All it can do is make the detail already present bigger. To show more detail you would have to entirely re render the game at a higher resolution. The upscaler cannot sample textures. It cannot increase polygon count. It cannot redraw your character. ALL an upscaler does is sample pixels from the picture it is fed by the GPU and run it through an averaging algorithm to increase the number of rows and columns in the image.

2) This process of averaging tends to destroy hard lines.

If you have a thin 1 pt black line between a red and a blue background, when you upscale you generally get the red blackground, a row averaged between red and black, the black line, a row averaged between black and blue, and then the blue background. When this is done, you lose the "hard" transition between the red and the blue. So the picture gets "softer".

That is exactly what an upscaler does. Most upscalers make pictures softer. Once again, repeat to yourself over and over until you understand this:

An upscaler does not re render the picture - ever.

Gamer mesure framerate & performance which MORE FRAMERATES PER SECOND & STABLE is better. If you try to claim that 30>60fps, there sure something wrong about it. Everyone prefer better framerate on the same & exact game.

This is just plain patent nonsense. Did you know I hit 120 fps on Hellgate London once? It must be a GREAT game compared to the Xbox and PS3 right?

Like I said before, anyone who is serious about framerate wants to see two things. They want to see a time averaged framerate and they want to see a graph of the framerate vs. time. I bet you any gamer you ask would prefer a rock solid 30 fps for 6 seconds to 60 fps for 3 seconds then 2 for 3 seconds alternating. The second has an average of 31 which is higher than 30 so according to your logic it must be better right?

Maximum framerate is worthless. It tells you absolutely nothing about the game. Average framerate without context is worthless. It might tell you which is better normally, but if the action bogs down every time you go to shoot someone it doesn't matter how good it is when you are looking at a wall. You require BOTH average and a time plot to show any decent comparison.

Contrast have nothing to do with framerate but the frame alone itself. Anyway, I have no idea what you are talking about...

Read the sentence again. Compare and contrast. What does that mean? Well, when you are asked to compare and contrast 2 books, generally people are saying you want to show not only what is similar, but what is unique about each.

Now, go back and read that sentence very very carefully. I think you'll get it this time.
 
Yeah, and SOME PS3 users have Televisions that support super white, full range color, and actually have the contrast and gamma on their television set correctly instead of the modified color palette used on the XBox. So we should use those settings right?
.........Read the sentence again. Compare and contrast. What does that mean? Well, when you are asked to compare and contrast 2 books, generally people are saying you want to show not only what is similar, but what is unique about each.

Now, go back and read that sentence very very carefully. I think you'll get it this time.


1. Scaler are there to allow the scaling to remain sharp at any given resolution. Your confusing scalling as Anti-Aliasing & DVD movie upscaling. It does not blur the frame nor give processing frame effect, but just scale the frame to the output given.(I should have written the sentence in another other paragraph)

2. The other sentence is what if they use 1080P in their encoding, again talking about how Gametrailers to be very far to be biased toward the 360 in their encoding choice.

3. About the ghosting... Capcom games such as Dead Rising, Devil May Cry 4(PS3), Lost Planet(PS3). Those games deliver ghosting by processing effect to get rid of jaggies & come right from the console hardware output. It make super low respond time HDTV meaningless.

Compression encoding algorithm can hide those error, because the algorithm reduce the number of color in the section were the color are near. With such low bitrate & poor pass encoding, it can hide the ghosting.

4. It's quite cool you talk about framerate & all, but the current situation between each machine make your reasoning quite questionable(Reference: Armored Core 4 & Armored Core 4 Answer).

5. Whatever what you will reply, as long as you argue about framerate & the quality of the frame by itself by the past comparisons done at gametrailers, so far the port are better on the XBOX 360 which an higher quality video would should show a much much more obvious difference even with the best connection with the best calibration possible----warning-->>>exaggeration>>>> With the most expensive TV on Earth ever created with the top of the line contrast ratio & respond time.
 
Last edited by a moderator:
1. Scaler are there to allow the scaling to remain sharp at any given resolution. Your confusing scalling as Anti-Aliasing & DVD movie upscaling. It does not blur the frame nor give processing frame effect, but just scale the frame to the output given.(I should have written the sentence in another other paragraph)

You really don't know what you are talking about here. The scaler chip in the XBox is post process - just like ANY scalar chip. You are confusing Anti-Aliasing which is a process time removal of jagged lines based on patch sampling with upscaling. Upscaling is ALWAYS increasing the resolution of the picture by increasing the number of horizontal and vertical lines. There is no other definition.

No offense, but ANY time you make more lines out of fewer you blur edges. No matter how good your algorithm is. If you think otherwise, perhaps you can share this mystical beast of an algorithm you think can create something from nothing. This is really basic stuff.

2. The other sentence is what if they use 1080P in their encoding, again talking about how Gametrailers to be very far to be biased toward the 360 in their encoding choice.

This is nonsense though. There are two options:

A) Output the signal in it's natively rendered resolution for both consoles. If it renders in 1080p, output it in 1080p. If it renders in 720p, output it in 720p. Doing this you are comparing console produced graphics to console produced graphics.

B) Upscale the output resolution to some arbitrarily defined resolution because it has a higher number. IE - have the XBox change the 720p image that it produces to 1080p via upscaling. The problem with this option is it is not indicative of the console itself.

3. About the ghosting... Capcom games such as Dead Rising, Devil May Cry 4(PS3), Lost Planet(PS3). Those games deliver ghosting by processing effect to get rid of jaggies & come right from the console hardware output. It make super low respond time HDTV meaningless.

I have never seen this type of ghosting. Regardless, it does nothing to invalidate my point. This type of ghosting will show up on a 30fps MPEG4. If it is in the signal, it is in the signal. Compression cannot go back and re render the scenes to remove it. You claimed that this type of ghosting was hidden by the comparison videos, but it wouldn't be. Compression would not remove it. The whole LCD discussion was your red herring and had nothing to do with the original point.

Compression encoding algorithm can hide those error, because the algorithm reduce the number of color in the section were the color are near. With such low bitrate & poor pass encoding, it can hide the ghosting.

I think you are confused. First, the problems in DMC4 and Lost Planet are not ghosting. Most videophiles would call them "edge effects". When you try to artificially sharpen the edge of an image post processing you run the risk of averaging over colors between objects incorrectly and adding a "halo". This adds a blur to the edge of images. It is not ghosting at all.

Further, I have never seen anything in DMC4 on either console that resembles edge effects or ghosting. There is a motion blur engine on the PS3 that looks a bit odd in some areas (adding blur to both sides of a pillar for instance instead of blurring against the direction of motion) - but it all looks like processed effects.

Keep in mind that compression algorithms wouldn't hide these. They make them worse. Far far worse. When you try to average over the surrounding colors you don't get background like you are supposed too. Instead, you get averages filled with the blurred image. This can cause the image to become pixelated. It is a well known processing effect.

4. It's quite cool you talk about framerate & all, but the current situation between each machine make your reasoning quite questionable(Reference: Armored Core 4 & Armored Core 4 Answer).

No - my reasoning applies perfectly to those games. Once again - please read and understand this:

FPS by itself is meaningless. The ONLY thing we are interested in is where FPS drops low enough to affect performance.

Telling me that the Xbox can do a peak of 120487894351058170 fps on Pong 2008 HD is worthless. I don't care. No gamer or videophile does. I can give you programs that will refresh almost as fast as your cpu will run. Doesn't matter - it isn't useful information. What we care about are FPS spikes. There are games where these spikes can matter. You pointed one of them out. In Armored Core, at times when there are multiple explosions on the screen the PS3s fps will drop to single digits. This kills reaction times. The problem is that a video is not good for showing that. You MUST show graphs of fps vs. time to show the effect. That is what I have said from the beginning.

5. Whatever what you will reply, as long as you argue about framerate & the quality of the frame by itself, so far the port are better on the XBOX 360 which an higher quality video would should show a much much more obvious difference even with the best connection with the best calibration possible----warning-->>>exaggeration>>>> With the most expensive TV on Earth ever created with the top of the line contrast ratio & respond time.

This is your opinion at best though. I have played Armored Core 4 side by side on the XBox and the PS3. I like the PS3 version better. The same holds true with Oblivion. With Armored Core, it is probably the fact that the controllers fit my hands better allowing me better control. With Oblivion, I like the graphics better. On the other hand, I like the Madden version better on the XBox. It does not take performance hits when you pan the crowd during a play. You know what all of that is? My opinion.

Which leads us back to the site in question. The highest quality video available would STILL be a poor substitute for sitting down yourself and looking at them side by side. Many people can't afford that, so they rely on sites like GameTrailers who claims to be neutral.

The problem that people are pointing out is that their comparisons aren't neutral. It doesn't matter whether you think they are slanted against the XBox or the PS3. The very fact that you think they are slanted is proves the point. Their whole methodology doesn't make a whole lot of sense. They have shown obvious bias in the past in regards to individual consoles. It is the fact that their site doesn't live up to its own claims of journalistic integrity that bothers me most about the situation - not that they messed up on some comparison video that probably shouldn't have been done in the first place.
 
You really don't know what you are talking about here. The scaler chip in the XBox is post process - just like ANY scalar chip. You are confusing Anti-Aliasing which is a process time removal of jagged lines based on patch sampling with upscaling. Upscaling is ALWAYS increasing the resolution of the picture by increasing the number of horizontal and vertical lines. There is no other definition.

No offense, but ANY time you make more lines out of fewer you blur edges. No matter how good your algorithm is. If you think otherwise, perhaps you can share this mystical beast of an algorithm you think can create something from nothing. This is really basic stuff.



This is nonsense though. There are two options:

A) Output the signal in it's natively rendered resolution for both consoles. If it renders in 1080p, output it in 1080p. If it renders in 720p, output it in 720p. Doing this you are comparing console produced graphics to console produced graphics.

B) Upscale the output resolution to some arbitrarily defined resolution because it has a higher number. IE - have the XBox change the 720p image that it produces to 1080p via upscaling. The problem with this option is it is not indicative of the console itself.

It's either the HDTV or The console hardware that scale it. ;)
Why would it be more blurred?

If the video is encoded over the upscaled version, the video will gain more detail out of the upscaled signal at 1080P which the viewer will probably notice more detail out of the encoded video at 1080P..

I have never seen this type of ghosting. Regardless, it does nothing to invalidate my point. This type of ghosting will show up on a 30fps MPEG4. If it is in the signal, it is in the signal. Compression cannot go back and re render the scenes to remove it. You claimed that this type of ghosting was hidden by the comparison videos, but it wouldn't be. Compression would not remove it. The whole LCD discussion was your red herring and had nothing to do with the original point.

I think you are confused. First, the problems in DMC4 and Lost Planet are not ghosting. Most videophiles would call them "edge effects". When you try to artificially sharpen the edge of an image post processing you run the risk of averaging over colors between objects incorrectly and adding a "halo". This adds a blur to the edge of images. It is not ghosting at all.

Further, I have never seen anything in DMC4 on either console that resembles edge effects or ghosting. There is a motion blur engine on the PS3 that looks a bit odd in some areas (adding blur to both sides of a pillar for instance instead of blurring against the direction of motion) - but it all looks like processed effects.

Keep in mind that compression algorithms wouldn't hide these. They make them worse. Far far worse. When you try to average over the surrounding colors you don't get background like you are supposed too. Instead, you get averages filled with the blurred image. This can cause the image to become pixelated. It is a well known processing effect.

If it's not ghosting, what it is then?
idcanceluz0.jpg

Maybe my eye have trouble to view the pic, but I'm pretty sure that effect look similar to what is known as ghosting on HDTV's.

No - my reasoning applies perfectly to those games. Once again - please read and understand this:

FPS by itself is meaningless. The ONLY thing we are interested in is where FPS drops low enough to affect performance.

Telling me that the Xbox can do a peak of 120487894351058170 fps on Pong 2008 HD is worthless. I don't care. No gamer or videophile does. I can give you programs that will refresh almost as fast as your cpu will run. Doesn't matter - it isn't useful information. What we care about are FPS spikes. There are games where these spikes can matter. You pointed one of them out. In Armored Core, at times when there are multiple explosions on the screen the PS3s fps will drop to single digits. This kills reaction times. The problem is that a video is not good for showing that. You MUST show graphs of fps vs. time to show the effect. That is what I have said from the beginning.

Armored Core 4 spike a lot on the PS3 claimed by Eurogamer. Obviously, you have experienced more spike than me. but that's not my point. What I point out is if Gametrailers would present only 60fps video comparison, the PS3 version of the game would probably look worst that what we currently know.

This is your opinion at best though. I have played Armored Core 4 side by side on the XBox and the PS3. I like the PS3 version better. The same holds true with Oblivion. With Armored Core, it is probably the fact that the controllers fit my hands better allowing me better control. With Oblivion, I like the graphics better. On the other hand, I like the Madden version better on the XBox. It does not take performance hits when you pan the crowd during a play. You know what all of that is? My opinion.

Which leads us back to the site in question. The highest quality video available would STILL be a poor substitute for sitting down yourself and looking at them side by side. Many people can't afford that, so they rely on sites like GameTrailers who claims to be neutral.

The problem that people are pointing out is that their comparisons aren't neutral. It doesn't matter whether you think they are slanted against the XBox or the PS3. The very fact that you think they are slanted is proves the point. Their whole methodology doesn't make a whole lot of sense. They have shown obvious bias in the past in regards to individual consoles. It is the fact that their site doesn't live up to its own claims of journalistic integrity that bothers me most about the situation - not that they messed up on some comparison video that probably shouldn't have been done in the first place.

I've written in my post they are passable as enough legit. Though, there's no point to argue about gametrailers to be. They do not choose the user view for the preference but we are still able to distinguish the difference of both system at a certain extend & accuracy.
 
It's either the HDTV or The console hardware that scale it. ;)
Why would it be more blurred?

I've already explained it. Let me do so again. I will add bold and italics this time so you understand the relevant portion.

When you upscale an image you are adding in lines that are not there. It does not matter what device is doing the actual scaling, it has to "guess" at what goes there. This invariably means averaging pixels - which REMOVES sharp lines.

This is basic basic AV knowledge. The question is not whether or not it happens, it is how much it happens. Different algorithms produce different results but they all have to do some form of averaging to fill in the blanks. Seriously - if you think you know of an algorithm that can create accurate information that isn't there post it. You will make a bunch of forum programmers really really rich.

If the video is encoded over the upscaled version, the video will gain more detail out of the upscaled signal at 1080P which the viewer will probably notice more detail out of the encoded video at 1080P..

..........

At this point, I think I am going to leave this issue. Either you are trying to say something entirely different and just misusing the term, or a brush up on upscaling is really required. Once again, detail can ONLY be added when the picture is created. It can be removed at any point from then on in the processing - but it can only be added once. Upscaling does not add detail. It just stretches an existing picture to match the size of the screen.

If it's not ghosting, what it is then?

The first picture looks like a processed motion blur effect. The majority of the image is sharp and has not been repeated. It is combined with some rather pronounced aliasing in the texture being applied. However, it really doesn't look like ghosting at all.

The second picture has nothing at all that I can see out of the ordinary, other than some minor aliasing.

I did a quick search and found some images to help. This is an image of ghosting:

ghosting.jpg


Notice the arrows he has drawn on the picture. The human form is clearly visible as a "ghost" next to the original object. It really isn't blurry at all - rather it is the image being repeated over again in a location it is not supposed to be at. Here is another example:

rev-solarism17v2-ghost.jpg


Once again look at the image. It is not "blurry". Instead, there is a very clear outline of the player character right next to the character. Go back and compare that to your screenshots. The first one I could kinda see mistaking for ghosting, but it just doesn't fit the profile. Edge effects yes. A heavy blur filter that really wasn't done that well yes. I would even believe intentional resolution limiting of textures in the background. But it does not look like ghosting at all to me.

The second image has nothing even remotely resembling ghosting that I can see. I am not sure what you are indicating with it.

Armored Core 4 spike a lot on the PS3 claimed by Eurogamer. Obviously, you have experienced more spike than me. but that's not my point. What I point out is if Gametrailers would present only 60fps video comparison, the PS3 version of the game would probably look worst that what we currently know.

The wonder of this conversation is that I don't need to read what Eurogamer says - I own the game and have played it through on both consoles. It is actually one of my favorite games. The thing is that I could show you footage from both copies at 60fps and you would see NO difference. I could show you footage from both games that make the PS3 look markedly superior. I could show you footage from both games that make the XBox markedly superior. It all depends on my agenda.

On the other hand, if I showed you a plot of the two consoles FPS side by side over the duration of a level, you would be able to pick out exactly where the slowdowns occurred. As I said initially, video comparisons regardless of speed are fairly useless for comparing fps.

I've written in my post they are passable as enough legit. Though, there's no point to argue about gametrailers to be. They do not choose the user view for the preference but we are still able to distinguish the difference of both system at a certain extend & accuracy.

That is the problem - they aren't really legit right now in my opinion. Their methodology is flawed from the get go. They choose the segments, they choose the settings. The segments they choose often leave questions as to why - on both consoles. Settings are also somewhat of a mystery. They have included incorrect information for both consoles in their reviews before. There has been no effort to reproduce intended game design on each console. Apparently they are even able to make very large mistakes in the editing process. I think they might actually want to be a decent site. They really need to tighten things up to be so in my opinion.
 
I've already explained it. Let me do so again. I will add bold and italics this time so you understand the relevant portion.

When you upscale an image you are adding in lines that are not there. It does not matter what device is doing the actual scaling, it has to "guess" at what goes there. This invariably means averaging pixels - which REMOVES sharp lines.

This is basic basic AV knowledge. The question is not whether or not it happens, it is how much it happens. Different algorithms produce different results but they all have to do some form of averaging to fill in the blanks. Seriously - if you think you know of an algorithm that can create accurate information that isn't there post it. You will make a bunch of forum programmers really really rich.

Signal Upscaling does not add blur nor anti-aliasing. Again will remain sharp. My previous posts are not edited at all which I am indeed aware that the screen will be aliased with the lower original resolution step visible which is why I said remain sharp which is the goal of a scaler.

At this point, I think I am going to leave this issue. Either you are trying to say something entirely different and just misusing the term, or a brush up on upscaling is really required. Once again, detail can ONLY be added when the picture is created. It can be removed at any point from then on in the processing - but it can only be added once. Upscaling does not add detail. It just stretches an existing picture to match the size of the screen.

Obviously if you encode a video of a bigger frame, you get more detail out of it since the compression algorithm won't erase certain detail if the size of the frame would be bigger.

The first picture looks like a processed motion blur effect. The majority of the image is sharp and has not been repeated. It is combined with some rather pronounced aliasing in the texture being applied. However, it really doesn't look like ghosting at all.

The second picture has nothing at all that I can see out of the ordinary, other than some minor aliasing.

I did a quick search and found some images to help. This is an image of ghosting:

ghosting.jpg


Notice the arrows he has drawn on the picture. The human form is clearly visible as a "ghost" next to the original object. It really isn't blurry at all - rather it is the image being repeated over again in a location it is not supposed to be at. Here is another example:

rev-solarism17v2-ghost.jpg


Once again look at the image. It is not "blurry". Instead, there is a very clear outline of the player character right next to the character. Go back and compare that to your screenshots. The first one I could kinda see mistaking for ghosting, but it just doesn't fit the profile. Edge effects yes. A heavy blur filter that really wasn't done that well yes. I would even believe intentional resolution limiting of textures in the background. But it does not look like ghosting at all to me.

The second image has nothing even remotely resembling ghosting that I can see. I am not sure what you are indicating with it.

It's ghosting no matter how you try to make it to be something else. That's the same effect that you will get with a low respond time LCD screen back in 2000.

Motion blur is a directional blur made on the texture to make the movement more realistic. There's indeed multiple type of motion blur that are used in the games, but what you see on Lost Planet above side is ghosting. The same picture below side do have motion blur which you clearly see on the background having a directional blur done on the textures of the 3d model building with the windows & all.
idcanceluz0.jpg


The wonder of this conversation is that I don't need to read what Eurogamer says - I own the game and have played it through on both consoles. It is actually one of my favorite games. The thing is that I could show you footage from both copies at 60fps and you would see NO difference. I could show you footage from both games that make the PS3 look markedly superior. I could show you footage from both games that make the XBox markedly superior. It all depends on my agenda.

On the other hand, if I showed you a plot of the two consoles FPS side by side over the duration of a level, you would be able to pick out exactly where the slowdowns occurred. As I said initially, video comparisons regardless of speed are fairly useless for comparing fps.

Eurogamer use this forum as reference. If you feel that they are biased, the whole forum is. :p


That is the problem - they aren't really legit right now in my opinion. Their methodology is flawed from the get go. They choose the segments, they choose the settings. The segments they choose often leave questions as to why - on both consoles. Settings are also somewhat of a mystery. They have included incorrect information for both consoles in their reviews before. There has been no effort to reproduce intended game design on each console. Apparently they are even able to make very large mistakes in the editing process. I think they might actually want to be a decent site. They really need to tighten things up to be so in my opinion.

I wonder where's the laggy version of Armored Core 4 came from. All I know is that the game have been released only on both XBOX 360 & PS3.

Since I have a XBOX 360, I can claim that it is silky smooth running at 60 frames per second & Eurogamer get that result, which I presume maybe gametrailers video may have edited the PS3 version with unsteady framerate, but would require an insane editing skill to actually manage this kind of effect via Final Cut Pro to make the frame drops look realistic.

Eurogamer claim that the PS3 version have unsteady framerate. So I presumer again that gametrailers comparison capture are legit in reality.
 
Signal Upscaling does not add blur nor anti-aliasing. Again will remain sharp. My previous posts are not edited at all which I am indeed aware that the screen will be aliased with the lower original resolution step visible which is why I said remain sharp which is the goal of a scaler.

Upscale inserts interpolated values between samples. Interpolation is made by smooth curves be it linear, quadratic or cubic, which means that it reduces the high frequencies in the image, which in turn means that image will be blurred to some extent. Blur is just another name for low-pass filter.
 
I think Mr. Deap is suggesting a straight pixel copy upscale. If that's so, that isn't the goal of a scaler! A scaler has to compromise sharpness of individual source pixels and reduction of jaggies.
 
Upscale inserts interpolated values between samples. Interpolation is made by smooth curves be it linear, quadratic or cubic, which means that it reduces the high frequencies in the image, which in turn means that image will be blurred to some extent. Blur is just another name for low-pass filter.

That's not true,
when an image of wxh upscaled to 2wx2h frequency response doesn't change with good interpolation.
Why would it? 2wx2h is higher sampling rate, it can hold more frequencies than wxh by Nyquist.

I think the confusion partially originates from the vague definition of blur.
Contrary to popular belief, proper upscaling does remove information as in low pass filtering,
What people should say is an upscaled image cannot hold as much information as a native one of that upscaled resolution.
That's not a result of upscaling operation, but a result of the original lower resolution image not having "capacity" to begin with.
 
That's not true,
when an image of wxh upscaled to 2wx2h frequency response doesn't change with good interpolation.
Why would it? 2wx2h is higher sampling rate, it can hold more frequencies than wxh by Nyquist.

I can give you a simpler example of what he means. Take the following set of numbers:

4 8 4
8 16 8
4 8 4

Starting in the upper left corner as (1,1) this square is described by the equation 2^(cos(x*Pi/2+cos(y*Pi/2)+2). The frequency for either rows or columns is 1. Now, I will assume you meant that the frequency across the rows remains the same, so in this case, 1 row is 1 cycle or you can claim the frequency is 3 pixels. In the expanded case we want the same to be true so all you need to do here is post a matrix that is 4x4 in which 1 row is 1 cycle that has the following form:

4 a 8 a 4
a b c b a
8 c 16 c 8
a b b b a
4 a 8 a 4

You can try a 6x6 matrix if you think that it will be easier to do. I think you will find a, b, and c fairly easily. Now scan the matrix from corner to corner and make sure that the frequency still matches. What I think you will find is that the distance between 4 and b and the distance between 16 and b cannot be equal while still maintaining the frequency across the abcba rows. Unfortunately, they are equal in the original.

This is actually the real question behind scaling. As you increase the number of those "inbetween" pixels that cannot exactly be matched in all 3 directions (down, across, and slanted) it gets harder and harder to get an accurate representation. If you just do simplistic row and column averaging you tend to get "peaks" where there was a smooth transition. You can transform all 3 into frequency space using a fourier transform and take the average. That is better, but it changes the row and column frequency. Neither of those algorithms handle hard lines well.

I am not saying it can't be done - but I have never seen an algorithm that can handle both this type of pattern (that would be found commonly in shading a sphere for instance) and a straight hard edge well. Increase processing power and you can do mixes of them.

So, instead of trying to argue the process in detail, lets take a different approach. In honor of the original subject of this thread here is a picture called "The Grid":

The Grid

My request is simple. Use whatever algorighm you like to upscale the image from 1152x864 to 1900x1080. This is the equivalent of a 720p to 1080p upscale. This is a perfect opportunity to show off that algorithm we keep hearing about that can upscale an image without blurring sharp lines. Please note - I do not expect the upscaled image to look like the actual rendered 1900x1200 image. That would be silly. Like I have said all along, you cannot add detail when you upscale. I just want to see this upscaling that retains sharpness on all images that people keep saying exists.
 
I call for a thread fork since this has been really off topic.

I can give you a simpler example of what he means.
That doesn't really look like a simpler example of what he means. :)
Now, I will assume you meant that the frequency across the rows remains the same,
not really, I meant the frequency components in the frequency domain will not be lost as a result of some kind of low pass filtering.
so in this case, 1 row is 1 cycle or you can claim the frequency is 3 pixels.
That looks 2 pixels but I will ignore your example for now,
mainly because I'm not talking about odd upscaling (like 3p->5p, 640p->720p, or 720p->1080p) for some particular reason which I see no point discussing right now.
More importantly the explanation given for supposed blurring effect has no relation to number of dimension (even "bi" prefix of interpolation techniques have been omitted :) )
so let's figure out the 1D case for now .
My question to you is simple, do you think 1D upscaling has to "blur" the result, or more technically has a low pass filtering effect on the result?
My request is simple. Use whatever algorighm you like to upscale the image from 1152x864 to 1900x1080. This is the equivalent of a 720p to 1080p upscale. This is a perfect opportunity to show off that algorithm we keep hearing about that can upscale an image without blurring sharp lines.
Even with perverted notions of blurriness, sharpness, smoothness, 0th order polynomial interpolation (= no interpolation) will not "smooth out" the edges. I think Shifty was speculating that also. (Not that I think this is proper upscaling, but that answer covers all kinds of interpretations of "blur").

Please note - I do not expect the upscaled image to look like the actual rendered 1900x1200 image. That would be silly. Like I have said all along, you cannot add detail when you upscale. I just want to see this upscaling that retains sharpness on all images that people keep saying exists.
Let's discuss the 1d case, then we can jump to 2d-proper and question "diagonal frequency" issue.
And for the record I don't really use terms like sharp, smooth etc.
 
not really, I meant the frequency components in the frequency domain will not be lost as a result of some kind of low pass filtering.

We are talking about upscaling though - not low pass filtering. Not only that, but I have a feeling I am talking about general fourier transforms used to map an image into the frequency domain by assigning each pixel a value in a grid and performing the transform while you are talking about a discrete fourier transform like you might get while running low pass filters on music (ie - a wavelength vs time transform).


That looks 2 pixels but I will ignore your example for now,
mainly because I'm not talking about odd upscaling (like 3p->5p, 640p->720p, or 720p->1080p) for some particular reason which I see no point discussing right now.

It is a 3x3 grid where each number represents a pixel. Hence it must be 3 pixels down, 3 pixels across and 9 total pixels. I am not sure where you get two.

Also, the feature I demonstrated holds regardless of the number of rows you expand it too. I said 4 to keep the math simple. You can do the 6x6 and see the exact same problem.

More importantly the explanation given for supposed blurring effect has no relation to number of dimension (even "bi" prefix of interpolation techniques have been omitted :) )
so let's figure out the 1D case for now .
My question to you is simple, do you think 1D upscaling has to "blur" the result, or more technically has a low pass filtering effect on the result?

Once again - not low pass filtering. Upscaling. We are trying to add rows. Let us take the 1d case for now. I'll use your own assumption about constant frequency in the frequency domain to show you the problem. Take a row of pixels where you can describe a blue color component by x^2 (to make the fourier transform easy for you). Here is the row with 4 values:

1 4 9 16

Now, you are claiming you can map that row to 8 values without changing the frequency w in the fourier transform. Lets just take the simplest possible interpolation where we substitute x/n where n is the number of pixels we want to add so n is 2 to get discrete pixel values for pixels (1...8):

.25 1 2.25 4 6.25 9 12.25 16

A polynomial interpolation of this should return (x/2)^2 if done properly. The fourier transform of this is -1/2*sqrt(Pi/2)*D''[w]. Now, lets look at the frequency domain components to see if they are the same:

Code:
  w         f(w)                   f'(w)
  0      -sqrt(2*Pi) D[w]    -1/2*sqrt(Pi/2)D''[w]
  1           0                       0
  2           0                       0

ect. The derivative of the delta function complicates things, but it isn't too bad. Remember that the derivative of the delta function follows the identity x^n*(d^n D(x)/dx) = (-1)^n*n!*D(x). So set the two sides equal, multiply by w^2, and you end up with
-(w^2*sqrt(2*Pi) D[w]) = -sqrt(Pi/2)*D[w]).

Solve for w and you will find that they are only equal for w != 1, or everywhere that they are both zero.

You may take issue with the function I used to interpolate. That is fine. Choose your own, do the fourier transform of the resulting function for the row, and post the results. I am absolutely certain that no matter what formula you come up with I can give it a situation where the frequency domain components change.

Even with perverted notions of blurriness, sharpness, smoothness, 0th order polynomial interpolation (= no interpolation) will not "smooth out" the edges. I think Shifty was speculating that also. (Not that I think this is proper upscaling, but that answer covers all kinds of interpretations of "blur").

Now we can really get down to the details of filters vs upscaling. Let me show you what you are trying to say, and then let me show you why it doesn't work.

Let us start with a set of points similar to our previous set:
{1,4,9,16}

Do a simple polynomial interpolation. You get out x^2. I've listed the fourier transform for this above.

Now, as long as you stay in this coordinate space (x=0..3) you are fine. For example, the value between 0 and 1 would be .25. As a matter of fact, you will find out that the numbers in between are exactly what I posted above as the inbetween values for my simplest possible interpolation. However, there is no pixel .5!

You HAVE to map one coordinate space onto another to draw these. One way to do that mapping is to use the function I gave above where you let x -> x/n where n is the number of points you want to map. This gives you discrete integral values for each of the pixels. Someone performing a basic interpolation on the new points should then return f(x/n) if they do the interpolation right. For our example, that function would be (x/2)^2.

The fourier transform for the function (x/2)^2 is done above. If someone comes along and performs a polynomial interpolation on the new pixels you have given them then they will get different values for the frequency components after they interpolate and do a fourier transform.

Because the fourier transform is linear I can give a general case for a polynomial mapping. The fourier transform of a polynomial mapping of degree l in 1 d to x/n is given by the following:

(s2p*D[w] - i*s2p/n*D'[w] - s2p/n^2*D'''[w]+i*s2p/n^3*D'''[n] ... -i^n s2p/n^ l*D(n)[w])

Here s2p is the square root of 2 pi and D[w] is the Dirac delta function as mentioned previously. D(n)[w] represents the nth derivative of the delta function. So indeed, in the general case for polynomial filtering remapping the coordinate system does indeed result in a different frequency space. So why is this result different from filtering?

Filtering does not remap the coordinate domain

A low pass filter will indeed never change the frequency domain values. Generally, a low pass filter performs a discrete fourier transform and then removes one of the frequency components that you do not desire, then changes the resulting data back. Everything is done in the same frequency/position domain so the values remain constant. However, upscaling is NOT a low pass filter. Upscaling is the process of changing from a n component space to an m component space. Fourier transforms are the process of changing from position space to frequency space. Remapping the position space had better remap the frequency space.

Now, lets talk about what started this whole issue. I said a picture is made "softer" by upscaling. Let me say that I agree with you entirely that the definition of such a term is too subjective for use in this type of conversation. I tried to clarify that before though. To demonstrate what I meant, I gave the illustration of averaging between two backgrounds. So lets start there with this 1d example. Once again, let us take every value as a representation of a blue color.

{1 1 1 1 4 2 2 2 2}

Now, how do you want to upscale. Polynomial interpolation perhaps? Fair enough, here is an 8th order polynomial mapping for the above set of numbers:

323 - 845.199x + 868.038x^2 - 464.906x^3 + 144.032x^4 - 26.7208*x^5 + 2.925x^6 - 0.174107x^7 + 0.00434x^8

So, lets fill out the array using numbers generated from this interpolation:

{1, -4.062, 1 , 2.43, 1, 2.88321, 4, 3.48135, 2, 1.17422, 2, 3.31729, 2}

That doesn't look very good does it? It totally destroyed the integrity of the picture. Maybe I sampled too many pixels? Ok, let us just interpolate using nearest neighbors and a straight line.

{1, 1, 1, 1, 1, 1, 1, 2.5, 4, 3, 2, 2, 2, 2, 2, 2}

That is better, but look at what it did to the line in the middle. The line is no longer "hard" like it was in the first picture. Line doubling actually does a better job here:

{1, 1, 1, 1, 1, 1, 1, 1, 4, 4, 2, 2, 2, 2, 2, 2, 2, 2}

Now you still have a hard line. What happens if you have a circle though? I actually have an ascii art representation in one of my posts above. It adds "steps". Now, we can ignore the terms smooth, soft, hard, ect. I agree with you that they aren't actually all that helpful. However, I have shown both that remapping the coordinate space remaps phase space and that interpolation alters the edges of lines in 1D. If you can accept that, it generalizes to 2D in a straight forward fashion. We can go through that as well if you like.
 
Last edited by a moderator:
We are talking about upscaling though - not low pass filtering.
I was referring to this (for which you gave "better example"):
Blur is just another name for low-pass filter.
This is technically reasonable definition of blurring, and since we investigate blurring under upscaling the question is really whether there is something like a low pass filtering effect going on or not.
If you are not discussing this, why have you replied me?
(But I believe you were arguing that without giving that effect a concrete name, doesn't matter anyway, just let me know what smooth, blur etc means in your previous posts. )
Not only that, but I have a feeling I am talking about general fourier transforms used to map an image into the frequency domain by assigning each pixel a value in a grid and performing the transform while you are talking about a discrete fourier transform like you might get while running low pass filters on music (ie - a wavelength vs time transform).
I'm not really talking about transforms or techniques (Not that there is a constant Fourier Transform definition). A transform is a tool, has no meaning.
I'm talking about the frequency domain (for which FT is a tool) you capture with sampling (rasterization). And a specific frequency domain at that.

I guess if you want to talk about transforms, yes it's similar to a normalized discrete transform for sound waves you would use for real life frequency operations or analysis.

The point is there is a fixed width on screen of your display analogous to time, whether you use 720p or 1440p. And yes normalization to real spatial frequency is the key.
Since it's about human eye perception, it's only logical to talk about real spatial frequencies.
That is, the domain I'm talking about is between -fs/2 and fs/2, where fs=width/720 or width/1440
Of course the actual value of width is irrelevant.

Now, I don't really like writing much so I really wish we define some common ground and understand what each other is claiming before throwing out examples.
It is a 3x3 grid where each number represents a pixel. Hence it must be 3 pixels down, 3 pixels across and 9 total pixels. I am not sure where you get two.
I'm totally lost here by what you mean by frequency. You talk about Fourier transform a lot yet you claim 3 samples mean a frequency of 3? Which "general Fourier transform" used on images give you a (non-zero) frequency of 3? Is it sampling frequency of some kind?
Once again - not low pass filtering. Upscaling.
:)
We are trying to add rows. Let us take the 1d case for now. I'll use your own assumption about constant frequency in the frequency domain to show you the problem.
Until we agree on what "constant frequency" means, or even what frequency means, there is really no point discussing this.

Let me try to show you what I mean and why your examples has no relevance (thanks for the effort though).

What is the highest single frequency you can have at a sampling rate of fs?
fs/2

Let's say I have 1 unit width of fs many samples (or anything else is fine).

And my low resolution image is sin(pi*n) where n is the pixel index.
The main (dominant) frequency component of that image is fs/2.

Let's upscale this to 2*fs pixels with simplest interpolation (0th order).
That is sin(pi * floor(n/2)) where n is from 1 to 2*fs
If it's not apparent immediately you can check the response and see that the dominant frequency of upscaled image is still fs/2, plus "noise".

Now lets do slightly better interpolation: linear (simplest non-trivial interpolation ;) )
The upscaled image is zig-saw pattern: sin(pi*n/2) for n from 1 to 2*fs
And guess what the frequency is? :)
Obviously same is true for both lower frequencies and for any higher order interpolation.
Now where is the "blurring" that I keep hearing?
The highest frequency in the original image is clearly "preserved".
And again if you'd missed, my jump on the topic was based on specific (and reasonable) definition of blurring.

A low pass filter will indeed never change the frequency domain values. Generally, a low pass filter performs a discrete fourier transform and then removes one of the frequency components that you do not desire, then changes the resulting data back.

You can apply a low pass filter on spatial domain or frequency domain (or hybrid what ever) but the filter does not do any transformation. A filter is not a computer program, dedicated hardware or mathematical function.
Nor a transform(ation) is anything of the value you seem to be crediting it.
:|

This is really depressing, When I say things like "low pass filtering" effect, I'm not talking about any transformation (or even filtering), I'm talking about the signal loosing it's higher frequencies for some reason.

A signal is a signal you know, wherever you choose to represent it, it will still have frequency components. Transforms are irrelevant.
 
I was referring to this (for which you gave "better example"):

Actually, reread the post you questioned:

Upscale inserts interpolated values between samples. Interpolation is made by smooth curves be it linear, quadratic or cubic, which means that it reduces the high frequencies in the image, which in turn means that image will be blurred to some extent. Blur is just another name for low-pass filter.

Now, I was giving you a better example as to WHY an upscaled image can have frequencies changed. When he said "Blur is just another name for a low-pass filter" I read it as him saying "Upscaling introduces x effect. Do not confuse x effect with the term Blur which is done using a low-pass filter". You obviously read it differently.

However, I was giving a better example as to how upscaling can introduce frequency shifts in a picture. I have always been referring to the averaging of sharp lines done by upscaling that reduces the difference between them.

I'm not really talking about transforms or techniques (Not that there is a constant Fourier Transform definition). A transform is a tool, has no meaning.
I'm talking about the frequency domain (for which FT is a tool) you capture with sampling (rasterization). And a specific frequency domain at that.

First, a Fourier transform is a mathematical construct. Not only is there a very definite and constant definition, but it is so well defined as to leave NO grey area whatsoever.

Second, a transform does have a definite meaning. A very clear, very precise meaning. Understanding a Fourier transform is VERY important here, because without it you wont understand the problems in your next example. Because of that, I am going to give you a short run down on the meaning behind it.

In around 1800s, scientists and mathematicians were using series to express complicated functions. While used by others before him, Fourier started using series expansions of sin and cos functions to express new series. Basically, he postulated that you could represent ANY function with just a sum of sin and cos functions. Now, if you take that sum of sin and cos functions and transform it into an integral, you get an expression for what each of the coefficients in the series would be. It was quickly realized you could take this one step further though. Because sin and cos functions are periodic, you can represent then with individual frequencies instead of with the standard position variable. In other words, you can expand a function f(x) in a series of functions g(f) by taking the sum in a specific way. This IS the Fourier transform.

When describing this, we say that the Fourier transform translates a function in the position domain x to a function in the frequency domain f. This defines frequency domain! Let us look at an example. Take the function sin(x). The Fourier transform for this function is:

i*Sqrt[Pi/2] DiracDelta[-1+w]-i*Sqrt[Pi/2] DiracDelta[1+w]

Notice that this has two discrete values at -1 and 1. These correspond to the minima and maxima of the Sin function. So in the frequency domain, this function has 2 and only 2 values. So why is this useful? For a general example lets take sin(x)+.2*sin(3x). That function has the Fourier transform:

0.250663 I DiracDelta[-3+w]+I Sqrt[\[Pi]/2] DiracDelta[-1+w]-I Sqrt[\[Pi]/2] DiracDelta[1+w]-0.250663 I DiracDelta[3+w]

Notice that there are now 4 distinct values. But what if the .2*sin(3*x) is just noise? In the Fourier space it is very easy to see all parts of the function that are related to each wave! So you can subtract the noise function without needing to worry about damaging your original signal. That might seem pointless with just these 2 sin functions. You could easily fit the data and just subtract the second sin function. What happens if you have 200 of them though? Well, you can sample the data as it comes in and perform a discrete Fourier transform. You can then remove all of the low frequency values and transform the data back. Guess what? You just made a low pass filter!

So the meaning of the Fourier transform is a transformation that separates out every unique frequency in a function and displays it as a unique value. That meaning is very important. Keep it in mind for later.

I guess if you want to talk about transforms, yes it's similar to a normalized discrete transform for sound waves you would use for real life frequency operations or analysis.

Not similar. There IS only one Fourier transform. It is NOT vague at all. It has a definite meaning and a definite purpose. Sound waves are the first thing that comes to mind for most people because they are waves. It is very easy to model them as sums of sin and cos functions. So it is natural to use a Fourier transform to separate out frequencies. Also, don't try and throw the normalized red herring in here. The Fourier transform is only normalized to make it unitary. You can still use the non-unitary transforms to do anything we are talking about here. If you want more detail on when unitary is important and why, I am more than happy to give it too you.

I'm totally lost here by what you mean by frequency. You talk about Fourier transform a lot yet you claim 3 samples mean a frequency of 3? Which "general Fourier transform" used on images give you a (non-zero) frequency of 3? Is it sampling frequency of some kind?

Read above for an explanation of what a Fourier transform is. The Fourier transform is an integral transform. That means if you do not have the generating function, you cannot do it in closed form. To counter this, the discrete Fourier transform was developed. This is the SAME transform - it is just adapted to dealing with a set of data points. If you perform a discrete Fourier transform on one row of 3 data points that I gave you get the following in frequency space:

{4.6188,-0.57735+i,-0.57735-i}

Notice there are 3 frequencies in the discrete Fourier transform. I hope that is general and non-zero enough for you.

Until we agree on what "constant frequency" means, or even what frequency means, there is really no point discussing this.

These have very precise mathematical definitions. There is no gray area here...

What is the highest single frequency you can have at a sampling rate of fs?
fs/2

Let's say I have 1 unit width of fs many samples (or anything else is fine).

And my low resolution image is sin(pi*n) where n is the pixel index.
The main (dominant) frequency component of that image is fs/2.

Let's upscale this to 2*fs pixels with simplest interpolation (0th order).
That is sin(pi * floor(n/2)) where n is from 1 to 2*fs
If it's not apparent immediately you can check the response and see that the dominant frequency of upscaled image is still fs/2, plus "noise".

Now lets do slightly better interpolation: linear (simplest non-trivial interpolation ;) )
The upscaled image is zig-saw pattern: sin(pi*n/2) for n from 1 to 2*fs
And guess what the frequency is? :)
Obviously same is true for both lower frequencies and for any higher order interpolation.

Once again, I've already gone over this in my previous post. However, let us try again. Instead of the periodic function you have chosen, choose the function n^2 for your row values. Now, what is the frequency?

Once again, to get this you MUST perform a Fourier transform. The Fourier transform of n^2 is

-Sqrt[2 \pi] (DiracDelta'')[w]

This ONLY has a value for w=0. Now, do your simple linear interpolation. That results in the function (n/2)^2. Do the Fourier transform for this function. The transform is:

-((Sqrt[2 \pi] (DiracDelta'')[w])/n^2)

This also only has one value, for w=0. Are those two values the same? Of course not. One is smaller by a factor of n^2.

So, why does your example seem to work and mine doesn't? It is simple. Go back up and reread what the Fourier transform is. You choose a periodic function. We can see the effect if we take the Fourier transform of sin(n) and sin(n/2). Here they are for reference:

i*Sqrt[pi/2] DiracDelta[-1+w]-i*Sqrt[pi/2] DiracDelta[1+w]
i*Sqrt[pi/2] DiracDelta[-(1/2) + w] - i*Sqrt[pi/2] DiracDelta[1/2 + w]

Notice what happened. The location shifted but the value remained the same! This is exactly what you would expect from the sin function. It is periodic. You would not expect the frequency to shift if you expanded it with the phase space. The problem is non-periodic functions.

Now, to some extent, you can represent any function as a series of sin and cos functions - so you can suppress the amount of shifting the frequencies do in the frequency domain. However, that is computationally expensive. Eventually there must be a cutoff. As soon as you reach that cutoff, you are into the n^2 type of function and not the sin(Pi*n).

Can you now see where there is a clear shift in frequency value?


You can apply a low pass filter on spatial domain or frequency domain
This is correct.
but the filter does not do any transformation.

This is heavily dependent on the filter. You will find that most filters like Blur on photoshop do indeed shift the pixel domain into the frequency domain using discrete Fourier transforms and then subtracting or averaging individual frequencies.

A filter is not a computer program, dedicated hardware or mathematical function.

This is just wrong. The filters we are talking about are always just a mathematical function applied to a series of numbers. Generally that is done in dedicated hardware or in a computer program.


Nor a transform(ation) is anything of the value you seem to be crediting it.

This is just you misunderstanding how these filters are done. Once again, look up the implementation of most filters. You will find they use some form of transform and then add or subtract individual elements in the transform. Fourier transforms are used very very frequently because most things in nature can be represented at least approximately with harmonic functions.

A signal is a signal you know, wherever you choose to represent it, it will still have frequency components. Transforms are irrelevant.

Once again, it WILL always have frequency components. To get those frequency components, you use a Fourier transform. I can give you other transforms if you want other components - but as long as we are talking about frequency Fourier transforms are not only relevant, but MUST be the basis for ANY discussion.
 
Actually, reread the post you questioned:
Now, I was giving you a better example as to WHY an upscaled image can have frequencies changed. When he said "Blur is just another name for a low-pass filter" I read it as him saying "Upscaling introduces x effect. Do not confuse x effect with the term Blur which is done using a low-pass filter". You obviously read it differently.
You obviously read it wrong, but never mind.
Regarding his comment that you gave example for:
Do you think upscaling "reduces the high frequencies in the image" (more than lower ones) or not?
Yes or no? simple question.
However, I was giving a better example as to how upscaling can introduce frequency shifts in a picture.
Please stop giving examples for the time being.
I have always been referring to the averaging of sharp lines done by upscaling that reduces the difference between them.
You have (say) twice more pixels (in one dimension), while the rate of change decreases per pixel, it doesn't "change" per distance. What is your point?
I'm not in the habit of reading off topic, obvious or inaccurate bulk of text, please don't expect me to do any data mining for you.
First, a Fourier transform is a mathematical construct. Not only is there a very definite and constant definition,
Is it based on angular or oscillatory frequency? :)
but it is so well defined as to leave NO grey area whatsoever.
Indeed, withing its particular definition. So?
Second, a transform does have a definite meaning. A very clear, very precise meaning. Understanding a Fourier transform is VERY important here, because without it you wont understand the problems in your next example. Because of that, I am going to give you a short run down on the meaning behind it.
I seriously have no idea how to respond to such remarks without insulting.
Silence is a virtue I guess.
But for the record, don't bother explaining again.
In around 1800s, scientists and mathematicians were using series to express complicated functions. While used by others before him, Fourier started using series expansions of sin and cos functions to express new series. Basically, he postulated that you could represent ANY function with just a sum of sin and cos functions.
He was obviously wrong
Now, if you take that sum of sin and cos functions and transform it into an integral, you get an expression for what each of the coefficients in the series would be. It was quickly realized you could take this one step further though. Because sin and cos functions are periodic, you can represent then with individual frequencies instead of with the standard position variable. In other words, you can expand a function f(x) in a series of functions g(f) by taking the sum in a specific way. This IS the Fourier transform.

When describing this, we say that the Fourier transform translates a function in the position
Who are you? (My curiosity is related to the topic though it may not seem so)
domain x to a function in the frequency domain f. This defines frequency domain! Let us look at an example. Take the function sin(x). The Fourier transform for this function is:

i*Sqrt[Pi/2] DiracDelta[-1+w]-i*Sqrt[Pi/2] DiracDelta[1+w]
This is one Fourier transform of sin(x) (I'm not even going into generalized Fourier transforms).
Notice that this has two discrete values at -1 and 1. These correspond to the minima and maxima of the Sin function. So in the frequency domain, this function has 2 and only 2 values. So why is this useful? For a general example lets take sin(x)+.2*sin(3x). That function has the Fourier transform:

0.250663 I DiracDelta[-3+w]+I Sqrt[\[Pi]/2] DiracDelta[-1+w]-I Sqrt[\[Pi]/2] DiracDelta[1+w]-0.250663 I DiracDelta[3+w]
Those look like homework questions, I wonder why.
Notice that there are now 4 distinct values. But what if the .2*sin(3*x) is just noise?
In the Fourier space it is very easy to see all parts of the function that are related to each wave!
Shocking revelation
So you can subtract the noise function without needing to worry about damaging your original signal. That might seem pointless with just these 2 sin functions. You could easily fit the data and just subtract the second sin function. What happens if you have 200 of them though? Well, you can sample the data as it comes in and perform a discrete Fourier transform. You can then remove all of the low frequency values and transform the data back. Guess what? You just made a low pass filter!
Got it, low pass filtering=noise removal from 200 sin waves
So the meaning of the Fourier transform is a transformation that separates out every unique frequency in a function and displays it as a unique value. That meaning is very important. Keep it in mind for later.
Don't worry, I'm taking notes.
Not similar. There IS only one Fourier transform. It is NOT vague at all.
What is vague is your understanding of Fourier transform.
I can define infinitely many Fourier transforms using the same orthogonal kernel, but there happens to be two commonly used ones.
You are still missing my point, when I say transforms are unimportant, I mean different representations of a signal exists independent of the transformation you would use.
It has a definite meaning and a definite purpose. Sound waves are the first thing that comes to mind for most people because they are waves. It is very easy to model them as sums of sin and cos functions. So it is natural to use a Fourier transform to separate out frequencies.
Dear sir, is it okay to use Cosine transform instead?
Also, don't try and throw the normalized red herring in here.
:)
I was talking about sampling rate to scale to actual real life frequencies as opposed to keeping "useless" DFT frequencies.
The Fourier transform is only normalized to make it unitary. You can still use the non-unitary transforms to do anything we are talking about here.
Indeed
If you want more detail on when unitary is important and why, I am more than happy to give it too you.
Please do so, since you don't seem to mind writing.
Read above for an explanation of what a Fourier transform is. The Fourier transform is an integral transform. That means if you do not have the generating function, you cannot do it in closed form. To counter this, the discrete Fourier transform was developed. This is the SAME transform - it is just adapted to dealing with a set of data points. If you perform a discrete Fourier transform on one row of 3 data points that I gave you get the following in frequency space:

{4.6188,-0.57735+i,-0.57735-i}
Notice there are 3 frequencies in the discrete Fourier transform.
Another shocking revelation (for the number of frequencies).
For the values, yeah, that's one textbook DFT result though I doubt anyone is actually using that in real life. Not that I care.
I hope that is general and non-zero enough for you.
It sure is not. Recall you said
you can claim the frequency is 3 pixels
Again I'm asking where do you get the frequency of 3?
The transform you posted reveals that the dominant frequency is 0 (not surprising since all samples are positive), but where is that frequency 3?
These have very precise mathematical definitions.
Only to one whose knowledge is limited to an intro chapter in some book.
Once again, I've already gone over this in my previous post. However, let us try again. Instead of the periodic function you have chosen, choose the function n^2 for your row values. Now, what is the frequency?
Once again, to get this you MUST perform a Fourier transform. The Fourier transform of n^2 is
-Sqrt[2 \pi] (DiracDelta'')[w]
What the hell? You think a polynomial function is of a single frequency and a constant at that?
This ONLY has a value for w=0.
In your dreams maybe.
Forgive me, once again I will choose the ignore your interpolation example until you explain how you match a polynomial function to a constant one.
And while you are at it, clarify why you have Dirac's delta on FT of a discrete function?
Otherwise I'm gonna start speculating that you work on infinite sample size thus constant constant frequency destroys everything else when normalized. What is the point of having an unstable infinite signal?
Now, do your simple linear interpolation. That results in the function (n/2)^2.
That is not linear interpolation. That's a full sample polynomial interpolation of order 2, which no one in his right mind would use with any number of samples higher than 2.
Do the Fourier transform for this function. The transform is:

-((Sqrt[2 \pi] (DiracDelta'')[w])/n^2)
Just curious, what is n doing there? or w?
This also only has one value, for w=0. Are those two values the same? Of course not. One is smaller by a factor of n^2.

So, why does your example seem to work and mine doesn't? It is simple.
:)
Go back up and reread what the Fourier transform is. You choose a periodic function. We can see the effect if we take the Fourier transform of sin(n) and sin(n/2). Here they are for reference:

i*Sqrt[pi/2] DiracDelta[-1+w]-i*Sqrt[pi/2] DiracDelta[1+w]
i*Sqrt[pi/2] DiracDelta[-(1/2) + w] - i*Sqrt[pi/2] DiracDelta[1/2 + w]

Notice what happened. The location shifted but the value remained the same! This is exactly what you would expect from the sin function. It is periodic. You would not expect the frequency to shift if you expanded it with the phase space. The problem is non-periodic functions.

Now, to some extent, you can represent any function as a series of sin and cos functions - so you can suppress the amount of shifting the frequencies do in the frequency domain.
0th order interpolation is a linear operation, if you prove for any frequency you prove for all signals. The linear interpolation is almost linear, but if it will make you happier you can define extended linear interpolation(TM) that operates on each frequency component separately.
However, that is computationally expensive. Eventually there must be a cutoff. As soon as you reach that cutoff, you are into the n^2 type of function and not the sin(Pi*n).
Again, are you talking about continuous FT on an infinite discrete function?
Otherwise, as long as you have finite samples you have finite number of non-zero frequency components whether your function is n^2 or not. There is no difference between a perfectly harmonic function and another one for a linear system for the argument I was making (loss of high frequencies).
Can you now see where there is a clear shift in frequency value?
My imagination sucks.
This is heavily dependent on the filter. You will find that most filters like Blur on photoshop do indeed shift the pixel domain into the frequency domain using discrete Fourier transforms and then subtracting or averaging individual frequencies.
You should have stopped talking about filtering after your previous definition of low pass filter.
Almost no sane person applies low pass filtering on frequency domain because it's expensive.
Instead they use IIR filters, that is simple convolution with imperfect cutoff frequency. But you can increase the size of the matrix to sharpen the cutoff (duality if your book has that), would still be much cheaper.

I haven't even seen Photoshop source code but I can bet good money they are doing a 3x3 convolution for regular blur, and a parametrized Gaussian matrix for Gaussian blur.

If you still insist on murdering signal processing I'm sure there are open source image processing applications for which you can check the source.
This is just wrong. The filters we are talking about are always just a mathematical function applied to a series of numbers.
I meant it's not a mathematical function from input language to output language.
So it's not a mathematical function applied on the input numbers.
Strictly speaking it's a mathematical function from your input signal domain to output signal domain.
Generally that is done in dedicated hardware or in a computer program.
I don't care how you implement it (at least didn't, until you came with those claims), again the filters exist beyond their implementation.
This is just you misunderstanding how these filters are done. Once again, look up the implementation of most filters.
:)
You will find they use some form of transform and then add or subtract individual elements in the transform.
pure bullshit, I can tell you for sure, most filter implementations in the world operate on time domain. Think about all the devices, phones, headphones, radio, TV, etc.
FD is not even practical in most applications.
Not that there is any point on discussing that, but I'm having hard time believing claims like that can come from a person with signal processing familiarity.
Fourier transforms are used very very frequently because most things in nature can be represented at least approximately with harmonic functions.

Once again, it WILL always have frequency components. To get those frequency components, you use a Fourier transform.
Once again, whether you use FT or not, it will have frequency components.
I can give you other transforms if you want other components - but as long as we are talking about frequency Fourier transforms are not only relevant, but MUST be the basis for ANY discussion.
:cry:
 
You obviously read it wrong, but never mind.

No, I did not. Unfortunately you did. However, let me answer the question you are having trouble with.

Do you think upscaling "reduces the high frequencies in the image" (more than lower ones) or not?
Yes or no? simple question.

Yes. If you have trouble accepting my word for it try the following references:
IEEE, Transactions on Consumer Electronics, Volume 51, Issue 1. Feb 2005 - Masking Noise in up-scaled video on large displays. Hcesch et al.
Journal of Electronic Imaginc, October-December 2005. Image resolution upscaling in the wavelet domain using directional cycle spinning. Temizel. et al
Grass Detection for Picture Quality Enhancement of TV Video. Springer Berlin/Heidelberg. ISBN 978-3-540-74606-5
High Frequency Component Compensation based Super-Resolution Algorithm for Face Video Enhancement from the 17th international conferance on Pattern Recongition
Journal of Electronic Imaging, October 2000, Volume 9, Issue 4 pp. 534-547. Nonlinear resampling for edge preserving moire suppression.

I can give you another 40 references over the last 20 years to verify this claim as well. Or you can pick up an introductory level signal analysis book. Either will do.



Is it based on angular or oscillatory frequency? :)

Is this an attempt at a joke? It is hard to detect humor over the internet - smiley or no. Angular frequency is 2*pi*f where f is the oscillatory frequency. In other words, you can write the Fourier transform in regards to either. All it changes is a constant in the exponential part of the integral which changes normalization.

He was obviously wrong

.....

You are joking right? The Fourier expansion has been proven correct so many times that claiming this is ridiculous. Plancherel's theorem works not only on R, but also on locally compact abelian groups and has even been extended to non-commutative locally compact groups. This forms the very foundation for Harmonic Analysis. I have to assume you are joking. Otherwise there is really no point in continuing this conversation. If you are ready to reject over 200 years of math and science just because you say so then obviously there is more at issue here than frequency degradation in upscaling.

You know what? I had a long point by point response typed that pointed out how you were wrong in just about everything you typed. From your fundamental understanding of math to your ridiculous claims about interpolation. However, at this point I cannot post something like that without plain out outright insulting you - which I am trying really hard to avoid. These last two statements are perfect examples of why that is difficult though. It is just plain ignorant to claim that over 200 years of proofs, research, and practical application that prove Fourier correct is "wrong" because you said so.

At this point, I have provided you with reference journal papers you can examine at your leisure that all confirm the statement that there is high frequency reduction in upscaled images and discuss various techniques used to get around them. If you read enough people saying it in peer reviewed journals, maybe you can bring yourself to accept it. Once you get to that point, we can go back and discuss all of the math that proves it.
 
Back
Top