Panel Resolution Upscale Spinoff

Arwin

Now Officially a Top 10 Poster
Moderator
Legend
Fifa is 720p AA 2x on X360 and 720p no AA on PS3

Just about as I thought, but I had the feeling that there were differences for the different play modes (intro one-on-one, replay, and overview gameplay). But maybe not then.

The lighting seems different though, but it could just be the different stadiums. On the PS3 you have a stadium with some very nice sunlight and shadow, on the 360 it seems cloudy.

For what it's worth by the way, 576p is the PAL equivalent of 480p, isn't it?
 
No, PAL is 576i ;)

I was talking about the progressive variant, hence all the p's dude. ;)

Our HD tvs are also different (768p and 1125p or something like that). Diagonally the difference with 720p is quite strange. 1366x768 is what, 15% more pixels? Most TVs upscale this, making it very hard for me to make good comments on scaling and such on my TV, because I haven't found a 1:1 pixel mapping option on it just yet.
 
Only 720p HDTVs are different, they're actually 1366*768 pixels. I've just been told that it's because they have a certain amount of overscan built into them... 1080p panels are 1920*1080p, though.
 
Only 720p HDTVs are different, they're actually 1366*768 pixels. I've just been told that it's because they have a certain amount of overscan built into them... 1080p panels are 1920*1080p, though.

The reason is not because of the certain amount of overscan, but because of better conversion of PAL signals, apparently. Pal is 768 wide, vs 640 on NTSC. Double these and presto. For widescreen purposes, the height is PAL * 1.5, if I remember correctly. Similar things have happend for 1080p screens, but the differences are much smaller in comparison. and there are no official resolutions for this, but some TV builders have the extra pixels in place nonetheless to improve the quality of PAL resolutions.
 
The reason is not because of the certain amount of overscan, but because of better conversion of PAL signals, apparently. Pal is 768 wide, vs 640 on NTSC.
Neither PAL nor NTSC have a horizontal pixel resolution. They're analog signals with a fixed number of scanlines. PAL has 576 visible scanlines, so that's a 3:4 mapping on 768 scanline displays.
 
The reason is not because of the certain amount of overscan, but because of better conversion of PAL signals, apparently. Pal is 768 wide, vs 640 on NTSC. Double these and presto. For widescreen purposes, the height is PAL * 1.5, if I remember correctly. Similar things have happend for 1080p screens, but the differences are much smaller in comparison. and there are no official resolutions for this, but some TV builders have the extra pixels in place nonetheless to improve the quality of PAL resolutions.

i think 1366x768 display is for the PC compatibilité (1024x768 with 1:1 mapping) and 768 with 16/9 ratio for TV contents = 1366
 
i think 1366x768 display is for the PC compatibilité (1024x768 with 1:1 mapping) and 768 with 16/9 ratio for TV contents = 1366

You may think that, but I'm still right. ;) This is why there is a difference between HDtvs sold in the US that have 1280x720 panels, and HDtvs in the EU, which have 1366x768 panels.

Or has that changed and are they selling 1366x768 panels in the US now also?
 
Only 720p HDTVs are different, they're actually 1366*768 pixels. I've just been told that it's because they have a certain amount of overscan built into them... 1080p panels are 1920*1080p, though.

I thought there are some monitors that accept 1080p signal while the panels itself is less than 1080.
 
You may think that, but I'm still right. ;) This is why there is a difference between HDtvs sold in the US that have 1280x720 panels, and HDtvs in the EU, which have 1366x768 panels.

Or has that changed and are they selling 1366x768 panels in the US now also?

My panel is 1280x720 & I'm in the EU ;)
 
You may think that, but I'm still right. ;) This is why there is a difference between HDtvs sold in the US that have 1280x720 panels, and HDtvs in the EU, which have 1366x768 panels.

Or has that changed and are they selling 1366x768 panels in the US now also?

They used to sell 1366x768 plasmas and there may still be discontinued models still being sold.

Then they moved up to 1080p since DLP and LCD went there.

They also used to sell XGA plasmas as HDTVs.
 
There are many current model plasmas and LCDs which are 1366x768, and 42 plasmas which are 1024x768. Fruthermore, 1024x76816:9 aspect ratio displays are HD, they surpass the minimum HD display standards of 720 lines in a widescreen aspect ratio. Those standards are serrate from HD signal and content standards simply CRT displays have a variable horizontal resolution and hence displays can't be held to the same standards as signals or content.

Also, Arwin, the 1280x720 or 1366x768 was never a Euro/US thing but rather both are and have been available in both areas. Different manufactures simply choose different dispaly resolutions based on a wide asrortment of reasons.

And Laa-yosh, 1366x768 is just refered to 720p because that is the brodcast/content resolution which they are closest to, but they are more more correctly referred to as 768p. Direct view fixed pixel displays don't have any physical overscan, they often overscan content but every pixel of advertised resolution is used to display it.
 
It's a Sony :rolleyes:

(KDF-E50A11E)

Ok, I guess I haven't fully understood the complexity of the issue, apparently. I could have sworn (and have been told) several times that the reason is really compatibility with PAL resolutions (which yes, for all intents and purposes, when used digitally, do in fact have a horizontal resolution as well as a vertical resolution ... everyone who has had an Amiga or Atari ST knows this ;) ).

But the matter seems to be more complicated. Right now I'm starting to believe that the real reason is what is mentioned in this post:

http://hd1080i.blogspot.com/2006/12/1080i-on-1366x768-resolution-problems.html

and that is the one he mentions where this is the highest resolution that fits in VRAM.

But I really don't know. It must be a combination of this and the PAL thing, because that just seems too convenient.

It's all very stupid. I'm glad 1080p is becoming the standard in the end, but in the meantime there are going to be a LOT of 1366x768 TVs out there for quite a while yet I think. It's no problem if all source content becomes 1080p, because then everything is getting scaled anyway, and then 1366x768 is actually a little bit better (more pixels = better resolution). But for the 720p content, which includes most games and is likely to include most games for quite a while yet, 720p is probably better. Annoying and weird stuff. Then again, I knew going in that 1080p would be 'it' eventually, based on the recording resolution for most digitally recorded movies and the HD formats (BluRay and HDDVD) alone, and rather than fork out 2000 for a somewhat weak 1080p TV this year, we opted for a decent 768p LCD this year, and then hopefully for half the price they are currently, an additional 1080p display next year (or the year after). We'll see.
 
Last edited by a moderator:
Ok, I guess I haven't fully understood the complexity of the issue, apparently. I could have sworn (and have been told) several times that the reason is really compatibility with PAL resolutions...
I don't see that claim making any sense at all.

But for the 720p content, which includes most games and is likely to include most games for quite a while yet, 720p is probably better.
Nah, like the article you linked mentioned, decent upscaling looks sharper than displaying at a lower native resolution. Besides most displays tend to overscan signals anyway to avoid junk that shows up on the edges of bad edits, so even if you do have a 1280x720 or 1920x1080 display, 720p and 1080p/i signals will respectivly be upscaled a bit beyond those display resolutions anyway.

Also, unless you plan to sit rather close to a rather large display, moving to a 1080p display isn't going to accomplish anything. Most likely, you'd be better off finding a TV with better contrast and more accurate color reproduction, as even when sitting close enough to a large enough display to fully resolve 1080p, those other factors tend to have far more impact on overall image quality than the added sharpness gained by the higher display resolution.
 
Ok, I guess I haven't fully understood the complexity of the issue, apparently.
The lovely HD future has been designed to be as confusing as possible. I doubt anyone really understands it, not least because the rules are getting made up as they go along ;)

I could have sworn (and have been told) several times that the reason is really compatibility with PAL resolutions
I can't see any reason for that. AFAIK 768p have been 'commonplace', long before the EU really got HD. The technology came from creating PC displays, and 768p was a natural PC resolution where 720p isn't - 3x 256 pixels opposed to 3x 240. The 720p format comes from the TV space, as a multiple of the NTSC base unit (one interlace field) of 240 lines. Thus you have a collision of two different technologies. The TV standards, led by the NTSC nations, choose 3x the 240 line standard, while the panel manufacturers produced panels from their existing PC lines. That's how I understand it anyway, but I'm not well read, and there's a different explanation wherever you read too!

But the matter seems to be more complicated. Right now I'm starting to believe that the real reason is what is mentioned in this post:
Why not put in the 1MB of VRAM but not use it all? Adding more pixels just means needing more processing power and technology to drive them. A 720p native set would need no processing of 720p signals and produce the best picture of any HD set with 720p content, and would need less processing power to handle SD content to upscale. Going to 768p means needing to spend more money and do more work. For me, the sensible reason for 768p is because that's all the panel manufacturers make and will provide, so the TV companies just have to work with it.
 
Back
Top