*downscaled* Resolving Resolution

Status
Not open for further replies.
Yep.

For instance, you could watch a CG movie on VHS on a decades-old TV and see better visuals than the most advance games. Pixel quality is where it's at, not resolution, IMHO.

Better visuals if you watch that (small) decades old TV at a distance of several meters.
Once you get to the combination of image size and viewing distance where artifacting due to lack of resolution becomes apparent, it is a different story.

I submit that in order to make statements such as Megadrive 1988s above, you need to qualify your statement a bit. It is by no means true in my case where my gaming predominantly takes place on a 27" screen at 50cm distance, or (more seldom) a 50" screen at roughly 2 meters distance. Resolution limitations are pretty damn glaring under those circumstances. I'll take more image information (higher resolution models/textures/rendering resolution) over prettier lighting any day of the week, and twice on Saturdays.

Usage patterns differ. But the trend towards higher resolution content and displays is going on throughout the tech industry, and it would be strange indeed if gaming consoles were exempt.
 
Up-scaled graphics are horrible...

Not true, depends on the quality of the scaling and how big the difference in resolutions is.
720p to 1080p (2.25 the pixels) should yield pretty good results with decent scaling (like the 360 currently does with its dedicated scaler chip).

Plus, most people can only perceive more detail between 720p and 1080p content when watching on a screens around 50" or larger (from normal seating distances),
 
And what's the frame rate of DVD again? :rolleyes:

The field rate is 50Hz in PAL-land and 60Hz in NTSC-land.

Movies are generally shot at 24 fps. DVD movies for NTSC usually pack fields using a 3:2 pulldown cadence. DVDs for PAL are sped up 4% to match the higher field rate.

You're completely missing the point though. 1080i50, 720p50 and 1080p25 all have roughly the same pixel bandwidth.

Cheers
 
The field rate is 50Hz in PAL-land and 60Hz in NTSC-land.

Movies are generally shot at 24 fps. DVD movies for NTSC usually pack fields using a 3:2 pulldown cadence. DVDs for PAL are sped up 4% to match the higher field rate.

You're completely missing the point though. 1080i50, 720p50 and 1080p25 all have roughly the same pixel bandwidth.

Cheers

You're missing the point, the point was claimed that no one broadcasts in 1080p

I proved otherwise, point proved, case closed.
 
Not true, depends on the quality of the scaling and how big the difference in resolutions is.
720p to 1080p (2.25 the pixels) should yield pretty good results with decent scaling (like the 360 currently does with its dedicated scaler chip).

Plus, most people can only perceive more detail between 720p and 1080p content when watching on a screens around 50" or larger (from normal seating distances),

I can spot it a mile away on my little brothers 32" TV with his 360, I even tried to see if his TV would do the scaling better and I could still tell.

I would assume that anyone with half decent eye sight would be able to spot the difference, upscaled just wouldn't be as sharp or as clear as true 1080p
 
You're missing the point, the point was claimed that no one broadcasts in 1080p

I proved otherwise, point proved, case closed.

Wonderful, splitting words.

And what is the difference for the viewer watching 25fps material in 1080p25 vs 1080i50 ?
Zero!

Cheers
 
Also:


[...]

In case your HDTV would be doing 4:2:2 chroma subsampling and you would play a PS3/Xbox 360/PC game for example which would render at 1280x720, your HDTV would reproduce color at just 640x720.

In case your HDTV would be doing 4:2:2 chroma subsampling and you would play a PS3/Xbox 360/PC game for example which would render at 1920x1080, your HDTV would reproduce color at just 960x1080.

:eek:

[...]


:eek:;)
 
I can spot it a mile away on my little brothers 32" TV with his 360, I even tried to see if his TV would do the scaling better and I could still tell.

I would assume that anyone with half decent eye sight would be able to spot the difference, upscaled just wouldn't be as sharp or as clear as true 1080p
resolution_chart.png

http://carltonbale.com/1080p-does-matter/

What the chart shows is that, for a 50-inch screen ... the benefits of 1080p vs. 720p start to become apparent when closer than 9.8 feet and become full apparent at 6.5 feet. In my opinion, 6.5 feet is closer than most people will sit to their 50″ plasma TV (even through the THX recommended viewing distance for a 50″ screen is 5.6 ft). So, most consumers will not be able to see the full benefit of their 1080p TV.

I also resized this Windows wallpaper down from 1080p to 720p and then upscaled it back to 1080p:
Original 1080p
FQgS1.jpg

Resized from 720p
6aBlt.jpg


Definitely not 'horrible' - and don't forget your viewing this up close on your PC,
if you view the images on your HDTV from 10 feet away it'd be much harder to notice a difference (unless you have a huge 60 inch display)
 
I doubt upscaling of framebuffers is done bi-cubicly (or lanczosly), you completely kill the texture-sampling hardware/caches with it. DC's thread-shared "buffering" may help, but you don't want new powerfull hardware just to do decent post-filtering and nothing else.
 
And clearly nowhere near the same image. PC with massive anti-aliasing vs 360 without.

Cheers

It did makes no difference and does not affect the out come of that blur-o-vision upscaled shot.

Unless you're saying the PC shot would be that blurry with no AA?
 
This quote started from Acert93 on other thread
http://beyond3d.com/showpost.php?p=1662636&postcount=310

MS:
Scalable Patent
Subscription Based
Kinect2

look at MS patent
http://www.eurogamer.net/articles/digitalfoundry-microsoft-scalable-platform-patent
At fig 3B, show as 3GPU
at fig 3A, 2GPU

If MS targetting combined OS effort plus extensive RD maybe all leak
are true, MS will have several or different version of xbox next.
the Lowest maybe 2+TF, also MS targetting for 2-3 year of tommorow
GPU design.

Interesting... Model A: 720p/FXAA, Model B: same games @ 1080p/FXAA. :p
 
Close enough not to be a deal breaker, depending on what you use the power saved towards.

Mandatory 1920 x 1080 buffers would be a tremendous waste of potential when you consider all the different usage scenarios for a console, all the different goals developers might have and and the range of users involved. I rate 60hz above 1080p for most games.

I played Final Fantasy 13 on the 360, with its "720p quality" graphics only displaying at 1024 x 600 (30hz). It was by no means the biggest problem with the game. It was not even the biggest visual problem with the game.
 
Close enough not to be a deal breaker, depending on what you use the power saved towards.

Mandatory 1920 x 1080 buffers would be a tremendous waste of potential when you consider all the different usage scenarios for a console, all the different goals developers might have and and the range of users involved. I rate 60hz above 1080p for most games.

I played Final Fantasy 13 on the 360, with its "720p quality" graphics only displaying at 1024 x 600 (30hz). It was by no means the biggest problem with the game. It was not even the biggest visual problem with the game.

They'll have to move to 1080p eventually... And the same could be said for current consoles, infact why even bother with D in first place, lets all move back to 480p

And what's close enough?
 
I'm a bit sad to see so little difference between native & upscaled, I'm starting to think it might not really be worth the computation difference.
On the other hand it's nice, it means I should be happy with a 720p projector ;p
 
Status
Not open for further replies.
Back
Top