Well it would be scaled horizontally of course to 1920x1080 before being displayed.
scaling in only 1 direction usually ends up pretty bad
Well it would be scaled horizontally of course to 1920x1080 before being displayed.
It might not be an exact factor, but when you have large enough pixels, a little interpolation on one side would go pretty unnoticed. Let's say you have a black pixel, white pixel, and another black pixel in your game. A perfect upscale would be 3 pixels of black, 3 of white, and three of black. If your game resolution is lower than that, you might get 3 black pixels, one grey, 2 white, one grey, 3 black. In a simply linearly upscaled game, you'll get instead a gradient from black to white to black, looking all blurry.So anything that is an exact factor of the native resolution of the TV should be expected to be outputted on the display as is and as it should, regardless if the running resolution is lower than the max output of the display. Correct?
Ok then to understand better if this is the case we should check the resolution of the game running,
Game I run last time was Panzer Dragoon 1. What is the resolution of that game? It certainly wasnt 640x480. So....was it really an exact factor of 1080p? And what about the vertical resolution? 480 is not an exact factor of 1080 pixel wide image.
Any Saturn game of varied resolutions I remember trying on my TV in the past produced the same crisp image.
So anything that is an exact factor of the native resolution of the TV should be expected to be outputted on the display as is and as it should
When going via an analogue component or composite input as Nesh is, I wouldn't expect a TV to use anything other than nearest neighbour, or rather change the value according to scan timings. To do otherwise would need the frame to be digitised and upscaled.It can, but it really shouldn't.
It might not be an exact factor, but when you have large enough pixels, a little interpolation on one side would go pretty unnoticed. Let's say you have a black pixel, white pixel, and another black pixel in your game. A perfect upscale would be 3 pixels of black, 3 of white, and three of black. If your game resolution is lower than that, you might get 3 black pixels, one grey, 2 white, one grey, 3 black. In a simply linearly upscaled game, you'll get instead a gradient from black to white to black, looking all blurry.
Aren't there PSone settings playing on PS3? - http://manuals.playstation.net/document/en/ps3/current/game/2settings.html Try switching off smoothing.
It can, but it really shouldn't. It's typically the result of simple doubling (or tripling) of pixel values and is the equivalent to nearest neighbor filtering.
Proper upscaling hardware usually use cubic interpolation or windowed sinc, - or at the very least bi-linear.
Remember: Pixels aren't squares, they're points !!
Cheers
When going via an analogue component or composite input as Nesh is, I wouldn't expect a TV to use anything other than nearest neighbour, or rather change the value according to scan timings. To do otherwise would need the frame to be digitised and upscaled.
So anything that is an exact factor of the native resolution of the TV should be expected to be outputted on the display as is and as it should, regardless if the running resolution is lower than the max output of the display. Correct?
Ok then to understand better if this is the case we should check the resolution of the game running,
Game I run last time was Panzer Dragoon 1. What is the resolution of that game? It certainly wasnt 640x480. So....was it really an exact factor of 1080p? And what about the vertical resolution? 480 is not an exact factor of 1080 pixel wide image.
Any Saturn game of varied resolutions I remember trying on my TV in the past produced the same crisp image.
Then you basically have the best LCD/plasma TV that is possible to make that pretty much no TV maker is capable of making.
There is no LCD/plasma TV currently available that I have viewed that can display SD content without blur related to upscaling.
That said if you sit far enough away from a TV, that blur will diminish in visibility until it may no longer be easily seen. The farther apart the resolution of your source compared to the native resolution of your TV, the farther away you'll have to sit.
Regards,
SB
I am not the Angry Video Game Nerd. When I want to play on an old console I will use a TV from our ageThe saturn is so old that his tv could of been a crt
At normal viewing distance, my Sammy upscales SD beautifully. It's not HD quality so isn't pin sharp, but it's smooth and balanced like a CRT a lot of the time. One reason for picking it was the rave reviews about its SD content quality.There is no LCD/plasma TV currently available that I have viewed that can display SD content without blur related to upscaling.
How large was your old TV and what was your viewing distance? CRTs were typically a lot smaller than our flat panels, so the games of old occupied a much smaller FOV. I'm sure if you tried your DC on a 40" CRT today, you'd be shocked how blurry it is.Perhaps what I see is crispier quality in relative terms then
Curious to check again.
Well it doesnt matter how big my old TV was because currently I am referring to how my current HD TV displays the content and thats the only TV I have right now.How large was your old TV and what was your viewing distance? CRTs were typically a lot smaller than our flat panels, so the games of old occupied a much smaller FOV. I'm sure if you tried your DC on a 40" CRT today, you'd be shocked how blurry it is.
40fps would be mid point and is available for all lucky with 120hz monitors.Where's the middle ground options like 900p 45fps?
40fps would work on a 60hz monitor too. Its just 1 duplicate frame every 3 frames instead of 1 duplicate every 2 like it is with 30fps. It fits in nicely with triple buffering too.
it does.Though that might create a worse judder effect that could arguably be considered less fluid than 30fps from a perceptual level.
I don't get why it's so hard for you to understand that some people prefer 1080p @ 30fps over 720p @ 60fps. It's personal preference! You can make all the pictures you want, but it won't change this fact.
You really haven't given much of a reason why this is a bad idea.
Cut on image quality just to contribute to the "fairness" of a competition that's not theirs isn't really an issue, because the rules would be the same for everyone - and because of that, the perception of the individual would be bound to what everyone expects to be possible running that framerate.
Imagine a bit - how much better looking games would be if developers targeted 15fps. Would we really want that?
By the same account - graphics are cleary better on PC, so why are we still playing on consoles? Because there's more to gaming than just looking at nice pictures. Because people prefer to actually sit in a living room or just simply have that "plug & play" experience that you can't get on the PC. Because of that, there is a compromise - we live with worse visuals (compared to PCs) because we prefer the different gaming environment. So why would it be bad, if we just limit our expectations to graphics a little more for the advantage of having a better framerate (that benefits controls, responsivnes, motion of the game)?
I never argued personal preference. I'm taking the stance on what is the greater good and what would need to be done for it to work for everyone. Graphics will always get better - and better graphics will always be nice. But not at the expense of framerate!
Take most people. Sit them on a couch at typical/recommended TV viewing distances.
Show them content (movies, games, photos, etc.) on the TV without telling them what resolution something is running at.
The vast majority of them wouldn't be able to tell the difference or even know which was at 720p and which was at 1080p. Hence why I do ALL of my PC gaming on my living room TV at 720p instead of 1080p. I'd be willing to bet most of them wouldn't be able to tell if a game was running at 540p versus 1080p either if the game had good MSAA or RGSSAA. Pretty much the only thing that would give it away is the size of the jaggies from aliasing. Again, assuming typical/recommended TV viewing distances for a given screen size.