Ethics of resolutions displayed on back of box

If it looks exactly the same why is it not?

"looks" is subjective. The reviewer may not be able to see the difference in sharpness/detail while I might on my 1080p set.

Frame rate is frame rate. If it stutters it will probably get nicked for it. Resolution however is not as easy to compare. The reviewers eyes, tv res, tv size, setttings, scaler, all come into play when comparing resolution. Not so with framerate.
 
"looks" is subjective. The reviewer may not be able to see the difference in sharpness/detail while I might on my 1080p set.

Frame rate is frame rate. If it stutters it will probably get nicked for it. Resolution however is not as easy to compare. The reviewers eyes, tv res, tv size, setttings, scaler, all come into play when comparing resolution. Not so with framerate.

I agree. I understand where you are comming from. I think we here @ B3D are ahead of the curve when it comes to this kind of knowledge. Would it be nice to know exactly what rez a game is rendered at internally? Yes. Does it honestly matter in the end? I dont think it does.

That is why my position is that XTS games should have no resolutions listed on the back of the box. And that PS3 games have no resolution listed after the scaler becomes fully functional. We didn't care before the advent of HD, why do we care now?

BTW, did anyone notice how the XBOX had games that could be played at higher resolutions? How was that done? i.e. was it upscaled or were the games actually ran at the rez listed?
 
We didn't care before the advent of HD, why do we care now?

That's the point! Before it was not an issue as everyone had the same resolution display.

People are buying HDTV's today to get INCREASED fidelity images. Same is true for xb360/ps3. If HDTV didn't exist this would not be a topic. But it does and people are buying them for various purposes but the ones buying 1080p sets are paying a premium for this ability.

Selling games that do not offer additional fidelity over others but are marked as such is misleading and should be corrected.
 
Heres a silly example:

You purchase 2 DVD's, on the back of the box they both say "Widescreen, the black bars on top and bottom are normal" You get DVD A home and put it in your DVD player and you watch it on your gorgeous new 720p LCD TV. You put DVD B into your DVD player next and BAM, its in widescreen..but its been matted from a 4:3 source. Your gorgeous widescreen TV can "zoom" and now the image fits perfectly on your TV but you KNOW your not getting the best picture you could because its not natively 16:9 material. This does not happen much anymore but it used to happen to me quite frequently, now however some morons think that putting matted 16:9 material on DVD's that have the feature film in true 16:9 format is ok!!

I don't want a game saying its 1080p or 720p if its being scaled up to that resolution! The least amount of conversion that is done to the image being shown is what I expect. My projector is 720p, I wont feed it 1080I over 720p so why would I want a 480p upscaled to 720p or 1080p? No mater how good the scaler is the output "cannot" be the same, no 720p material is going to look as good as true 1080p material on their corresponding displays. Will a 720p owner care if 1080p is only upscaled from a 720p source...NO. But the person who owns a True HD set should not be lied to and should be allowed to have their "better" tv scaler do it for them.

Its ethically wrong as would upscaling DVD players trying to suggest they output the same image quality has HD or BR movies do because they have the same resolution going to your set!

Dregun
 
Its ethically wrong as would upscaling DVD players trying to suggest they output the same image quality has HD or BR movies do because they have the same resolution going to your set!

Dregun

Good example. HD-DVD vs upscaling DVD player.

But even this is not as clearly seperated as realtime rendered images as aliasing is much more of a problem. Films have infinite AA vs 4xAA max on xb360.


If there is no discernable difference between 480p(4xaa upscaled) and 1080p then what chance does hd-dvd/bluray have to succeed?:???:
 
Good example. HD-DVD vs upscaling DVD player.

But even this is not as clearly seperated as realtime rendered images as aliasing is much more of a problem. Films have infinite AA vs 4xAA max on xb360.


If there is no discernable difference between 480p(4xaa upscaled) and 1080p then what chance does hd-dvd/bluray have to succeed?:???:

The thing is if there is a discernible difference in the quality of the native 1080p vs scaled to 1080p image then THAT is the issue (The quality difference). And this is no different than if one version has superior AA/AF, higher quality textures, better framerate, higher-poly models, better lighting, etc. Why do you place so much importance on resolution when any and all of these factors alone or in combination can have as much or more of an effect on the quality of the image?

The resolution-based marketing being done by Sony ATM is no more valid than the GHz-based marketing that Intel were doing with their Netburst architecture.

Just like a lower-clocked processor can perform better than a higher-clocked one if more work is being done per clock, a lower-res image can definitely look better than a higher-res one providing that more "work" is being done per frame.
 
I try to match my LCD native res (1920x1080) with PC games as much as possible. I might drop to 1280x720 with high AA/AF if I cannot get the fps I want. I prefer the clean and crisp look to 1080P over upscaled 720P, even if the AA/AF is good. Lower res upscaled just looks like butter is smeared on my screen. As a customer I'd like to know what I'm buying. I don't want some PR guy telling me a low res upscaled is just as good.

Sony's reporting the true res is not some marketing bull, now if they have changed things like some have said with VF5 to emulate the 360 that is just deceiving.
 
I try to match my LCD native res (1920x1080) with PC games as much as possible. I might drop to 1280x720 with high AA/AF if I cannot get the fps I want. I prefer the clean and crisp look to 1080P over upscaled 720P, even if the AA/AF is good. Lower res upscaled just looks like butter is smeared on my screen. As a customer I'd like to know what I'm buying. I don't want some PR guy telling me a low res upscaled is just as good.

Sony's reporting the true res is not some marketing bull, now if they have changed things like some have said with VF5 to emulate the 360 that is just deceiving.

One of the advantages of having the scalar integrated into the console is that the developers can get a definitive indication whether scaled output is satisfactory or not since they will have predictable results no matter what the display device. Ultimately if the results are not satisfactory then it is on them, since they made a poor design choice and reviewers both professional and amateur will notice and criticize the graphics accordingly. And this will happen whether the reviewer knows that the graphics are being scaled or not.
 
I think things were made more confusing by how they list resolutions on games. People get native resolutions, but the support for scaling resolutions is flat out strange logic.



Hopefully by next generation consoles launch the madness will end, and developers can support whatever resolutions they want. To some extent this is just fallout from just transisting people from old standard defintion into HD sets.



1440p is the true next step after 720p. With the Xbox 720 Microsoft should let developers choose support 720p resolution if they want, and if a user has a 1440p resolution display let them scale to that. It should look fantastic because 1440 is exactly 4 times the resolution of 720p, thats why they call it quad HD. You can fit four 720p screens of pixel data onto a 1440p display perfectly. This would be great for four player split screen gaming.

And if a developer wants to go higher let them support 1080p or 1440p native. Probably the GPU will be built with 1440p in mind anyway. Today Apple with their high end wide screen cinema display supports 1440p. Today Dell with their high-end moniter line supports 1440p in several models, a 24 inch and 30 inch displays.

Microsoft needs to display some leadership and in the near future just announce that 1440p will be supported in next-gen console platforms to help give clear guidance with the standard. Just come out and say clearly it means that the Xbox LIVE dashboard will support the resolution, but they will not force developers to render at 1440p. Since almost all games on Xbox 360 are native 720p, they will scale up excellently on a 1440p sets for those intrested in backward compatibility. The trend is clear that screen sizes are going to conintue to expand rapidly over the next 5 years, at the same time prices will continue to plummet.
 
The thing is if there is a discernible difference in the quality of the native 1080p vs scaled to 1080p image then THAT is the issue (The quality difference). And this is no different than if one version has superior AA/AF, higher quality textures, better framerate, higher-poly models, better lighting, etc. Why do you place so much importance on resolution when any and all of these factors alone or in combination can have as much or more of an effect on the quality of the image?

The resolution-based marketing being done by Sony ATM is no more valid than the GHz-based marketing that Intel were doing with their Netburst architecture.

Just like a lower-clocked processor can perform better than a higher-clocked one if more work is being done per clock, a lower-res image can definitely look better than a higher-res one providing that more "work" is being done per frame.


Can you tell the difference between an upscaled DVD and HD-DVD/BLURAY Displayed on a 1080p Monitor?

If one cannot answer yes to this question then this thread is not for that person.

Assuming one answers yes to this question, the difference is obviously in sharpness and detail.

This same difference in sharpness and detail would be even more evident in games rendered at ~ 640x480 vs 1920x1080 because of the lack of AntiAliasing. Obviously no games for xb360 or ps3 are rendered natively at this res but there are cases where the native resolution of the game is not listed and is not standard. In these cases the developer is doing the best they can to balance frame rate, graphic fidelity, and resolution. There have been suggestions that if developers were not focused on achieving "HD" visuals they could provide significantly more image quality/pixel by limiting the resolution of these next gen games to 640x480 instead of the target resolution of 1280x720.

Extreme scenario:

Publisher AAA wants to bring a popular cutting edge PC franchise to next gen systems. They hire two developers to do so. Developer XBDEV will handle the xb360 version while PSDEV will handle the PS3 version. Both are among the best and equal in ability. The publisher has contracted out for content creation so both developers will be given the exact same assets. Same models and textures.

XBDEV believes in the philosophy of pixel quality over pixel quantity.
PSDEV believes in the philosophy of pixel quantity over pixel quality.

XBDEV focuses heavily on shader quality and framerate.
PSDEV attempts to match the effort but with the 1920x1080 frame buffer they do the best they can.

Screenhots after the game is completed prove to be one sided when looking at the games running side by side on the reviewers monitors. Both have smooth edges and detail to spare. But XBDEV's port recieved better reviews based on the higher quality shaders, and smoother frame rate. PSDEV complained to the review sites that the comparison was unfair based on their inability to review their port on a 1080p HD Monitor.

The review sites replies varied from "I do have a 1080p monitor and both consoles set to export 1080p" to "What's a HD Monitor?".

How does the story end?
 
Can you tell the difference between an upscaled DVD and HD-DVD/BLURAY Displayed on a 1080p Monitor?

If one cannot answer yes to this question then this thread is not for that person.

Assuming one answers yes to this question, the difference is obviously in sharpness and detail.

But what if the source material were originally SD? Then the upconverted DVD would likely look nearly identical to the HD Disc. You do see the flaw with this analogy, right? There is no disadvantage to the higher resolution offered by HD disc formats while in game graphics there could potentially be a tradeoff that effects the "source material".

This same difference in sharpness and detail would be even more evident in games rendered at ~ 640x480 vs 1920x1080 because of the lack of AntiAliasing. Obviously no games for xb360 or ps3 are rendered natively at this res but there are cases where the native resolution of the game is not listed and is not standard. In these cases the developer is doing the best they can to balance frame rate, graphic fidelity, and resolution. There have been suggestions that if developers were not focused on achieving "HD" visuals they could provide significantly more image quality/pixel by limiting the resolution of these next gen games to 640x480 instead of the target resolution of 1280x720.

And I like that the 360's internal scaler allows them to make these types of design decisions with more flexibility than has previously been offered.

Extreme scenario:

Publisher AAA wants to bring a popular cutting edge PC franchise to next gen systems. They hire two developers to do so. Developer XBDEV will handle the xb360 version while PSDEV will handle the PS3 version. Both are among the best and equal in ability. The publisher has contracted out for content creation so both developers will be given the exact same assets. Same models and textures.

XBDEV believes in the philosophy of pixel quality over pixel quantity.
PSDEV believes in the philosophy of pixel quantity over pixel quality.

XBDEV focuses heavily on shader quality and framerate.
PSDEV attempts to match the effort but with the 1920x1080 frame buffer they do the best they can.

Screenhots after the game is completed prove to be one sided when looking at the games running side by side on the reviewers monitors. Both have smooth edges and detail to spare. But XBDEV's port recieved better reviews based on the higher quality shaders, and smoother frame rate. PSDEV complained to the review sites that the comparison was unfair based on their inability to review their port on a 1080p HD Monitor.

The review sites replies varied from "I do have a 1080p monitor and both consoles set to export 1080p" to "What's a HD Monitor?".

How does the story end?

I would assume you have been around long enough to have your own list of sites who's reviews you give weight to because they have consistently done comprehensive and competent reviews that have matched your own experience. Between a few of these and a canvasing of other site's reviews I would be absolutely shocked if all of them missed an obvious difference in image quality and that none of them would have run the game on a 1080p-capable monitor.

Ultimately the only way to be absolutely certain that a game is going to look good to you on your equipment before buying it is to rent the game first or download a demo.
 
But what if the source material were originally SD? Then the upconverted DVD would likely look nearly identical to the HD Disc. You do see the flaw with this analogy, right? There is no disadvantage to the higher resolution offered by HD disc formats while in game graphics there could potentially be a tradeoff that effects the "source material".

In this case would you not be upset that you spent x dollars on x-hd player and x-hd movie only to find that there is no difference between it and the older dvd player & movie that you already owned?


And I like that the 360's internal scaler allows them to make these types of design decisions with more flexibility than has previously been offered.

I do to, but I also like knowing what I buy is not a fraudulant misrepresentation. see case above.

I would assume you have been around long enough to have your own list of sites who's reviews you give weight to because they have consistently done comprehensive and competent reviews that have matched your own experience. Between a few of these and a canvasing of other site's reviews I would be absolutely shocked if all of them missed an obvious difference in image quality and that none of them would have run the game on a 1080p-capable monitor.

Indeed. As I've said before, this is not so much a problem for me personally. I know where to go to find out such details about a game.

Note that I addressed the reviewers reaction range and included the possibility of a 1080p set hooked up through the 1080p output of both consoles. Who's to say this guy wouldn't see "1080p signal" in the corner of his display and take it from there as "well, it's a 1080p game alright, I see the checkbox on the back and my tv affirms it."

Ultimately the only way to be absolutely certain that a game is going to look good to you on your equipment before buying it is to rent the game first or download a demo.

Again for me personally it isn't an issue as I rent nearly all my games. But the issue still exists as to whether this is ethically correct to not state what the rendered resolution of the game is and worse to imply it renders significantly above it's native res.
 
Heres a silly example:

You purchase 2 DVD's, on the back of the box they both say "Widescreen, the black bars on top and bottom are normal" You get DVD A home and put it in your DVD player and you watch it on your gorgeous new 720p LCD TV. You put DVD B into your DVD player next and BAM, its in widescreen..but its been matted from a 4:3 source. Your gorgeous widescreen TV can "zoom" and now the image fits perfectly on your TV but you KNOW your not getting the best picture you could because its not natively 16:9 material. This does not happen much anymore but it used to happen to me quite frequently, now however some morons think that putting matted 16:9 material on DVD's that have the feature film in true 16:9 format is ok!!

I don't want a game saying its 1080p or 720p if its being scaled up to that resolution! The least amount of conversion that is done to the image being shown is what I expect. My projector is 720p, I wont feed it 1080I over 720p so why would I want a 480p upscaled to 720p or 1080p? No mater how good the scaler is the output "cannot" be the same, no 720p material is going to look as good as true 1080p material on their corresponding displays. Will a 720p owner care if 1080p is only upscaled from a 720p source...NO. But the person who owns a True HD set should not be lied to and should be allowed to have their "better" tv scaler do it for them.

Its ethically wrong as would upscaling DVD players trying to suggest they output the same image quality has HD or BR movies do because they have the same resolution going to your set!

Dregun

Ethically wrong maybe, but it's not like tv makers are any more innocent in being misleading.

BTW, what if the game doesn't render in a native hd format? It has to be scaled by the 360's scaler at some point then. VGA could do it, but I don't know of any multiscan monitors that can automatically adjust the image to fit. It'd get mighty annoying to have to readjust the monitor for each different res.
 
But what if the source material were originally SD? Then the upconverted DVD would likely look nearly identical to the HD Disc. You do see the flaw with this analogy, right? There is no disadvantage to the higher resolution offered by HD disc formats while in game graphics there could potentially be a tradeoff that effects the "source material".

In this case would you not be upset that you spent x dollars on x-hd player and x-hd movie only to find that there is no difference between it and the older dvd player & movie that you already owned?

As long as the movie looked as good as it possibly could on that platform, not really, no. I don't buy movies so I can count pixels and do side-by-side quality comparisons. ;)

And I like that the 360's internal scaler allows them to make these types of design decisions with more flexibility than has previously been offered.

I do to, but I also like knowing what I buy is not a fraudulant misrepresentation. see case above.

It is meant to represent functionality and nothing more. Nothing fraudulent at all about that.

I would assume you have been around long enough to have your own list of sites who's reviews you give weight to because they have consistently done comprehensive and competent reviews that have matched your own experience. Between a few of these and a canvasing of other site's reviews I would be absolutely shocked if all of them missed an obvious difference in image quality and that none of them would have run the game on a 1080p-capable monitor.

Ultimately the only way to be absolutely certain that a game is going to look good to you on your equipment before buying it is to rent the game first or download a demo.

Indeed. As I've said before, this is not so much a problem for me personally. I know where to go to find out such details about a game.

Note that I addressed the reviewers reaction range and included the possibility of a 1080p set hooked up through the 1080p output of both consoles. Who's to say this guy wouldn't see "1080p signal" in the corner of his display and take it from there as "well, it's a 1080p game alright, I see the checkbox on the back and my tv affirms it."

Again for me personally it isn't an issue as I rent nearly all my games. But the issue still exists as to whether this is ethically correct to not state what the rendered resolution of the game is and worse to imply it renders significantly above it's native res.

Why does it matter what res. the game renders at? It either looks good or it doesn't. I don't need to know precisely how much salt has been added to my meal at a restaurant to know whether or not it's too salty. Why do I need to know what res. a game is running at in order to determine whether it lacks detail or not, has a good/poor framerate, shows or does not show a lot of aliasing, etc. The resolution is irrelevant save for how it effects the quality of the output.
 
The resolution is irrelevant save for how it effects the quality of the output.

Agreed and on that note we disagree. I feel it greatly affects the quality of the output and you don't. Let's agree to disagree. :smile:

To any that agree that resolution affects the quality of the output but do not think it is ethically incorrect to display 1080p on a game which does not accurately represent this resolution (in some cases not even 25% of it). Please post why you feel this way.

To those that agree that resolution affects the quality of the output and also feel it is ethically incorrect to display resolutions outside the ability of the game, please your comments as well.
 
To any that agree that resolution affects the quality of the output but do not think it is ethically incorrect to display 1080p on a game which does not accurately represent this resolution (in some cases not even 25% of it). Please post why you feel this way.

I think I fall in this group.

as long as it's labeled as "supported resolution" and not labeled as being rendered at that resolution

it's a couple reason I feel that way

1. I hate when a device has to change resolution and the screen blacks out for a few seconds, so if it will stay a a single resolution and scale where necessary thats fine with me

it may be other like myself and if the supported resolution wasn't listed on the back the average consumer may not know that the console will scale to their TV's native resolution and pass on the game otherwise.

2. Most TVs have crap scalers, and the game is going to get scaled somewhere. and again with the resolution listed on the back it lets the consumer know it at least scales to that resolution. a game they may pass on if the scaler in there TV is crap and they only buy games that output at the TV native resolution
 
The resolution is irrelevant save for how it effects the quality of the output.

Agreed and on that note we disagree. I feel it greatly affects the quality of the output and you don't. Let's agree to disagree. :smile:

Thing is I never said that. In fact what I said was that any effect lowered resolution had on the image whether blindingly obvious or subtle would be visually noticeable as lowered detail, a lack of sharpness, or possibly increased aliasing. But fewer polys/inferior textures/lower AA settings which are the likely tradeoffs (along with framerate) that might have to be made to achieve higher res. could contribute to the same visual inferiorities. Why do you feel the need to put a higher priority on one factor over any of the others? If I look at a game and I see that it lacks detail do I really need to know whether that is from a lack of resolution, poor texture quality or maybe even some combo of the two?

If we are going to agree to disagree, I just wanted to make it clear about what we were disagreeing.
 
Wow, the argument is still going? I am almost confused as to why we are still even arguing.

So there are really two camps:

1st camp wants the games to be labeled with the native resolution because anything else is misleading.

2nd camp wants the games to be labeled with supported resolutions because it done that way now.


Would it change your buying habit if you knew the native resolution to games? i.e. You would buy, if you haven't already, a PS3 to play Marvel Ultimate Alliance because on the PS3 it is a 1080P game? Also are you content with the idea of a game being made at 1080p and if you tv can't run that exact resolution then the game runs at 480p instead with no in betweens? Because that is what I think of when I see native resolutions.
 
Back
Top