Ethics of resolutions displayed on back of box

Diamond.G

Regular
It seems as though a few of us have derailed the VF5 thread here so I opened this new thread to continue the discussion.



@Chef: Yes I have a HDTV (since 2004) and I scale everything to 720P. Yes SDTV look like butt upscaled, but I expect it to.

Note to mods: Wasn't sure if this should go in the tech forum or not so I placed it here. Move or delete if redundant.
 
My current position is that its irrelevant. What is relevant is does it look good? Does native rendering vs scaling necessitate looking good vs. looking bad? Would game A rendered natively at 1080p without question look better than game A rendered natively at 720p and scaled to 1080p? Are there not other significant factors that will determine how good or bad the game looks whether natively rendered or scaled?
 
Originally Posted by Gradthrawn
My point is simple. Does it support X resolution? Does it look good at X resolution? If you want to show off your new 1080p TV (I would want to do the same) I would want a game that looks good on it, regardless of it renders 1080p natively, using a broken hardware scaler, or with help from the Pixel Fairies (tm). I fail to see how being natively rendered automatically makes it better than being scaled when it both situations there’s plenty of room for the developer to screw something up that makes it look bad.

Its called Tekken 5 Dark Resurrection! Where have you been?!

All joking aside, I understand your argument. But on a personal level, its seems so irrelevant to me. Am I going to take Call of Duty 3 back for my 360 because it doesn't actually render at 1280x720? That would be ridiculous. I would take it back if it looked bad, had horrible frame rate problems etc, not for something superficial like native rendering resolution, which I would never be able to actually verify anyway. In no way does a game's native rendering resolution effect me. It looking like crap does, but as far as I can tell that’s independent of what resolution(s) it nativly renders at and/or scales to.



Different situations. Does that DVD actually output 5.1 to the receiver? Is that a supported codec on the DVD? If not, then its not the same thing. A similar analogy would be listing 7.1 PCM lossless support, when in reality its just a remixed 5.1 soundtrack. It actually outputs 7.1 PCM, but the true source is only 5.1. The car is also completely different as well. A correct analogy would be gas labeled as 93, that's actually 90, but has the detonation resistance of 93 due to other factors.

I get what your saying - "If it looks good who cares?"

Problem is how can you tell? Are you going to take the 640x480 game (labeled 1080p) home, hook it up to your tv, notice it looks like crap on your brand new 1080p native set, and then attempt to get a refund?

Good luck!

One can look on the back of the box and see the screenshots to see what the game looks like but res differences are limited with postage stamp sized pics. One could also wait for a mag review to see if it is something purchase worthy. They likely will not get correct res info from a mag review though.

If one knows where to go on the net *and has access to the net* then it is a moot point. But in the meantime it is borderline false advertising and really where do we draw the line on what is considered a minimum acceptable resolution for the stated checkbox of 1080p on the back of the box?

Devs could render 640x480 w/4xaa all day long on 360 without tiling. Should this be acceptable while posting 1080p on the back of the box?

-My suggestion is to fall back to the next native resolution if confusing the customer with native rendered resolution is a big concern. (ie: 1280x800 = 720p)
 
Last edited by a moderator:
Problem is how can you tell? Are you going to take the 640x480 game (labeled 1080p) home, hook it up to your tv, notice it looks like crap on your brand new 1080p native set, and then attempt to get a refund?

If your strategy for buying games is just walking into a store reading the back of boxes, then you chance of walking out with a crappy game is going to be pretty high.
 
If your strategy for buying games is just walking into a store reading the back of boxes, then you chance of walking out with a crappy game is going to be pretty high.

Not to mention, your chances of having a decent enough TV to see any difference, a next-gen console that outputs HD, the incentive to care about this issue and at the same time lack access to the internet are pretty low.

I do, however, agree that perhaps there should be a standard. They should keep the PS3 convention of listing "Ouput" support over the ambiguous/dummed down Xbox360 labels, and list also the highest native resolution.
 
If your strategy for buying games is just walking into a store reading the back of boxes, then you chance of walking out with a crappy game is going to be pretty high.

Funny - the topic is "ethics of resolution" not "ethics of crap games"

Devs could render 640x480 w/4xaa all day long on 360 without tiling. Should this be acceptable while posting 1080p on the back of the box?

In using this example games could claim "parity" with ps3 ports actually rendering at 1080p but having the advantage of actually rendering less than 25% of the pixels on the ps3 title which would probably enable higher (double?) framerates among other features.

If the reviewers couldn't tell the difference in res but obviously could tell the diff in framerate and the reviews reflected this difference would you have a problem with it?
 
Last edited by a moderator:
I get what your saying - "If it looks good who cares?"

Problem is how can you tell? Are you going to take the 640x480 game (labeled 1080p) home, hook it up to your tv, notice it looks like crap on your brand new 1080p native set, and then attempt to get a refund?

But what if I get a game natively rendered at 1080p and it looks like crap? For that matter, what if I get a game natively rendered at 480p that looks great? I guess what I'm searching for is does native rendering at 1080p necessitate looking better than being scaled to 1080p? It seems to me there are tons of other factors involved that will determine the actual picture quality, far beyond native rendering resolution. I could be wrong, and it would be interesting to know why.

Devs could render 640x480 w/4xaa all day long on 360 without tiling. Should this be acceptable while posting 1080p on the back of the box?

Considering the quality of graphics in terms of effects, frame rate, characters on screen etc they could achieve if that was the design decision from day 1, I don't know if I could answer that question so easily. I would really like to see what Gears looks like if designed from day 1 with 640x480 as the target with knowledge of how the scaler works. I'm not saying it would look better, or even good at all, but I would really like to see that along with other examples. It doesn't seem like there's a clear cut answer here, and that its going to very from game to game.

Lets look at Tekken 5: DR for the PS3. No matter what you do to it, its still using art assets made when 640x480 on the PS2 was the target. I assume they've modified the engine so that it runs natively in 1080p or they could possibly be using the Cell to scale it. Regardless, whether natively rendered or scaled, its roots are going to shine through. Now lets take VF:5. We'll assume its target was 1280x720 for the arcade (please correct me if I'm wrong). Does it matter if Sega modifies the engine to run natively at 1080p or have it scaled to 1080p? Could the same results not be achieved either way if done properly? I don’t ask that rhetorically, as there may be factors I’m not aware of.

EDIT

With ethics being the topic, I'm still off-topic even here. But I need more information to actually answer the question of whether or not its acceptable. Whether its acceptable or not to me depends on what the end result is.
 
Last edited by a moderator:
I get what your saying - "If it looks good who cares?"

Problem is how can you tell? Are you going to take the 640x480 game (labeled 1080p) home, hook it up to your tv, notice it looks like crap on your brand new 1080p native set, and then attempt to get a refund?

Good luck!

Would you expect to take a 1920X1080 rendered game home, hook it up to your TV, notice it looks like crap and attempt to get a refund? In this instance how would you be better able to tell if it looks good just because it is being rendered @ 1080p and is labeled as such?
 
Say Burnout 6 renders at 1080p on ps3 but runs at 30fps and (stutters at that) but on xb360 they render 640x480 (1080p sticker though!) with 4xaa and 60fps smooth as butter. The review reflects this difference in the score and because of the unstable framerate on ps3 it gets a 70/100 while xb360 version gets a 90/100. Would this be acceptable?
 
Say Burnout 6 renders at 1080p on ps3 but runs at 30fps and (stutters at that) but on xb360 they render 640x480 (1080p sticker though!) with 4xaa and 60fps smooth as butter. The review reflects this difference in the score and because of the unstable framerate on ps3 it gets a 70/100 while xb360 version gets a 90/100. Would this be acceptable?

How does it look? As has been pointed out before, if given a choice between today's game graphics @ 1920*1080 and movie CGI quality graphics @ 720*480 what looks better? People shouldn't get so hung up on rez IMO. It is but one factor in a larger equation.
 
How does it look? As has been pointed out before, if given a choice between today's game graphics @ 1920*1080 and movie CGI quality graphics @ 720*480 what looks better? People shouldn't get so hung up on rez IMO. It is but one factor in a larger equation.

Obviously assuming rendered resolution is the only difference.

640x480 4xaa @ 60fps
1920x1080 no aa @<~ 30fps
 
Say Burnout 6 renders at 1080p on ps3 but runs at 30fps and (stutters at that) but on xb360 they render 640x480 (1080p sticker though!) with 4xaa and 60fps smooth as butter. The review reflects this difference in the score and because of the unstable framerate on ps3 it gets a 70/100 while xb360 version gets a 90/100. Would this be acceptable?

Hmmm.... how does the 360 version look at 1080p?

On a side note, that's a crappy hpothetical you brought up, and seems to closely mirror the situation with CoD 3 and THP8 for the PS3 and 360.

EDIT

Obviously assuming rendered resolution is the only difference.

In that situation, sucks for the PS3. SCE should have been more aggressive with their hardware scaler. :p If I can get game A on platform A to look as good a game A on platform B, but while using less resources, and improving other areas, that's what I would do. Use every bit of trickery I could to get the desired results. No harm no foul, right? Or maybe not...
 
I get what your saying - "If it looks good who cares?"

Problem is how can you tell? Are you going to take the 640x480 game (labeled 1080p) home, hook it up to your tv, notice it looks like crap on your brand new 1080p native set, and then attempt to get a refund?

Good luck!

One can look on the back of the box and see the screenshots to see what the game looks like but res differences are limited with postage stamp sized pics. One could also wait for a mag review to see if it is something purchase worthy. They likely will not get correct res info from a mag review though.

If one knows where to go on the net *and has access to the net* then it is a moot point. But in the meantime it is borderline false advertising and really where do we draw the line on what is considered a minimum acceptable resolution for the stated checkbox of 1080p on the back of the box?

Devs could render 640x480 w/4xaa all day long on 360 without tiling. Should this be acceptable while posting 1080p on the back of the box?

-My suggestion is to fall back to the next native resolution if confusing the customer with native rendered resolution is a big concern. (ie: 1280x800 = 720p)

Okay. Lets drop the 360 in this example. It has a scaler and thus shouldn't count. I mean older systems used all sorts of odd sized frambuffers and yet still outputted a standard rez.

It seems like your issue is the fact that there are more than two possible resolutions that can be used. (Well more than 480i vs 480p). Add to the fact that there is no real regualtion on what is considered HD (see Wikipedia's def of HD Ready). And now you feel like you are getting jipped. I understand what you are saying. The best thing is to not list what resolutions are supported at all. Why? Cause honestly only we care. My dad played his PS3 in 480i until I brought him my unused component set. He even beat Resistance before I got him the better cables. And he is a tech person. AFAIK most ppl just want to plug it in and have it work. Period. Would it be nice for every dev to tell us: "Hey we used a 1024x768 buffer cause it was faster and the system can upscale it."? Sure it would, but people that don't understand (or care) are going to get confused.

Best Buy has this huge campaign on informing people about hd. It isn't as simple as plug tv in, plug movie player in and enjoy. You have to make sure all the settings are correct. :cry:
 
If it's labeled 1080P it should run fine on old 1080I sets right?.

And when i see 1080P i think of "full hd" as in 1920x1080, are those the same (full hd and 1080P) ?

When i took the rtl (the certification on colour tv) classes years ago they always talked about how many horizontal lines they never mentioned any thing specific about pixel as they doesn't exists in the old analog tv standards. If i remember correctly you can change the the colour of the beam more than 640 times/line so on an old SD tv you can achieve higher horizontal res than 640.

Anyway this got me thinking if the 1080P just specify the horizontal lines and nothing about the horizontal res..
 
Last edited by a moderator:
i cant speak for other countries (but would assume its similar)
in nz it would be illegal to say 1080p on the box, but yet have the game render at 720p
yould have to get someone to complain first + then the product would be removed or most likely a sticker saying 720p placed over the 1080p
 
No harm no foul, right? Or maybe not...

I personally would have a problem with it.

This places the responsibility of picture quality/sharpness on the reviewers and their sets. Who knows what the reviewer has for a set or what res his 360 is set to output. Not to mention - like others have said "how would he know what the res is?"

That example was an extreme one so lets tone it down a bit.

Assume devs "unlock the power of ps3" and burnout 7 manages to run 1080p @60fps on ps3. On the 360 their still cruisin along with the no-tile-scale-from-480(4xaa)-to-1080p method and manage the same 60fps.

The reviewer still has his 360 set to 720p and the tv is still a 720lcd so he "can't tell the difference" between the images.

Review scores are the same 90/100. Is this acceptable?
 
I personally would have a problem with it.

This places the responsibility of picture quality/sharpness on the reviewers and their sets. Who knows what the reviewer has for a set or what res his 360 is set to output. Not to mention - like others have said "how would he know what the res is?"

That is the case between TV's even with the same specs. I can get my TV calibrated and even if you had the same TV it could still look different.

That example was an extreme one so lets tone it down a bit.

Assume devs "unlock the power of ps3" and burnout 7 manages to run 1080p @60fps on ps3. On the 360 their still cruisin along with the no-tile-scale-from-480(4xaa)-to-1080p method and manage the same 60fps.

The reviewer still has his 360 set to 720p and the tv is still a 720lcd so he "can't tell the difference" between the images.

Review scores are the same 90/100. Is this acceptable?
If everything but resolution is the same? Yes it is fine by me.
 
I personally would have a problem with it.

This places the responsibility of picture quality/sharpness on the reviewers and their sets. Who knows what the reviewer has for a set or what res his 360 is set to output. Not to mention - like others have said "how would he know what the res is?"

That example was an extreme one so lets tone it down a bit.

Assume devs "unlock the power of ps3" and burnout 7 manages to run 1080p @60fps on ps3. On the 360 their still cruisin along with the no-tile-scale-from-480(4xaa)-to-1080p method and manage the same 60fps.

The reviewer still has his 360 set to 720p and the tv is still a 720lcd so he "can't tell the difference" between the images.

Review scores are the same 90/100. Is this acceptable?

I believe there was a couple of cases when IGN reviewed some games (cant' remember the games, in the 360 early days)in SD and said that there was no stuttering or framedrops.

So no that would not be OK..
 
I personally would have a problem with it.

This places the responsibility of picture quality/sharpness on the reviewers and their sets. Who knows what the reviewer has for a set or what res his 360 is set to output. Not to mention - like others have said "how would he know what the res is?"

That example was an extreme one so lets tone it down a bit.

Assume devs "unlock the power of ps3" and burnout 7 manages to run 1080p @60fps on ps3. On the 360 their still cruisin along with the no-tile-scale-from-480(4xaa)-to-1080p method and manage the same 60fps.

The reviewer still has his 360 set to 720p and the tv is still a 720lcd so he "can't tell the difference" between the images.

Review scores are the same 90/100. Is this acceptable?

I didn't think your example was that extreme, honestly. Seems darn near plausible given the current state of multi-platform development (Publisher: “We have to get this game out on all platforms!!…. At the same time!!1!â€￾). Nonetheless...

Hmmm, several factors here. Let me break the situation down into chunks to make sure I understand you correctly.

PS3 Version:
Native Res: 1080p
Output Res: 720p?
Box Label: 480i/p, 720p, 1080p
FrameRate: 60 fps
Test TV: 720p LCD
Graphics Rating: x/10
Overall Score: 90/100

360 Version:
Native Res: 480p
Output Res: 720p?
Box Label: 480i/p, 720p, 1080p
FrameRate: 60 fps
Test TV: 720p LCD
Graphics Rating: x/10
Overall Score: 90/100

If the graphics look the same and it will properly output a 1080p signal then I'd say all is well. In the same sense that wouldn't care how Criterion got HDR and FSAA running at the same time on the PS3 version if they're claiming they have done so (FSAA and HDR sticker on the box :D). My focus is really on "what features can I expect from this game?" Particularly for the PS3, as there are some thing we can't take for granted on the PS3 that we can on the 360. Where I would have a problem is if the 1080p mode looked like crap, while 480 or 720p does not. In that situation, if it were do to the image being scaled, I would have a serious problem with them using that 1080p label. But I assume it could also look like crap at 1080p yet not at 720 etc even when natively rendered at those resolutions if not done properly. In which case I would also have a problem with the 1080p label.
 
PS3 Version:
Native Res: 1080p 0xAA
Output Res: 1080i
Box Label: 480i/p, 720p, 1080p
FrameRate: 60 fps
Test TV: 720p LCD
Graphics Rating: x/10
Overall Score: 90/100

360 Version:
Native Res: 480p 4xAA
Output Res: 1080i
Box Label: 480i/p, 720p, 1080p
FrameRate: 60 fps
Test TV: 720p LCD
Graphics Rating: x/10
Overall Score: 90/100

Lets assume the reviewer has the consoles set to output 1080i as in his mind "bigger number is better". All fx, geometry, textures, etc are the same.

Is this review result fair?
 
Last edited by a moderator:
Back
Top