What Makes Graphics?

Status
Not open for further replies.
When talking about graphics, the argument semitope is putting forward appears to be "how good to screenshots of the game look?" as part of the "best graphics of the year" showcase. I think the other side is more interested in how a game looks as an overall visual package - not just screenshot-wise, but is it smooth, well-animated and keeps you involved in the overall visual experience.

Graphics != screenshots. Pushing any other case will give us lots of crappy-performing games over time :(

Graphics = screenshots ! fixed :D

screenshots was just to make a point. I think you guys are mixing things up a bit. If you show a group of gamers still pics of uncharted 2 and a gameplay vid of Dantes Inferno they won't say "oh, this one is not moving so DI has better graphics" They'd still pic UC2 based on the pics alone. Now take a 30fps game with better graphics and compare it to a 60 fps game with both in motion and you will still end up with the same result.

I understand your argument though, I think :-/
 
Graphics = screenshots ! fixed :D

screenshots was just to make a point. I think you guys are mixing things up a bit. If you show a group of gamers still pics of uncharted 2 and a gameplay vid of Dantes Inferno they won't say "oh, this one is not moving so DI has better graphics" They'd still pic UC2 based on the pics alone. Now take a 30fps game with better graphics and compare it to a 60 fps game with both in motion and you will still end up with the same result.

They'd say a game they've only seen a still picture of has better graphics than a game they're watching in motion? Are these people idiots?

Your second point isn't as absolute as you make it seem.
 
My follow-up question to that would be as follows:
Why aren't consumers shown 60fps, if it is so important? ;)
Two answers.
1:-
I'm not saying there isn't an impact...The point here is not to quantify a threshold above which greater framerate is worthless, but to highlight that frame-rate is a part of the graphics equation.
2:- Legacy. Film was shot at 24 fps in the earlier days (post 16fps and such silent movies), as I understand it as much as a cost cutting measure as anything. If it were possible and economical, factoring in the lighting, exposure, editing and processing and storage as well, both in creating the film and distributing it to movie theatres, to shoot in 60 fps then I'm sure they would have. But they didn't. The end result is a legacy format. 24 fps motion looks diabolical. Up! in 3D had moments of judder that weren't anything like smooth motion. There's an artistic issue regards the style of film, and some suggest shooting faster framerates looks cheaper than the 24fps equivalent. Maybe that's the case due to conditioning. You could argue that in this age of digital distribution, CGI doesn't need to limit itself to 24fps. But you'll have increased costs of producing and rendering the same film at 60 fps and I don't know what projector technology exists to show it anyway.

You only need see a panning shot in LOTR and the juddering mountains to appreciate the negative impact of a low framerate, and to want a faster refresh, which is what we want in graphics too - a refresh which is comfortable for the content being shown.
 
They'd say a game they've only seen a still picture of has better graphics than a game they're watching in motion? Are these people idiots?
No, but they are making an ill-informed judgement given incomparable information. you'd need to compare like-for-like data. Either show both games in motion and see which they prefer (this would be a good case to prove the framerate is part of graphics. Get Polyphony to have a PhotoRace mode that renders every frame in photo mode for 1 frame per second and see how many people pick that as being a better looking game over GT5 in realtime mode!) or show both as still and see which they prefer. The latter would isolate framerate from the evaluation, but that doesn't mean it's not important. You could compare PR shots from two games and ask which has better graphics. Responses would consider shading, texturing, lighting, artstyle, but not IQ factors whcih the PR shots would elliminate. Then show the games in real gameplay mode, one with 4xAA 720p native and the other upscaled 600p without AA, and see how that affects the responses.
 
They'd say a game they've only seen a still picture of has better graphics than a game they're watching in motion? Are these people idiots?

Your second point isn't as absolute as you make it seem.

No they are people. They simply compare the 2 visually whether or not they are both in motion. This may be an outrageous comparison but compare a pic of halo 3 to a vid of tetris. Would you really say tetris looked better just because it was moving? In that case.... this:

Are these people idiots?

would apply

The problem is our interpretations. I agree that if a scene is moving then it needs to move at a reasonable rate, but is it that framerate is just something that could prevent the appreciation of the scene instead of something that actually contributes to it? I played crysis back when it came out and 20-30 fps and I really don't think 60fps would have made it any better looking.
 
Last edited by a moderator:
Two answers.
1:-

2:- Legacy. Film was shot at 24 fps in the earlier days (post 16fps and such silent movies), as I understand it as much as a cost cutting measure as anything. If it were possible and economical, factoring in the lighting, exposure, editing and processing and storage as well, both in creating the film and distributing it to movie theatres, to shoot in 60 fps then I'm sure they would have. But they didn't. The end result is a legacy format. 24 fps motion looks diabolical. Up! in 3D had moments of judder that weren't anything like smooth motion. There's an artistic issue regards the style of film, and some suggest shooting faster framerates looks cheaper than the 24fps equivalent. Maybe that's the case due to conditioning. You could argue that in this age of digital distribution, CGI doesn't need to limit itself to 24fps. But you'll have increased costs of producing and rendering the same film at 60 fps and I don't know what projector technology exists to show it anyway.

You only need see a panning shot in LOTR and the juddering mountains to appreciate the negative impact of a low framerate, and to want a faster refresh, which is what we want in graphics too - a refresh which is comfortable for the content being shown.

Wouldn't that show that it isn't so important, then? If it was so important, it would be a necessary evil (which would put the cost out the window). :smile:

Like I said before, I have never seen the term framerate exist outside of graphics, but I have seen the term graphics exist outside of framerate. Have you seen framerate exist/have purpose outside of graphics? If not, the ultimate answer should be revealed, right?
 
Graphics = screenshots ! fixed :D

screenshots was just to make a point. I think you guys are mixing things up a bit. If you show a group of gamers still pics of uncharted 2 and a gameplay vid of Dantes Inferno they won't say "oh, this one is not moving so DI has better graphics" They'd still pic UC2 based on the pics alone. Now take a 30fps game with better graphics and compare it to a 60 fps game with both in motion and you will still end up with the same result.

I understand your argument though, I think :-/

Its a stupid comparison. The people looking at still pics extrapolate the data they are given and assume a certain level of performance in motion, still pics only show half the story and its down to assumtions to fill in the gaps.

A valid comparison would be to show uncharted 2 at 10fps and then again at 60fps and ask people which looked better.
 
Wouldn't that show that it isn't so important, then? If it was so important, it would be a necessary evil (which would put the cost out the window). :smile:
The argument isn't that framerate is/isn't important (I'm starting to quetsion what language I'm typing here as I feel I'm repeating that point more often than should be necessary! ;)). The argument is that, like it or not, framerate is an aspect of graphics. Do we need antialiasing to play games? No. Ergo would you say antialising isn't a part of graphics? No. It's just one aspect that we like more of. Similar to resolution. Similar to framerate.
Like I said before, I have never seen the term framerate exist outside of graphics...
Which surely proves framerate is intrinsic to graphics, because you can't have framerate without graphics?
 
The argument isn't that framerate is/isn't important (I'm starting to quetsion what language I'm typing here as I feel I'm repeating that point more often than should be necessary! ;)). The argument is that, like it or not, framerate is an aspect of graphics. Do we need antialiasing to play games? No. Ergo would you say antialising isn't a part of graphics? No. It's just one aspect that we like more of. Similar to resolution. Similar to framerate.
Which surely proves framerate is intrinsic to graphics, because you can't have framerate without graphics?

Why that statement? we do need framerate to play games so the comparison is off a bit there. Graphics and framerate will always go together because its a balancing act between the 2. Framerate is definitely important if you want to enjoy a games graphics

Lets say when crysis came out, noone could play it above 15fps, would that have made it any less the most graphically impressive game till even now?

I'd love input from a few more ppl on this here, otherwise those who are actively contributing right now will just be going around in circles on the topic.
 
Riddle me this. COD7 comes out on PS3 and 360. It runs at 15fps on PS3 and 60fps on 360. Would anybody in there right mind think they should get the same score for graphics?

This is nuts...
 
Riddle me this. COD7 comes out on PS3 and 360. It runs at 15fps on PS3 and 60fps on 360. Would anybody in there right mind think they should get the same score for graphics?

This is nuts...

In that situation the PS3 version should get a lower gameplay score, by the logic in this thread.
 
Graphics are the sum of visual presentation. Frame rate, high or low, can easily add or detract from that presentation.
 
Indeed, so what would those people say if it was point and click adventure game where framerate is irrelevant to gameplay

I have no idea. To me, frame rate is part of the graphics. If there's no frame rate, you have no visuals. The importance varies by title, but frame rate is definitely part of the "graphics" or the overall visual presentation.
 
Riddle me this. COD7 comes out on PS3 and 360. It runs at 15fps on PS3 and 60fps on 360. Would anybody in there right mind think they should get the same score for graphics?

This is nuts...

Good question, bring it up to 30 fps and ask that question again. Like I said, framerate is necessary to appreciate the graphics in motion (edited to reflect agreement with Lucid_Dreamer), same as a responsive controller is necessary to appreciate gameplay. What is there is there, how it comes across to you doesn't change that. Show a reviewer a game where fps doesn't limit his appreciation of the graphics and he will come to the same conclusion no matter how high or low the fps is.

Cover one of your ears and slam a hammer on a wooden table, its loud yet you can't appreciate exactly how loud it is, that doesn't change at all how loud it is. Same with Framerate, get a great looking game and drop the framerate and you might not be able to appreciate how good it looks, but that doesn't change how good it looks.

If you see beyond the limitations of the controller you will see how the game is supposed to look. Another way to look at it is from a PC perspective. Lets say I run a game on max settings on a 5 year old PC and someone runs it at the same settings on a current high end rig. Will his graphics be better than mine or will he just be able to play the game better?
 
The argument isn't that framerate is/isn't important (I'm starting to quetsion what language I'm typing here as I feel I'm repeating that point more often than should be necessary! ;)). The argument is that, like it or not, framerate is an aspect of graphics. Do we need antialiasing to play games? No. Ergo would you say antialising isn't a part of graphics? No. It's just one aspect that we like more of. Similar to resolution. Similar to framerate.
The argument is that, like it or not, framerate CAN be an aspect of graphics or it can NOT exist. Graphics can be in motion or NOT. When it's NOT in motion, framerates doesn't/can't exist.

The issue seems to be that some people are TRYING to elevate framerate OVER the graphics themselves. This is not correct. I've shown that framerates can not exist without graphics, but graphics CAN exist without framerates. Therefore, the only logical conclusion is that framerates shouldn't be elevated over graphics.

Which surely proves framerate is intrinsic to graphics, because you can't have framerate without graphics?
Actually, it proves the opposite. It proves that framerate NEEDS graphics, but graphics do not NEED framerate. Remember, graphics can be still images (no framerate required). ;)

In that situation the PS3 version should get a lower gameplay score, by the logic in this thread.
I totally agree.
 
The question isnt how important framerate is but whether it plays a part in graphics within games. Animation is also of course part of graphics, just because you can have graphics without animation doesnt make it irrelevant to a games graphics. Physics also play a part in the graphics of a game yet you can have graphics without it.
 
Can I ask a question to those that think framerate isn't tied to graphics? Am I correct in assuming you started 3D gaming on consoles, whereas those that think framerate is directly tied to graphics started 3D gaming on PC?

PC gamers as a rule tend to prioritise by:
1) stable framerate
2) image quality
3) increased "screenshot"-type visuals
 
Can I ask a question to those that think framerate isn't tied to graphics? Am I correct in assuming you started 3D gaming on consoles, whereas those that think framerate is directly tied to graphics started 3D gaming on PC?

PC gamers as a rule tend to prioritise by:
1) stable framerate
2) image quality
3) increased "screenshot"-type visuals

Nope, i have never gamed on PC in a big way. All i know is i would never consider a game that runs at 10fps as having good graphics compared to the same running at 30. Hell a CG movie running at 10fps compared to the same at 24fps would have the same effect, and that has no gameplay to speak of.
 
Status
Not open for further replies.
Back
Top