What Makes Graphics?

Status
Not open for further replies.
60fps is nice but other new stuff like lighting does far more for graphics IMO.
When I played Dante Inferno I never once thought it was 60fps until I saw later that it was.
 
LOL :) I actually only make models, then others do textures, shaders, animate them, light them, render it out and put the passes together in comp...

As for fps, according to some friends I have, James Cameron has seriously considered shooting Avatar at 48fps in addition to stereoscopic 3D. But it would've required too much from theater owners so they ended up with a standard frame rate, for now. I would expect them to push this issue as soon as they can, though - Avatar Day footage clearly showed a lot of ghosting and jerkiness. And an interactive visual experience like video games is most definitely in an even greater need of a stable and relatively high frame rate. With that said, 30 fps is usually enough - but 60 fps is certainly nicer.
 
The argument is that, like it or not, framerate CAN be an aspect of graphics or it can NOT exist. Graphics can be in motion or NOT. When it's NOT in motion, framerates doesn't/can't exist.
You're right. I'll accept that. Graphics can exist without framerate. CGTalk is full of nice renders that are 'graphics'.

However, in the context of gaming, when determining the quality of graphics of a game, framerate is a factor. You can't have a game without a framerate, and the framerate determines a quality of aspect of the visuals.
 
When talking about graphics, the argument semitope is putting forward appears to be "how good to screenshots of the game look?" as part of the "best graphics of the year" showcase. I think the other side is more interested in how a game looks as an overall visual package - not just screenshot-wise, but is it smooth, well-animated and keeps you involved in the overall visual experience.

Graphics != screenshots. Pushing any other case will give us lots of crappy-performing games over time :(

I fully agree with this...one should only judge the full package, because as a gamer you have to (!) experience the full package if you want to or not!

It doesn't help you if it looks good in screenshot when you want to play the game with stuttering framerate...this is like ignoring other features, like for instance 'teh jaggies'...would someone say that inFamous looks good, if we judge the graphics without the aliasing, because we define that aliasing is not part of the graphics...this does not compute in my opinion!

In short: as a gamer you have to judge the overall visual package, just because there is no other option available when playing the game

(I think this is very different to the PC crowd, where they can tweak the settings to satisfy their personal needs: more bling, lower frameate, more anit aliasing, less details, ... )!
 
You're right. I'll accept that. Graphics can exist without framerate. CGTalk is full of nice renders that are 'graphics'.

However, in the context of gaming, when determining the quality of graphics of a game, framerate is a factor. You can't have a game without a framerate, and the framerate determines a quality of aspect of the visuals.
You are right as well. You can't have a game without framerate. Framerate HELPS the quality of graphics in a video game, but I don't believe it should be elevated ABOVE the graphics themselves (for the reasons I mentioned in previous posts).

Since the thread that spawned this one is about graphics alone (no framerate mentioned in the OP), framerate shouldn't have a value of 50% or greater. This should be, especially, true when the difference is from 30 to 60 fps. An increase from 8 to 30 fps would be bit of a different story, when it comes to gaming, in general. As previously stated, in not so many words, from 30 to 60fps would be more of a gameplay element than a graphical one. IMO, that point seems pretty obvious.
 
You are right as well. You can't have a game without framerate. Framerate HELPS the quality of graphics in a video game, but I don't believe it should be elevated ABOVE the graphics themselves (for the reasons I mentioned in previous posts).

I dont believe anyone suggested i should be. I think everyone is arguing about totaly seperate things :LOL:
 
As previously stated, in not so many words, from 30 to 60fps would be more of a gameplay element than a graphical one. IMO, that point seems pretty obvious.

If you created KZ2 or UC2 quality game with an average framerate of 30 fps and I created a KZ2 or UC2 quality title at an average of 60 fps then I would have created a more graphically impressive game. While the IQ within each frame may be the same, my visuals are more impressive because my game is producing those visuals at 2 times your frame rate. Producing a level of IQ at 30 fps is not as impressive as producing the same IQ at 60 fps just like bench pressing 300 lbs once is not as impressive as bench pressing 300 lbs 2 consecutive times.

Framerates are an inherent part of both gameplay and graphics. Framerate is one of the variables where graphics and gameplay meet. Visuals are not developed within a vacuum but in line with framerate target. The fact that you can sacrifice framerates to increase graphic quality or sacrifice graphic quality to improve framerate means that framerate within the context of judging graphic deserves to be considered.

While I can see why a lay person wouldn't necessarily consider framerates while judging graphics, I can't see how anyone with some technical understanding would not.
 
Last edited by a moderator:
You are right as well. You can't have a game without framerate. Framerate HELPS the quality of graphics in a video game, but I don't believe it should be elevated ABOVE the graphics themselves (for the reasons I mentioned in previous posts).

A game with only one static frame will be graphically as useless as one with 60 blank frames a second.

Since the thread that spawned this one is about graphics alone (no framerate mentioned in the OP),...

Frame rate is an aspect of real time, or "interactive" graphics rendering.

You can't talk about real-time or "interactive" graphics without taking into account the real-time or "interactive" bit.

... framerate shouldn't have a value of 50% or greater.

What would having a "value" of "50%" mean with regards to "best graphics" for console games? What is this value representing, and how is standardised among gamers with different preferences? Define and give examples please.

This should be, especially, true when the difference is from 30 to 60 fps. An increase from 8 to 30 fps would be bit of a different story, when it comes to gaming, in general. As previously stated, in not so many words, from 30 to 60fps would be more of a gameplay element than a graphical one. IMO, that point seems pretty obvious.

Q) How does a graphics programmer benchmark his code?
A) He doesn't! Performance isn't part of graphics! It's the game designers problem!

*Rimshot*
 
And given thats this is a mostly PC site based around gpus, I find it rather confusing that some don't factor in framerate into graphic quality.

We are on a site where gpus are judged by standardizing visual settings and judging each gpu by how many frames are pushed per second. If that doesn't show you how framerates play an important role in what you see on screen, I don't know what will.
 
Last edited by a moderator:
. Producing a level of IQ at 30 fps is not as impressive as producing the same IQ at 60 fps just like bench pressing 300 lbs once is not as impressive as bench pressing 300 lbs 2 consecutive times.

300 lbs is still 300 lbs no matter how many times you lift it, just like the graphics are the same quality no matter how many times the scene is rendered per second. Yes it's more impressive, the visual output is better and yes it's better as a whole, but it's not better graphics. A photograph has just as good graphics as a video. Maybe they crank all that under graphics in reviews, but it's still wrong. There is no frame rate setting in games under graphics options.
 
I don't know much about technical stuff, but I still believe this is a valid point:

I see many comparing for example Alan Wake and Uncharted 2 graphics, and say "hey, U2 looks better, therefore it has better graphics"...

But they completely neglect that Wake is built on open-world technology, and therefore renders many times bigger environments than those in Uncharted 2, which take a hit on the visuals

Just like how Far Cry 2 isn't considered as being one of the best graphical game on consoles compared to Gears of War 2 or Killzone 2, but isn't Far Cry 2 very impressive graphically on consoles considering it is free-roaming in open-world?



And in my eyes, framerate is a vital point to the graphical score...RAGE running at 60 FPS open-world is for me better than KZ2/GoW2/whatever running in 30 FPS closed environments

And if Cryengine 6 came out on consoles, looking absolutely phenomenal but only running at 0,0001 FPS, of course it can't get good graphical score
 
I still don't understand how anyone can think you can evaluate the graphics in a game in a still image. The graphics can only be properly appreciate in motion. Frame rate, by it's nature, is part of the graphics.

You couldn't evaluate the visuals in a movie, especially the cinematography as a series of still images. The motion of the image is a huge part of the experience. Photographs and still images are one medium and moving pictures are another.
 
300 lbs is still 300 lbs no matter how many times you lift it, just like the graphics are the same quality no matter how many times the scene is rendered per second. Yes it's more impressive, the visual output is better and yes it's better as a whole, but it's not better graphics.

You appear to be saying that the subject of "computer graphics" concerns itself only with how nice (subjective) a given image is, and not with issues like performance.

To say this is not true is something of an understatement.

A photograph has just as good graphics as a video.

If you're judging the effectiveness of those "graphics" by how well they represent changing visual data over time, then no, it does not.

And as the purpose of a video game's graphic output is to represent changing information over time you would take into account how well they represent changing information over time when judging which was "best".

Maybe they crank all that under graphics in reviews, but it's still wrong. There is no frame rate setting in games under graphics options.

There's no frame rate setting under "gameplay" options either. Or audio. Or Load/Save.
 
300 lbs is still 300 lbs no matter how many times you lift it, just like the graphics are the same quality no matter how many times the scene is rendered per second. Yes it's more impressive, the visual output is better and yes it's better as a whole, but it's not better graphics. A photograph has just as good graphics as a video. Maybe they crank all that under graphics in reviews, but it's still wrong. There is no frame rate setting in games under graphics options.

No sir, you do not judge a 2D still image in the same way you judge 3D graphics of a game. Graphics by its basic definition are visual representations on some surface. In the case of a video game the surface is a PC screen.

If I went to your house and set it on fire and brought Ansel Adams back to life to take a picture of that fiasco using a $25000 professional camera. :LOL: A lot of people might judge the image to be stunning. What I can't do is then take glorious image and paste into my game as a 2d animated gif and expect the same compliments. A 2d image only has to capture the look, a 3d game has to capture not only the look but also the motion of the fire. Thereby, graphics in a 3d videogame are naturally dependent on frame rate. A video of fire running at 10 frame per minute is not going look as smooth and realistic as one running at 60 fps and no one is going to consider them graphically equivalent.

Furthermore, just about any graphic setting in a PC game will affect framerate, so while there no framerate setting under the graphic options, when you set those graphic settings you are indirectly determining the average framerate of your game.
 
Last edited by a moderator:
I don't know much about technical stuff, but I still believe this is a valid point:

I see many comparing for example Alan Wake and Uncharted 2 graphics, and say "hey, U2 looks better, therefore it has better graphics"...

But they completely neglect that Wake is built on open-world technology, and therefore renders many times bigger environments than those in Uncharted 2, which take a hit on the visuals

Just like how Far Cry 2 isn't considered as being one of the best graphical game on consoles compared to Gears of War 2 or Killzone 2, but isn't Far Cry 2 very impressive graphically on consoles considering it is free-roaming in open-world?



And in my eyes, framerate is a vital point to the graphical score...RAGE running at 60 FPS open-world is for me better than KZ2/GoW2/whatever running in 30 FPS closed environments

And if Cryengine 6 came out on consoles, looking absolutely phenomenal but only running at 0,0001 FPS, of course it can't get good graphical score

You are putting a lot of things together to reach a conslusion about just 1 thing. Graphics doesn't have to be about how impressive and achievement the game is overall or how the gameplay feels with it. The open world comment might apply but thats not framerate related.

Alan Wake is open world btw?
 
Alan Wake is open world btw?

Yeah...There were alot of misunderstandings after a faulty translation, but Remedy came out and said that the open world still is there, just that it won't be the typical sandbox gameplay(because they want to focus on story)

What we know is that the night events will be more linear, and on the day it is more about exploring
 
Let me see if I can explain this - if we have a still image, be it for an advertisement, or for a pie-chart, we call it a graphic, so I do not count frame rate in with that, any more than I do accurate physics (both are seperate), BUT!! I think some confuse graphics with presentation. Graphics is simply the quality of the frames being rendered, with motion, we get fluidity of animation, we get motion blur (a graphic updated over time), etcetera, but together those are a complete experience, something *more* than graphics. So I think the thread title really should say "What makes a Presentation". And I think the answer to that is something different to everyone.
 
Shaders for example most often than not need to be seen in motion to properly appreciate them, and for that you need a good framerate.
 
Let me see if I can explain this - if we have a still image, be it for an advertisement, or for a pie-chart, we call it a graphic, so I do not count frame rate in with that, any more than I do accurate physics (both are seperate), BUT!! I think some confuse graphics with presentation. Graphics is simply the quality of the frames being rendered, with motion, we get fluidity of animation, we get motion blur (a graphic updated over time), etcetera, but together those are a complete experience, something *more* than graphics. So I think the thread title really should say "What makes a Presentation". And I think the answer to that is something different to everyone.

I think you need to understand the context that is going on in this discussion. This started out as a "best graphics of 2009" discussion. I (and apparently others here too!) don't consider a game's graphics solely based on screenshots. It's the visual package that makes is the graphical quality of a game - models, textures, smoothness, animation, shaders, the whole kit and kaboodle.

For reference - would Myst - or any of FMV-based game - be considered to be graphically superior to anything the PS3 or 360 have done to date? The characters are more lifelike, after all when you look at their screenshots.
 
Status
Not open for further replies.
Back
Top