Casuals

Cheezdoodles

+ 1
Veteran
Often i see people making arguments about how the "casual gamer" is unable to see the differences in AA, framerates, even differences between CGI trailers and the actual game.

It implies that the casuals have a worse vision than gamers!

To me this is utter BS.

My experience is that casuals are just as capable of noticing even minor differences like framerates, anti-aliasing, you name it. They just dont CARE as much about the differences like geeks like us to.

I dont really have any "gamer" friends, of the friends i have the few people that have consoles play Fifa occasionally with friends, and thats about it.

They are equally able to spot differences in framerates, aa, etc, even if comparing to different platforms. They might not know the techniqual names for what they see, but they are able to tell that one game has more jaggies, or unstable framerate etc.

Nobody i know, regardless of if they own a console or not, are unable to tell the difference between 2x AA and no AA. Nobody i know are unable to see the differences between Motorstorm E3 CGI and real Motorstorm. Nobody i know are unable to notice that Fifa08 on PS3 runs at low framerate while on the X360 its smooth as butter. They all see it, some dont care about it, but they are NOT blind!!!

Therefore i suggest that we should stop saying that casuals dont see the differences, rather say that some casuals might not care about the differences.
Any thoughts?
 
I had a friend who couldn't tell composite and RGB signal apart on a 52" TV. I agree that casual don't care about the difference so much, but imo it mostly means that they don't perceive the difference as big as we do and thus their eyes are not as honed spot these differences, of course it's not due to their eyes, but lack of knowledge of this particular area. You see what you know.
 
I think it's simply how people 'look' at things.

My wife would never notice any differences unless I pointed them out - and I'd probably have to do it several times! It's like defects on TVs and things like that...people either have that 'quality control' eye or don't IMHO...if you have to point something out then they haven't actually noticed it.
 
Just last night I had a gaming buddy (low-intensity gaming, very casual in tastes if not what the industry would class as a Casual) completely miss the difference between Botty at 30 fps and 60 fps. This came as a huge shock to me. So some people are blind to IQ issues, but I have no idea how much variation there is across the populace. You're probably right that most people are/can be aware. It would be very interesting for the industry to have stats though.
 
Change blindness. "Hardcore" gamers only notice it because they're looking for it. Years of reading and gaming has conditioned their minds to seek out these changes, they notice them immediately because they know what to look for or at least know to look for something different. A casual or someone who isn't aware that there could be a difference isn't going to look for it so they're much less likely to see it. Now there are always exceptions to the rule, but for the most part change blindness is a very well known and documented phenomenon. Just another reason people need to be aware that your brain has just as much to do with your sight as your eyes.
 
Therefore i suggest that we should stop saying that casuals dont see the differences, rather say that some casuals might not care about the differences.
Any thoughts?

People here put games under a microscope and come up with insanely small differences. I cannot see these differences during game play, I seriously doubt any non-gamer can.

Can people see the difference between 30fps and 12fps? Sure. Can they tell a distant texture in COD4 is lower res in one version over another while playing? No.

If you read the Eurogamer console-war-for-hit articles you will see how they silly they are getting these days, Joe-off-the-street cannot see these differences in action, hell I bet most of us could not in a blind test.
 
Just last night I had a gaming buddy (low-intensity gaming, very casual in tastes if not what the industry would class as a Casual) completely miss the difference between Botty at 30 fps and 60 fps. This came as a huge shock to me. So some people are blind to IQ issues, but I have no idea how much variation there is across the populace. You're probably right that most people are/can be aware. It would be very interesting for the industry to have stats though.

Your friend must not have been paying attention or must be incredibly insensitive to framerate. I'd say the vast majority of people can see the difference between 60fps and 30fps.
 
Your friend must not have been paying attention or must be incredibly insensitive to framerate. I'd say the vast majority of people can see the difference between 60fps and 30fps.

Remember the 3dfx 30fps vs 60fps test?

Where one half of the screen was rendered at 60fps and the other at 30fps?

Atleast from what i recall, all journalists managed to immediately spot a difference.
 
Remember the 3dfx 30fps vs 60fps test?

Where one half of the screen was rendered at 60fps and the other at 30fps?

Atleast from what i recall, all journalists managed to immediately spot a difference.
Strict A/B tests are stupid, though. It's a situation that never arises in use.

If you want to test whether some image/sound quality factor is noticeable, show a bunch of random material with the feature off on one day, a different bunch on another day with the feature on, and then ask if he can identify whether it's on or off on the third day with yet another random sequence of scenes. (Okay, they don't have to be a day apart, but you get my point.)

AA (2x vs. none), AF (4x vs. none), black levels (factor of two), framerate (20%) and resolution (impact varies) are things I can tell instantly without an A/B test. These things matter to me. The threshold can be higher for others, so I think it's plausible for people not to notice the difference between 30fps and 60fps without direct side-by-side comparison.

DD 5.1 compressed vs. uncompressed sound, TV color accuracy (to a certain point), and lots of other stuff make no difference. I bet the vast majority of audiophiles and videophiles will not be able to pass my test.
 
Sure, when it's right next to each other and so on. However, how many journalists thought GTAIV looked nicer ("Smoother") on the upscaled PS3 version? I initially went "yikes" but got used to it. My colleague is just starting CoD5 on PS3, but I don't hear him complain about upscaling, he probably doesn't even know that it does. He may notice a difference when he plays Resistance immediately afterwards though, but then again he may not, and he may not have a clue either that CoD5 is supposed to run 60fps and Resistance 1 runs 30fps. With CoD5 dropping down to 45 frames at times, he might notice that more than Resistance 1 which stays at 30fps without a single frame missed, basically, then again he might notice something. But it's not that likely. Consider Dirt versus Forza 2, Dirt being at 30fps but motion blur helping that game look very smooth. It goes on and on. And that's apart from the fact that when say Eurogamer is comparing two games and finds one version (usually PS3) is on average 2fps slower, that's not going to be noticed by single platform gamers or even people who see both games fairly regularly. On the other hand, a 30fps game that spikes down to 20fps, that's going to be noticed by a fairly large percentage.

When I was a helpdesk guy, I could walk through our office and spot any CRT that for some reason had jumped back to 60hz left and right, as I was on my way to help someone. When I changed this to something like 80hz, most people couldn't notice the difference, except at the end of the day they didn't have a headache, so they were still grateful. ;) When I was in a THX cinema 12 years ago in Stockholm, I immediately noticed that one of the speakers in the surround setup (probably 7.1 using maybe 15 speakers in total) didn't work. I was the only one though, but this was partly because I owned a Surround set since 1992. Initially, my wife barely noticed the difference between HD and regular TV, but after a few movies that changed drastically.

I think you need to be trained to see certain things, and there are also physical differences between individuals - combined, they cause huge differences between individual's perceptions, as I think Shifty mentioend before. Even then there is a huge difference between games and game contexts when it comes to things like framerates. Taking a sharp turn in a racing game or watching bullets fly by really stand out in 60fps vs 30fps, but if you're watching something where things aren't moving as fast, or mostly in your direction or away from you, then the difference is going to be much less noticeable. Motion blur can also really effectively compensate for the difference between 60 and 30fps, and can sometimes even be more effective at tricking your brain into creating the illusion of fluid movement.

Etc. etc. etc.
 
It is harder to spot visual improvement than deterioration. It doesn't matter if it is spatial resolution, filtering quality or temporal resolution. It's hard to go back

The point of diminishing returns depends alot on the type of game.

My brother, who doesn't have a clue about any of the above, spent a lot of time playing Forza on the old XBox. Then with the 360 he played (well, still do) a lot of Forza 2. His 360 RRODed and he was forced to play on his old XBox for 2 weeks.

Quite unhappy, he couldn't express what the problem was, but kept saying things like "... less responsive", "..harder to estimate speed" etc. The difference is of course the 30 vs 60 Hz refresh which means a lot in a racing sim.

He never thought about the improvement in framerate, but instantly recognised the deterioration when he reverted.

Cheers
 
When I was really active in the DVD backup scene (and encoding my VHS, SVHS etc.) I was playing around with many different codecs and encoding settings.

Sometimes when my friends came over I made some quick and dirty A/B/C comparison and mostly they would not see any difference where I would see macroblocks, encoding artifacts etc. Their comment usually was "You're crazy!" :D .

So many of those things can be seen but need to be "trained"/shown. The question is if the casual user wants to be bothered with it.

I totally agree with DrJay24 that some comparisons are a quite detached from the average user but I doubt anyway that a normal user who is not interested in technical stuff would have any fun reading e.g. the console tech forum. He put it nicely, big differences will be seen by everyone, small not. Now we only need to agree on what a big and small difference is.

I curse the day when I started to dissect movies and video games, because you detach too much instead of enjoying it. I would understand it if I would work in the video game industry like many of you, but I'm in a completely different field professionally so it's more like a curse ;) .

Cheers...
 
Last edited by a moderator:
I have to say that in my experience, good content (game, movie etc) is absorbing to the point that the technical side of the presentation goes unnoticed. Even myself, if I'm really absorbed in a game then technical aspects of the presentation fall out of my mind, and I'm a graphics programmer.

The only time when this is not the case, is when a sudden drop in framerate breaks the immersion.

In a good game, I have to deliberately tear my attention away from the game to look for graphical techniques. The differences we pick up in the crossplatform comparisons are illuminating and interesting as a graphics/console programmer, but in reality in almost all cases we are talking about back-to-back comparisons being analysed in great detail by technically minded people, and the absolute differences are minor in the extreme.

To take a case in point, I had to show my better half a clip of a TV show in SD and 720p back to back, before she noticed a difference. She immediately agreed that the 720p version looked a lot better. However, when watching the same TV show the following week I asked her afterwards SD or HD and she had no idea. It's not only the back-to-back nature of the comparison, but also the fact that when I did the comparison, we were not watching the program, just looking at the difference in resolution. When actually watching the program and being absorbed in the storyline, resolution was the furthest thing from her (and I admit, my) mind.
 
Maybe casual gamers aren't really anal enough to be bothered with the subtleties of the shiny bits. They just want a simple bit of gaming pleasure for half an hour or so?
 
Often i see people making arguments about how the "casual gamer" is unable to see the differences in AA, framerates, even differences between CGI trailers and the actual game.

It implies that the casuals have a worse vision than gamers!

To me this is utter BS.

Like others mentioned, nobody said that people can't see the difference. They just don't see the difference unless you point at it mostly because they are not trained to see it or do not care about it. Another example is, I once saw close ups of the Uncharted models and thought it was the best thing ever. A 3d artist friend of mine almost got angry because of the incorrect lighting at the two points where the mouth ends (I think). Same thing goes for audiophiles rejecting compression and wine tasters guessing whatever they guess about wines.

Also your casually playing friends are not Joe Average. You mom is (and my mom too).
 
Back
Top