Looking Back, This Gen Was Disappointing On The Graphics Side Of Things...

Iam more dissappointed with shadows(a biggy for me), resolution and AA. But overall Iam satisfied however Iam thinking about getting into PC gaming though.
 
High IQ doesn't mean lack of effort. It just means a difference of priorities. To achieve higher image quality, other features need to be reduced. Developers aren't making these calls, instead preferring a higher degree of per-frame eye-candy at a lower fidelity and frequency.

Agreed, and I can understand people wanting for higher framerates, better physics or more AA at the expense of other things. My post was directed towards the original poster who seems to be complaining about more or less everything, rather than simply advocating different priorities.
 
Infinity Ward did it right. No compromise. Go for 60, use simpler graphics, deliver excellent gameplay. WIN.

And the second game is actually looking pretty good imho.

Not that I mind 30 fps with extra eye-candy. However, most games end up being "almost 30... sometimes". I can't wait for Uncharted 2 given that, judging by the released footage, it's synced and has very little slowdown. When playing the first game I remember lots of situations where I was thinking "oh maaan if only this was perfectly smooth".
 
I'm personally pretty happy with this gen so far, but I want the new machines to come out within reasonable time frame and with solid upgrade in tech, if it goes beyond 2011 or 2012, I probably won't be as happy anymore.
 
Meh I can do quite well with a solid 30 as long as there aren't any horrendous drops.

So I'm quite happy with solid 30 and great graphics, even for FPSes. That's the one thing that still keeps PC's ahead for me as a primary game platform. The ability to choose 60 FPS sacrificing some quality versus cranking things up and going with 30 FPS.

Would certainly be interesting if some console dev threw that into a game just as an experiment. Let the user choose between 30 FPS cranked graphics and a 60 FPS lowered graphics options. And see which one got used more.

Then again some games already do this to an extent. How many pay play Sacred 2 at 480p for better frame rates? So I'd be willing to guess the 30 FPS cranked graphics would be by far the more used option if some dev did that.

Regards,
SB
 
Personally my favorite FPS is 30 with some dips to 25-ish. Lets me know the hardware is being maximized (30 FPS target) and stressed (dips). And I dont really notice or mind the dips if theyre not too bad.

This is pretty much the framerate of Gears, Killzone, Uncharted 2 etc.
 
There isn't a long way to go at all.

Generations last 6 years.

This generation began in 2005.

2011 is the end of this generation.
 
If you don't believe me then check this out -

IBM East Fishkill workers were called to an all-hands-on staff meeting Thursday evening where they were told more work is coming their way. Employees were told work will come to the Systems and Technology Group from Nintendo, Freescale and Sony. IBM employs more than 9,000 at East Fishkill, Poughkeepsie and Sterling Forest. Managers told employees they did not anticipate hiring additional workers. IBM officials could not be reached for comment.

http://gonintendo.com/viewstory.php?id=90612
 
Disappointment requires expectations. If your expectations were based on hype, that's a lesson to learn for the next round of game machines. That's what this thread title makes me think is the case.

I am actually rather well past being all about graphics. I play a lot of old games for various reasons. The newer games are either similar to the old stuff or there aren't modern games for the genres I like. So I guess I'm disappointed in the lack of originality and variety in modern gaming. My brother on the other hand is endlessly happy over his multiplayer shooters so obviously it's all from each person's perspective. Many of my friends are all about MMOs, too, while I am not (although this has little impact on consoles).

I will say though that the devs seem to push the hardware too far in a few cases. Oblivion comes to mind especially because it ran really poorly on 360. Mass Effect also was pretty rough with the data streaming. I'd like to see consoles somehow move away from ridiculously noisy optical drives.

There's more to the experience than just the sheen on the graphics.
 
Last edited by a moderator:
I have the same problem as swaaye. The genres I really liked in the past seem to have mostly gone the way of the Dodo. The only retail game on my must have holiday shopping list is the new Ratchet&Clank. (probably going to end up with one or two more, but most of the new stuff just doesn't interest me)
Thank god for PSN and LiveArcade.

Graphics this generation on the other hand look fine to me. I also think we have reached the point of dimishing returns quite a while ago. Things have started to mimic reality rather closely, but they are still a long way off from photo-realism and a bunch of higher resolution textures aren't going to fix that anyway.
 
If I were building a console, I'd have 60fps mandated.
I measured this empirically a couple of weeks ago with a few ppl, and we all concluded the same thing, though 40fps is good enuf i.e. should be the absolute minimum, of course 60fps is better, one thing was certain, 30fps is absolutely not (*) + is only used historically due to tv hz refresh rate, something ppl will be stuck with for a long time no doubt.


(*)except for RTS's etc
 
I measured this empirically a couple of weeks ago with a few ppl, and we all concluded the same thing, though 40fps is good enuf i.e. should be the absolute minimum, of course 60fps is better, one thing was certain, 30fps is absolutely not (*) + is only used historically due to tv hz refresh rate, something ppl will be stuck with for a long time no doubt.


(*)except for RTS's etc

I still say that if devs were to actually give people a choice similar to PCs.

Where there's a graphics option for 30 FPS high graphics quality or 60 FPS lower graphics quality.

I would bet money that the majority of console users would opt for the 30 FPS high graphics quality option. And that the minority of "serious" players would opt for the 60 FPS option.

Just look at Far Cry, Oblivion, Crysis, etc. Where rather than going with 60 FPS, most people lowered their FPS as low as possible in order to have the best visuals possible. I've seen plenty of people that chose 15-20 FPS for better graphics even if their gameplay suffered.

Myself, I'm quite happy with a steady 30.

Regards,
SB
 
While I'm not bitter about anything, I am a bit disappointed by this gens IQ. One is the shadow quality. Some games I honestly think would look better without any shadows at all. That's really how bad it looks sometimes. Also, texture quality/lack of filtering is another huge one.

With regards to frame rate, the only game I can think of where frame rate was a problem was in Mass Effect for 360 (and maybe PS3 GTA4). Other than that I find 30 fps to be perfectly smooth for my tastes. I'd even take 30 fps and greater IQ over 60fps and slightly worse IQ.

However, it seems to be only going uphill from here, mostly on the PS3 side of things. With games like Killzone 2 and Uncharted 2, I really think we're starting to see a leap that will only continue on and improve. If more games come out looking like those (or better) this gen will actually have exceeded my expectations.
 
actually I thing I might have a technical reason why 30fps is not good enuf. (TM) + why 40fps is good enuf

from testing human reaction times are 100msec (eg quicker than this is false start in the olympics, just ask lynford christie :) )
thus even the most responsive 30fps game is gonna >= this limit, thus a normal sober player will notice lag

with 40fps the response is under this human reaction time.

seems like a plausible reason worthy of further study, someone should write an article.

as PhilB saiz yes shadows are a huge letdown, though he mentions killzone2, the final game in some respects has in fact exceeded the 'impossible to do' original trailer. The problem is whats mindblowing changes year after year we get used to it + need a greater and greater fix to reach the same high lingo for the benifit of the 30fps brigade : )
 
Most people's reaction times are well above 100msec, more like 200-300... at least as far as I know...
 
Games aren't about point reactions though. A lot of input is predictive, taking a feed of images that lead one to understand what will happen, allowing response by timing rather than reaction. eg. In a racer, you can see the corner coming. Reactions aren't needed there, but a fast refresh makes everything clearer and improves the mental tracking and prediction of events.

A simple example - consider a test where a human subject ahs to press the fire button when a big dot is in a circle on screen. There are two different setups with different movement :
1) The dot appears randomly on screen.
2) The dot moves in a straight line towards the circle at a constantly velocity.
If reaction time is based on time taken to hit the button after the dot enters the circle, clearly case 2 will have the more accurate results, case 1 the least accurate, and importantly case 1 will be reliantly purely on reaction time, whereas case 2 will require very little reaction and depend on timing.

Personally I think the main reason for a higher refresh is aesthetic. It doesn't impact too much on most games. Higher refresh is just easier for the brain to work with and less jarring, especially large camera movements like turns, so I guess there is a gameplay case. But for me playing Booty for example, higher refresh dosen't affect the game at all, but it looks soooo much better!
 
Back to the OP comments,

I think it's silly to have expected every AAA game on £140-£400 consoles to look like crysis on high settings.

At the beginning of this year i spent £800 on a brand spanking new PC expecting to play Crysis at a constant 60fps on high settings (at even 720p resolution too, as I play on my Samsung HDtv through VGA).

My computer ended up being spanked by Crysis, and on high settings with y quad core CPU and 8800 series GPU I still failed to achieve anywhere near a constant 30fps at 1024x768 resolution.

I spent £300 on my PS3 and had a much more enjoyable time, being much more impressed playing KZ2 and Uncharted.

For the money you spend on a console, you're getting phenomenal performance.

Also, apart from Crytek, Id and a few higher res console ports, I don't really see any future PC titles on the scale scale of big budget console exclusives. PC nowadays gets up-res'd console ports, and many other games that, while being excellent games, aren't even comparable to Crysis (however many years old that game is).

More and more PC-only devs are going multiplat and leading on consoles first, so I really can't see any more PC exclusive mega-ZOMG-Teh-Graphix games like Crysis appearing on PC in future.

In a nutshell, I disagree with the OP. I find the best games on consoles pretty amazing. Yes I'd prefer it if every game ran at 60fps, but last gen was barely any different to be fair. Of course Sony and MS' original promises were unrealistic, but if you've been in gaming since the 8 bit days, you learn not to pay attention to PR hyperbole ;-)
 
Back
Top