Top developers slam PS3 "broken" allegations

Shifty Geezer said:
Who's trying to justify waiting an extra year to play Heavenly Sword and Resistance etc.? ;)

You'll have to wait for them, regardless of what other console you might buy in the interim :)
 
You'll always get better graphics at 30 FPS so you'll always have it around.

IMO I prefer it that way. If you told me the next Halo (or previous Halo's) could be 30 FPS with markedy better graphics or 60 FPS with lower graphics I'll take the former every time. Hell 30 FPS just fits Halo's slow pace single player game perfectly.
 
Meh

In any case, I wish I had the eyes that everybody else did. As much as I wish I could say "Argh, 60fps >> 30fps! All good games are at 60fps!" I just can't. Sure, I can tell the difference. It's nice to have a buttery smooth game, and 60fps tends to have more games like that than 30fps. But I can also appreciate more detail, more AF, much easier. And honestly, I have a high tolerance for framerate. playing at 25 or so on a PC game is... workable.

Tearing drives me insane, though. Other graphical glitches too.
 
From the video's I ve seen the game seems too fast to be 30fps :???:

Seems more like 60fps.

Ofcourse I could be wrong
 
For a fps game, or any other "slower than 20 kmph" I find it impossible to reliably judge between 30 fps and 60fps from low(ish) quality video.
But I'm pretty sure, when I get my next gen console and I'll be watching it from my 92" front projection screen from 3 meters away, 30 fps will be more disturbing in more games than it was this gen from a 50" screen.

Personally, I can't from memory guess whether the Resistance: Fall Of Man videos I've seen are 30fps or 60fps.
They seemed smooth enough though when viewed form 17" screen, 1/4 screensize, 60 cm away. But that's not how I'll be viewing it if I buy it.
The videos did look much smoother and less eye straining than for example PGR3 videos I'd seen, but then again the motion is generally slower velocity.
 
Mintmaster said:
People were expecting Cell to take over vertex shading duties. There are games out there that use 10 iterators as input to some pixel shaders. Position, eye vector, 1-4 light vectors, 1-4 halfway vectors, normal, texcoords for base texture, texcoords for 1-4 shadow maps... This is what I was talking about. 6 iterators is not excessive at all.

Minty, i don't know what shaders you've been looking at but 6 iterators is borderline excessive for a fragment shader. position, light vector(s) and several uv vectors is what you use most of the time. i.e. when doing phong with normal mapping, diffuse- and shadow maps. when you're doing just phong, passing the interpolated blinn's halfway vectors makes passing an eye vector useless, and you pre-compute your diffuse just as well so passing light vectors is useless either.
 
Last edited by a moderator:
fireshot said:
I always found the 30fps != playable argument nothing but ****** canon fodder.

Perhaps, but if people can look at a game and claim it needs higher resolution, higher res textures, or anti aliasing, then it's just as valid to claim it needs to be 60 fps.
 
Fox5 said:
Perhaps, but if people can look at a game and claim it needs higher resolution, higher res textures, or anti aliasing, then it's just as valid to claim it needs to be 60 fps.

Increases to AA or texture resolution are things that can be percieved and appreciated (to varying degrees) by 100% of people playing the game. Bumping FPS from 30 to 60 is not even noticeable for many people (the majority?) so it's far different IMO.
 
scooby_dooby said:
Increases to AA or texture resolution are things that can be percieved and appreciated (to varying degrees) by 100% of people playing the game. Bumping FPS from 30 to 60 is not even noticeable for many people (the majority?) so it's far different IMO.

Is this substantiated? Cause everyone I have spoken to have agreed that GT3 and GT4 look so much better than other racers out there cause of the rock solid 60 fps.
 
Last edited by a moderator:
scooby_dooby said:
Increases to AA or texture resolution are things that can be percieved and appreciated (to varying degrees) by 100% of people playing the game. Bumping FPS from 30 to 60 is not even noticeable for many people (the majority?) so it's far different IMO.

I disagree. AA is definetely not something everyone can appreciate, and niether is texture resolution. There are people who can play RE4 on PS2 and Gamecube (not side by side) and not notice the graphical difference. Even raising the resolution isn't evident to everyone (and really depends on the detail of the game), I'd bet a significant number of PC gamers are playing at 640x480 on LCD screens, and that is downright nasty. I have a few friends who play games that way, cause they'd much rather have a good framerate than a non-blurry screen that they apparently don't notice the difference in anyway.
 
drpepper said:
Is this substatiated? Cause everyone I have spoken to have agreed that GT3 and GT4 look so much better than other racers out there cause of the rock solid 60 fps.

it's substantiated by the fact that many people simply can not tell whether a game is 30 or 60fps until they are told. Everyone can notice a more detailed texture, or a line lacking jaggies, as it's right in front of their eyes.

For example, if you increase the resolution of a movie from SD to HD, that's something everyone can see. Some will notice it more than others, but it's part of the image and everyone can see it. If you raise the framerate of a movie from 24fps to 60fps, only a small fraction of people will notice or appreciate this. That's the difference I see here.
 
Fox5 said:
I disagree. AA is definetely not something everyone can appreciate, and niether is texture resolution.

They can percieve it though, they may not appreciate it very much(which is what I meant by varying degrees), but they can see it. Not so for for framerate. Many people simply don't percieve it.
 
scooby_dooby said:
it's substantiated by the fact that many people simply can not tell whether a game is 30 or 60fps until they are told. Everyone can notice a more detailed texture, or a line lacking jaggies, as it's right in front of their eyes.

For example, if you increase the resolution of a movie from SD to HD, that's something everyone can see. Some will notice it more than others, but it's part of the image and everyone can see it. If you raise the framerate of a movie from 24fps to 60fps, only a small fraction of people will notice or appreciate this. That's the difference I see here.

Huh? Most people don't even know what aliasing is let along anti-aliasing. They look at a game and jaggies just go over their heads. But As soon as the framerate drops or if the animation is running liquid smooth people are able to spot that out like an elephant in the room.

I still get the majority of people appreciating high framerates rather than IQ.
 
drpepper said:
Huh? Most people don't even know what aliasing is let along anti-aliasing. They look at a game and jaggies just go over their heads. But As soon as the framerate drops or if the animation is running liquid smooth people are able to spot that out like an elephant in the room.

I still get the majority of people appreciating high framerates rather than IQ.
Of course they don't know what aliasing is, but they can SEE the difference between a smooth image and a jaggy image. Of course they can notice framerate drops, there is a minimum threshhold there, what I'm talking about is the difference between 30 and 60.

Of course, it all comes down to preference on forums like these, and everyone's opinion is valid. I still feel that fundamental improvements in the IQ are far more important than raising the framerate above 30fps, simply because it can be noticed by all rather than a select few.
 
scooby_dooby said:
it's substantiated by the fact that many people simply can not tell whether a game is 30 or 60fps until they are told. Everyone can notice a more detailed texture, or a line lacking jaggies, as it's right in front of their eyes.

For example, if you increase the resolution of a movie from SD to HD, that's something everyone can see. Some will notice it more than others, but it's part of the image and everyone can see it. If you raise the framerate of a movie from 24fps to 60fps, only a small fraction of people will notice or appreciate this. That's the difference I see here.

Aren't there studies that show people are more sensitive to temporal changes (and thus motion sickness) than chroma? Most of my casual gamer friends won't play an fps that doesn't run at 60fps because the motion is so nauseating to them otherwise.
 
Keep on topic

Why is this thread turning into yet another "30 VS 60 FPS" thread?

All that while there are other (recent) threads on the subject on the forum; just do a search and take this discussion to the appropriate threads. ;)
 
Fox5 said:
Most of my casual gamer friends won't play an fps that doesn't run at 60fps because the motion is so nauseating to them otherwise.

Well...you have some wierd friends :p I've never even heard any of my casual gamer friends say "frames per second"

Vysez is right though, I digress.
 
scooby_dooby said:
Well...you have some wierd friends :p I've never even heard any of my casual gamer friends say "frames per second"

Vysez is right though, I digress.


It's not a hard concept to explain to them why they point out the games they refuse to play and the jerky or slow motion in it.
 
Will lower frame rates cause motion sickness? There are some games which just gives me a total headache and queasy, while others are fine. I always wondered if it was attributed to the framerate .. what the eye perceives just not reconciling with the motion on the screen. Or maybe just poor game design.
 
Back
Top