What's next-gen doing wrong.

Shifty Geezer

uber-Troll!
Moderator
Legend
We've got quite a lot of next-gen screens and footage, some of it really impressive, and I think it's time we got critical and offered 'advice' to developers. The purpose of this thread to identify key faults with the next-gen visuals so far shown that ought to be addressed, trying to keep within the realms of what's possible (no complaints about lack of realtime GI solutions please ;)).

I'll start off with:

Grounds shouldn't be flat. In most title the ground is a textured quad, totally smooth. Parallax mapping adds a bit of depth but still looks terribly artificial at a distance. No AF doesn't help.

Character composition. In a lot of games I'm seeing, the characters don't seem to be grounded onto that perfectly flat ground. The lighting/shadowing isn't convincing enough and to me, games looks a lot like separate scenery and character engines being combined.

Fire is weak. We've got water down pat. That looks great now. But fire, and other streamy particle effects, looks so naff. It hasn't seen any progress since masses of 2D sprites with animated textures.
 
Absolutely agree. V-synch is such a cheap way to get decent framerates, and it looks horrible. It shouldn't be allowed to be turned off in any instance whatsoever as it ruins the whole experience more so than a slight downgrade in detail (if that's what's needed to get the framerate right)
 
Absolutely agree. V-synch is such a cheap way to get decent framerates, and it looks horrible. It shouldn't be allowed to be turned off in any instance whatsoever as it ruins the whole experience more so than a slight downgrade in detail (if that's what's needed to get the framerate right)
V-Sync should be allowed to be turned off if players want it (although I agree it should be on by default). Some hard-core gamers out there need their high frame rate (most notably in FPS games), and on Quake 3 engines disabling VSync will allow you to reach the "magic" frame rates allowing you to jump faster (76, 125, 133, etc.). Granted this is a special case though :)
 
V-Sync should be allowed to be turned off if players want it (although I agree it should be on by default). Some hard-core gamers out there need their high frame rate (most notably in FPS games), and on Quake 3 engines disabling VSync will allow you to reach the "magic" frame rates allowing you to jump faster (76, 125, 133, etc.). Granted this is a special case though :)

In the console realm we're talking about v-sync issues because the framerate is unstable and low, not because it might be so fast that it makes things better, like in the Q3 example.
Developers should be prepared to adjust their game around the general rule that V-sync is ON, not just resort to turning it off in a last-minute attempt to get decent framerates.
I mean, we're talking about a stable 30fps for god's sake! That's LOW by any standards compared to what we get on PC.
 
In the console realm we're talking about v-sync issues because the framerate is unstable and low, not because it might be so fast that it makes things better, like in the Q3 example.
Developers should be prepared to adjust their game around the general rule that V-sync is ON, not just resort to turning it off in a last-minute attempt to get decent framerates.
I mean, we're talking about a stable 30fps for god's sake! That's LOW by any standards compared to what we get on PC.


I can agree with that in general, but I'm not sure it should apply to first year games. Launch games and those that come soon after spend most of their development time without knowing all of the specific specs. They may know what the GPU will include as far as features, but not the final clock rate, etc... Because of this I would give them a bit of slack.

After the first few months to maybe the first year though, it's unacceptable to have it turned off, because by that point the developers have had plenty of time to optimize their game around the idea that it is on.
 
I really don't know if I can agree with the hating on vsync ...
God of War does it right IMO. As long as the game does 60 fps, there is no tearing but when the framerate drops (and it does sometimes), it doesn't immediately drop to 30 fps (as it would with true vsync). Yes, this means there's tearing. I prefer that over an extreme drop in framerate.

=========

Scrap the ridiculous HDR effects. I know programmers don't leave the house often and have little contact with the actual physical world exposed to non-artificial light. I can sympathize with that way of life but I absolutely hate the results.
 
I can agree with that in general, but I'm not sure it should apply to first year games. Launch games and those that come soon after spend most of their development time without knowing all of the specific specs. They may know what the GPU will include as far as features, but not the final clock rate, etc... Because of this I would give them a bit of slack.


After the first few months to maybe the first year though, it's unacceptable to have it turned off, because by that point the developers have had plenty of time to optimize their game around the idea that it is on.[/QUOTE]

I agree, we should be more lenient to devs making early titles. But after a while, v-sync OFF is unacceptable.
It's just a matter of choice in the end, some devs choose to turn it off instead of lowering the detail just that little bit to allow a stable framerate. I guess in most cases they just don't have enough testing time to see where the framerate falls and subsequently fix things to avoid that, but still...

I don't know, i just think that turning V-Synch off is cheap. ;)
 
Scrap the ridiculous HDR effects. I know programmers don't leave the house often and have little contact with the actual physical world exposed to non-artificial light. I can sympathize with that way of life but I absolutely hate the results.

:LOL: That was funny. I think the culprit is the overuse of HDR, and a certain infancy of the effect too. Remember how things looked when we first got Bump Mapping? Everything looked like a shiny orange skin for a few months!:LOL:

The flaw is not in the effect per se, it's the overuse of it that makes things look just as unrealistic as they do without the effect!
 
Character composition. In a lot of games I'm seeing, the characters don't seem to be grounded onto that perfectly flat ground. The lighting/shadowing isn't convincing enough and to me, games looks a lot like separate scenery and character engines being combined.
If this is about a object collision detection (even for the plain ground standing) it would consume vast amount of processing power, if it has to be made absolutely proper and real-time, because even for a simple ball or cube, the entire volume of the interacting object and the plane (a.k.a. the ground) must be dynamically divided into small "particles" with given properties, calculate the potential and kinetic forces and then integrate all this back to the entire object volume, and this must be made for each frame.
 
The animation bothers me the most. Alot of next gen games still have current-gen animation. Hopefully, NaturalMotion's next gen real time physics animation engine will make a huge difference. We need more smooth lifelike animation with good blending and transition.
 
- Never disable V-Sync. Tearing was horrible in God of War.

- Better Ground Textures -_-

- Clipping, specially the hair that still this day go through the characters bodies.

- 3D and interactive grass. It was really nice in MGS3, a last gen game, that you could go through the grass and it would react accordingly, then we see a couple of next gen games (Untold Legends and Motor Storm come to mind) in which the grass looks like its simply painted on the ground.
- Animation. The natural motion technology being used in Indiana Jones is incredible, once of the games that definitely needs it is Metal Gear Solid 4.

- Tone down HDR. I dont know why the vast majority of next gen games are abusing this feature.

- Longer Draw distance (Gundam Games come to mind).

- Better AA. I would prefer a game with somewhat less quality textures with better AA than the other way around. Next gen games with hard edges fest are not so nice looking.

- Stop the Motion Blur abuse. Next gen racers have been over using this tech. We know that in real life there is motion blur after you reach some high speeds, but dont use it in the games to the level it gets annoying (Forza2 and PGR3 come to mind).

EDIT: When i was writing the word "day", i wrote it with a G instead of a D, and to my surprise the word is not filtered.
 
Last edited by a moderator:
The thing with HDR is that it's currently a relitavely expensive effect, kind of like how bump mapping was when it started to go mainstream. Devs really have to justify the performance hit by showing off a difference, hence we get exagerated HDR. That's my personal take on it anyway.

I would like to see mandatory AF, though I know and accept that's usually a good place to start optimizing if your game has performance problems.

Smooth animation transitions is another big one, as well as particle effects that doesn't clip through all geometry. Some games have already fixed this though.

AA on alphatests! Shimmering leaves/fences make my eyes bleed! Supersampling/AA on selective shaders would also be nice.
 
I really don't know if I can agree with the hating on vsync ...
God of War does it right IMO. As long as the game does 60 fps, there is no tearing but when the framerate drops (and it does sometimes), it doesn't immediately drop to 30 fps (as it would with true vsync). Yes, this means there's tearing. I prefer that over an extreme drop in framerate.

=========

Scrap the ridiculous HDR effects. I know programmers don't leave the house often and have little contact with the actual physical world exposed to non-artificial light. I can sympathize with that way of life but I absolutely hate the results.

It is a misconception that with vsync, not being able to render at 60fps will drop your framerate to 30fps. I suppose you meant 1/30 sec frame duration for that single frame. though.
 
This may be premature, but I really wish 1080p was simply not a possibility this gen. I'm sure a handful of games could really use the extra resolution, but I have a feeling that all SCE titles are going to be 1080p sooner or later and that will have a ripple effect. To me, AA is much more important than higher res. But I may be totally wrong about this.
 
This may be premature, but I really wish 1080p was simply not a possibility this gen. I'm sure a handful of games could really use the extra resolution, but I have a feeling that all SCE titles are going to be 1080p sooner or later and that will have a ripple effect. To me, AA is much more important than higher res. But I may be totally wrong about this.

Well we have to see real 1080p Games on large HDTVs to judge that ;)

Shouldn't matter much to have 720p with 4x AA or 1080p without AA (atleast for PS3).
 
Back
Top