Console technical problem questions and comments

....

Banned
Ok, I've been playing ps3 with the original cable at standard resolution. Most games and demos are very clean iq wise and look excellent but there are exceptions...


Mild:

Uncharted. There is some shimmering in the demo at least, not too bothersome, but it shouldn't be there.

SEVERE:

The ninja gaiden demo, and lair full game, experience lvls of shimmering the likes of which I've never seen before, not even on ps2. Practically all textures shimmer at the slightest movement.

First
What's the cause of this? is it lack of optimization for games at standard resolutions as I presume? Are developers aware of the situation.

This seriously hurts public perception. Seeing as a.) there are still many standard resolution displays outhere, b.) not everyone knows/gets the proper cables or sets things right. People will believe anything from the games having bad graphics to believing the ads are outright lies.


Second
I've not been paying attention, but can I pressume the 360 hardware issues are a thing of the past with the latest hardware revisions?
 
The new 360's are much better than the original ones but if you're using an SD display you'll probably come across the some problems as the games are generally rendered at 720p and then downscaled to SD. Maybe it's time to jump into a new TV set.
 
I bought a ps3 on the 27th of December and yep the shimmering is evident on the games you mention even at 720p. Halo 3 also has shimmering.
 
A lot of games are actually rendered at SD res rather than at HD res and downscaled, AFAIK. Sadly it gets next to no coverage from the media that all uses HD screens, so IQ at SDTV is a small concern for developers. Something like Uncharted downsampled would have superb IQ on an SD set. Any HD game downsampled should experience good shimmer reduction.

Sadly I don't think it does hurt public perception, as no public voicings talk about the SD experience. Many gamers bought now with HD sets, and even though HD installation is a minority, chances are most gamers are gaming on HD sets (which includes monitors). At the end of the day, I don't think anyone involved in game creation really cares about the SD output. All advertising media will be of the HD experience, and that's what'll sell people on a game even if their experience on their own SD set falls short of that.
 
Sadly I don't think it does hurt public perception, as no public voicings talk about the SD experience. Many gamers bought now with HD sets, and even though HD installation is a minority, chances are most gamers are gaming on HD sets (which includes monitors). At the end of the day, I don't think anyone involved in game creation really cares about the SD output. All advertising media will be of the HD experience, and that's what'll sell people on a game even if their experience on their own SD set falls short of that.
How can it not hurt public perception? calling friends over, parties, not to mention the individuals who play it themselves. Even those using HDtvs, if they don't use the right cables and configuration, they will be getting this output(upscaled by the tv.).

The new 360's are much better than the original ones but if you're using an SD display you'll probably come across the some problems as the games are generally rendered at 720p and then downscaled to SD. Maybe it's time to jump into a new TV set.

Gears of War had a bit of shimmering, but most everything else has been fine.(haven't played halo 3 yet.)

PS
I'll get a new tv set later, maybe for final fantasy.
 
As Shifty pointed out, some games renders in 640x480 (or close) when outputting to SDTV resolution which wouldn't help at all with shader aliasing.

Well, what you described in Lair and Uncharted can be linked to shader aliasing, in the case of Ninja Gaiden Sigma it might be a case of Team Ninja not using any MipMapping again. I'll have to check out the Ninja Gaiden demo to see that, if I have the time later on (or if someone already checked that out, feel free to chime in with your observations).

The best and easiest solution to aleviate (not make it disapear, seeing that's a much more complex task) shader aliasing, or any other surface aliasing due to high frequency issues, on the consoles, or even current graphics hardware is to have recourse to supersampling. Which would be possible in this case, since those games can render at higher resolution already.
 
How come consoles can't be told or detect what kind of TV you have and render at an apropriate resolution? Computer games do that! Consoles are made from computer bits and are made by people who are confident with computer code?

Do they asume users are too dumb to set a display resolution??? (I might be wrong though :$)
 
It's the same thing as having to set a optimal resolution in Windows.
The process occurs under both console and PC circumstances.
 
But don't computers render at different targets? Or do they just scale like consoles?
 
Consoles are about fixed hardware specs and developing games for a consistent experience (performance). Letting the user set the rendering resolution would be terrible to that end. Imagine a user setting the rendering resolution to 1920x1080! Performance would be ugly.
 
Back
Top