Image Quality and Framebuffer Speculations for WIP/alpha/beta/E3 games *Read the first post*

What's up with the dithering in the gameplay screenshot? 6-bit per colour framebuffer? :p

:3
Perhaps they try to get better image quality for people watching images from TN panels. ;)
Some dithering is actually quite good idea even with monitors capable displaying full 8bit gradient.
 

Looks sharp, but It's not yet running on the consoles for the time being. so an analysis wouldn't yet do any good. :cry:

The alpha mapping on the hair is a bit interesting, it's almost the same as final fantasy XIII's. I wonder what the system specifications are on the PC?

.....I got my fingers crossed if it can look this good at 720p FXAA, I'd be blown away.:smile2:

Also i was told today that Snake's hair isn't supposed to be white, it's to be brown like in this shot
http://gematsu.com/gallery/metal-ge...Solid-Ground-Zeroes_2012_09-10-12_011.jpg.php
The hair Graphics still need to be tweaked.
 
Last edited by a moderator:
Not exactly sure why they switched to deferred when visual quality is arguably worse :???:

maybe to financially rationalize a big cost feature so they could use for next gen consoles. also i think the lighting and shading look really good in comparison to their past games.
 
Those definitely look native 720p and PPAA. It may ran on PC, but probably at console settings.

I don't really trust "console settings". This can have an effect on things like IQ and performance. IIRC Crysis 2 was previewed on a PC at console settings and those demos ran better than the final game.

Not exactly sure why they switched to deferred when visual quality is arguably worse :???:

They probably needed to move over to a deferred model for Dragon's Dogma, so why not keep it? It's not necessary to make games look good, but it can offer some options in the lighting department.
 
I don't really trust "console settings". This can have an effect on things like IQ and performance. IIRC Crysis 2 was previewed on a PC at console settings and those demos ran better than the final game.
...but those demos also looked worse. All I'm saying is that, Kojima CAN use exact same settings as consoles. And by looks of it, screens are native 720p with PPAA and some heavy dithering. Don't know exactly why would they release it if they don't want to give people guidance as how console versions should look.



They probably needed to move over to a deferred model for Dragon's Dogma, so why not keep it? It's not necessary to make games look good, but it can offer some options in the lighting department.
Well, it does give them ability to have night/day cycle in their game. But at cost of resolution, hardware antialiasing and, in RE6 case, general visuals of the game.
 
Also, Kojima isn't known to fake stuff, either (i.e. CG trailers posting for in-game content). There's one incident with MGS4, where they used "PS3 spec'd PCs" before the final PS3 specs were official, where Snakes polycount was much higher than it was in the end, but that's about it, and he sort of apologized for it in the end, too.

Though, I am still in doubt. This looks "too good" for todays consoles, in my eyes.
 
...but those demos also looked worse. All I'm saying is that, Kojima CAN use exact same settings as consoles. And by looks of it, screens are native 720p with PPAA and some heavy dithering. Don't know exactly why would they release it if they don't want to give people guidance as how console versions should look.

We'll see, I hope they pull it off but I'm always skeptical until I see gameplay running on a console. :smile:

Ruskie said:
Well, it does give them ability to have night/day cycle in their game. But at cost of resolution, hardware antialiasing and, in RE6 case, general visuals of the game.

Well more and more games are moving away from hardware AA anyways due to cost and the resolution is the same in RE6. If you're referring to Dragons Dogma, well we don't know if it's sub-hd because of the light pre-pass.

I also disagree on RE6 looking worse, but I guess we'll have to play the full game to come up with any accurate conclusion.
 

At least according to MediaInfo for example, the pictures from the link quoted above apparently are saved/compressed with chroma subsampling (4:2:0).

The pictures over there for example:

http://www.eurogamer.net/articles/2012-09-10-metal-gear-solid-ground-zeroes-screenshots

apparently are saved/compressed without chroma subsampling (4:4:4), at least according to MediaInfo for example.

Just saying ;).
 
I'm wondering if the people complaining about lag ever played the damn thing to be honest. I platinumed the game and it felt just fine actually.

Some people are more bothered by controller latency, nothing wrong with that. Many people complained about games like GRAW or KZ2 and I thought both games controlled fine.
 
ps360 did a latency test, albeit in a couple of controlled areas with no enemies, and the PS3 RE5 was actually more responsive. Nevertheless, QTE's aside, I thought it was more responsive myself.
 

Has his tests ever been verified as accurate? I've seen his frame rate counting videos, but the latency tests always looked off on the various games I've seen.

By what I've played and read, these results don't make sense to me. We know the PS3 version is triple buffered, we know that adds latency, and you can feel it when playing both demos.
 
Some people are more bothered by controller latency, nothing wrong with that. Many people complained about games like GRAW or KZ2 and I thought both games controlled fine.

RE5 does feel a heck of a lot snappier than KZ2, though. I think there's simply more to it than just input lag. Killzone 2 for example, at least initially, also had an annoyingly large controller dead zone and weird aiming acceleration that made fine tuning your aim a real chore, at least for me.
 
Back
Top