Image Quality and Framebuffer Speculations for WIP/alpha/beta/E3 games *Read the first post*

Anyone here who would be able to figure out the native rendering resolution and AA in the first video there:


?
It's 1080p. Looks like FXAA or some post-AA. It reaaaally suffers from 1/2 x 1/2 res post-processing.
yes it's 1080p, running on a single geforce 680 gtx
Well, at least according to the following presentation:

http://www.unrealengine.com/files/misc/The_Technology_Behind_the_Elemental_Demo_16x9_(2).pdf

it apparently is not full 1080p, because it apparently states the following (especially see the highlighted (bolded/underlined) part at the end of the quote):

unrealengine.com/files/misc/The_Technology_Behind_the_Elemental_Demo_16x9_(2).pdf said:
Elemental demo
  • GDC 2012 demo behind closed doors
  • Demonstrate and drive development of Unreal® Engine 4
  • NVIDIA® Kepler GK104 (GTX 680)
  • Direct3D® 11
  • No preprocessing
  • Real-time
    • 30 fps
    • FXAA
    • 1080p at 90%

;)
 
Last edited by a moderator:
Do they mean 90% of the time or with 90% of the features.
And now im wondering what would be in the last 10% if it was the 90% of the features.
 
Not sure, but maybe they are referring to what is described over there for example:

tweakguides.com/UT3_5.html said:
http://www.tweakguides.com/UT3_5.html

Screen Percentage: This slider controls the amount of screen space to render the game world in, and the lower the screen percentage, the smaller the game image will be on the screen. However, what actually happens is that the game takes this smaller image and by default upscales it to fill your screen (if UpscaleScreenPercentage=true in your UTEngine.ini file - see the Advanced Tweaking section). The result is that the in-game image becomes noticeably more blurry the lower the screen percentage setting is taken, and in return you gain more performance. In practice this is very similar to running the game at a specified proportion of your current resolution. For example a 50% screen percentage on a 1280x1024 resolution gives the visual equivalent of running at half the resolution. The difference is that your main menus and in-game HUD elements (graphics and text) will not be reduced in resolution, which helps legibility, particularly for people using LCD screens at native resolution, and the setting can also help those already running a low resolution, or those who want to only slightly decrease their resolution. To observe the difference for yourself, you can see an animated screenshot comparison by clicking this link: UT3_ScreenPercent.gif (553KB). If you want the best image quality don't reduce screen percentage below 100, unless you're truly struggling for FPS. If you want to decrease this setting below 50%, use the ScreenPercentage setting in the Advanced Tweaking section.

?

;)
 
Last edited by a moderator:
Hmmm, makes me wonder now. were any of the next gen game engines really running at 1080p as well? i had thought of this long since i had heard a while back that even the best setups are still fumbling at 1080p for a few games.

or will maybe the next gen consoles be able to run only the previous generation of graphics at 1080p and 2014's graphics at 720p. ................i say this cause that's kinda the way things played out with this gen, sure they supported 1080p but rendering at native 1080p wasn't possible for all games.

I mean running ps3 and xbox 360's games at 1080p 60 fps takes a pretty strong rig, so i wonder what would it take to run those new tech demos at real 1080p 60 fps?

.......deja vu, i had this in a conversation just yesterday.:smile2:
 
Last edited by a moderator:
1080p will most likely be the aim, but how will we describe the image quality as decoupled shading and dynamic resolutions will be even more common?

We might have same amount of shading for 2160p, 1080p and 720p, but higher resolutions would have sharper edges and high resolution GUI.
 
This thread will probably have served its purpose by then. It's creation was to determine the rendering resolution to see how games were using the hardware. Over the years, evolution of rendering techniques has made a singular rendering resolution+MSAA pretty rare, so counting pixels isn't an accurate investigation into what the engine is doing.

I suppose there's still an IQ metric which is the presence of jaggies. For that we could carry on counting the resolution of the geometry (which is what we do now) but with various AA methods, determining the MSAA samples strikes me as a waste of effort. If games next gen don't all have great IQ from AA techniques then it's time to throw in the towel and call it a day regards consoles. :p
 
A bit OT but where are these pics taken from? Is there footage of the ps3 version somewhere? I'm curious to see how each version turns out since I still haven't decided which one to pick up.

There will be demo for both platforms on september 18th.
 
hm... it doesn't exhibit usual FXAA qualities, plus there's still a lot of edge shimmering. Probably the same PPAA implementation as on 360.
 
A bit OT but where are these pics taken from? Is there footage of the ps3 version somewhere? I'm curious to see how each version turns out since I still haven't decided which one to pick up.

These are from the Dragon's Dogma pre-order early access, the demo is apparently the old E3 build compared to what would be released on the 18th.
 
Also, there exist some final retail copies in Poland, which were stolen from Capcom/their Shipping department. Not sure if there's any footage for that out there.
 
Back
Top