Unreal Engine 4

I think he means the memory footprint of a g-buffer "pixel". If it's too big it eats too much bandwidth, his conjecture. (Not necessarily true, just saying)
 
Does anyone know how many bytes per pixel Unreal 4 requires? This will have a huge impact on the future of the Xbox One. A majority of games released in the next 6 - 10 years will use this engine, and if it requires more than 16 bytes per pixel, the One will most likely not reach 1080P for most games.

forty-two
 
Does anyone know how many bytes per pixel Unreal 4 requires? This will have a huge impact on the future of the Xbox One. A majority of games released in the next 6 - 10 years will use this engine, and if it requires more than 16 bytes per pixel, the One will most likely not reach 1080P for most games.

GBufferUE4.JPG

source : https://de45xmedrsdbp.cloudfront.ne...Behind_the_Elemental_Demo_16x9-1248544805.pdf

The initial GBuffer layout from two years ago used 28 bytes/pixel but keep in mind that not everything needs to be in ESRAM at the same time. Moreover note that this is purely informative since this layout has changed recently when they switched to a more physically based renderer. They've replaced the notions of DiffuseColor/SpecularColor/SpecularPower with things like BaseColor/Metallic/Roughness/Cavity
 
Does someone know how to get 3840x2160 working?

"setres 3840x2160" won't do it. I though everybody was into 4K but it seems many still think in crt...
 
Does someone know how to get 3840x2160 working?

"setres 3840x2160" won't do it. I though everybody was into 4K but it seems many still think in crt...

Try this:
WindowsNoEditor/Elemental/Saved/Config/WindowsNoEditor

GameUserSettings.ini

bUseVSync=False
ResolutionSizeX=
ResolutionSizeY=
 
The g-buffer is exactly what I'm talking about. For example, Titanfall is using deferred shading and stores 28 bytes per pixel as input to the pixel shader. If you take it's confirmed resolution on the Xbox One (1408x792) times 28 bytes it's almost exactly 32MB. That's not a coincidence. Zeross shows a great breakdown of the various buffers in a typical deferred shader, 28 bytes, exactly like Titanfall. If homerdog is accurate and Unreal4 needs 42 bytes per pixel that's a huge problem for Microsoft.

http://www.eurogamer.net/articles/digitalfoundry-2014-titanfall-ships-at-792p
 
Epic shares the Unreal Engine 4 roadmap

It's amazing, first they give the source code away for 19$ and now they keep us informed on what they're planning. Not long ago such information would have been considered sensitive and kept highly secretive and now it's open to anyone.

Epic clearly have chosen to reinvent itself and it's really laudable: when you have a highly successful business, it's easy to become complacent, to lose your focus and try to hang on your outdated business model until someone disrupt your market and before you realize soon you become irrelevant. (think major labels with the MP3 and digital distribution of music). They're trying really hard to avoid this pitfall.
 
To be honest, Epic is just attempting to get top to bottom usage of their engine. Wouldn't say they have chosen to reinvent itself. It helps the top end of their business if people learning and indies get experience with their tools. The more people that know their stuff, the more people that will want to keep using it, the more chance people will license it.
 
Back
Top