Is UE4 indicative of the sacrifices devs will have to make on consoles next gen?

Yes, Durango definitely has only 6 cores available to devs.

But regardless of the PS4 vs PC talk, I'm just happy that the PS4, at least, will eventually be able to run games that look like the Infiltrator demo. :smile:

It's definitely the next gen demo that has impressed me the most.
 
Yes, Durango definitely has only 6 cores available to devs.

But regardless of the PS4 vs PC talk, I'm just happy that the PS4, at least, will eventually be able to run games that look like the Infiltrator demo. :smile:

It's definitely the next gen demo that has impressed me the most.

I'm kinda disappointed I dumped $450 into a video card when I'm primarily a console gamer. Darn Crysis 3. :cry:
 
Because DX11 is much more efficient and all about multithreading. You can find examples here or in Repi presentation from BF 3 how they've managed to decrease amount of draw calls in DX 10+ in comparison to DX9 and how better multithreading works. There are more to DX11, like atomics or better use of shader cores etc
What about the following article written around 2 years after DirectX11 was released?
http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/2

Then, there are the tests done and graphed by forum members (DirectX11 vs OpenGL).
http://beyond3d.com/showpost.php?p=1630487&postcount=24

Timothy Lottes said: "One simple example, drawcalls on PC have easily 10x to 100x the overhead of a console with a libGCM style API..."

Plus, in PS4, you would have architectural advantages, like unified memory, adding to efficiency.

Also, wouldn't it be more accurate to say PCs lose half their theoretical performance to consoles rather than saying consoles double their theoretical performance? After all, isn't it mainly the software and some architectural differences keeping GPUs from performing as that should, in a PC environment?
 
DF: UE4 PC vs PS4

Last week's GDC reveal of Unreal Engine 4 running on PlayStation 4 hardware gave us a revised look at 2012's Elemental PC demo - at the time operating on Core i7 in combination with Nvidia's powerful GTX 680 - and after the dust settled we were curious to see how the two renditions compared directly. This should give us some idea of the ways in which Epic has reshaped its code to better suit the new console platform. Of course, it's early days and UE4 in itself is still in development, but the question remains - to what extent can PS4 match up to a top-end PC?

[...]

So just how has the first UE4 tech demo transitioned across from PC to PS4? The biggest casualty is the omission of real-time global illumination, which produced some really impressive lighting in the original presentation - specifically in the way that light sources "bounce" off different materials. Dynamic generation of shadows also appears to be an issue, but we can't really read too much into that as positioning of the sun appears to have changed significantly from PC to PS4, altering the way the scenes are lit.

While the lack of real-time GI is a bit of a blow, we get the impression that this really heavy tech proved too much for the new console hardware (and bearing in mind the power of the PS4, an array of lower/mid-range PC graphics cards too) because it's been replaced with an enhanced solution of the "baked", pre-computed lighting system used in Unreal Engine 3: Lightmass. This works in combination with a form of real-time global illumination on objects - not exactly a new approach, as the same basic principles were in place on Halo 3.

Other effects have definitely been scaled back, but not to an extent that has much of an impact on the presentation - GPU particles are fewer in number, depth of field has been significantly retooled and isn't quite as impactful on PS4, while object-based motion blur appears to have been removed. The flowing lava effect had real depth and texture to it in the original PC version - on PS4, it's significantly flatter.

[...]

With all of this in mind, the fact that PS4 is within striking distance at all is a fairly substantial achievement. Only the omission of Sparse Voxel Octree Global Illumination tech (SVOGI) comes across as a disappointment - and from this, it's difficult to avoid the conclusion that at a base-line level, the next generation of console hardware isn't quite as powerful as Epic was hoping for this time last year.
 
Last edited by a moderator:
SVOGI was omitted from UE4 in general versus last years version, so that can't really be held against the PS4 ;). Richard will update the article.
 
SVOGI was omitted from UE4 in general versus last years version, so that can't really be held against the PS4 ;).

Of course not.

I wonder if there's still room for improvement on that SVOGI algo. White papers have it running anywhere from 20-70ish fps on GTX480's IIRC. Cutting console res closer to 720p would definitely help too.

Update to DF article:

Update: To clarify, it's our understanding that there won't be a real-time GI/Lightmass divide between PC and console in final UE4 games - we're looking at the pre-computed solution across all platforms.
 
Somewhat unrelated, but here is a video I found of this guy testing out his real time GI solution, and at one point there ( 4:40 ) the real gameplay impact of real time GI is self evident. There is a second character with a flashlight going about in the environment. At one point he and all the light from his flashlight is ocluded by big containers in front of them, yet you can tell where he is at by the bounced light that come out of the corridor he is in and lights the taller walls further back. Here color bleeding is not just a graphical flair, it adds to the game, similar to how global shadowing does this gen (detecting enemies or other stuff happening through their shadows) and the best way to go about it is through robust and general solutions, so situations like that can emerge spontaneously even in places unexpected for the designers, and any sort of baking hurts the games ability to freely move big pieces of geometry, handle destruction and so on. Real time GI would be a great next gen feature, and still is one worth pursuing.


4'37'': http://youtu.be/1XSTejoiHMY?t=4m37s
 
Well as i said i read somewhere that each moving object was 5ms of updating the sparse voxel octree data.So, too expensive till Nvidia Einstein and its 40 tflops...
 
Last edited by a moderator:
Digital Foundry said:
Update: Brian Karis, senior graphics programmer at Epic Games adds some more insight in the comments below, explaining some of the more obvious differences - particularly in terms of the very different lighting schemes. At the technical level, the two demos are closer than it seems:

"The biggest changes actually came from the merging of two separate cinematics, the original Elemental and the extended Elemental we showed at PS4's launch event. Each had different sun directions and required some compromises to join them. This resulted in some major lighting differences that aren't platform related but were due to it being a joined cinematic. Another effect, in the original you could see the mountains through the door where in the merged one we made the view through the door white since the mountains outside were no longer the same. Same deal with the mountain fly by. The old mountain range doesn't exist in the new one. These changes from the merge make direct comparisons somewhat inaccurate.

"Feature wise most everything is the same, AA resolution, meshes, textures (PS4 has tons of memory), DOF (I assure you both use the same Bokeh DOF, not sure why that one shot has different focal range), motion blur.

"Biggest differences are SVOGI has been replaced with a more efficient GI solution, a slight scale down in the number of particles for some FX, and tessellation is broken on ps4 in the current build which the lava used for displacement. We will fix the tessellation in the future."

http://www.eurogamer.net/articles/digitalfoundry-unreal-engine-4-ps4-vs-pc
 
For me it's a pity to see that despite DF articles for many these demos are considered the be-all-and-end-all of next-gen power.
 
While I don't doubt what the Epic programmer says, it still looks noticeably worse. I think that's mostly down to a matter of spending more time in the oven.

You'd think they'd realize a comparison would easily be made so soon after the PC one was released though. Not sure why they needed to release this 'updated' version so quickly either. From a PR perspective it's not so great.
 
While I don't doubt what the Epic programmer says, it still looks noticeably worse. I think that's mostly down to a matter of spending more time in the oven.

You'd think they'd realize a comparison would easily be made so soon after the PC one was released though. Not sure why they needed to release this 'updated' version so quickly either. From a PR perspective it's not so great.

The worse is that for being a tech demo it looks years worse that KZ SF gameplay demo. A strange way to sell your engine for PS4.
 
The worse is that for being a tech demo it looks years worse that KZ SF gameplay demo. A strange way to sell your engine for PS4.

To be fair I think GG had tons more time with the dev kit than Epic did so it's not surprising KZSF looked much better given such advantage. And what's scary is that neither of them had the final dev kit (8g gddr5) until recently.
I hope we see Gears of war 4 on PS4 at E3 given the recent praise from Mark Rein.
 
While I don't doubt what the Epic programmer says, it still looks noticeably worse. I think that's mostly down to a matter of spending more time in the oven.

You'd think they'd realize a comparison would easily be made so soon after the PC one was released though. Not sure why they needed to release this 'updated' version so quickly either. From a PR perspective it's not so great.

Well, broken tesselation and all that ... nice to see the comments directly from Epic regardless.

Remember that their PR is directed at multi-platform developers partly. They're their biggest customers, and GDC is the biggest developers conference of the year.

We'll see how far things have come two months from now. ;)
 
What about the following article written around 2 years after DirectX11 was released?
http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/2

Then, there are the tests done and graphed by forum members (DirectX11 vs OpenGL).
http://beyond3d.com/showpost.php?p=1630487&postcount=24

Timothy Lottes said: "One simple example, drawcalls on PC have easily 10x to 100x the overhead of a console with a libGCM style API..."

All that evidence only relates to drawcalls, not the overall efficiency disadvantage of the PC so in isolation it's not really proof of anything other than the already well known fact that PC's can be draw call limited compared to consoles.

Saying that though the very article you link specifically refers to the improvements in DX11 further reinforcing the previous statements about Carmacks 2x advantage no longer being a valid reference point.
 
Why the hell are you people still stuck at the elemental demo, which wasn't that impressive from the start anyway?

The new one, Infiltrator, is head and shoulders above practically anything else we've seen on the new consoles, including Battlefield 4 and KZ SF. Epic won't have any problems selling that engine.
 
The new one, Infiltrator, is head and shoulders above practically anything else we've seen on the new consoles, including Battlefield 4 and KZ SF. Epic won't have any problems selling that engine.
It looks good, but remember to separate "cinematic" from "gameplay". In the former (like infiltrator) you can pretty much just pre-bake all of the lighting (into probes and lightmaps) and optimize exactly what the user is seeing. While it is definitely pretty and is running well, don't necessarily expect the lighting and animation in particular to translate perfectly to dynamic gameplay situations.

The whole "overhead" vs "to the metal" conversation is making me sigh... it's not as simple as people want to make it out to be. It's not a single number, so don't try to boil it down to one. As consoles add more multitasking/dynamic features, they'll add overhead as well. I'm not trying to defend the situation on PC (heaven knows it's the most sub-optimal for shared memory/SoC's in particular!) but there's a lot of ridiculous claims being made, and then people further taking stuff out of context. Frankly history pretty firmly supports the notion that while there's some advantage to coding to the metal on consoles, it doesn't exactly kick consoles into another tier of performance than their underlying hardware would imply. I seriously doubt that a console generation that is closer-than-ever to PC architecture is going to change that, but would love to be proven wrong.

I'll also note that draw call overhead is even less relevant this generation than in the past. DX10+ texture arrays, constant buffers, instancing, etc. all reduce the need for batch-breaking state changes.
 
Last edited by a moderator:
All that evidence only relates to drawcalls, not the overall efficiency disadvantage of the PC so in isolation it's not really proof of anything other than the already well known fact that PC's can be draw call limited compared to consoles.

Saying that though the very article you link specifically refers to the improvements in DX11 further reinforcing the previous statements about Carmacks 2x advantage no longer being a valid reference point.

May I ask that how much efficiency PS4 can achieve compared with a PC having the same spec? 1.5 times of efficiency? Better/Worse? THX!
 
The whole "overhead" vs "to the metal" conversation is making me sigh... it's not as simple as people want to make it out to be. It's not a single number, so don't try to boil it down to one.

May I ask that how much efficiency PS4 can achieve compared with a PC having the same spec? 1.5 times of efficiency? Better/Worse? THX!

I´m sorry, but I just had to put those two together. In one post
 
Back
Top