Is UE4 indicative of the sacrifices devs will have to make on consoles next gen?

May I ask that how much efficiency PS4 can achieve compared with a PC having the same spec? 1.5 times of efficiency? Better/Worse? THX!

It's like asking how rich you can be if you go to Harvard. There is the potential and promise, plus a lot of hardwork, some luck, and $$$.
 
It looks good, but remember to separate "cinematic" from "gameplay". In the former (like infiltrator) you can pretty much just pre-bake all of the lighting (into probes and lightmaps) and optimize exactly what the user is seeing. While it is definitely pretty and is running well, don't necessarily expect the lighting and animation in particular to translate perfectly to dynamic gameplay situations.

I don't see too much detriment to gameplay there. Most games do pretty damn well with fully pre-baked lighting, and in a deferred renderer it isn't that hard to add a few spots or omnis for muzzle flashes and flashlights and such anyway. Also, the - justly - praised MGS V demos have pre-baked lighting as well, and yet they still manage a full time-of-day system by blending between different light probes.
Dynamic GI is somewhat overrated, IMHO - games managed to do quite well without it for a long time, and it's not like gameplay has been getting more complex in recent years anyway.

UE4 looks pretty damn good, period. Also, I'd say it's the most impressive stuff as a whole from the next gen tech demos; individual features like Activision's skin and eye shader are better looking, and BF4 has very nicely tuned lighting in terms of color and contrast, but as a whole package, Infiltrator is the best. I'd have a very very hard time selling anyone how we could do better in offline CG.
 
Original shot:
ue4mklkh.jpg


Editor shot without washed out video look and motion blur:
unreal_engine_4pexnw.jpg


Gamersyde has 11,520 x 6,480 version of this new shot.
http://images.gamersyde.com/image_unreal_engine_4-21778-2539_0001.jpg
 
The whole pic is washed in a chromatic aberration effect. It looks great but I wish they would stop trying to emulate camera effects in games to make things 'more real' ;)
 
Dynamic GI is somewhat overrated, IMHO - games managed to do quite well without it for a long time, and it's not like gameplay has been getting more complex in recent years anyway.
Sure, I didn't mean to imply that we need fully dynamic GI for something to look good. I was just using that as an example of the fact that a very "directed" experience will always look better than gameplay. Animation is probably the more obvious example here.

Anyways it looks good, no doubt. I just don't necessarily expect it to look quite that good in a game, but we'll see :)
 
The point is that the engine has the potential to look better than anything else we've seen so far and the Elemental demo is irrelevant in comparisons with KZSF or BF4 or whatever.

Obviously the actual poly counts and scene complexity and animation fidelity will be dictated by the nextgen console hardware performance envelopes. But Epic is - at this time - ahead of the rest in the showcase department. BF4 may have better daytime lighting but the assets don't look this good and the realistic warfare setting limits them to an extent, too.
 
All that evidence only relates to drawcalls, not the overall efficiency disadvantage of the PC so in isolation it's not really proof of anything other than the already well known fact that PC's can be draw call limited compared to consoles.

Saying that though the very article you link specifically refers to the improvements in DX11 further reinforcing the previous statements about Carmacks 2x advantage no longer being a valid reference point.
That's obvious or consoles would have up to 100x performance advantage over a PC with similar hardware, :)

PS4 should have more architectural advantages over PC than consoles had over PC in DX9 days (especially with GPGPU). Tomb Raider, with TressFX, cut PC performance almost in half. Wouldn't that likely be because the GPU had to stop processing graphics to do compute tasks? Onion+ and Garlic with cache bypass and locking particular data in cache wouldn't add to that overall measure? If you say no, then why not?

Can you show some evidence, like the differences in that member provided chart (DirectX11 vs OpenGL)?
 
All that evidence only relates to drawcalls, not the overall efficiency disadvantage of the PC so in isolation it's not really proof of anything other than the already well known fact that PC's can be draw call limited compared to consoles.

Saying that though the very article you link specifically refers to the improvements in DX11 further reinforcing the previous statements about Carmacks 2x advantage no longer being a valid reference point.

I think it's more than 2X realistically.

720P aside, todays console games look in the same ballpark as todays PC games, despite 1/10-1/20th the power.

tomb raider on my 360 doesnt look like a different game than tomb raider on my brothers 6970 pc.
 
Tomb Raider, with TressFX, cut PC performance almost in half. Wouldn't that likely be because the GPU had to stop processing graphics to do compute tasks?
No, it's because TressFX rendering is obnoxiously brute force. It'd run just as poorly (if not more) on a console. The compute part of it is fairly cheap and done all together at the start of the frame. 1 compute/render transition in a frame is hardly an issue (Frostbite typically does several for instance).

Honestly people are mostly just making stuff up in the overhead discussion... please just stop.

720P aside, todays console games look in the same ballpark as todays PC games, despite 1/10-1/20th the power.
Uhuh... :S Yeah let's stop this conversation already before it gets even more stupid.
 
I think of actually want to see DF compare a very early 360 game and a late 360 game to remind people what a significant amount of time and energy spent familiarizing yourself with the hardware gets you. We all know it happens, but it's just a gut feeling, I don't think I've ever seen such differences properly analyzed. That way we don't have to keep bringing up console "programming to the metal hocus pocus" each time we see a demo, especially when E3 rolls around.
 
The point is that the engine has the potential to look better than anything else we've seen so far and the Elemental demo is irrelevant in comparisons with KZSF or BF4 or whatever.

Obviously the actual poly counts and scene complexity and animation fidelity will be dictated by the nextgen console hardware performance envelopes. But Epic is - at this time - ahead of the rest in the showcase department. BF4 may have better daytime lighting but the assets don't look this good and the realistic warfare setting limits them to an extent, too.

From looks alone, I didn't find it that impressive,especially after seeing the 1080p screenshots.The previous page with screenshot at a crazy resolution and without all the blur sure looks good though.
 
The blur and other post processing helps a great deal in making the demo look less like realtime and more like offline rendered or even live action.
 
The blur and other post processing helps a great deal in making the demo look less like realtime and more like offline rendered or even live action.

To some maybe,but it kills all the small details for me.We already know anyone can do blur,devs should focus on showing razor sharp images with high detail IMO:p
 
^I disagree completely. Anything that makes game graphics look less fake is an enormous plus for me.
I actually use FXAA on the PC version of Skyrim deliberately because it kills all those razor sharp polygonal edges.
 
The point is that the engine has the potential to look better than anything else we've seen so far and the Elemental demo is irrelevant in comparisons with KZSF or BF4 or whatever.

Obviously the actual poly counts and scene complexity and animation fidelity will be dictated by the nextgen console hardware performance envelopes. But Epic is - at this time - ahead of the rest in the showcase department. BF4 may have better daytime lighting but the assets don't look this good and the realistic warfare setting limits them to an extent, too.

I would say Deep Down and Agni's Philosophy are at least in the same league if not higher. I'm pretty impressed at what those Japanese companies can do nextgen.
 
^I disagree completely. Anything that makes game graphics look less fake is an enormous plus for me.
I actually use FXAA on the PC version of Skyrim deliberately because it kills all those razor sharp polygonal edges.

Some is nice,but this has way too much.

Original shot:
ue4mklkh.jpg


Editor shot without washed out video look and motion blur:
unreal_engine_4pexnw.jpg


Gamersyde has 11,520 x 6,480 version of this new shot.
http://images.gamersyde.com/image_unreal_engine_4-21778-2539_0001.jpg
 
Editor picture is obviously much better. You can achieve that to some degree from the video by sharpening it post process in a media player. The original pic looks like a 480p image :???:

Game devs, give us the sharp version out of the box and/or let us toggle blur...
 
Obviously the actual poly counts and scene complexity and animation fidelity will be dictated by the nextgen console hardware performance envelopes. But Epic is - at this time - ahead of the rest in the showcase department. BF4 may have better daytime lighting but the assets don't look this good and the realistic warfare setting limits them to an extent, too.

I would say Deep Down and Agni's Philosophy are at least in the same league if not higher. I'm pretty impressed at what those Japanese companies can do nextgen.

The key thing here common to both engines and imo the reason for them being impressive is physically based rendering, for lighting, shading, and assets. I think most if not all engines of late have incorporated parts of it more or less, but Deep Down, Infiltrator, and Fox Engine are the first tastes of what can be achieved when artists and engine pipeline come together to do it right.
 
Gamersyde has 11,520 x 6,480 version of this new shot.
http://images.gamersyde.com/image_unreal_engine_4-21778-2539_0001.jpg
Too much chromatic aberations (and a poor approximation of it too if I'm honest), looks like one of those old school three color projectors miss calibrated.

... though it is a 74Megapixel render... not sure what the point of it is, but I like it! Pixelpeeping I can see that the railing is a horrible polygonal model, or has some strange normals on it.
 
Back
Top