Is UE4 indicative of the sacrifices devs will have to make on consoles next gen?

If MarkR really is Mark Rein, thanks for telling us more about the UE4 demo! I can't even imagine that, even while unoptimized only 30% of the hardware's capability is used?

And on PC dev kits too? Although i still wonder if they had the 8gb units by then. If phil is correct, they got them last month with no time to create something for the PS4 reveal on that kind of hardware...
 
MikeR, perhaps he's an Epic dev, but I doubt it's Mark Rein, who has posted here in the past as MarkRein.
 
Reducing the amount of effects to maintain a frame rate is one tactic. Completely changing the lighting, geometry, shadows and multiple other shader effects so it resembles a DX9 UE3 demo? All those changes because otherwise it would resemble an embarrassing slide show? That's the kind of thing you do if you're sending it to a different generation of hardware.

I accept your analogy to low end DX11 cards. However, the PS4 (7850/7870 level) certainly isn't comparable to a low end DX11 card. Not in compute ability, memory bandwidth or texturing ability. Is it as good as the GPU used for the PC demo? Nope. Neither is it a thousand miles away.

Has the PS4 been so badly designed for the last 5 years that a demo has to look like that simply to get a decent framerate in a demo situation? I think it's fair to say that Sony wasted their R&D fund if that's the case.

I don't think that's the case. So I'll choose to believe...something else...

Hasn't this generation show that developers will sacrifice resolution before frame rate or effects?
 
Honestly you have no clue on what you are talking about. This demo was created/running on approximation hardware - not final dev/PS4 silicon. Also, the demo was created within a constrained timeframe for showing. Just for sh**s and giggles, the demo only used 27-29% of the AH resources - unoptimized. Before you ask, there is no link, I am the link.

Wow.. 27-29%? So.. what we are seeing from that demo is actually closer to how a current-gen console (with more RAM I would guess) or perhaps the Wii U would be able to render that scene, and even then it is not optimized so there are some texture problems. Very interesting. If that is the case, the engine itself is more impressive than I thought.
 
Honestly you have no clue on what you are talking about. This demo was created/running on approximation hardware - not final dev/PS4 silicon. Also, the demo was created within a constrained timeframe for showing. Just for sh**s and giggles, the demo only used 27-29% of the AH resources - unoptimized. Before you ask, there is no link, I am the link.

I very much doubt they left an extremely important presentation showing their past three years of work unoptimised, "for sh**s and giggles".
 
He didn't say it was just because, he said he was mentioning it just because. There's a difference.

And the original demo wasn't optimized in any sense either, neither was agni, or samaritain. Epic and Square both confirmed that

These tech guys tend not to optimize for tech demos
 
He didn't say it was just because, he said he was mentioning it just because. There's a difference.

And the original demo wasn't optimized in any sense either, neither was agni, or samaritain. Epic and Square both confirmed that

These tech guys tend not to optimize for tech demos

I will need to go back and watch the PS4 reveal but wouldn't they reveal at the time that the demo was done in 3 weeks and not optimized? The whole presentation seemed to be in some ways geared towards developers as much as gamers and I am sure many would find that interesting.

With that said I think its entirely probable that what he said is true as the graphics card in the PS4 surely is capable of better results wrt shadows and textures than what we saw in the demo.
 
I very much doubt they left an extremely important presentation showing their past three years of work unoptimised, "for sh**s and giggles".

I'll agree it sounds strange considering the PC like GPU/GPU architecture and PS4 being much like a PC. Or as they said at the PS4 presentation like a PC that is supercharged. So if true then one could speculate that it is also as badly optimised for PC thus it can improve on both platforms quite some.
 
Hasn't this generation show that developers will sacrifice resolution before frame rate or effects?

Indeed. Which is exactly why it makes no sense for them to ditch all the IQ that made the UE4 demo on PC look good, just so they could run it at 1080p/60fps (I'm not saying they did do that res/fps) when resolution would be an easy give.

If they wanted the PS4 demo to look like the PC, and it didn't have enough "power" then it would have made more sense to drop to 720p and keep everything else as close as possible, but they didn't. Nowhere near.

Only Epic devs could comment on why it looks so different in comparison ;)
 
PS4 uses bog standard PC parts and would not suffer from having game running on dev kits.

A dev kit with a 2Gb AMD 7870 and an under clocked FX8150 would easily emulate the power on the final machine......

If it used custom hardware like Sony are famous for then you would of had something to stand on....but this generation the whole dev kit argument doesn't meen squat.

So your finally silicon theory is just a complete load of bollocks..... As is your equally bullshitty assumption of 27-29% hardware utilisation...

So unless you have links then don't post crap..

And welcome ;)

You post in an entirely too condescending of a manner. Your failure to understand or even acknowledge basic concepts has really stalled the discussion in this thread. You're entirely insulting to other posters when you could just disagree without being a complete prick. It's one thing if you acted this way in RPSC, but in the console forum or any forum outside of RPSC I really don't think it should be tolerated.

With that said...your proposed dev kit specs do not acknowledge the internal links and shared memory space with CPU and GPU. If something like that ends up being an advantage then yay, if not then boohoo. And if Epic truly didn't take into account 8 GB of memory then they could have used that for even more visual goodness.

You should know that when programmers get used to a certain hardware they will find ways to eek more and more power out of it, usually through optimization or attacking a problem in a different way. So that means the overall performance of a machine, will go up over time, as devs get more used to it. Which means one day we will likely see visuals on PS4 as good as or better than the PC Samaritan demo at the same resolution. I see little reason to believe we will get there, especially with the way console devs work always finding new ways to do things. Epic will most likely end up doing it themselves and proving you wrong.

And I'm very grateful to have a few of those devs in the beyond3d community. Don't be a jerk to other posters.
 
Honestly you have no clue on what you are talking about. This demo was created/running on approximation hardware - not final dev/PS4 silicon. Also, the demo was created within a constrained timeframe for showing. Just for sh**s and giggles, the demo only used 27-29% of the AH resources - unoptimized. Before you ask, there is no link, I am the link.

Welcome Mike and thanks for the insights. It seems like we may have an opportunity here to put this argument to rest once and for all so if you don't mind, could you let us know what the real reason being the missing effects is? You've already made it clear it's not total processing power so perhaps one particular part of the hardware bottlenecking the rest that you didn't have time to optimise around? Or a lack of available memory on the AH? Or purely just a time constraint of being able to port over every single effect?

Cheers
 
I'll agree it sounds strange considering the PC like GPU/GPU architecture and PS4 being much like a PC. Or as they said at the PS4 presentation like a PC that is supercharged. So if true then one could speculate that it is also as badly optimised for PC thus it can improve on both platforms quite some.

Good to see ya again Neb. You've returned at a pretty tumultuous time. Silly season is well and truly in full swing!
 
There should be no real reason why they could not do it, so yeah it was probably just a time constraint, but if MikeR wants to respond i will recall my hypothesis
 
If MikeR wants to respond, he might want to start by stating his occupation. Anybody can make a new user name and talk about being "the link".
 
MikeR, perhaps he's an Epic dev, but I doubt it's Mark Rein, who has posted here in the past as MarkRein.
If they're two accounts by the same person, I have to ban them both as we don't tolerate puppet accounts. :yep2:

Pardon my ignorance, but what is the 'AH' that MikeR says is only 27-29% being used?
 
And approximation hardware i'm guessing, like Ubisoft for Watch Dogs, they were running it on a "comparable" PC. But there is no comparable PC with that kind of HSA APU like achitecture in combination with that kind of ram configuration(with that large amount of ram reserved for both CPU and GPU at 176gb/s. In PC's, the GDDR5 ram speed only goes for the GPU(since it is obvious graphics ram), and everywhere else has ram of much slower bandwidth. So really, you'd have to wait for retail units to see what it can really do if they aren't using the actual PS4 dev kits :/
 
Approximation hardware.
Ah, yeah. Spending a little more time reading carefully helps. :mrgreen:
I very much doubt they left an extremely important presentation showing their past three years of work unoptimised, "for sh**s and giggles".
And me reading it more carefully, I see that's not what MikeR said. He added the < 30% utilisation comment for s**** and giggles. As in, a stirring contribution to this thread is that the demo wasn't taxing the hardware at all. He did not say that they decided to throw together a demo and show it for s**** and giggles. The effort would have been a serious undertaking, maybe a last minute request from Sony to showcase something UE4 related, and they cobbled the demo together. The result was a very unoptimised demo with low utilisation (if MikeR is to be believed).
 
Then why there is not a single PC or console game on the horizon that comes close to the amount of detail or particles shown in Killzone SF? Not even Crysis 3 is able to match it. Not counting tech demos of course.

What do you mean by amount of detail? Crysis 3 is easily at least a step up compared to what was shown in the killzone demo regarding character polycounts and texture resolution... The scenery in crysis 3 also sports tesselation and pom, which seemed absent from Killzone demo.
 
Back
Top