Is UE4 indicative of the sacrifices devs will have to make on consoles next gen?

Thanks for welcoming me and my statement still stands. ;)

PS4 uses bog standard PC parts and would not suffer from having game running on dev kits.

A dev kit with a 2Gb AMD 7870 and an under clocked FX8150 would easily emulate the power on the final machine......

If it used custom hardware like Sony are famous for then you would of had something to stand on....but this generation the whole dev kit argument doesn't meen squat.

So your finally silicon theory is just a complete load of bollocks..... As is your equally bullshitty assumption of 27-29% hardware utilisation...

So unless you have links then don't post crap..

And welcome ;)
 
Really? Killzone was nothing special,
Then why there is not a single PC or console game on the horizon that comes close to the amount of detail or particles shown in Killzone SF? Not even Crysis 3 is able to match it. Not counting tech demos of course.
 
A dev kit with a 2Gb AMD 7870 and an under clocked FX8150 would easily emulate the power on the final machine......

That's not necessarily true. These demos tend to be pure graphics assets - meshes and textures. It all goes to VRAM - and if you stuff more than 2GB, your performance will degrade - potentially significantly - depending on how bandwidth intensive your demo is - and UE4 is likely bandwidth limited.
 
Last edited by a moderator:
I seriously, seriously doubt that this demo uses over 2Gb of main RAM, especially as the demo might not even be 64bit.
The GTX680 is just a much much faster GPU then what's inside PS4 and that's what the problem is, not lack of memory :rolleyes:

The demo is quite likely 64bit - they originally showed it running inside their editor - no way that's 32bit.
The GTX 680 sure has more ALU - but you can have all the ALU in the world and still have it just sit idle and wait for data to arrive from slow DDR3 across the PCI-E.
 
Last edited by a moderator:
The demo is quite likely 64bit - they originally showed it running inside their editor - no way that's 32bit.
The GTX 680 sure has more ALU - but you can have all the ALU in the world and still have it just sit idle and wait for data to arrive from slow DD3 across the PCI-E.

The f!.How i would like you could talk!.
 
Honestly you have no clue on what you are talking about. This demo was created/running on approximation hardware - not final dev/PS4 silicon. Also, the demo was created within a constrained timeframe for showing. Just for sh**s and giggles, the demo only used 27-29% of the AH resources - unoptimized. Before you ask, there is no link, I am the link.

Are you a dev/insider/media etc? Your scenario does seem plausible but we need more credence than that here.
 
This is the first time in console history that they've not released with the absolute best of the best so of course they'll be lacking compared to PC.
So this got me thinking a bit more... other than the CPU, is the PS4 really that much weaker compared to a high-end PC today, vs a high-end PC and PS3 back in late 2006/early 2007?

Before:
Intel Conroe/Kentsfield vs Cell
GeForce 8800GTX / Radeon 2900 XT 512MB-756MB vs 7800/7900 256MB DDR3
2-4GB DDR2 of system RAM vs 256MB XDR

Now:
Intel i7 vs 8 Jaguar cores (1.6Ghz? Plus any additional chips to alleviate processing?)
GeForce 680 / Radeon 7970 2-3GB GDDR5 vs Radeon 7850/7870 8GB GDDR5 unified RAM
8-16GB DDR3 of system RAM vs 8GB GDDR5 unified RAM

I'm not the most technically inclined person here, and I'm not up to speed on upcoming chips/cards, so feel free to make any corrections.
 
So this got me thinking a bit more... other than the CPU, is the PS4 really that much weaker compared to a high-end PC today, vs a high-end PC and PS3 back in late 2006/early 2007?

Before:
Intel Conroe/Kentsfield vs Cell
GeForce 8800GTX / Radeon 2900 XT 512MB-756MB vs 7800/7900 256MB DDR3
2-4GB DDR2 of system RAM vs ~256MB XDR

Now:
Intel i7 vs 8 Jaguar cores (1.6Ghz? Plus any additional chips to alleviate processing?)
GeForce 680 / Radeon 7970 2-3GB GDDR5 vs Radeon 7850/7870 8GB GDDR5 unified RAM
8-16GB DDR3 of system RAM vs 8GB GDDR5 unified RAM

I'm not the most technically inclined person here, and I'm not up to speed on upcoming chips/cards, so feel free to make any corrections.
In fact this time they have top notch ROPs,texture units, bandwidth and RAM, an 'enhanced gpu'(bigger caches somewhere like with RSX and direct communication with the cpu), and the fucking Mark Cerny orchestrating all of this.Remember that RSX had 8 rops instead of the 16 of the 7800GTX.And dont discard yet enhancements to vanilla jaguar.
By the way Barbarian post is the best explanation of why devs want moar RAM.
 
Last edited by a moderator:
I'm not, I just find it funny as hell that the specs are out, the GPU is roughly what we expected it to be and now people act a little surprised and blame stupid things like memory when some short falls are high lighted rather then just accepting the obvious.

This is the first time in console history that they've not released with the absolute best of the best so of course they'll be lacking compared to PC.

And high end PC's are superior, that's just fact.
:rolleyes: This just incorrect. PC have always been more powerful and always will.

That is a "fact." High end gaming PC in 2006 out class the ps3 or x360.

Love the rage.... :LOL:

Honestly you have no clue on what you are talking about. This demo was created/running on approximation hardware - not final dev/PS4 silicon. Also, the demo was created within a constrained timeframe for showing. Just for sh**s and giggles, the demo only used 27-29% of the AH resources - unoptimized. Before you ask, there is no link, I am the link.
:oops:

I would love to see what epic can do with this power! Epic games on this hardware would be amazing!
 
Then why there is not a single PC or console game on the horizon that comes close to the amount of detail or particles shown in Killzone SF? Not even Crysis 3 is able to match it. Not counting tech demos of course.

Have you read DF article? they think otherwise ;)
 
PS4 uses bog standard PC parts and would not suffer from having game running on dev kits.
You're making an assumption about the hardware and ignoring the possibility of immature software/API.

http://en.wikipedia.org/wiki/Mark_Rein_(software_executive)

I rarely visit the console forums but boy do they do a great job of perpetuating the stereotype of console followers being less than bright and informed.
Why do you assume MikeR is MarkR?
 
I doubt it..... They can more then likely add the textures but the DOF, SSAO, GI and all the stuff?

They wanted 2Tflop minimum for the Good Samaritan and they got les then that.

I honestly can't believe that people on this forum are expecting a ~1.8Tflop GPU with 150Gb/s of bandwidth for GPU ( The other 20Gb/s for the CPU and such ) operations to keep up with demos running a GTX680 that has ~3Tflop and 192Gb/s of bandwidth.

We all knew that they would not be shipping with high end PC hardware and that they would be slower then PC in a lot ways and now that we know the specs and have seen some of the cut backs some people are surprised, why?

I honestly can't believe that you have so little comprehension of how software and hardware interact. What exactly is this odd concept of "keeping up" that you speak of? Do you think when (within the same architectural generation) GPU 1 is slower than GPU 2 that it throws its arms up in horror and proclaims that it can't run the code to perform "DOF, SSAO, GI and all the stuff"?

It simply gives you the same output at a slower speed - it doesn't mean it can't do it at all! The GPU has no concept of what "effect" you are asking it to do, it is simply running code.

An Nvidia GTX 660 has 1.88 TFLOPS and 144GB/s of bandwidth. It runs exactly the same (games) software, with the same effects, with the same textures, at the same resolution as an Nvidia GTX 680. It just does it slower! So does an AMD 7850. So will the PS4.

I, personally, have no idea why the demo that was shown did not look like UE4 on PC; and neither do you.

But if you're seriously going to tell me it doesn't look like the PC demo because it can't, because it doesn't have enough power - oh please!

BTW - yes I only have one post, and no I'm not an angry fan boy. I just can't bear to ignore the nonsense you're spouting any longer.

:rolleyes:
 
Why do you assume MikeR is MarkR?

Yeah I had to do a double take on that one. Saw someone else mentioned the mark rein name and went from there.


I honestly can't believe that you have so little comprehension of how software and hardware interact. What exactly is this odd concept of "keeping up" that you speak of? Do you think when (within the same architectural generation) GPU 1 is slower than GPU 2 that it throws its arms up in horror and proclaims that it can't run the code to perform "DOF, SSAO, GI and all the stuff"?

It simply gives you the same output at a slower speed - it doesn't mean it can't do it at all! The GPU has no concept of what "effect" you are asking it to do, it is simply running code.

An Nvidia GTX 660 has 1.88 TFLOPS and 144GB/s of bandwidth. It runs exactly the same (games) software, with the same effects, with the same textures, at the same resolution as an Nvidia GTX 680. It just does it slower! So does an AMD 7850. So will the PS4.

I, personally, have no idea why the demo that was shown did not look like UE4 on PC; and neither do you.

But if you're seriously going to tell me it doesn't look like the PC demo because it can't, because it doesn't have enough power - oh please!

BTW - yes I only have one post, and no I'm not an angry fan boy. I just can't bear to ignore the nonsense you're spouting any longer.

:rolleyes:

I also don't get why so much jerking off about the gtx680. GCN (pitcairn? in the case of PS4) is arguably a better architecture specially when it comes to compute. For someone guilty of PC hardware elitism the OP certainly picked the wrong horse.
 
I think its simply a case of rushed port, or the engine has not been optimized for the ps4; the lighting was really flat, even compared to certain current gen games, texture work was terrible, again, even compared to current gen. DoF was missing, of course the ps4 can do DoF etc. If you compare it to other ps4 games shown, it becomes really obvious that it was rushed. Simple as

Though I dont really get this talk about "port" or "rushed port". It's not like where talking about big hardware tech differences like Cell vs x86 CPUs or custom built GPUs. The PS4 is using PC hardware for both CPU and GPU with minor differences.

Only thing I could think off is the API hampering perfomance a bit but then this is hard to know about as they could already have lots of optimisations getting near to metal for the UE4. The GPU and CPU tech in PS4 has been available for a good time on PC. A devkit with these parts would do a good base for optimising code and getting to metal as opposed to if they would have had hardware to try to emulate final hardware specs and functions.

I am sure this gen it will be easier than ever to tap the hardware on their resources. No Emotion Engine, Cell, Xenos, In-order CPUs that require more research to get-to-know and development of custom tools. This time around the PC platform development has prepared a lot of that work for the consoles as I see it.
 
I honestly can't believe that you have so little comprehension of how software and hardware interact. What exactly is this odd concept of "keeping up" that you speak of? Do you think when (within the same architectural generation) GPU 1 is slower than GPU 2 that it throws its arms up in horror and proclaims that it can't run the code to perform "DOF, SSAO, GI and all the stuff"?

It simply gives you the same output at a slower speed - it doesn't mean it can't do it at all! The GPU has no concept of what "effect" you are asking it to do, it is simply running code.

An Nvidia GTX 660 has 1.88 TFLOPS and 144GB/s of bandwidth. It runs exactly the same (games) software, with the same effects, with the same textures, at the same resolution as an Nvidia GTX 680. It just does it slower! So does an AMD 7850. So will the PS4.


I, personally, have no idea why the demo that was shown did not look like UE4 on PC; and neither do you.

But if you're seriously going to tell me it doesn't look like the PC demo because it can't, because it doesn't have enough power - oh please!

BTW - yes I only have one post, and no I'm not an angry fan boy. I just can't bear to ignore the nonsense you're spouting any longer.

:rolleyes:

But assume it runs it slower that means they would have to sacrifice something to get stable 30fps or reach 1080p. same reason a low-end DX11 card can run Crysis 3 with exact same effects as a high-end one but nobody wants to play at jerky framerates. If they dont want to lower resolution they would need to start reduce effects being displayed and/or quality to reach stable framerates.
 
But assume it runs it slower that means they would have to sacrifice something to get stable 30fps or reach 1080p. same reason a low-end DX11 card can run Crysis 3 with exact same effects as a high-end one but nobody wants to play at jerky framerates. If they dont want to lower resolution they would need to start reduce effects being displayed and/or quality to reach stable framerates.

Reducing the amount of effects to maintain a frame rate is one tactic. Completely changing the lighting, geometry, shadows and multiple other shader effects so it resembles a DX9 UE3 demo? All those changes because otherwise it would resemble an embarrassing slide show? That's the kind of thing you do if you're sending it to a different generation of hardware.

I accept your analogy to low end DX11 cards. However, the PS4 (7850/7870 level) certainly isn't comparable to a low end DX11 card. Not in compute ability, memory bandwidth or texturing ability. Is it as good as the GPU used for the PC demo? Nope. Neither is it a thousand miles away.

Has the PS4 been so badly designed for the last 5 years that a demo has to look like that simply to get a decent framerate in a demo situation? I think it's fair to say that Sony wasted their R&D fund if that's the case.

I don't think that's the case. So I'll choose to believe...something else...
 
@ MarkR: Assuming you are the real Mark, Thanks for posting--ignore "almight" as his long running presupposition in every discussion is any issue on a console compared to the PC version is due to consoles being a hardware inferiority issue compared to the PC (it could not possibly be dev kit, dev time, etc issues). It is his version of Occam's Razor.

Many of us are interested to know if Unreal 4 will continue to offer SVOGI on the console platforms; does 8GB on the PS4 (and probably Durango based on leaks) change Epic's outlook of such?

Ps- fully dynamic rendering solutions to aid smaller and indie developers gets two thumbs up. Now just ship your next Unreal Engine based console game with a robust editor so we can change traits, gun traits (damage, splash damage, range, rof, etc), and finally modify the game objectives (think Halo's MP on steroids) and you will get many console gamers geeked out :p
 
Back
Top