Is UE4 indicative of the sacrifices devs will have to make on consoles next gen?

I don't think there's anyway to answer this one way or another without knowing the state of UE4 and its toolset on PS4. If we are to believe VGleaks, which has been correct on everything so far, Sony only shipped the custom GPU in January.
 
Sorry for the OT, but could admin put both SEBBI messages in a sticky post under "You should read this before...". Thanks :D
 
It seems that PC demo runs at 2560 X 1600 with GTX 680 (3T flops) while PS4 demo runs at 1920 x 1080, which is only half of the resolution of PC version. How could it be possible that PS4 don't have enough graphic power? It's obvious just the problem of optimization.
 
Only drawing functions scale with resolution. Computation of physics as an example, by contrast, is completely resolution independent.
 
It seems that PC demo runs at 2560 X 1600 with GTX 680 (3T flops)

The Elemental Cinematic Part 1 demo shown on the GTX 680 is not rendering at 2560x1600.

And it's not even rendering at full 1920x1080.

See (especially see the highlighted (bolded/underlined) part at the end of the quote):


unrealengine.com/files/misc/The_Technology_Behind_the_Elemental_Demo_16x9_(2).pdf said:
http://www.unrealengine.com/files/misc/The_Technology_Behind_the_Elemental_Demo_16x9_(2).pdf

Elemental demo
  • GDC 2012 demo behind closed doors
  • Demonstrate and drive development of Unreal® Engine 4
  • NVIDIA® Kepler GK104 (GTX 680)
  • Direct3D® 11
  • No preprocessing
  • Real-time
    • 30 fps
    • FXAA
    • 1080p at 90%


;)
 
And?

The original spec PS4 had 4Gb so that would be the same as a PC with a 2Gb GTX680 and 2Gb system RAM.

I seriously, seriously doubt that this demo uses over 2Gb of main RAM, especially as the demo might not even be 64bit.
This.

It's likely running on a PS4 dev kit that doesn't even have the newly announced 8GB (dev's were pleasantly surprised by the news on 2/20). Not only that, but it's likely the dev kit doesn't even have unified ram yet.
 
If you have various texture layers, can't you just merge them once before you render them the first time, or doesn't that work precisely because you want to keep them in memory separately? But then, if you keep them in memory separately and merge them when rendering, is that to conserve memory? Because in that case, using more memory by merging layers before rendering could decrease bandwidth use?

It's funny you are asking Sebbi that. That's exactly what the engine he developed for trials does. He already had some very insightfull posts here on the forum about how all of it works. You should try to look it up.
 
Working on it! :)

Always good to see you chime in. I like my FPSes, but frankly I'm more interested in what Bioware is doing with your engine ;) I hope being under the same EA umbrella means they have better technology access than they did with UE3.

Judging by your last tweet, that's exactly what they're doing :)
 
The Elemental Cinematic Part 1 demo shown on the GTX 680 is not rendering at 2560x1600.

And it's not even rendering at full 1920x1080.

See (especially see the highlighted (bolded/underlined) part at the end of the quote):





;)

I just realized that at the time of that demo, the GTX 680 had JUST been released and was in short supply. Unless they were just cranking things up mindlessly, I don't think it was optimized for the card at GDC (unless NVidia gave them some early samples). Then again, they've already said that, haven't they..
 
Not much of a demo, apart from mesh modification in the hand model, which isn't shown in game so doesn't tell us anything about in game performance (will the player model adapt, or be a baked model with the modifications).
 
How can people blame the 4Gb GDDR5 RAM when the a GTX 680 only has 2Gb?

It's lack of power, pure and simple.

Lack of power.... brutal view. Have a Rainbow!!!


There is a definite reason for such dramatic changes {specifically the wall texture being completely different, power has nothing to do with such a decision when the replacement would be a larger texture file on a console, than the pc texture it replaces.)

I suspect a lot of the rework has to do with the NTSC color values of TV's not having the range of contrast an RGB monitor can support.

(I definitely see, some things that were altered, that is proof of tapering down performance. But this hardware is completely new and still in the development pipeline. So... if you have any experience... don't be so harsh with your judgement. Have a Rainbow. ~ "It Be Al-right." VW. Take care, she just needs some polish and wax. NTSC displays will never have been happier!!!)

See:
NTSC 720x480 pixels ~ http://www.pcmag.com/encyclopedia_term/0,1237,t=NTSC&i=48147,00.asp
RGB
So what is the HDTV standard?
HDTV: http://www.pcmag.com/encyclopedia_term/0,1237,t=DTV&i=42071,00.asp
Strangely...HDTV's only 60fps progressive standard is 1280x720???

Does anyone of the critics here know of a TV Video Standard = 1920x1080p x 60???
Currently the highest frame-rate TV Display technology allows support is 1080i at 60fps?

If TV's only support 1080p @ 30... Can forum members Fairly slam consoles for TV limits?
http://www.pcmag.com/encyclopedia_term/0,1237,t=HDTV&i=44166,00.asp
 
Not much of a demo, apart from mesh modification in the hand model, which isn't shown in game so doesn't tell us anything about in game performance (will the player model adapt, or be a baked model with the modifications).

I had the hope someone with engine development knowledge could see something about the lighting method there.
 
What I would like to see is a game on this engine that is made to look as realistic as possible and without being stylized. I realize the PS4 is not going to be able to make true photorealistic graphics, but I would like to see how close it can come. For example, even if they had to make the game at 480P and have mostly indoor scenes.
 
Lack of power.... brutal view. Have a Rainbow!!!


There is a definite reason for such dramatic changes {specifically the wall texture being completely different, power has nothing to do with such a decision when the replacement would be a larger texture file on a console, than the pc texture it replaces.)

I suspect a lot of the rework has to do with the NTSC color values of TV's not having the range of contrast an RGB monitor can support.

(I definitely see, some things that were altered, that is proof of tapering down performance. But this hardware is completely new and still in the development pipeline. So... if you have any experience... don't be so harsh with your judgement. Have a Rainbow. ~ "It Be Al-right." VW. Take care, she just needs some polish and wax. NTSC displays will never have been happier!!!)

I think broadcast TV only goes up to 1080i. Not sure why but I've seen this in many countries. (US, Taiwan, Japan)
Possibly due to the ages of component inputs.
 
I think broadcast TV only goes up to 1080i. Not sure why but I've seen this in many countries. (US, Taiwan, Japan)
Possibly due to the ages of component inputs.

All depends on the broadcast standard. ATSC isn't the same as DVB-T (Europe) or ISDB (Japan, Brazil).
 
Back
Top