Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Platform holders like Sony, and mega pubslishers like EA have their own internal engine development teams. So for them, I think issues like the ones you rightly raise are much less of an issue; as they can have their game dev teams working on the latest full-fat stable build of the engine and tools, whilst their core tech guys (ICE Team, ATG, DICE) work on re-writing the engine code and updating features and tools.

For smaller devs, it of course much more difficult to manage.

New tech = new problems, always. Dedicated engine teams are even less likely to get it right because they're not always working within the constraints required by a game team.

ICE team doesn't write engines anyway, and go ahead and ask anyone on the MoH team how well Frostbite worked out for them. New tech is always a risk. You need a lot of time to get it right.
 
http://www.eurogamer.net/articles/digitalfoundry-2014-sniper-elite-3-face-off

Ouch for the XB1. They had to sacrifice texture filtering, frame-rate, and other graphical effects just to get it to run 1080p/60FPS. Screen-tearing is also pretty bad. I would have been fine with 1600x900 if it meant an improvement to what I just mentioned. Although I guess some people will be fine with a sharp native 1080p game. Meh...

EDIT: Oh, looks like DF updated their article with info about the launch-day patch which included a v-sync option for the XB1.
 
The PS4 game holds up rather well in approaching a 60fps set-up while featuring almost identical graphical quality to the PC game running with ultra settings enabled.

sounds like 399 euro's well spent :p

edit:
is the Xbox One version the only one running the CryEngine? Holy shit @ those tears. At times it was as if the image was teared in multiple places..
 
Last edited by a moderator:
is the Xbox One version the only one running the CryEngine? Holy shit @ those tears. At times it was as if the image was teared in multiple places..
What makes you think the XB1 is running on CryEngine? :???: The article...

DF said:
Instead, Rebellion looks to have focused on optimising the in-house technology in order to boost performance, targeting 60fps on both Sony and Microsoft's new consoles

Rebellion's in-house Asura engine proves that it has the potential to bring a 60fps experience to consoles, although the technology doesn't appear to be fully optimised for the task in hand given the large gap between PC, PS4 and Xbox One performance.
 
What makes you think the XB1 is running on CryEngine? :???: The article...

screen-tear..... tear.....tears... crying... CryEngine :cool:

But seriously; in the video you can see whole sequences of successive, teared frames. Playing it frame-by-frame on my Mac reveals that there was only 1 tear per frame, but played back at normal speed it really was as if they were using some kind of tiled rendering :cry:

I am happy that they didn't push for parity on this one, I mean, the PS4 is not 60fps all the time, but either way; it is well north of 30.
Imagine if it was capped at 30, we would never know how much headroom the PS4 had..
 
Last edited by a moderator:
I think Xbox owners would rather take almost identical version with only at 900p rather than forcing 1080p with big issues. The SDK update is not going to cover the 900p->1080p jump fully at all
 
Wow, I'm pretty sure I'd rather have 900p. I hope it isn't pressure from gamers that is making this developer force 1080p on the Xbone. Constant screen tearing, really blurry textures and a frame-rate that is too juddery to be uncapped.
 
I have a gut feeling that MS went around encouraging devs to target 1080p more often. Coinciding with the '10% GPU boost', it's good PR for them, even if that means that games might perform worse and 900p makes more sense.
 
Last edited by a moderator:
I think Xbox owners would rather take almost identical version with only at 900p rather than forcing 1080p with big issues. The SDK update is not going to cover the 900p->1080p jump fully at all

Be interesting which plays out better from a Neogaf whining perspective. I'm not automatically sure it's 900P.

I wonder if they got the Kinect SDK update? I think too late to do any good if they did. It dropped in June.

Destiny will be a fun DF comparison...and the beta drops very soon too.

Heck I wouldn't be shocked if the Destiny beta was 900P on Xbox. 1080P came late after all I think, if this is supposed to be truly a beta,
 
Is this is a texture loading issue?

e0fp8US.jpg
 
I have a gut feeling that MS went around encouraging devs to target 1080p more often. Coinciding with the '10% GPU boost', it's good PR for them, even if that means that games might perform worse and 900p makes more sense.

That update arrived in june. Seems a little late for any game released in early july.
 
With a 10GB day 1 patch on XB1, I'm sure they at least did something with the SDK update. For comparison, the PS4's day 1 patch is 450MB.
 
Is this is a texture loading issue?

e0fp8US.jpg


From what I could tell they only shows these issues in the comparison videos during ingame-cutscene and not gameplay. You can tell because the camera perspective on the character and his animation and what transpires.
Usually cinematics preempt the end of one level/area/environment and the beginning of a new one.
Because of the immense amount of texture data that needs to be streamed from the hdd now days some developers are using cinematic as an opportunity to hide load times.

There is nothing that seems particularly more demanding bandwidth wise about rendering that particular cinematic than gameplay where there ARE high res textures on the main character.

The HDD transfer speeds have not kept pace with how much data must be streamed off the hdd. Data that needs to be streamed off is probably at least 4x larger than last gen and more.

Its possible that there were background apps competing for hdd bandwidth with on the eurogamer's ps4, as that is not something the Xbox has to worry so much about as it has the 8GB emmc nand.
And we know its at the beginning of a new level/area and he just got off the jeep and is in the new area and is looking around.


edit: This guy has the hq textures seen in that ingame-cutscene(ps4)
http://www.youtube.com/watch?v=cgUC7DVCOEQ

Also HQ textures in the cutscene (ps4)
http://youtu.be/IEI-szx56zo?t=40m58s

This guy has Lowq textures like eurogamer in that cutscene(ps4)
http://www.youtube.com/watch?v=7badoEqkEHE



Now here is something interesting:
Here is that exact cutscene on a PC or Xbox One with LQ textures (its not specificed but there are A,X,R,L,Y, etc button prompts)
But the ingame-cutscene has LQ textures but high quality in the game
http://www.youtube.com/watch?v=xtztLlVlAEE


edit:2
okay the solution has been found. Its because users are loading their game at the beginning of the level, and the game doesn't load hq textures into memory immediately.

People who play continuously from the end of the previous level have the HQ textures on the main character just like on X1. It looks like this will happen regardless if its PC/X1 or PS4.
 
Last edited by a moderator:
Yeah, locked at 30fps. They should have just locked it to 30 in the first place on the X1 especially with that amount of tearing (it seems they knew it was bad enough to offer a vsync mode).

No.

They should have run it at a resolution that allowed them to hit somewhere even remotely close to their performance target.

1080p with AIDS ridden, aniso-less-full-screen-blurred, sub 30 fps "60 fps", tearing afflicted graphics is like proving to fuckwits what a man you are by cutting your own balls off.

Seriously, this is the worst generation ever. The 1080p cheerleaders have wrought destruction of console gaming through ignorance, hubris and wilful stupidity.
 
Seriously, this is the worst generation ever. The 1080p cheerleaders have wrought destruction of console gaming through ignorance, hubris and wilful stupidity.

Whose fault is that?

If Sony feels comfortable enough (1st part wise) that its hardware is capable of outputting native 1080p, without any significant sacrifice to other parts of the graphics pipeline, why shouldn’t its users demand it? PC gamers for the most part demand it…

If MS is feeling pressured to follow suit or make sacrifices on achieving native 1080p parity – who fault is that? Blaming gamers for wanting certain features out of new generation hardware isn’t their fault, IMHO. The console manufacture not delivering or overpromising on the console capabilities, are the ones at fault.
 
Last edited by a moderator:
The Eurogamer Sniper Elite 3 texture problem: (see my last post) http://beyond3d.com/showpost.php?p=1859341&postcount=6790

The guys with texture streaming issues on PS4/XO/PC are starting up the game from brand new. And the HQ textures haven't had time to load.

While the guys with HQ textures are continueing playing on from last level. The HQ textures have already loaded into memory. Developers prioritized shorter load times as important.

Texture streaming issue confirmed and its a potential problem on all platforms.
 
Well, I do hope to see DF do the analysis with the Destiny Beta coming in the next few weeks. They must be running out of materials when they start comparing mediocre titles. And this imagination of how the gaming industry is full of conspiracies and stuff is really getting old. Honestly that's not how it works, that's not how any of this works.
 
Status
Not open for further replies.
Back
Top