Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
whats with Aliens Isolation performance, its a game that looks like it should be running at 1080p 60 fps with ease on current gen consoles. Thats not even based on the pc benchmark.
 
Alien Isolation is up. Parity at a cost on the X1 version. It sometimes hits 30fps but mostly in the high 20s and that judder and screen tearing is terrible. This parity crap is just going to hurt gameplay in the end.
I wouldn´t call it a parity cost, since PS4 dips to 17fps even worst than One.
 
My point is that no version runs at a constant frame rate. I agree that one version has higher average. It would be different if one version was able to sustain 30fps.
 
So the lower average fps is not a parity cost?

Ok like we didnt have games running in the high 20s at 720p last gen.
Cmon this parity talk is getting out of hand. Are we suddenly exempt to unoptimized titles being released on console now that we have new machines? Games drop frames on every console they always have they always will. This is not something new.
 
Ok like we didnt have games running in the high 20s at 720p last gen.
Cmon this parity talk is getting out of hand. Are we suddenly exempt to unoptimized titles being released on console now that we have new machines? Games drop frames on every console they always have they always will. This is not something new.

I am sure there are other places where that discussion is old. Here it just started and it will continue until next generation arrives and then start again.

A average lower framerate compared to the ps4 but the resolution being the same could easily indicate that the cost of matching the ps4 resolution was paid with a few or many frames pr second.

Parity at a cost.
 
Ok like we didnt have games running in the high 20s at 720p last gen.
Cmon this parity talk is getting out of hand. Are we suddenly exempt to unoptimized titles being released on console now that we have new machines? Games drop frames on every console they always have they always will. This is not something new.


That's true, but the idea of this thread is to take a look at these direct comparison videos, and they are useful when sizing up a consoles capabilities. The track record shows the PS4 has faired better than the X1 with multi plat titles, pretty much all the time. A single dip to 17fps is more than likely an anomaly that doesn't happen very often, the X1 build seems to move around in the mid 20's while the PS4 holds the 30fps target far more often. None of this should be surprising, the PS4 and X1's specs aren't a secret, and the PS4 holds the advantage. If we were comparing graphics cards, you wouldn't expect a 1.2 Tflop card to be neck and neck with a 1.8Tflop card.
 
That's true, but the idea of this thread is to take a look at these direct comparison videos, and they are useful when sizing up a consoles capabilities. The track record shows the PS4 has faired better than the X1 with multi plat titles, pretty much all the time. A single dip to 17fps is more than likely an anomaly that doesn't happen very often, the X1 build seems to move around in the mid 20's while the PS4 holds the 30fps target far more often. None of this should be surprising, the PS4 and X1's specs aren't a secret, and the PS4 holds the advantage. If we were comparing graphics cards, you wouldn't expect a 1.2 Tflop card to be neck and neck with a 1.8Tflop card.

Oh i dont disagree with any of your statements. I am not new here though I know the purpos of the thread. I just dont see what it has to do with any dev forcing parity. Everyone knows the power difference. I just dont see why it is considered dev forced parity if a multiplat title targets 1080p on both conoles. 1080p could have been a main priority to the devs like framerate is priority for other devs. It just seems like anytime a 3rd party game is 1080p on the X1 it is a result of parity instead of a dev design choice. Sure the X1 is more suited for 900p on demanding games or games that strive for 60 fps, but acting like any title released at 1080 is a reult of devs pushing for something the X1 is incapable of. That is the point of my post.
 
http://www.eurogamer.net/articles/digitalfoundry-2014-alien-isolation-performance-analysis

Here, the same level of graphical quality extends to both PS4 and Xbox One versions of Alien: Isolation. Native resolution comes in at 1080p using what appears to be a post-process anti-aliasing solution, while the artwork and effects appear a close match, right down to the dithered shadow edges and how reflections and specular highlights are displayed. The main points of difference in our shots come down to the dynamic lighting which changes the way environments and characters are illuminated in real-time, along with slight variances in depth of field and motion blur at any given moment.

The PS4 version of Alien: Isolation mostly hits a solid 30fps throughout, with frame-rates only mildly impacted in more demanding scenes. On the other hand performance takes a bigger hit on the Xbox One, where frequent frame-rate drops and tearing is commonplace.

Should they have lowered the resolution on XB1 edition, or just reduced texture and shadowing/lightning quality? Performance drops looks awful...
 
We have 2 versions of the game running at what seems like identical IQ except with one version struggling to maintain a 30fps performance delta. So yeah, its by all accounts thats parity at a cost of gameplay performance.

judder is a regular issue and controller feedback is also affected by the uneven frame delivery

Not sure how else you could spin that.:rolleyes:
 
We have 2 versions of the game running at what seems like identical IQ except with one version struggling to maintain a 30fps performance delta. So yeah, its by all accounts thats parity at a cost of gameplay performance.

Quote:
judder is a regular issue and controller feedback is also affected by the uneven frame delivery.

Not sure how else you could spin that.:rolleyes:

That alone would drive me crazy...


FYI: Just picked up Alien Isolation for the PC and DriveClub... :)
 
The weirdest thing is: GPU's with lower/same performance as 7770/7850 have better framerates on PC. Maybe it's the CPU but I don't know, all this supposed fixed architecture gain is completly not working in A:I and SOM.

That's why Cerny was so keen on PS4 being compute heavy... I think we're going to see a major shift with these consoles (PS4 more so) with offloading certain AI routines that are well suited for GPU compute. I think if SSM, Naughty Dog, Sucker Punch, and a few other internal teams become leaders in that direction, maybe more 3rd party developers would start approaching compute... within limits of course.

Because if SSM or Naughty Dog proves GPU compute as a valid and solid foundation with certain AI routines, then you're going to hear a lot more gamers calling 3rd party developers lazy...
 
That's why Cerny was so keen on PS4 being compute heavy... I think we're going to see a major shift with these consoles (PS4 more so) with offloading certain AI routines that are well suited for GPU compute. I think if SSM, Naughty Dog, Sucker Punch, and a few other internal teams become leaders in that direction, maybe more 3rd party developers would start approaching compute... within limits of course.

Because if SSM or Naughty Dog proves GPU compute as a valid and solid foundation with certain AI routines, then you're going to hear a lot more gamers calling 3rd party developers lazy...
That would be viable too because the Xbox One also has compute function. Both sides would benifit unlike the CELL situation last gen.
 
That would be viable too because the Xbox One also has compute function. Both sides would benifit unlike the CELL situation last gen.

Both sides benefited last gen also, but not equally. And this time, that part will be the same. The actual code at least can be shared easily but one platform may not have the CUs to spare for the purpose from the regular graphics rendering work. Will be a matter of wait and see.
 
Both sides benefited last gen also, but not equally. And this time, that part will be the same. The actual code at least can be shared easily but one platform may not have the CUs to spare for the purpose from the regular graphics rendering work. Will be a matter of wait and see.

Yeah lets not forget the changes Sony made to benefit compute as well, it might not make that big a difference (if any at all depending) but they are still there, that stacked with the extra ALU makes the PS4 a much more likely target of compute imo.
 
Apparently Ryse makes heavy use of compute shaders. Crytek have 'advised' us that we could see some big performance differences between PC GPU architectures given how well different architectures handle compute. I'm certainly looking forward to seeing those results.
 
Apparently Ryse makes heavy use of compute shaders. Crytek have 'advised' us that we could see some big performance differences between PC GPU architectures given how well different architectures handle compute. I'm certainly looking forward to seeing those results.

So, basically, GCN > every GPU architecture ever?
 
Status
Not open for further replies.
Back
Top