I wouldn´t call it a parity cost, since PS4 dips to 17fps even worst than One.Alien Isolation is up. Parity at a cost on the X1 version. It sometimes hits 30fps but mostly in the high 20s and that judder and screen tearing is terrible. This parity crap is just going to hurt gameplay in the end.
I wouldn´t call it a parity cost, since PS4 dips to 17fps even worst than One.
I wouldn´t call it a parity cost, since PS4 dips to 17fps even worst than One.
So the lower average fps is not a parity cost?
If you were luckyOk like we didnt have games running in the high 20s at 720p last gen.
Ok like we didnt have games running in the high 20s at 720p last gen.
Cmon this parity talk is getting out of hand. Are we suddenly exempt to unoptimized titles being released on console now that we have new machines? Games drop frames on every console they always have they always will. This is not something new.
Ok like we didnt have games running in the high 20s at 720p last gen.
Cmon this parity talk is getting out of hand. Are we suddenly exempt to unoptimized titles being released on console now that we have new machines? Games drop frames on every console they always have they always will. This is not something new.
That's true, but the idea of this thread is to take a look at these direct comparison videos, and they are useful when sizing up a consoles capabilities. The track record shows the PS4 has faired better than the X1 with multi plat titles, pretty much all the time. A single dip to 17fps is more than likely an anomaly that doesn't happen very often, the X1 build seems to move around in the mid 20's while the PS4 holds the 30fps target far more often. None of this should be surprising, the PS4 and X1's specs aren't a secret, and the PS4 holds the advantage. If we were comparing graphics cards, you wouldn't expect a 1.2 Tflop card to be neck and neck with a 1.8Tflop card.
Here, the same level of graphical quality extends to both PS4 and Xbox One versions of Alien: Isolation. Native resolution comes in at 1080p using what appears to be a post-process anti-aliasing solution, while the artwork and effects appear a close match, right down to the dithered shadow edges and how reflections and specular highlights are displayed. The main points of difference in our shots come down to the dynamic lighting which changes the way environments and characters are illuminated in real-time, along with slight variances in depth of field and motion blur at any given moment.
The PS4 version of Alien: Isolation mostly hits a solid 30fps throughout, with frame-rates only mildly impacted in more demanding scenes. On the other hand performance takes a bigger hit on the Xbox One, where frequent frame-rate drops and tearing is commonplace.
judder is a regular issue and controller feedback is also affected by the uneven frame delivery
We have 2 versions of the game running at what seems like identical IQ except with one version struggling to maintain a 30fps performance delta. So yeah, its by all accounts thats parity at a cost of gameplay performance.
Quote:
judder is a regular issue and controller feedback is also affected by the uneven frame delivery.
Not sure how else you could spin that.
http://www.eurogamer.net/articles/digitalfoundry-2014-alien-isolation-performance-analysis
Should they have lowered the resolution on XB1 edition, or just reduced texture and shadowing/lightning quality? Performance drops looks awful...
The weirdest thing is: GPU's with lower/same performance as 7770/7850 have better framerates on PC. Maybe it's the CPU but I don't know, all this supposed fixed architecture gain is completly not working in A:I and SOM.
That would be viable too because the Xbox One also has compute function. Both sides would benifit unlike the CELL situation last gen.That's why Cerny was so keen on PS4 being compute heavy... I think we're going to see a major shift with these consoles (PS4 more so) with offloading certain AI routines that are well suited for GPU compute. I think if SSM, Naughty Dog, Sucker Punch, and a few other internal teams become leaders in that direction, maybe more 3rd party developers would start approaching compute... within limits of course.
Because if SSM or Naughty Dog proves GPU compute as a valid and solid foundation with certain AI routines, then you're going to hear a lot more gamers calling 3rd party developers lazy...
That would be viable too because the Xbox One also has compute function. Both sides would benifit unlike the CELL situation last gen.
Both sides benefited last gen also, but not equally. And this time, that part will be the same. The actual code at least can be shared easily but one platform may not have the CUs to spare for the purpose from the regular graphics rendering work. Will be a matter of wait and see.
Apparently Ryse makes heavy use of compute shaders. Crytek have 'advised' us that we could see some big performance differences between PC GPU architectures given how well different architectures handle compute. I'm certainly looking forward to seeing those results.