Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
I wonder whether we'll still need to state the obvious every time in the next few years. Xb1 version slightly lower res, slightly toned down effects, feels pretty much the same but at marginally lower settings.
I mean, it's really nothing new and unlikely to change so I'm not sure why DF even bother producing all these almost copy&paste analyses.

Ok, party pooper leaving now. Bye.
Because at the end of the day people who wish to buy the best multiplatform still wants to be properly informed? I know the current trend is pretty obvious but there's always the outlier titles when devs decided to go console parity such as AC Unity, Syndicate, Fallout 4, Mad Max, Destiny and maybe more. So it's far from a fixed equation at this point and I both welcome and appreciate DF doing the lord's work for us end users for a better purchase.
 
I'm a believer that it still could. I'm not sure what the upper limits of either console are, so I'd rather just wait and see. If by the time next gen swings around and that gap never closed, it never closed. But about 2013 October is when both released, and Xbox has made huge strides in performance in a very short amount of time. You can't say it hasn't closed since launch. The question is how much more or if it widens.

This is only true for the 1080p/720p scenario... how many multiplatform games are concerned ? 1 or 2 ?
 
This is only true for the 1080p/720p scenario... how many multiplatform games are concerned ? 1 or 2 ?
I think my opinion on whether xbox will close the gap or not, only lies on hopefulness that somehow they discover a better method with this dual pool of RAM they have. That's about it, I'm not suggesting it's possible, but would be really cool to know that one day developers are getting big numbers out of both DDR3 and eSRAM. That's about all that can be said in closing the gap, PS4 has more of everything so the only possible way Xbox could theoretically do more work is to actually move more data. Whether it's possible is beyond me.

As for closing the gap, yes the 1080/720p scenario issues of earlier, but more importantly that we saw quite a few games start at 720p and move their way back up. Destiny, Diablo (900->1080), CODs (720->1080** dynamic) etc. I think using PS4 as a reference (though a moving one, its clear to me that PS4 has been improving as well), we get an idea of how well Xbox is being able to keep up or not. In many ways I think it's safer to say Xbox has kept up, as opposed to closing the gap, the move to a lower level API helped tremendously with that, and the move to DX12 may offer additional features that the previous API did not (there are a few, how relevant they are in performance is unknown, a couple percentage points here and there likely).

That being said, the long story short, and building off Ultragpu's comments, at the end of the day a console purchase is a great deal of many factors for a consumer, some value graphical performance more than others, others value the games available, others the controller, the OS, the other services factors in, media features, or apps. Consumers are taking in all these considerations, and it helps that DF continues to provide the play by play on whether the graphical department between the two devices is shrinking or widening. Though it does seem to follow a strong pattern today.
 
Last edited:
Even if XB1 can achieve more by shifting work to data access rather than calculating stuff, and utilise a higher overall BW, devs aren't going to bother to make XB1 specific versions of titles. And ultimately the premise is IMO flawed because everything's moving towards computing solutions rather than fetching them. Tiled resources means less requirement for storage and BW, and produces better results. Compute based shaders. Compute based rendering. In terms of pixels pushed, XB1 is extremely unlikely to reach parity save in games where business concerns cap to the lowest common denominator, just as its always been on consoles. The only gen that was really interesting in this regard was last gen where the two machines were diverse in their implementation of similar power, and ultimately XB360 won out due to better GPU. GPU == graphics, basically. But this gen the power envelope is very one-sided, more so than PS2 versus XB where PS2 at least had massive advantages in a few areas.

The true value of DF investigations is likely limited to exclusives where they can discern (wildly guess? :p) what solutions devs use when not limited to cross-platform solutions, and most importantly how those solutions change over the platform's life, seeing what XB1 and PS4 exclusive devs get up to five years after launch.
 
Even if XB1 can achieve more by shifting work to data access rather than calculating stuff, and utilise a higher overall BW, devs aren't going to bother to make XB1 specific versions of titles. And ultimately the premise is IMO flawed because everything's moving towards computing solutions rather than fetching them. Tiled resources means less requirement for storage and BW, and produces better results. Compute based shaders. Compute based rendering. In terms of pixels pushed, XB1 is extremely unlikely to reach parity save in games where business concerns cap to the lowest common denominator, just as its always been on consoles. The only gen that was really interesting in this regard was last gen where the two machines were diverse in their implementation of similar power, and ultimately XB360 won out due to better GPU. GPU == graphics, basically. But this gen the power envelope is very one-sided, more so than PS2 versus XB where PS2 at least had massive advantages in a few areas.
Agreed.

Though the actual assumption I'm relying on is that the CUs are not saturated enough, such that they are waiting on information for processing because the data isn't there for them to process, ideally this would change as async shader based games continues to evolve (at least i'm hoping).

I'm not sure how accurate that statement is, it's likely a half truth, or just wishful thinking. But I'm biased on the idea that if CU saturation is still low, the bottleneck being available bandwidth (moving completed work out of the CUs to memory, or the opposite, moving fresh work they need to the CUs), then there would be room to grow in saturation as much as there is bandwidth, at least from the actual CU side of things. The idea is, what good is 100 CUs, if you only have 32GB/s of bandwidth. 12 CUs with 192 GB/s should be able to outperform that.

If that's true, both consoles have room to grow, but I'm pretty sure they'd approach the same solution differently, and PS4s solution should be somewhat parallel to a PC method, and Xbox is on its own tangent.

Clearly both MS and Sony have very talented engineers, it's clear that they've paired hardware with bandwidth. Suggesting otherwise would be a mistake. But I'm curious to see if a maxed bandwidth full 192 GB/s on eSRAM + a full 60 GB/s (it's ugly I know, because 60 needs to feed the esram, etc etc. I know) would result in enough efficiency to produce more work than a system with less bandwidth.

Lastly, to address whether a company would invest the effort to discover this paradigm shift of working with embedded RAM for SoC APUs, my answer is: well if going forward embedded RAM is the future, then it's not a terrible idea to get started on that type of R&D.
 
Last edited:
As for closing the gap, yes the 1080/720p scenario issues of earlier, but more importantly that we saw quite a few games start at 720p and move their way back up. Destiny, Diablo (900->1080), CODs (720->1080** dynamic) etc. I think using PS4 as a reference (though a moving one, its clear to me that PS4 has been improving as well), we get an idea of how well Xbox is being able to keep up or not. In many ways I think it's safer to say Xbox has kept up, as opposed to closing the gap, the move to a lower level API helped tremendously with that, and the move to DX12 may offer additional features that the previous API did not (there are a few, how relevant they are in performance is unknown, a couple percentage points here and there likely).

I don't know if Diablo is a good example... 1080p but with worse performances compared to the 900p version and COD AW almost never runs at 1080p.

In my opinion, there are very few examples to support your point and many counter examples : Hardline, Battlefront, NFS 2015, etc.
 
I'm not sure how accurate that statement is, it's likely a half truth, or just wishful thinking. But I'm biased on the idea that if CU saturation is still low, the bottleneck being available bandwidth (moving completed work out of the CUs to memory, or the opposite, moving fresh work they need to the CUs), then there would be room to grow in saturation as much as there is bandwidth, at least from the actual CU side of things. The idea is, what good is 100 CUs, if you only have 32GB/s of bandwidth. 12 CUs with 192 GB/s should be able to outperform that.
If your compute units can work in local storage (cache), it's not a problem. Depends whether you're reading lots of assets and perform simpler ops on them, or computing noise/fractal, type of thing. I don't know what the cache situation is with these consoles.
 
Because at the end of the day people who wish to buy the best multiplatform still wants to be properly informed? I know the current trend is pretty obvious but there's always the outlier titles when devs decided to go console parity such as AC Unity, Syndicate, Fallout 4, Mad Max, Destiny and maybe more. So it's far from a fixed equation at this point and I both welcome and appreciate DF doing the lord's work for us end users for a better purchase.
My post was tongue in cheek obviously but surely it would be quicker to do analyses on those games that for whatever reason run better on xb1, than hundreds of articles about every game that runs better on ps4 - sans AF of course :)

As I said, I'm just being pedantic but still, it's all a bit repetitive.
 
If your compute units can work in local storage (cache), it's not a problem. Depends whether you're reading lots of assets and perform simpler ops on them, or computing noise/fractal, type of thing. I don't know what the cache situation is with these consoles.
Yea. Well the only known fact is that cache optimization is far more critical than bandwidth numbers. I assume it's the goal of all graphics engineers to ensure that cache has the right data at the right time. Memory is just so much slower, embedded or not.

That being said, indeed the more the CUs are working in cache the better hence a good counter point to CU saturation.

Im not sure of the cache situation either. I think something along the lines of being able to hide memory loads or cache loading by staggering the jobs. There's definitely methods to get around the latency of memory. It seems that a lot of the way rendering is done today is to load things in huge bulk in one direction and write back all in one load. Just curious to see what if any advantages to changing that access pattern.
 
The true value of DF investigations is likely limited to exclusives where they can discern (wildly guess? :p) what solutions devs use when not limited to cross-platform solutions, and most importantly how those solutions change over the platform's life, seeing what XB1 and PS4 exclusive devs get up to five years after launch.
I do not think there ever will be a games like Dreams, Tomorrow Children and VR games on xbo. We have Ubisoft slide where PS4 handled twice the number of "dancers".
 
Apart from GPGPU, i don't see what should be so specific about their respective exclusives ? Both consoles have a completely trivial architecture.
 
I do not think there ever will be a games like Dreams, Tomorrow Children and VR games on xbo. We have Ubisoft slide where PS4 handled twice the number of "dancers".

More than hardware it is the freedom SCEWWS give to the studio the biggest difference. The modification ask by Sony to AMD were for compute task(graphics or non graphics).

Dreams is a 2017 game. From 2011 PC prototype to the release of the game, 6 years of development and R&D.
 
James Mc Laren from Q games said Mark Cerny ask them to show what they can do with compute.

Media Molecule Alex Evans doesn't like polygons.

They aren't AAA budget and Sony can give the the freedom to explore new realtime rendering field in a game.
 
Last edited:
IMO with absence of normal maps, the game looks really ugly on PS4, this is the second game on PS4 that suffers from lack or low res normal maps.
 
StarX is talking about NFS, it confused me too so here's the paragraph

...There are more unusual variances in other areas though; in some scenes we see the PS4 lacking an extra normal map and texture layer used on Xbox One to add details to puddles of water in the ground on wet surfaces, while light sources sometimes appear brighter and feature a stronger bloom component. Given that artwork and effects generally appear identical in these areas elsewhere the differences here seems more like unintentional oversights as opposed to a calculated downgrade....
 
what game are you talking about?
NFS & Mad max

Mad max:
some lower resolution textures and normal maps are present on the characters on the PS4

http://www.eurogamer.net/articles/digitalfoundry-2015-mad-max-face-off

NFS:

db82Untitled.jpg
b152a69b.jpg
d397ZZZZZZZZZZZZZ.jpg
 
Thats a strange thing. According to DF this is most likely looks like an accidental omission on selective areas rather than a conscious choice from the developers
 
Status
Not open for further replies.
Back
Top