Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Performance lead in CPU bound areas almost exactly matches the per-core performance advantage of x1 over PS4 in latest cpu bench.

In GPU bound areas PS4 has its usual 20%++ lead.

Likely that 30fps 900p is a good fit for x1 as more % of buffer will fit in esram, and lower overall load on main ram and so less contention.

CPUs are being hammered hard by this game. This will greatly reduce available memory bw in PS4 as per Sony slides. Upping resolution to 1080p on PS4 would only hurt already awful frame rate further. Even a small further reduction would be catastrophic. No headroom.

No 7th core unlock needed to explain this.

Hard to disagree with metro dev - up clock of x1 CPU was a smart move.
 
The impressions don't sound good for this game. They should have scaled the NPC number down to achieve a) acceptable framerate and b) the promised parity :)
 
I mean the hardware zlib decompression unit the PS4 possesses.

Ok, then why Cerny suggested to use GPU for decompression?

Physics simulation, collision detection, ray casting for audio, decompression and the like. And these operations are fine grained meaning that there will be many small world simulation tasks running on the GPU simultaneously alongside rendering of the game scenes. So the concept is that as game developers learn to use these techniques later on in the console life cycle, we will see richer and even more interactive worlds.

http://gamingbolt.com/mark-cerny-explains-the-ps4s-gpu-can-perform-asynchronous-complex-processes
 
I don't know. Maybe he's talking about a different kind of compression. If it's unsuited for the zlib decompression unit it's probably unfit for the Xbox One's DME decompression hardware, too.
 
I don't know. Maybe he's talking about a different kind of compression. If it's unsuited for the zlib decompression unit it's probably unfit for the Xbox One's DME decompression hardware, too.

But I think he was talking about texture decompression and XB1's DME has JPEG decompression as well.
 
What do you mean by saying equivalent co-processors? ACP?
Gamasutra said:
To further help the Blu-ray along, the system also has a unit to support zlib decompression -- so developers can confidently compress all of their game data and know the system will decode it on the fly. "As a minimum, our vision is that our games are zlib compressed on media," said Cerny.

http://www.gamasutra.com/view/feature/191007/inside_the_playstation_4_with_mark_.php?print=1
 
But I think he was talking about texture decompression and XB1's DME has JPEG decompression as well.

Why wouldn't you use a compressed texture format the GPU natively understands? If you need an intermediate format, why wouldn't you package them using zlib compression so you can leverage the hardware unit?
 
Performance lead in CPU bound areas almost exactly matches the per-core performance advantage of x1 over PS4 in latest cpu bench.

In GPU bound areas PS4 has its usual 20%++ lead.

Likely that 30fps 900p is a good fit for x1 as more % of buffer will fit in esram, and lower overall load on main ram and so less contention.

CPUs are being hammered hard by this game. This will greatly reduce available memory bw in PS4 as per Sony slides. Upping resolution to 1080p on PS4 would only hurt already awful frame rate further. Even a small further reduction would be catastrophic. No headroom.

No 7th core unlock needed to explain this.

Hard to disagree with metro dev - up clock of x1 CPU was a smart move.

Yup, I'm thinking exactly the same lines as you. As we discussed to death in another thread and had multiple people attempted to prove that CPU bound means the GPU can freely set whatever resolution it wants because it has even more time to render will only apply in the situation that the GPU is not being hampered.

In the case of PS4, if the CPU is being hammered to death the available bandwidth is lost, and the lack of resources to complete 1080p in the same frame time as the CPU is completing is work is thrown out, meaning you're ultimately going to have to reduce the resolution to get those frame completed in time.

CPU per core performance advantage might still play a factor (I've personally never thought 150Mhz could result in such a performance difference), but it doesn't explain the resolution drop, but disproportionate bandwidth loss during contention does. The poor performance of PC is API based? But they have a mantle version, and it's only performing marginally better than the non mantle version, so as Laa-Yosh said we're probably looking at a rushed product.

You know what, in this scenario, I have to applaud Ubisoft for taking the hit here for the industry - they could have dialed back a lot of things to get things running like you'd expect, but despite the bad press and the continual bad press they received they stuck to their vision. If they dialed back their vision to less people less everything we'd just have another Assassin's Creed but with better graphics. But they wanted to do more, they wanted to do things that would change the game play, not just the experience. I want games to do this, I want a game with heavy CPU load. Cinematic narrative games have their place, but part of being Next-Gen is offering next gen experiences, not just next gen graphics and narrative. We need different and varying gameplay instead of the similar set pieces over and over again.
 
Ironically, the biggest complaint from reviews is that the things they've done affect mostly the visuals (both positive and negative) rather than the gameplay, which is too much the same as everything before.
 
...in this scenario, I have to applaud Ubisoft...

Wha?? I think it'll be an awkward echoy clap.

Next time I'd prefer for their expectations and the released videos to more accurately represent the game they're showing. I really fear for The Division, a game I've been really looking forward to this generation and I'm now waiting on the horror that's imminent.
 
Ironically, the biggest complaint from reviews is that the things they've done affect mostly the visuals (both positive and negative) rather than the gameplay, which is too much the same as everything before.
lol ouch. They'll get the formula right next time maybe.

Edit: Yes lol, I'll be that guy clapping by himself in the audience of silence.
 
XB1 has the same CPU BW blocking, no?
It does. It is likely losing tons as well on the DDR3 side of things.
The esram could be slightly impacted by not being fed as well as it could be (as in no contention), but once in ESRAM the GPU is entirely free to do its work there.
 
The impressions don't sound good for this game. They should have scaled the NPC number down to achieve a) acceptable framerate and b) the promised parity :)

Yep. If the crowd is the bottleneck, it was their job as engineers to fix it. Turning down the number of NPCs would be easy. Of course it is likely a bogus answer and they just have a young engine which is unoptimized and they ran out of time. Likely XB1 got some extra love with their bundle+marketing deal too. From all reports Rogue is the better game. They should have made it cross-gen and delayed Unity for 6-12 months. Ubi seems to be in disorder lately. I hope they don't screw up FC4.
 
You know what, in this scenario, I have to applaud Ubisoft for taking the hit here for the industry - they could have dialed back a lot of things to get things running like you'd expect, but despite the bad press and the continual bad press they received they stuck to their vision. If they dialed back their vision to less people less everything we'd just have another Assassin's Creed but with better graphics. But they wanted to do more, they wanted to do things that would change the game play, not just the experience. I want games to do this, I want a game with heavy CPU load. Cinematic narrative games have their place, but part of being Next-Gen is offering next gen experiences, not just next gen graphics and narrative. We need different and varying gameplay instead of the similar set pieces over and over again.

Im not seeing a drastic change in the gameplay though, and the areas of big crowds seem to cause the framerate to hang out in the low 20's, so I will not applaud an ambitious effort that failed to execute. These consoles were really built around being next gen graphics with more of the same in terms of narrative. Developers have 6 cores to work with, but even with perfect utilization across all six cores, we are still talking performance that would fall far below that of an I5 processor. So regardless if its a time and budget thing, or that the hardware simply cant handle their vision, a low framerate is undesirable. In the sections where there aren't many NPC's around, the game holds pretty tight to 30fps, but then is the game really offering anything new in the sections of the game? Anywhere the game seems to be running just fine, also seems to look and play exactly like all the previous AC games. If you can only apply your advancements at the expense of framerate, then it seems like a poor choice.
 
Im not seeing a drastic change in the gameplay though, and the areas of big crowds seem to cause the framerate to hang out in the low 20's, so I will not applaud an ambitious effort that failed to execute. These consoles were really built around being next gen graphics with more of the same in terms of narrative. Developers have 6 cores to work with, but even with perfect utilization across all six cores, we are still talking performance that would fall far below that of an I5 processor. So regardless if its a time and budget thing, or that the hardware simply cant handle their vision, a low framerate is undesirable. In the sections where there aren't many NPC's around, the game holds pretty tight to 30fps, but then is the game really offering anything new in the sections of the game? Anywhere the game seems to be running just fine, also seems to look and play exactly like all the previous AC games. If you can only apply your advancements at the expense of framerate, then it seems like a poor choice.

Several ways to look at it, but I see your perspective. To me it's better to risk, try and fail, then it is to only do that which you only know. Failing is how progress is made. We don't learn much from always attempting to do the same thing. They tried and fail, their next attempt should include a lot of learning good and bad from Unity. They may have set the bar too high within the limits of time and budget, but Unity is far from a complete failure.
 
If it's just the size of the crowds that tanks the framerate, they should have made the crowds smaller. Simple as that. I appreciate their attempt to make the world seem more realistic by having it full of more people, but at the expense of playability it's not a good choice. I haven't watched enough to see how interesting the NPC behaviour is, or how different they all look from one another.

My real issue with the game is the gameplay looks pretty much the same as the last six games.
 
In the case of PS4, if the CPU is being hammered to death the available bandwidth is lost, and the lack of resources to complete 1080p in the same frame time as the CPU is completing is work is thrown out, meaning you're ultimately going to have to reduce the resolution to get those frame completed in time.

Well let's not get carried away. Sony's slide from last August's GDC confirms a disproportionate amount GPU bandwidth is lost as CPU bandwidth increased but we're looking at a drop from 135Gb/sec to 100Gb/sec - I assume the metrics are accurate, otherwise why have them.

2013-08-11%20GDC%20-%20PS4%20CPU%20and%20GPU%20Bandwidth%20Interaction.png



CPU per core performance advantage might still play a factor (I've personally never thought 150Mhz could result in such a performance difference),

150Mhz doesn't sound a lot but it's 150Mhz across six cores (available to games), or 900Mhz in total, which is the IPC equivalent to another half a core compared to Sony's IPC - assuming they're roughly equivalent.
 
If it's just the size of the crowds that tanks the framerate, they should have made the crowds smaller. Simple as that. I appreciate their attempt to make the world seem more realistic by having it full of more people, but at the expense of playability it's not a good choice. I haven't watched enough to see how interesting the NPC behaviour is, or how different they all look from one another.

I'm curious about the AI as well. Certainly big crowds of people - from what I've seen people stream on Live from PlayStation - are very impressive and if there's any repetition in models or behaviour, it doesn't readily stand out. But where there are really big crowds (hundreds of people), the majority are generally standing around jeering at something or somebody.

I'm curious how aware and reactive they are individually, i.e. if you rush in the middle and start firing a musket, will they all independently try to get away, i.e stampede? If somebody could test this (and even better, capture this), I'd really appreciate it.

My real issue with the game is the gameplay looks pretty much the same as the last six games.
Yup. The old control issues seem to exist, where it mis-predicts what you intend to do, most of the reviews mention the problem of trying to go through a door or window only to have the game think you're trying to climb over it.

Having come off the great controls in Shadow of Mordor, this may fear like step backward.
 
Status
Not open for further replies.
Back
Top