Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Without knowing how the engine operates its going to be tough to figure out. I assume that if CPU bound is an issue the small uptick on X1will play a small factor. With my limited knowledge on the subject I think that the Shared memory contention on PS4 between its GPU and CPU would start to be a real problem if the CPU is taking too long to do its work and is constantly taking priority away from the GPU. If the charts were accurate we would see a steep decline in available bandwidth which despite all the performance advantages ps4 has the ROPS wouldn't have enough available bandwidth to operate fully so a decrease in resolution was necessary??

Remember the last Ubisoft CPU benchmark where similarly the CPU test got a ~15% advantage for XB1? There was no contention here as only the CPUs were used. Also previous benchmarks either were a draw or were given a PS4 advantage, still no contention here.

Also I remember a Ubisoft guy saying the XB1 got not one (first being assumed the 5% CPU boost from SDK june) but two recent CPU improvements, can't find the source now unfortunately.

Finally 7 cores versus 6 cores is a ~16.7% advantage.
 
But wasn't there some GDC presentation about doing NPCs using compute ?

The presentation was regarding cloth simulation.

Sounds like a lousy job by the devs to me, or perhaps #PS4Parity (marketing dollars ?) at work.
Can we not go down this line of discussion? There are enough cesspools on the internet for wild speculation & "lazy devs".
 
CPU really shouldn't be used for 10k AI or up, imho. It should have been streamed through the CUs. I bet that this project started four years ago is part of the reason why (though even then there were articles showing how it could/should be done). Also, the content creation tools they use must be quite extensive after so many games with so much content in them, so that they probably weren't ready to rewrite those as well, especially during such a large and ambitious project. Still disappointing though, but the game does really look nice. And there is some wind effects here and there with planks flying around that I wouldn't be surprised does use a bit of CU ;)
 
Didn't they say they were CPU limited? PS4 has lower clocked CPU than XB1 so game should struggle a bit more if it truly is CPU issue.

Yeah it's right in the df article here:

"Technically we're CPU-bound. The GPUs are really powerful, obviously the graphics look pretty good, but it's the CPU [that] has to process the AI, the number of NPCs we have on screen, all these systems running in parallel.

"We were quickly bottlenecked by that and it was a bit frustrating," he continued, "because we thought that this was going to be a ten-fold improvement over everything AI-wise, and we realised it was going to be pretty hard. It's not the number of polygons that affect the frame-rate. We could be running at 100fps if it was just graphics, but because of AI, we're still limited to 30 frames per second."

I partly feel bad for them because they will likely get crucified on forums for this even though general ai code is difficult to shift to gpgpu so they may very well be cpu screwed, but the odds of anyone believing them though are probably close to zero.
 
Disclaimer: I wasn't ever familiar with the engine tech, despite us being involved in the CG work, so this is pure speculation.

I believe this is a first iteration of the engine on the new hardware; despite not being a cross-generation title, the game probably didn't have enough time to properly fit the various systems on the new tech. The scope is huge, not really comparable to any nextgen releases so far, so the execution is probably not the best possible approach. The only comparable game is GTA V but it's an enhanced port and doesn't try to do more than what was possible on the PS360 systems, they just scaled a few things up as much as it was possible.
We'll see how the (likely) future iterations will fare, it's still possible that Ubi can improve on the visuals and the stability. It probably also doesn't help that AC1 was so well done, expectations were that its accomplishments can easily be repeated.
 
I partly feel bad for them because they will likely get crucified on forums for this even though general ai code is difficult to shift to gpgpu so they may very well be cpu screwed, but the odds of anyone believing them though are probably close to zero.
The 'crucifixion' is over their priorities. Why put in several hundred NPCs if it tanks framerate so badly? Those crowds are very crowded in places. Thin them out a little and hit a stable 30 fps.

Anyone can throw AI at a game until the hardware buckles. There's nothing special about this gen that means it can be AI'd to death where previous gens couldn't. Other games have more sense with how they deal with it, scaling the AI to the hardware. No different to scaling the graphics to the hardware to hit a target framerate.
 
Disclaimer: I wasn't ever familiar with the engine tech, despite us being involved in the CG work, so this is pure speculation.

I believe this is a first iteration of the engine on the new hardware; despite not being a cross-generation title, the game probably didn't have enough time to properly fit the various systems on the new tech. The scope is huge, not really comparable to any nextgen releases so far, so the execution is probably not the best possible approach. The only comparable game is GTA V but it's an enhanced port and doesn't try to do more than what was possible on the PS360 systems, they just scaled a few things up as much as it was possible.
We'll see how the (likely) future iterations will fare, it's still possible that Ubi can improve on the visuals and the stability. It probably also doesn't help that AC1 was so well done, expectations were that its accomplishments can easily be repeated.

I agree. I hope next year is better on the performance side of things. Sub 30 is unacceptable imo.
 
Isn't there any cloth simulation in this game? Or is it all canned including Arno's cape?
I haven't actually seen much of the game outside of the screens, but I thought the cloth simulation was in regards to the dresses for the dancers. That they use a seemingly large number of characters is just a demonstration of compute capability, but as for the game population density... that is seemingly a separate issue that may sit wholly on the CPU. *shrug*

I'm somewhat curious about the visibility culling.
 
You can pre-coreograph a dance routine so that the characters can use the same motion and never collide - but in a crowd that also has to be interactive, you need to introduce pretty complex pathfinding and such. The AI agents have to deal with each other, the key NPCs, and the player; and any intersecting or silly behavior would instantly kill the illusion. It's a much more complex problem and the interactions are probably the reason why it can't just simply be done with compute. Each agent has to deal with a relatively large number of other agents in the vicinity, must be hard to parallelize.
 
When the game OS is running, what is running beside the system OS? When running an app with the game OS suspended, there are 4 cpus available not two. If the system OS can run comfortably on one cpu core, then why should this present a problem?

*If* the system OS can run comfortably on one sure. Do we have evidence on how the X1 performs on the 2 cores let alone 1?

And once again technical aspects aside I'm am not a believer in silent patches this close to the most important 3 months of Xbox this year. There should have been mention of it, Phil already made mention that the X1 updates would resume in 2015. The next two December update will be minimal.
 
Even in Hitman Bloodmoney they already showcased massive crowds running on ancient hardware. Io Interactive has some great talent.

It's tough to tell cpu load purely from crowd count, it depends on their level of interactivity and a whole host of other things. From what I recall having finished Hitman is that the large crowds in that game were actually quite primitive in their purpose.


The 'crucifixion' is over their priorities. Why put in several hundred NPCs if it tanks framerate so badly? Those crowds are very crowded in places. Thin them out a little and hit a stable 30 fps.

Anyone can throw AI at a game until the hardware buckles. There's nothing special about this gen that means it can be AI'd to death where previous gens couldn't. Other games have more sense with how they deal with it, scaling the AI to the hardware. No different to scaling the graphics to the hardware to hit a target framerate.

Fair enough, although that's why I said "partly feel bad". Maybe they realized too late that they were going to miss their performance mark, and a "ship before thanksgiving" mandate meant last minute changes were impossible. So yeah partly their own doing, but I still feel bad that no one will believe them that they are cpu bound.
 
The PS4 advantage in cutscenes seems to support the official explanation.

Still... take almost any screenshot of the game, lean back a little and squint - it looks almost like a movie. The only exception is the wide city vistas where asset and texture instancing becomes more obvious.
 
I'm not sure if that counts as an official source. But even if true, it doesn't imply that there is a whole other core unlocked due to the kinect reservation being removed.

However: if the reservation unlocked a core and the GPU and memory allocations that would be a significant release of resources and we'd hear more from developers if it was that large. I believe a developer discussed the removal of kinect reservation as a minor increase in performance and at most inconsistent resources being released to them, the June SDK is actually what lead to major performance improvements on X1, that to me isn't a strong indication that a full other core was released at that time.
 
It's tough to tell cpu load purely from crowd count, it depends on their level of interactivity and a whole host of other things. From what I recall having finished Hitman is that the large crowds in that game were actually quite primitive in their purpose.

Not really seeing what's so impressive in the crowds of Unity either, to be honest. I mean there's more animation types but that was going to happen regardless because of the memory increase. Other than that they still stand there miming endlessly for you to wade through.

I haven't actually seen much of the game outside of the screens, but I thought the cloth simulation was in regards to the dresses for the dancers. That they use a seemingly large number of characters is just a demonstration of compute capability, but as for the game population density... that is seemingly a separate issue that may sit wholly on the CPU. *shrug*

I'm somewhat curious about the visibility culling.

I too thought it was for the dancers but after watching a few vids it doesn't seem like it. Maybe in the next game, I guess..
 
However: if the reservation unlocked a core and the GPU and memory allocations that would be a significant release of resources and we'd hear more from developers if it was that large. I believe a developer discussed the removal of kinect reservation as a minor increase in performance and at most inconsistent resources being released to them, the June SDK is actually what lead to major performance improvements on X1, that to me isn't a strong indication that a full other core was released at that time.

Or could be that theoretical improvement is under NDA and MS doesn't want the information spread to those who don't need to know.
 
Remember the last Ubisoft CPU benchmark where similarly the CPU test got a ~15% advantage for XB1? There was no contention here as only the CPUs were used. Also previous benchmarks either were a draw or were given a PS4 advantage, still no contention here.

Also I remember a Ubisoft guy saying the XB1 got not one (first being assumed the 5% CPU boost from SDK june) but two recent CPU improvements, can't find the source now unfortunately.

Finally 7 cores versus 6 cores is a ~16.7% advantage.

Right but in the scenario where there is shared memory contention (like games would have), then we can't use a graph looking strictly at CPU benchmarks. Bandwidth contention is going to be an issue for both X1 and PS4 - the question is how much and ultimately what the behaviour will be if it becomes extremely lop sided. *we don't know yet*. In this scenario, having way more NPCs with AI, Animations, Sound and collisions needing to be processed by the CPU it could be such an extreme load.
 
Status
Not open for further replies.
Back
Top