How should devs handle ports between consoles? *spawn

This is not gonna end well. Forced parity is such bullshit.

We went though entire last generation with worse-performing PS3 ports and zero X360 games were "toned down", and now when MS is feeling threatened, now we magically have devs [with unlimited funds and manpower] who are not willing to take full advantage from PS4.
 
In all seriousness, spare a thought for the PC gamer, almost all of whom have cpus capable of handling several times the AI workload.

Parity is a thing that we all have to live with at different times. For Sony fans, after more than a decade of being in the shadow of Xbox GPUs, this is probably a bit upsetting after finally tasting victory. But it's something everyone else has been living with for a long time. There were no doubt times when the PC gaming master race felt despair.

Ubi should have probably worded this better to protect themselves from the backlash.
 
Ubi says they could have run the graphics at 100 fps if they wanted, but because of the advanced AI they had to lower the res? Sounds like bullshit :???:

Edit, I just reread the article again. The parity doesn't have any relation to the AI part. Parity is just for avoiding debate (what?) and the AI is keeping them from going beyond 30 fps.
 
Parity is just for avoiding debate (what?) and the AI is keeping them from going beyond 30 fps.

Volume perhaps. On previous AC games it was usual to see dozens and dozens of NPCs but in some of the Unity screens there are hundreds of NPCs.

I presume they're not individually simulating each single NPC, more likely there is a some grouped crowd behaviour for the mass crowd scenes, but there's still a fair amount of positional AI required, i.e. you make sure that NPCs are trying to stand in the same place and push each other out of the way and so on.
 
Just a thought, but if Ubi are hitting performance problems on the cpu, then reducing resolution will reduce memory contention issues and get them some CPU performance back, particularly for algorithms that are doing lots of random accesses where the cpu is stalling waiting on data from main memory.

In short, reducing resolution may help with performance of their CPU limited code.
 
Lowest common denominator. Makes sense from some perspective.

I do wonder how the industry will settle eventually. This gen has, quite frankly, been a God-awful mess. The feedback from the masses is confusing the hell out of companies who have no idea how to handle it. Eventually I guess we'll get to a point where publishers don't comment, and just release products and get sales, or not. Either that or they'll give up and chase some other business entirely where the audience isn't so damned demanding and argumentative. "Damned if you do, damned if you don't," has never been more applicable to this biz than now.
 
Volume perhaps. On previous AC games it was usual to see dozens and dozens of NPCs but in some of the Unity screens there are hundreds of NPCs.

I presume they're not individually simulating each single NPC, more likely there is a some grouped crowd behaviour for the mass crowd scenes, but there's still a fair amount of positional AI required, i.e. you make sure that NPCs are trying to stand in the same place and push each other out of the way and so on.

I watched a dev video from them explaining their system a while back. They increase the interactivity of the AI for people nearby while keeping a more generalized one for the larger group. It's not so much the number of people in the crowd being an impressive feat (since that kind of thing has been seen around since Hitman games) but rather the interactions and collisions, I guess.
 
Technically we're CPU-bound," he said. "The GPUs are really powerful, obviously the graphics look pretty good, but it's the CPU [that] has to process the AI, the number of NPCs we have on screen, all these systems running in parallel
Um doesnt the guy realize thats even more reason to increase the resolution :)
ok if you're CPU bound it would be hard to increase framerate / or do better AO / AA etc, but CPU has bugger all effect on resolution
I'ld put money on a patch to 1080p coming later on
 
Unfortunately on PS4/Bone the performance of the CPU and GPU are interrelated.

Sony released a very interesting slide showing just how disproportionately CPU memory access hurts the GPU, and the Metro dev categorically stated that - at least on the Bone (and he may have been referring to PS4 too) - GPU access hurt CPU performance in a very real way.

Unified memory isn't *all* win after all! :eek:
 
In short, reducing resolution may help with performance of their CPU limited code.

This makes sense. We've all seen the Sony PS4 dev slide that shows the disproportionate CPU bandwidth loss to GPU bandwidth usage increase. Of course we don't know if by "CPU bound" they mean computationally or RAM bandwidth. It could be a bit of both.

I watched a dev video from them explaining their system a while back. They increase the interactivity of the AI for people nearby while keeping a more generalized one for the larger group. It's not so much the number of people in the crowd being an impressive feat (since that kind of thing has been seen around since Hitman games) but rather the interactions and collisions, I guess.

Yeah, the the interactions increase not-quite-expotentially with volume as well. A few dozen people on screen in a decent sized environment (like a street) does not require too much in the way of interaction/collision detection because there is plenty of space around but some of the screens of Unity show a mass of people. A hundred or more NPCs pressed closely against each other may be quite tricky. They need to be able to move but don't want to look like people placed in a predetermined pattern.
 
Last edited by a moderator:
I take it the compute units are no good at processing AI?
 
So my computer is cpu limited in just about every game I play being old, so my gpu (which is several years younger than the cPu) handles as much aa as I want without hitting my fps at all.

So being cpu limited means gpu 'extras' could easily be turned on.
 
So my computer is cpu limited in just about every game I play being old, so my gpu (which is several years younger than the CPU) handles as much aa as I want without hitting my fps at all.

So being cpu limited means gpu 'extras' could easily be turned on.

I'm not sure if that's entirely true. Correct me if I'm wrong.


You have 33 ms per frame to meet 30fps. 33ms on your GPU and 33ms on your CPU. They are running in parallel. But if CPU is running over 33ms, there is less time your GPU has to work with to meet 33ms. Essentially over time your GPU will catch up with the slowness of your CPU and just sit there and do nothing. *Having said that, this is an ideal situation for that dynamic resolution right guys!! ;)*

Say your CPU time is 49ms per frame. In order to keep that 30fps, your GPU time might need to complete it's frames in 16ms just to make the game appear as though it's running 30fps.

If you cranked your GPU settings up to be as long as your CPU time, then your GPU is running 49ms as well for instance, you're only getting like 15fps.
 
Is Ubisoft PR run by monkeys? Legit LOL at that statement. Wouldn't surprise whoever said that doesn't last the whole day with a job...
 
Back
Top