How should devs handle ports between consoles? *spawn

Just get both versions to 1080p, PS4 should be fine as it is and just crank down settings for Xbone till it locks at 30fps. These days the difference between Low, Medium, High and Very High is not quite as drastic if all being rendered at the same res. But the performance you save can be fruitful.


I think this is an ideal out for ubisoft. Part of me said they were down playing resolution to not get hopes up and burn like they did in WD. So perhaps we will see an improvement right at release at least for one system, but more ideally both.

And if I was MS. Haha I would send in the engineers! Great time to see if they can get a leg up for once if Sony doesn't care to.
 
Just get both versions to 1080p, PS4 should be fine as it is and just crank down settings for Xbone till it locks at 30fps. These days the difference between Low, Medium, High and Very High is not quite as drastic if all being rendered at the same res. But the performance you save can be fruitful.

This by a country mile...

If they simply targeted 1080p 3fps on PS4 and then just switched off AA, AO and went to a lower shadow quality on XB1, then it should be a reasonable enough drop for the XB1 to keep a consistent 30fps, and you can avoid the #resolutiongate discussion because both skus run at full HD res. So gamers are happy, and so is the marketing department.
 
"We decided to lock them at the same specs to avoid all the debates and stuff,"

Where have they been for the last year to where someone thought this would actually happen :)

It's a new engine and maybe the kinks aren't worked out and they don't care to do that in the timeframe they have left. Besides what company is looking to have their co-marketing partner's console have a bullet point deficit if it is going to push them off of their scheduled release date ? The weird part is that they had no plans to make a patch but then again that might have been a "political' statement which would be a good idea this close to launch. Folks can get annoyed but IF there is a performance issue that cannot easily be resolved and similar specs are the result I don't see why Ubi should bother dealing with it and blowing their release date and it's attendant marketing schedule.
 
And if I was MS. Haha I would send in the engineers! Great time to see if they can get a leg up for once if Sony doesn't care to.

MS need an army of their engineers permanently on 3rd party resolution duty, or better still some trainers so that 3rd parties can achieve the same performance boosts the MS guys do.

When those wizards go to work they get the X1 hitting pretty much all the targets the PS4 reaches, seemingly overnight.
 
MS need an army of their engineers permanently on 3rd party resolution duty, or better still some trainers so that 3rd parties can achieve the same performance boosts the MS guys do.

When those wizards go to work they get the X1 hitting pretty much all the targets the PS4 reaches, seemingly overnight.

Its actually kind of sad that these developers need assistance from Microsoft to get the desired performance. Are the developers that bad at optimizing software?

It certainly seems like we, or perhaps developers are over complicating the situation. If you want to hit 1080p 60fps with a game on PC, you go into the options menu and make some changes until you get what you want. It shouldn't be that much more complicated for developers with these consoles. If the game runs on PS4 in 1080p and holds a solid framerate, it shouldn't be to tough to change a few settings and get the same performance on X1. They are choosing to have sub par performance on X1. They could lower settings in areas to get the framerate where they want it. Its a developers choice, and so far it seems they are choosing to maintain visual fidelity at the expense of a solid framerate.
 
Its actually kind of sad that these developers need assistance from Microsoft to get the desired performance. Are the developers that bad at optimizing software?

It certainly seems like we, or perhaps developers are over complicating the situation. If you want to hit 1080p 60fps with a game on PC, you go into the options menu and make some changes until you get what you want. It shouldn't be that much more complicated for developers with these consoles. If the game runs on PS4 in 1080p and holds a solid framerate, it shouldn't be to tough to change a few settings and get the same performance on X1. They are choosing to have sub par performance on X1. They could lower settings in areas to get the framerate where they want it. Its a developers choice, and so far it seems they are choosing to maintain visual fidelity at the expense of a solid framerate.

For the fact that they've managed to finally get their games starting to run where we all believed that Xbox One should have been performing at is at least some indication of the dedication to the platform. A simple glance at say Frostbyte 3 engine, we can see in a year progress from 720p (BF4) to 900p (PVZ) 900p+ (pvz) and 1080p (still needs to be proven but is declared; BF Hardline). We are still waiting to see how DA3 turns out.
I mean, the hardware hasn't changed at all since launch, removal of kinect is variable in performance, so we're really just looking at proper usage of the hardware and better improvement in tools providing huge improvements for their own platform.

I imagine that they only really need to do this once, and every company going forward should know how to do this going forward for the lifetime of the console and get better at it. I think once this 'send MS engineers' phase is over, it might occur again if they really do decide to push 'cloud enhanced CPU based games' I can see that as being a highly collaborative effort for a while.
 
Its actually kind of sad that these developers need assistance from Microsoft to get the desired performance.

It's not an uncommon situation. Sony had engineers assisting 3rd parties with PS3 engines due to CELL being such a bitch to code for. I expect all platform holders have sent out troubleshooting teams in the past.


Are the developers that bad at optimizing software?

As has been repeated ad infinitum on these forums, no.

The issue is always that developers have a target to hit within a time/budget constraint. Not all developers are equally versed in each platform's idiosyncrasies, not all budgets are adequate.

Optimisation is largely an iterative process. The MS troubleshooters have most likely been through those optimisation scenarios many more times than the developers they are sent in to assist, so can pinpoint issues and suggest changes quicker.

'Good' and 'bad' are largely irrelevant terms here with respect to developers. 'Experienced', 'familiar' and 'trained' are far more applicable.
 
For the fact that they've managed to finally get their games starting to run where we all believed that Xbox One should have been performing at is at least some indication of the dedication to the platform. A simple glance at say Frostbyte 3 engine, we can see in a year progress from 720p (BF4) to 900p (PVZ) 900p+ (pvz) and 1080p (still needs to be proven but is declared; BF Hardline). We are still waiting to see how DA3 turns out.
I mean, the hardware hasn't changed at all since launch, removal of kinect is variable in performance, so we're really just looking at proper usage of the hardware and better improvement in tools providing huge improvements for their own platform.

I imagine that they only really need to do this once, and every company going forward should know how to do this going forward for the lifetime of the console and get better at it. I think once this 'send MS engineers' phase is over, it might occur again if they really do decide to push 'cloud enhanced CPU based games' I can see that as being a highly collaborative effort for a while.

+1 to all of that.

Last gen there was a distinct transition period with respect to PS3 performance. It started off atrocious, because engines were largely being carried over from previous generations, being written with a single thread model, or relying on RSX for all render tasks, and the SDK and toolset for the PS3 was not providing the optimisation data that was needed.

After a while the machine with the vastly inferior GPU generally achieved parity through improved performance monitoring tools, improved API's, and knowledge of the peculiarities of CELL development like getting SPU's to assist with rendering tasks.

Now we see X1 games showing great improvement through better developer tools, improved API's and better knowledge of how ESRAM can assist with bandwidth bottlenecks.
 
The PS3 and 360 were two very different beasts with different strengths and weaknesses - arguably very equal overall once fully tapped. That isn't true this gen. You can't make a 1.3TF console perform like a 1.8TF one unless you put a lot of work into one and little into the other (Destiny comes to mind) or you just cap the frame rate and let the excess go to waste (Diablo 3). If a 3rd party developer actually plays to the strengths and weakness with equal man-hours and don't play the parity card then you will get measurable, expected differences.
 
'Good' and 'bad' are largely irrelevant terms here with respect to developers. 'Experienced', 'familiar' and 'trained' are far more applicable.

Quite correct.
However not all people (devs) want to improve, does that classify those as bad then ?
 
Seriously, can we stop with the 1.3TF vs 1.8TF thing? It's starting sound like a Bay movie sequel for a start, but more importantly it's an oversimplified view of the issue.
 
+1 to all of that.

Last gen there was a distinct transition period with respect to PS3 performance. It started off atrocious, because engines were largely being carried over from previous generations, being written with a single thread model, or relying on RSX for all render tasks, and the SDK and toolset for the PS3 was not providing the optimisation data that was needed.

After a while the machine with the vastly inferior GPU generally achieved parity through improved performance monitoring tools, improved API's, and knowledge of the peculiarities of CELL development like getting SPU's to assist with rendering tasks.

Now we see X1 games showing great improvement through better developer tools, improved API's and better knowledge of how ESRAM can assist with bandwidth bottlenecks.
You cant compare the previous generation with this one.

The PS3 had an "inferior" GPU but had a CPU that was probably more capable of assisting the GPU. The 360 and the PS3 had many differences. Type of RAM,unified memory VS discrete memory, different GPUs, different CPUs....

Now this time the PS4 and XB1 are almost the same. The GPU and the CPUs are the same. The PS4 has the same GPU with more computational units. The CPU is the same with a tiny overclock on the XB1. It can barely cover for the GPU performance difference. Both have the same amount of RAM but one has GDDR5 the other has DDR3 and some ESRAM which can only partly help in the memory bandwidth situation.

Both the PS4 and the XB will see improved API's.

To summarize the only hardware "peculiarity" that the XB1 has that may help parity is the ESRAM and thats only for memory and in specific situations.
 
Quite correct.
However not all people (devs) want to improve, does that classify those as bad then ?

I guess so. It should at least classify them as 'employment prospects limited' ;)

Of course there will be poor developers out there, but I'd expect the standard set by major 3rd party studios to be pretty high.
 
I think this is an ideal out for ubisoft. Part of me said they were down playing resolution to not get hopes up and burn like they did in WD. So perhaps we will see an improvement right at release at least for one system, but more ideally both.

And if I was MS. Haha I would send in the engineers! Great time to see if they can get a leg up for once if Sony doesn't care to.

Almost anything MS engineers do to help performance will benefit the PS4 version, unless they focus exclusively on ESRAM utilization Sony can coast on those coat tails.
 
The way I see it, if a console dev (that have fixed hardware) making an AAA game doesn't use all the available hardware, then they are seeking parity even at the cost of not fully utilizing the hardware.
Basically if you make a game exclusive for PS4 (especially for AAA game), surely you try to use all the available hardware. If in the event that you're limited to the CPU, doesn't mean that you stop exploiting the GPU (or maybe the extra bandwidth or something else). If I were to make a game that really CPU limited that I can't fully exploit other parts of the system, I certainly would tweak my CPU usage so the approach can be more balanced towards other parts of the system.

In Unity case, if the game is really CPU limited, it means that on X1 it might not utilize the full 12CU? Or it just happen that in this CPU limited scenario, they can only use 12CU? Isn't that convenient for X1? If the dev only make Unity for PS4, would they still make a game limited to X1 spec or somehow magically they can use all the available hardware?

Anyway, that just me thinking. It can also be that X1 code is more optimized vs PS4 code because of MS help. But as it is right now, the game isn't out yet. It should be fun to see how much parity they can achieve with Unity and whether their claim of CPU limiting performance can be translated to better gaming experience.
 
The way I see it, if a console dev (that have fixed hardware) making an AAA game doesn't use all the available hardware, then they are seeking parity even at the cost of not fully utilizing the hardware.
Basically if you make a game exclusive for PS4 (especially for AAA game), surely you try to use all the available hardware. If in the event that you're limited to the CPU, doesn't mean that you stop exploiting the GPU (or maybe the extra bandwidth or something else). If I were to make a game that really CPU limited that I can't fully exploit other parts of the system, I certainly would tweak my CPU usage so the approach can be more balanced towards other parts of the system.

In Unity case, if the game is really CPU limited, it means that on X1 it might not utilize the full 12CU? Or it just happen that in this CPU limited scenario, they can only use 12CU? Isn't that convenient for X1? If the dev only make Unity for PS4, would they still make a game limited to X1 spec or somehow magically they can use all the available hardware?

Anyway, that just me thinking. It can also be that X1 code is more optimized vs PS4 code because of MS help. But as it is right now, the game isn't out yet. It should be fun to see how much parity they can achieve with Unity and whether their claim of CPU limiting performance can be translated to better gaming experience.

Costs money and time. Making a multiplatform game is not an easy task. Making full use of the hardware is seldom if ever achievable. The reality of it is that you your choices for the game and how it should perform will lead into how your engine should be designed. If that doesn't line up for your console too bad. Both will be compromised to a degree to make it work, at the end of the day the developers are trying to shoe horn their game into the console, they'll make adjustment where needed.

Games are about making choices, once choices and player freedom or movement is involved and the more involved that is ie.not a rails based game, it's becomes harder and harder to making something that will maximize the console. The more you can do the more diverse your engine has to be as handling different situations which ultimately means you couldn't optimize everything for the sake of allowing freedom of interaction. The more the game controls everything the less of a game it becomes and the more of a cinematic rails game it becomes. If you're okay with watching CG movie games, so be it, but the hardware needs to support all types of games not just the ones that cost millions of dollars in assets.

Developers are building games, not tech demos. They do their best within their limitations of the hardware presented before them. You guys need to relax about oozing every last drop, it doesn't happen, it'll likely never happen because no single game needs to use every single feature that is available in the hardware. Most of the features are just different solutions to approach the same problem!
 
Besides, any console game will only "max out" the hardware in worst case situations, in order to keep minimum frame rates above 30/60 per second. This means that 60-80% of the time a lot of the CPU and GPU power is left unused.

Like in the case of Unity, whenever you'd be roaming the deserted countryside or walking in an underground tunnel on your own, there'd be a lot of "wasted" TFs. But they have to design the engine and the various systems so that the game could still maintain a fluid framerate even when you're fighting a dozen soldiers on top of a building, in front of the entire city, with lots of citizens watching from the ground.
 
Besides, any console game will only "max out" the hardware in worst case situations, in order to keep minimum frame rates above 30/60 per second. This means that 60-80% of the time a lot of the CPU and GPU power is left unused.

Like in the case of Unity, whenever you'd be roaming the deserted countryside or walking in an underground tunnel on your own, there'd be a lot of "wasted" TFs. But they have to design the engine and the various systems so that the game could still maintain a fluid framerate even when you're fighting a dozen soldiers on top of a building, in front of the entire city, with lots of citizens watching from the ground.

But do you really think that if a dev working exclusively for PS4 they will not target to use all the available hardware because the game is CPU limited? If they target the worst case scenario for a hardware similar to PS4 then port it to X1, do you think we will see parity in all situation?
That is why we should wait and see whether it's 100% parity or there are cases where PS4 maintains better FPS or even tiny quality reductions that isn't noticeable without being pointed out (like I don't really noticed those different AO solution when playing the game).
 
Back
Top