How should devs handle ports between consoles? *spawn

"Parity".

I would like to put forward a suggestion: with Kinect, powerful audio co-processors that could handle all kinds of stuff like binaural audio, and SDK integrated cloud not being utilised by, like, anyone, that Xbox one owners have lost more to "parity" than PS4 owners ever possibly could by not having the GPU (allegedly) pushed as hard as it could be.

Why aren't they out filling the internet with rage?

I'm starting to hate the word "p****y" - "the 'P' word" - because it's being used in such a narrow way and being applied to such a tiny, tiny subset of console and software characteristics.
 
I don't understand that the idea that every game needs to be pushing these consoles to the maximum. This is a primary reason why development budgets have skyrocketed over the last eight years. Your average dev team size back in the PS2 era was around 50-75 people on average, and sales of 500k was more than enough to turn a profit. I see no issue with developers using the X1 as the target spec. If it runs great on X1, it should be little issue for it to run on PS4. This mentality of needing 20+ million dollars to make a game has lead to publishers only funding proven IP's, and certainly sticking with genre's that are proven to sell big numbers. Even though Destiny is a new IP, Activision did one heck of a job hyping the game before the game every hit store shelves. Do you honestly believe Destiny would have been such a hit if it hadn't had that prerelease hype? No, now that its come out, the hype has been replaced by less than enthusiastic praise for the game. It went from being the next best thing to a very mediocre title. Not that its bad, but the quality of the title did not justify the prerelease hype. The mentality that every game has to be a technical marvel is going to result in an industry collapse at some point, or games retail price will move up to $80+ offset these cost. I suppose that's the real reason DLC is so popular, the development cost is insignificant compared to the core game, but the profits of DLC is huge.
 
"Parity".

I would like to put forward a suggestion: with Kinect, powerful audio co-processors that could handle all kinds of stuff like binaural audio, and SDK integrated cloud not being utilised by, like, anyone, that Xbox one owners have lost more to "parity" than PS4 owners ever possibly could by not having the GPU (allegedly) pushed as hard as it could be.

Why aren't they out filling the internet with rage?

I'm starting to hate the word "p****y" - "the 'P' word" - because it's being used in such a narrow way and being applied to such a tiny, tiny subset of console and software characteristics.

But that's partly MS fault. You can't expect other developers to use "something" if your own internal teams aren't pushing it. MS vision of cloud usuage is pretty much dead as Sony's vision of PS3 based Cell networks. As for XB1 great audio capabilities - it still would be relegated to less than 1% of people with high-end audio systems. For the most part, gaming audio is still being played through the typical TV speakers and headphones.
 
Last edited by a moderator:
"Parity".

I would like to put forward a suggestion: with Kinect, powerful audio co-processors that could handle all kinds of stuff like binaural audio, and SDK integrated cloud not being utilised by, like, anyone, that Xbox one owners have lost more to "parity" than PS4 owners ever possibly could by not having the GPU (allegedly) pushed as hard as it could be.

Why aren't they out filling the internet with rage?

I'm starting to hate the word "p****y" - "the 'P' word" - because it's being used in such a narrow way and being applied to such a tiny, tiny subset of console and software characteristics.

PS4 has a audio co-processor also , the power of the cloud does not belong to the Xbox One & as far as gaming goes Xbox One's controller with no touch pad or sensors is holding back game play more than anything on the PS4 side.
 
But that's partly MS fault. You can't expect other developers to use "something" if your own internal teams aren't pushing it.

Completely agree.

MS vision of cloud usuage is pretty much dead as Sony's vision of PS3 based Cell networks.

Disagree here though. Cloud is coming on just fine. Drivatars and scalable hosting for servers and game related services are adding a lot of value.

MS overhyped cloud and misrepresented it to consumers. But if you look behind the spin and at the technology you can see that it's part of a clear trend and MS are leading here. 3rd parties are dragging heels (understandably) and "parity" means games are stuck at the level of PS4 games - which mean dedicated servers, maybe, if you're lucky.

I'm looking forward to Brotars in future persistent world shootbangers. I want to have my Halotar with my actual, ugly head running round maps and doing badly and then picking some guy on the other team and just trying to run him over again and again and again because I won't always be there to do it myself.

And for XB1 great audio capabilities - it still would be relegated to less than 1% of people with high-end audio systems. For the most part, gaming audio is still being played through the typical TV speakers and headphones.

Binaural works with ordinary stereo headphone. MS could have offered libraries to assist in adding this to games, and had a translator to work at converting typical 5.1 to virtual surround for headphone users in any game that didn't want to fully support binaural.

And just to top it off they could have used Kinect to actually model each users ears to make the effect more accurate. Link it to the user profile so it applied to all games and apps.

But this is getting OT, and is turning into me ranting and how much MS fucked up with their early platform plans.
 
PS4 has a audio co-processor also , the power of the cloud does not belong to the Xbox One & as far as gaming goes Xbox One's controller with no touch pad or sensors is holding back game play more than anything on the PS4 side.

Xbox audio hardware is better, PS4 doesn't have SDK hooks to Sony's gaming optimised cloud infrastructure (accessible at reduced cost), and while there are some nice additions to the PS4 pad they hardly compare with Kinect.

If both platforms were fully exploited Xbox One would be doing a lot more things than it currently is, and they would have a lot of impact on the way games are played and experienced. MS fucked up though.

When the entire subject of "parity" is reduced down to number of pixels - that most of the complainants need someone else to count for them - things get very depressing.
 
Xbox audio hardware is better, PS4 doesn't have SDK hooks to Sony's gaming optimised cloud infrastructure (accessible at reduced cost), and while there are some nice additions to the PS4 pad they hardly compare with Kinect.

If both platforms were fully exploited Xbox One would be doing a lot more things than it currently is, and they would have a lot of impact on the way games are played and experienced. MS fucked up though.

When the entire subject of "parity" is reduced down to number of pixels - that most of the complainants need someone else to count for them - things get very depressing.


Touch & IMU walking all over Kinect when it come to controls what are you talking about? And since Kinect is no longer standard it shouldn't even be a part of this discussion. Unless you want to bring the PlayStation Camera & Move into this & when it come to control Kinect fall way behind in functionality.
 
I think it's better to take a more objective view.

Kinect was standard at first. Games didn't use it in any meaningful way. "Parity". Now it's not standard, but it's still a more advanced device than anyone else is offering. But it's tied to the cheaper and less sophisticated PSEye in terms of what games will bother to actually do with it. "Parity".

So where have all the angry chimps hurling rage-turds into the internet "parity" shitstorm been?
 
i don't mind parity if the game is good on both.
My eyes are not bleeding when I go from a 1080 game to BF4 MP at 900 with a lot of aliasing.
i'm fine with that resolution but i need a very stable 30fps, if not locked.
 
things get very depressing.

For you maybe.

Its not wrong that people demand/want more. A not so insignificant number of consumers demand higher IQ and some could care less. The important thing is that consumers have choice.

One thing that shouldn't happen (Alien: Isolation for example) is sacrificing input consistency for the sake of IQ. Decisions like that are on the publishers and developers. When games have to play bad to look better then no one wins.
 
I think it's better to take a more objective view.

Kinect was standard at first. Games didn't use it in any meaningful way. "Parity". Now it's not standard, but it's still a more advanced device than anyone else is offering. But it's tied to the cheaper and less sophisticated PSEye in terms of what games will bother to actually do with it. "Parity".

So where have all the angry chimps hurling rage-turds into the internet "parity" shitstorm been?

Functionality is what matters & Kinect isn't going to do much that can't be done with the PlayStation Camera / DS4 or Move & in most cases they will perform better.
 
Last gen we were all fine with disparity

That worked both ways last gen. People are quick to say how last gen it was fine to have the ps3 versions not run as well, but 360 hardware was also frequently left idling to bring that gap closer especially in the early years. For example most early 360 games shipped with lots of ram totally unused to be able to maintain some semblance of parity with the ps3 version. I personally shipped a 360 game that left ~110MB of ram totally unused on the 360. Frustrating but there was no choice, you could only have so much disparity between versions before people would start to scream, more so last gen where people were widely deceived by paper specs that they didn't understand. Likewise I often was veto'd in meetings for suggesting things to implement because they couldn't be implemented on the ps3 version. In other words, last gen even 360 owners got screwed by parity. So it goes, gamers scream, gamers cry, gamers complain, they are quite literally an impossible to satisfy bunch so you have to try and make the best of it and choose the path of least bitching and complaining.
 
That worked both ways last gen. People are quick to say how last gen it was fine to have the ps3 versions not run as well, but 360 hardware was also frequently left idling to bring that gap closer especially in the early years. For example most early 360 games shipped with lots of ram totally unused to be able to maintain some semblance of parity with the ps3 version. I personally shipped a 360 game that left ~110MB of ram totally unused on the 360. Frustrating but there was no choice, you could only have so much disparity between versions before people would start to scream, more so last gen where people were widely deceived by paper specs that they didn't understand. Likewise I often was veto'd in meetings for suggesting things to implement because they couldn't be implemented on the ps3 version.

And people think this didn't happened last gen! :LOL:
 
I don't suppose you could just allocate more RAM for streaming purposes :?:

If the early games streamed assets...many of which didn't because we didn't have any streaming system coded yet for the new machines. I know for one game we just allocated all that extra memory to the instant replay buffer on the 360 version, figuring that would be innocuous enough to not get gamers screaming. So the 360 version basically had this monstrously large instant replay buffer compared to the ps3 version.


And people think this didn't happened last gen! :LOL:

Yeah the parity axe swings both ways, it doesn't discriminate. It can strike in subtle ways as well, like I'm sure the non hdd 360 sku caused much grief by year 6 leaving the parity axe to strike both the ps3 and 360+hdd setups. Honestly you aren't hugely concerned with it in the beginning of a gen because you are so busy just trying to skip the damn game due to the joys of console development forcing you to re-write everything from scratch when a new box comes out. It does get irritating though after that first game is out and you have more time to actually think about things you want to try and implement.
 
That worked both ways last gen. People are quick to say how last gen it was fine to have the ps3 versions not run as well, but 360 hardware was also frequently left idling to bring that gap closer especially in the early years. For example most early 360 games shipped with lots of ram totally unused to be able to maintain some semblance of parity with the ps3 version. I personally shipped a 360 game that left ~110MB of ram totally unused on the 360. Frustrating but there was no choice, you could only have so much disparity between versions before people would start to scream, more so last gen where people were widely deceived by paper specs that they didn't understand. Likewise I often was veto'd in meetings for suggesting things to implement because they couldn't be implemented on the ps3 version. In other words, last gen even 360 owners got screwed by parity. So it goes, gamers scream, gamers cry, gamers complain, they are quite literally an impossible to satisfy bunch so you have to try and make the best of it and choose the path of least bitching and complaining.

Wow, makes me wonder how many Wii U ports of 360/PS3 games aren't using more than 512MB of memory. Pretty sad, but when Criterion said using the high res texture was as easy as flipping a switch, it makes sense that a lot of memory was sitting their vacant. I suppose other developers just never bothered.
 
People here are talking about costs to justify parity. For a game like the new AC, is it so expensive to up the resolution, AA quality or to use more advanced effects at the PS4 version that were already developed for PC?

I mean, the PC version will, at the higher settings, use more advanced effects. Why not use those already developed higher quality to use the higher power of the PS4?

I really don´t understand.
 
I think function made a valid point.
Last gen PS3 had the edge on sound output, being able to output 7.1 channels.
This time Xbox has the advantage on sound.
 
People here are talking about costs to justify parity. For a game like the new AC, is it so expensive to up the resolution, AA quality or to use more advanced effects at the PS4 version that were already developed for PC?

I mean, the PC version will, at the higher settings, use more advanced effects. Why not use those already developed higher quality to use the higher power of the PS4?

I really don´t understand.


That's different than what I was saying. If they already have done all the work on those higher quality assets, but choose not to use them even if it wouldn't take any extra work, then that is crap, but I don't see anything wrong with the X1 being the target spec, and that can still be for a 1080p 60fps game. This allows the game to easily port to PS4, and both versions will run well. Targeting PS4 for multi plats seems like its going to create more work to get the X1 version up to snuff, or the version ends up being sub par. I just don't think a developer should be under pressure to always push themselves to the limits. These are games afterall, there is nothing wrong with focusing on the gameplay, and choosing a graphics setup that doesn't stress the developer out so much.
 
Back
Top