How should devs handle ports between consoles? *spawn

Parity shmarity. I really hope Assassin's Creed is some groundbreaking game gameplay and AI or else I'll be disappointed at Ubisoft yet again. If the CPU is their justification for 900p@30fps then they have set the bar higher for the game in my eyes. I will eagerly anticipate playing the game and see if they accomplish the lofty goal I've projected onto them.

At this point in time I would't mind if the sales disparity grows between PS4 and XB1 making devs focus on the dominant machine to compete for that machine's game sales.
 
http://www.techradar.com/news/gamin...-industry-is-dropping-60-fps-standard-1268241

WTH is wrong with UBI? The foot goes further into the mouth... maybe something else.

This week, gamers accused Ubisoft of keeping the frame rate and resolution down to these numbers to avoid the PS4 having a graphical advantage, but Nicolas Guérin, World Level Design Director on Unity, told TechRadar that the decision was partly to give the game more of a cinematic gloss - though did admit that it was also tough to achieve.

"At Ubisoft for a long time we wanted to push 60 fps. I don't think it was a good idea because you don't gain that much from 60 fps and it doesn't look like the real thing. It's a bit like The Hobbit movie, it looked really weird.

Doesn't sound like their CPU bound to me.... :???:
 
You are definitely CPU bound for 60fps if you can't get your game code to run under 16ms.
 
It's important to have some context to the UBI comments.

They *could* have meant that AC running at 60 fps (with the appropriate graphic cutbacks to reach that 60fps target) looks less cinematic than the game running at 30fps with higher graphical options like more realistic shaders and lighting.

It can be difficult to get context across in press quotes.
 
That's the ever present issue of managing PR in the modern (Poor) Information Age, and veering OT. Obviously we need to know what's really going on to discuss their choices, but please let's avoid discussing PR in this thread.
 
I do wonder why we even have a parity discussion this gen?

Last gen, the X360 multiplat versions had in 99.9% better framerate, higher resolution and better effects...i.e. there was no parity last gen, PS3 only people lived with it.

Why this gen? Why is it important to the devs to achieve parity when this gen everyone already knows that the PS4 is much more capable compared to the X1?

Best options: just have the 900p/1080p situation...then all PS4 warriors are happy, and X1 are happy as well, because they can iterate that resolution does not matter/no difference anyway.

Problem solved...
 
For me, my problem is not about targeting 900p30, heck, I would be fine with 720p30. My problem is purely about parity. If (and this is a big if) the game came out running exactly the same between X1 and PS4, that means one (if not both) platform is being held back. I don't think being limited to CPU automatically made other resources useless. If that is the case, maybe we'll see a game where it's so CPU limited that it can only utilize 6CU and probably not bothering with ESRAM (because it's very CPU limited and not bandwidth or GPU or something else, purely CPU).

Edit : right now Ubi only said that they are targeting 900p30. We don't even know if either version will have better effects or simply run better. They really should just keep quiet instead of trying to start a fire.
 
Last edited by a moderator:
To me its more important that the game is fun and gives me entertainment than if it maxes out the hardware. :)
 
I also prefer that the game is fun. But this isn't about the game itself, but more about handling ports. Is it better just better targeting the lower spec and leave it at that for the higher spec one, or should they add something? Or maybe targeting the higher spec one and untick some of the stuff so it can run on the lower spec one?
Maybe it's not a development problem but more about business decision. If making a better version for one of the platform won't generate extra revenue, then why pursue it? In PC you can just brute force it, but on console the performance window is much tighter, thus extra time and obviously money to make sure everything runs normally after adding the extra stuff (or even removing stuff if it's a down port).
 
I also prefer that the game is fun. But this isn't about the game itself, but more about handling ports. Is it better just better targeting the lower spec and leave it at that for the higher spec one, or should they add something? Or maybe targeting the higher spec one and untick some of the stuff so it can run on the lower spec one?
Maybe it's not a development problem but more about business decision. If making a better version for one of the platform won't generate extra revenue, then why pursue it? In PC you can just brute force it, but on console the performance window is much tighter, thus extra time and obviously money to make sure everything runs normally after adding the extra stuff (or even removing stuff if it's a down port).


Are you upset that minecraft and Tetris are exactly the same?
 
Maybe it's not a development problem but more about business decision. If making a better version for one of the platform won't generate extra revenue, then why pursue it?

I would say almost all development issues are business decisions. ;)

The issue of being CPU bound is important as both companies chose to spend their thermal budgets on GPU rather than CPU both expecting that developers would leverage the GPU to supplement the CPU ( the opposite of the PS3 ). Cerny said he expected GPGPU or compute to something rolled into middleware where companies would be able to extract value from doing the compute development work for developers. So at some point in the future supplementing the CPU will a licensed feature rather than a developer time suck.

After a year the ESRAM seems to be getting along well as companies get used to working out the data access patterns needed to exploit the significant read/write abilities of the thing. When the consoles came out I assumed that there would be a time when the PS4 held a substantial advantage until the XB1's architecture got worked out then at some point when compute became a thing the PS4 would open up that advantage again with the XB1 enjoying the benefits of compute but with fewer resources at hand. If you keep your resolution down however the ESRAM will have a bit more room for compute scratchpad work or the like which may allow for the XB1 to close the gap once again. All speculative of course but there is a reason these systems are designed in similar ways to leverage the GPU ( cache coherency and extra command buffers being a major component if I am not mistaken ).

Added: While the XB1 has compute capabilities I pretty much assumed it was at heart a Direct X processing machine so that even if compute wasn't a big deal the XB1 would still run the kind of games that were written for the 360 and PCs but very efficiently. Like Windows 8 the XB1 was very ambitious in trying to cover as many bases as possible with a single core resource although it didn't have the legacy baggage of Windows to deal with.
 
I think the idea of GPU compute becoming the savior of these CPU lightweights is a bit of a dream. They are already pushing these GPU's pretty hard for graphics rendering operations, so I kind of doubt developers are going to sacrifice those cycles seeing as how they would be sacrificing the graphics in one way or the other.

As for X1 and developers getting the hang of the esram, its probably got some truth to it, seeing as how you could argue that the X1 has underperformed since launch. I still feel its foolish to believe the gap can be completely closed, at some point you have to concede that the extra SPU's and ROPs in the PS4 are going to bring extra performance that the X1 cant match.

Ubisoft's PR is getting pretty hilarious. We don't want 60fps? Are we supposed to believe that? We are playing the games, not watching a cinematic movie. If you want to make the cut scenes 30fps, go right ahead, but gameplay is better at 60fps. I honestly think they are just prepping gamers for the future. Developers are constantly trying to wow gamers with awesome visuals, and seeing as how 1080p 60fps is far more taxing than 900p 30fps, your going to see a lot of developers opt for less pixels, but better shaders, and less frames. Lets face it, double the framerate doubles the load across the board, so 60fps is every expensive.
 
I think the idea of GPU compute becoming the savior of these CPU lightweights is a bit of a dream. They are already pushing these GPU's pretty hard for graphics rendering operations, so I kind of doubt developers are going to sacrifice those cycles seeing as how they would be sacrificing the graphics in one way or the other.
Asynchronous compute is about using unavoidable GPU down-time. No matter how complex the graphics, there are spells where the rendering has a lull and the GPU is twiddling its thumbs. Async compute fills those holes with work. There's definitely a good couple of hundred GFLOPS available there for a lot of titles, although that may be purposed towards compute-based graphics as much as fancy AI or physics.
 
I think the idea of GPU compute becoming the savior of these CPU lightweights is a bit of a dream. They are already pushing these GPU's pretty hard for graphics rendering operations, so I kind of doubt developers are going to sacrifice those cycles seeing as how they would be sacrificing the graphics in one way or the other.

As for X1 and developers getting the hang of the esram, its probably got some truth to it, seeing as how you could argue that the X1 has underperformed since launch. I still feel its foolish to believe the gap can be completely closed, at some point you have to concede that the extra SPU's and ROPs in the PS4 are going to bring extra performance that the X1 cant match.

Ubisoft's PR is getting pretty hilarious. We don't want 60fps? Are we supposed to believe that? We are playing the games, not watching a cinematic movie. If you want to make the cut scenes 30fps, go right ahead, but gameplay is better at 60fps. I honestly think they are just prepping gamers for the future. Developers are constantly trying to wow gamers with awesome visuals, and seeing as how 1080p 60fps is far more taxing than 900p 30fps, your going to see a lot of developers opt for less pixels, but better shaders, and less frames. Lets face it, double the framerate doubles the load across the board, so 60fps is every expensive.


But in a situation when the Xbox One & PS4 is the same resolution & frame rate the GPU should have a lot of free time for compute. About 40% more GPU processing power free.
 
They are already pushing these GPU's pretty hard for graphics rendering operations,

Using essentially the same ways of doing things as before. New techniques may produce better results although there may indeed be a sub 1080p resolution penalty for lots of compute. Maybe the bet on compute is a loser but the die has been cast so to speak :devilish:

Even if the developers have little use for it in supporting the CPU maybe over time Sony and MS will find a way to restore a core or a large "fraction"* of one for use by developers by optimizing the system OS with the help of the GPU. An extra core would be pretty sweet.

* the cores are single threaded so I would assume a bit slice of core time available for developers
 
Last edited by a moderator:
Hmm. Enlightened of new events, I am now heavily vested in the camp of parity.

No one is knowledgeable enough to know where each game hits it's absolutely optimum point, we should leave that to developers. The masses are under the pretense that PS4 holds a 44% graphical horsepower difference over X1, in some cases it might, in some cases it won't. But they hold it as fact like witch hunting that is generally applied all cases and scenarios.

Hypothetically, if there are scenarios (and there will be eventually) where X1 is able to match PS4 in graphics quality or surpass it, PS4 owners will refuse to buy the game. Period. They vote with their wallet (see recent DA:I threads, AC:U threads), because they expect their platform (they set their own expectation mind you) that it will always triumph over X1.

So do developers start artificially downgrading the performance of the X1 version to get the PS4 sales back and not cause poor marketing around their product? How does that work for X1 owners?

It's basically forced slant towards PS4 at this point in time. And the X1 crowd is no longer vocal about having the weaker visuals either. So it's going to be like this even if X1 could compete. If it can't, it can't. But both platforms should improve equally together.
 
Back
Top