How should devs handle ports between consoles? *spawn

*ahem* Can we stick to the topic at hand "How should developers handle ports" or not post at all?
 
I don't find the reality where the PS4 is easier to develop for that hard to believe. But I do find it hard to believe the XB1 would be considered "hard" to develop on. I when I say "hard" I mean like the normal "next gen new hardware no one has ever seen before hard".

I have a hard time believing that a bunch of fixed function accelerators and eSRAM on top of GCN and x86 presents a more complex development enviroment than what was presented to developers last gen.

Probably the biggest thing facing developers is trying to deal with what is suppose to be a fixed hardware platform but during pre launch stage is an ever moving target for both the hardware as well as the development tools.

Games missing launch isn't a sign of difficult to develop for hardware when it pretty normal phenomenon during that time period. Spring post launch end up being a pretty good time for 360 gaming last gen and it looks to be a good time for gaming for both the 360 and the PS4 this gen.

Seems like a lot of us are unsatisfied by the info given so far while still hungry for anything material that the release of any little piece of trivial information ends looking like a bunch of mangy starving dogs fighting over spoiled scraps.

Once the hardware and software tools stabilize I imagine porting to be as convenient as it has ever been. Its all GCN and x86 and I doubt a scratch pad on the gpu on one platform is going to throw cross platform development into disarray.
 
Last edited by a moderator:
Little things matter when you are developing large scale software, if the linker on one platform takes 60 seconds longer to link your app, it severely impacts your ability to work on that platform. If the shader compiler is buggy ditto, if the debugger crashes, if the streaming file link is crippled, etc etc etc.

Hard doesn't necessarily need to refer to the hardware, I'm sure the ESRAM adds to the complexity, but it impacts such a small number of developers in a team that I wouldn't expect it to be a big hurdle. The bulk of the engineers on large teams have only a cursory knowledge of the hardware anyway.
 
It's all down to the current state of the tools and the API. If the API makes it tough to use the ESRAM currently (for whatever reason) then there's problems. And without it, the Xbone is essentially a 7770 with 68gb/s shared with a CPU, and no AAA dev is targeting that level of spec for their games... thus the entire matter at hand which devs have been telegraphing to us through rumors and articles the last few months. So yeah, I'd say that makes things hard.

For some perspective, this was a rumor floating around during E3:

Ravidrath said:
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

http://www.neogaf.com/forum/showpost.php?p=81321425&postcount=422

360 probably had better tools comparatively at the time, and the main issues were fitting the framebuffer into 10mb, which was largely handled by reducing framebuffer size, dropping AA, and/or lower lighting precision to avoiding tiling, while still achieving 720p or thereabouts.
 
I don't find the reality where the PS4 is easier to develop for that hard to believe. But I do find it hard to believe the XB1 would be considered "hard" to develop on. I when I say "hard" I mean like the normal "next gen new hardware no one has ever seen before hard".

I have a hard time believing that a bunch of fixed function accelerators and eSRAM on top of GCN and x86 presents a more complex development enviroment than what was presented to developers last gen.

Probably the biggest thing facing developers is trying to deal with what is suppose to be a fixed hardware platform but during pre launch stage is an ever moving target for both the hardware as well as the development tools.

Games missing launch isn't a sign of difficult to develop for hardware when it pretty normal phenomenon during that time period. Spring post launch end up being a pretty good time for 360 gaming last gen and it looks to be a good time for gaming for both the 360 and the PS4 this gen.

Seems like a lot of us are unsatisfied by the info given so far while still hungry for anything material that the release of any little piece of trivial information ends looking like a bunch of mangy starving dogs fighting over spoiled scraps.

Once the hardware and software tools stabilize I imagine porting to be as convenient as it has ever been. Its all GCN and x86 and I doubt a scratch pad on the gpu on one platform is going to throw cross platform development into disarray.



Wouldn't it be a function of how locked down the platform is? If MS limits your access to the hardware and forces you to use their tools but the tools are incomplete in some cases then that would make developing games at launch harder. Could be a function of what many are hinting at - PS4 tools are much further along.

OTOH if MS is limiting low level access that could mean we'll see new hardware sooner but it might maintain backwards compatibility. In the long run it may turn out to be a tradeoff that is worth it.
 
Wouldn't it be a function of how locked down the platform is? If MS limits your access to the hardware and forces you to use their tools but the tools are incomplete in some cases then that would make developing games at launch harder. Could be a function of what many are hinting at - PS4 tools are much further along.
Devs rarely go low level, especially for launch titles. That won't be an issue.

OTOH if MS is limiting low level access that could mean we'll see new hardware sooner but it might maintain backwards compatibility. In the long run it may turn out to be a tradeoff that is worth it.
That's a discussion for another thread.
 
Devs rarely go low level, especially for launch titles. That won't be an issue.

That's a discussion for another thread.

my point is simply if they are limited in the tools they can create due to platform restrictions, it could impact development. Purely speculation on my part.
 
I'm thinking that CoD:Ghost & Battlefield 4 devs might have made the better choice for the Xbox One ports because the games would look even worse compared to the PS4 games if they were the same resolution & the Xbox One port had to cut down on the graphics.
 
I'm thinking that CoD:Ghost & Battlefield 4 devs might have made the better choice for the Xbox One ports because the games would look even worse compared to the PS4 games if they were the same resolution & the Xbox One port had to cut down on the graphics.

My guess is that thedevs presented the options to MS and in the end MS decided to go with lower rez but good pixels, probably the best decision given the circumstances.
 
Or maybe higher resolutions were not possible for other limitations, for example not enough time to rework the G-buffer layout for 32MB ESRAM, so lowering scene detail or shading quality would not have solved anything.
 
Is PS4 for more pixels while Xbox One is meant for less with some enhancements? Both?

Edit: By "Both?" in the title, I mean that I wondered if both consoles were meant for the same graphics style or not. Wanted to write "meant for more pixels" in the title instead of just for most pixels but by doing so I went beyond the limits of the thread's title.
____________________________________________________________________________________________________

We all know the benefits of a superior resolution, but that's not the entire point of this post, so I will focus in cinema like effects like those used in Ryse.

This is a theory I developed after reading 4 articles -linked below- regarding to graphics. I said this days ago, more or less, but the post was removed because it was off-topic in that particular thread, so I decided to create a post on the matter.

I said then that given the fact that the Xbox One can't equal PS4 resolution in every game developers could try a different approach for both consoles, because both could shine in their own way.

I'm not gonna bother with complex terms. Anyway, my theory is that developers could choose to provide PS4 games with a resolution boost and in the case of the Xbox One they could work with a lesser resolution and adding extra effects.

High quality Depth of Field, dynamic range, motion blur, new AA technologies and high levels of AF.

As pointed out by forumers here before, the question is..., couldn't both consoles follow the same approach? In the same way than the Xbox One the PS4 could drop the resolution.

The possibilities would be going both with superior resolutions but less effects or dropping the resolution a little and enhance different areas of the image.

The thing about the second approach is..., for the PS4, wouldn't the 32 ROPs be underutilised if you choose an inferior resolution, so would do the CUs? Excuse me if I am wrong.

Maybe on the PS4 the best approach would be trying the best of both worlds, upping the res a little to utilise the ROPs and CUs and use as many effects as possible at the same time using the GDDR5 bandwidth.

When it comes to the Xbox One, I think the best approach would be trying to fit the framebuffer within the eSRAM in its entirety 100% of the time!

As long as you choose something along the lines of 720p, 800p, 900p, -or a very carefully tuned 1080p- and apply a lot of effects, this approach could provide incredible graphics.

Even if you dropped the native resolution of a PS4 game to the Xbox One levels the Xbox One would have an advantage in its whopping 270+GB/s of bandwidth -theoretical maximum if the framebuffer fits the eSRAM-.

Which will allow for some crazy effects. :smile2:

My point is that on the PS4 developers could choose a given higher resolution on a regular basis, and for the Xbox One a lesser resolution keeping within the ESRAM limits to take advantage of it at all times, but using some extra effects.

These are the aforementioned articles:

http://www.eurogamer.net/articles/2013-11-23-digital-foundry-vs-forza-motorsport-5

But as we've seen in the 900p presentation of a game like Ryse, there's more to producing an appealing end image than heightening the pixel count - though this inarguably goes a long way.
http://www.eurogamer.net/articles/digitalfoundry-next-gen-now-ryse-son-of-rome

Tech guru Timothy Lottes - then of Nvidia, now at Epic - presented an interesting theory about the difference in presentation between a Hollywood Blu-ray movie and a typical video game. His blog post - unfortunately - is now gone, but you can get the gist of the discussion in this Digital Foundry article, where Lottes concludes:

"The industry status quo is to push ultra-high display resolution, ultra-high texture resolution, and ultra sharpness. In my opinion, a more interesting next-generation metric is, can an engine on an ultra high-end PC rendering at 720p look as real as a DVD quality movie?"

The debate was interesting enough that even a Hollywood CG professional contributed:
"We do what is essentially MSAA. Then we do a lens distortion that makes the image incredibly soft (amongst other blooms/blurs/etc). Softness/noise/grain is part of film and something we often embrace. Jaggies we avoid like the plague and thus we anti-alias the crap out of our images," said Pixar's Chris Horne. "In the end it's still the same conclusion: games oversample vs. film. I've always thought that film res was more than enough res. I don't know how you will get gamers to embrace a film aesthetic, but it shouldn't be impossible."

Well, of all the games we've seen since then, Ryse is arguably the closest we get to a practical example of this theory - and it looks quite spectacular for much of its duration. Resolution doesn't drop all the way to 720p - Crytek chose 1600x900 - but the overall look is very cinematic, from film grain to motion blur to the immense levels of post-processing and pitch-perfect effects work. Ryse works at a sub-native resolution where others flounder partly because the anti-aliasing is quite sublime
Tech Focus: Game Graphics vs. Movies



http://www.gamesindustry.biz/articles/digitalfoundry-tech-focus-does-pixel-count-matter

An interesting discussion kicked off on the blog of NVIDIA's Timothy Lottes recently, where the creator of FXAA (an anti-aliasing technique that intends to give games a more filmic look) compared in-game rendering at 1080p with the style of visuals we see from Blu-ray movies.

"The industry status quo is to push ultra-high display resolution, ultra-high texture resolution, and ultra sharpness," Lottes concluded.

Do 1080p games super-sample compared to Blu-ray movies? Is the current focus on high contrast, high detail artwork the right approach for a more filmic next-gen?


"In my opinion, a more interesting next-generation metric is can an engine on an ultra high-end PC rendering at 720p look as real as a DVD quality movie? Note, high-end PC at 720p can have upwards of a few 1000s of texture fetches and upwards of 100,000 flops per pixel per frame at 720p at 30Hz."

Comparing screengrabs of a game (Skyrim running with a super-sampled anti-aliasing hack) with the Robert Downey Jr Iron Man movie, the NVIDIA man reckons that even at native 1080p with no MSAA, game rendering is still effectively super-sampling compared to the quality we see in theatrical presentations, and maybe game developers could pursue a more filmic look using fewer pixels in concert with other processing techniques.


Lottes noted that there is little or no single pixel-width detail in 1080p Blu-ray movies, as we can see in spades in ultra-precision PC presentation, suggesting that the same level of detail can be resolved in gaming without recourse to a 1080p framebuffer - or else utilising 1080p with a lot of filtering that gives the illusion of a lower resolution.
http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview

We've chosen to let title developers make the trade-off of resolution vs. per-pixel quality in whatever way is most appropriate to their game content. A lower resolution generally means that there can be more quality per pixel. With a high-quality scaler and antialiasing and render resolutions such as 720p or '900p', some games look better with more GPU processing going to each pixel than to the number of pixels; others look better at 1080p with less GPU processing per pixel.
Game developers are naturally incented to make the highest-quality visuals possible and so will choose the most appropriate trade-off between quality of each pixel vs. number of pixels for their games
 
I don't think your post is going to gain an awefull lot of new answers. The PS4 isn't doing higher resolution because its architecture is inherently better at it than XboxOne. I think it purely comes down PS4 being easier to work with, due to being closer to what is found on the PC, having one pool of big memory and high bandwidth. In the end, in a multiplatform game, it's probably easier to focus on that and drop resolution on XboxOne to hit framerate targets. And while we're on topic, I'm also not sure the CU advantage on PS4 will yield much better visuals [in a multiplatform game] for the same reasons.

In an exclusive game environment, sure - games will target each platforms strengths - but as with every generation before, these games usually tend to do things completely different, making it hard to compare them.
 
I don't think your post is going to gain an awefull lot of new answers. The PS4 isn't doing higher resolution because its architecture is inherently better at it than XboxOne. I think it purely comes down PS4 being easier to work with, due to being closer to what is found on the PC, having one pool of big memory and high bandwidth. In the end, in a multiplatform game, it's probably easier to focus on that and drop resolution on XboxOne to hit framerate targets. And while we're on topic, I'm also not sure the CU advantage on PS4 will yield much better visuals [in a multiplatform game] for the same reasons.

In an exclusive game environment, sure - games will target each platforms strengths - but as with every generation before, these games usually tend to do things completely different, making it hard to compare them.

The bolded parts are to illustrate what I see as a contradiction in your logic.
I would say a big pool of memory, with even higher bandwidth is the definition of "inherently better hardware for higher resolutions".
 
What I ment to say is that there is a difference between *better* and *easier*. I don't think that the issues are that one platform is better at higher resolutions than the other, but that one happens to be closer to the PC hardware benchmark and the other requires a bit more thinking outside the box to get the same results.

If the main development platform is a PC setup evolving around a GPU with high bandwidth RAM, of course, porting it to a very similar platform [PS4] will yield more predictable results than going to a platform that requires a slightly different concept. These games have been rushed to some degree to get ready for launch. Of course, it might also be that the PS4 is more suitable for higher resoltions, but I don't think that's the limiting factor in *this* case with these games.

I see it similar to the PS3 / Xbox360 situation, but reversed. There, the Xbox was the easier platform due to large unifed RAM and the PS3 splitting it into two pools. Not a problem if you have devs with time and money to work around these things, but in a multi-platform environment, it usually leads to the more complicated version taking a back seat and offering the slightly more compromised experience, as was often the case with PS3 ports.
 
The bolded parts are to illustrate what I see as a contradiction in your logic.
I would say a big pool of memory, with even higher bandwidth is the definition of "inherently better hardware for higher resolutions".
The architectures are fundamentally the same. The advantages PS4 may have regards higher resolution work just as well in providing better pixel quality at lower resolution. It is not the case that XB1 == better quality pixels, PS4 == more pixels, by design.
 
What if MS demands at least equality?


Starting to look like that might be the plan.

http://www.handytips.info/2177/phil-spencer-is-making-sure-destiny-hits-1080p-on-xbox-one/


Bungie have revealed that head of Xbox Phil Spencer wants them to make sure that their Xbox One version of Destiny hits a 1080p resolution.

Whilst the PlayStation 4 beta and alpha of Destiny were easily hitting a 1080p resolution, when the Xbox One beta was released it was 900p. Microsoft want the same higher resolution for the game’s retail release on their platform.

Bungie told MCV “Phil Spencer is a great friend of ours, and has been putting great effort into making sure that the Xbox One edition of Destiny hits 1080p and is a great experience.”

This news is likely to further provoke discussion about whether Phil Spencer is pushing developers to implement 1080p resolution into the Xbox One edition of their games at the expense of performance. Earlier this week it was revealed that Microsoft on seeing a pre-release build of Diablo III with a 900p resolution said “This is unacceptable. You need to figure out a way to get a better resolution.” and pressured them to change it to 1080p which Digital Foundry’s analysis said impacted frame rate. In the aftermath of the furore Phil Spencer commented on Twitter “I just thought we could get to 1080p. If Blizzard thought it wasn’t right for Diablo they had the call on what shipped.”

Bungie also spoke about working with Sony on the PlayStation 4 before it was even released:

“Sony has been a fantastic partner with us. It’s been wonderful that we’ve been able to take part in the development of their hardware, it has gone beyond expectation in term of co-developing to ensure that our ambitions are possible on their platforms.”

However they haven’t forgotten the loyal Xbox fans who have previously invested financially and emotionally in their games:

“We’re not turning our back on our great relationship with Xbox.”

Source: MCV issue 801, Page 23. Friday 22nd August
Interview conducted with Bungie Lead Concept Artist Jesse Van Dijk and Director of Production Jonty Barnes.
 
I like how the quote from the Digital Foundry article has now become words that Phil Spencer actually said. Those are the words of John Hight of Bungie, paraphrasing or simplifying what might have been a very long conversation.

This is the real quote from Digital Foundry, in context:

"We did find it challenging early on to get it to 1080p. That's why we made the decision to drop to 900. That's what we demoed and were showing around E3 time. And Microsoft was just like, 'This is unacceptable. You need to figure out a way to get a better resolution.' So we worked with them directly, they gave us a code update to let us get to full 1080p."


I don't like the idea of Microsoft pressuring developers to hit 1080, just to hit that number. It would be incredibly stupid if they didn't give the devs the final say, in any case. It would be an easy way to strain relationships.

This quote: "Bungie told MCV “Phil Spencer is a great friend of ours, and has been putting great effort into making sure that the Xbox One edition of Destiny hits 1080p and is a great experience.” That could easily be interpreted as Microsoft putting in effort to improve their tools, development kit and resources to help devs get better performance.
 
I like how the quote from the Digital Foundry article has now become words that Phil Spencer actually said. Those are the words of John Hight of Bungie, paraphrasing or simplifying what might have been a very long conversation.

This is the real quote from Digital Foundry, in context:




I don't like the idea of Microsoft pressuring developers to hit 1080, just to hit that number. It would be incredibly stupid if they didn't give the devs the final say, in any case. It would be an easy way to strain relationships.

This quote: "Bungie told MCV “Phil Spencer is a great friend of ours, and has been putting great effort into making sure that the Xbox One edition of Destiny hits 1080p and is a great experience.” That could easily be interpreted as Microsoft putting in effort to improve their tools, development kit and resources to help devs get better performance.

"Bungie told MCV “ not Bungie told Digital Foundry, so I don't see how its being misquoted.
 
Back
Top