How should devs handle ports between consoles? *spawn

As for the last reply. I'm not sure I fully understand what we are attempting to calculate. 40 FPS works out to be 25ms, and the monitor refreshes as 16 so wouldn't that be 7ms lag for the next frame and then it would be perfect the following frame and then back to 7ms lag for the next frame? Something similar would occur for 12 FPS, 5ms lag followed by (add 83ms) then mod 16 to get another 6ms lag?

I do not think this is the case. If you have a 17 ms render time you will only miss one display frame update (if the display updates at 60 fps). If you have a 25 ms render time you will first miss one update, but then you have a 16.7 + 8.4 window to hit the next display frame update. So you hit 2/3 of the frame updates but every third frame will be a duplicate of the previous.

I think this is correct....
 
Don't know if this has been posted already but there was an email read on the Giant Bomb podcast allegedly from a developer on the AC Unity team that said half of the CPU time was spent on setting up the graphics and lighting ( mostly lighting I think ) and half of the bluray was baked Global Illumination assets. The dev said that after the xbone gave back the resources to devs there was only a 1 or 2 FPS difference between the consoles.

Apply salt to the email where needed and patience to my post in case very possible case that I am mangling things in translation.

So a question comes to mind. If this pre-baked GI example is a thing might we not see a bit more parity or similar porting issues if this particular method is being used. If the GI is enabled for all of the NPC etc that might also make Unity a bit of an outlier though.
 
Last edited by a moderator:
Don't know if this has been posted already but there was an email read on the Giant Bomb podcast allegedly from a developer on the AC Unity team that said half of the CPU time was spent on setting up the graphics and lighting ( mostly lighting I think ) and half of the bluray was baked Global Illumination assets. The dev said that after the xbone gave back the resources to devs there was only a 1 or 2 FPS difference between the consoles.

Apply salt to the email where needed and patience to my post in case very possible case that I am mangling things in translation.

So a question comes to mind. If this pre-baked GI example is a thing might we not see a bit more parity or similar porting issues if this particular method is being used. If the GI is enabled for all of the NPC etc that might also make Unity a bit of an outlier though.

This?
http://v.giantbomb.com/podcast/Giant_Bombcast_10_14_2014-10-14-2014-1451096026.mp3

Go to 2:24 to hear the part about the email
 
I do not think this is the case. If you have a 17 ms render time you will only miss one display frame update (if the display updates at 60 fps). If you have a 25 ms render time you will first miss one update, but then you have a 16.7 + 8.4 window to hit the next display frame update. So you hit 2/3 of the frame updates but every third frame will be a duplicate of the previous.

I think this is correct....

Yea I think this is correct. I think overall we are agreeing that the latency caused by V-sync is framerate mod 16.66.
 
Interesting comments on Giant Bomb podcast from AC dev, I'll link to neogaf (where they don't seem to understand what they're saying) as I'm on my phone and broadband-less again.

http://m.neogaf.com/showthread.php?t=913010&page=1

It appears that there are CPU dependant resolution-related elements in the lighting system. And as basically both have the same CPU...

(sebbbi speculated about this kinda thing. Always, always listen to the sebbbi!)

Also makes the producers comments about keeping the specs the same. Want the same lighting and number of AIs? Then you're going to get the same resolution. Parity Truthers: denied.

Interesting to see that MS have rolled back on CPU reservations twice now, perhaps explaining the changing shape of CPU benchmarks on the platform.
 
Interesting comments on Giant Bomb podcast from AC dev, I'll link to neogaf (where they don't seem to understand what they're saying) as I'm on my phone and broadband-less again.

http://m.neogaf.com/showthread.php?t=913010&page=1

It appears that there are CPU dependant resolution-related elements in the lighting system. And as basically both have the same CPU...

(sebbbi speculated about this kinda thing. Always, always listen to the sebbbi!)

Also makes the producers comments about keeping the specs the same. Want the same lighting and number of AIs? Then you're going to get the same resolution. Parity Truthers: denied.

Interesting to see that MS have rolled back on CPU reservations twice now, perhaps explaining the changing shape of CPU benchmarks on the platform.

Even if that's the case PS4 has 500 Gflops of GPGPU compute free that could have handle the lighting.
 
Isn't this weird? We just discussed an UBI paper where they offloaded tasks from CPU to GPGPU (the dancers)...do we know if they use GPGPU in AC Unity? Maybe they can free up CPU resources in a future iteration when porting stuff to GPGPU.
 
Even if that's the case PS4 has 500 Gflops of GPGPU compute free that could have handle the lighting.

And in time I'm sure that more of these things will get moved over to compute shaders. That takes time though. Time, money, blood, sweat, tears etc. Remember that this game has to run on all manner of GPUs, and that on PC fast CPUs are far more common than fast GPUs that are great at compute.

For the moment though this is simply a very ambitious title that's running on some fairly weak console CPUs.
 
Even if that's the case PS4 has 500 Gflops of GPGPU compute free that could have handle the lighting.

Obviously, they're doing their lighting on the CPU because they're just too stupid to move it to compute, and not because it's probably not possible at this point.
 
Isn't this weird? We just discussed an UBI paper where they offloaded tasks from CPU to GPGPU (the dancers)...do we know if they use GPGPU in AC Unity? Maybe they can free up CPU resources in a future iteration when porting stuff to GPGPU.

My guess is they obviously don't, or they would not have repeteaded coutlness times in those disastrous PR talks that their game framerate was CPU bound because of the NPCs.

But like you said probably in the future they'll do it but only in a generic and unoptimized way. Because of the Nvidia deal, the fact that all their devs develop first on Nvidia GPUs and port the game later on consoles, I dread they could never fully optimize the GPGPU hardware in those AMD cards and only use generic functions also available for Nvidia cards.
 
Ubisoft optimised their cloth compute for GCN, talked specifically about PS4.

No need to jump from one conspiracy to another!
 
Oh boy here comes a bloodbath..

http://www.reddit.com/r/pcmasterrace/comments/2j7r9j/delivering_ubisoft_came_to_my_school_for_a/

The Game Architect said that they aim for 60 fps but due to "limitations", they have to settle for 30 fps in recent games. He then implied that console makers are pressuring them into doing the same thing on PC.
(...)
The Game Architect said that on consoles, and for this type of game, they have to chose between graphical fidelity and smoothness. He implied that MS is making them lock the framerate on PC too. Then, he smiled, said "But our eyes can't see past 24 fps anyway" and winked at me.
(...)
They said they have to use Microsoft's cloud instead of the APU to do the AI in certain games due to hardware limitations.


Don't shoot the messenger. And please be civil.
 
Both pieces of information aren't verified you can't really admit them as evidence yet. For locked at 30fps one, I recall reading longer version of that where it began with MS and Sony want us to cap our games to 30 on PC. To later on just being MS. Which is inconsistent at the very least.
 
Second hand comments from a schoolboy, interpreting winks during a recruitment presentation, posted on a "PC Gaming Master Race" sub reddit. With MS trying to cap PC games at 30 fps.

Damming.
 
Come on people, don't be so stupid as to believe indirect third hand interpretations. You're now approaching the level of MisterXMedia in conspiracy theories. You should be better than that.
 
Would you seriously expect MS to say otherwise if it is true ?

I have no clue whether that's correct, but current (Intel based) PC should run anything current consoles run quite a bit faster...

And ffs, do not believe a single second that it can't be true. (Doesn't mean it is though, but if you think game dev is carebear world you're in for a rude awakening...)
 
MS has already made it a significant point that they want to continue focusing on the PC market, why would they all of a sudden decide to gimp their market and freely hand it over to steam?
 
Would you seriously expect MS to say otherwise if it is true ?
Of course not, but that's the mantra of the conspiracy theorists. Silence is interpreted as proof. Denial is interpreted as a cover.

What's the sense in MS trying to pull this stunt? 1) what pressure do they bring to bare? Threaten to refuse Ubisoft games on XB1? 2) Why?! What's the outcome? To make XB1 not look as poor in the graphics department next to PCs? They could have been trying that trick for the past decade but they haven't. PCs are superior, everyone knows it, and capping a framerate isn't going to change that. It'll just pee off PC users (who MS wants to turn into Xbox consumers with Windows 10) and we all know the truth will out when the mods unlock the framerate.

There's so little sense to the notion that, without decent proof, it's logically untenable.
 
Oh boy here comes a bloodbath..

http://www.reddit.com/r/pcmasterrace/comments/2j7r9j/delivering_ubisoft_came_to_my_school_for_a/




Don't shoot the messenger. And please be civil.

Why is Microsoft only mentioned? I thought Sony was also apart of this conspiracy as well?

Anyhow, I can see Sony/MS wanting favorable results on certain AAA titles, but nothing on the level of paying developers to cripple PC titles. That being said, we have seen lots of multiplatform titles being capped at 30fps on the PC for no apparent reason. Usually PC gamers just figure out away to unlock PC frame-capps.
 
Back
Top