Assassins Creed Unity [PS4, XO]

Funnily enough, even that difference matters in my recent experience. On our last big project we've delivered cinematics for a console game at 30fps - and in some scenes, it looked and felt noticeably different to me than 24fps. Just some simple camera pans and slow character movement, but it still mattered somehow.

However I'm not sure about our theater room's tech specifics, ie. what's the projector's actual refresh rate for 24fps and 30fps material, does it adjust or is there some pulldown/whatever... It's a relatively big screen and cinema seats, nice for dailies ;)

In a game setting, unless the whole game is 24fps, you will get pulldown if you do 24fps cinematics unless you change the output mode which takes a couple of seconds on most TVs.
 
And just a day back his producer was saying that if it was for gfx they could go for 100fps, but it is because of the AI that they can't go beyond 30 as the game is CPU bound due to the AI. Not for cinematic purposes but because the CPUs of these consoles cannot handle the AI.
Listening to these guys has no meaning. They are just justifiying what they have.

Just quoting for sanity's sake. Their producer already dropped the ball by saying they can't go beyond 30 due to CPU constraints and not because it is cinematic. This new PR talk is to cover up what ensued after the Producer's moment of truth. This is to drown whats already out. I , frankly, don't understand why they have to cover it up anyway. He said it flatly that the machines' CPUs can't take anymore.

and frankly, anyone who thinks a 30 fps RnC game feels better than 60 fps needs a checkup.
 
Last edited by a moderator:
And you believe this is going to fly with PC gamers - um, good luck with that.

1) Why should a group of people (that I am also a part of) have any problem with my *opinions* on frame rates?

2) As a PC gamer and I often chose 30fps over 60fps in slower paced games if I can ramp up the visual fidelity. When I need responsiveness (racers, shooters, fighters) I will make whatever visual sacrifices necessary to hit 60fps.

3) I probably didn't make it clear that I was talking about framerate trade-offs with respect to fixed architecture (i.e. consoles). On PC we usually have the choice but that choice is not easily accommodated, however much we'd like it, on consoles.
 
Just quoting for sanity's sake. Their producer already dropped the ball by saying they can't go beyond 30 due to CPU constraints and not because it is cinematic. This new PR talk is to cover up what ensued after the Producer's moment of truth. This is to drown whats already out. I , frankly, don't understand why they have to cover it up anyway.

So we should believe on UBI employee's bullshit over anonther UBI employee?
 
Like Sigfried said, that's exactly what they are saying.

Obviously 30fps can be more preferable to 60fps if it comes with twice the power to spend on each pixel but they are saying 30fps is better than 60fps all other things being equal.

If that is genuinely what the UBI guy was saying then yes I totally agree with you, and I can't imagine many people contesting that.

However I don't think he, or Ready at Dawn meant that. This quote from the article:

TechRadar said:
"30 was our goal, it feels more cinematic. 60 is really good for a shooter, action adventure not so much. It actually feels better for people when it's at that 30fps. It also lets us push the limits of everything to the maximum."

The sentence "It actually feels better for people when it's at that 30fps." is obviously bull, but he does specifically say that he is not talking about an 'all things being equal' situation. He is talking about reducing framerate to 'push the limits'. This surely refers to AI and visual fidelity which have already been explained as the reason for the 30fps cap.
 
1) Why should a group of people (that I am also a part of) have any problem with my *opinions* on frame rates?

2) As a PC gamer and I often chose 30fps over 60fps in slower paced games if I can ramp up the visual fidelity. When I need responsiveness (racers, shooters, fighters) I will make whatever visual sacrifices necessary to hit 60fps.

3) I probably didn't make it clear that I was talking about framerate trade-offs with respect to fixed architecture (i.e. consoles). On PC we usually have the choice but that choice is not easily accommodated, however much we'd like it, on consoles.

No, what I was I getting at - is that your view and UBI's reasoning for 30fps would not fly with PC gamers - like myself. It's one thing to say your engine is capped on purpose for artistic reasons (cinema appeal)... but don't BS PC gamers into believing that your engine is limited to only 30fps for artistic reasons. Because you and I, damn well know, UBI will not release AC:U on the PC at a locked 30fps.

After re-watching all the latest AC:U footage, I can honestly say, the NPCs seem to do the typical stuff, like the previous AC games, and the graphics so far need a lot of work. Like the stretched, no depth having textures across the roof tops... and some of the worst anisotropic filtering I have seen in awhile.
 
No, what I was I getting at - is that your view and UBI's reasoning for 30fps would not fly with PC gamers - like myself.

Yes, that's exactly what you said before. We're obviously not communicating our standpoints clearly enough.

What I am saying is that the Ubisoft guy is talking purely about consoles, and he is talking about targeting 30fps to reach the AI and visual fidelity targets they want within the consoles fixed architectures.


Because you and I, damn well know, UBI will not release AC:U on the PC at a locked 30fps.

Mr. UBI's comments obviously have no bearing on PC gamers, as we balance resolution and effects to hit our own target framerates.

So maybe it's me being as thick as a whale omelette, but I don't see how any of this would 'not fly' with PC gamers, as it really has relevance to us. It only affects the console versions of AC:U.
 
Only reason I can think of is that things rarely if ever seem to fly with PC gamers. You guys are a cranky bunch ;)

But yeah, that guy was clearly talking about consoles.
 
I wonder why if Ubisoft is complaining about the lack of CPU power, they don't use the Azure servers to offload some of the work, just like Titanfall does. At least on Xbox One they have that option.
 
I wonder why if Ubisoft is complaining about the lack of CPU power, they don't use the Azure servers to offload some of the work, just like Titanfall does. At least on Xbox One they have that option.

Parity? :LOL:
 
Some of those screens look more like a village than Paris.

When is the game suppose to take place?

By the second half of the 19th century, Paris had wide Haussmanian boulevards and big buildings.

Notre Dame has a lot of open space in front of it. Doesn't look like there were ever any smaller buildings in front of it.

Then again previous AC games of Florence didn't look that convincing either.
 
Some of those screens look more like a village than Paris.

When is the game suppose to take place?

By the second half of the 19th century, Paris had wide Haussmanian boulevards and big buildings.

Notre Dame has a lot of open space in front of it. Doesn't look like there were ever any smaller buildings in front of it.

Then again previous AC games of Florence didn't look that convincing either.

I think it is set during the revolution or during/after the reign of terror.
I believe they said they used old maps of Paris anyway.

P.S.
Venezia, Firenze, San Gimignano in AC 2 are not faithful reconstructions.
I always thought that Ubi's goal was just to give an "impression" but not actually recreate them anyway.
 
Last edited by a moderator:
About the parity "scandal" surrounding this game:

I can accept CPU is the limiting factor on AC:Unity. CPU and GPU need to coordinate in every frame, and if the CPU, due to the several hundreds NPCs on screen limits the frame rate of the GPU to 30 fps, then we can accept that AC:Unity cannot go higher than 30 fps. No problem here.

But what this has to do with resolution?

As I see it, as long as conditions do not change with a resolution upgrade (like depth of view, number of elements on screen, etc), all of the calculations that the CPU needs to do remain static regardless of the resolution.
All the internal calculation process that the CPU has to do inside a game is dependent on what goes on in the game, and I don´t see it as beeing resolution dependant.

So, as I see it, 30 fps at 720p, or 30 fps at 900p, or 30 fps at 1080p, if the GPU is not the limiting factor, are still 30 fps.

Heres some words on resolutionupgrades from Peter Thoman, a modder who created a mod for Dark Souls that increase the graphics quality,

"While increasing the resolution only increases GPU load, increasing the frame rate also increases the CPU load significantly
So in cases which are CPU-limited (or limited on the GPU by some very specific factors), you might be able to increase resolution while not affecting (or only slightly affecting) frame rates."

It seems to me, that not having the GPU limited (and Ubisoft claimed that they were not), 1080 at 30 fps is possible on PS4.

Ok, it may be a limitation on forward rendering since 900p is what can be fitted in the 32 MB ESRAM. But not on the PS4!

PS: If this is not the proper place to discuss this, please direct me there. Thanks!
 
As I see it, as long as conditions do not change with a resolution upgrade (like depth of view, number of elements on screen, etc), all of the calculations that the CPU needs to do remain static regardless of the resolution.
All the internal calculation process that the CPU has to do inside a game is dependent on what goes on in the game, and I don´t see it as beeing resolution dependant.
You are corrent that the resolution increase mainly affects the GPU performance. However, most texture streaming algorithms calculate the needed texture mip levels precisely (using screen space texel density). 1080p will stream 44% more texture data from the hard drive compared to 900p. If the texture data is stored in a compressed format not directly supported by the GPU (such as zlib, lzx, jpg-xr, etc), that data needs to be uncompressed (or transcoded) during loading. This step takes some CPU time (as variable length data decoding is often processed on CPU). Obviously 1080p also need 44% memory to store the texture streaming working set.

If you are using virtual texturing, 1080p will generate approx 44% more page requests compared to 900p. This means that the page management requires a little bit more CPU time. Depending on what operations are needed to be done when a page is generated, this might actually cost some real CPU cycles. For example in our case we render lots of decal triangles to the virtual texture pages, and this means more CPU work to setup the draw calls. On modern GPUs (with compute shaders) you can offload most of this work to GPU (if you prefer to do that).

Some engines also use screen size based LOD system (swap LODs by pixel count) to maintaining the same geometry density per pixel on all resolutions. This means that the triangle count depends on the screen resolution. This is mostly an extra GPU cost, but if the last LOD is visible for further distance when the resolution is higher, it also means that the CPU must setup more draw calls and cull more objects.

These real world examples show that some engines incur a (small) extra CPU cost when the resolution is increased. I don't know anything about this particular case however.
 
That's really interesting, thanks sebbbi! I'd always assumed resolution was totally disconnected from CPU performance. I'll bare that in mind when choosing graphics settings in the future.
 
Back
Top