Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Or could be that theoretical improvement is under NDA and MS doesn't want the information spread to those who don't need to know.

Possibly, but as a universal 'unlock' for all games, it is likely something to be announced publicly, good publicity is good publicity, they really needed it back then. If it was done on a case by case basis, where MS engineers were required to go on site to 'unlock' system resources, then I could see this as a possibility.
 
Then again, perhaps it's the power of the cloud. :p
 
our resident pixel-counter (but only to a certain threshold) is oddly absent...


But really, I don't think it has much to do with the hardware/allocation. That aspect has always been read into way too much.
It could just be a few common libs that have a more elbow grease on the X1 than the PS4.
 


Seems the first game that may use X1's 9% faster CPU clock and lower latency DDR3 to a benefit. I was wondering about this.

My guess is that benefit didn't exist before (we had not seen examples of games running slightly better on X1), but Microsoft has been streamlining the SDK's to the point where it does.

I see no reason to jump to a conclusion that MS unlocked another core when we've never heard a single rumor to that effect.
 
Last edited:
*If* the system OS can run comfortably on one sure. Do we have evidence on how the X1 performs on the 2 cores let alone 1?

And once again technical aspects aside I'm am not a believer in silent patches this close to the most important 3 months of Xbox this year. There should have been mention of it, Phil already made mention that the X1 updates would resume in 2015. The next two December update will be minimal.

Well, windows 8.1 minimum requirements makes no mention of a multimode CPU. Just an 1 GHz CPU with pae, nx and sse2 support.

That being said, I have serious doubt that unlocking a core to the game OS made it past the internet.

I wonder about the little accelerators behind the DMEs. The whole point was to offload work from the main processors and may be the xb1 is seeing a benefit from having its CPU accommodating less work.
 
Seems the first game that may use X1's 9% faster CPU clock and lower latency DDR3 to a benefit. I was wondering about this.

My guess is that benefit didn't exist before (we had not seen examples of games running slightly better on X1), but Microsoft has been streamlining the SDK's to the point where it does.

I see no reason to jump to a conclusion that MS unlocked another core when we've never heard a single rumor to that effect.
Or perhaps it was simply that MS worked closely with Ubisoft to maximize optimization on this one, and Sony didn't do anything. I think that is very plausible considering MS has been doing this a lot lately. And because of their marketing agreement.

Who knows.
 
Last edited:
How did they fuck up like that? especially with the more powerful hardware?

I was on the fence about this but the game dipping low to 20fps makes this easy no buy for me.

What a weird November. I was looking forward to Batman 3 for the open world before discovering it had none. AC:U was on my radar and has technical issues. I wasn't remotely interested in COD:AW yet thoroughly enjoyed that.

At least there's still GTA V next Tuesday and hopefully Far Cry 4 will be good too! Maybe I'll treat myself that that Magma Red DualShock 4 with the money saved.
 
Well, windows 8.1 minimum requirements makes no mention of a multimode CPU. Just an 1 GHz CPU with pae, nx and sse2 support.

That being said, I have serious doubt that unlocking a core to the game OS made it past the internet.

I wonder about the little accelerators behind the DMEs. The whole point was to offload work from the main processors and may be the xb1 is seeing a benefit from having its CPU accommodating less work.

The DMEs were supposed to alleviate additional burdens from the GPU, except for the decompression hardware of which the PS4 contains equivalent co-processors.

It's certainly possible the PS4 version's performance is simply CPU limited, but that still does not explain why the resolutions are equivalent as a CPU limitation implies wasted GPU time on the PS4 side.
 
What is it with Ubisoft right now? Just about everything they're publishing seems to have issues and really awful marketing. I'll definitely give AC:U a miss.

If they think that they would be hitting 100+fps with no crowd, then doesn't it make perfect sense to lower population down until you're hitting 30/60fps. Surely the AI on the NPCs isn't that noticeably different to previous AC games. It seems like such a bizarre design choice to have a large crowd at the detriment of performance and resolution.
 
I've been reading really positive impressions about this game as well though, on any platform, and the footage I've seen is still really beautiful. I look forward to trying this game myself. But as bgroovy said, it seems likely something has been left on the table. Game had a 4 year development cycle though and so many people working on it, it is probably a miracle the game looks this next-gen at all. Heck, a small miracle it even exists.
 
It seems like such a bizarre design choice to have a large crowd at the detriment of performance and resolution.

It might be implicit in the quote - this game was made by multiple teams, so maybe...
- one team built a "Paris" that can run at 100+fps.
- another team built a crowd with a variety of clothing and AI to put into "Paris" that can run at 100+fps.
- other teams worked on quests/gameplay/animations etc.

I'm guessing that someone will eventually leak what really happened with AC:U.
 
It might be implicit in the quote - this game was made by multiple teams, so maybe...
- one team built a "Paris" that can run at 100+fps.
- another team built a crowd with a variety of clothing and AI to put into "Paris" that can run at 100+fps.
- other teams worked on quests/gameplay/animations etc.

I'm guessing that someone will eventually leak what really happened with AC:U.

I'd like to think that there might be one guy that occassionally looks at the metrics and goes; "okay, we need a bit less of this and a bit more of this. Oh the crowds are effecting performance that much?? Then let's drop population to a point where it doesn't effect gameplay".

I dunno, maybe I'm over simplifying things.
 
So a PS3 with a Cell Processor doing heavy graphics work to help out a weak GPU was able to sustain a prober AI performance, and a 8 core CPU that in theory should do less graphics work than mentioned Cell CPU is having trouble keepin up with AI. Seems totally weird to me. Do we have any CPU usage from the PC versions to backup the claim that it´s a CPU bound?
 
The DMEs were supposed to alleviate additional burdens from the GPU, except for the decompression hardware of which the PS4 contains equivalent co-processors.

It's certainly possible the PS4 version's performance is simply CPU limited, but that still does not explain why the resolutions are equivalent as a CPU limitation implies wasted GPU time on the PS4 side.

What do you mean by saying equivalent co-processors? ACP?
 
Since both version are running in basically the same setting, should the ps4 have extra GPU CU left over to help out the CPU?
 
I'd like to think that there might be one guy that occassionally looks at the metrics and goes; "okay, we need a bit less of this and a bit more of this. Oh the crowds are effecting performance that much?? Then let's drop population to a point where it doesn't effect gameplay".

Yes, but I suspect the problem is that by the time someone saw the metrics for the project running on the target hardware (rather than the modules), it was too late.
- the demos of 'final game' with vast crowds running on state-of-the-art PCs had been shown to the media/public.
- the lighting/architecture of Paris had also been shown off.
- the whole game had probably been playtested on those high-performance PCs with those crowds.

I don't think that Ubisoft had a "plan B" - even delaying the game wouldn't seem to provide a good solution to anything.
 
I'd like to think that there might be one guy that occassionally looks at the metrics and goes; "okay, we need a bit less of this and a bit more of this. Oh the crowds are effecting performance that much?? Then let's drop population to a point where it doesn't effect gameplay".

I dunno, maybe I'm over simplifying things.

Perhaps following the reception to Watch_Dogs visual 'downgrade', Ubisoft dearly wanted to avoid a repeat and overreacted.
Ubisoft had been heavily publicizing the visuals and crowd behaviour of Unity.
 
Status
Not open for further replies.
Back
Top