Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Yes, there's a lot of good stuff in that article. I recommend it for everyone interested in knowing AMD GCN better.
If it is not much trouble, how do you estimate the power of Jaguar CPU compared to the X360 CPU? more than two folds?

Do you have a specific case where running a certain type of code on Jaguar resuluted in xx% performance increase compared to running it on Xenon?
 
Face-Off: Lego The Hobbit

http://www.eurogamer.net/articles/di...obbit-face-off

-----------------------

One: 1920x1080p

PS4: 1920x1280p downsampled to 1920x1080p

Is a 200 line vertical resolution advantage really offer that much more sub-pixel accuracy for AA? I mean it's a ~6:5 ratio vertically over 1080p which isn't that much more sub-pixel info, might it just be a way of burning excess cycles to sustain a refresh rate closer to 30hz? I ask as the analysis seems to raise the issue of excess framerate 'hitching' that was an issue with CoD Ghosts on PS4 at launch.
 
Is a 200 line vertical resolution advantage really offer that much more sub-pixel accuracy for AA? I mean it's a ~6:5 ratio vertically over 1080p which isn't that much more sub-pixel info, might it just be a way of burning excess cycles to sustain a refresh rate closer to 30hz? I ask as the analysis seems to raise the issue of excess framerate 'hitching' that was an issue with CoD Ghosts on PS4 at launch.

Fair question. Could it be for forced parity? 1920x200 isn't insignificant, we're looking at 384000 pixels or 18.5% of a standard 1080p image. Rendering at regular resolution would only increase the frame rate by 5.5fps, but if the hitching is happening, then surely the fps would only increase much further.

Seems bizarre. Wouldn't MSAA be a better use of the resource?
 
Fair question. Could it be for forced parity? 1920x200 isn't insignificant, we're looking at 384000 pixels or 18.5% of a standard 1080p image. Rendering at regular resolution would only increase the frame rate by 5.5fps, but if the hitching is happening, then surely the fps would only increase much further.

Seems bizarre. Wouldn't MSAA be a better use of the resource?

I'm getting a dead link on that post.

But regardless, for a game like lego it's reasonable to assume that maximizing each platform is not part of the design goals for the team. Resources could be put elsewhere once the game reaches its target frame-rate and resolution.
 
I'm getting a dead link on that post.

But regardless, for a game like lego it's reasonable to assume that maximizing each platform is not part of the design goals for the team. Resources could be put elsewhere once the game reaches its target frame-rate and resolution.

I guess so. At least the devs know there's room for improvement once the game is only on the PS4/Xbox One.
 
Batman coldest enemy!

The developers really seem to have many problems with their PS4/XB1 3D engine with lego games, here Lego the Hobbit:

On XB1 the game freezes (litteraly during a whole half a second) every minutes or so completely breaking the immersion. I noticed at least 4 freezes in the 5 mn gameplay video. If you look at the video, impossible to miss the freezes also the FPS dips to 20fps-ish because of the average.

On PS4 there is some occasionnal judder, on the gameplay video the game in fact never dips under 30fps but occasionnaly (every minute?) goes over 30fps at 32fps-ish during half a second, creating the judder.

It reminds me the similar judder because of going over 60fps with COD Ghosts pre-patch and over 30fps with NFS.

Finally, I find the supersampling on PS4 completely useless because of the heavy DOF. But PS4 should have won, not because of the mild supersampling, but because those freezes on XB1 really must be annoying and IMO completely break the immersion from the beautiful world of Tolkien. Some people will really complain about those freezes, they really should patch the game on XB1.
 
Last edited by a moderator:
Shadows are the worse on the PC (quarter the resoltuion of the console version), also the PC's lacking motion blur, plus theres also a few other things missing with the PC version,
the PC is easily the worse but has the best framerate.

GOOD - they sorta mention what PC they used Intel Core i5 (not model) and GTX 680
 
The team must have been under a lot of pressure lately to hit deadlines with both the Hobbit (missed by a few months even) and LEGO movies (released around theatre date). It also wasn't too long ago they released LEGO Marvel (3 games on major platforms released within the span of 6 months).

I do wonder what changed in the graphics side of the engine considering they generally appear the same though. Bizarre introduction of bugs/missing effects.
 
Shadows are the worse on the PC (quarter the resoltuion of the console version), also the PC's lacking motion blur, plus theres also a few other things missing with the PC version,
the PC is easily the worse but has the best framerate.

GOOD - they sorta mention what PC they used Intel Core i5 (not model) and GTX 680

Sounds like a completely botched port. Thank god these issues don't effect the only recent Lego game that's actually worth playing - Lego Marvel Superheroes. PLaying through that at the moment, absolutely brilliant and very nice looking game (especially in 3D).

Even if the Hobbit was the best Lego game ever released I still wouldn't buy it on the same principle that I avoid the movies. It's been deliberately over split up to leech as much money out of consumers as possible. Lego LoTR was a single game and covers all 3 movies. How can they justify splitting the Hobbit game into two when the book was 1/4 the size of LoTR? Same goes for the 3 films. It was 2 at most. 1 if they condensed the material as much as they did with LoTR.
 
might it just be a way of burning excess cycles to sustain a refresh rate closer to 30hz? I ask as the analysis seems to raise the issue of excess framerate 'hitching' that was an issue with CoD Ghosts on PS4 at launch.
I wonder if that's the same thing. AFAIK the consoles are only outputting video at 60Hz anyway, so I would expect that Ghosts was experience wildly variable game-time deltas from one frame to the next, not variance in the time between buffer flips, the latter of which DF claims is happening here.

Why is frame timing seemingly so difficult? Unless there's wildly uneven engine loading on a coarse scale, you'd think they could get great results by just double-buffer vsync'ing this stuff.
 
Yay, very nice. Gonna go pick up your game today. Will probably suck.
 
Status
Not open for further replies.
Back
Top