Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
I really don't understand the notion that beta/alpha/demos shouldn't be analyzed... it's a glimpse into the current state (performance, IQ, etc...) of the game. Sure things can change for the better... yet, change for the worst. And we seen plenty of cases for both....
 
I don't think analysis of resolutions and frame rates from show floor demos is going away, Digital Foundry did this last year at E3 2013 with separate features for PlayStation 4 and Xbox One. I wasn't surprised to see this again.

I find it quite interesting to look at some titles pre-release and post-release, the differences can be dramatic. It just goes to show what a difference a few months of optimisation can make to a code base :yep2:
 
I find it quite interesting to look at some titles pre-release and post-release, the differences can be dramatic. It just goes to show what a difference a few months of optimisation can make to a code base :yep2:

The God of War 3 analysis was pretty darn cool to see happen (in retrospect).
 
The God of War 3 analysis was pretty darn cool to see happen (in retrospect).

I remember the DF 'Making of' and 'How SSM Mastered PS3' articles for GoW3 but don't remember if there was pre-release analysis - that would be good comparative reading!

I wish that more teams would (or were allowed) to engage in such interviews.
 
I don't think analysis of resolutions and frame rates from show floor demos is going away, Digital Foundry did this last year at E3 2013 with separate features for PlayStation 4 and Xbox One. I wasn't surprised to see this again.

I find it quite interesting to look at some titles pre-release and post-release, the differences can be dramatic. It just goes to show what a difference a few months of optimisation can make to a code base :yep2:

jpg



1/41 According to Turn 10's Dan Greenawalt, everything you're seeing here is being generated in-game. It's an absolutely beautiful game and the highlight of the Xbox One reveal at E3.

Did Digital Foundry ever confront Dan Greenawalt on this?
 
I remember that thing about exceeding 60fps on COD, how did DF manage to understand this with only access to the 60hz output of the console? I didn't understand their clarification.
 
PS4 Ghost exceeding 60 is a theory when they observed judder at 60fps, which makes it looks like a frame is skipped in between of 2 1/60th frames.
They've since stated that it's not [running faster than 60fps].

Update: Having reviewed the patch captures, there's an odd issue with tearing which caused some confusion, but we're fairly convinced now that in the revised version of the game at least, any judder is down to frame-rate drops and not the game running faster than 60Hz.
 
I became paranoid about this kind of things. :LOL:

Is it in-engine?
in-engine real-time?
in-engine real-time on the console?
in-engine real-time on the console in-game?
in-engine real-time on the console in-game during actual gameplay?
 
jpg





Did Digital Foundry ever confront Dan Greenawalt on this?


Don't shoot me on this one , but I recall reading somewhere that forza 5 did render that in game but once they fitted all the physics code and the additional cars massive downgrades had to occur to meet the 60fps interval for release.

This difference was apparently present at some jimmy Fallon show.
 
Don't shoot me on this one , but I recall reading somewhere that forza 5 did render that in game but once they fitted all the physics code and the additional cars massive downgrades had to occur to meet the 60fps interval for release.

This difference was apparently present at some jimmy Fallon show.

I think a lot of it changed because of Azure/Thunderhead involvement... or the lack of it once F5 was released.

http://www.totalxbox.com/65749/forz...d-rely-on-xbox-ones-cloud-during-development/

Forza Motorsport 5 owes its 1080p visuals and 60 frames a second performance to Microsoft's furnishing Turn 10 with easy access to scalable servers, creative director Dan Greenawalt has told OXM. The ability to tap into Xbox Live Compute has allowed the studio to lavish more time and development resources on the game's visuals and complex physics systems; without this advantage, it would have had trouble achieving Forza 5's final resolution.
 
Last edited by a moderator:
I think a lot of it changed because of Azure/Thunderhead involvement... or the lack of it once F5 was released.

Article said:
Time saved creating online features has been spent on visuals and physics

To be clear, this isn't the same as saying that the game won't run as advertised without an internet connection, only that Microsoft has shouldered the burden of providing and tuning the servers, which has allowed Turn 10 to focus on maintaining 1080p.
Cloud servers had little to do with it.
 
It would be great if DF has a recurring item where they interview developers, like Turn10, and present them with proof and just confront them. Why, you might ask?
Well, for 1 it would be great journalism. And I believe that if Turn10 in this case, realises that they cannot hype up a game with essentially false advertisement anymore, they will become more honest and straightforward.

With the next mainline Forza game, they would state stuff like:
"The following realtime demonstration is running on hardware about 2 times the spec of Xbox One, this is to facilitate the global illumination, as well as render thousands of spectators in realtime, and provide cinema quality motion blur.
We chose this method of demonstration to show our ambition, and we believe the final game will have a comparable atmosphere. We might switch the spectators out for cardboard-like 2D crowds, the global illumination will probably only be present in the TV-ads and as for the motion blur, it would detract from the gameplay too much".

Instead of "all of this is running, in realtime, on Xbox One. The final game will look even better." and people going completely crazy about this amazing graphics.
 
Instead of "all of this is running, in realtime, on Xbox One. The final game will look even better." and people going completely crazy about this amazing graphics.


Eh. It's a lot of fixing that needs to be done with marketing in general then across the board. Quite frankly my Big Macs never look like what's photographed of you know what I'm trying to allude to.

Timelines need to be met is the long story short. Given their time frame they still did something great. The game is -out- and many games that look great are still in trailer form. It's a lot of work to push a game out so burning developers IMO will always be wrong.

Anyway it's off topic but this type of thing is more expected with launch titles than games being released post 2016.
 
All I'm saying is that if you gonna bitch about bullshot, you needs to bitch at ALL developers that use bullshot, which includes pretty much all of them.


Now, shall we start on pre-rendered videos being masqueraded as "tech demo" or "target quality" or doing a half resolution rendering and call it 1080p?
(no, cause it's pointless)
 
Last edited by a moderator:
The next waves of games will include new techniques that provide smoother frame rate. For example compute shader based particle system (bin+gather, no overdraw) is generally around 3x faster than pixel shader / alpha blender based traditional particle system, but in the worst case (nearby explosion fills the screen with extreme overdraw) the compute shader based particle system is over 20x faster. This will grearly reduce the particle related hiccups. GPU driven rendering pipelines cut the draw call cost to almost zero, eliminating most of the fluctuation caused by varying visible object count.

However techniques like these require complete rewrite of the existing (CPU / pixel shader based) rendering engines, and are not at all compatible with last gen consoles or DX10 PCs. It will take some time.

A very interesting and valid point sir, thank you very much for the insite and confirmation on points I suspected.

The question is how much the Engine Split will affect not only PS4/X1 architecture changes but also PC with a GPGPU driven render pipeline this will lower the target on PC games even further into the upper echelons of hardware and for Nvidia user's these are not on the level of Compute capable AMD cards, could we see split's in release like The Division where Console first (or no PC announced) and then changes to engine to suit and vice versa?
 
Status
Not open for further replies.
Back
Top