Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
My general impression of Xbox One was that it was not fully ready at launch on the software side of things. The hardware seems to be complete and without issues, but the OS, the development kits, the APIs and other tools all seemed to be in a poor state, and things have been changing rapidly. It's almost like they just slapped Windows and directx as-is(generalizing) on the thing, and have only been able to optimize it post launch. A lot of assumptions were made about Direct3D being low overhead on Xbox One, but that doesn't appear to be the case. I wonder if they thought Direct3D 12 would be ready earlier, or something like that.

I have said this same thing in the Dx12 thread. I was mocked and told I was wrong
Who knows?
 
I found this pretty interesting from a PC gamers point of view:



That makes it sound as though more future games are going to be tailored to the idiosyncrasies of GCN which in thry would give that architecture a big leg up over Nvidia's in the future. Strange that we still don't seem to be seeing that though in the games so far released. Metro Redux comparisons between AMD and NV on the PC should be very interesting in light of that interview though.

Also this:



Is that really still applicable in a world where DX12 is about to bring close to the metal programming to the PC and developers are clearly making low level optimizations to game code that will directly benefit x68 AVX enabled CPU's and GCN GPU's?

Consoles will still have an advantage because of the closed box advantage.
You know only having to target one spec.
 
Plus on PS4 and probably X1 too, they will still be closer to the metal than DX12/Mantle/Any API PC, I assume. Yes in theory X1 will be DX12 but I bet in some areas lower level, just like the 4A interview reveals X1 was with regards to DX11 before.
 
Yes this benchmark also shows a strange swing in Nvidia's favour in the new version:

http://www.pcgameshardware.de/Metro-2033-PC-143440/Specials/Technik-Test-1131855/

How completely bizarre for a game that was previously not optimized for GCN and now is! I'd love to hear the devs take on that one.

Makes you wonder "how much better" Watch Dogs would have performed on AMD GNC PC hardware and consoles. But this is nothing new though...

I agree with Shifty from an earlier post, should PC developers be using more AMD wares, when thinking about future console ports? Especially when porting code over to the SDKs that are AMD based... which could/should cut down optimization times, and use specific features native to GNC, not Nvidia.
 
Is that really still applicable in a world where DX12 is about to bring close to the metal programming to the PC and developers are clearly making low level optimizations to game code that will directly benefit x68 AVX enabled CPU's and GCN GPU's?
GCN stuff like lane swizzles can be really efficient vs the alternatives that you're required to use on PC. So, yes, it's still applicable.

I think there is a bit of exaggeration here, At any rate, the limiting factor in current consoles is the lowly CPU, which means that "2X performance" will be nowhere near any of the top medium end PC hardware, and that's before we factor in DX12.

Lowly CPU will not be the bottleneck across the board. Not even close.
 
Digital Foundry: In our last interview you were excited by the possibilities of the next-gen consoles. Now you've shipped your first game(s) on both Xbox One and PlayStation 4. Are you still excited by the potential of these consoles?

Oles Shishkovstov: I think what we achieved with the new consoles was a really good job given the time we had with development kits in the studio - just four months hands-on experience with Xbox One and six months with the PlayStation 4 (I guess the problems we had getting kits to the Kiev office are well-known now).

But the fact is we haven't begun to fully utilise all the computing power we have. For example we have not utilised parallel compute contexts due to the lack of time and the 'alpha' state of support on those consoles at that time. That means that there is a lot of untapped performance that should translate into better visuals and gameplay as we get more familiar with the hardware.

Can't wait until they're are fully up to speed with the hardware... their next project will be awesome.
 
Great interview, it sheds some more light on why the XB1 is coming off worse in CPU benchmarks; looks like virtualisation overhead (as suspected) or DirectX is the culprit here - it'll be interesting to see if driver improvements and lower level access can see the CPU realise its 150 mhz clock advantage over the PS4's.

Also curious is how the new low-level access fits in the with VMs and overall original design. Was there always going to be a low level option and DX was just a stop-gap, or have MS abandon something like forwards compatibility in order to release more for the XB1 games?

From what I know, all communication between the game VM and hardware was to go through the Host OS (which controls all access to hardware and inter-OS communication).

So I don't think they're letting games (or more accurately, the Game VM) just talk directly to hardware now, they're probably just letting the game issue instructions via the Host OS to the GPU/CPU as usual, but using a lower level API than DirectX.
 
http--www.gamegpu.ru-images-stories-Test_GPU-Action-Metro_Last_Light_Redux-test-mtero_r_1920.jpg

Didn't they say they were going to improve the performance?
 
Just saw the article.

The TLoU porting approach from PS3 to PS4 is probably very different from the Metro project. The scope seems different.

In Metro, the developers mentioned that they don't use parallel compute context, so mostly PC-style GPU rendering even if physics-based ? Looks like a PC-centric development project but optimized for GCN. After all they have to deal with the full DX-11 stack on the XB1 side. Must be a pain trying to optimize the overhead out, plus dealing with the ESRAM. The PS4 side did't get much mention in the article in comparison. Presumably, the default code path (and/or XB1's shared code path) works well enough ?

OTOH, the TLoU port migrates PS3 specific code to PS4. They can throw away unnecessary SPU code. But they may need to retrofit PS3 centric design to PS4. Coming from PS3, those guys may need to port "special features" SPU compute tasks to the new PS4 compute architecture. That's actually an area of work that I'm most interested in this gen because it's supposed to be the "new thing".

It is interesting because SPUs have small and predictable latency, whereas GCN compute has longer and presumably less predictable latency, albeit more massive computation power. Too bad they didn't go into details how they sidestep the issues for a ported game. It's easier to work through these characteristics for a new game. I suspect they workaround the problem rather than tackle the issue full on, but that's just my speculation.

The Metro developers also answered the TLoU framerate difference in their own ways:

GCN doesn't love interpolators? OK, ditch the per-vertex tangent space, switch to per-pixel one. That CPU task becomes too fast on an out-of-order CPU? Merge those tasks. Too slow task? Parallelise it. Maybe the GPU doesn't like high sqrt count in the loop? But it is good in integer math - so we'll use old integer tricks. And so on, and so on.

So they have free rein in designing and implementing a new game. TLoU is a done deal, ND may not have full flexibility in changing stuff, and reallocating run-time budgets. They need to complete this thing within the "porting" timeframe.

Either way, I look forward to more compute-centric games.
 
Great interview, it sheds some more light on why the XB1 is coming off worse in CPU benchmarks; looks like virtualisation overhead (as suspected) or DirectX is the culprit here - it'll be interesting to see if driver improvements and lower level access can see the CPU realise its 150 mhz clock advantage over the PS4's.

What makes you think the issue is the VM model rather than the fat API?
 
Xboned API getting its hips stuck in the main rams doorway when the GPU is trying to squeeze through at the same time.
 
GCN stuff like lane swizzles can be really efficient vs the alternatives that you're required to use on PC.

But why are you forced to use those alternatives on PC? Especially if using Mantle/DX12? Even in NV hardware doesn't support that functionality why not have a code path specifically for GCN based GPU's? It wouldn't be the first time a games have implemented vendor specific paths.
 
First Light takes place predominately across the first of the two islands available in Second Son while also offering a slew of indoor areas within the DUP containment facility. Prior to testing, we had hoped that, by utilising more confined environments, we might actually see frame-rates improve, perhaps even closing in on that 60fps mark. That turned out to not be the case at all, however, with the average frame-rate during these sections remaining mostly under 40fps. While the visuals in these sections remain quite detailed and impressive in their own right, it does suggest that their engine is not entirely bottlenecked by handling large environments.

http://www.eurogamer.net/articles/digitalfoundry-2014-infamous-first-light-performance-analysis
 
Conversely, we were surprised to find that gameplay set within the city itself actually seemed to operate at a slightly higher average frame*-rate. While exploring the city we found that the frame-rate stuck more closely to a 40fps average. There is a palpable sense while moving about that performance is indeed faster than the original game. If you take a look at the performance analysis comparing the two releases the difference becomes pretty clear; First Light has a good 5fps advantage, give or take, over Second Son for much of the duration. As with the first release, the frame*-rate reaches its lowest points during combat sections, once again suggesting that the game's bottlenecks may have more to do with enemy encounters than anything else. Changes made to the city include the removal of DUP outposts, which aren't installed yet in this timeline, but it's hard to imagine those changes alone accounting for the difference.

Sounds like the engine has improved (slightly)... those locked 30fps crowd will not care, but us unlocked crowd, it's welcomed to hear things improved.
 
Status
Not open for further replies.
Back
Top