Digital Foundry Article Technical Discussion [2020]

Status
Not open for further replies.
How many games on completely different engines by completely different developers have to perform awfully before its no longer an optimization problem? Seems like an odd way to look at it.

GCN products have the advantage that the current consoles are still being on GCN. If Kepler GPU's where to be in the consoles (lets say a GTX760 or something), games would have run well on that architecture. Kepler is 8 years old soon, there's not much optimization going on there.
Even then, i have to say my 670 system has served very well through the generation, outmatched the base PS4 i have. Never before could we keep hardware that long during other generations.
 
How many games on completely different engines by completely different developers have to perform awfully before its no longer an optimization problem? Seems like an odd way to look at it. You could be having a better experience on a GCN GPU you could have bought during the same time period for less than half the price.
I want to tiptoe around this one carefully because I imagine some people are sensitive to GPU wars. I personally don't have a stake in it; but I mean, with this generation of console being entirely GCN, should this have been more favourable for AMD cards in theory ( at least over time)? Nvidia can hand optimize each title with each driver pack, but can they really optimize DX12 better than the developers intend for it to run?
 
I want to tiptoe around this one carefully because I imagine some people are sensitive to GPU wars. I personally don't have a stake in it; but I mean, with this generation of console being entirely GCN, should this have been more favourable for AMD cards in theory ( at least over time)? Nvidia can hand optimize each title with each driver pack, but can they really optimize DX12 better than the developers intend for it to run?

Helps for sure but i dont think it can be entirely attributed to that. The trend towards compute seems more of an industry thing and less of a console focused thing. Perhaps AMD just made better guesses about where graphics would go many years ago when prototyping these architectures? When PC specific developers develop their own modern engine, and especially if it uses a low level API, it still tends to run better on GCN. Im no developer but given the current situation its hard for me to come to a different conclusion.
 
DF Article updated to go along with the video, introduction included below, @ https://www.eurogamer.net/articles/...-xbox-series-x-a-revolution-in-console-design

Undoubtedly the biggest surprise of The Game Awards back in December 2019 was Microsoft's decision to reveal Xbox Series X: the name, the branding - and most crucially, the form factor. It was a console quite unlike anything we'd seen before, possibly the most original home console design since Nintendo's GameCube way back in 2001. During our recent visit to the Microsoft campus in Redmond WA, we had a chance to meet key members of the hardware team that created this remarkable-looking device - and in the process, we gained a much better understanding of why Xbox Series X required a top to bottom revamp of the traditional console form factor.

"When we started thinking about how we would design this, everything was theoretical," says Chris Kujawski, principal designer at Microsoft. "We didn't have stuff we could test, we didn't have measurements we could take, we knew it was going to be powerful and we knew it was going to require a totally different way of thinking about how to design a console."

The key issue facing the designers came down to power and target performance. The Xbox system architects decided from the get-go that the next generation console had to deliver an absolute minimum of twice the overall graphics performance of the Xbox One X, meaning 12 teraflops of GPU compute, sitting alongside the Zen 2 cores that would deliver a 4x improvement in CPU power. At the same time, the mandate was set that the machine also had to equal the acoustic performance of the Xbox One X - a tall order when system power would be increasing significantly.

The challenge came into focus once the outsize power requirements of the new hardware came into focus. Based on the prototype hardware we saw, Xbox Series X ships with a 315W power supply and in keeping with all of Microsoft's console designs since Xbox Series S, this would be delivered internally. With the sheer amount of electrical power pumping through the processor, the regulators pump up to 100W per square inch, delivering up to 190A. What made this all coalesce into the form factor we have today is the key decision to move to a split motherboard design: one board houses the high-power components like the Series X processor, the GDDR6 and the power regulators. The other is the Southbridge board, principally handling I/O. The boards sit on either side of a substantial chassis block - a sheer aluminium casting.

View attachment 3718

It's interesting that the customizations go as far as customizing the exhaust fan as much as they did: choice of bearing material and fan blades (needed if you want to optimize for low noise with decent air flow). I wonder if the fan design was done in house or in collaboration with a manufacturer that specializes in custom fan designs?

Also glad to see that they still design the controllers to have easily replaceable batteries (AA).

Jason Ronald, partner director of program management at Xbox - aka 'Xbox Beard Guy' from the E3 2019 Scarlett trailer - muses that the 50/50 split on opinion in the room almost perfectly reflects customer feedback: "What it comes down to is when actually talking to gamers, it's kind of polarising and there is a strong camp that really want AAs. So just giving flexibility is the way to please both [sets of] people... You can use a rechargeable battery pack and it works just like it does on the Elite, [but] it is a separate thing."

Now, MS please release the Elite Controller 2 v.2 with replaceable batteries.

Regards,
SB
 
The trend towards compute seems more of an industry thing and less of a console focused thing.

I would have made a better choice with a gcn gpu like a 7950/70 at the time, it aged better and was price competitve. That was 2012, didn’t know how things would look like in close to 8/9 years. Didnt even think it would last that long lol. But it wasn’t a bad product, not that bad atleast, have been able to play most games on it, mostly atleast matching ps4 (7850 level), but sure a 7970 would have been better.


For next gen, would AMD’s gpus profit again due to them being in consoles again?
 
I would have made a better choice with a gcn gpu like a 7950/70 at the time, it aged better and was price competitve. That was 2012, didn’t know how things would look like in close to 8/9 years. Didnt even think it would last that long lol. But it wasn’t a bad product, not that bad atleast, have been able to play most games on it, mostly atleast matching ps4 (7850 level), but sure a 7970 would have been better.


For next gen, would AMD’s gpus profit again due to them being in consoles again?

Probably since id imagine usage of low level APIs will only increase. Hopefully not to the same extent it happened this generation since Nvidia seems to be in a much better place hardware wise.
 
Nano being ~10% behind 5500/580 is completely normal,
No it's not, it should be at least 20% faster, FuryX is 30% faster than RX 580, Nano should be 20% faster.

The Nvidia results are also completely normal and in line with recent trends, just to a worse extent.
No, how is it that 780Ti which is at least 40% faster than GTX 680 becomes equal to it like that? what sort of screwed up code you have to make to end up with that result?

There are clearly all kinds of architectural bottlenecks in Kepler preventing performance from scaling. Not surprising, at this stage its pretty well known that Kepler is a poor architecture. Maxwell and Pascal performing uncharacteristically higher than Kepler has become the norm. Again, this is not limited to this specific title.
I have never seen a title that forces the 780Ti to perform like a 680, or to perform worse than a freaking 1050. Not even the first Doom 2016. Only in Doom Eternal.
you can see similar performance drop offs for Nvidia GPUs prior to Turing in Forza Horizon 4, Battlefield V, RDR 2, Call of duty warzone, World War z, Division 2, Wolfenstein young blood etc.
Again, NEVER to the extent that a high end GPU end up performing like a mid tier GPU from the same family.
GCN GPUs have much more performance potential than their Nvidia competitors.
More like specific code optimization to specific archs, not about any "hidden" potential. You think the 780Ti is setting there producing the same fps as the GTX 680 while being begged at a 100% usage? Not even 60% I would say.

Look, Kepler is bad at compute, this much is known. This makes it perform badly in relation to GCN in modern games, However AMD is not immune to the same thing, Fury cards experienced the same problem this generation as well, where FuryX has it's performance slashed to the RX580/1060 level in dozens of titles starting from 2017 onwards. It just happens that each IHV will have a bad arch every once once in a while.
 
No it's not, it should be at least 20% faster, FuryX is 30% faster than RX 580, Nano should be 20% faster.

No, how is it that 780Ti which is at least 40% faster than GTX 680 becomes equal to it like that? what sort of screwed up code you have to make to end up with that result?

I have never seen a title that forces the 780Ti to perform like a 680, or to perform worse than a freaking 1050. Not even the first Doom 2016. Only in Doom Eternal.

Again, NEVER to the extent that a high end GPU end up performing like a mid tier GPU from the same family.

More like specific code optimization to specific archs, not about any "hidden" potential. You think the 780Ti is setting there producing the same fps as the GTX 680 while being begged at a 100% usage? Not even 60% I would say.

Look, Kepler is bad at compute, this much is known. This makes it perform badly in relation to GCN in modern games, However AMD is not immune to the same thing, Fury cards experienced the same problem this generation as well, where FuryX has it's performance slashed to the RX580/1060 level in dozens of titles starting from 2017 onwards. It just happens that each IHV will have a bad arch every once once in a while.

No its not.
perfrel_1920_1080.png

That was from 2017. Its only gotten better for the 580. In many games a 580 actually ties or beats a Fury X.

It must be some bottleneck specific to Kepler that doesnt scale with increased GPCs.

Even if you scale the 780ti by 40% over a 680 it's still in poor position. I dont think its code optimization to specific archs. We are talking about nearly 10 years of AMD GPUs here.

Fury X was a grossly imbalanced GPU which as a result was hard to fully utilize. You needed super high resolutions to saturate it but then become crippled by the lack of vram.
 
Last edited:
Perhaps AMD just made better guesses about where graphics would go many years ago when prototyping these architectures? When PC specific developers develop their own modern engine, and especially if it uses a low level API, it still tends to run better on GCN. Im no developer but given the current situation its hard for me to come to a different conclusion.

It wasn't even a guess on AMD's part, nearly all of the events were calculated from their end. Their Mantle API was intended to be a backup plan in case their lobbying efforts fell towards deaf ears in the Microsoft graphics board or the Khronos Group. The Mantle API had 2 possible paths in the future back then. Either much of it's concepts weren't going to be standardized in the industry and it would've solely kept serving AAA developers being the only explicit option had D3D12/Vulkan never came into fruition or new industry standards were going to be derived based off of it's design and it would've been deprecated then.

Lo and behold, it's not a coincidence why D3D12/Vulkan has explicit barriers, queues, manual memory management, and a similar binding model to it's spiritual predecessor. The lobbying effort from AMD could've been described as relatively successful since it happened quickly that they made the new industry standards pivot closer to their hardware.

Main reason AMD wanted more explicit APIs was because they were 'converging' on their hardware designs so they wanted to do the same for the programming model as well. This reason combined with the new explicit APIs meant that their hardware could perform more consistently across their different hardware releases. Maybe it's a possibility that this runs contrary to the interests of other vendors who still haven't 'converged' on their hardware design and at the end of the day they still might want to make significant changes to the hardware design which is why this strategy won't make them perform consistently across different generations of hardware releases on more explicit APIs.
 
That was from 2017. Its only gotten better for the 580. In many games a 580 actually ties or beats a Fury X.
Which does prove my point, aging badly can happen to any GPU, even GCN, and even to a monster of compute like Fury.
Even if you scale the 780ti by 40% over a 680 it's still in poor position. I
I know, but it still should happen never the less, it's completely unnatural and freakish for 780Ti to tie 680 or be surpassed by the 1050.
It must be some bottleneck specific to Kepler that doesnt scale with increased GPCs.
What kind of bottleneck is that?
 
Which does prove my point, aging badly can happen to any GPU, even GCN, and even to a monster of compute like Fury.

I know, but it still should happen never the less, it's completely unnatural and freakish for 780Ti to tie 680 or be surpassed by the 1050.

What kind of bottleneck is that?
Furyx is a specific GPU, not an entire architecture comprised of many GPUs. It also didnt age nearly as bad as Kepler.

I would love to know what the specific bottlenecks are in Kepler. I would love to read dev deep dives discussing their experiences on all the GPU architectures from the last decade. The only recent information that was uncovered about Kepler was how much trouble it has with atomics.
 
I know, but it still should happen never the less, it's completely unnatural and freakish for 780Ti to tie 680 or be surpassed by the 1050.

As far as the 1050 is concerned, it SHOULD be faster than those ancient GPUs. If you consider that iD, probably more than any other developer leverages modern compute as much as they do, then it really shouldn't come as a surprise that even a low end Pascal would end up doing better than GPUs that are multiple generations older than it.

Likewise, it's leveraging Vulkan. I imagine NV puts significantly less optimization effort into Vulkan than they do Dx11 or Dx12 when it comes to per game profiles for older hardware. I'd be surprised if NV puts any effort into optimizing Vulkan performance on a game by game basis for hardware that old.

Regards,
SB
 
People should really watch the video. The mechanical side of it are really interesting. Looks to me on par of Apple case engineering. Didn't expect that kind of effort for a console at all.

Wasn't Panos Panay & his team involved with the design? If so, it's not much of a surprise to me.

Tommy McClain
 
Who is Panos Panay?
I think he's been promoted several times so I was going to say team lead/director, but he's probably VP. But his teams are responsible for all Surface hardware. And in 2015 took over the xbox hardware as well.
 
Who is Panos Panay?

He is the Chief Product Officer of Microsoft's Windows & Devices Group where he oversees the Windows Experience & devices like Surface, Hololens & Xbox. He just got the new gig over the Windows client back in February & replaced Corporate Vice President Joe Belfiore who was originally over the Windows client. Go watch the latest Surface event showcasing the Surface Duo & Surface Neo. He's basically the father of Surface.

Tommy McClain
 
Although the Surface Pro 4 is far from fabulous. Repeated issues with the pen registering presses even when not touching the screen. Never seen the like on any other tablet device. And recently I had some uncomfortable flickering on the screen that I see is a defect with a replacement option as long as your SP4 is less than three years old. I have no other devices that have deteriorated after only four years. My Samsung Note 10.1 is still going strong to this day, coming up to 8 years old. The PSU/charger also died on me and needed a replacement which, again, I've never had happen with other devices. And the expensive pen, just died, mid use - bought a third party replacement that doesn't need Bluetooth. Not to mention the issues with the interface at large.

I'm pretty sworn off Surface products now, and I'm not convinced the team that made my SP4 can be trusted to make super-duper other products. That doesn't mean I expect problems with a Surface team designed XBox, but I don't assume it'll be anything superior either.
 
Although the Surface Pro 4 is far from fabulous. Repeated issues with the pen registering presses even when not touching the screen. Never seen the like on any other tablet device. And recently I had some uncomfortable flickering on the screen that I see is a defect with a replacement option as long as your SP4 is less than three years old. I have no other devices that have deteriorated after only four years. My Samsung Note 10.1 is still going strong to this day, coming up to 8 years old. The PSU/charger also died on me and needed a replacement which, again, I've never had happen with other devices. And the expensive pen, just died, mid use - bought a third party replacement that doesn't need Bluetooth. Not to mention the issues with the interface at large.

I'm pretty sworn off Surface products now, and I'm not convinced the team that made my SP4 can be trusted to make super-duper other products. That doesn't mean I expect problems with a Surface team designed XBox, but I don't assume it'll be anything superior either.

Interestingly that mirrors my experience with a 9.7" iPad. I'm guessing I got something that was slightly defective. It's so far provided a worse user experience than my Surface Book 2, but still good enough to keep it around for light use by relatives when they visit.

If I had to use it for any sort of work, however, I'd be immensely frustrated, so I understand you swearing off on the SP4. Thankfully I haven't had any of the issues you are experiencing with the Pen on my Surface Book 2. So far it's not only the best laptop I've ever owned it's also the best pen and touch computing device I've ever owned.

The SB2 came out much later than the SP4, so I'm guessing they fixed whatever issues they were having related to the Pen by then.

Regards,
SB
 
Status
Not open for further replies.
Back
Top