Digital Foundry Microsoft Xbox Scorpio Reveal [2017: 04-06, 04-11, 04-15, 04-16]

CPs are generally quite general processors with fixed function helper hardware in just about every GPU out there (mobile included), the hard bit is more to do with pipelining, cache and memory.

A CP is in many way similar to a Network Processing Unit, in that it really doesn't have long to do something and very little space to do it in. Its why they are usually threaded, so it one thread has to take a little bit longer, something can still push work to the various hardware units further down the pipe. Keeping a modern GPU busy isn't easy...

The software cost of changing CP a lot is a bit problem as well, a know of at least one GPU that keep its old and new CP in its new chip, so that the old could be used whilst the developed and debugged the SW. On release the old CPs are just not used, becoming dark silicon...
FINALLY THE ANSWER!!
waiting forever for this.

That's why XBO has 2 Command Processors!
1 that is customized for the microcode, and 1 that is not.
Only one is ever used!
 
4GB for OS sounds like way too much bloat regardless of functionality. Never mind Windows 10 that runs on 2GB if needed. We now have smartphones/tablets running tens of apps in the background and giving away push notifications, and those do very well with only 2GB total. And 1GB dedicated to a 4K dashboard? Why?!


No word on FP16 throughput, much less Vega's "Rapid Packed Math" that allows 2*FP16 and 4*uint8. News of 2*FP16 on the PS4 Pro actually came up some weeks after the console was in the stores, so they may actually be saving this for later.
It'll be very weird if Scorpio doesn't have this, though. And it could bring the Pro uncomfortably close to Scorpio in ALU-intensive multiplatform titles.

Also no HDMI 2.1, so no Variable Refresh Rate either...



Native 4k on this still doesn't make much sense to me. For multi-plats specifically.
The claim is really "Native 4K on games that were 1080p on Xbone, and some games that were 900p."
If you look at a pure specs comparison, Scorpio's GPU should indeed be able to do >4x more work than Xbone's.

Of course, this would mean sticking with the same draw distance, same shader LOD, same AF/AA levels, etc.
I'm not sure most developers will be very fond of this. Except maybe for the people with 70"+ TVs (a very tiny population among the already smallish amount of 4K users?), there may be very little gains in going full 4K instead of e.g. 1800p + upscale, while using the extra power to upgrade other stuff.


What's the consensus on them integrating DX12 at the hardware level? Is it really a noticeable jump or more of a talking point?
To be honest, from their description of the "it sounds exactly like the Graphics Command Processor we've had since GCN2.
I may be missing something, because I have no idea why Leadbetter made sure to mention it.



So much customisation for the "dead" cat cores is interesting too. Not just the improved clocks, but "extensive customisation to reduce latency" (an area where X1 implementation was already ahead of PS4). They should have higher IPC as well as higher clocks and lower workload in DX12 games.
I wouldn't be so sure about higher IPC.
Latency reduction is a by-product of higher clocks, and the customization could simply amount to tweaking the architecture to allow a higher clock rate without breaking TDP limits.
I would be convinced there was a lot of customization if they had changed the amount of L2 cache, or turned it into a full 8-core module instead of 2*4-core modules, or introduced some L3 cache.
But given the info we have, it's just uncannily similar to what we already have in the PS4 Pro.
 
Soooo... People are buying PC en masse now ? Given it's not remotely close to their wet dreams but a PS4 => PS4Pro like upgrade...
:p
ahh, well. I'm running a 1070 and a corei5 and it's not easy to achieve 4K either. So at the price point Scorpio will sell at, as well as being couch entertainment, I think it holds well on it's own. Price performance value is going to be very good on Scorpio. Considering I paid $650 CAD for my 1070, I can't see Scorpio at worst, much higher than that.

its' the cheapest platform for 4K and will likely have the largest library of 4K* games in the console space.
 
To be honest, from their description of the "it sounds exactly like the Graphics Command Processor we've had since GCN2.
I may be missing something, because I have no idea why Leadbetter made sure to mention it.
Continuation of their work I think, just pushing even further.

#1803

ExecuteIndirect allows for GPU generated draw calls, but it was limited in fashion, the XBO microcode helped make it more flexible, but it sounds like they've got nearly the whole GPU doing the draw call generation now with execute indirect.
 
Wait, so the new Scorpio isn't even using the new AMD Ryzen CPU? Its just an upgraded Jaguar CPU? Xbox Scorpio fail confirmed. LOL Kidding of course. Looks like a really nice piece of tech to me. In a way, It think the Scorpio and PS4 Pro deliver what consumers were expecting from the base models back in 2013. Not 4K, but the ability to deliver 1080p visuals at 60fps. I think a lot of people were surprised just how many games still run at sub 1080p and 30fps with the PS4 and X1. Im sure the core Xbox fans will be chomping at the bit to get their hands on a Scorpio later this year.
 
1 that is customized for the microcode, and 1 that is not.
Only one is ever used!
WTF?! Do you know what microcode is? Every CP program is called microcode. You really you should go read the ROCm kernel source code, if you want to understand our current architectures for PC parts. and see what a CP really does...

And the GPU I was talking about that had two CP, wasn't in any XBox...
 
WTF?! Do you know what microcode is? Every CP program is called microcode. You really you should go read the ROCm kernel source code, if you want to understand our current architectures for PC parts. and see what a CP really does...
there's specific microcode that they implemented, I apologize, I was too excited, I was waiting years for the explanation of why 2 CP would be on a GPU.
 
4GB for OS sounds like way too much bloat regardless of functionality. Never mind Windows 10 that runs on 2GB if needed. We now have smartphones/tablets running tens of apps in the background and giving away push notifications, and those do very well with only 2GB total. And 1GB dedicated to a 4K dashboard? Why?!

It's not the OS that takes up the ram, it's apps.

IMO, it's one of the best features of the XBox One; The game I paused two nights ago is instantly available even though my kids watched cartoons on Viaplay, the wife watched Netflix and I watched NHL/Twitch TV in between.

Cheers
 
4GB for OS sounds like way too much bloat regardless of functionality. Never mind Windows 10 that runs on 2GB if needed. We now have smartphones/tablets running tens of apps in the background and giving away push notifications, and those do very well with only 2GB total. And 1GB dedicated to a 4K dashboard? Why?!

Because its early in the bring-up, life and potential software targets for the platform? We've seen many times these specs get altered and optimized as the platform settles, better to make a big reservation to start and refine down later rather than have to alter the developer specs later.

Also no HDMI 2.1, so no Variable Refresh Rate either...
Too early. HDMI 2.1 features have been announced but actually specifications are only just due, implementation (i.e. logic) and then subsequent chip development can only really begin now.

http://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx

"The new specification will be available to all HDMI 2.0 Adopters and they will be notified when it is released early in Q2 2017"
"The HDMI 2.1 Compliance Test Specification (CTS) will be published in Q2-Q3 2017."
 
Because its early in the bring-up, life and potential software targets for the platform? We've seen many times these specs get altered and optimized as the platform settles, better to make a big reservation to start and refine down later rather than have to alter the developer specs later.


Too early. HDMI 2.1 features have been announced but actually specifications are only just due, implementation (i.e. logic) and then subsequent chip development can only really begin now.

http://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx

"The new specification will be available to all HDMI 2.0 Adopters and they will be notified when it is released early in Q2 2017"
"The HDMI 2.1 Compliance Test Specification (CTS) will be published in Q2-Q3 2017."

This why I postponed buying a 4k Tv to next year...
 
To anyone who is basing their hot takes on just the videos, read the articles. They will likely clear some things up for you.

@iroboto - credit where credit is due, you were clearly correctly evaluating the potential of offloading the task of feeding the GPU from the CPU.

Also, I'm totally shocked they were able to go fast instead of wide to get to 6TF. I didn't consider 40CUs a possibility at all.

Seems like this design will deliver exactly what was promised.
 
4GB reserved by the OS... good god, it keeps getting worse and worse lol.

Say what you want about Nintendo hardware, but at least their OS aren't maddeningly hoggish.

The rest of the console is pretty impressive.

Wrong. Their WiiU OS reserved 50% of all available RAM. ;)

I am disappointed they're using another GB for the OS, but I figure it'll be used for recording and broadcasting in 4K. Those buffers had to be increased too!
 
Back
Top