So some thoughts after reading the DF article (
https://www.eurogamer.net/articles/digitalfoundry-2020-inside-xbox-series-x-full-specs ), dual purposing the Series X SOC for Xcloud provides a some advantages they may not otherwise have had.
- First, even for extremely bad chips they can likely salvage it for use. Instead of running 4 instances of an XBO game, it could instead run 3 or even 2 instances. No ideal from a data center power standpoint, but would allow them to use them temporarily if power consumption per XBO instance is important.
- Second, this allows for potentially larger economies of scale that may allow them to negotiate a better wafer price at the fab.
- Lastly, I wonder if MS has any plans for potentially using these SOCs for Azure?
Locking the speed of the SOC down without using boost clocks is the smart thing to do. I don't imagine many gamers would want to play performance lottery about how well their console will run performance hungry games. Even worse if you don't generally have air-conditioning and would be at the mercy of local temperatures.
Personally, I still hate that PC GPUs all now use boost clocks making their performance non-deterministic.
Looking at the work done on Gears 5, I suspect there may be enhanced versions of all of Microsoft Studio's recent games on XBSX as they should have plenty of time to make any necessary changes and potentially optimizations to have XBSX at the very least run PC ultra settings at 4k/60, possibly even 60 FPS for titles that were 30 FPS on XBO-X.
Will be interesting to see if any 3rd party studios also do something similar.
So, ML was a focus and hence CU's were modified to not only do 2x-16 bit ops, but 4x-8 bit ops and 16x-4 bit ops as well. Depending how and whether ML is leveraged, this may be a competitive advantage if Sony hasn't also worked with AMD for similar modifications.
- Important to note that this isn't the same as the dual 16-bit ops that PS4-P supported.
- It's here from the start of the generation on the base XBSX console.
- There's a clear plan for use in ML algorithms that can be leveraged across the entire generation of XB titles, unlike dual 16 bit OPs only benefitting PS4-P.
- Of course, this still depends on Microsoft or 3rd party developers finding a use case for ML.
- Also, it's possible that it's standard RDNA2. I know AMD mentioned support for Tensor operations in RDNA2, but I can't recall if they mentioned that was via Tensor Cores (like Turing) or not?
- Andrew Goossen (Microsoft) mentions specifically adding support for this to the GPU, so it suggests it's a Microsoft requested change to the CUs.
So, developers will be able to create their own custom BVH structures and the SOC will accelerate that in tandem with whatever the rest of the GPU is doing. IIRC, this differs from NV's Turing in that the hardware accelerated BVH isn't customizable by developers?
Hmmm, no rasterization (other than the skybox and moon) in the Minecraft DXR demo? Or am I misunderstanding what DF are saying? Impressive if so. Thinking back to what 4A Games said about their next game having a full RT rendering engine, I wonder if that means they will go full RT with no rasterization?
I find it very interesting that they started development of the Project Scarlett back in 2016...and XBO-X (released in 2017) was part of the development of Project Scarlett.
As textures have ballooned in size to match 4K displays, efficiency in memory utilisation has got progressively worse - something Microsoft was able to confirm by building in special monitoring hardware into Xbox One X's Scorpio Engine SoC. "From this, we found a game typically accessed at best only one-half to one-third of their allocated pages over long windows of time," says Goossen. "So if a game never had to load pages that are ultimately never actually used, that means a 2-3x multiplier on the effective amount of physical memory, and a 2-3x multiplier on our effective IO performance."
The external SSD package for XBSX also looks suitably cute in a nostalgic console memory cart kind of way.
DLI seems interesting. Rather than the standard method of the system OS constantly checking the controller to see if there was any user input, it's been reversed so that the controller just immediately sends user input to the system whenever the user activates an input. Additionally each input is time stamped so that the game (developer) can see exactly what the latency is for user input.
OOOOOOOH. Full frame updates decoupled from display sync? So the benefits of faster response (Vsync disabled) without the drawback (screen tearing) even on non-VRR displays. This is something I'd like to see in action.
USB Type-A chosen specifically to ensure easy compatibility with past accessories. Not a bad choice as USB Type-A can support 10 Gbps speeds which is more than enough for most uses the USB ports will see. Unlike portable devices (laptops, tablets, etc.) you aren't going to be using the USB port as display out (lots of bandwidth required for 4k/60 or higher) or to charge the device which are things that USB-C would be needed for.
And back to the whole Machine Learning question...
This was a show-stopping moment. It was indeed Fusion Frenzy - an original Xbox title - running with its usual 16x resolution multiplier via back-compat, but this time presented with highly convincing, perceptibly real HDR. The key point is that this is proposed as a system-level feature for Xbox Series X, which should apply to all compatible games that don't have their own bespoke HDR modes - and as Marais demonstrated, it extends across the entire Xbox library.
I'm impressed, but also somewhat skeptical. If it actually works convincingly as a system level feature? I'd almost get one just to play something like Crimson Skies (in the BC list) in HDR.
And I'm not even interested in console gaming anymore. But this is something that you can't do on PC (at least currently). Mind blown if they can pull this off.
If anyone hasn't read that article yet, you should. It's a great read. Huge kudos to Richard Leadbetter for an excellent article. Now, when do we get one for the PS5? I'd be EXTREMELY interested to see something of this caliber with this much information on PS5.
Regards,
SB