Digital Foundry Microsoft Xbox Scorpio Reveal [2017: 04-06, 04-11, 04-15, 04-16]

In the PC world, texture paging systems can make due with under 600 Meg of data per frame. Go read some of sebbi 's posts to be educated on the refinements.

I was slightly off, it's not 600 Meg, it's under 500 Meg for 4K.

https://forum.beyond3d.com/posts/1974637/

Frostbite needs 472 MB for 4K (all render targets + all temporary resources) in DX12. Page 58:
http://www.frostbite.com/2017/03/framegraph-extensible-rendering-architecture-in-frostbite/

Assets (textures, meshes, etc) are of course loaded on top of this, but this kind of data can be easily paged in/out based on demand. I'd say 4 GB is enough for 4K (possibly even 2 GB), but too early to say how well Vega's memory paging system works. Let's talk more when Vega has launched. I have only worked with custom software paging solutions that are specially engineered for single engine's point of view. Obviously a fully generic automatic solution isn't going to as efficient.

Game's data sets tend to change slowly (smooth animation). You only need to load new pages from DDR4 every frame. 95%+ of data in GPU memory stays the same.
 
Last edited:
Just now catching up. Great posts so far.

The main negative for me is still no dedicated Kinect port, but that is the same as the S.

I thought it would be too, but I got the free Kinect adapter for my S & it works the same as on my original. Too bad the Kinect adapter is no longer free.

Does this bode well for built in mic in the controller or console since there is no Kinect port but does have HDMI in?

Has anybody tried a mic in the controller?

I'd like to see that too, but could see it not being a great experience. Seen that Nyko has SpeakerCom & also seen users buy those mini microphone stubs for use iPod Touch. Don't know how good or bad they work. Anybody try those?

Tommy McClain
 
Brit,

But having to bring textures in and out on the fly has a bandwidth and processing cost, doesn't it? You might be able to use 4K textures with less than the required amount of memory to keep them available at all times, but then you'll have less bandwidth or processing power for other tasks. Once again, it is a trade off. If you are willing to make enough trade offs, almost anythng is possible in the gaming world. But by having a greater quantity of memory available to developers, you could keep the textures loaded without having to make as many trade offs.
 
The way I see it is what would you rather have:

A) The same old Xbox One S hardware for another 3 to 4 years
B) The same old CPU and 5GB of Ram with 36 CUs for another 3 to 4 years
C) The Xbox One Scorpio system with all the improvements for CPU, Command Processors, 40 CUs, 328 Gbps of 8GB of Ram for another 3 to 4 years

There is no choice D.
 
In the PC world, texture paging systems can make due with under 600 Meg of data per frame. Go read some of sebbi 's posts to be educated on the refinements.

I was slightly off, it's not 600 Meg, it's under 500 Meg for 4K.

https://forum.beyond3d.com/posts/1974637/
Re-read that as it is just render targets and not textures and other assets that ideally are still in memory. Consoles lacking separate memory pools would be paging from disk. That should still work, but higher detail textures may pop in a bit more noticably with the delay of accessing a disk.

Requirements should still be lower, but consoles are more efficient with tighter specs.
 
Here is a good example.

In the game, "The Division" NVIDIA claims that users with less than 6GB of VRAM may not receive the highest quality textures at all times.

http://www.geforce.com/whats-new/guides/tom-clancys-the-division-graphics-and-performance-guide

Behind the scenes, Snowdrop aims to utilize 75% of your dedicated video memory (VRAM), leaving space for other GPU-accelerated, memory-consuming apps that may be running the background. Within this 75% block of memory, which may be exceeded, memory is earmarked for any settings you've enabled that use VRAM, and the rest divvied up between geometry, other necessary game elements, and textures required right this second. Any remaining memory is filled with additional textures for nearby areas, reducing the chance of pop-in and visible streaming as players travel through the world.

As such, users of GPUs with less than 6GB of VRAM may not receive maximum-quality textures at all times. Whether you do or don't is based on the selected screen resolution, the detail of the current area, and the quality selected for the many game settings.

If you do somehow fill your GPU's available VRAM, perhaps with a high resolution and max settings on a 4GB GPU, The Division will do its best to scale back detail to rectify performance. In some cases though you'll have to step in and dial back the aforementioned settings, or the resolution.

Unlike in a PC, the Scorpio will only have 8GB of memory for developers to use for both the CPU and GPU. So I think it is possible that if someone tried to play The Division on Scorpio at 4K at Ultra (without reducing any settings) a memory wall might be hit due to the texture usage. Yes, there could be ways to mitigate this. A paging systems could stream textures in and out. But this would cost ADDITIONAL resources that are in limited supply. I'm not sure if Scorpio has enough RAM to run all 4K games at ULTRA or the highest settings without making trade offs. Now, I realize that Scorpio was not designed to be a next generation system. So MS really doesn't care how many trade offs are required. They are just hoping to make more money by selling something people will buy. But I can't help but think only having 8GB of RAM divided between the CPU and GPU represents a weakness compared to a PC with maybe an additional 4-8GB of RAM.
 
Brit,

But having to bring textures in and out on the fly has a bandwidth and processing cost, doesn't it? You might be able to use 4K textures with less than the required amount of memory to keep them available at all times, but then you'll have less bandwidth or processing power for other tasks. Once again, it is a trade off. If you are willing to make enough trade offs, almost anythng is possible in the gaming world. But by having a greater quantity of memory available to developers, you could keep the textures loaded without having to make as many trade offs.

we have 4GB cards doing 4k games. 8 gigs should be more than enough with 4-5 going towards texture and the rest towards other things. MS could also keep tweaking and get another gig or so out of the system if they optimize the os and apps
 
Re-read that as it is just render targets and not textures and other assets that ideally are still in memory. Consoles lacking separate memory pools would be paging from disk. That should still work, but higher detail textures may pop in a bit more noticably with the delay of accessing a disk.

Requirements should still be lower, but consoles are more efficient with tighter specs.

Exactly. It can be done but with possibly significant trade offs.
 
we have 4GB cards doing 4k games. 8 gigs should be more than enough with 4-5 going towards texture and the rest towards other things. MS could also keep tweaking and get another gig or so out of the system if they optimize the os and apps


They haven't seemed to have any interest in this though to be fair. We all thought 5Gb was just temporary, AFAIK its still the limit on PS4 and xbox One both.

C) The Xbox One Scorpio system with all the improvements for CPU, Command Processors, 40 CUs, 328 Gbps of 8GB of Ram for another 3 to 4 years

Too also be fair, I'm going to operate under the assumption the CPU is basically just Jaguar. We can tell for one by the familiar clock speed. 2.3 ghz is EXACTLY what I predicted, and my reasoning was "take whatever Pro was at and add a little" (as yields and processes mature over time). This doesn't mean there might not have been minor tweaks but I doubt much. MS is just playing this game of "custom X86 CPU" because they dont want to admit the Jaguar base because it has a negative connotation. Like when Nintendo gave us no details on the Wii U GPU except it's a "Custom AMD GPU". Literally anything is custom when you drop it in a console due to interfaces. All perfectly normal.

I never expected Zen anyway. And I think overblowing and saying the system is useless because of no zen is foolish. It's one part of a whole.
 
Last edited:
I missed the flow of credible information in the media...
Was there a confirmation one way or the other about ID buffer and 2x FP16?
 
Exactly. It can be done but with possibly significant trade offs.

The trade-offs are significantly more nuanced than just higher resolution textures to take advantage of more than 8 GB of memory. How are you going to store that texture data? If we double the size of textures we'll soon be looking at 100+ GB games. 8 GB will already represent likely a 50-100% increase in memory you could use for textures compared to 6 GB (remember memory needs to be used for things other than just textures). 12 GB of game memory (16 GB total) would likely triple or quadruple the memory available for textures. If we used all of that to maximize texture quality we could potentially be looking at games in the 150-200 GB range.

How are you going to load that data into memory in a speedy and efficient way considering that affordable mass storage (large enough to store multiple 100+ GB games and cheap enough for a console) devices aren't going to be advancing significantly without a breakthrough in storage technologies? HDD's are currently at an impasse WRT increasing area density. SSDs are vastly more expensive in cost/GB. Other storage technologies are even more expensive.

Without relevant breakthroughs in mass storage technologies, chasing more memory is a fool's errand, IMO.

Higher resolution, IMO, allows you to do more interesting things with regards to processing and displaying pixels than just a simplistic increase in texture resolution.

Regards,
SB
 
Last edited:
After sleeping over the available informations I wonder how they reach 360sqmm. A base RX480 has 232sqmm, add another GDDR5 channel, 4 CUs(at the top 30sqmm more but surely a lot lesser), Jaguar should be around 12nm for 8 cores. How "custom" is this thing really? Did they add some L3 to the cpu and more cache to the GPU?
 
Last edited:
And you are inferring a lot from the post that just isn't there.
"Inferior GPU Tech" != "Inferior GPU Performance"
You made up the "inferior performance" part all by yourself, and then proceeded to make bias accusations.
Good job!
I didn't make up anything. He was inferring that the ability to do fp16 in ps4pro alone made its GPU superior tech. This feature is implied to double flop output according to mark cerny when used. By using this example alone to boast the Scorpio has inferior GPU tech it implies it has inferior performance to the pro. That seems pretty simple to understand to me.

So I didn't "make up" anything. Also I implied bias because it seems to me someone jumping to the conclusion that this feature isn't included in the Scorpio GPU without any proof or even a detailed breakdown of what is included is someone looking very hard to reinforce their own opinion. Now a lot of people were nice enough point me in the direction of devs talking about the advantages of fp16. Still this feature is in no way being used to double the performance of the ps4pros gpu.

So please for the love of god if people are going to blanket state what features the Scorpio GPU has or doesn't please back it up with a source. Or just wait until we get a full breakdown of the damn features included in the GPU.
 
Last edited:
To the guys who are worried about if 8GB is enough, remember developers can start applying dxt5 compression to textures like modders do to games 4K uncompressed Ultra textures and give you a gorgeous 4K texture that looks 99.9% identical to the lossless texture and fit it in that memory. They may finally have a motivated to do this somewhat simple task for Scorpio games ( a task that pc game modders can do).
 
I missed the flow of credible information in the media...
Was there a confirmation one way or the other about ID buffer and 2x FP16?
Nothing was mentioned, but the ID buffer was specifically pointed out as a custom feature added to the PS4 Pro in Mark Cerny's interview with DF, so I think that specific form of primitive tracking is not likely to show up.
 
But if they want the Scorpio to run 4K games, is 8GB enough for 4K textures? My understanding in the PC gaming world is that it is not.

Why do you perceive this as a problem? Doesn't Horizon do just fine with only 5.5 GBs available? I am under the assumption that checker board rendering doesn't alleviate the pressure textures place on the VRAM. If Horizon can readily make use of 4K textures on hardware with less CPU/GPU performance, bandwidth and RAM, then why would 4K textures be a problem with Scorpio's available 8 GBs running at 326 GB/s?
 
Back
Top