General Next Generation Rumors and Discussions [Post GDC 2020]

I don't know. I've never understand RAM buses and chips. :oops: The one thing we can be sure of is if it was better to go single pool, single bus, MS would have done just that. ;)

I had assumed that Microsoft had picked two specifications of GDDR6 to keep costs manageable, not because of some issue relating to the design of the bus or pathways to the RAM. Will have to have a re-read of the DF article to see if it's clearer.
 
The 2 GDDR6 speeds is for costs, but costs for the lower SKU(Lockhart)? Possibly. I could see that. Will be interesting if they drop Lockhart like I expect them to, then that will allow them to go full speed on the XBSX.

Tommy McClain
 
I had assumed that Microsoft had picked two specifications of GDDR6 to keep costs manageable, not because of some issue relating to the design of the bus or pathways to the RAM. Will have to have a re-read of the DF article to see if it's clearer.

Same for Sony and 14 GBps chip, they did nearly all test with memory going faster. This is a compromise to reach the price thy want.
 
I don't know. I've never understand RAM buses and chips. :oops: The one thing we can be sure of is if it was better to go single pool, single bus, MS would have done just that. ;)


We cant be sure, there's likely a cost savings tradeoff somewhere obviously. However I do have confidence MS/AMD engineers aren't stupid, and made a good tradeoff. At a glance I'm guessing you shove all the GPU stuff in the 10 GB of fast GDDR anyway, and this kinda just works out, that the GPU would have typically used about 10GB or less regardless, so you can make the other 6GB slower ram for CPU/OS without much performance penalty.

Then again, Xbox One ESRAM was a thing :LOL:
 
It was a smart thing too given predictions of RAM prices. PS4 was lucky to get 8GBs and when MS were making their choice to have 8 GBs slow DDR with ESRAM for BW, Sony were choosing 4 GBs unified GDDR. It certainly wasn't a choice between 8 GBs GDDR or 8GB DDR + ESRAM with MS deciding to pick the latter.
Yea, the context of the situation is often lost.
 
Then again, Xbox One ESRAM was a thing
Indeed, as was the inclusion of BC hardware and the introduction of BC into a device that launched without any word of BC support. And subsequently the fallout of that decision had carry over effects into next generation.

Xbox One's main failing this generation was not the lack of power or it's focus on the platform. They just didn't have the games that people wanted, and a competing console was able to play 3P titles better for cheaper and had an exclusive game lineup that trumped Xbox.

For these 2 failings cost the Xbox the generation, but power may have been overlooked if Halo, Gears, and others could not be. Unfortunately they were, because people have moved on. And the single title that may have made any sort of difference to maintaining xbox's population was named Destiny and you could find that on all systems.

Everything XBO introduced this generation in terms of platform continues to be a major customer value point into the next.
 
Last edited:
I'd pair that lack of software with mandatory Kinect and the increase in price there. But even that isn't a slight on MS's judgement - Kinect was super popular so it made sense to incorporate that into the new platform as a USP. Who could've known that consumers were so fickle as to move on without letter MS know before-hand?
 
I get that, what I don't understand us why the CPU can only access the fast pool at 336GB/s it seems like an unnecessary limitation, surely the cost to allow the CPU to access the fast pool at 560GB/s is minimal at best.

This has always been true of all AMD apus. There is a coherent bus (routinely referred to as onion) shared by all the processors on the apu (20GBs for PS4 and 30GBs for Xbox One) and non-coherent bus (usually referred to as garlic) exclusive to the GPU that can utilize the max bandwidth of the system memory. On the Xbox One X, the gpu has access to 12 memory controllers that no other processors on the apu can use.

The garlic bus can read from local as well as uncacheable memory but reading from uncacheable is done at a slower rate because uncacheable memory is not interleaved like local memory.

On old AMD APUs, CPU access to local memory is slow as shit as in less than 6% of the max bandwidth available over garlic to the gpu. This was due to the fact that the older AMD apus only supported one outstanding read from local memory to cpu.
 
Last edited:
This has always been true of all AMD apus. There is a coherent bus (routinely referred to as onion) shared by all the processors on the apu (20GBs for PS4 and 30GBs for Xbox One) and non-coherent bus (usually referred to as garlic) exclusive to the GPU that can utilize the max bandwidth of the system memory. On the Xbox One X, the gpu has access to 12 memory controllers that no other processors on the apu can use.

The garlic bus can read from local as well as uncacheable memory but reading from uncacheable is done at a slower rate because uncacheable memory is not interleaved like local memory.

On old AMD APUs, CPU access to local memory is slow as shit as in less than 6% of the max bandwidth available over garlic to the gpu. This was due to the fact that the older AMD apus only supported one outstanding read from local memory to cpu.

Is there a reference to Onion or Garlic for the latest APUs? The slides for Renoir I saw only show the infinity fabric, which would be consistent with the goal of making the interconnect modular.
The fabric should be able to carry transactions with the same coherent/non-coherent properties, just without needing bespoke pathways.
However, even with a modern architecture, is it certain there wouldn't be a significant impact to uncacheable reads? That type is often serializing and disables many speculative optimizations.
 
Who could've known that consumers were so fickle as to move on without letter MS know before-hand?
Not sure if your post was sarcasm, but accepted as not
Ummmm I (others no doubt as well here) was very vocal about i then on these forums, due to its latency and inaccuracy it wasnt well suited for most games, Sure party/fitness etc it worked but other stuff nah. Yes Kinect was big at the start cause of the novelty but after a while novelty wears off then its a case of emperors new clothing being found out
 
Not sure if your post was sarcasm, but accepted as not
Ummmm I (others no doubt as well here) was very vocal about i then on these forums, due to its latency and inaccuracy it wasnt well suited for most games...
Which is where version 2 improved on all those things. But instead of consumers being even more impressed and immersed, and loving the idea of console where that beloved movement interface is way better and incorporated from the ground up, as standard, for use in every game such as the COD player being able to duck while playing with the controller, they shrugged it off as old hat.

It's that sort of response no company can guard against. Consumers are fickle. You never know what they'll take to or when they'll drop it. MS can't be blamed for making dumb decisions when those decisions look sensible on paper (not TV TV TV but the system ideas) but the world around those ideas changes quickly and without forewarning such that those decisions turn out to be the wrong ones.
 
I'd pair that lack of software with mandatory Kinect and the increase in price there. But even that isn't a slight on MS's judgement - Kinect was super popular so it made sense to incorporate that into the new platform as a USP. Who could've known that consumers were so fickle as to move on without letter MS know before-hand?
I am pretty sure they could because Kinect sales and implementation on the original 360 dropped as fast as it peaked. Plus they should have factored it in the price and design of the XBOX One. The 360 price and design was independent and Kinect was optional for those that were interested in only
 
I am pretty sure they could because Kinect sales and implementation on the original 360 dropped as fast as it peaked.
They did?

xbox-kinect-ww-shipments.png


Looks to me like sales kept a constant growth and went on to sell 35 million units.
 
They did?
Kinect was used in a bunch of healthcare and non-gaming applications. In my previous role our emergent technology team did a lot of work using multiple-Kinect sensors and we bought loads of them and then deployed software solutions built on the tech that to people who also bought loads of them.

It guess it depends if you're looking at raw sales of the device vs how much it was used for the purpose Microsoft designed it.
 
That's actually a good point. Still, MS will have made their decision for the next console based on their data for the existing Kinect. They had the Live stat son play time and everything else. They knew if these things were being actively used or sat in drawers next to the Wii. :p
 
That's actually a good point. Still, MS will have made their decision for the next console based on their data for the existing Kinect. They had the Live stat son play time and everything else. They knew if these things were being actively used or sat in drawers next to the Wii. :p

Yeah, you have to assume that most connected consoles (and other devices) are feeding anonymised usage data about the console, game and peripherals back to the manufacturer. Nothing we used Kinect for would ever be sending data back to Microsoft.

Maybe this time xbox will be powerful enough to allow for a cheap simple ir/cam sensor

Like the PS4 Camera, which was surprisingly good including the voice recognition one by the PS4.
 
Back
Top