Xbox One Slim

I understand Microsoft went with DDR3 for price/density and a safer bet to reach 8GB of RAM, but damn has that decision come to bite them in the ass.
Now they're stuck with a memory that will be technically very difficult to replace because of its low latency and also gets in the way of making a smaller PCB for the console.

I wonder if MS could counter the latency issue in switching memory through a combination of low latency binning (DDR4 timings on the desktop can be squeezed fairly aggressively from what I've seen), shorter traces, and making a faster memory controller that took up some slack in presenting data to the processors. As long as the data is arriving when it's needed with the original timings it should be okay, although achieving that may be very difficult.

There's also the option of HBM2, where a single 8GB stack would fit on the interposer. Trouble then is you're paying for BW that isn't needed - unless you could remove the esram. Then you're back to potential latency problems, I guess.
 
That card uses GDDR3.
DDR3/4 offers 16bit width per-chip, whereas GDDR3/5 chips have a 32bit width.
So AFAIK if that card used DDR3, it would need twice the number of memory chips (at least 8?) to achieve 128bit.
Sorry, you are correct. But again, this isn't much of an issue - W2100 facilitates 8 devices in a low profile, half length form factor.
 
Last edited by a moderator:
For most console gamers, that's what it already is, hence its sales relative to the PS4.

Unfortunately, it's also unnecessarily expensive to make because of stuff the market has already demonstrated that it doesn't want.

Exclusive games and being cost competitive are the only weapons left for MS to use. That HDMI input that didn't sway the market when the machine was new? It's not going to sway the market now either, and especially not with no-one buying Kinect to control the not-attached device either ... but it is going to continue to add cost.

Yeah, I agree.

I'm pretty sure that Microsoft have some very detailed metrics that show exactly how much people value/use HDMI-in and Snap functions. Exactly as they slowly removed some of the Kinect OS navigation, they may start to remove other features to free up memory or processing requirements (unless it's negligible).

It may have been the original "soul" of the box, but it has definitely been reconfigured over the years and even prior to release.

People only ever buy consoles for games, everything else (no matter how much it's used) is only ever secondary.

Based on the market for devices that provide some smart TV functionality and the competition, I simply cannot comprehend why this'd be an option to Microsoft. All new TVs already perform these capabilities so the (already small) market can only get smaller.
 
Its exactly cheaper for apple TV than the iPhone and iPad. Apple reduced the rate to 15% for the apple TV from the customary 30% to drive acceptance. Apple controls the in app purchase policy around its devices. You could distribute an app and use a go around like having your interested parties sign up through your website but if you wanted to allow easy sign ups through the app, apple would be owed a cut.

The standard rate in the T&C is 15% but video content providers all seem to have undisclosed deals with Apple. This virtually guarantees no large video content provider is paying 15% of their sign-up to Apple. I can't imagine many sign ups happen on the AppleTV anyway - not yet at least. I'm not convinced the new platform will be any more popular than the original platform.

I'm pretty sure that Microsoft have some very detailed metrics that show exactly how much people value/use HDMI-in and Snap functions. Exactly as they slowly removed some of the Kinect OS navigation, they may start to remove other features to free up memory or processing requirements (unless it's negligible).

Yeah I'm sure both Microsoft and Sony collect usage info about how their consoles are used. it's the most reliable way of determining which functions are most used.
 
Sorry, you are correct. But again, this isn't much of an issue - W2100 facilitates 8 devices in a low profile, half length form factor.

From the pictures it appears that there are only 4 ram chips per side, with 8 achieved through using both sides of the board. Being such a tiny board using more layers probably isn't cost prohibitive, where as the football pitch sized X1 might be.

For whatever reason, MS chose not to place memory (in a none clamshell configuration) on both sides of the board.
 
Do you have a source for this?
http://www.ps3devwiki.com/ps4/★_Debug_Settings
Slow HDD mode (retail mode presumably)
Was not present in first versions of OS. There is a story where devs were confused why SSD did not speed up load times on devkit.

PS3 also has this. PS3 had several generations of HDD. From 20Gb to 500Gb. Of course with different speed.


MS won't do such drastic change to DDR4. Also 4266 DDR4 won't be very cost effective till the end of gen.
 
Last edited:
From the pictures it appears that there are only 4 ram chips per side, with 8 achieved through using both sides of the board. Being such a tiny board using more layers probably isn't cost prohibitive, where as the football pitch sized X1 might be.
This is only a 6 layer board. And there can be a crossover point where a smaller PCB with more layers can be cost neutral/beneficial. If we are talking about size reductions then these generally come with die shrinks, which has ramifications on other components such as power circuitry and thermal requirements; all of these are going to play into the overall system design and put a different set of characteristics and considerations for the PCB.
 
MS won't do such drastic change to DDR4. Also 4266 DDR4 won't be very cost effective till the end of gen.

LPDDR4 seems to have a more aggressive roadmap than DDR4, and be better suited to higher clocks. Intel expect LPDDR4 to be more widely used than LPDDR3 in 2017, and JDEC were expecting 4266 to arrive in 2015, which Samsung seem to think they've started to deliver.

Adoption of LPDDR4 is going to be very rapid, and due to volume LPDDR4 may end up being cheaper than DDR4.
 
Adoption of LPDDR4 is going to be very rapid, and due to volume LPDDR4 may end up being cheaper than DDR4.
Don't hold your breath :nope: The price of LP chips isn't high because of lack of economies of scale, it's because high performance devices have few other choices. When you have a captive market you charge premium. Also, where Samsung are involved don't look for bargains, look for price fixing. It's kind of their thing(tm).
 
But again, this isn't much of an issue - W2100 facilitates 8 devices in a low profile, half length form factor.
(...)
This is only a 6 layer board.

... on a card with a 90mm^2 chip that consumes 25W or so.. plus it's a FirePro card where margins are large so adding more layers to the PCB may not pose a problem for profitability.

I imagine that doing that with a 360mm^2 chip that consumes ~100W (or say 200mm^2 using FinFet on a 65W budget) wouldn't be as easy. Plus, that 128bit solution needs 8 chips. A shrinked XBone would always need 16 chips.


Regardless, the XBone's PCB seems to be not crowded at all. I'm sure they can make it a lot smaller and more power-efficient. Just not as small and efficient as Sony can, because the PS4 only needs half the memory chips, and their higher latency may allow an easier transition to e.g. HBM2.

LPDDR4 seems to have a more aggressive roadmap than DDR4, and be better suited to higher clocks. Intel expect LPDDR4 to be more widely used than LPDDR3 in 2017, and JDEC were expecting 4266 to arrive in 2015, which Samsung seem to think they've started to deliver.

Adoption of LPDDR4 is going to be very rapid, and due to volume LPDDR4 may end up being cheaper than DDR4.

I for one hope Intel and AMD support LPDDR4 on all their non-socketed solutions. IMHO it was already a shame that Skylake Y and U only support LPDDR3 and DDR3L.
Imagine a tablet with a Core M using 4266MT/s LPDDR4. That's 76GB/s on a 4-chip solution (like the one in Surface 3/4 Pro and most Core M tablets). eDRAM would probably be put aside for GT3 solutions at that point.

Don't hold your breath :nope: The price of LP chips isn't high because of lack of economies of scale, it's because high performance devices have few other choices. When you have a captive market you charge premium. Also, where Samsung are involved don't look for bargains, look for price fixing. It's kind of their thing(tm).
Micron and Hynix are producing LPDDR4 too.
 
... on a card with a 90mm^2 chip that consumes 25W or so.. plus it's a FirePro card where margins are large so adding more layers to the PCB may not pose a problem for profitability.

I imagine that doing that with a 360mm^2 chip that consumes ~100W (or say 200mm^2 using FinFet on a 65W budget) wouldn't be as easy. Plus, that 128bit solution needs 8 chips. A shrinked XBone would always need 16 chips
Err.... the point being you can see its pretty clear that the memory layout isn't that much of an inhibiter to a more optimised layout (from a size perspective) with relatively low layer counts.
 
I'm pretty sure that Microsoft have some very detailed metrics that show exactly how much people value/use HDMI-in and Snap functions. Exactly as they slowly removed some of the Kinect OS navigation, they may start to remove other features to free up memory or processing requirements (unless it's negligible).

It may have been the original "soul" of the box, but it has definitely been reconfigured over the years and even prior to release.
We also have other reference points of designs that get trimmed. PS2 lost its early Firewire port IIRC. It also lost its 'expansion bay' after the slim got internal networking. PS3 had quite a few trim downs ahead of release, and had BC hardware removed. So I think we should expect little used features to get axed as if it's little used, it's not wanted, doesn't add to desirability, and thus dead weight to the cost of the machine.
 
For whatever reason, MS chose not to place memory (in a none clamshell configuration) on both sides of the board.
Easier for the manufacturing plant to just solder & place all the chips on one side?

Perhaps part of the 3GB reservation is for the XDK environment. The back of the mobo doesn't seem to be designed for placing extra chips.
 
Last edited:
Its a general rule not the exception.

So if a company paid to put their product on there why would they let them take the product off ? Yet it seems to be a complaint with many smart tvs and many bluray players out there.
 
Easier for the manufacturing plant to just solder & place all the chips on one side?

Perhaps part of the 3GB reservation is for the XDK environment. The back of the mobo doesn't seem to be designed for placing extra chips.

I suppose so, yeah. If you have room for a huge mobo might as well use it I guess. And I suppose devkits are limited to 8GB, and use part of the snap/OS/wut 3GB for development.

Also, having memory on only one side saves having to engineer airflow through the underside of the case like with the 360. And where as the 360 had lots of airflow, the X1 uses a far more efficient cooler and minimal airflow. It was funny to see emergency revisions of the 360 add heat transfer pads to help the underbelly memory chips transfer heat to the metal rf shield after the air drawn from under the board wasn't guaranteed to be enough to keep the memory cool.
 
We also have other reference points of designs that get trimmed. PS2 lost its early Firewire port IIRC. It also lost its 'expansion bay' after the slim got internal networking. PS3 had quite a few trim downs ahead of release, and had BC hardware removed. So I think we should expect little used features to get axed as if it's little used, it's not wanted, doesn't add to desirability, and thus dead weight to the cost of the machine.
With regards to the TV use, one reference point is that of the ~24 dashboard updates at least half have included TV functionality improvements.
 
Back
Top