Xbox One Slim

As I understand it, TV use and the HDMI in is basically limited to US. Is this incorrect?

Yes, that is incorrect, HDMI and TV and TV Tuner use works everywhere. However, not all countries have full definitive Guide data.

In the dashboard preview forums there are substantial number of users from Europe and they love that functionality. There are a handful from a Scandanavian country that is missing guide data, but they seem to be happy with the other functionality despite the missing guide data.
 
So if a company paid to put their product on there why would they let them take the product off ? Yet it seems to be a complaint with many smart tvs and many bluray players out there.

Netflix doesn't pay to have its app on the device, it pays when that device serves as a catalyst in creating a new subscriber.
 
Err.... the point being you can see its pretty clear that the memory layout isn't that much of an inhibiter to a more optimised layout (from a size perspective) with relatively low layer counts.

And I agree with you, but the mandatory 16 DDR3 chips will always prevent the XBone's PCB area from going below a certain size and power consumption level, at least more than the PS4's memory system.



As I understand it, TV use and the HDMI in is basically limited to US. Is this incorrect?
HDMI-in works everywhere, at least.
 
And I agree with you, but the mandatory 16 DDR3 chips will always prevent the XBone's PCB area from going below a certain size and power consumption level, at least more than the PS4's memory system.
Well, I don't think we'll be seeing credit card sized PCB's any time soon; the smallest XBOX 360 was still quite large. If the optical drive is still a requirement then that's going to put considerable constraints to any shrink.

As for power on the memory subsystem, GDDR5 shifts a higher power requirement on to the ASIC side due to the PHY speed requirements.

As I understand it, TV use and the HDMI in is basically limited to US. Is this incorrect?
There were initial issues with PAL regions being 50Hz and the XBOX UI defaulting to 60Hz which caused judder on the overlay, but that was one of the fixes in the earlier dash updates.
 
Yeah, I agree.

I'm pretty sure that Microsoft have some very detailed metrics that show exactly how much people value/use HDMI-in and Snap functions. Exactly as they slowly removed some of the Kinect OS navigation, they may start to remove other features to free up memory or processing requirements (unless it's negligible).

It may have been the original "soul" of the box, but it has definitely been reconfigured over the years and even prior to release.

People only ever buy consoles for games, everything else (no matter how much it's used) is only ever secondary.

Based on the market for devices that provide some smart TV functionality and the competition, I simply cannot comprehend why this'd be an option to Microsoft. All new TVs already perform these capabilities so the (already small) market can only get smaller.

I thoroughly enjoy the feature because I rather be mostly dependent on a single interface. I'm in a circumstance where I can make use of a lot of the Xbox's features and I find it more ideal than navigating through the separate UI of a console, cable box and a smart TV which are devices I currently employ in my setup. Its not like TV manufacturers nor cable operators are known for their ingeniously designed innovative UIs. My main remote has twin analog sticks and can be used for gaming now because I readily move from gaming, to watching TV and streaming apps and the Xbox controller is the more suitable device for my usage behavior.

Most people are probably not me in terms of personal taste or circumstance. And most are mostly motivated by the gaming capabilities of console but on average these ancillary services are readily used. Perhaps its not that people don't value these non gaming features, its just that most don't believe in sacrificing gaming performance or paying extra for it.

But people don't pay for a lot of third party PC software and the capabilities that software offers even though they believe its an indispensable part of their PC experience.
 
Last edited:
Well, I don't think we'll be seeing credit card sized PCB's any time soon; the smallest XBOX 360 was still quite large. If the optical drive is still a requirement then that's going to put considerable constraints to any shrink.

Sony have gone for some PCBs that are much smaller than the case they sit in - presumably allowing them to drop the optical drive lower and reduce the overall height of the system. MS otoh seem to always fill the base of the case and sit everything above the PCB. They seem to be very careful with cooling of memory now after the overheat issues of the ram under the PCB on the first 360s.

With little airflow needed for the X1s efficient cooler (silence being an important design goal for the X1), drawing enough air round the underside of the system may not have been seen as an option.
 
I'm not aware that DDR3 needs active cooling these days and there's likely been memory process transitions since initial development that buys more margin. Thermal pads to a metal shield is likely a "belt and braces" solution.
 
DDR3 2133 won't be the hottest or coolest of memory, but if you had eight chips plus the heat passing through the PCB from the fairly toasty APU, I can image needing at least some airflow - no matter how small - to be sure the ram stayed cool.

The revised 360s used thermal pads to the metal shield, while the PS4 from day one had thermal pads from the underbelly memory to the metal shield, plus airflow over the shield to take away the heat. I can't think of any modern console (including the likes of the Wii) that doesn't draw air across memory. On the PC, dimms with no heat spreader only go up to about 1600 mHz (though perhaps this is for marketing of higher clocks), and even then there will be either airflow within the case or room to radiate or convect heat away.

My DDR3 2400 has small heatspreaders, or I'd try licking it to see if it got warm.
 
Yeah those chips consume a fair amount of power.

According to Micron, an x8 chip consumes half a watt, and these x16 parts would be much more, since a lot of power is from the interface. Twice the width is twice the termination to burn. This is probably way beyond what convection cooling can deal with. New processes wouldn't change what a 256bit wide DDR3 interface consumes at 2133, 1.5v.
 
1.35V SO-DIMM's are available at 2133MHz without heat spreaders and they are designed to operate in tight confines with no airflow.
 
I didn't know they went that high! Seems like they've only arrived on the market since the X1 launched.

Following up on MrFox's comment, these will only be on x8 interfaces though? And would there be a price penalty for using lower volted parts as I'm guessing they're just "special bins"?*

*1.35v seems to imply DDR3L - basically DDR3 - which is different to LPDDR3 iirc.
 
1.35V SO-DIMM's are available at 2133MHz without heat spreaders and they are designed to operate in tight confines with no airflow.

Aren't those high-binned or cherry-picked parts selling for a premium though?
And I can't seem to find any DDR3L module at 2133 without a heatsink in the internets...
 
I haven't seen 2133 DDR3L from any memory manufacturer yet (even micron's top bining is 1866), but that would be a great solution, because it's only 400mW at x16, this is a very big drop.

But if they need to change the SoC to support DDR3L@2133, they might as well change it to use DDR4 to get massive volume @2133 instead of trying to source heavily binned DDR3L parts?
 
This is my point earlier - process transitions happen more frequently on RAM than other areas because its easy and cheap to build defect resilience; DDR3L almost certainly started life as a bin, but where is the process now? 1.5V on DDR3 is required for compatibility but that doesn't speak to where current memories are actually binning out at.

It may or may not be of relevance (I don't know) but the DDR3 controller on PC Jaguars support DDR3L.

As for the modules - Here's a Newegg list. I have a G.Skill Ripjaws module and the overlay on there is nothing more than a metalic sticker.
 
So that would apply to the SoC too, right? If they go down to 14nm, it would make it easier for it's controller to support 2133 at 1.35v?
 
So, anybody care to make predictions ?

TV SKU:
APU 14/16nm half the die size, half the power, half the price (~$60)
DDR3L ($30)
128GB flash storage ($30)
No optical drive !!
Internal PSU
Compact physical package

$100-$150 cheaper than current (Kinect free) XB1

Slim SKU:
APU as above
DDR3 ($25)
Optical drive ($20)
1TB HDD ($30-35)
Internal PSU
Smaller physical package than current XB1

$50-$100 cheaper than current (Kinect free) XB1

Cheers
 
Why are there two SKUs? Arey they physically different just because they use different storage devices?

Or are you saying the TV SKU is supposed to be embedded into an actual TV?
 
I think he means something small like Apple TV.

Would have thought they'd go with the same motherboard for any multi-SKU plan though.

Aren't streaming devices kind of a flooded market at this point or is the draw simply streaming service + Xbox games instead of iOS or Shield or Amazon "gaming".
emot-turianass-gif.1106


Maybe they should just go more Xtreme with the chassis+mobo redesign considering the PS4 is already small in comparison. Forget about chasing the tiny streaming console that could game.
 
Last edited:
The SKU with HDD and optical drive has to be bigger to accommodate the devices. It also needs more cooling, since more power is dumped as heat in the case.

I don't see the TV SKU as a streaming only device. It is a entry level device, but still a fully fledged XB1 with limited storage. Users will have to add external storage as they go along.

Cheers
 
Back
Top