News & Rumors: Xbox One (codename Durango)

Status
Not open for further replies.
All problems need to be distilled down to a dollar value when considering these design choices within the console business. What's the dollar price of 200 W versus 150 W?
 
All problems need to be distilled down to a dollar value when considering these design choices within the console business. What's the dollar price of 200 W versus 150 W?

It's probably significant if you're looking at margins on a consumer device. Say it was $10. Then that could be the difference between break even and a $10 loss. $10 per unit could work out to a $500 000 loss over the course of a generation. I would expect that when they design these things they have a price point in mind, and a target margin, so the design follows from there. $399 is probably a good number.
 
On the flip side though, there must be a clear value benefit of a 200 W console in performance and ability to attract customers. If $10 is the difference between 200W and 150W, and the performance is so much better, isn't it worth it in the long run? But if the cost if $50, maybe not. I've no idea how one tries to gauge probable appeal versus determinable costs and come up with a choice!
 
On the flip side though, there must be a clear value benefit of a 200 W console in performance and ability to attract customers. If $10 is the difference between 200W and 150W, and the performance is so much better, isn't it worth it in the long run? But if the cost if $50, maybe not. I've no idea how one tries to gauge probable appeal versus determinable costs and come up with a choice!

I'd expect 50W is significantly more than $10. I was just saying, they design for a price point, most likely. A launch price of even $450 doesn't seem likely at this point.
 
Cloud compute doesn't get rid of the wattage. It just moves it somewhere else, with cost to pass on to the consumer.
Economies of scale come to play - whereby the infrastructure of a dense computing system in things like cooling and other efficiencies, as well as the design of the hardware, will result in a lower per-head power consumption. Likewise, on the cost front any industry plan is going to have lower on-grid power costs than retail/consumer power plans - additionally you'll find that the planning of many of the larger datacenters are done in conjunction with their power needs to reduce costs, even to the point where some are going off-grid and developing their own efficient power generation solution for the datacenter.
 
On the flip side though, there must be a clear value benefit of a 200 W console in performance and ability to attract customers. If $10 is the difference between 200W and 150W, and the performance is so much better, isn't it worth it in the long run? But if the cost if $50, maybe not. I've no idea how one tries to gauge probable appeal versus determinable costs and come up with a choice!

When bean counters look to justify even 0.5 USD, it has to be compelling not only for performance but in profit potential. Granted for consoles, performance does correlate to profit potential somewhat. But that's where you have the delicate balancing act of "good enough" (PS4, immediate profits and sales) versus too little (XBO, can't match the competition and lost sales) versus too much (PS3, too expensive and too complicated to make games for initially which lead to lost sales).

Regards,
SB
 
The only slightly interesting thing this Gen is HTF MS could have cocked things up so badly and so completely.

I hope someone writes a book.
 
Economies of scale come to play - whereby the infrastructure of a dense computing system in things like cooling and other efficiencies, as well as the design of the hardware, will result in a lower per-head power consumption. Likewise, on the cost front any industry plan is going to have lower on-grid power costs than retail/consumer power plans - additionally you'll find that the planning of many of the larger datacenters are done in conjunction with their power needs to reduce costs, even to the point where some are going off-grid and developing their own efficient power generation solution for the datacenter.

I think it would be interesting to see a full chain cost and power evaluation.
In terms of consumer cost, a local console box cannot overwhelm the typical dwelling's ability to deliver power and dissipate heat, due to latent capacity, bulk, and the existing facilities for all other uses.
There is an incremental cost in terms of 100-200 Watts of power and AC load, which at least in the US is going to fall within the peak capability of existing infrastructure without problem. AC is sized more on the size of the dwelling, so a console's contribution is dwarfed by exterior conditions, and the residential circuits and utilities need to supply bigger peak draws anyway. If there are cool seasons or time periods, the natural dissipative capacity of the building or natural air flow can reduce all but the electrical draw to zero additional cost.
Additionally, a residence does not normally need redundant AC units, very high air quality, and a UPS is generally optional.

Data centers are so dense that they overwhelm air's natural capacity to transport heat without heroic measures, and general reliability concerns mean redundant hardware and tight ambient control.
A full data center requires its own transformer or generation capability, physical plant overhead, real-estate, and specialized construction as something separate to society's existing infrastructure and housing, and I would be curious what the power consumption would be for X amount of bandwidth over a full network backbone to exchanges to last mile to a router, versus several mm of PCB traces or an HDMI cable.

Per the following, a local device basically does not need half of what goes into the power budget needed by a data center, as it has a free ride on the existing capabilities needed for everything else.
http://www.emersonnetworkpower.com/documentation/en-us/brands/liebert/documents/white papers/data-center-energy-efficiency_151-47.pdf
The capital requirements are effectively zero until the average home runs out of a rectangular area of free floor or shelf space.

I can see places where there are ways to get around it, such as centers located in cold climates or using naturally flowing water, but that incurs geographic or other practical costs.
Data centers start from a level that entails measures considered heroic by any consumer standard, so savings come from a baseline in a different realm.
 
Capitol expenditure is a very different prospect to power expenditure.

Part of the savings from a data center is coming from the fact that there is not a 1-to-1 relationship between the hardware devices and the end user, its by definition one to many; sizing the capacity to most effective use over different time zones load peaks is a key to gaining efficiency.

Yes, from a cooling perspective, the TDP density is going to require heroic measures for that site, but from at the individual computing device perspective it is going to be more efficient at doing it overall because it is built to do exclusively, and in a known and designed environment. Google (and I'm sure many other large datacenters) already employs water cooling for their datacenters because its more efficient to do that.

Economics / financial reporting is also going to be another factor. Companies running datacenters are going to be motivated to increase efficiencies as this as a direct improvement on their bottom line.
 
I'm currently working on putting up to 210kW into a roughly 750 square-foot room. Air conditioning on that scale is interesting. Happen to be buying Liebert.
 
Well my TV I bought last Christmas decided to blow some component and I've lost all picture and sound. Tried the remote feature on the Xbox. Up until now I felt kind of indifferent towards the feature even though I could see its uses as I've never felt compelled to play a game with a controller while on the shitter.

But after trying it out on a yoga 2, I find its a competent alternative to lugging my xb1 up the stairs to another TV until I repair or replace that year and a few days old TV. I spent the first few hours after the TV went down, catching 4k fever until a shot of cheapness cleared it up.
 
Last edited:
Capitol expenditure is a very different prospect to power expenditure.
I was perhaps poorly trying to cover both, in terms of power cost at a macro scale and the costs passed onto the consumer.
It's a hefty price that is going to be paid by someone, and the cloud model has not fully accounted for the intransigence of telecoms and their desire to double-dip in information services. There are costs being treated as external by the data center, which the market is trying to reconcile either with peering agreements, punitive throttling, or various methods of billing/monetizing the consumer.

The power calculation for cloud center efficiency also needs to reckon with the the cost of communication between the data center and consumer. The energy expenditure of communication within the console and within the console+TV loop is pretty well accounted for, and the telecom trunks, repeaters, and whatnot would be complicated to model (given their shared nature) but would be non-zero.

Part of the savings from a data center is coming from the fact that there is not a 1-to-1 relationship between the hardware devices and the end user, its by definition one to many; sizing the capacity to most effective use over different time zones load peaks is a key to gaining efficiency.
Is this savings in terms of expense in terms of the devices being sold, minimizing the amount of hardware manufactured, or power?

For something that can be locally computed, the power cost of non-utilized hardware is made of consoles not turned on or in a box (dissipation 0), or in some form of standby. Besides local functions like USB charging or Kinect, the consoles burn wattage in the single digits.
Additional power related to cooling goes to zero when any power dissipation can be handled passively.

Some of the standby numbers for top of rack switches that might be used for cloud a server rack can pretty much be the same as the number of idling consoles it has ports for, and facility systems are constrained in just how idle they can go. The need to raise utilization for a data center and the very high odds that at least some of the blades are active leaves comparatively coarse options for managing a significant fraction of the facility's power budget.

The hardware savings do not reach the ideal considering that the uptime and QoS requirements are so different, or that one side does it innately while the other has to work harder to accomplish it. A local device's uptime requirements are pretty close to what a user wants it to be on for, versus 24/7 for thousands to millions of users in the face of various local and regional failure modes.

Yes, from a cooling perspective, the TDP density is going to require heroic measures for that site, but from at the individual computing device perspective it is going to be more efficient at doing it overall because it is built to do exclusively, and in a known and designed environment. Google (and I'm sure many other large datacenters) already employs water cooling for their datacenters because its more efficient to do that.
Water's superior thermal capacity is a good way to beat air's limitations. I am curious about the numbers, and how it measures up to local boxes that are in the noise or go to zero--which is a high bar to reach.

Economics / financial reporting is also going to be another factor. Companies running datacenters are going to be motivated to increase efficiencies as this as a direct improvement on their bottom line.
Cloud centers are by default something that is in addition to the infrastructure and resources that exist, so at least right now they are always an additive cost. That is certainly justifiable as a powerful enabling facility for things that local resources cannot provide.
However, the other trend that these centers do benefit from is that data centers bolster consigning functions that were once autonomous into becoming subscription services. The tradeoffs there and the net effect varies by use case, but from a macro perspective they are streamlining something that exists in addition to existing costs, sort of like a monkey on one's back going on a diet.
 
Now for something different...

Per latest weekly email from Xbox evidently you can now use Xbox Fitness without Kinect...

YpzQOPe6fmHYWQzQT7pzlTxOxaTfOfX3qZWGHIk89L6xPFjYU8aNcsrVXguSTFUMWwVg3G8vphiyEg2mQtv2QqZEI8cjesrGIQLleU3l1mA3eo9SEBzFfLg6tRaZ8fixH9Nc1WKCR2eBNxadQmtu7g=s0-d-e1-ft


Disclaimer at bottom reads...

**Advanced features, such as Muscle Mapping and feedback on your form, are available only to those with an Xbox One Kinect sensor.

Guess the obsolesce of Kinect is almost complete.

Tommy McClain
 
Wait, what? You expect the PS5 to keep up the current trend and go for a very powerful machine? The PS4 is a weak, underpowered, piece of garbage. So is the XB1.

They're both garbage, the days of powerful machines apparently ended last generation and with Sony's success this generation, there's no reason they are going to go back to losing hundreds of dollars on each console sold.

The only question is whether or not "the cloud" really actually works for MS and they can leverage money they've already invested in cloud servers to noticeably improve the performance of the local machines. I guess we'll see when Crackdown comes out and if other truly "Next Gen" games like ReCore utilize it as well.

There's no way Sony is going to put state of the art hardware in the PS5. Nor MS, for that matter.
Sure, but you are a tech savvy person, the general public perceives the PS4 as a way more powerful machine than the X1, so Sony will be pushing for power, because it works. Shifty also mentioned that Sony are the ones taking console gaming to another level and you can see it clearly despite X1's efforts --in some ways better than Sony's offering, like in prices and great sales and how they are treating their customers, a hurt MS is the best MS. But Sony are trying to get exclusives like Street Fighter or jRPGs and built the most powerful machine for a reason. Sales are telling the whole story.

Xbox could return to be super popular again one day, or how they say here "having girlfriends appearing left and right", literally, but not this gen. Power is guaranteed next generation at lower cost, but the thing is that Xbox could become an awesome hybrid. (a well built machine is better than raw power, tbh, as long as it's capable enough)

As for the cloud..I'd rather prefer to use other X1's as cloud machines, but I see it as a plus rather than a necessity. I have classmates making DDOS attacks whenever they can, they are quite common, so there goes your cloud. Plus, I run some software on the cloud at times, and it is sloooow.
 
What's your favorite or interesting Xbox Year in Review stat?

Get it here...

http://www.xbox.com/en-US/year-in-review-2015

Here are mine...

5204 hours on Xbox Live. Top 1% of the World. :) My Xbox is always on & connected to Xbox Live. It usually gets turned on when I wake up & turned off when I go to bed. I'm averaging over 14 hours a day connected. :) On average my friends only spent 909 hours & the community averaged 368 hours.

Wavey should obliterate this, but I unlocked 165 achievements worth 1955 Gamerscore. That's 19% increase in achievements from last year & that's better than my friends' average(8%) & the community(9%). It also put me in the top 5%. Neato.

My most played game was Batman: Arkham Knight & my most played Game with Gold game was Assassin's Creed IV: Black Flag.

Tommy McClain
 
4246 hours on xblive (mostly tv viewing while invisible I bet)

15125 achievement score increase over 925 achievements and an icrease of 40%, all that places me in top 1%

Most played game was Borderlands the Presequal.
 
Status
Not open for further replies.
Back
Top