Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
Power Supply ratings does not equal Power Consumption.
 
Power supply max output rating != typical power consumption.

Also, 17.9A*12V = 214.8W

This is unchanged from previous discussion when PSU rating was discovered over a month ago.

So they are putting in more margin over the Xbox 360 power supply design? (Bigger gap between rating and typical power consumption?)

If so, why?
 
Could just be economies of scale...
Could just be power efficiency...
Couldn't peripherals play a role on this, too? I mean.. I read there are rumours that keyboard and mouse will be supported. Perhaps VR glasses too.
 
I would guess having seen kinect torn down, that it is very power hungry, the LED array, so I'm not sure the XB1 PSU is in anyway directly comparable to the original 360 one.
 
I would guess having seen kinect torn down, that it is very power hungry, the LED array, so I'm not sure the XB1 PSU is in anyway directly comparable to the original 360 one.

I guess the Kinect is the biggest unknown which limits guessing where the power is going. And with the wide USB 3.1 power range (including the 12V and 20V options) it is hard to guess.

For efficiency and margin I would look at the five or so iterations of the 360 supply and assume that it isn't going to depart much from S or E versions.

Well only a few more weeks to find out. I guess since it is off Ebay we can't pool together and send it to ChipWorks before the launch :D [That would be a first, right?]

(Rumor is that 150 were shipped... ...hmmm... what are the chances on getting one...)
 
These are the Video options for the Xbox One.

Resolution 1080p or 720p

HDMI or DVI
Color Depth: 24, 30 or 36 Bits per pixel
TV RGB limited or PC RGB full



What does 36 Bits per pixel mean? Richer colours? :smile2:-my TV supports up to 69 billion colours, so maybe 36 bits is the right setting?-

Only two options for RGB, which is cool, three on the Xbox 360 was just too much, and for me it was about choosing between either Standard or Extended.

I think I am going to try TV RGB limited first, just in case.
 
Last edited by a moderator:
What does 36 Bits per pixel mean? Richer colours?
Richer? It means the colours are going to be more precise (at least in terms of how the video stream defines them) (compared with 30 bits per pixel).

36 bits per pixel is one of the standards defined in the HDMI specifications.
 
Richer? It means the colours are going to be more precise (at least in terms of how the video stream defines them) (compared with 30 bits per pixel).

36 bits per pixel is one of the standards defined in the HDMI specifications.
Yes, I mean... like a painting filled with rich hues. Deep colour, so to speak.

Okay, that explains it since it is a standard specification for HDMI, although it seems more related to video than videogames. Thanks for the insight, HTupolev -fun nick btw-.

Also thanks to Rudecurve for the link. In the article it says 36 bits are 68.71 billion colours -which is about right for my TV, for instance, but there are also 48 bits specifications! :oops:

I think only the top of the line Bravia TVs, Philips, Samsung, LG, etc, support that for now. By top notch TVs we are talking about TVs which can easily cost up to 5000€ or more.
 
Today's TVs - especially LCDs using white LED backlighting - can't display these color formats properly anyway so I wouldn't sweat it. Not to mention, they exceed the color resolution of the human eye anyway... ;)
 
question might be the wrong section for this but the xbox one has tiled resources as an api correct? i've read some people state that once developers start using it we should see some huge increases in fidelity. ive read that the xbox one's gpu has extra hardware to take advantage of tiled resources that PCs don't have. anyone know exactly what these extras are?

also i tried searching for this but what is the difference between microsoft's tiled resources and AMDs PRTs/OpenGL sparse textures?
 
question might be the wrong section for this but the xbox one has tiled resources as an api correct? i've read some people state that once developers start using it we should see some huge increases in fidelity. ive read that the xbox one's gpu has extra hardware to take advantage of tiled resources that PCs don't have. anyone know exactly what these extras are?

also i tried searching for this but what is the difference between microsoft's tiled resources and AMDs PRTs/OpenGL sparse textures?

You mean astrograd and his buddies said it wherever he slunk off to and you came running here to ask? How many posts are you going to start with a variation of 'some people said...'?

You obviously didn't search very hard because this was discussed in this forum a few months ago. It is not some magic bullet. PC and PS4 have it as well.
 
also i tried searching for this but what is the difference between microsoft's tiled resources and AMDs PRTs/OpenGL sparse textures?

Tiled resources and sparse textures are essentially the same functionality being exposed in two different API's. PRT is AMD's name for the their hardware-specific functionality that allows their GPU's to support tiled resources/sparse textures.
 
Tiled resources and sparse textures are essentially the same functionality being exposed in two different API's. PRT is AMD's name for the their hardware-specific functionality that allows their GPU's to support tiled resources/sparse textures.

sparse textures is opengl correct?

You mean astrograd and his buddies said it wherever he slunk off to and you came running here to ask? How many posts are you going to start with a variation of 'some people said...'?

You obviously didn't search very hard because this was discussed in this forum a few months ago. It is not some magic bullet. PC and PS4 have it as well.

not that its on topic but astro isn't the person saying it. they were talking about tier 2 versus tier 1 and that no pc parts atm support tier 2. though your search would work better if you did partial resident textures site:beyond3d.com.

i've already run the search but there seems to be much confusion on what is considered tier 1 and whats tier 2. some of the beyond3d posts state tier 2 is simply PRT support, but then we had dave that stated only Bonaire supports tier 2 (so it can't be a PRT problem since all of 7000 series have PRTs from my understanding). and it seems like XBO gpu seems to be based on bonaire at a glance so what hardware does bonaire/xbox have that makes it tier 2 complaint but other gcn cards aren't? ms seems to be pretty cryptic on the entire thing. i figured since windows 8.1 was out as was drivers someone could shed some light and perhaps we might learn more about xbox's gpu by extension.
 
As others here have stated for what feels like the hundredth time; MS's Tiled Resources is(are) the direct equivalent of AMD's sparse texture OpenGL extension. Tier 1 supports all DirectX cards (which is how they were able to show it off at Build// running on Nvidia hardware), while Tier 2 exposes GCN's PRT. There is nothing magical about Xbox One's implementation opposite to PC land. There are no well of hidden features in the Xbox One's GPU. The scaler is not secret sauce, there is no raytracing chip, there is no dGPU, etc. Please stop being so quick to believe everything posted. 2/3 of the posts by Astro and his ilk are simply complete nonsense that plays towards your confirmation bias. I'm not going to insult you or anything like that given your age posted in your profile and I applaud you for wanting to get a true understanding of hardware. Honestly I like TXB and UnionVGF (obviously since I'm a mod there) but they're not websites you go for any real discussions on the technical aspects of either platform.

While your questions do pertain to the thread topic at hand; I'm not sure how mods here are, but one of my biggest pet peeves is people always quoting other forums (I seriously can't stand when members of UVGF talk about GAF nonstop ugh). If you feel the need to respond, send a PM.

Now to get away from things discussed ad nauseam. With the Xbox One only supporting 720p and 1080p am I correct to assume that if I were to hook my Xbox up to my monitor via HDMI and I have a 2560x1440 display; the Xbox would simply scale the 1080p image up to 2560x1440? Not sure if that's how it was handled last generation, I mainly gamed on the TV but I've been thinking about gaming on my monitor lately. I'm so use to it at this point. The TV gets very little love despite the fair amount of work I put into getting the downstairs one (Pioneer Kuro!).
 
Last edited by a moderator:
The Xbox One would output 1080p and it would be up to your monitor to scale that to the full panel resolution. If you have one of those cheap Korean monitors it may lack any scaling hardware, so make sure to check.
 
For 360, MS had to expose/support the particular scaled output res, so there are no guarantees. They should be able to support 1440p if they can already scale to 4K (3840x2160,30Hz ala HDMI 1.4). *shrug* Would think there's bandwidth for 1440p60, but I dunno.
 
Sure, they can support more resolutions if they like, any supported by HDMI 1.4, that's simply a software question. But if at launch they only allow you to select between 1080p and 720p it will be up to the display to scale that to any different resolution they might require. The numerous resolutions supported by 360 were largely an artifact of offering the VGA cable.
 
Hey guys.

This is my first post here so I hope you will welcome me:D Long time reader first time poster:LOL:

I have a question about the X1. Since I can't ask any developer directly you are the closest one I can think of.

Simply put, Is the X1 capable of running games at 1080p with 60fps? Games like Halo, Gears, Uncharted, The division etc. I know there is power difference between it and PS4 but is it really that big? That's the technical question I have.
I appreciate your help :oops:
 
Status
Not open for further replies.
Back
Top