Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
They say reduced deadzones.


cool, thanks

it sure sounds like there is resistance there from people who have tried it

http://majornelson.com/2013/06/06/more-details-about-xbox-one-controller/

Impulse Triggers – Xbox One’s Wireless Controller sports four vibration motors – a small one behind each trigger that adds precise haptic feedback to the finger tips, and a larger in each grip for large scale rumbles. This gives users a sense of in-game directionality and depth, creating rich, immersive experiences where gunshots, car crashes and explosions can come to life.

Higher Quality Headset Audio – The data transfer rate between the controller and console has been improved, allowing for higher fidelity audio in communication headsets. In-game chat over Xbox Live, according to the team, will be in many cases clearer than talking on a phone.
Revamped Thumbsticks – The thumbsticks are built for precision and comfort. They’re smaller and outlined with a knurled texture for better grip. Competitive gamers will be pleased to hear the sticks require 25 percent less force to move, allowing you to adjust your aim in a first-person shooter or execute a half-circle sweep in a fighting game faster and more accurately. The controller also uses advanced electronics that reduce thumbstick deadzone in the center.
Brand New D-Pad – The old Xbox 360 D-pad is replaced by a new design that pays homage to classic controllers and is architectured to deliver more precision and tactile feedback for gaming. The D-pad’s cross shape is honed to provide accurate cardinal direction input, sweeping movements and combinations – important factors for sports and fighting games, and other experiences.
Buttons, Buttons, Buttons – The A, B, X and Y buttons are lower to the controller with tighter spacing, making the transition between each one smoother. A new, three-step manufacturing process gives these buttons a more premium look, as if the letters on them are suspended in 3-D space. The size and placement of the Xbox button has also changed so the view and menu buttons are more accessible.
Seamless Connectivity – Each controller uses a combination of invisible reflective technology and LEDs to send a patterned infrared signal to your console and Kinect sensor. Not only does this make pairing the devices seamless, but it enables Kinect to associate the controller with whoever is holding it. This introduces innovative experiences, such as player switching, where a split screen display can swap positions on the TV if users change seats on the couch.
Low Power State – If you’re watching a movie or need to step away from the TV, the controller enters a low power state that conserves your battery. The moment you pick it up again, it will be ready for use without having to resynch with the console.
Refined for Comfort – The controller’s design is deliberately honed to the closest tenth of a millimeter to offer the most comfortable fit in users’ hands, and was tested extensively by a broader age group than ever before to ensure it is optimized for as many people as possible. According to the team’s research, this improves gameplay performance and allows comfortable gaming for longer periods of time.
Angled Triggers and Bumpers – The triggers and bumpers are carefully designed for performance and comfort. The specific angling allows for a natural fit for your fingers, and the triggers require a lighter pull, so squeezing it repeatedly is an easier and more precise action.
Internal Battery Cavity –The compartment that houses AA batteries is built into the interior of the controller, providing more room at the bottom for your fingers to grip. Another convenient improvement is that the controller is both wireless and a wired– simply plug it into your console with a mini USB cable and the connection automatically switches to preserve battery life
 
That is just it. The hands-on reviews talked as if there was some form of force feedback present while the official documents and presentations we have seen only confirm rumble motors. Thus, my confusion.
 
I initially thought it was about the sticks, but realised later that it was actually about the triggers ... Either I wasn't paying attention or the message was confusing (in this case, I wouldn't be surprised if I just wasn't paying attention).

They did some good refining, but I'm a little disappointed that we didn't get anything really new, or even bring the controller up to speed with gyros. They're so cheap, and they could be really nice for some games.
 
Slightly off topic but for those on the last page MSAA with TrSSAA does a much better job on alpha textures and chain link fences then any post process AA ever will.
 
Given lack of support for gyros in PS3, I can understand seeing them as an irrelevant additional cost.

The gyros on the PS3 controller were pretty noisy, the gyros on the PS4 controller are a lot more accurate, it'll be interesting to see if they get used
 
I don't know why they reduced the force required to move the sticks.

For people playing shooters competitively that's a bad thing, you want to have more tension in the sticks so you can use higher sensitivities while maintaining the ability to easily do slight stick movements.

This is the whole reason Razer's Onza and Sabertooth 360 controllers work so well (user adjustable tension for each stick).
 
Alright, enough with the rumors of rumors in the TECHNICAL thread.
 
I'm going to play devil's advocate and ask what concrete information do we have about Xbox One anyhow? Have MS even confirmed 1.2 tflops? The only information we have on that front afaik is 768 operations per clock, but we don't know what it's clocked at.

There is a hell of a big difference between revealing the 8GB in Feb, and not revealing a 30% downclock 2 or 3 months before game teams aiming at launch will have to submit.
I'd be stunned IF it is happening and devs weren't aware at this point.

Which means that IF the clock speed was lowered, it would have been a while ago and third party developers would have been well informed by now. I know that yield rumours for Xbox One go back as far as September.
 
Which means that IF the clock speed was lowered, it would have been a while ago and third party developers would have been well informed by now. I know that yield rumours for Xbox One go back as far as September.

Wasn't really my point.
It's more if a decision like this was made this late, then MS would want to communicate it to developers as soon as the decision was made. You're looking at developers mastering probably in August, delaying the communication an only hurt the software.

Having said that I would be surprised if there were anything to this rumor, only because if developers had been recently been broadly told this, I'd expect more leaks than a couple of insiders at GAF.
 
I'm currently waiting for my wife in a car with not much to do... I was reading yesterday a couple of posts on the RWT forum about Crystal Web. At some point a guy pointed out something when comparing Edram and SRAM especially when you speak of large amount of memory: whereas SRAM is overall faster (latency, clock speed) once you speak of large amountdue to their bigger size the SRAM cell requjre significantly longer wiring.
32MB is a lot, we hear rumors about issues whereas SRAM by self is nothing special (you would think it is a well mastered piece of silicon could issue be related to wiring one way or another?
From burning more power than expected to may be timing issues between the different banks of SRAM or what not?
 
So Major Nelson refuted the downclock rumour? It didn't make any kind of sense to begin with, but thats still good to know.
 
We are not discussing the downclock as a rumour here! I've just moved the Major Nelson tweet discussion from this thread. It's an ambiguous answer with room for interpretation. Wait until real, unambiguous confirmation, or discuss it in the rumour thread.
 
We are not discussing the downclock as a rumour here! I've just moved the Major Nelson tweet discussion from this thread. It's an ambiguous answer with room for interpretation. Wait until real, unambiguous confirmation, or discuss it in the rumour thread.

You should change the thread name then.
 
You should change the thread name then.
We're discussing the rumour, but as a technical investigation, not a "he said this, well she said that, well she's a liar, no you're the liar, stupid fanboy always spreading FUD about XB1, blah blah" discussion. The discussion about the rumour that revolves around characters on GAF and whether to believe them or not clearly doesn't belong here.

In simple terms, in this thread the question is: "There's a rumour that the ESRAM is causing heat issues and has resulted in a significant downclock for XB1. Is this plausible given what we know about SRAM? Have we any technological points of reference for this supposed problem?"
 
I'll post this here, although it seems unlikely at this point...

On-chip thermal gradient analysis and temperature flattening for SoC design
"Important results obtained here are 1) the maximum temperature difference increases with higher memory area occupancy and 2) the difference is very floorplan sensitive."

Paper is here, but I certainly haven't read it:
http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1466526

As I understand it, this problem depends usage of chip components - it seems plausible that such a problem would only appear when the system is fully optimized (e.g. when as move engine usage increases/toolset improves etc).

But I think it's rather a long-shot :).
 
Tech Argument Against Jaguar (Or Jaguar Only)

Over at Semi Accurate Forums someone posted what I think could be a pretty good technical catch from looking at the Xbox One mainboard (the wired photos).

So jeff_rigby pointed out the size of the GPU and CPU power planes and specifically that the size of the CPU power plain copper plus the associated buck converter components look *way* too large for the power consumption of jaguar.

So take a look at both this photo (zoomed out) and then also go look at the zoomed in photo which you can see over at the Semi Accurate Forums:

http://www.wired.com/gadgetlab/2013/05/xbox-one-development-photos/#slideid-138498

In short the CPU power plane and buck converter looks like it can power some much larger bigger cores or other blocks than the lower power jaguar cores.



So based upon die size, transistor counts, power consumption (as suggest by the CPU power plane copper cross section and the buck converter components) and MS's trends in the original and 360 I doubt it is jaguar. My bet and hope is that it is far removed from jaguar at the other end of the power spectrum.

Or I could have some fun and suggest that there are "additional powerful blocks" attached to the CPU power supply. Back to the dual APU rumors or the ray tracing block? (I am mostly just kidding but I would like to see revolution and serious power instead of evolution and jaguar cores.)

Maybe W8-lite runs on Jaguar and games runs on Power8 custom or K10.5+ enhanced/customized husky cores or steamroller-like customized cores.

So it might have jaguar but for the W8-lite and the constant on. Not for the games at all. The two get stitched together through the display planes which might involve the eSRAM.

Not sure which power plane the eSRAM is attached to. I would guess GPU.
 
Over at Semi Accurate Forums someone posted what I think could be a pretty good technical catch from looking at the Xbox One mainboard (the wired photos).

So jeff_rigby pointed out the size of the GPU and CPU power planes and specifically that the size of the CPU power plain copper plus the associated buck converter components look *way* too large for the power consumption of jaguar.

So take a look at both this photo (zoomed out) and then also go look at the zoomed in photo which you can see over at the Semi Accurate Forums:

http://www.wired.com/gadgetlab/2013/05/xbox-one-development-photos/#slideid-138498

In short the CPU power plane and buck converter looks like it can power some much larger bigger cores or other blocks than the lower power jaguar cores.



So based upon die size, transistor counts, power consumption (as suggest by the CPU power plane copper cross section and the buck converter components) and MS's trends in the original and 360 I doubt it is jaguar. My bet and hope is that it is far removed from jaguar at the other end of the power spectrum.

Or I could have some fun and suggest that there are "additional powerful blocks" attached to the CPU power supply. Back to the dual APU rumors or the ray tracing block? (I am mostly just kidding but I would like to see revolution and serious power instead of evolution and jaguar cores.)

Maybe W8-lite runs on Jaguar and games runs on Power8 custom or K10.5+ enhanced/customized husky cores or steamroller-like customized cores.

So it might have jaguar but for the W8-lite and the constant on. Not for the games at all. The two get stitched together through the display planes which might involve the eSRAM.

Not sure which power plane the eSRAM is attached to. I would guess GPU.

Arent we running into transistor limitations here though? We can guess there's more than just jaguar in there all we want, but with 5 million transistors - 1.6 billion for ESRAM - 12CU GCN core - 4 module Jaguar arent we only looking at a few hundred million unaccounted for, at best?
 
Arent we running into transistor limitations here though? We can guess there's more than just jaguar in there all we want, but with 5 million transistors - 1.6 billion for ESRAM - 12CU GCN core - 4 module Jaguar arent we only looking at a few hundred million unaccounted for, at best?

I do not think so due to my arguments for the use of 1T-SRAM. So others might not agree but I consider 1T-SRAM more likely than 6T or 8T. 80MB 1T is is use at much higher freq in Power 7+ (and later Power 8). 32MB 1T is in Wii U die already, as someone pointed out.


1T-SRAM is a pseudo-static random-access memory (PSRAM) technology introduced by MoSys, Inc., which offers a high-density alternative to traditional static random access memory (SRAM) in embedded memory applications. Mosys uses a single-transistor storage cell (bit cell) like dynamic random access memory (DRAM), but surrounds the bit cell with control circuitry that makes the memory functionally equivalent to SRAM (the controller hides all DRAM-specific operations such as precharging and refresh). 1T-SRAM (and PSRAM in general) has a standard single-cycle SRAM interface and appears to the surrounding logic just as an SRAM would.

Due to its one-transistor bit cell, 1T-SRAM is smaller than conventional (six-transistor, or “6T”) SRAM, and closer in size and density to embedded DRAM (eDRAM). At the same time, 1T-SRAM has performance comparable to SRAM at multi-megabit densities, uses less power than eDRAM and is manufactured in a standard CMOS logic process like conventional SRAM.

MoSyS markets 1T-SRAM as physical IP for embedded (on-die) use in System-on-a-chip (SOC) applications. It is available on a variety of foundry processes, including Chartered, SMIC, TSMC, and UMC. :arrow: Some engineers use the terms 1T-SRAM and "embedded DRAM" interchangeably, as some foundries provide Mosys's 1T-SRAM as “eDRAM”. However, other foundries provide 1T-SRAM as a distinct offering.
 
Status
Not open for further replies.
Back
Top