Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
IMO, as even 900mhz won't get them parity, they should go for the CPU.
The web can be stupid, at least a faster CPU than competition will have people to argue for eons about the respective merits of the designs.
 
They can't overclock the CPU too much either.
At 2GHz would require a 66% increase in TDP.
Well percentages are percentage of something.
If you look at Kabini SKU, the 1.6GHz part has a TDP of 15Watts, the 2GHz part, 25Watts, indeed ~66%, though that is "only" 10 Watts. Power consumption doesn't scale exactly with the number of cores, if it were an 25% overclock of the CPU would costs 20Watts.
Thing is as all this seems to about "PR wins", they doesn't have to go for a 25% CPU overclock to claim that they CPU is faster.
That is if MSFT is really about to enter a "dick contest" with Sony which I doubt but it seems that lot of People would want MSFT to have at least 1 "technico-PRish" wins against Sony :LOL:

I just say that Having a faster CPU would work in that context.
 
Last edited by a moderator:
MS could always just go ahead as planned, and release an upgraded Xbox One, or Xbox Two (lol) that runs all previous xbox one games plus is more powerful in even just a few years.
If you did that then you would screw all the millions of people with older boxes, who would be unable to run new games on their existing hardware. You would also screw developers by essentially pushing the reset button on your installed user base.

That's probably where I would take the ball and run with it at MS.
Tempted to say, we should all count our blessings for you not being in charge then... ;)

Pretty sure nobody else would be interested in something like this.

Virtualizing might make BC more manageable for the successor box in ~5+ years' time though.
 
If you did that then you would screw all the millions of people with older boxes, who would be unable to run new games on their existing hardware. You would also screw developers by essentially pushing the reset button on your installed user base.
You would design it to run the same games jsut at lower quality, like PC or Android. Every new game runs on the old console, at lower quality. There's a thread on this subject.
 
You're still pushing the reset button because you're fragmenting your customer base. One platform will require extra work and attention (maybe a LOT of extra work, depending on how much more powerful it is) while having a much smaller number of units to sell to, and you will also have to make sure the game runs and scales properly to the older hardware, so you get double whammy of extra work. Once to make sure the game runs/looks better, and once to make sure it runs/looks worse.

I can't see how it would make devs happy getting upgraded consoles released in the middle of a cycle. It would be akin to Sega megadrive addons back in the 90s. That didn't end well, as we all recall.
 
If the APIs remain the same, I don't see what the issue would be in releasing an Xbox Two in three or four years. That's the benefit of the API model.
 
IMO, as even 900mhz won't get them parity, they should go for the CPU.
The web can be stupid, at least a faster CPU than competition will have people to argue for eons about the respective merits of the designs.
They've lower latency memory, a fancy audio processor and extra bandwidth for some things. Not much to shout about, but ~10% higher CPU/GPU clocks would create plenty of smoke.

Their big win if there is to be one, is Kinect in the box and the all-in-one bit.
 
If the APIs remain the same, I don't see what the issue would be in releasing an Xbox Two in three or four years. That's the benefit of the API model.
Okay, I assume you are talking about the new VM model(s) for the OS, in which case, that's exactly a great point. But it doesn't answer a crucial question regarding early adopters.

On a different note, a true Xbox One developer is taking to reddit to answer fans questions, really interesting stuff: -from SlickShoesRUCRazy post here http://forum.beyond3d.com/showpost.php?p=1765416&postcount=5308-

http://www.reddit.com/r/xboxone/comments/1i71s5/i_am_an_xbox_one_dev_ask_me_almost_anything/
 
Last edited by a moderator:
Isn't it the same piece from examiner ? And where is the "confirmation " ?
Not that I care much tbh but that's what the article says. Specifications math haven't been shown and such we are going with rumours when it "sounds better". Allegedly their source also says that the 12GB rumour isn't true.

Essentially, the source has confirmed that the increase in GPU clock speed is 100% true

To me the only words I'd like to trust is what Albert Penello said, the point is why he wrote that -I find it useless if there wasn't some truth to it-, especially when he then replied with a "I am a hardware guy, not a PR guy".

This is the only thing that keeps me thinking specs changes could be possible, how and when, I don't know....
 
They've lower latency memory,
To eSRAM (where the CPU doesn't have direct access), sure. Otherwise not so much (DDR3 and GDDR5 are in the same ballpark as all DRAM latency wise).
a fancy audio processor
Granted. It's somewhat probable SHAPE can do more than the PS4 audio processor. How relevant this will be we have to wait and see.
and extra bandwidth for some things.
What extra bandwidth? The eSRAM may be enough to compensate the bandwidth disadvantage to the main RAM for some stuff (mainly ROP exports in first generation games, later one may see one or two examples of more unorthodox uses) and help to compensate that the XB1 has just half the ROPs by providing adequate bandwidth (the 32 PS4 ROPs could be bandwidth starved in some [contrived?] scenarios with HDR color formats and blending, but it's hard to imagine they will end up slower than the 16 ROPs of Durango). But the single memory pool of the PS4 will be easier to develop for and won't cause headaches of splitting the assets in the most optimal way between two memory pools to get parity with the PS4 bandwidth.
Not much to shout about, but ~10% higher CPU/GPU clocks would create plenty of smoke.
It will just help them to close the gap a bit more.
Their big win if there is to be one, is Kinect in the box and the all-in-one bit.
The all-in-one bit is probably quite US centered. Personally, I hate settop boxes. That means I prefer a CAM plugged directly into the TV (current models include receivers for everything from DVB-T over cable to satellite, so one really doesn't need a settop box). And this also means I have no signal for the HDMI in of the XB1. At least for me the TV functionality of the XB1 is completely wasted.
 
Last edited by a moderator:
Isn't it the same piece from examiner ? And where is the "confirmation " ?

From the piece:

Essentially, the source has confirmed that the increase in GPU clock speed is 100% true, with the increase happening in direct response to the PlayStation 4. The increase has been a reality since before the reveal, as they have "been reacting to Sony ever since the first leaks of both systems." The news of this bump in clock speed was announced to first parties only, as they "actively spread disinformation to 3rd parties just before reveal to prevent leaks."

I guess you could say it new in and as much this Hilbert Hagedoorn fellow has a trusted source saying the above. Of course the active spread of disinformation could have entrapped our trusted source as well. Would an actual speed increase be matched by PS4 since an increase in RAM would not.
 
From the piece:

Essentially, the source has confirmed that the increase in GPU clock speed is 100% true, with the increase happening in direct response to the PlayStation 4. The increase has been a reality since before the reveal, as they have "been reacting to Sony ever since the first leaks of both systems." The news of this bump in clock speed was announced to first parties only, as they "actively spread disinformation to 3rd parties just before reveal to prevent leaks."

I guess you could say it new in and as much this Hilbert Hagedoorn fellow has a trusted source saying the above. Of course the active spread of disinformation could have entrapped our trusted source as well. Would an actual speed increase be matched by PS4 since an increase in RAM would not.

That quote is pulled word-for-word from the old Examiner article.
 
I keep mentioning this big fact: They are trying to cool 100W (near) silently.

100W is nothing for a chip of that size.

They are not in the "overclock" or "burning hot" region at all. They are in the opposite region, by their choice.

If that's the case then i see, though my question has been for some time based on the power supplies used for the ps4 and XB1; Is the ps4's external and is it going to be bigger than Xb1's? I don't know what's the expected size of XB1's power supply but we do know the actual PS4 console is going to be smaller than XB1's. I'm thinking i don't see it possible for sony to include the power supply into the new design....unless there's something i'm overlooking.

Also, how difficult would it be to have a power supply change for the XB1 if it was needed? I do see the costs going up a little to enable a GPU altering, but i don't see it too expensive to be possible.
 
Is the ps4's external

I believe the PS4 is internal. Don't see that as a problem. Whether internal or external is better, that's up to debate. I used to think internal is way better, now that I've matured...I don't think it matters too much. I'm currently on assignment in a region where my US 360 and PS3 requires a power converter. For 360, I just buy a power brick for region...for the PS3, I brought a power converter. The power converter costs more, but works just as well.

Personally for me, external has it's advantage...One night, my friends and I got drunk...and decided to play some xbox. They moved my 360 out of my bedroom to the living room, but they didn't move the power brick, they instead found my US power brick and used that instead. Needless to say, it easier to replace that than to have to open it up and replace it internally.

edit: just in case, you're wondering why i'm not the one moving my 360 instead of my friends. Because I have more important things to do...to find more alcohol in the apartment.
 
I wouldn't consider the ability to accidentally use the wrong kind of power brick, thus destroying it any kind of plus. AFAIK the PS3 (and presumably PS4) internal power sources can actually recognize and deal with the regional differences in electricity supplied. The only thing you'd have to change is the $3 cable to the wall.
 
If that's the case then i see, though my question has been for some time based on the power supplies used for the ps4 and XB1; Is the ps4's external and is it going to be bigger than Xb1's? I don't know what's the expected size of XB1's power supply but we do know the actual PS4 console is going to be smaller than XB1's. I'm thinking i don't see it possible for sony to include the power supply into the new design....unless there's something i'm overlooking.

Also, how difficult would it be to have a power supply change for the XB1 if it was needed? I do see the costs going up a little to enable a GPU altering, but i don't see it too expensive to be possible.

The PS4 supply looks to be internal like it was on the PS3. The PS3 internal power supplies are examples how how it is possible to have much larger than 100W internally. That isn't difficult. There are pros and cons of each approach. I like the "clean" aspect of internal, but I like the lower internal heat of external plus ease of replacement. However I have not regularly replaced power supplies since the era of my C64 where it fried on a regular basis. It should not need replacing if designed correctly and built with quality components.

Also it is not hard to make the Xbox One power supply for 125W or 140W. The choke components and phases look like what is used on the high end 7970 for example (about 2x the power). You can go higher end still but you don't need to. I don't know about the MOSFETs or the caps but with the right part numbers it looks like the PCB has the required number of components for 125W or 140W without too much trouble.

I was expecting something in the range of 180W and a transistor budget in the range of Xenos + Xenon. The transistor budget is about right but the power is about half. I think they simply made the choice for a low power and cool/quiet design.
 
I wouldn't consider the ability to accidentally use the wrong kind of power brick, thus destroying it any kind of plus. AFAIK the PS3 (and presumably PS4) internal power sources can actually recognize and deal with the regional differences in electricity supplied. The only thing you'd have to change is the $3 cable to the wall.

I agree with you that ideally self voltage regulating power supply is the best solution, but thats a different point than regarding internal vs external. Imagine if the 360 had an internal powe supply that doesn't self regulate.

It's good to know that my ps3 supports 220v, just wish it would labelled as such...nevertheless a nice surprise!
 
Status
Not open for further replies.
Back
Top