Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
7Gb for games
3 or 4 for OS
Idk what the other gig is reserved for if they do go with the 12Gb and 900Mhz GPU and ESRAM.

Anyway 12 CUs at 900 MHz are still 1,38 Tflops and 112 GB/s of ESRAM bandwidth. So much fluss for so little gain... More CUs or burst!.
 
Last edited by a moderator:
Anyway 12 CUs at 900 MHz are still 1,38 Tflops and 112 GB/s of ESRAM bandwidth. So much fluss for some little gain... More CUs or burst!.
This is a significant point IMO. The real-world gains won't be noticeable between an upclock and not, so if they are happening, it must be at zero cost. There's no point spending out another n million bucks to gain zero advantage in the product positioning. Unless MS has suddenly become charitable towards their fanbase and are doing this as a service to them...

The 12 GBs I can understand to be RAM competitive considering so much is taken with the OS (assuming PS4's OS is lighter and devs have more RAM to play with), although that change is going to come at quite cost, and probably not much visual improvement. The up-clock would have to be something they've found they can do with the existing design and no changes.
 
If they wait until then no games will be taking advantage of either change...
Games will be going into QA probably in August.
I think any change at this point is going to be pointless.
Extra RAM might buy them something in the medium to long term, but I just don't see it happening.
Well my post was no that serious either more a joke with xbox 1 and the radeon (1) GHz edition.

Though I'm not sure about what is going on at MSFT at the moment, for me the the U-turn they made wrt their DRM policies was really unexpected. We are not sure about the reasons behind it, could be the noise on the web (but I would not bet on that being that relevant) or others less obvious things or a combination of those factors.
Point is I find it amazing, when I look at the length MSFT went through with that design, according to them they are lots and lots of simulations, attention to detail like noise was extreme and so on. SHortly I expected such a big company as MSFT to have considered everything, to have a proper PR campaign, etc. It turned out that that was not the case and by a stretch (I won't get into the details here it is OT).
Then E3 happened and following E3 the unthinkable for me happened they turned their back to their policies, in the process making letting people with questions about how the system will deal with some functionality.

Clearly something happened, was there tension within the management, that finally get out of control and somebody (at the top so Ballmer most likely) finally acts? It is weird, it is a big business, a big company, I can't understand why such tensions were not dealt with prior the launch period. Though something "strange" happened after the Windows8 launch too.
At this point I'm willing to believe that outside of their corporate division there are severe issues within MSFT management.

Anyway to which extend MSFT changed its views about what the Xbox one should be, it is unknown. That product is extremely important to them, that the only segment in the personal realm where they did well, if not great. They clearly want to use it so devs have more intensive to develop Metro Apps, Windows RT is not doing well and Windows 8 sells but imo it is a mess (/damaging for the reputation of Windows /my pov and my pov only).

Why doing it now, in a hurry? that is what I don't get it but exactly the same applies to their U-turn. I don't believe it is people whining that made them move (could have played a part, though as I said it is not the topic to speak about that) but something made it that they have to make move.
It is not a tiny move, if I were an investor I would be "wtf are they doing?" last minute improvisation on a billions Dollars business plan should rightfully be ill perceived.

So I'm a bit more opened now wrt a possible changes in hardware, though it was a joke actually I think that they have to do something significant to change the "tide" (again whatever the reasons known or unknown that set that tide), 1GHz GPU and 2 GHz CPU is something easy to remember, as well as the "gigahertz edition" hence my joke.
For the timing it is a lesser concern indeed really a joke following 3dillentante sarcasm (I read it as such) about a day launch firmware update for maximum levity :LOL:
 
I disagree with Shifty's supposition that you wouldnt be able to see the clock bump.

It's all a matter of increments and where do you draw the line? I thought a 50 mhz downclock would have been bad (though nothing like the original drastic downclock rumors obviously), so how can I not think twice that on the positive side is relevant? Or put another way, going from 1.2 to 1.4 TF is almost like adding two CU's (not really since that's rounded up, but anyway).

I mean it just becomes a case of wheres the magic line? 6 more CU's on PS4 is relevant (lets says, since most seem to agree) but 2 more effective CU's (XB1 upclock to 900) is not? What about 3? 4? 3.56?

I agree 1 mhz, 2 mhz, isn't relevant but I think even 20 mhz is to some small degree!

Also, I dont mean any offense but 900 mhz came from mrphenix and he's well, highly unproven to say the least. I'd take the more trustworthy rumor as 875 from eastmen, and even that is quite flimsy as evidence at this point. I could of course well eat crow on this amd mrphenix could prove right.

Looks like MS has rumored to be a press conference at Gamescom in late august. As always maybe some answers then (but probably not)

http://www.computerandvideogames.co...microsoft-planning-gamescom-press-conference/
 
I disagree with Shifty's supposition that you wouldnt be able to see the clock bump.
It'll mean the difference between 30 and 33 frames a second. Or rather between 30 and 27 fps, with a smidgeon of screen tearing versus none. Or 1920x1080 versus 1728x1080. If placed side by side, you might notice the higher spec'd XB1 a little better than the normal-clocked XB1, but in terms of consumer experience it's not enough to make a real, notable difference. Heck, there's even debate over how much difference 50% more CUs will actually make on screen and whether that'll be enough to sway consumers. It's all a matter of value. If that overclock comes at negligible extra cost, we want it. But if it results in, say, increased failure rate, would you really prefer the 10% extra framerate/resolution over an increased chance of your console dying? Or if it's the difference between silent operation and a noticeable noise? Different folks will value it differently, but from a business POV I'm not seeing the value in spending big bucks on an upclock. I'm not really seeing the value in spending small bucks on an upclock. ;)
 
It'll mean the difference between 30 and 33 frames a second. Or rather between 30 and 27 fps, with a smidgeon of screen tearing versus none. Or 1920x1080 versus 1728x1080. If placed side by side, you might notice the higher spec'd XB1 a little better than the normal-clocked XB1, but in terms of consumer experience it's not enough to make a real, notable difference. Heck, there's even debate over how much difference 50% more CUs will actually make on screen and whether that'll be enough to sway consumers. It's all a matter of value. If that overclock comes at negligible extra cost, we want it. But if it results in, say, increased failure rate, would you really prefer the 10% extra framerate/resolution over an increased chance of your console dying? Or if it's the difference between silent operation and a noticeable noise? Different folks will value it differently, but from a business POV I'm not seeing the value in spending big bucks on an upclock. I'm not really seeing the value in spending small bucks on an upclock. ;)

Still not clear to me why they would go thru the hassle for a small increase in clock which would have a marginal benefit on performance. The memory seems like it could be more useful if the OS is taken too big a footprint but again I have a hard time seeing where 4GB isn't enough memory for games much less 5GB and why they would even need to set aside more?

Aren't both consoles based on the current specs already at the sweetspot of the performance curve based off bandwidth, CPU and GPU? What would another 2 or 3 gigs do for these machines that the hard drive and virtual memory cant do already?
 
It'll mean the difference between 30 and 33 frames a second. Or rather between 30 and 27 fps, with a smidgeon of screen tearing versus none. Or 1920x1080 versus 1728x1080. If placed side by side, you might notice the higher spec'd XB1 a little better than the normal-clocked XB1, but in terms of consumer experience it's not enough to make a real, notable difference. Heck, there's even debate over how much difference 50% more CUs will actually make on screen and whether that'll be enough to sway consumers. It's all a matter of value. If that overclock comes at negligible extra cost, we want it. But if it results in, say, increased failure rate, would you really prefer the 10% extra framerate/resolution over an increased chance of your console dying? Or if it's the difference between silent operation and a noticeable noise? Different folks will value it differently, but from a business POV I'm not seeing the value in spending big bucks on an upclock. I'm not really seeing the value in spending small bucks on an upclock. ;)

no but 800Mhz vs 875Mhz = joe thinks MS have a better GPU...
 
Actually, the Amiga OS required just above 100k in normal resolution. Yes, my Amiga still works, it has a 25 years old Quantum SCSI HDD and a 1084 NTSC monitor. :LOL: They achieved that with lots of cheating and 100% assembly, it had zero memory protection, it was just a big beautiful pile of hacks. If you needed a driver for some new peripheral, you just drag and drop a .device file in the dev directory, and it just worked. It was a single 15kb file, not a .NET crap with useless UI from a 200MB download, trashing the windows registry, and reboot twice, for absolutely no reason.

Still, they were cheating. No memory protection, we have to be honest about it. Combined with it's unified memory, a crash would "display sounds" on your monitor, and "playback graphics" through your sound system. No system ever crashed so spectacularly. Ever.

Back on topic, as soon as you need a web browser, any attempt at making a super lean and mean OS will fail. I fully appreciate the memory reserves they want to have, I just think much more than 1GB is a bit crazy.

(off-topic)
AmigaOS only had very few elements written in asm(microkernel and few little things). The rest was in C or BCPL(DOS) which were later moved to C with AmigaOS 2.x.

1 address space OS without any VM/MP is not cheating. It was a reasonable design decision for its market reality in the mid 80s on a 68000 and lack of memory.

There are evolutionary implementations of advanced AmigaOS APIs for PowerPC(Apple or custom and still 68k compatible with OS apps) and x86(hosted, not compatible).

All still one address space OS but more stable than before as there's limited VM/MP. The PPC versions require about 32-48MB to boot to the desktop in 1920*1200 and there are stable webkit implementations.

But as you said you need memory to run these broken lazy garbage collector browser instances so MS's OS memory budget is more related to browser and some other middleware memory bloated apps than the core of the OS.
 
no but 800Mhz vs 875Mhz = joe thinks MS have a better GPU...
How much real-world sales will that net you? For those who look up stats but don't go far enough to learn 18 CUs versus 12, and who then buy based on the highest clockspeed regardless of content and services and price (and possibly what's visible on screen), sure, but I'm not feeling that's a large population that justifies any expenditure.
 
How much real-world sales will that net you? For those who look up stats but don't go far enough to learn 18 CUs versus 12, and who then buy based on the highest clockspeed regardless of content and services and price (and possibly what's visible on screen), sure, but I'm not feeling that's a large population that justifies any expenditure.

Well the problem MS seem to have is justifying an extra £80/$100 on weaker hardware, if they started a campaign saying all the same stats but a GPU appearing 10% faster then all of a sudden they appear to justify part of that price.

Pre E3 (yes I know there is the DRM factor to consider) pre-orders were neck and neck with both launching "at the same price" then now we have a console cheaper it's winning so MS may feel they need to justify the extra cost to balance that back out.

Let's face it, everything they seem to be doing is reactionary - this is quite a mess...they need any good news they can get (bandwidth news as an example)
 
Imagine they launch with those alpha or beta kits finally with that monster Intel CPU and discrete graphics card!. In this fantasy realm could it even be possible legally to say AMD we don´t want that APU anymore?.

I didn't say any such thing.

Try again: This is what I said:

1. I explained (to answer the question) what the first dev kit was reported to be (reported by others: 8 Core Sandybridge-E, 7970, 2x4GB & 2x2GB DDR3 = 12GB)

2. I suggested (based upon the *1GB* 360 dev kits/same PCB) that MS did the same thing this round and the the dev kits look the same as the production boards but with the alpha readout LEDs (that would not be in a production kit) and 16 x4G DDR3 (like production) on the front of the board (as shown in the wired photos) and 16 x2G DDR3 on the back side of the board. [THIS WOULD BE THE NEXT DEV KIT >>> NO INTEL >>> NO 7970. Take a look at the photos at Wired.]

[Actually it might be a bit stranger that there are no memory module land patterns on the back of the Wired board. Why make the board that big? That is a waste.]



So in other words maybe MS did what they did with the 360 dev kits: Basically same board and BOM (bill of materials) but with a couple changes: More memory on the same board and some debug provisions.

It is not a wild suggestion. Basically saying they did what they did 8 years ago.

So it would be no trouble to ship 12GB at all. It would be the same PCB used to ship 8GB and would look exactly like what was shown in the wired photos. Just mount the 16x2G modules on the land patterns on the back of the board which would be empty if they shipped 8GB.



Absolutely nothing to do with shipping Intel CPU or 7970 graphics card. Try a reading comprehension course.



Here you can see how similar the 360 dev kits looks to a 360 (inside, PCB back and front):

http://ixbtlabs.com/news.html?04/62/50

Mounting the memory on both sides is really common.
 
Last edited by a moderator:
I gotta go so I will be brief. Firstly, eastmen and (((interference))) you are fun as hell with this speculation. I know how active you have been with these things, specially (((interference))), although I think it is highly unlikely we are going to see a clock bump.

It has been tough to keep up with this thread, it took me a while to read through the pages, and I skimmed through the posts in this very last page, I shall read them later.

The 12GB rumour is not new though and people said it seemed possible cos the developers kits have that amount of memory already.

Otoh, Penello writes long posts but he doesn't say much because he can't. An upclock plus an increase in RAM with an ample size would have statistical significance and would add some extra performance, but I still think it's too much.
 
I didn't say any such thing.

Try again: This is what I said:

1. I explained (to answer the question) what the first dev kit was reported to be (reported by others: 8 Core Sandybridge-E, 7970, 2x4GB & 2x2GB DDR3 = 12GB)

2. I suggested (based upon the *1GB* 360 dev kits/same PCB) that MS did the same thing this round and the the dev kits look the same as the production boards but with the alpha readout LEDs (that would not be in a production kit) and 16 x4G DDR3 (like production) on the front of the board (as shown in the wired photos) and 16 x2G DDR3 on the back side of the board. [THIS WOULD BE THE NEXT DEV KIT >>> NO INTEL >>> NO 7970. Take a look at the photos at Wired.]

[Actually it might be a bit stranger that there are no memory module land patterns on the back of the Wired board. Why make the board that big? That is a waste.]



So in other words maybe MS did what they did with the 360 dev kits: Basically same board and BOM (bill of materials) but with a couple changes: More memory on the same board and some debug provisions.

It is not a wild suggestion. Basically saying they did what they did 8 years ago.

So it would be no trouble to ship 12GB at all. It would be the same PCB used to ship 8GB and would look exactly like what was shown in the wired photos. Just mount the 16x2G modules on the land patterns on the back of the board which would be empty if they shipped 8GB.



Absolutely nothing to do with shipping Intel CPU or 7970 graphics card. Try a reading comprehension course.



Here you can see how similar the 360 dev kits looks to a 360 (inside, PCB back and front):

http://ixbtlabs.com/news.html?04/62/50

Mounting the memory on both sides is really common.

It´s not clear who needs the reading comprehension course. My post was a simple joke I came up reading your post.
 
no way alpha kits had a 7970... a 7790 is basically what is in the xbone

Do you know that or are you making it up? Have you seen a dev kit? Did anyone tell you what is in the dev kit?

I guess you didn't see photos of things being demoed at E3 with things like a Geforce Titan?
 
Status
Not open for further replies.
Back
Top