Whoops: XB360 at 2.8GHz

Jawed

Legend
Whoops: XB360 Dev/Debug at 2.8GHz

I've just heard that a load of final XB360 debug (and/or devkit - not clear) have been discovered, in the wild, at 2.8GHz instead of 3.2GHz. So they have to go back. Could be November before they're replaced :rolleyes:

Not all of them, but...

Jawed
 
Last edited by a moderator:
I hope there aren't yield difficulties at 3.2 GHz or anything, because I would be a little surprised to learn that all these chips went out clocked 400 MHz lower by accident.
 
Geh.

I remember Itagaki commenting something about whether or not Sony could deliver the speed promised in one of his interviews (the one where he talks a bit about the PS3, calling it a "baby, that needs to be brought up proper" or something to that effect). I always thought it strange. Perhaps, he's known that there are some yeild issues at 3.2GHz with the 360 and subsquentially the PS3? Assuming, the same chip of course.


Maybe DD2 had something to do with this perhaps? Doubt it, but there must be some reason for DD2.
 
What does it mean, are there both 2.8Ghz kits and 3.2Ghz kits in hands of devs as 'final' kits?

Final, or final beta? :???:
 
Final kits, as in final final kits have been reaching devs for the last couple of weeks. Or supposedly final kits.

I can only assume what Jawed is referring to are kits that were supposed to be final.

Jawed - can you elaborate as to where this info is coming from?
 
I thought the title rather misleading. Sounded like the XB360 was being launched at lower clocks, rather than a problem with devkits.

I remember the noise here about XB360 being lower clocked and thought this was confirmation but if it's only related to devkits it's not a huge issue. Just slows a bit of development for some people.
 
Mefisutoferesu said:
I remember Itagaki commenting something about whether or not Sony could deliver the speed promised in one of his interviews (the one where he talks a bit about the PS3, calling it a "baby, that needs to be brought up proper" or something to that effect). I always thought it strange. Perhaps, he's known that there are some yeild issues at 3.2GHz with the 360 and subsquentially the PS3? Assuming, the same chip of course.
Was it about how Sony could deliver? IIRC he didn't specify who should deliver but in the same interview he talks about how it was unexpected for him that the Xbox 1 GPU was downgraded adjacent to the launch.
 
Mefisutoferesu said:
Geh.

I remember Itagaki commenting something about whether or not Sony could deliver the speed promised in one of his interviews (the one where he talks a bit about the PS3, calling it a "baby, that needs to be brought up proper" or something to that effect). I always thought it strange. Perhaps, he's known that there are some yeild issues at 3.2GHz with the 360 and subsquentially the PS3? Assuming, the same chip of course.


Maybe DD2 had something to do with this perhaps? Doubt it, but there must be some reason for DD2.


I think that whole Itagaki thing was just a reference to the PS3 architecture as a whole, and how everyone needs to be involved to make sure it 'grows' to it's full potential. I don't think it actually had much to do with meeting stated specs or anything.

As for reasons 'why' there is a DD2, I'd recommened RealWorldTech's latest article on the matter to you. It's really kind of a wild theory on their part, but hey what the hell. I'm sure in coming months we'll have more interviews with random STI engineers and such and more light will be shed onto exactly what led to the decision to make DD2.
 
Last edited by a moderator:
To be honest, I'm not sure, I'm trying to search through the interviews, but it's proving a pain to find, but I believe you've got the right interview one. If that's the case maybe I read/remembered it wrong? Perhaps he was commenting on a possible relapse of the Xbox's problems?? I'll keep looking for it though.
 
I think they're all meant to be final gear, since a pile of it arrived recently.

Seems accidental - which makes me laugh cos it's so tragic, really, as nobody realised until very recently (a day or two, I think).

The lack of replacements for gear that has to be sent back is what's going to drive peeps mad. This isn't just one batch/place, as far as I can tell...

I don't work there, so don't have much else to offer.

Jawed
 
But how could it be accidental? I mean we're talking about a chip with, essentially, only one SKU - even though a SKU isn't the right way to talk about it...

It would just be surprising to me if for whatever reason a great number of these chips were accidently set at a lower clock. I would more readily believe Microsoft sent out chips that had failed to meet the voltage criteria for 3.2 GHz operation in order to satisfy dev-kit demand.

I'm just theorizing and I don't want this to run away here, but just the notion of getting a chip that for some reason was clocked one step lower in an environment where MS only has to be dealing with one speed is so so strange to me.
 
Last edited by a moderator:
Hey I don't know. Maybe beta Xenons found their way into final kits. What I hear is it's a big stink - the loss of equipment for a month or so is gonna be painful. Can't dev/test if you aint got (enough of) the gear.

Jawed
 
accidental or are the CPU's aggressively changing their clockspeed due to some power savings feature :?:
 
My guess is that they must have "accidentally" used the older motherboard that they had. Isn't the motherboard (or the clock generator and/or the chipset on it) control the clock, not the CPU itself?

That sucks, though. I am sure they are built by hand at the moment...

Hong.
 
Source? Seriously, sounds like a problem on the QC end of thing, if real. But I'm having a really hard time believing that. I mean, they test each chip rolling off a production line to see if it's up to spec. How could a slow chip pass inspection? PEACE.
 
You know that feeling when you press the gas pedal but the car don't go - and you don't know why?

I honestly don't know what it is that sets the clockspeed - didn't even think about it. I don't even know how PCs set their clockspeed these days (seriously! actually I'm quite embarrassed to admit this!). Just assumed that there's a manufacturing step that sets the maximum speed, and in the case of Xenon they'd all be the same and pass/fail in test.

Back when I was a kid you got yerself a 1MHz crystal and that was it - you could divide it down. If you wanted to go faster, you replaced it with a faster crystal and ... well everything broke since the entire system was tied to that crystal...

Tis puzzling. I might have the relevant email tomorrow...

Jawed
 
Back
Top