Whoops: XB360 at 2.8GHz

Is this an inopportune moment for a joke referring to an upcoming announcement to reveal that the lower speed will be the "core" version and the higher speed will be the "deluxe" version? :p
 
xbdestroya said:
***UPDATE***

Well, Fouad - I guess you were right. Indeed in Microsoft's NDA'd developer notes, they indicate the final tri-core design at 3 GHz rather than 3.2 GHz. Now I'm not sure what the significance of this is; honestly I still think 3.2 would be what to expect. But indeed you are vindicated in having preached what you were preaching.

Taken from the document: 'Xbox 360 Alpha vs. Final Hardware Performance'

Keep in mind within the doc itself it states that specs could change substantially prior to release, and that the doc itself is an early release. So again I'm still going with the 3.2 GHZ Microsoft stated, but indeed the doc we were both quoting *does* say 3 GHz.

Going back to a question I asked earlier in this discussion, does anyone know the FSB frequency for the XeCPU chipset/CPU? I'm wondering what the clock multiplier would even have to be to run at 3 GHz. For some reason I had been thinking 400 MHz clock multiplier increments before.

I wonder who has not glanced over those docs... We are making a big fuss over nothing: expect 3.2 GHz chips in the retail machines (there are developers with 3.2 GHz XeCPU's, so there goes the theory ;)).
 
Last edited by a moderator:
Panajev2001a said:
I wonder who has not glanced over those docs... We are making a big fuss over nothing: expect 3.2 GHz chips in the retail machines (there are developers with 3.2 GHz XeCPU's so, there goes the theory ;)).

Agreed.
 
Well, if everybody has a such good insight in this file, then file is a public domain and it could be published. Can someone post the link so we can check it up?
Primary docs (like xenon white paper) explicitly talked about 3ghz cpu. I think this are the old data not in relation with alpha, beta or final dev kit. I think 2.8ghz is a production defect and MS will do recall.
 
I think the whole debate over things like this often take out the worst side in some people(READ ******S).

Although Jaweds title was a little missleading as in many thought X360 xCpu is downgraded i doubt it mather anyways.

IF the yeilds at 3.2GHz dont meet the demand MS is expecting but 3.0GHz DO then IMO the smartest move is to ship final X360 with that speed as they will still sell more and most people dont care anyways, remember they will make money and thats always a companys priority one. And lets speculate the same thing for 2.8GHz, i dont se it as bad buisness because instead of (and remember im just speculate) ship say 300K units worldwide at 3.2 or 600k worldwide at 3.0 and lastely 4MILLIONS units worldwide at 2.8.
What would you do if you were in charge? I know what i would do atleast.

Lastly IF there are any changes is this really something strange, just look at the current gen consoles and how numbers jumped both UP and DOWN at the last minute.
 
I can't believe some people still think this is anything other than a manufacturing mistake.

Devs already have 3.2GHz debug and dev kits. It's just that some of them got made wrong. A juicy bit of gossip that I couldn't let pass...

Jawed
 
From the email:

Final Development Kits started shipping to developers roughly 2 weeks ago. We've determined that a large percentage of these kits shipped with the CPU clock speed locked at 2.8GHz instead of the correct final CPU clock speed of 3.2GHz. Because the CPU is locked at 2.8GHz at the time of kit assembly there is no software fix to tweak the frequency to 3.2GHz and thus no way to increase the frequency of kits in the field. The manufacturing teams have verified that no retail units were affected, this is purely an issue with development kits.

With the exception of clock speed, the CPUs are exactly the equivalent of retail console CPUs and the development kits themselves have no other disadvantages in terms of functionality. Use of a final development kit with a 2.8GHz processor still enables you to install the August Final release and associated QFEs, which raises the memory clock speed to 700MHz
...

Sympathies to XB360 dev teams! Blimey.

Jawed
 
Jawed said:
Because the CPU is locked at 2.8GHz at the time of kit assembly there is no software fix to tweak the frequency to 3.2GHz
Probably one or a couple surface-mount resistors being soldered into the wrong place, setting an incorrect multiplier. That's my guess. :p It doesn't seem as if the chips themselves are different, and I wouldn't expect them to be either.
 
Guden Oden said:
Probably one or a couple surface-mount resistors being soldered into the wrong place, setting an incorrect multiplier. That's my guess. :p It doesn't seem as if the chips themselves are different, and I wouldn't expect them to be either.

Or someone could have grabbed a batch of pre-production sample CPU's on accident, which seems much more likely. After all, they would have done a run of lower-clock speed CPU's to test the fabrication process before they started mass producing the final CPU's. It's quite possible that they could have had a box of those pre-production samples lying around in the same area as the first final-spec chips were produced, and they may have been mixed in on accident.
 
Jawed said:
I can't believe some people still think this is anything other than a manufacturing mistake.

Devs already have 3.2GHz debug and dev kits. It's just that some of them got made wrong. A juicy bit of gossip that I couldn't let pass...

Jawed

but u could have titled thread differently
 
Slightly off-topic. Well, quite off-topic but related to the X360 Development SDKs…

How much do they cost? I think to remember someone said they were quite cheap for a SDK, but I don’t find the quote.
Is it the same price for an alpha kit than for a final version of the SDK?, or do you pay a fixed amount and receive newer versions as they are available?

Thank you.
 
Powderkeg said:
Or someone could have grabbed a batch of pre-production sample CPU's on accident, which seems much more likely. After all, they would have done a run of lower-clock speed CPU's to test the fabrication process before they started mass producing the final CPU's.
The clockspeed is determined after the chip's been manufactured, during chip functionality verification. It would make no sense to fabricate lower clocked chips to test with, because they'd be completely different compared to the real chips; essentially a different product... Besides, the manufacturing process used for these chips is NOT new. In fact it's quite mature by now.

It's quite possible that they could have had a box of those pre-production samples lying around in the same area as the first final-spec chips were produced, and they may have been mixed in on accident.
I don't see how that would have been the case. These things aren't soldered together by hand. They're built in automated plants.

Like I said, I very VERY much doubt the lower-clocked and fully-clocked machines use CPUs that are different, they probably just have different multipliers set, either on the CPU substrate or on the motherboard.
 
DarkRage said:
How much do they cost?
SDKs typically cost LOTS (50-ish K US$, maybe more I suppose), but then you also have to pass qualifications to even be allowed to buy one. I highly doubt just any guy could order a devkit, even if he/she has the money to blow on one.
 
Guden Oden said:
The clockspeed is determined after the chip's been manufactured, during chip functionality verification. It would make no sense to fabricate lower clocked chips to test with, because they'd be completely different compared to the real chips; essentially a different product... Besides, the manufacturing process used for these chips is NOT new. In fact it's quite mature by now.


I don't see how that would have been the case. These things aren't soldered together by hand. They're built in automated plants.

Like I said, I very VERY much doubt the lower-clocked and fully-clocked machines use CPUs that are different, they probably just have different multipliers set, either on the CPU substrate or on the motherboard.

I agree with everything here. I think some people really have an issue grasping what determines chip speed. Every chip on a revision is 'the same,' so to speak. It's what's done to the chip after the fabbing process that seperates it from it's kin.

On the issue of the SDK's, I know the price tag for the PS3 SDK's early on was $100,000 each - if it's changed any I don't know - but it's safe to say that whatver Microsoft's SDK costs, it costs a fair deal less than that.
 
Well, actually though I can't find any confirmation of the $100,000 figure I swore was quoted before - so who knows, maybe it is $25,000?

Oh well I'm sure someone who knows better will float by this topic and answer our questions at some point. :)
 
Even if it is 25k, that's hardly pocket change. :) I sure hope Sony follows through on the talk about Linuxing PS3, think of the scene demos you could cook up on a machine like that... Staggering!

I think it's fairly probable they'll do a hobby development environment of some kind, there's a long tradition of homebrew development with the PSes after all, starting with the Yaroze PS, then continuing on with the (way over-priced) PS2 linux kit.

Maybe a PS3 Linux system could be sort of a new starting-point for young talented programmers, like the basic prompts of the old 8-bit systems in the early 80s... :)
 
Have these dev kit prices been confirmed?

I was also under the impression that these prices are not something out of the ordinary.

In any case I do consider 25,000 chump change when compared to an UE3.0 license.
 
Back
Top