xbox 360 specs (unofficial, but believable)

Besides, hasn't STI already stated that Cell can scale to 4.6GHz

because sell is going into servers too. It can scale to 4.6 ghz and that could be server only . It can also scale to 80x1000 for all we know . Do you honestly think you will see that in a console ?

Now i'm being optimistic with a 3.5 1x8 . It could surely be 4ghz but i doubt it . Just as i doubt the x360 cpu will be 4ghz

What would be the basis for two chips running at similar clock speeds and similar die sizes, having such a disparity in efficiency
There are alot of factors . Optimal cache sizes , penatlies for miss predictions . Latancy . Keeping the cell chip feed and optimaly running code on 9 diffrent cores will be harder than on a 3 core cpu . It will also be harder to be kept filled
 
wco81 said:
A 40 GB drive being big enough to serve as storage for HDTV recordings?

Not so sure about that. Comcast offers HDTV PVRs with 160 GB drives I think and they only get about 10-20 hours. Of course MS could use VC-1 but you'd have to use 1/10th the bitrate to store a comparable amount of recordings.

ATSC tuners are not that cheap yet, not to mention QAM tuners or tuners which will record from satellite.

Plus VC-1 compression in software will chew up a lot of CPU. Probably enough to affect game performance?

Would you be able to record "24" in HDTV resolution (in this case 720p) and play Halo3 at the same time?

The pricing would be very attractive, a $360 box to record HDTV video as well as play games. Hell, I'd pay that much just for the HDTV recording capabilities alone.

Maybe the only way to do it is to stream HDTV video from an MCE. The WSJ article said the MCE extender functionality would be built-in. Would playing a stream from an MCE use the CPU? Or could it be that once captured/encoded by the MCE, decompressing the streams would require a fraction of the CPU needed to compress?

Wtf, you expect the ability to record video and play games at the same time? I for one certainly don't.
 
Brimstone said:
therealskywolf said:
jvd said:
The gpu clock speed , edram amount , hardrive size and bandwidth from main ram is wrong

In what way? Their worse or better? I've heard the edram ammount would be bigger and the hardrive at least 40gb. The gpu clock speed also sounds kinda low, arent last year gpus already around that?

A GPU with eDRAM is different from one without. Getting a high clock speed with GPU including eDRAM is difficult. Also heat is a major concern because it will cause the eDRAM to fail.

That's true assuming the eDRAM is on the same chip. If the EDRAM module talked about is true then it's a different story.
 
MechanizedDeath said:
pakotlar said:
The 360Gflop figure that the Cell is touting is for single precion, best scenario case. I wouldn't bet on it actually achieving that figure outside of very rare circumstances. Will the CPU be more powerful that the X360's. Yes. Will it be a lot more robust? Probably not. Also keep in mind that the PS3 gpu may rely on the Cell's SPU's to process vertex data, in which case the Xbox360 may come much closer performance wise to the PS3. I do not believe that the PS3 will be much more powerful than the Xbox360. It will have a somewhat more robust CPU setup, and a more powerful (?) pixel rendering setup. However it may add up to be a very similar gaming experience compared to the xbox360. Remember what kind of technical advantage the PS2 touted compared to the dreamcast? I personally bought into the hype and didnt buy a dreamcast at launch. When the PS2 came out, I remember comparing the games and remarking how similar the graphics between the two were.

Doesn't this keep the comparison relative? If the Cell FLOPS figure is ideal, and the realworld figure is lower, then why wouldn't the XeCPU figure also be an ideal figure with the same downward scaling? It's been en-vogue to accuse Sony of fluffing their numbers, but I don't see how it'll be different for MS. And if they're both fluffing, then the comparison stays the same. Roughly a 3x advantage in CPU power. It could just be that STI did their homework, that's all. They spent enough, shouldn't we expect them to reap the rewards? PEACE.

Yep, good point. The MS figures are also for ideal scenario. The efficiency of each chip (how close to max perf it can reach) is to be seen. Also remember that some of the technology developed by Sony/Toshiba/IBM is going into the X360 CPU, if rumors are correct (specc'ing it to be 3 PE's). From what I understand, the Cell is focusing on vector math with the multiple SPU's.


Questions to answer:
How efficient is the Cell at processing A.I and physics code?
Are the X360 CPU's better at general purpose computing than Cell?
How much (if any) of Cell's processing time will be spent on vertex data?
 
one said:
Jaws said:
barnak said:
Sorry I havnet been keeping up with the specs as much as I like, so MS went ahead and went for the 512 ram?

If this spec is true then yes, 512MB is good.

Memory
-512 MB GDDR3 RAM
- 700 MNz DDR

Memory Bandwidth
- 22.4 GB/s memory interface bus bandwidth
- 256 GB/s memory bandwidth to EDRAM
- 21.6 GB/s frontside bus

700 MHz DDR (1400 MHz effective)

22.4 GB/s --> 128 bit bus

I was hoping it would be 256 bit ---> 44.8 GB/s :(
128 bit bus enables them to reduce 512Mbit chips * 8 to 1Gbit chips * 4 when cheaper 1Gbit chips get available a few years later. Then they can retool the motherboard more simpler, and cheaper.

BTW, 256 GB/s memory bandwidth to EDRAM means it uses 4096bit bus running at 500Mhz if the GPU is a single chip.

Or it could just mean it's using compression using a smaller bus width...
 
PC-Engine said:
That's true assuming the eDRAM is on the same chip. If the EDRAM module talked about is true then it's a different story.
If eDRAM is not on the same die as the GPU then it's not embedded anymore. Hence it would only be E(nhanced)DRAM.
 
Squeak said:
PC-Engine said:
That's true assuming the eDRAM is on the same chip. If the EDRAM module talked about is true then it's a different story.
If eDRAM is not on the same die as the GPU then it's not embedded anymore. Hence it would only be E(nhanced)DRAM.

Would putting quotes around eDRAM make you happy? Did it change the point? No.

If it's not embedded it could be called any type of DRAM, DDR, GDDR, XDR...
 
Hang on, where is jvd getting his information from about the eDRAM amount and GPU clock speed being wrong?
 
Squeak said:
I confused those two myself once, so just trying to get the semantics right. :)

I know what eDRAM is and maybe I should've put quotes around eDRAM to make it clear that technically it wouldn't be "eDRAM". Regardless if it's off die then it could be called anything and not just EDRAM or Enhanced DRAM. :)
 
Re: Anti-Aliasing

Megadrive1988 said:
I was going to start a seperate thread for this - but that doesnt seem right at this time - I was going to start a topic on Anti-Aliasing on the 360. so I'll just throw it out here in this perfectly capable thread :)

what can we expect for AA on the '360? are developers limited to 4x in HDTV resolutions ?

is 4x AA for 720p, 1080i or just 480p/480i?


if defaulting down to 480i/480p can we get more AA to help make up for the lower res?

can we expect AA in all '360 games?


current consoles with decent AA *capability* (GCN, Xbox) don't seem to have many games that use it -- developers seem to want the extra cost of AA to go into prettier graphics and smoother framerates. its fair to say that AA on this generation consoles (DC, PS2, GCN, Xbox) has been almost a disaster.

not just on PS2. but all consoles. ok perhaps 'disaster' is the wrong word. failure? disappointment? definitally a disappointment for me. without a doubt.

This does deserve a thread by itself. I utterly agree with you that AA has been a massive disappointment this gen.

When the Dreamcast first came out the one thing I really liked about it was the smoother AA and far less jagies that the Playstation. I had all these high hopes for the consoles to come.
The PS2 with its jagies and shimmering textures was a shocking letdown, and the Xbox for all its vaunted power never seemed to have AA turned on in any games because of frame rate issues.

One of the best features of the X360 is the recognition of getting AA done right, with a dedicated AA filter and native 720p resolution.
 
What do you mean by output filter? I dont know how it worked. I always though the Xbox at least should have had a lot better AA than a console released in 1998. It didnt seem to
 
Xbox 360's EDRAM will be embedded alright. The point about using EDRAM is that the blending/filtering hardware (Raster Output) which is normally the final stage of the graphics pipeline, is going to be on the same chip as the EDRAM. That alone means that anti-aliasing will be extremely fast on Xbox 360, and it's why the "effective 256GB/s" figure is thrown around.

Whether the ROP/EDRAM will be on die with the rest of R500 is still open to debate. Personally I think there's a chance it will, because I think R500's graphics pipeline is smaller (less transistors) than a lot of people are expecting. I conclude this because of the unified shading architecture and the relatively low peak fill-rate of 4GP/s (i.e. much lower than R420, for example, which at 500MHz is 8GP/s).

R500 is, as far as I can tell, designed to maintain a high fill-rate when AA/AF is turned on - whereas R420, for example, can't cut it when you turn on AA and AF:

http://www.beyond3d.com/reviews/ati/aiwx800xt/index.php?p=13

This shows (at 1280x960) how a theoretical 8GP/s GPU ends-up with fill-rates of:

- 130 MP/s
- 86 MP/s with 4xAA/8xAF

So AA/AF reduces performance to about 66% (with heavy shader code and lots of texturing).

Jawed
 
jvd said:
I could see the xcpu hitting mabye 60 in a real life situation as i think it would be easier to keep it operating to its maximum as its maximum is much lower
Umm, what's with all your pulling stuff out of your bum, J? You're not a programmer, nor a hardware engineer, you really have no clue on the subject (neither do I, I might add but you don't see me running around quoting speed figures do you?)

Any current PPC/x86/MIPS/SPARC/etc CPU score rather far from its theoretical max on almost any FPU workload, no reason to believe different from x360s CPU.
 
Agisthos said:
What do you mean by output filter? I dont know how it worked. I always though the Xbox at least should have had a lot better AA than a console released in 1998. It didnt seem to

I'm not an expect on it, and some of what I say could be wrong, but as I understand it you can filter the output - Dreamcast, for instance, filtered the image by taking advantage that it was outputting to a interlaced TV, but had a full-size frame buffer, allowing a "free" form of AA. Because many of the original PS2 titles didn't use full-frame buffers, they couldn't enable it (but later ones did).

Also, the sharpness of the output is can be manipulated - PS2 was crystal clear, which really showed up on a lot of the jaggies, whereas Dreamcast had a sort of "soft" look - Xbox seemed to be somewhere in between.

In any case, if you play Soul Calibur or whatever on a monitor, you'll see there is no AA going on. Some games had some SSAA going on - apparently Ready To Rumble 1 or 2 did - but the majority didn't.

I do remember at the time though thinking how great the DC looked and thinking it looked like it had some AA going on :) But yeah, to this day very few console games use AA - Quincunx on Xbox was too blurry, and other AA modes too slow. Because most TVs are kind of blurry, devs figured they're better off putting in extra features to use up the fillrate, rather than AA.
 
If MS manages to keep their console priced much lower than the PS3 and continues to steal more developers from Sony we should see a very successful product in years to come.That is if they can meet with consumers demand.History has shown that superior hardwares doens't guarantee a successful console and I am certain MS is heading that path.
Do not be put off by how much more powerful the PS3 would be because the further it goes the pricing criteria would follow in parellel.It would end up with a downside for their marketing.

A tri-core CPU @ 3Ghz does not come cheap.Look at the P4 EE 840 dual core for example and the PowerPC G5 2.7Ghz.I am wondering how they are going to price their consoles below the 500 dollar mark.A 9-core Cell?with a top of the line GPU included?How much losses are these companies going to bear?Next gen would not be cheap.
 
hugo said:
A tri-core CPU @ 3Ghz does not come cheap.Look at the P4 EE 840 dual core for example and the PowerPC G5 2.7Ghz.I am wondering how they are going to price their consoles below the 500 dollar mark.A 9-core Cell?with a top of the line GPU included?How much losses are these companies going to bear?Next gen would not be cheap.
The 9 cores of Cell aren't comparable to the dual Pentiums cores. Overall Cell occupies as much die space as these dual-core processors, and as I understand it that's the limiting factor. Plus the dual-Pentiums have a massive markup as they're for a specialist market willing to blow money away.

Price of dual-Pentium/PPC is not comparable to price of 9 'core' Cell.
 
Back
Top