More Xenon rumors

Guden Oden said:
vliw,

You didn't actually respond to my post, you just reiterated the position you'd already made public in your previous post. :?

Of course main memory bandwidth is important. Where else is the CPU going to get instructions and data structures from? Where will the GPU fetch textures from? And so on. One can't just rely on eDRAM, that doesn't solve anything.

Sorry, i'm very busy now soon i'll respond everithing you need to know.
 
The 3.5Ghz number will not happen, i would bet on it. The latest info is spread by someone who likes the attention of seeing his "work" flood on the webb. This is Fake! 8)
 
Brimstone said:
More eDRAM would obviously be nice. TSMC has a Mosys license so the eDRAM they put into the Xbox 2 GPU will be of a much higher density than what was on the Gamecube Flipper chip. Unless Flipper used 1t-SRAM-Q, which I don't think it did.

Isn't this close enough? ;)

"Instead of going for the highest possible performance, which does not contribute to software development, our idea was to create a developer-friendly next generation TV game machine that maintained above-standard capabilities.

In order to accomplish this, we have painstakingly removed the "bottlenecks" which hinder an efficient system. We have introduced 1T-RAM technology, which has a minimum of delays, into the main memory and the Graphics LSI Mixed Memory. Also, secondary cache memory with a large capacity was implemented in the MPU. With this combination we have succeeded at creating reliable functionality that can be used with actual games." - Nintendo of America
 
The 3.5Ghz number will not happen, i would bet on it. The latest info is spread by someone who likes the attention of seeing his "work" flood on the webb. This is Fake!

You never know... Prescott is up to 3.6GHz now, and all three current consoles actually launched with higher CPU clocks than originally specified.. :p
 
Archie said:
You never know... Prescott is up to 3.6GHz now, and all three current consoles actually launched with higher CPU clocks than originally specified...
Exactly - hence Xenon will definately not launch at 3.5Ghz (as states DM(tm) logic extension).

In fact, here are the real numbers you should expect:

Xenon 3.5Ghz -> 4.27Ghz
PS3 4Ghz -> 4.8Ghz
...
:oops:
 
archie4oz said:
The 3.5Ghz number will not happen, i would bet on it. The latest info is spread by someone who likes the attention of seeing his "work" flood on the webb. This is Fake!

You never know... Prescott is up to 3.6GHz now, and all three current consoles actually launched with higher CPU clocks than originally specified.. :p

I believe xbox, and possibly ps2, had lower cpu clocks than the highest ever batted around for the systems.(800mhz for xbox I think, and not sure about ps2)
 
Fox5 said:
I believe xbox, and possibly ps2, had lower cpu clocks than the highest ever batted around for the systems.(800mhz for xbox I think, and not sure about ps2)

No. For xbox the specs were always for a 600 mhz processor AFAIK. That was raised to 733 mhz.

The GPU clock was dropped from 300 mhz to 233 mhz though.
 
PS2 was raised from 250->300Mhz for EE. (and 125->150 for GS).

Interestingly the raise of CPU spec for all three consoles was nearly the same ~ 20%.
 
Well the GPU spec in xbox was dropped from 250 to 233. The 300 mhz spec wasn't officially claimed by nivida, and was a leftover from gigapixel loosing the deal at the last minute. onc ethings settled down afte rE3, nvidia stated it was going ot be 250mhz.
 
Fafalada said:
Exactly - hence Xenon will definately not launch at 3.5Ghz (as states DM(tm) logic extension).

Deadmeat once said nextbox would have a 10GHz Tejas CPU, so that shows how much of a clue he has about anything... Anyway, is DM posting anywhere on the web currently, or did someone finally shoot the troll and hang his head over the fireplace as a trophy or something? :LOL:
 
Qroach said:
Well the GPU spec in xbox was dropped from 250 to 233. The 300 mhz spec wasn't officially claimed by nivida, and was a leftover from gigapixel loosing the deal at the last minute. onc ethings settled down afte rE3, nvidia stated it was going ot be 250mhz.

IMO thats incorrect. I believe the initial 300MHz was targettung 130nm, the subsequent climbdown to 250 was when they realised 130nm would be a no-go, so they retargetted for 150nm. The final climbdown to 233 was when they started getting the 150nm chips back and they got a handle on initial yields.
 
DaveBaumann said:
Qroach said:
Well the GPU spec in xbox was dropped from 250 to 233. The 300 mhz spec wasn't officially claimed by nivida, and was a leftover from gigapixel loosing the deal at the last minute. onc ethings settled down afte rE3, nvidia stated it was going ot be 250mhz.

IMO thats incorrect. I believe the initial 300MHz was targettung 130nm, the subsequent climbdown to 250 was when they realised 130nm would be a no-go, so they retargetted for 150nm. The final climbdown to 233 was when they started getting the 150nm chips back and they got a handle on initial yields.

I'm going to go with QRoach on this one, The 300MHz was the original Gigapixel number, I only ever heard this number directly from Samus, (although there might have been a powerpoint with it on) and before the final configuration was announced. By the very first XFest which was the first "real" info the number had changed to "at least 250Mhz". I do not remember any reference to the NVidia part at 300.

I do remember old powerpoint circulating with Gigapixel specs on though, so possibly that's what your remembering.
 
DaveBaumann said:
Qroach said:
Well the GPU spec in xbox was dropped from 250 to 233. The 300 mhz spec wasn't officially claimed by nivida, and was a leftover from gigapixel loosing the deal at the last minute. onc ethings settled down afte rE3, nvidia stated it was going ot be 250mhz.

IMO thats incorrect. I believe the initial 300MHz was targettung 130nm, the subsequent climbdown to 250 was when they realised 130nm would be a no-go, so they retargetted for 150nm. The final climbdown to 233 was when they started getting the 150nm chips back and they got a handle on initial yields.

But by the time xbox released, nvidia already had the ti500 series clocked at 250mhz.(though on my geforce 3 ti200, it maxes at 230mhz, so I guess they were going to mass production)

BTW, anyone remember what the gigapixel specs were?
 
Fox5,

But by the time xbox released, nvidia already had the ti500 series clocked at 250mhz.(though on my geforce 3 ti200, it maxes at 230mhz, so I guess they were going to mass production)

You're right, Nvidia did hit 250mhz. They actually hit that with the xbox devkits for launch titles, but MS decided to knock it down to 233 to increase the yeilds.
 
Fox5 said:
But by the time xbox released, nvidia already had the ti500 series clocked at 250mhz.(though on my geforce 3 ti200, it maxes at 230mhz, so I guess they were going to mass production)

And it was significantly less complex (1 Vertex Shader to NV2A's two, half the FX Shader ALU's, lacks double stencil / Z).
 
DaveBaumann said:
Fox5 said:
But by the time xbox released, nvidia already had the ti500 series clocked at 250mhz.(though on my geforce 3 ti200, it maxes at 230mhz, so I guess they were going to mass production)

And it was significantly less complex (1 Vertex Shader to NV2A's two, half the FX Shader ALU's, lacks double stencil / Z).

NV2A had all those? And they were all located on the chip?
Anyhow, aren't chips rather chip to produce anyhow? I mean, even an x800 chip probably doesn't even cost $40, it's the ram I think that drives the price up.(that and wanting to make huge profits)
 
ERP said:
I'm going to go with QRoach on this one, The 300MHz was the original Gigapixel number, I only ever heard this number directly from Samus, (although there might have been a powerpoint with it on) and before the final configuration was announced.

I'd only heard of the 300MHz number in relation to NV2A myself. They seem to be talking purely about NV2A here as well.

[edit] Reading the Allard interview comments such as "we learned a bit more about the production and the manufacturing" and "From a physics change it's what's possible and what's right." all seem to indicate a change in the targetted process as well.
 
okay, as far as I remember, the 300 Mhz GPU for Xbox was the reported clockspeed for the GigaPixel GPU (GP4) and the announced clockspeed for the Nvidia chip, which was not called NV2A at the time of Xbox's official unvieling in March 2000, it was called X-Chip and XGPU which was reported to be and assumed to be the NV25. only months later was it revealed to be something other than an NV25, the NV2A.

now I say the GigaPixel part was reportedly 300 Mhz, because my memory keeps telling me that the 4.8 billion pixel per second fillrate ( arrived at from 300 Mhz * 4 pipes) announced for Xbox actually spanned the time between GigaPixel being the X-Box graphics provider in Jan-Feb to mid March, and the official announcement of Xbox with Nvidia as the graphics provider later that March. so, I might be wrong about it, but *something* is sticking in memory about 4.8 Gpixel / 300 Mhz GigaPixel GPU. I think it was a magazine that listed Xbox specs with GigaPixel.

edit: the 300 figure i am remembering for GigaPixel GPU could either be 300 Mhz, or 300 million polygons per second. or both.
 
NV2A is said to have around or in excess of 3x the geometry / vertex performance of GeForce 3, GeForce 3 Ti200, and still around 2x that of GeForce 3 Ti500, since all GF3s have 1 vertex shader.
 
aaaaa00 wrote

No. For xbox the specs were always for a 600 mhz processor AFAIK. That was raised to 733 mhz.

I think the 600Mhz cpu was a AMD Duron so that was probably responsable for the change. IMO the Duron would have been a better choise than the Celeron that now are in the Xbox.
 
Back
Top