X-Box 2 Speculation!

The R350 right now with the F-Buffer is showing some very impressive results.

If this is an indication of what can be done by GPUs now, I think it's safe to say that the next generation GPU going into the XBox will be VERY powerful. They definately won't roll over and die to what the PS3 will bring to the table interms of graphics power -- of course this depends on who much of the CPU will be spent on doing graphics calcs in the PS3. I suspect that AA and filtering will be at a point where 4X MSAA, trilinear filtering and 8X AF will be a given. So the images will be sharp, next I suspect incredibly high polycounts, the R3XX and NV30 can push a lot of polygons and sustain it nicely. Well the R3XX can. ;) They have plenty of texturing power to boot and the R3XX has some really nice vertex and pixel shading abilities.

One thing that would be nice is obviously more power, but more importantly, the ability to create and destroy vertcies (HoS and so on).

A big problem is the fact that the massive power of these GPUs is coming from fat boards (in the 10 layer region) and seriously wide busses. The question is how possible will this be in a console?

If XBox2 is off the shelf, then this is how I see it.

-P4 3GHz, hyperthreading and a fast FSB 800MHz.
Things to think about:
The question is will this be a Prescott and if so, what will it bring in terms of SSE3? This will be important if they want a hope in hell to be able to have enough power to compete in the physics department. AI stuff can be done with integer units which is where the P4 has some serious umph.

A modified Banias with emphasis on the Vector and FP side might be a good contender.

Note: I seriously doubt we'll see a Hammer, I think the onboard memory controller will be more trouble than it's worth. Not to mention, I don't think AMD has the fab capacity and out sourcing would be hard since that would drive up costs and Hammer likely has a lot of custom work and tuned to AMD's processes. Unless that IBM deal yields something spectacular.

Loons: I'm pretty sure an Itanium wouldn't come into the equation, even though it has ridiculious amounts of FP power.

GPU, NVidia, ATI or 3DLabs doesn't really matter. Nvidia seems to really have dropped the ball with the NV30, it's just ugly and riddled with legacy garbage, reminds me of x86. 3DLabs could offer up some really nice highly programmable and even fast solutions. They have slipstream, virtual texturing and various other things. ATI has chipset experience and they have liscences from Intel for the P4 bus IIRC. So they're a definate contender.

Sound, I'm not sure how much this matters, creative could probably quite easily take a more beefy audigy chip and throw that out. Rather than using the same one over and over and under a new brandname. ;) Of course Nvidia will have it's offerings, though the bus lisencing might be an issue. Then again, Intel might not mind giving Nvidia one just for the XBox.

RAM, well this depends if it's a unified architecture. The more I think about it, the worse it sounds. I think this gen will need ridiculous amounts of bandwidth in which case I expect a hierarchy.
 
(b) Can you explain to me what, exactly, is wrong with that? Vector Processing is bad? This is news to me.
It's all a matter of context.
Eg. - In XBox, vector processing is the god's gift to gaming :) Anywhere else... well...
 
jvd said:
Panajev2001a said:
jvd, the max FP ops/cycle for the A64, in 32 bits or 64 bits mode is identical...

4 FP ops/cycle that is... In 64 bits mode they have more XMM registers and more GPRs, but they abandoned the super RISC style FPU when the K8 design was downgraded from its original ambitious specs...

The A64 FPU is identical ( well more or less ;) ) to the one in the K7 chips, yet it has SSE and SSE2... more for compatibility with all the SSE optimized code out there... as with the dual ADD/SUB and MUL/DIV FP pipes design ( plus the third unit which is the LOAD/STORE pipe ) and 3DNow we already had reached the 4 FP ops/cycle max...

Thanks for clearing that up , i was told its 20% faster when running 64bit programs over 32 bit programs so i figured it have a faster flops rating. Still though whats the flops rating for it at say 3 ghz ?

No, the Hammer will just be able, on average, to come closer to the theoretical maximum compared to the Athlon. Some code will not be effected at all while latency bound code will improve quite a lot.

An aside - the Flops rating really is an utterly usless measure of CPU performace since realworld code can often achive anything from 0 to 95% of the best case performace.

A 3GHz Hammer (not possible until the 90 nm node) would have 12 GFlops in single precision (using SSE) and 6 GFlops in double precision (using SSE2).
 
You think the PS 3 will be more powerful than what I described?
I thought those were some really powerful specs I speculated on...WOW.

I wasn't commenting on your specs but yea. If Xbox2 gets a head start against PS3, Sony will just throw in another Cell or do what they gotta do to give it the edge against Xbox2. I REALLY doubt Sony will let Xbox2 come out before ps3 and have the more powerfull specs.

Plus, PS3 will have the polygon advantage against Xbox2 anyway. A 1tflop vertex monster will be able to push ALOT of polygons. The GPU is going to be the thing that is gonna seperate Xbox2 from PS3, we will just have to see.
 
Paul said:
You think the PS 3 will be more powerful than what I described?
I thought those were some really powerful specs I speculated on...WOW.

I wasn't commenting on your specs but yea. If Xbox2 gets a head start against PS3, Sony will just throw in another Cell or do what they gotta do to give it the edge against Xbox2. I REALLY doubt Sony will let Xbox2 come out before ps3 and have the more powerfull specs.

Plus, PS3 will have the polygon advantage against Xbox2 anyway. A 1tflop vertex monster will be able to push ALOT of polygons. The GPU is going to be the thing that is gonna seperate Xbox2 from PS3, we will just have to see.

How much will that cost ?

I say ms can put in two gpus and everyone crys it will cost ot much , i'm sure adding another cell would cost the same amount of money .
 
glappkaeft said:
jvd said:
Panajev2001a said:
jvd, the max FP ops/cycle for the A64, in 32 bits or 64 bits mode is identical...

4 FP ops/cycle that is... In 64 bits mode they have more XMM registers and more GPRs, but they abandoned the super RISC style FPU when the K8 design was downgraded from its original ambitious specs...

The A64 FPU is identical ( well more or less ;) ) to the one in the K7 chips, yet it has SSE and SSE2... more for compatibility with all the SSE optimized code out there... as with the dual ADD/SUB and MUL/DIV FP pipes design ( plus the third unit which is the LOAD/STORE pipe ) and 3DNow we already had reached the 4 FP ops/cycle max...

Thanks for clearing that up , i was told its 20% faster when running 64bit programs over 32 bit programs so i figured it have a faster flops rating. Still though whats the flops rating for it at say 3 ghz ?

No, the Hammer will just be able, on average, to come closer to the theoretical maximum compared to the Athlon. Some code will not be effected at all while latency bound code will improve quite a lot.

An aside - the Flops rating really is an utterly usless measure of CPU performace since realworld code can often achive anything from 0 to 95% of the best case performace.

A 3GHz Hammer (not possible until the 90 nm node) would have 12 GFlops in single precision (using SSE) and 6 GFlops in double precision (using SSE2).

Thanks for the info on the hammer
 
I say ms can put in two gpus and everyone crys it will cost ot much , i'm sure adding another cell would cost the same amount of money .

Too bad in my scenario PS3 would be coming out after Xbox2 huh? And the price would go down. Sony will take the hit to keep PS3 at 300 for a USA launch just like they did with PS2.
 
Paul said:
I say ms can put in two gpus and everyone crys it will cost ot much , i'm sure adding another cell would cost the same amount of money .

Too bad in my scenario PS3 would be coming out after Xbox2 huh? And the price would go down. Sony will take the hit to keep PS3 at 300 for a USA launch just like they did with PS2.

First u didn't say that . Second how much do you think the cell chip will go down ? i bet u anything the only thing the cell chip will be used in is the ps3 at that point . Its just not magicly going to drop in price . Ms can also take the hit too. We also have no clue how much of a hit a cell chip would take with out adding more ram to it , it may actaulyl end up slowing the chip down relative to how it was before. Unless they increase the ram which would make it more expensive , increasing die size and more chances for the ram to not work and rendering the chip useless
 
First u didn't say that . Second how much do you think the cell chip will go down ? i bet u anything the only thing the cell chip will be used in is the ps3 at that point . Its just not magicly going to drop in price . Ms can also take the hit too. We also have no clue how much of a hit a cell chip would take with out adding more ram to it , it may actaulyl end up slowing the chip down relative to how it was before. Unless they increase the ram which would make it more expensive , increasing die size and more chances for the ram to not work and rendering the chip useless

First you need to read more carefully, I did say that above.

"If Xbox2 gets a head start against PS3, Sony will just throw in another Cell or do what they gotta do to give it the edge against Xbox2."

Head start meaning Xbox2 coming out 6-8 months before ps3. In that time, if Xbox2 is more powerfull than sony expected and does indeed beat their current ps3 specs. Then sony will increase the power of ps3 somehow, either by increasing the rasterizer, or maybe even throwing another cell in.

In 6-8 months the price on a Cell would go down, even if it didn't Sony would just have to take the hit associated with it, becase they would get it back later.
 
jvd said:
How much will that cost ?

I say ms can put in two gpus and everyone crys it will cost ot much , i'm sure adding another cell would cost the same amount of money .

Guess it would come down to pincount. If Sony uses next-gen rambus memory they get truly hellishly high bandwidth figures from a relatively narrow bus width.

A twin-GPU each on a 256-bit bus running GDDR3 (which is what is likely in this timeframe) means a lot of chips. Takes up board space and power useage would likely be horrendously enormous too. :)

Not that I know how much a Cell + RAM would draw by the way. Probably not peanuts. :D The original EE+GS combo ran plenty hot for its time (30+ W if I remember correctly).

It's pointless speculation really since I don't ever expect MS to use two GPUs or Sony to chuck in two cells in their next console. Who would need two teraflops of floating point performance in a consumer device anyway, you going to sit at home and crack 1024-bit PGP keys in your livingroom for fun or something? :D


*G*
 
Grall said:
jvd said:
How much will that cost ?

I say ms can put in two gpus and everyone crys it will cost ot much , i'm sure adding another cell would cost the same amount of money .

Guess it would come down to pincount. If Sony uses next-gen rambus memory they get truly hellishly high bandwidth figures from a relatively narrow bus width.

A twin-GPU each on a 256-bit bus running GDDR3 (which is what is likely in this timeframe) means a lot of chips. Takes up board space and power useage would likely be horrendously enormous too. :)

Not that I know how much a Cell + RAM would draw by the way. Probably not peanuts. :D The original EE+GS combo ran plenty hot for its time (30+ W if I remember correctly).

It's pointless speculation really since I don't ever expect MS to use two GPUs or Sony to chuck in two cells in their next console. Who would need two teraflops of floating point performance in a consumer device anyway, you going to sit at home and crack 1024-bit PGP keys in your livingroom for fun or something? :D


*G*

Grall i will invite you over and we will watch movies on my plasma and we will race , i will use my xbox 2 and u can use your ps3 and see who rips those 1024 bit pgp keys first :)


Anyway. Yes a dual gpu would cost alot . How hot it would be and how much power it is i dunno . If they go ati or power vr it shouldn't be too bad. Ati and power vr make very cool chips compared to their rivals .

Also with 2 gpus they can back off the speed a little . Doesn't have to be as fast as they can push it and what not .

Cost would be easier to bring down. Since they are buying basicly two of everything they will bring it down on volume alone . Think of it this way . If at the end of the day the gs2 (for lack of a better name) costs the same as two ati gpus ( just an example don't take it for real) the two chps will drop faster as the quantiy is double that of the others in each console. Or am i wrong .
 
What did Sony promise? Are you speaking of the "toy story" graphics comment?

Not just Toy Story, but Hollywood quality effects, 50X the DC, technologically supreme for 6-8 years, polygons are useless, supernode for entertainment network, all words from Sony themselves. :oops:

b) Can you explain to me what, exactly, is wrong with that? Vector Processing is bad? This is news to me.

Going by what the expert Mr Randy Orton said,

"Our belief is Sony won't go with pixel shading, but will make every pixel a polygon and throw polygons at the problem. It's a different philosophy," Orton said.

But that's not optimal for graphics processing that requires a variety of specialized polygon and rendering engines, he said. And programmers will need a new environment that shields them from such massive parallel processing, he added.


Looking back at PS2....Sony has a task at hands. :oops:
 
jvd;

How much will that cost ?

I say ms can put in two gpus and everyone crys it will cost ot much , i'm sure adding another cell would cost the same amount of money .

Of course they can - no one is saying they can't. The question is, how likely are they to?

First u didn't say that . Second how much do you think the cell chip will go down ? i bet u anything the only thing the cell chip will be used in is the ps3 at that point . Its just not magicly going to drop in price . Ms can also take the hit too. We also have no clue how much of a hit a cell chip would take with out adding more ram to it , it may actaulyl end up slowing the chip down relative to how it was before. Unless they increase the ram which would make it more expensive , increasing die size and more chances for the ram to not work and rendering the chip useless

Considering all the mindshare Sony has going for them, plus them being leader for two generations, a respectable first party developer, 3rd party support that is unlikely to change much if they don't make a big mistake - I'd say Sony is willing to loose out equal on initial costs. In contrast, what has Microsoft going for them? You think just because Micrsoft has the account backing them up, they'll magically invest billions until they're close to bankcrupsy (sp)?

BTW; Sony isn't that small either and the money they do lack compared to MS, they make up in mindshare.
 
It's all a matter of context.
Eg. - In XBox, vector processing is the god's gift to gaming Anywhere else... well...

I *think* it is about balance. I am not sure how raw vector proccessing will do? :oops:
 
Something I haven't seen mentioned that could possibly be a big exclusive is WMV9 decoding in Xbox2. If you've seen the HD demos, you'll most likely agree that it has the potential of being the next codec used for HD-DVD (the HD transfer of T2 on the new dvd will only be ~3gigs).
 
Dural said:
Something I haven't seen mentioned that could possibly be a big exclusive is WMV9 decoding in Xbox2. If you've seen the HD demos, you'll most likely agree that it has the potential of being the next codec used for HD-DVD (the HD transfer of T2 on the new dvd will only be ~3gigs).

Good point, but the codec will only be accepted as a new industry standard if widespread adoption is guaranteed. That rules out such exclusivity, especially considering the massive share Sony have of the standalone DVD player market.

MuFu.
 
I think HD-DVD technology will have a big impact on the next console race.

The format war will spread to this part of the hardware aswell.

The PS3 and Xbox 2 are sure to use different competiting HD-DVD technologies.

This will be another element that draws distinction between the two machines.

I always believed that Blu-Ray was an essential part of the PS3.

I'm not so sure if Microsoft will be prepared to include a similiar hardware in the Xbox2. They cut back the DVD functionality on the Xbox1. It is an area where the cost/benefit argument has very little value to gamers. They want a games console not a DVD player.

Because comsumer electronics are a big part of Sony's business, they have a vested interest to support their own formats. The success of Blu-Ray would be a big pay day for Sony.

The two burning questions for me are:

1.Will Xbox 2 be backwards compatible?
2.Will Xbox 2 include a HD-DVD device?

I think no to both. Backwards compatibility would be a fudge, using emulation? HD-DVD adds too much to the cost of the hardware with very little gain.
 
Back
Top