X-Box 2 Speculation!



actually i'm pretty sure the on-die memory controller reduces the latency which is why amd went that route. instead of it having to go through an northbridge. Not only that but the hammers have hyper transport .

3 hyper transports at what is it 3.2gigs of bandwidth on them ?

1 to the sound card , one to the system ram one to the video card and then the video card using a high mabye 1024 bit bus to the video ram. Makes for a fast system . A cell killer the hammer is not but a damn good cpu it is and couple it with one or two of the fastest chips out at the time i think sony will have some stiff rivals
 
jvd said:


actually i'm pretty sure the on-die memory controller reduces the latency which is why amd went that route. instead of it having to go through an northbridge. Not only that but the hammers have hyper transport .

3 hyper transports at what is it 3.2gigs of bandwidth on them ?

1 to the sound card , one to the system ram one to the video card and then the video card using a high mabye 1024 bit bus to the video ram. Makes for a fast system . A cell killer the hammer is not but a damn good cpu it is and couple it with one or two of the fastest chips out at the time i think sony will have some stiff rivals



well yeah take a system with 128 Geforce4's and it will crush PS3 :rolleyes: ... thing is, at what price?...

see the point is, they would have to go multi-chip to keep up with Sony's designs. that makes u think about:

1) who's the best hardware manufacturer
2) where are you going to put all those chips, unless u want a Xbox2 to be the size of your Desktop
 
Grall said:
No, you fanbois gotta lay off that crack pipe, hehe. :LOL: M$ is going to use off-the-shelf tech again just like last time. I know, I know, you want your sugar daddy Billy Goat to whip out something even longer and phatter than Evil Sony's Cell, but face it, it's not going to happen.

Why I gotta be labeled a fanboi just because I asked a question? [Good points though Grall :p] I don't know if MS has that ability I was just curious. People keep saying they have $billions to throw around.

I honestly don't know what CPU MS could uses to topple PS3's Cell if it indeed can do 1 TFLOPS. I had assumed that since in the PC/Xbox setup, all your heavy graphics load was being done by the GPU that it wouldn't make any sense to use a 'killer CPU' and just put something in there that was functional (Banias) and marketable (high GHz). Sony's way with PS2/PS3 puts alot of the graphics load on the CPU it seems. I'm assuming that since no one seems to know what the GS/GPU for PS3 looks like yet. In MS case and by its DirectX API, wouldn't MS put its load on the GPU rather than concentrate on a CPU that was a 'Cell Killer'. Is it a given that MS has to use a x86 class CPU or stick to only AMD/Intel for a CPU? Are there any alternatives?

Quite simply is there anyone who can make a CPU equal to Cell without going insane on the power/cost. Would someone even want to?

Intel is still the world's largest semiconductor supplier so I'm guessing they will get the task.

http://www.siliconstrategies.com/story/OEG20021211S0035
 
Given time anything is possible, unfortunately even Gates's billions cant buy time ... it is too late for lots of custom development, best they can do is get a slightly adapted GPU. GPUs in that frame will IMO not be usefull for general purpose processing, so while they might be able to get the graphics power needed to combat Sony ... they are going to be hard pressed to have processor power able to match the quality of their graphics with the kind of physics it needs (and to a lesser extent AI). Lots of pretty pictures, but not very interactive.

The only market for which really high end embedded processors are being developed are consoles, DSPs are a joke, wether other markets would use them if available in usefull form is debatable (IBM is gambling on yes). They dont exist, and noone else is developing them. Takes too much investment, the kind of investments only put into desktop processors and Cell. Given that they cant have Cell they will have to take desktop processors and suffer the consequences (even if cost doesnt limit performance, power consumption will).

Unless they have had some deep undercover development going on together with Intel or AMD. Given that they are all public companies and the money streams would have to be accounted for that seems unlikely though.
 
As far a system RAM goes, they could use Reduced Latency DRAM. RLDRAM is the result of Micron and Infineon. Since Microsoft used Micron last time around, I think its fair to say Micron has a good chance at being involved in the X-Box 2.




RLDRAM Homepage

Microns site info on RLDRAM

As far as the GPU goes, I have doubts if either ATI or Nvidia will get the contract. Cost seems to be a huge concern for Microsoft and a company like Power VR or 3d labs will produce a GPU design at a lower cost. If Microsoft wants performance though (cutting edge features and raw power), I don't see them having much choice but to choose Nvidia.


Intel comes across as a good choice for a CPU. I think in 2004 they'll have a new core to replace to current Pentium 4 and they are moving towards dual cores. So my guess would be a Intel dual core x86 processor with a really small chance of being a quad core.


If Sony goes Blu-Ray, I think Micorosoft will cave in and put in a Blu-Ray player. Technology perception is very important, and to give Sony a percieved edge with a more advanced dvd player wouldn't be wise.
 
Thanks MfA, I never looked at Cell from that angle. Cutting edge physics/AI. Reminds me of those DC ads, "It's thinking..." :p

or better yet the Saturn ads, "A Little Too Real"

Edit:I thought MS switched to Samsung for the 64mb DDR in Xbox1, I refer you to Hard|OCP XBOX SPREAD EAGLE :LOL:

http://www.hardocp.com/article.html?art=NTk=

http://www.hardocp.com/image.html?image=NTlfMV8yMF9sLmpwZw==


This Forbes article last week hints that MS may not go with nvidia also:

http://www.forbes.com/technology/2003/03/28/cx_ld_0328nvda.html


PowerVR or 3DLabs, now thats interesting...
 
a4164,

I'm sorry, I didn't mean to label you a fanboi, I just meant it as a joke. Don't be angry with me please okay? :)

It's a bit amusing, so many really WANT M$ to come out with a Cell-killing custom CPU, I just wanted to poke a lil fun at that, I meant no disrespect.


JVD:

Hammer-class CPUs are unsuitable as a memory controller for a high-performance UMA-type system, they only feature a 128-bit memory interface at most for AMD Opteron, Athlon 64 has only half of that. It has no fancy 4-way balanced crossbar like today's GPUs, the memory controller is likely optimized for CPU-type access patterns and while having an on-chip memory controller reduces latency, that is for the *CPU*, not the GPU. Besides, a multi-GHz CPU can eat all the bandwidth given to it quite on its own given the right task, so the CPU and GPU would constantly fight with each other and strangle each other.

UMA isn't a high-performance invention, it never was. The fact it kinda works in the XB is that the CPU can only ever use a fraction of the available bandwidth (due to its slow FSB).

Hypertransport in its current implementation isn't fast enough anyway to feed a GPU, even if the Hammer's memory controller was extended to provide enough bandwidth somehow.


*G*
 
I believe ATi has experience or is getting it, with GDDR-II, chipsets with graphics controllers and 256bit busses. Not to mention the R300 is one wicked chip. Mr Orton hinted even before the R300 came out that the R400 would be quite a leap.

On the CPU side, I wonder if a modified Banias would do the trick?
 
a4164 said:
Thanks MfA, I never looked at Cell from that angle. Cutting edge physics/AI. Reminds me of those DC ads, "It's thinking..." :p

or better yet the Saturn ads, "A Little Too Real"

Edit:I thought MS switched to Samsung for the 64mb DDR in Xbox1, I refer you to Hard|OCP XBOX SPREAD EAGLE :LOL:

http://www.hardocp.com/article.html?art=NTk=

http://www.hardocp.com/image.html?image=NTlfMV8yMF9sLmpwZw==


This Forbes article last week hints that MS may not go with nvidia also:

http://www.forbes.com/technology/2003/03/28/cx_ld_0328nvda.html


PowerVR or 3DLabs, now thats interesting...

3d labs calls their graphic chips a Visual Processing Unit now because of the programability of them. A 3d labs Wildcat VPU and E-MU audio/dsp chip would fit into Microsofts plans. For Creative Labs the money from an X-Box 2 contract would make a difference to their bottom line. Also Creative sells speakers under the Cambridge Sounds works name. I can see the debut of the X-Box 2 with all kinds of Creative Labs stuff surronding it. Cambridge Sound Works 6.1 speakers and a Nomad mp 3 player that links with the XB 2 console. A potential synergy between Microsoft and Creative labs I think exists.
 
DeathKnight said:
:rolleyes: Now why would the Xbox2 have less power than the stated 80 gigaflops that the Xbox has?

OMG! XBox extracts 80GFlops from it's Pentium3 hack? Bill Gates's Billions really can make anything happen!!
 
Can msoft really get a 1.256 Tflops GPU they will need for Xbox2?

Since the CPU won't even come close, and ps3 will have 1.256tflops or more.
 
What you have to ask yourself, is Sony really going to be able to get the Cell to reach the performance of a dedicated GPU? Will it be able to replicate the efficiency of a dedicated GPU when performing intense graphical operations along with all of the game-related operations?

I doubt M$ would need a 1.256 Tflop GPU to compete since the system designs are functionally different (M$: dedicated GPU for all graphical computations and an adequate CPU for game code... Sony: both CPU and rasterizer share work within the graphics pipeline). With M$'s approach you're limited by how fast the GPU is (ignoring any possible system bottlenecks). With Sony's approach you have possible limitations surrounding two components since one component is relying on the other to feed it the T&L'd triangles.

All this should hold true if Sony hasn't changed the design philosophy. Dunno if they have or not. Anyone care to shed some light?
 
DeathKnight said:
I doubt M$ would need a 1.256 Tflop GPU to compete since the system designs are functionally different (M$: dedicated GPU for all graphical computations and an adequate CPU for game code... Sony: both CPU and rasterizer share work within the graphics pipeline).

What you have to ask yourself, is Sony really going to be able to get the Cell to reach the performance of a dedicated GPU? Will it be able to replicate the efficiency of a dedicated GPU when performing intense graphical operations along with all of the game-related operations?

No their not. My God this irritates me to no end. They're not dissimilar, not to the degree you're speaking of.

The architecture befind 'Cell" isn't like that of a PentiumX or Athlon, et al. It's much more like the front-end Vertex Shader of the N3x, or the VU's of the EE.


So, when the PC IHV's moe to a unified shading model in DX10 or sooner, will you stand-up and tell us all how their preformance is so lacking to the old-school DX7 T&L pipe? Give me a break.

Also, where the computation takes place (eg. CPU, GPU, Jonathon Dancy's Brain-in-a-vat) doesn't matter in the least when your using custom components in a a self-contained and 'closed-box' style architecture. Please, stop this... it's killing me.
 
So let me get this straight. The Cell's some miracle processor that essentially has GPU features and full-blown CPU functionality all rolled into one package? Sounds eerily resemblent to the EE: multipurpose processor. Dedicated hardware will always prevail (more being done clock for clock) and I doubt the Cell's going to change that.

It's just humorous how you strut around here like the Cell's the second coming :LOL:
 
You would burn an inordinate amount of silicon dealing with textures if you tried to do everything on a fully unified architecture (I seriously doubt DX10/unified-shading will have such an architecture).

Once more about the patent. It is questionable if they even had done many simulations of the architecture in the patents, much less done any real work on implementation. It predates the professed development of the Cell architecture, not the implementation, together with IBM for god's sake ... IMO people who think the patent has any relevance at all are a little too eager to have some real information to chew on.
 
Something missing from proposed specs that I believe would be easily implemented and is technology that is here now: wireless technology.

Just my tuppence worth.
 
Grall said:
Where would they get this world-class semiconductor design team from? M$ has never done anything even remotely similar in the past. They'd need like a couple dozen well-experienced engineers.

You mean all those ex-DEC engineers now working at MS?

http://www.microsoft.com/presspass/features/2000/jul00/07-03engineers2.asp

We haven't heard anything in the rumor mill signalling they've gone down this path.

If it's rumor mills we're talking about, I beg to differ:

http://www.theinquirer.net/?article=796

And being old buffers, we at the INQUIRER also remember when Microsoft antagonised Intel by messing around with a graphics chip design - Talisman -- that never saw the light of day.

But did you know that Microsoft has an advanced semiconductor design team that works, for example, on chipsets and silicon for the Microsoft Mice and other projects?

http://www.theinquirer.net/?article=2899

The fact is that Microsoft not only as the staff to design CPUs and graphics chipsets, it also has something of an incentive to do so. And it won't be its first foray into this area either.

Back in 1996, Microsoft royally cheesed off Intel by having a team working on a graphics chip codenamed Talisman. That plan was finally laid to rest, but we also reported earlier this year that it had chip designers working as a team on several Microsoft projects related to gaming.

And there was an intriguing piece in yesterday's Observer, which conducted an interview with Robbie Bach - Microsoft's "chief Xbox officer". He told the paper that Microsoft was putting billions of dollars of investment into the Xbox, and the firm already has 2,000 people working on it and future projects.

If Microsoft were to team up with UMC, TSMC or other of the foundries, we can see no reason why, in the next year or two, it couldn't have its own microprocessor and graphics chip technology and then would be master of its own console destiny.
 
Back
Top