if Nvidia powers XBox2, will IBM fab XGPU2?

Since Nvidia has a new manufacturing buddy, IBM, does that mean IBM would likely produce/fab (not design) the XGPU2 assuming Nvidia wins the contract over ATI, for XBox2 graphics?
 
megadrive0088 said:
Since Nvidia has a new manufacturing buddy, IBM, does that mean IBM would likely produce/fab (not design) the XGPU2 assuming Nvidia wins the contract over ATI, for XBox2 graphics?

Doubt it, MS is already pissed over the price of XGPU from TSMC (which IIRC is ~$16? MCP is ~$20?). Actually I forget, someone look it up in 'Opening the Xbox' ;)

IBM costs big bucks.

More I think about it, the more MS is stuck between a rock and a hard place, trying to come up with hardware to counter PS3, but keep the price down at the same time. Can't be much fun for the boys in Redmond :p
 
Anything can happen, Sony is building their own fabs for PS3 so the IBM fabs WOULD be open for XGPU2.

But who knows.
 
I think the'll be well-served by licensing IP and making their own chips instead of buying them from the likes of Nvidia.
 
Doubt it, MS is already pissed over the price of XGPU from TSMC (which IIRC is ~$16? MCP is ~$20?). Actually I forget, someone look it up in 'Opening the Xbox'

Actually, MS doesn't deal with TSMC, Nvidia does and they deliver the chips directly to the manufacturing plants. MS was once pissed at the price of the chips from Nvidia, but all that has been worked out and both sides are now happy. Not only that theose prices you mentioned are extremly old.

Anyway I think it's extremly likely that MS will use Nvidia again (remrmber they are making more than just te GPU) and IBM will manuafacture each new cheap on thier best process.
 
It would be very fun to see MS designing their own graphics processor. or at least have someone do it with MS's IP. MS has alot of engineers & technology from the M2 and MX projects. actually not alot, but all of it IIRC. MS bought CagEnt in 1998, intergrated them into the Webtv division.
CagEnt is basicly a slightly changed 3DO Systems group who designed M2 and MX. the tech was good and they were supposedly working on M3.
something like 'M4' or 'M5' would likely smash Playstation3, since M2 smashed PSX techwise. :)
 
megadrive0088 said:
It would be very fun to see MS designing their own graphics processor. or at least have someone do it with MS's IP. MS has alot of engineers & technology from the M2 and MX projects. actually not alot, but all of it IIRC. MS bought CagEnt in 1998, intergrated them into the Webtv division.
CagEnt is basicly a slightly changed 3DO Systems group who designed M2 and MX. the tech was good and they were supposedly working on M3.
something like 'M4' or 'M5' would likely smash Playstation3, since M2 smashed PSX techwise. :)

Well, WebTV wanted to use GigaPixel, so :p

I think its a very smart and important thing for MS to stick with NVIDIA.. not only for their tech, but for the massive developer relations clout NVIDIA has, their excellent documentation, and ofcourse, their familiarity with PC devs.

edit: im an NVIDIA horn-tooter, but i still think there's some objective logic in this!
 
Can Nvidia + Intel/AMD top PS3 spec wise is the question now.

The CEO of Nvidia seems a little unsure.
 
Paul said:
Can Nvidia + Intel/AMD top PS3 spec wise is the question now.

The CEO of Nvidia seems a little unsure.

It'll probably same old same old.. the PS3 CPU crushes whatevers in the Xbox2, but NVIDIA's uber-GPU will trounce the PS3 visualizer.. just different rasterizing theories, thats all.
 
I dunno.. GS was pretty advanced for it's time, and it had a few flaws that held it back, I expect there to be no more flaws. They arent going to mess up this time.
 
It'll probably same old same old.. the PS3 CPU crushes whatevers in the Xbox2, but NVIDIA's uber-GPU will trounce the PS3 visualizer.. just different rasterizing theories, thats all.

I'm interested in seeing how that is going to happen iow be surpassed(It obviously will, it's only a matter of time... and that is what's important ;) ), as has been said by many, the ps3 will apparently obtain its massive perf. from multi-Ghz, massively parallel units combined with massive bandwith...

In order to beat/equal it, you'd need multi-Ghz speed with massive transistor counts.... Will we be seeing multi-Ghz gpus? A new shift in the way gphx are done? Something else? My guess is whatever it is will cause a delay...

edited
 
MS could always hire nVidia to design a graphics core specifically for the Xbox2 using TBR. nVidia has the tech, and it could prove useful for them in the future regarding PC graphics.

I wonder what PowerVR has brewing up its sleeve, I'm sure it has something nice and powerful that would be a blast for MS to use.

The problem I see with Xbox2 isn't so much the graphics chip, but much more the CPU that could be placed within it. They could very well go with IBM for some sort of design if Intel or AMD don't have anything to offer other than a modified x86 CPU at the time. While it would be great for insured future PC ports and vice versa, would it really be able to hold up even slightly against the CELL proc going into the PS3? Would it be possible for IBM to use a Power4 CPU in a celluar fashion and produce it at .65nm? That would be quite the powerhouse in terms of power, but the costs might be too high.
 
Sonic said:
MS could always hire nVidia to design a graphics core specifically for the Xbox2 using TBR. nVidia has the tech, and it could prove useful for them in the future regarding PC graphics.

I wonder what PowerVR has brewing up its sleeve, I'm sure it has something nice and powerful that would be a blast for MS to use.

The problem I see with Xbox2 isn't so much the graphics chip, but much more the CPU that could be placed within it. They could very well go with IBM for some sort of design if Intel or AMD don't have anything to offer other than a modified x86 CPU at the time. While it would be great for insured future PC ports and vice versa, would it really be able to hold up even slightly against the CELL proc going into the PS3? Would it be possible for IBM to use a Power4 CPU in a celluar fashion and produce it at .65nm? That would be quite the powerhouse in terms of power, but the costs might be too high.

Since NVIDIA is a public company, would they be required to disclose any information regarding, say, a secret-in-production-custom-Xbox2-chip or something similar in their SEC filings?
 
As far as I know I don't think so. They would probably just have to state they are in agreement that nVidia is in contract with MS on their next console.
 
what conserns me the most is, with a cpu like Cell, the animation in PS3 games should be absolutely amazing. even IF ps3 in some way or another comes out as less graphically impressive than, say, XBOX2, and given that the XBOX2's power is all in its GPU, how on earth is it going to replicate the animation and physics a 1TFLOP CPU provides?

that is, IF Cell is a 1tflop CPU, IF the visualiser turns out to be less powerful than the XGPU2 etc....


of course we don't know anything, but what if that is the case?

animation-physics-collision detection-etc can't be done on a GPU can it?
 
london-boy said:
what conserns me the most is, with a cpu like Cell, the animation in PS3 games should be absolutely amazing. even IF ps3 in some way or another comes out as less graphically impressive than, say, XBOX2, and given that the XBOX2's power is all in its GPU, how on earth is it going to replicate the animation and physics a 1TFLOP CPU provides?
Having worked on a parallel system in a 'previous life', it's not difficult to build powerful hardware. Producing software that can actually use this power efficiently, OTOH, can be a bugger...
 
Simon F said:
Having worked on a parallel system in a 'previous life', it's not difficult to build powerful hardware. Producing software that can actually use this power efficiently, OTOH, can be a bugger...


saturn? :LOL:

what i mean is, let's just look at this...

if developers DO get to know how to properly program Cell (and they WILL, i mean, if devs got to grips with saturn and PS2, then they will be ok with Cell given the time), then the animation and physics provided by the chip will only be surpassed by a more powerful CPU... and since Xbox2 will be all about its GPU, how is it gonna compare?

again, DISCLAIMER:
that is, IF Cell is such a monster, which it probably will, and IF the xbox2 is all about its GPU, which it probably will.
 
london-boy said:
animation-physics-collision detection-etc can't be done on a GPU can it?

Animation blending and some physics code can be done on a GPU. Collision detection is a bit of non-starter (its more of a database search, which isn't what GPU are good at) but research is on-going.

The main issue is getting the data out of the GPU. If you're just going to use it for graphics, thats o.k. but if you want to use it in game code, its a little harder. You need to transfer it into RAM that the CPU has access to, Xbox's UMA makes this easy but getting the data into UMA from GPU is a bit harder (render to vertex array demostrates it, you can't just 'dump' data to memory, you have to 'render' it into a texture, which has issues), the PS2 on the other hand can just 'dump' its VU's memory into main RAM via DMA.
 
DeanoC said:
london-boy said:
animation-physics-collision detection-etc can't be done on a GPU can it?

Animation blending and some physics code can be done on a GPU. Collision detection is a bit of non-starter (its more of a database search, which isn't what GPU are good at) but research is on-going.

The main issue is getting the data out of the GPU. If you're just going to use it for graphics, thats o.k. but if you want to use it in game code, its a little harder. You need to transfer it into RAM that the CPU has access to, Xbox's UMA makes this easy but getting the data into UMA from GPU is a bit harder (render to vertex array demostrates it, you can't just 'dump' data to memory, you have to 'render' it into a texture, which has issues), the PS2 on the other hand can just 'dump' its VU's memory into main RAM via DMA.


SO BASICALLY... by 2005 we should see the GPU doing more and more stuff... cool, the thing is, whether it's done on the CPU or the GPU, this thing is still gonna be in direct competition with a 1TFLOP chip. i mean, and i'm pulling this out of my head, let's take this example:
if we have 1000 characters on screen, all of them with realistic clothing and hair which actually bounces off whatever it comes into contact with (unlike today, where it just goes THROUGH the characters or whatever they touch)... let's say this is a 80% performance job for Cell, and since everything is done in the chip then the results are sent to the visualiser to be rendered it should be ok in terms of framerate...
how would the *probably* underpowered XCPU cope with the workload? even IF some of the physics are done on the XGPU, it should still need 1Tflop performance to cope...
 
london-boy said:
SO BASICALLY... by 2005 we should see the GPU doing more and more stuff... cool, the thing is, whether it's done on the CPU or the GPU, this thing is still gonna be in direct competition with a 1TFLOP chip. i mean, and i'm pulling this out of my head, let's take this example:
if we have 1000 characters on screen, all of them with realistic clothing and hair which actually bounces off whatever it comes into contact with (unlike today, where it just goes THROUGH the characters or whatever they touch)... let's say this is a 80% performance job for Cell, and since everything is done in the chip then the results are sent to the visualiser to be rendered it should be ok in terms of framerate...
how would the *probably* underpowered XCPU cope with the workload? even IF some of the physics are done on the XGPU, it should still need 1Tflop performance to cope...

I might be pulling this out of my arse, but with 1000 characters on screen at once, wouldn't your chances of seeing more than a handfull of them close up and actually notice wether their hair/clothing bounces off everything realistically or not be pretty slim? No seriously, that's what clever coding is there for, even with such a monster CPU, why waste power for something nobody will ever notice? Any sane game coder would implement a LOD system and only calculate the full physics for the hair and clothing close to the viewpoint as long as they're not essential for gameplay. On the other hand if the graphics rasterizer that's being fed all this data couldn't handle per-pixel shading, lighting, proper filtering and antialiasing of those 1000 characters (that's theoretical too of course, I have little doubts that the GS2 will be at least as advanced than any graphics hardware out right now), wouldn't that be just as or even more noticable than less detailed physics ?

Theoretically we can think up any number of scenarios where the monster power of such a CPU cannot be matched by a Xbox2 or GC2 using stock components. We can also theoretically think up scenarios where a less fully featured or powerfull GPU would run into its limits, but what good does it do us? Practically the instances where you'd really get to see a difference inside an actual game would be pretty rare IMO. Its all a question of how balanced the next systems will end up being. Personally, I expect the visual and computing power of next generation's consoles will be so good overall that a few rendering or physics capabilities lacking here and there wouldn't really kill any of the systems.

Actually, after watching the amazing physics in the HL2 vids (that will supposedly run without a problem even on the lame current XCPU), I'm not that worried about a lack of processing power of an off-the-shelf CPU a couple years down the road anymore (which Nintendo and MS will likely use). All the more so if it again takes programmers ages to even tap the powers of such a computational beast, tame it and then actually use its superior processing power for .. well anything at all really. And don't get me started about the load of nightmarish work artists will have to make all those programming work shine! If you listen to game designers complain about the cost and time content creation eats up today already, think about how they will complain if their modelers have to actually build all models with fully functional clothing for soft body dynamics... <shudders> ;)
 
Back
Top