JP Morgan report on Xbox 2

Qroach said:
guden,

You think MS is going to basically GIVE you a high-performance computer?

That depends, do you think Sony is gonna give you a high performance computer?


short answer would be no, not a chance.

And I wouldn't take that report from JPMC to seriously either (trust they have very little Idea what they are talking about).
 
This report is crap. The guy that wrote it is completely speculating.

R500 will probably be too large/expensive for MS, remember this is a high-end PC part.

Why would it be too large and expensive for a "new" console? They licensing the chip tecnology to manufacture a GPU (and save drastically on costs), not making an entire video card.
 
Qroach said:
Why would it be too large and expensive for a "new" console?

Well for starters, we had a guy publish what he purports to be pre-release NV40 figures in the 3D tech forum yesterday or earlier today (depending on your time zone). Amongst other things was a seven *billion* pixel per second fillrate.

If those numbers are correct, you can rest assured the R500 will top that figure with a significant margin, but do you see even a slight reason a console would need that much fillrate, for ANY purpose? It's not going to do 16x AA of (HD)TV resolutions, because it couldn't spare the RAM for it anyway.

If MS plans on sticking R500s straight into their next-gen boxes, I'll be immensely surprised, because it seems like a terrible waste of money to me. I mean, some months ago there was actually talk of MS using 10GHz (!) Tejas P4s in nextbox (Deadmeat favored that scenario as I seem to recall, though now his tune has changed - again! :LOL:); fan-people always seem to want the baddest chip to end up in their hardware of choice. It's been *rumored* R500 would be in nextbox ever since ATi was revealed as the provider of the GPU, now someone suddenly wants R600! Not R550, but a full generation ahead!

Jesus, when will this rampant speculation END? Why not just say it'll have sixteen G5 processors and quad R1000 GPUs and be done with it? :LOL:

Just seems they'd strike a much better balance with a custom chip. R500 tech perhaps but fewer pipes than the PC equivalent since they don't need upwards of ten Gpix fill, no PCIe interface, UMA design with GPU acting as northbridge, something like that. Of course, that likely means it'll just be ONE processor die in the box, because G5 uses a point-to-point bus with a horrible amount of pins. Current GPUs have over 1000 already so it would be difficult to cram in THREE processor interfaces like some have guessed, plus 256-bit memory interface and whatnot else...

Ok, I'm done speculating for this post. I admit I don't know sh!t, but at least I try to use common sense when guessing.
 
Well for starters, we had a guy publish what he purports to be pre-release NV40 figures in the 3D tech forum yesterday or earlier today (depending on your time zone). Amongst other things was a seven *billion* pixel per second fillrate.

so you're talking abotu specs, not cost. just ebcuase it has a huge fillrate doesn't mean it's going to be too expensive for a console.

If those numbers are correct, you can rest assured the R500 will top that figure with a significant margin, but do you see even a slight reason a console would need that much fillrate, for ANY purpose? It's not going to do 16x AA of (HD)TV resolutions, because it couldn't spare the RAM for it anyway.

Your reasoning is flawed. Just becaue YOU can't see a reason for it, doesn't mean there isn't a use for it. I can safely say developers will find a way to use whatever resources they get. Even still this doesn't mean its going to be too expensive to use in a console.

I admit I don't know sh!t, but at least I try to use common sense when guessing.

If you don't know $hit then don't go around telling other people they aren't using common sense, because I fail to see anything in your argument that proves or comes close to proving it would be "too expensive".

If the gpu in xbox that was the most powerfull GPU at the time wasn't too expensive for xbox, then how is something MS uses now going to be too expensive (specifically when they aren't paying additional money for someone else to fab the chip and tack on additional cost/profit to the price)? This new GPU is probably going to cost them less this time around. Is that common sense enough for you?
 
What???! Sure it was TOO expensive for Xbox, don't you see MS bleeding billions on the machine? It was TOO expensive, but MS didn't give a shit, that's the other part to it.

get your facts straight. the GPU in xbox wasn't what made MS loose tons of cash. it was a combination of expensive parts (GPU, CPU, MPU, hardrive). Considering how they are licensing technology instead of purchasing the parts this time around, they will be in far better control of costs and the fabbing while avoiding anyone tacking on extra costs. Licensing the technology and fabbing it yourself is cheaper than buying the part already fabbed.

Do you expect MS to lose billions on Xbox2 this time? Not from what we're hearing now. They definately *do* give a shit about costs now though.

Yes which is why they followed a model othe rhave used by licensisng technology and fabbing it themselves. Which should be far cheaper then it was last time, even using the latests and greates technology.

Higher specs = higher costs, let's not kid ourselves.

That's not 100% the case all teh time. I bought a video card for my PC that cost me less then the last video card I bought a year ago, yet it's specs were higher. Jeez, how did that happen?? :rolleyes:

Where are these multibillion dollar fabs you are talking about?

use some common sense. Nvidia paid TSMC to fab the chips, then passed that cost along to MS along with a tidy profit tacked on. This time around IBM is getting paid directly from MS to fab the chips and ATI recieves a small royaltee from the sale of each game on the system. Where do you think the risk/cost has been re-located?
 
Sony funded an application specific processor, Microsoft is funding a customized desktop processor.

So what? which of the two do you think will be more costly to design and fab? As MS doesn't have to buy their chips from intel this time around. I think they will be saving quite a bit compared to xbox.
 
Qroach said:
so you're talking abotu specs, not cost. just ebcuase it has a huge fillrate doesn't mean it's going to be too expensive for a console.

Umm, dude, ever heard of a concept called capitalism? In a nutshell it says if you're going to build a space shuttle, you outsource the job to the contractor that offers the lowest bid on your contract. You don't strap in the biggest engine you can get your hands on because it has lots of big numbers on the specs sheet. You buy the cheapest thing that'll get the job done!

Same thing here. If you don't have any earthly use for 7gpix fillrate (and that might be counting low), what would that be really when you're only going to redraw the screen 60 times per second. Even at 4xAA at HDTV res you could redraw the screen about 50 times per frame, what's the point in sticking in such a graphics chip?

It'd just cost you more money than it's worth, that's what!

Aren't you supposed to be a games dev or something? You tell me what the point is in being able to redraw the screen 50 times per frame.

Seems it's about as useful as strapping an 8-liter Dodge Viper V10 engine to a pickup truck and mass-produce the damn thing I say. ...Oh wait a minute, someone actually DID that! ;)

Your reasoning is flawed. Just becaue YOU can't see a reason for it, doesn't mean there isn't a use for it.

It's not flawed man, it's just you being fanboiishly obtuse on purpose. 50 times per frame per second at AA'd 60Hz HDTV res screen redraw. Sensible? No. Not when you weigh in you need like 60GB/s bandwidth just for 32bpp framebuffer ops to sustain that kind of fillrate. Not only do you have a rediculously overpowered and cost-ineffective GPU, you have to stick rediculously fast and cost-ineffective memories to it to actually be able to use the power you have at your disposal.

Capitalism, man. When Sony removed the separate S-video and phono audio plugs from the original PS to save not even half a buck per unit, why would MS stick in a 7gpix/s GPU when that power is completely wasted in such an application and squander A LOT more?

I can safely say developers will find a way to use whatever resources they get. Even still this doesn't mean its going to be too expensive to use in a console.

Uh huh. *nods head politely* Right.

Use it for what? 50 times per frame per second at AA'd 60Hz HDTV res screen redraw... Useful, for WHAT!!! purpose man?

If you don't know $hit then don't go around telling other people they aren't using common sense, because I fail to see anything in your argument that proves or comes close to proving it would be "too expensive".

Again, you're just being plain obtuse. I've provided plenty in the way of arguments if your mind just was open enough to accept them.

If the gpu in xbox that was the most powerfull GPU at the time wasn't too expensive for xbox

Um, first of all you might argue wether that is true or not. Second, if you care to remember, MS actually sued Nvidia because they were forced to pay through the nose for those things, so wether it 'wasn't too expensive' or not seems just a little bit shaky doesn't it? ;)

Now, MS could still go ahead and DO it and stick in R500s by the bucketload, I never said anything about that (in fact I expressly said the opposite, that I don't know what they plan to do), but that doesn't change the fact it'd be stupid because that chip would just sit there buring power and clock cycles and twiddling its thumbs most of the time for no reason. You know it and I know it.

How many situations can you think up where you'd need even ten screen passes to do what you want with a DX9 part, much less fifty with a rumored DX10 part? Use your brain man! Devs are barely using shaders and such as it is and they've been around since spring 2001. High-profile titles like UT2004 is neither vertex shader-aware nor multithreaded or anything. Are they going to re-think EVERYTHING and start using super-ultra-multipass techniques out the wazoo and burn 7+ gpix/s in just a few years? That's worth a few :LOL: smilies in my book.

Besides, consoles have never been about heaping on limitless power to solve a problem, it's been doing things efficiently and still get excellent results. I don't expect this to change now.
 
Qroach said:
That depends, do you think Sony is gonna give you a high performance computer?

What does Sony have to do with anything in this discussion? :rolleyes: If you're lacking things to do with your time other than making trollish posts, I can give you a few helpful pointers... ;)

And to answer your totally, completely irrelevant question, no I don't expect them to give me anything! PS2 launched at around SKR5500 over here so I fully expect PS3 to cost an arm and a leg actually!
 
Umm, dude, ever heard of a concept called capitalism?

I'm not even gonna bother repsponding to all the stuff you wrote as it's pure nonsense! You still haven't written a single good reason (even while admitting to not know $hit) on WHY a future video processor would be too expensive for xbox2. There isn't any valid reason you could come up with that would back up what you claim, specifically when MS has done this in the past and teh GPU turned out to not be the most expensive part.

no I don't expect them to give me anything! PS2 launched at around SKR5500 over here so I fully expect PS3 to cost an arm and a leg actually!

It's hardly an irrelevant question. It all depends on what YOUR expectations are. Your expectation of a high performance computer is complely out of wack IMO. There's no reason why MS couldn't use technology from a graphics chip that won't lauch until late 2005. There's more than enough reasons for them to use something like R420 or even R500.

1.) Licensing the technoloy instead of purchasing the final chips
2.) MS can directly control the fabbing of the processors as needed.
3.) ATI is paid with a royaltee from software sold and not for each chip fabbed. In other words costing them less up front.
4) MS already used a graphics chip that was current technology and couldn't control the cost of the chips over time becuase Nvidia controlled the final price. They had to go to arbritration over this. Obvisouly this can be avoided this time aorund.

You keep arguing that beacause the specs of the chip are high, this means it's too expensive. Using YOUR logic, because the PS3 specs are high (or are going to be) then it will be too costly to use Cell. Does tha tmake sense? NO, of course it doesn't!

You seem to think that price is directly related too performance, yet you forget that cost is driven down over time when you produce those chips, and that makes them affordable in the long term. They cost may be high at the start, but that doesn't mean it's too high to even use it at all!


If you're lacking things to do with your time other than making trollish posts, I can give you a few helpful pointers...

Speaking of troll posts, if you don't know $hit like you claimed, then shut up already. You claim to be using common sense, but that doesn't seem the case. Quit looking at this as if it were a graphic card that uses the latest and greatest ram (which Xbox didn't use), dueal heads, while including a ton of other features not even needed in a console.

... and btw, what if I told you that I already had confirmation that xbox 2 is using a R500 -like chip, and that's your assumption regarding costs are WAY off. I'd also throw in that you're going to look pretty foolish when the specs are annouced, because after all, you did say you "didn't know $hit".
 
Ya know there's just no point in discussing anything with anyone when they say:

Ok, I'm done speculating for this post. I admit I don't know sh!t...

you can reply if you want, but I'm not going to bother after this point, because I know you're very incorrect about your assumptions related to the cost of the video processor, even before you said the above statememnt.
 
Qroach said:
I'm not even gonna bother repsponding to all the stuff you wrote as it's pure nonsense!

Uh, okay. Well, trying to discuss with you is like listening to that Mumbo Jumbo character in Banjo-Kazooie; lots of strange noises that vaguely resemble human speech, but in the end means nothing.

You take stuff out of context either deliberately out of sheer incompetence whilst at the same time refusing to answer direct questions, really, it's rather annoying.

So you're supposed to be a games dev, RIGHT OR WRONG?

Well if right, what use would you have for the ability to redraw your screen 50+ times per frame? Give me a good answer to that and I'll change my opinion to 'R500 is the IDEAL choice for nextbox' in a heartbeat. Until you actually can DO that though, I'll continue to state it's a DUMB IDEA using a chip with that much power.

You still haven't written a single good reason

Sure I have, you're just being too god damn obtuse to accept them. A chip like R500 would be too overpowered and hence too large physically to be cost-effective, this I've said all along and it's feckin true.

That MS with shrinks and stuff could make it more profitable as time goes by doesn't change the fact that with a smaller, leaner chip they'd either make more money out of it, or at least less of a loss, STRAIGHT AWAY, with even more to gain from shrinks.

R500 is a PC part that's designed to run at XGA resolutions with high AA, a console doesn't need that. A smaller chip would work out just fine whilst not losing any real-world performance.

(even while admitting to not know $hit)

...About FINAL HARDWARE SPECS, ya lyin b&%¤£! Don't take stuff out of context!

It's hardly an irrelevant question.

Sure it is. The topic has nothing to do with Sony.

Your expectation of a high performance computer is complely out of wack IMO.

What expectation, you say? I haven't said anything about what I expect Sony to come up with. So much for your "IMO"...

Using YOUR logic, because the PS3 specs are high (or are going to be) then it will be too costly to use Cell. Does tha tmake sense? NO, of course it doesn't!

It doesn't make sense because I never bloody SAID PS3 specs would be high. It's pointless trying to discuss with you when you persist in misquoting to the point of making stuff up, and putting words in my mouth.

You obviously think I must be a PS3 nut a la Vince, but you're just plain WRONG. I have no particular expectations of how "cell" (you mean broadband engine, I guess :rolleyes:) or its accompanying GPU will perform, of course I hope they'll be kickass, but when it's years until I will play on one it doesn't really matter to me right now.

You got a lot of guts calling me a troll whilst making a useless post like yours!

You seem to think that price is directly related too performance, yet you forget that cost is driven down over time when you produce those chips

No, I never said that, and I don't believe price is directly related to performance, but one can definitely argue there exists a connection the other way around. Naturally MS will lose money if they manufacture overpowered GPUs that have raw performance far above what their machine will ever need nor able to utilize. It won't ever need multi-gigapixel fillrates, nor will it have the RAM bandwidth to utilize such fillrates, so if it sits with maybe sixteen graphics pipes running at between half and one GHz it'll just waste money for them. It'll be a buttload of useless transistors all bought and paid for but contributing nothing to final performance.

This is exactly what I've been saying all along when you're saying I have no arguments. THAT is just rediculous, of course this matters. You think die space comes for free just because you licensed the IP that's etched onto it?
 
Guden Oden said:
Qroach said:
so you're talking abotu specs, not cost. just ebcuase it has a huge fillrate doesn't mean it's going to be too expensive for a console.

Umm, dude, ever heard of a concept called capitalism? In a nutshell it says if you're going to build a space shuttle, you outsource the job to the contractor that offers the lowest bid on your contract. You don't strap in the biggest engine you can get your hands on because it has lots of big numbers on the specs sheet. You buy the cheapest thing that'll get the job done!

Same thing here. If you don't have any earthly use for 7gpix fillrate (and that might be counting low), what would that be really when you're only going to redraw the screen 60 times per second. Even at 4xAA at HDTV res you could redraw the screen about 50 times per frame, what's the point in sticking in such a graphics chip?

It'd just cost you more money than it's worth, that's what!

Aren't you supposed to be a games dev or something? You tell me what the point is in being able to redraw the screen 50 times per frame.

Seems it's about as useful as strapping an 8-liter Dodge Viper V10 engine to a pickup truck and mass-produce the damn thing I say. ...Oh wait a minute, someone actually DID that! ;)

Your reasoning is flawed. Just becaue YOU can't see a reason for it, doesn't mean there isn't a use for it.

It's not flawed man, it's just you being fanboiishly obtuse on purpose. 50 times per frame per second at AA'd 60Hz HDTV res screen redraw. Sensible? No. Not when you weigh in you need like 60GB/s bandwidth just for 32bpp framebuffer ops to sustain that kind of fillrate. Not only do you have a rediculously overpowered and cost-ineffective GPU, you have to stick rediculously fast and cost-ineffective memories to it to actually be able to use the power you have at your disposal.

Capitalism, man. When Sony removed the separate S-video and phono audio plugs from the original PS to save not even half a buck per unit, why would MS stick in a 7gpix/s GPU when that power is completely wasted in such an application and squander A LOT more?

I can safely say developers will find a way to use whatever resources they get. Even still this doesn't mean its going to be too expensive to use in a console.

Uh huh. *nods head politely* Right.

Use it for what? 50 times per frame per second at AA'd 60Hz HDTV res screen redraw... Useful, for WHAT!!! purpose man?

If you don't know $hit then don't go around telling other people they aren't using common sense, because I fail to see anything in your argument that proves or comes close to proving it would be "too expensive".

Again, you're just being plain obtuse. I've provided plenty in the way of arguments if your mind just was open enough to accept them.

If the gpu in xbox that was the most powerfull GPU at the time wasn't too expensive for xbox

Um, first of all you might argue wether that is true or not. Second, if you care to remember, MS actually sued Nvidia because they were forced to pay through the nose for those things, so wether it 'wasn't too expensive' or not seems just a little bit shaky doesn't it? ;)

Now, MS could still go ahead and DO it and stick in R500s by the bucketload, I never said anything about that (in fact I expressly said the opposite, that I don't know what they plan to do), but that doesn't change the fact it'd be stupid because that chip would just sit there buring power and clock cycles and twiddling its thumbs most of the time for no reason. You know it and I know it.

How many situations can you think up where you'd need even ten screen passes to do what you want with a DX9 part, much less fifty with a rumored DX10 part? Use your brain man! Devs are barely using shaders and such as it is and they've been around since spring 2001. High-profile titles like UT2004 is neither vertex shader-aware nor multithreaded or anything. Are they going to re-think EVERYTHING and start using super-ultra-multipass techniques out the wazoo and burn 7+ gpix/s in just a few years? That's worth a few :LOL: smilies in my book.

Besides, consoles have never been about heaping on limitless power to solve a problem, it's been doing things efficiently and still get excellent results. I don't expect this to change now.

of course there will be a use for 7 bp fillrate. ut2004 only runs are reported 90fps at 1280*960 (well within HDTV res) with 4xaa and 4x af. what you don't think xbox 2 will use AA or AF ? we're not talking about 16x AA here, use your head.
 
Guden Oden:

Going forward it's not so much a question of fill-rate as such but number of shader ops per second. I doubt Xbox2 games will use trivial shaders that can retire one pixel per cycle per pipe. I should hope that XB2 games will run 30-100 instruction shaders on almost all pixels so we can have true per pixel lighting at least with realistic shadow and specularity, so your assumptions about fill rate and display target size aren't quite valid.
 
You know what guden . In 2005/2006 i don't know what i'd use all that power for. But in 2008/2009 i sure as hell know there will be a use for a seriously outdated r500 . And it would be a hell of a lot better than having an r420 in my machine at the time .


Ms is not building the xbox 2 for 2005 or 2006. They are building it to last as best it can for 4-6 years . So yes it may cost more to put in a r500 than a r420 or 450 or whatever . But a year , 2 years. 90nm , 65 nm , 45nm later it may make alot more sense that they used it .

Say at the start the r500 on 90 micron process costs 50$ a chip. As yields improve on that process the cost of the chip will go down as less are lost and there are more usable ones perwafer . Then as the micron process shrinks in the years following the chip will become smaller , there wil lbe more per wafer , the yields will go up and the cooling will go down. Thus reducing cost so that at some point the chip will drop down to 5-10$ .

Sony wil lbe doing this with the cell chip too.


At first both companys will loose a ton of money . Eventualy they will come close to breaking even on the systems but by then software royaltys wil lbe flooding in .

Now this is all with in reason. if they get 75% yield on a 1ghz r500 i don't expect them to ship with a 2ghz r500 that has a 10% yield rate .



But i can easily see an r500 in the system if it launches in 2005. If it launchs in 2006 then i can see an r550 or a r600.

Just like the later sony launches the faster the cell chip inside it will be .
 
PC-Engine said:
akira888 said:
But why would ATI waste money by designing a whole new graphics ASIC architechure just for Microsoft, when they have several in design or already finished? Which is not to say it won't be customized - but it will almost certainly be a member of a family of mostly PC chips - be that R5XX or R6XX.

MS has lots of money.

Certainly - and they have 58 billion in cash reserves by not pointlessly wasting it by doing things like paying ATI to design a "from scratch" graphics chip for XB2 when ATI already has many excellent chips coming down the pipe over the next 2-3 years.
 
akira888 said:
PC-Engine said:
akira888 said:
But why would ATI waste money by designing a whole new graphics ASIC architechure just for Microsoft, when they have several in design or already finished? Which is not to say it won't be customized - but it will almost certainly be a member of a family of mostly PC chips - be that R5XX or R6XX.

MS has lots of money.

Certainly - and they have 58 billion in cash reserves by not pointlessly wasting it by doing things like paying ATI to design a "from scratch" graphics chip for XB2 when ATI already has many excellent chips coming down the pipe over the next 2-3 years.
well they did spend millions making a web browser only to turn around and give it away for free so they could crush netscape and crontroll the internet thus increasing their grip on the os market
 
Back
Top