Qroach said:guden,
You think MS is going to basically GIVE you a high-performance computer?
That depends, do you think Sony is gonna give you a high performance computer?
R500 will probably be too large/expensive for MS, remember this is a high-end PC part.
Qroach said:This report is crap. The guy that wrote it is completely speculating.
Qroach said:Why would it be too large and expensive for a "new" console?
Well for starters, we had a guy publish what he purports to be pre-release NV40 figures in the 3D tech forum yesterday or earlier today (depending on your time zone). Amongst other things was a seven *billion* pixel per second fillrate.
If those numbers are correct, you can rest assured the R500 will top that figure with a significant margin, but do you see even a slight reason a console would need that much fillrate, for ANY purpose? It's not going to do 16x AA of (HD)TV resolutions, because it couldn't spare the RAM for it anyway.
I admit I don't know sh!t, but at least I try to use common sense when guessing.
chapvince said:Some people are even expecting more from the PS3.
What???! Sure it was TOO expensive for Xbox, don't you see MS bleeding billions on the machine? It was TOO expensive, but MS didn't give a shit, that's the other part to it.
Do you expect MS to lose billions on Xbox2 this time? Not from what we're hearing now. They definately *do* give a shit about costs now though.
Higher specs = higher costs, let's not kid ourselves.
Where are these multibillion dollar fabs you are talking about?
Sony funded an application specific processor, Microsoft is funding a customized desktop processor.
Qroach said:so you're talking abotu specs, not cost. just ebcuase it has a huge fillrate doesn't mean it's going to be too expensive for a console.
Your reasoning is flawed. Just becaue YOU can't see a reason for it, doesn't mean there isn't a use for it.
I can safely say developers will find a way to use whatever resources they get. Even still this doesn't mean its going to be too expensive to use in a console.
If you don't know $hit then don't go around telling other people they aren't using common sense, because I fail to see anything in your argument that proves or comes close to proving it would be "too expensive".
If the gpu in xbox that was the most powerfull GPU at the time wasn't too expensive for xbox
Qroach said:That depends, do you think Sony is gonna give you a high performance computer?
Umm, dude, ever heard of a concept called capitalism?
no I don't expect them to give me anything! PS2 launched at around SKR5500 over here so I fully expect PS3 to cost an arm and a leg actually!
If you're lacking things to do with your time other than making trollish posts, I can give you a few helpful pointers...
Ok, I'm done speculating for this post. I admit I don't know sh!t...
Qroach said:I'm not even gonna bother repsponding to all the stuff you wrote as it's pure nonsense!
You still haven't written a single good reason
(even while admitting to not know $hit)
It's hardly an irrelevant question.
Your expectation of a high performance computer is complely out of wack IMO.
Using YOUR logic, because the PS3 specs are high (or are going to be) then it will be too costly to use Cell. Does tha tmake sense? NO, of course it doesn't!
You seem to think that price is directly related too performance, yet you forget that cost is driven down over time when you produce those chips
Guden Oden said:Qroach said:so you're talking abotu specs, not cost. just ebcuase it has a huge fillrate doesn't mean it's going to be too expensive for a console.
Umm, dude, ever heard of a concept called capitalism? In a nutshell it says if you're going to build a space shuttle, you outsource the job to the contractor that offers the lowest bid on your contract. You don't strap in the biggest engine you can get your hands on because it has lots of big numbers on the specs sheet. You buy the cheapest thing that'll get the job done!
Same thing here. If you don't have any earthly use for 7gpix fillrate (and that might be counting low), what would that be really when you're only going to redraw the screen 60 times per second. Even at 4xAA at HDTV res you could redraw the screen about 50 times per frame, what's the point in sticking in such a graphics chip?
It'd just cost you more money than it's worth, that's what!
Aren't you supposed to be a games dev or something? You tell me what the point is in being able to redraw the screen 50 times per frame.
Seems it's about as useful as strapping an 8-liter Dodge Viper V10 engine to a pickup truck and mass-produce the damn thing I say. ...Oh wait a minute, someone actually DID that!
Your reasoning is flawed. Just becaue YOU can't see a reason for it, doesn't mean there isn't a use for it.
It's not flawed man, it's just you being fanboiishly obtuse on purpose. 50 times per frame per second at AA'd 60Hz HDTV res screen redraw. Sensible? No. Not when you weigh in you need like 60GB/s bandwidth just for 32bpp framebuffer ops to sustain that kind of fillrate. Not only do you have a rediculously overpowered and cost-ineffective GPU, you have to stick rediculously fast and cost-ineffective memories to it to actually be able to use the power you have at your disposal.
Capitalism, man. When Sony removed the separate S-video and phono audio plugs from the original PS to save not even half a buck per unit, why would MS stick in a 7gpix/s GPU when that power is completely wasted in such an application and squander A LOT more?
I can safely say developers will find a way to use whatever resources they get. Even still this doesn't mean its going to be too expensive to use in a console.
Uh huh. *nods head politely* Right.
Use it for what? 50 times per frame per second at AA'd 60Hz HDTV res screen redraw... Useful, for WHAT!!! purpose man?
If you don't know $hit then don't go around telling other people they aren't using common sense, because I fail to see anything in your argument that proves or comes close to proving it would be "too expensive".
Again, you're just being plain obtuse. I've provided plenty in the way of arguments if your mind just was open enough to accept them.
If the gpu in xbox that was the most powerfull GPU at the time wasn't too expensive for xbox
Um, first of all you might argue wether that is true or not. Second, if you care to remember, MS actually sued Nvidia because they were forced to pay through the nose for those things, so wether it 'wasn't too expensive' or not seems just a little bit shaky doesn't it?
Now, MS could still go ahead and DO it and stick in R500s by the bucketload, I never said anything about that (in fact I expressly said the opposite, that I don't know what they plan to do), but that doesn't change the fact it'd be stupid because that chip would just sit there buring power and clock cycles and twiddling its thumbs most of the time for no reason. You know it and I know it.
How many situations can you think up where you'd need even ten screen passes to do what you want with a DX9 part, much less fifty with a rumored DX10 part? Use your brain man! Devs are barely using shaders and such as it is and they've been around since spring 2001. High-profile titles like UT2004 is neither vertex shader-aware nor multithreaded or anything. Are they going to re-think EVERYTHING and start using super-ultra-multipass techniques out the wazoo and burn 7+ gpix/s in just a few years? That's worth a few smilies in my book.
Besides, consoles have never been about heaping on limitless power to solve a problem, it's been doing things efficiently and still get excellent results. I don't expect this to change now.
PC-Engine said:akira888 said:But why would ATI waste money by designing a whole new graphics ASIC architechure just for Microsoft, when they have several in design or already finished? Which is not to say it won't be customized - but it will almost certainly be a member of a family of mostly PC chips - be that R5XX or R6XX.
MS has lots of money.
well they did spend millions making a web browser only to turn around and give it away for free so they could crush netscape and crontroll the internet thus increasing their grip on the os marketakira888 said:PC-Engine said:akira888 said:But why would ATI waste money by designing a whole new graphics ASIC architechure just for Microsoft, when they have several in design or already finished? Which is not to say it won't be customized - but it will almost certainly be a member of a family of mostly PC chips - be that R5XX or R6XX.
MS has lots of money.
Certainly - and they have 58 billion in cash reserves by not pointlessly wasting it by doing things like paying ATI to design a "from scratch" graphics chip for XB2 when ATI already has many excellent chips coming down the pipe over the next 2-3 years.