MS wants XBox2 out before PS3?

megadrive0088 said:
Tehy're both likely to have 1GB or more memory. With PS3, I dont see all of that fitting into a box the size of PS2 with all of those CELL chips inside as well.

Um, what are you talking about - seriously. In addition to the corrections that Archie made, your so completly wrong about the whole "all of those Cell chips inside" comment wich just reeks of not only the wrong idea, but questions your knowledge and that you'd even beleive or consider such a multi-chip scenario.

In addition, the underlying principle behind said Cellular Computing is to move a large amount of the memory on-die - where it can be accessed quickly to keep each computing element 'fed'.

Listen up to Marco and Archie - their both really smart guys.

We're getting to the point of diminishing returns, and the content (or how much of it can we cram) of what we display is going to be of more critical importance than how we display it. Likewise audio may finally get it's time in the limelight (as audio has become more so lately. And finally AI (which has become probably the most critical and busy topic as of late), and how it affects gameplay will become the most critical factor...

Amen!
 
Is it just me, or does anyone else get the distinct impression that Sony [in it's more personified form - maybe Aibo?!] raped Chap's mom? ;)
 
Sorry guys, but i am very impressed with MS first foray into the console industry.

Bill Gates have already said that Xbox will be their 2nd utmost priority, behind Windows, where MS is ready to spend and outspend their rivals to take over the console world. :D

MS will change how things work in console gaming.
Again, if MS wants, MS gets.

:oops:
 
From reading that link it seems that , OUM is a non-volitile memory technology that might someday replace Flash.

What application would it have in a game console? I guess it could be used as a memory card medium. But it doesn't sound like it would be fast enough to be used as a RAM or EDRAM replacement.
 
Its speed does not prevent it from competing with DRAM (it will be faster at the feature sizes used in 2005 most likely) reliability might though (at present it can only stand ~1e12 reset cycles ... which is fine for replacing flash, but obviously a bit of a problem for embedded RAM).
 
chap said:
Sorry guys, but i am very impressed with MS first foray into the console industry.

Bill Gates have already said that Xbox will be their 2nd utmost priority, behind Windows, where MS is ready to spend and outspend their rivals to take over the console world. :D

MS will change how things work in console gaming.
Again, if MS wants, MS gets.

:oops:

MS have enormous power on the PC because of its OS monopoly.

but consoles market is different and while PC gamers have to buy MS OS in order to play on their PC, there is nothing that makes consumers have to buy the xbox instead of its competitors.

and even having lots of money to spend could not be enough to beat sony and nintendo.
 
chap = chap.
I think deep down inside most of you, you will have that lil feeling that MS is going to take over the console world. I sense insecurity in most Sony and Nintendo fans. :D
 
chap said:
chap = chap.
I think deep down inside most of you, you will have that lil feeling that MS is going to take over the console world. I sense insecurity in most Sony and Nintendo fans. :D

you got the faith in microsoft.. you are a believer !

who cares about lil feelings here ?
 
chap said:
chap = chap.
I think deep down inside most of you, you will have that lil feeling that MS is going to take over the console world. I sense insecurity in most Sony and Nintendo fans. :D

wouldnt that require
A) making some good games and
B) selling some consoles outside of the US?

Xbox is already dead in Japan, its doing better in Europe but not great.
they are only using the Xbox to try to get some dedicated fanboys such as yourself, but I think your going to find it a hard letdown, because they are going to be putting a lot into the XBox 2, hoping it will be the one to bring in the money(instead of sucking up thier funds like the Xbox). if this thread is true, that wont be too far off, I would not pay $50 for a VCR if I knew it would only last me 4 years.. I sure as heck wouldnt pay $200 for a console with that lifespan.
 
Vince-

Woah boy... NV6x in 3 years? I guess this comes down to a fundimental question of how nVidia's timetable will stay. I see a slower paced enviroment, with the pace set by less frequent, but more radical DX revisions.

I don't see the continued investment of producing entirely new cores or major revisions every year. CineFX is here for awhile - just as the TNT core lasted all this time as the foundation of nVidia's products.

NV30 taped out in early September of '02, looking at the XB2 we are looking at a Q4(likely mid November) '05 timeframe. Roughly 38 months between the two. As I stated I don't think NV6x is likely, although I see it as an outside possibility. NV5X should be a given barring a complete implosion at nVidia. From '99 to '02 nV moved from nV10 to nV30, now in the next three years they will only introduce one new core? I see NV6X as far more likely then a NV4X core(although NV5X is where my money would be).

Yes, the TNT core was with us for ~four years. Compare the TNT to a GeForce4 and see exactly what that means though ;)

AFAIK, the CEO (whose name eludes me) has stated several times when pressed that no tape-out occured untill early September. The problem - while the Fab's early problems add to it - probobly stem from the influx of ideas injected into the NV30 project when the 3dfx aquisition took place. The design was probobly too ambitious for the temporal and lithography contraints. This, together with the lithography production problems caused the problem their in now.

And you think this will hold up every future nV product?

Well for nVidia's sake, lets home your hardware predictions are better than your software ones

Which of my software calles have been off so far? It looks like I was a bit too low on Mario(despite being quite a big higher then anyone else) but their are outside factors that I did not see happening(revolving around WM).

On the topic of Grid, IBM had a conference outlining their plans for it. Before the conference I thought it was a half witted idea, now I see that it is all clearly a joke. Have all computers connected to a BB setup so that their non used CPU cycles can be used by others and making the people pay for it.....? Its distributed computing with a financial model that can only benefit IBM and cost the overwhelming majority of users money for something that they already have too much of(they are going to charge for CPU time, which as of right now most users have 90%+of free, as a utility).

They state that businesses who have some PCs not in use will be able to have other machines on their Grid, which you have to pay to be on, use the spare cycles. You know, so those people on their 2GHZ P4/Athlon systems can grab that extra power for those spell checks :rolleyes:

The only practical applications they could come up with for this was places that needed massive computing power but couldn't afford a super computer(scientific modeling). Right now these places would be forced to start up a free distributed computing project, but IBM will let them do it and pay them.

Grid in terms of PS3s is an absolute non issue. Perhaps IBM is planning on giving Sony a kickback for CPU time taken from consoles(like people are going to leave it on when they aren't using it), it can not benefit the PS3 in any meaningful way. You can not predict how many spare cycles will be available at any given point in any particular area which mean there would be no way for developers to deal with latency. That is assuming that IBM could actually have this up and running by the time PS3 launched and assured BB access to all potential PS3 customers within the next four years.

Do you remember network computers? They are what replaced PCs(the reason why noone has PCs anymore :LOL: ). It was an idea hatched from one of the largest computer companies and it certainly had more worth then Grid did. Just like IBM they have all sorts of supposedly smart people, the problem was that like IBM they seem to base their scheme on 'how can we generate revenue' instead of 'what does the market want'.

Archie-

Speaking of partnerships, it's interesting as this so-called 'Xbox partnership' has two hardware partners who have had a tendency to avoid one-off designs to get where they are today, yet everybody here seems to expect them to just stop doing what got them where they are today. Intel's pretty much been a mass producer of a limited lineup, and Nvidia who's leveraged their existing IP through their entire product line, exploiting their venture into core-logic to win the Xbox contract.

Where as Toshiba and IBM have made quite a business leveraging their R&D facilities into doing one-off designs for their customers..

Is this why the PS2 is so much more powerful then the XBox? ;)

I don't consider Intel part of the XB2 equation, they had very little to do with the original XBox. nVidia will more then likely handle the design and hand it off to a group almost assuredly including former SGI engineers. AMD and Intel simply provide one of their mass produced parts which have been closing the performance gap with specialized designs for years now(particularly when looking at single die offerings).

I am in complete agreement with you statements about diminishing returns on the visual end(artists are likely to be the key to the best visual next gen), I am interested in hearing your take on how much computing power you think it will take to significantly improve the state of AI.
 
Where as Toshiba and IBM have made quite a business leveraging their R&D facilities into doing one-off designs for their customers..

I would like to hear your opinion about whether you think that this approach will still be financialy viable in the future. Despite all its shortcomings x86 undoubtly is the "successfull" ISA from a Darwinistic POV yet. Currently it is aggressivly expanding into other markets (into embedded vs. MIPS/SuperH (see XBOX & various x86 MCUs (~10% market share)), low to medium end workstation (that is almost a complete takeover by now (Sun & PowerPC have single digit market penetration) and pushing aggressivly into low to medium end server markets (up to 50000$) & they are complementing this with their ARM/IA64 ISAs) while being unchallenged in its home market (mainstream personal computing). Intel by now even seems to be fairly ahead of IBM on process technology (look at their ramp @.13 micron & their plans of having .09 mainstream (for prescott) at the end of next year). Just to come full circle for me the x86 camp Intel & AMD seems stronger than ever, for all major competitors failed to really introduce a serious competitor, that could profit from similar economics of scale. (I still have hopes for IBMs 64Bit Power4 mainstream derivate but i feel it will be too little too late by 2004). Toshibas & IBMs microprocessor bussineses are just drwafed by Intels & AMDs nowadays.
 
It is just rediculas to expect any NV3X GPU (NV30,NV31,NV35,etc) in XBox2, a console that's not coming out for at least 3 years.

Even if Nvidia slows down massively the pace of GPU development, it's inconcievable to me that XBox2 would use anything less than an NV40. The NV40 should have been a late 2003 product, but even if it slips to late 2004, that's still a year or so away from any XBox2 release. In that case, the base NV45 or NV50 GPU would probably be likely choices for an XBox2 varient.

The most I would think that is possible is NV55, or some varient of it. Not that I would mind an NV60 powered XBox2, though :)

Whatever MS/Nvidia (or ATI) decide on, I just hope it has really fantastic FSAA capability (as I do with PS3) - with lots of samples of the best method of FSAA possible.
 
Sorry guys, but i am very impressed with MS first foray into the console industry.

Well you're certainly welcome to be... I guess some of us are less impressed because Sony already demonstrated that a relative newcomer can succeed (and their success is arguably more impressive), leaving Microsoft's foray somewhat less impressive in comparison...

MS will change how things work in console gaming.

I'm curious, what do you expect them to change?

Currently it is aggressivly expanding into other markets (into embedded vs. MIPS/SuperH (see XBOX & various x86 MCUs (~10% market share)), low to medium end workstation (that is almost a complete takeover by now (Sun & PowerPC have single digit market penetration) and pushing aggressivly into low to medium end server markets (up to 50000$) & they are complementing this with their ARM/IA64 ISAs) while being unchallenged in its home market (mainstream personal computing).

I don't know if I'd consider Xbox an aggressive expansion into the embedded space. I guess maybe an Intel/x86 platform win perhaps, but x86 has always had a small piece of the microcontroller market (where CISC code density matters more than performance), it's just that that's primarily been filled by non-Intel x86 which in turn has been small peanuts compared to Motorola's m68k/ColdFire and various other microcontroller vendors (Hitachi, Toshiba, NEC, Sony, Mitsubishi).

As for desktop MPUs (whether they be consumer, professional workstation, or mid to high-end server), Intel has always been the dominant force there (maybe not in performance but definitely in size and influence). Being IBM's choice for their PC, and subsequently all major cloned systems, pretty much meant that economy of scale (along with M$'s dominance on the software side) would permit them to dominate that segment in all but the most specialized sectors. That pretty much meant the all the RISC competitors never really had much of a chance (including AMD and Intel's own RISC designs). The market for competitors really hasn't changed much in that area with the exception of AMD actually providing competition with Intel on the high end to some degree. There's still non-Intel low-cost x86 solutions around (Via and Transmeta instead of Cyrix and AMD). High-end wise, even with the relative improvements of McKinley over Merced, it hasn't done all that well (hence it's more affectionately known nick 'Itanic'). As far as their ARM solutions go, it's probably been a total blessing the DEC's demise dropped their ARM7 based work (StrongARM) into Intel's lap. While StrongARM did fairly well as a high-end solution, I don't know if Xscale (ARM10) will do the same. While ARM moves on (ARM11) and other ARM vendors compete with high-end Xscale offerings (like Samsung), not to mention their use of Xscale in comms where it's dominated by Motorola and IBM's PowerPC offerings. The real market for ARM is in the low-end where Intel doesn't really compete (I'd even speculate that Nintendo sells more ARM devices than Intel does).

Intel by now even seems to be fairly ahead of IBM on process technology (look at their ramp @.13 micron & their plans of having .09 mainstream (for prescott) at the end of next year).

I'm not so sure that necessarily makes Intel 'ahead' of IBM on process technology. IBM's CMOS8xx (.13µm) has been around longer than Intel's and CMOS9xx (90nm) will likely be available as well next year, not to mention they're solidly on their way to 65nm with CMOS10xx. That also neglects to mention their SiGe BiCMOS processes for mixed signals ICs as well (although there are rumors of Intel making a big push with 90nm SiGe, that remains to be seen). Then there's Toshiba who's already sampling .10µm devices (Sony makes the verification equipment for it), and has pretty solid plans for 90nm, 75nm and 55nm... In fact 90nm is a goal for a lot of people next year (foundries, embedded, memory, etc)...

Just to come full circle for me the x86 camp Intel & AMD seems stronger than ever, for all major competitors failed to really introduce a serious competitor, that could profit from similar economics of scale. (I still have hopes for IBMs 64Bit Power4 mainstream derivate but i feel it will be too little too late by 2004). Toshibas & IBMs microprocessor bussineses are just drwafed by Intels & AMDs nowadays.

Well considering IBM made x86 the defacto for the PC, nobody really could challenge it unless you wanted to make x86 processors yourself and the only one who's really given Intel a run for their money has been AMD. RISC vendors never really had a chance because there was no popular software platform for them to flourish (other WindowsNT for MIPS, Alpha and PowerPC of which the Alpha version only lasted any decent amount of time, and are now all gone aside from the rumored existence of Win2k builds for Alpha maintained internally). Linux and BSD of course change that (and perhaps gives IBM's PPC970 some path to popularity although I doubt it as both IBM's AIX and Linux solution will likely be too expensive for general consumers and Apple will probably be the best chance for a regular end-user to get their hands on it), but in the end they've come about too late to have any *real* influence.

As for IBM and Toshiba's microprocessor businesses. I'd say they're fairly comparable in size considering while semiconductor business represents only portions of each companies business while Toshiba is twice the size on Intel and IBM three times... I'd attribute Intel's visibility to it's core market being the more 'glamorous' desktop MPU and core-logic categories...

Is this why the PS2 is so much more powerful then the XBox?

Hehe, well considering I first saw GS hardware in late '98 and EE hardware in early '99, when Xbox wasn't even a paper spec, I'd say it does a fair job of competing...

I don't consider Intel part of the XB2 equation, they had very little to do with the original XBox. nVidia will more then likely handle the design and hand it off to a group almost assuredly including former SGI engineers.

Another interesting approach that nobody has mentioned is an all Intel solution. While it goes against what I previously argued about Intel's business patterns, considering the amount of production capability they spending on (while mainly to kill AMD, it'll be 'too much' once AMD would be dead so perhaps they plan on providing foundry services for others as well), they could possibly build a totally custom solution. It's not out of the realm of possibilities, as Intel does have the 'know-how', large-scale reliable fab-space, core-logic design experience. Perhaps something like Banias, although with more emphasis on higher performance and embedded graphics than I/O...

I am in complete agreement with you statements about diminishing returns on the visual end(artists are likely to be the key to the best visual next gen), I am interested in hearing your take on how much computing power you think it will take to significantly improve the state of AI.

Well it's hard to determine since AI has gone from "as long as it doesn't hurt frame-rate" to gaining enough significance to be critical component of a game's engine (in some cases being *the* feature basis for a game). Basically meaning that even without processor improvement, AI is benefitting from gaining a larger share of existing processor cycles. Also there's the shift away from academic methods (genetic algorithms, neural nets), to more traditional methods that yield 'more' for less cycles. Of course like any other aspect, AI can consume vicious amounts of cycles if you're willing to go there (just look at all of the computer vs. human chess competitions).

I only mention AI (as there's other aspects to consume processing cycles) as it's one of the larger determinants of gameplay and in spite of popular belief I don't think online gameplay is the be all to end all...

Oh well, enough rambling...
 
Another interesting approach that nobody has mentioned is an all Intel solution. While it goes against what I previously argued about Intel's business patterns, considering the amount of production capability they spending on (while mainly to kill AMD, it'll be 'too much' once AMD would be dead so perhaps they plan on providing foundry services for others as well), they could possibly build a totally custom solution. It's not out of the realm of possibilities, as Intel does have the 'know-how', large-scale reliable fab-space, core-logic design experience. Perhaps something like Banias, although with more emphasis on higher performance and embedded graphics than I/O...

According to "Opening the Xbox", Intel had its own console in the planning stages during the Xbox development (unbeknownst to each other). Intel and MS basically approached each other, MS to secure Intel hardware, and Intel to secure MS software. As the story goes, once Intel suits found out about the Xbox's existence, their system basically died a cold death. The Intel box was part of their consumer division (the guys who made the microscopes and such), so I don't think they were nearly as ambitious as MS in this regard.

zurich
 
Is this why the PS2 is so much more powerful then the XBox?

Ps2 and the cube where dev. without the knowledge that MS would get that seriously involved.... but now they do... cube2 is to come later than ps3 ... and ps3 is being dev. with the knowledge of MS being involved....

Rest assured that whatever is thrown out by sony will eclipse by several orders of magnitude any gphx card that's thrown out in 2004 and it will be DESIGNED with the intention of surpassing any Xbox1 like upgrades such cards could get...

I dunno who actually expects the Ps3 R/D staff to actually see future gpus on the market, and future DirectX versions... and completely ignore such things...

Anyways with Nintendo launching a console later, it will be the one to hold the gphx trone next gen....

you will have that lil feeling that MS is going to take over the console world. I sense insecurity in most Sony and Nintendo fans.

.... hmmm.... i think MS is gonna push too many buttons in the near future.... maybe even as soon as with their next so called end-all win rev... and like a powerful caesar they'll be stabbed to death by the rest of the threatened industry....
 
Do you remember network computers? They are what replaced PCs(the reason why noone has PCs anymore ). It was an idea hatched from one of the largest computer companies and it certainly had more worth then Grid did.
Network computers are actually still around, believe it or not, but granted the concept failed to gain any notcieable popularity (and probably never will in that form). However, the idea behind it was nonetheless, a sign of things to come.
It doesn't matter if it'll be Grid, .Net or whatever else, that pushes it forward into mainstream... I'm very confident it's what will eventually replace the concept of personal computers as we know them today.

Mind you, I'm mainly Not referring to any kind of distributed computing with this, although that would eventually play some role as well.
 
Personally Id like to keep my data local, I wouldnt mind an encrypted backup somewhere else ... but as long as my processing is local there is no reason for my data and programs not to be, except for presenting new and better ways to bleed money from me. I dont think I like the future very much ;)
 
Back
Top