xbox360 not hitting planned goals?

Status
Not open for further replies.
I personally think this thing is being pulled way out of proportion in a drama-esque way. When you're planning hardware to launch in 3+ years time, there are risks involved and any such manufacturer will at some point miss what they were aiming for. If the target was set very high - which it is likely to be since this is a very competitive market - I'd be surprised if all goals are easily met. I'm not sure how this rumour of a ~25% cut is accurate as far as the number is concerned, but I am willing to believe that goals have been missed - as I am sure that Sony missed theirs too. I also believe that missing goals isn't going to be that much of a problem since this would be a known risk factor and one that's likely to occur.

I'm sure MS has backup plans as probably Sony and Nintendo have as well. I mean, if Nvidia's work on PS3 is any indication - it might just mean that Sony missed their goals too a few months back.

In the end, all will be well and 99% of the consumers (we included) will never know what goals were missed and by how much - that is, unless you're a very deep insider and work in those devisions.
 
Personally I'm of the opinion that the expectations of the fanboys of all colours won't be met for another couple generations of hardware after XBox360/PS3/Revolution, by which time if 45nm is achievable they'll be able to actually make a BE implementation of Cell that does a TFLOPS for instance. Hardware aside the bottom line is up to the creativity of the devs, there's graphic whores who see nothing but hi tech rendering and think it makes a game but if GAMEPLAY is crappy no amount of money thrown into the art department is going to make a game or the associated hardware interesting. It's a bit like fanboys of jet fighters, the capability of a modern combat aircraft is on par with a Sopwith Camel sitting on the ground, it takes somebody to truely fly it and fight with it well to get anything out of its promise.

I think people take technology too much for granted, like jvd's lamented lost Intellivision, if a game is fun, the underlying technological datedness of it is irrelevant....lives on its own merit irregardless of current systems. Hell, one of my favourite games is an ASCII based dungeon games from the mid 80's called Larn and i play it often even today.
 
Teasy said:
Well GC's GPU performance did end up being 75% of what it was planned to be. This kind of thing can happen when your not just picking already created parts off a shelf.
They said it was a balancing issue. It could have run at 202.5MHz, but it wouldn't have done any good with the CPU running at only 405MHz, so the CPU clock went up and the GPU clock went down and we were left with an extremely efficient and balanced design. Doesn't disprove your point, though, because a major shift in balance like that indicates some faults in their targetting or planning.
 
Not to mention that if they had kept Flipper at 200MHz they would have to had increased Gekko's speed to 600MHz to retain the balance ratio they were looking for. That would have increased the power and heat significantly, not good for a tiny box.
 
Maybe this two-core cpu refers to a PPC 970MP. Just like "version" predicted. And maybe his prediction of a separate PPU is true as well. C'mon version, tell us!!!!
 
Tacitblue said:
Personally I'm of the opinion that the expectations of the fanboys of all colours won't be met for another couple generations of hardware after XBox360/PS3/Revolution, by which time if 45nm is achievable they'll be able to actually make a BE implementation of Cell that does a TFLOPS for instance. Hardware aside the bottom line is up to the creativity of the devs, there's graphic whores who see nothing but hi tech rendering and think it makes a game but if GAMEPLAY is crappy no amount of money thrown into the art department is going to make a game or the associated hardware interesting. It's a bit like fanboys of jet fighters, the capability of a modern combat aircraft is on par with a Sopwith Camel sitting on the ground, it takes somebody to truely fly it and fight with it well to get anything out of its promise.

I think people take technology too much for granted, like jvd's lamented lost Intellivision, if a game is fun, the underlying technological datedness of it is irrelevant....lives on its own merit irregardless of current systems. Hell, one of my favourite games is an ASCII based dungeon games from the mid 80's called Larn and i play it often even today.

Except processing power multiplies itself in a short period of time, so fanboy expectations need only be off by a matter of months to a year or so to be totally off the mark. I don't think the problem is expectations so much as the rapid expansion of power in the industry. Some might be expecting 2006 performance from a 2005 product. Just my opinion. PEACE.
 
By the next next generation of consoles (PS4, XBox3, etc.) 45nm should be commonplace, 32nm might even be here, a chip with a billion transistors (BE) should be achievable no sweat. Think about this: the Emotion Engine had 10.5 million transistors, Cell has 234 million, PS4 will have what 5 billion? Coupled with another 6-7 years of advancements in GPU designs, at that point you might really be talking photorealism.
 
Pozer said:
Nirey said:
This does sound true in some sense, since MS is not a hardware company like us at Sony here. Couple that with MS rushing too fast to release, then maybe really Xbox 360 is a just "Xbox verison 1.5" :D :LOL: ;)

You mean an appliance company. ;) jk. Does make sense. Tri-core 3ghz did sound a little exotic to say the least. IBM has a long history of promising apple certain clock speeds and then delivering something else. I still think MS should have partnered with AMD this round.

I still find a hard time believing all these forum 'leaks' lately. There seems to be too much fanboyism behind them. Everybody and their brother works for Sony or Nintendo now or is developing a next gen game and knows all.

Time to toss in that PPU now. :D

PPU to the rescue! to save Xbox360 from sub-teraflop hell!

:LOL:
 
I cant belive we are debating something that has no source and no proof. This same guy spammed this on many Xbox boards already too. I fully expect the cpus to be very powerful in Xbox 360.

You basicly have a guy who is not providing any credable sources and is mostlikely making it all up. Hey guys i got this inside info on PS3's CPU, i heard it will only be 45% of the power it was planned to be, this is what my l337 inside source says!


Does that sound beliveable?

If you look at IBM and what they are doing with CPUs, this really is not the least bit beliveable at all. Hes basicly talking about nothing and making up buzz words to make his story sound beliveable. It all amounts to nothing. All 3 console makers are using IBM, there is a reason for this.... None of them would be using them if they were not confordent in their abblites.

It might be ture, it might not, but after seeing all of the pre e3 stuff the press has seen so far, if the cpus are not as powerful as expected you can't tell the least bit in the games.... So i really do not belive this to be true at all, specially since the developers still do not have final hardware.



Except writing stable, secure, effecient, reliable operating systems, which is their primary area of 'expertise'

Give me a break, we are sooooo past the age when it was "cool" to bash MS and their software. I run XP 24/7 for months on end with out having to do a reboot. You dont get more stable then that. Yea there system might not be the most secure out there, but thats life. They do a great job of quickly fixing problems that come up, and are getting better at catching them before they are found. An OS with the features of windows is a massive batch of code, it's not easy finding every hole in it. If other systems like Mac OS were used by 90% of the worlds PCs there would be tons of flaws found for it too.
 
BOOMEXPLODE said:
By the next next generation of consoles (PS4, XBox3, etc.) 45nm should be commonplace, 32nm might even be here, a chip with a billion transistors (BE) should be achievable no sweat. Think about this: the Emotion Engine had 10.5 million transistors, Cell has 234 million, PS4 will have what 5 billion? Coupled with another 6-7 years of advancements in GPU designs, at that point you might really be talking photorealism.


I agree with most of what you said. Xbox3--N6--PS4 will be here in the early part of the next decade. 45nm will be commen place, and late-model PS3s will have been using it. then next-next generation consoles (Xbox3, PS4, N6) should debut on 32nm or smaller and have several billion transistors, both CPU and GPU each.

they (Xbox3, PS4, N6) should achieve the kind of graphics that we wanted for this coming generation (high end CGI from games and movies). we'll have a few gigs of RAM and several non-BS teraflops of performance (at least).

then the following generation, late next decade (Xbox4, N7, PS5 etc) we might see near-photo realism, at least to the point that they should rival the best CGI we have ever seen to this day. we should have full raytracing, global illumination, with enough complexity and quality for everything that developers and gamers would want to see. I don't know if we'll need much more of a graphical improvement beyond this next-next-next generation which would start late next decade with
4th-gen Xboxes, 7th-generation Nintendos and 5th-gen Playstations.
 
Well I shouldn't have said photo-realism, that's too vague. A photo-realistic grain of sand could be rendered on todays machines, it's all relative to the complexity of the scene. I'm certain of one thing: in our lifetimes the scenarios in todays games (like, say, a football game) will be simulated completely (graphics, physics, AI, etc.) to near-perfection. It's really amazing the pace that video game technology has progressed: when I was a kid crude monochrome graphics were cutting edge spectacles that amazed everyone, and by the time I'm an old man noone will bat an eyelash at photo-realism and accurate reality simulation.

I'm satisfied with what the next-gen consoles are shaping up to be. The jump from PS2>PS3 should be even greater than the jump from PS1>PS2, and that in itself is pretty amazing.
 
Qroach said:
This guy claims MS & IBM removed "path forwarding" from the CPU and that the yields are not good (causing the clock speed to be below 3Ghz).

That the heck is path forwarding?

path fowarding seems to be a term used in networks, don't have a clue what it really is and since it isn't important here i won't do any further research in that area.

forwarding is however a term used in processors.
The result from an ALU operation or memory read is fed directly to the ALU before being written to the redgister file. This is to reduce the number of stalls and is (one of) the first optimization done to an pipleined in-order processor.

For a better explanation follow the link and read the section ALU Forwarding about halfway down the page.
https://www.cs.tcd.ie/Jeremy.Jones/vivio/dlx/dlxtutorial.htm

But i'd bet that's not it.
 
EPe9686518 said:
I cant belive we are debating something that has no source and no proof.
That's often the case in discussion forums. Someone chucks out an idea and people think about it. People have discussed 'will PPU feature in XB360' without any proof it will or won't. If we had proof about all this stuff, it wouldn't be a discussion forum but a collection of news post!

All 3 console makers are using IBM, there is a reason for this.... None of them would be using them if they were not confordent in their abblites.
This has already been explained. Not matter HOW good your company is at what it does, targets are missed. Same happen in software. Look at all the games with x features tooted during development, and then when it's released a whole load of those features are dropped for one reason or another. Same happens with hardware.

Also there have been NO official XB specs, so no-one really can say whether performance is lower than hoped for, but I would say that EVERY console has lower than hoped for performance though, because every console manufacturer is really hoping for something amazing!

It might be ture, it might not, but after seeing all of the pre e3 stuff the press has seen so far, if the cpus are not as powerful as expected you can't tell the least bit in the games....
Which is a different discussion. Rather than 'could this be true?' your talking here about 'if true, does it matter?' Whether it matters or not has no bearing on how valid it is or not. The fact MS don't NEED 100% power and 70% is okay, doesn't make the idea that they're (allegedly) only getting 70% power untrue.

Except writing stable, secure, effecient, reliable operating systems, which is their primary area of 'expertise'
Give me a break, we are sooooo past the age when it was "cool" to bash MS and their software.
I wasn't bashing their software as much as pointing out a flaw in someone's reasoning. They said MS has expertise, so wouldn't end up with problems with their products. I'm saying all that expertise hasn't helped them in the past. Which is all about that key point that we've mentioned in this thread several times, just because you're an expert at something, doesn't mean problems don't arise. It is not impossible for MS+IBM to have missed targets for XenonCPU, and looking at track records to see probability, these companies have shown missed targets in the past (look at all the features apparently being dropped from Longhorn = missed targets).

Is this rumour true? I would say there's no reason to believe it. It comes from an unnamed source without any evidence on an open forum with a track record of lies and attention seekers.
Is this rumour plausible? I would say yes, it could happen. Dropped features and missed targets happen all the time in all industries.
 
i'll be v happy once e3 begins and all these rumours can be put to bed and everyone can go back to being fanboys and discussing the merits of processor a v processor b with bugger all knowledge of either processor or the fact that texture a in game x is better than texture b in game y.

Only a few more days thankfully
 
No such luck, croc hunter! At e3 I doubt we'll hear MS saying anything about how they hit/missed their targets. They'll show a system, and then the discussions 'is this what MS were hoping for?' will begin in earnest, along with 'PS3 blows da 360 away' and 'Sony hype shows PS3's crippled' and 'PS3 demos all offline movies' and all that jazz :oops:

There is no escape ('cept to avoid internet forums and do something more productive instead of wading through all this trash!).

Hopefully though, post e3, we'll also see some intelligent hardware discussion on implementation etc. which is what this forum had when I joined.
 
Re:

I just think its hilarious that you people took random Gaming Age forumer #154's word for it. I mean it was like hook, line and sinker. It's freaking Gaming Age for crying out loud. The only less reliable internet forum is GameFAQs.
 
Re:

ecliptic said:
I just think its hilarious that you people took random Gaming Age forumer #154's word for it. I mean it was like hook, line and sinker.

Entertaining the scenario != "taking his word for it"

It's a plausible scenario, doesn't mean it's real.

And "random Gaming Age forumers" have turned out to be more credible afterall than sceptics initially proclaimed (see the KoTOR thread) ;) (not that fujimax is necessarily credible just because others have emerged, provably, as "insiders", but the whitewashing of anything coming from GAF is silly).
 
Shifty Geezer said:
No such luck, croc hunter! At e3 I doubt we'll hear MS saying anything about how they hit/missed their targets.
Though I've not read this entire thread but isn't it about the difference between the actual Xbox 360 spec and the leaked Xenon block diagram?
 
Status
Not open for further replies.
Back
Top