Future CPU performance

You can have all the potential performance in the world but if its never put to use or used properly for one reason or another then who cares? In the end potential means nothing if it goes unused.
But much of it will be.

I think given amount of time those PS3 developers had to work on them with dev kits that were far better than what X360 devs had for launch games and that they actually had some help from Sony in the form of better (vs. PS2 release) documentation and tools that they sure are comparable.
Great, they're starting at comparable then.
If Xenon and Cell are working at the same point in their learning curves, I'm still right in the end. Unless Cell devs are incapable of keeping the proportion of unutilized execution vs utilized execution to a fraction that is something not +50% worse, Cell will offer more.

That's a big number. If Xenon in the end is utilized to 80% of its abilities, Cell's top titles must somehow fail to utilize 70%.

Considering how optimistic 80% is, it doesn't get better.
At 60%, Cell would have to fail to hit 40% utilization.

Cell is a pain, I seriously doubt it's that horrendous.

Well duh, given all the Cell hype and the cost and how much later it came out it damn well better. But will it be superior enough to show a significant (ie. equal to or greater than Xbox vs. PS2 graphics quality in games) difference in graphics/AI/whatever vs. X360 games without forcing devs to either have massive budgets and/or significantly longer development time and/or cut content out of their games? Thats my question, and personally I'm very skeptical as to wether it is that powerful and if it isn't than given then amount of money, time, and effort they put into Cell I'd call it a failure.
There's that money/non-technical aspect I have explicitely stated I don't care about. My argument is not about the eventual success of the platform, just a comparison of the two CPUs have to offer, and that someday software will exist that will show the differences.

I don't care if it's just three big-budget titles, and it doesn't matter for the point I'm making if the developers lose their shirts in the process. They probably won't, since software development is just one part of the cost of producing games.

Oh come on, if embarrassingly parallel tasks are that difficult to get working on a arch. like Cell then they've got some real problems, problems which have no easy answers despite decades of work by people who had much more human resources and with FAR larger budgets than a lot of these game developers have to work with.
It's not that they are hard to get working. Cell's not exploding when they try to implement a cloth simulation.

It's that they are hard to get working close to peak performance numbers. There's not one architecture ever that didn't have this problem.

Both Cell and Xenon will have more issues with this than say, an Athlon (and it is very inefficient).
Cell is likely to have more issues than Xenon, but it offers so much more that it would have to be close to twice as bad in order to not hit parity.

I think you misunderstood or I'm misunderstanding you, I was talking about differences in graphics between games on the 2 platforms, not market share, etc.
In your words, the success of Cell is determined by how widespread high-quality titles are for the platform.
I said that I don't care if Cell is a success, and that such things are equally or more determined by factors that have nothing to do with the technical aspects of the CPU.
The few AAA titles that will come out showcasing Cell's processing superiority to Xenon are really all I need to prove my point, and I'm very sure competent developers with smaller budgets will do very well, given enough time (a time frame I say is less than 5 years).
I've already made clear that such games would be a smaller subset of the market at large.

It's not a big bold statement on my part to say that Cell exceeds Xenon on most technical aspects, and that with enough time software development will utilize it well.

Oh I'm sure there'll be more than 1 too, perhaps 5 or 6 AAA titles that really try to do everything possible to get as much out of the PS3 throughout its lifetime, but if that is all there is due to the difficulties in wringing performance out of Cell than all that potential is going to be spending most of its time being wasted.
We'll see. I'm not particularly convinced that there won't be enough accumulated knowledge and improved tools by then to get a pretty decent showing.

Oh come on now, they're retarded if they can't make practical use of much of Cell's performance in that time frame?

I have faith in the ability of developers that they will improve utilization in five years compared to what is done now. The problems associated with Cell are not insurmountable, and it's probably better that devs get bloodied by asymmetric programming models now, since it's likely they'll be getting more of the same in the next console generation.
 
Why? You say that 360 developers had final hardware 8-10 months longer than PS3 developers and they have significantly better sdk:s.

Do you think PS3 developers have more skills or what do you base your assumptions on?
Those beta PS3 SDK's were fairly close feature wise with the final hardware though (IIRC biggest problem with them was the lack of the Flex IO bus yes?) weren't they? And they had em' for quite a while too didn't they?
 
Those beta PS3 SDK's were fairly close feature wise with the final hardware though (IIRC biggest problem with them was the lack of the Flex IO bus yes?) weren't they? And they had em' for quite a while too didn't they?
This feels a bit like nit-picking.

There were a great number of different Alfa-Beta configurations of the PS3 dev kits with different graphic solutions, lack of FlexIO, bluRay etc. I am sure that plethora of dev kits did not do much to simplify the life of the developers. What I do remember is that the original beta kits were not delivered in the numbers originally planned for in the late fall of 2005.

Whatever, 360 developers have had more time with final, beta and alfa dev kits + some superior sdk:s than PS3 developers at this time where we are today.

Do you have some other arguments to why Cell should be exploited to the same level as Xenon in the current titles?
 
But much of it will be.
This sounds like a vague prediction open to interpretation.

Great, they're starting at comparable then.
If Xenon and Cell are working at the same point in their learning curves, I'm still right in the end. Unless Cell devs are incapable of keeping the proportion of unutilized execution vs utilized execution to a fraction that is something not +50% worse, Cell will offer more.

That's a big number. If Xenon in the end is utilized to 80% of its abilities, Cell's top titles must somehow fail to utilize 70%.

Considering how optimistic 80% is, it doesn't get better.
At 60%, Cell would have to fail to hit 40% utilization.

Cell is a pain, I seriously doubt it's that horrendous.
Yea you sound awfully optimistic of things in general, pretty unreasonable IMO given the direction towards commoditization that the game/console industry in general seems headed.

There's that money/non-technical aspect I have explicitely stated I don't care about.
Well than that'd make your argument academic wouldn't it then? Do you know any developers that can ignore the money (not just game sales, but the economics of time spent working on the title, what you can do thats practicle, etc.) side of things? If they can't get the performance without going broke doing it then they won't bother will they?

It's not that they are hard to get working. Cell's not exploding when they try to implement a cloth simulation.

It's that they are hard to get working close to peak performance numbers. There's not one architecture ever that didn't have this problem.
Well, so say I got this CPU that is relatively straight forward or even easy to get stuff working on it, but to get your given code to run fast enough req. so much effort no one bothers to try despite the massive potential performance available....

Both Cell and Xenon will have more issues with this than say, an Athlon (and it is very inefficient). Cell is likely to have more issues than Xenon, but it offers so much more that it would have to be close to twice as bad in order to not hit parity.
So potential performance trumps everything for a given work load across all arch.?

In your words, the success of Cell is determined by how widespread high-quality titles are for the platform.
Well no thats not what I meant (sorry if I wasn't clear about that), high quality is pretty subjective, but things like better graphics/AI/physics/etc. tend to stand out in a much more obvious and clear cut fashion due to hardware resources at hand.

It's not a big bold statement on my part to say that Cell exceeds Xenon on most technical aspects, and that with enough time software development will utilize it well.
It sure is a vague statement though, open to interpretation and meaningless "proof" (ie. GFLOPS) via corner cases.

I have faith in the ability of developers that they will improve utilization in five years compared to what is done now. The problems associated with Cell are not insurmountable, and it's probably better that devs get bloodied by asymmetric programming models now, since it's likely they'll be getting more of the same in the next console generation.
Faith based programming models are the wave of the future eh?
 
Last edited by a moderator:
Maybe this is too far OT and I should start a new thread, but let's do an academic exercise to illustrate my point. Let's say you have two systems, A and B.

A has a very popular and well understood architecture which is extremely developer friendly. Let's say, for sake of argument, it has unified memory, a very PC-like GPU, standard SIMD instruction set, standard shader language, all the fixed pipeline methods you expect, plenty of L1/L2 cache, is OOO, has a single core, single hardware thread, and excels at single-thread instruction. It is based on an well-understood ISA, and so already has great compiler support. In general, devs already understand implicitly how to write code for the box and get great performance, but even if they don't, it still does very well. In general, it's very difficult to write poorly performing code for the box. (A sounds a lot like Xbox1, but that's no my intention, it's all hypothetical).

B is a completely foreign architecture. It has NUMA, in-order execution, uses a mesh of tiny cores to do grid-computing with no cache, a small set of registers, maybe some local storage, and no branch prediction. There is no traditional CPU and no dedicated GPU, meaning no rasterizer, no fixed function pipelines, no shaders, nothing. The ISA is completely new and compilers must be totally rewritten by hand. Developers have never even conceived of writing a game on such a system; it's not even clear how to go about starting to write an engine on it.

The kicker is that B's theoretical peak performance is miles and miles above what A offers.

Now if you're a game developer, which would you rather develop on?
 
Maybe this is too far OT and I should start a new thread, but let's do an academic exercise to illustrate my point. Let's say you have two systems, A and B.

A has a very popular and well understood architecture which is extremely developer friendly. Let's say, for sake of argument, it has unified memory, a very PC-like GPU, standard SIMD instruction set, standard shader language, all the fixed pipeline methods you expect, plenty of L1/L2 cache, is OOO, has a single core, single hardware thread, and excels at single-thread instruction. It is based on an well-understood ISA, and so already has great compiler support. In general, devs already understand implicitly how to write code for the box and get great performance, but even if they don't, it still does very well. In general, it's very difficult to write poorly performing code for the box. (A sounds a lot like Xbox1, but that's no my intention, it's all hypothetical).

B is a completely foreign architecture. It has NUMA, in-order execution, uses a mesh of tiny cores to do grid-computing with no cache, a small set of registers, maybe some local storage, and no branch prediction. There is no traditional CPU and no dedicated GPU, meaning no rasterizer, no fixed function pipelines, no shaders, nothing. The ISA is completely new and compilers must be totally rewritten by hand. Developers have never even conceived of writing a game on such a system; it's not even clear how to go about starting to write an engine on it.

The kicker is that B's theoretical peak performance is miles and miles above what A offers.

Now if you're a game developer, which would you rather develop on?

Depends on how much of a challenge you as an individual like to undertake..
As a developer, either way your still going to get paid so it's not too big of a problem considering the second option..

It's funny how your'e theoretical A & B sound interesting like Xbox and PS2 and look what happened there in terms of developer support.. :oops:
 
Depends on how much of a challenge you as an individual like to undertake..
As a developer, either way your still going to get paid so it's not too big of a problem considering the second option..

It's funny how your'e theoretical A & B sound interesting like Xbox and PS2 and look what happened there in terms of developer support.. :oops:
But outside forces (economics...) were the deciding factor there, Xbox was unproven and they had to make money so what were they to do? I doubt there was any developer anywhere that didn't prefer working on games with a Xbox vs. the PS2.
 
No that pretty much sums em' up. Got any other rebuttals other than "feels like nitpicking"?
Nope, I am pretty happy with my last argument, PS3 developers will never catch up with the time the 360 developers have spent together with alfa-, beta-, or final dev kits.

I think the Cell has more mileage left than the Xenon, for two simple reasons, Xenon has less computational power than Cell and the initial threshold for tapping into the power of Xenon is lower.

I think 3dilettante summed up the situation pretty well.
 
This sounds like a vague prediction open to interpretation.
If some software right now on average forces Cell to execute 30% of its peak FLOPs, a similar project in the future will use 30%+.

If a game now has 10 zombies, and a similar game later has 15 of the same zombie.

Yea you sound awfully optimistic of things in general, pretty unreasonable IMO given the direction towards commoditization that the game/console industry in general seems headed.
Right, Xenon is somehow immune to these trends, and middleware is incapable of working to a high degree of effectiveness on Cell.

Well than that'd make your argument academic wouldn't it then? Do you know any developers that can ignore the money (not just game sales, but the economics of time spent working on the title, what you can do thats practicle, etc.) side of things? If they can't get the performance without going broke doing it then they won't bother will they?
There are enough large developers with big franchises that can do this. They have enough brand recognition to make it profitable.
It's not like a game costs 1000x to make for Cell that it does for Xenon. Development of software is only one slice of the pie.

I said this repeatedly: In the next 2 software generations for the platforms, the AAA big money platform exclusives will be technically superior on the PS3.

That is the only thing I am arguing. You are trying to interpret this as my saying the PS3 or Cell will succeed, or that one platform is somehow better. I am saying that those games wil have more features, or more of each feature than the equivalent on the other platform.

Well, so say I got this CPU that is relatively straight forward or even easy to get stuff working on it, but to get your given code to run fast enough req. so much effort no one bothers to try despite the massive potential performance available....
Your assumption is that the effort is excessive. Incremental gains carry between projects, people don't forget how to program on Cell when they finish one project and go onto the next.

Neither Xenon nor Cell are easy to program for. Cell is harder, but nobody who's had a look at it has said it's to the point that people run screaming into the night.

The next question is whether it is relevant to my argument, or whether anyone outside of the programming group is going to care how hard it is to develop.
The publishers don't care. The game buyers don't care.

Let's say there are two games.
Game A took X amount of dollars and effort to utilize its CPU.
Game B took X*2 dollars to reach the same level of utilization on a CPU that is significantly stronger.

Because of this, Game B has better physics, better graphics, and better AI.
Why should a gamer care that it took longer to develop for B, or that the developer had twice as many brutal long nights?
What matters is that he can buy B, and make fun of the guy who bought Game A, who obviously has a much smaller e-penis.

So potential performance trumps everything for a given work load across all arch.?
It does when it is better utilized. Can you give me a reason why none of it will be utilized in the next 5 years?

It sure is a vague statement though, open to interpretation and meaningless "proof" (ie. GFLOPS) via corner cases.
Fine, in 3 years:
If there is a ninja fighting game exclusive on both platforms, one will have more flowing cloth, more ninjas, and better physics.

Guess which platform it will be on.


Faith based programming models are the wave of the future eh?

Let's find a list of architectures anywhere that failed to have improved utilization with the passage of time.
The first abacus had better utilization with the passage of time.

Then, imagine a situation where somehow having more to utilizable resources cannot in any way yield an improvement.

My assumption is that historical trends will continue into the future.
You're arguing for the developmental equivalent of the End of Days.
 
Last edited by a moderator:
If some software right now on average forces Cell to execute 30% of its peak FLOPs, a similar project in the future will use 30%+.

If a game now has 10 zombies, and a similar game later has 15 of the same zombie.
I'm gonna hold you to that estimate.

Right, Xenon is somehow immune to these trends, and middleware is incapable of working to a high degree of effectiveness on Cell.
Of course it isn't and of course better or even great middleware could be made available to run on Cell, but middleware aint' a silver bullet that solves every problem or even most of them, and Xenon is easier and more straight forward to work with than Cell yes?

There are enough large developers with big franchises that can do this. They have enough brand recognition to make it profitable.
We'll see how well they do then, I have no faith in your predictions.
It's not like a game costs 1000x to make for Cell that it does for Xenon. Development of software is only one slice of the pie.
Its one of the larger if not largest pieces though normally for most titles right? Certainly has a big impact on things, so much so that the difficulties can't be ignored correct?
You are trying to interpret this as my saying the PS3 or Cell will succeed, or that one platform is somehow better.
But by saying that Cell is technically superior isn't that exactly what you are doing?
Your assumption is that the effort is excessive. Incremental gains carry between projects, people don't forget how to program on Cell when they finish one project and go onto the next.
There are some issues though, that for the foreseeable future, cannot be gotten around by just experience though aren't there (in a recent interview Carmack mentions how inheritly fragile multi threaded code is for instance)? Certainly no easy approach to working on something like Cell right?
Neither Xenon nor Cell are easy to program for. Cell is harder, but nobody who's had a look at it has said it's to the point that people run screaming into the night.
No they haven't done that, there has been the occassional comment to the effect that they were worried getting things up and running on Cell in a timely and effective fashion though.
The next question is whether it is relevant to my argument, or whether anyone outside of the programming group is going to care how hard it is to develop.
The publishers don't care. The game buyers don't care.
No they don't care about any of that stuff, but they do care about end results (is the game good? how does it look? how much gameplay is there? etc.) and those thing can (will?) have a significant impact on those end results won't they?
It does when it is better utilized. Can you give me a reason why none of it will be utilized in the next 5 years?
None?! I'd have no idea why they couldn't get ANY performance out of Cell over the next year much less 5, but I also have no idea why I should believe that higher potential automajically translates into higher performance.
Fine, in 3 years:
If there is a ninja fighting game exclusive on both platforms, one will have more flowing cloth, more ninjas, and better physics.
Quite a bit more or just a little? If we can't see the difference who cares right?
Guess which platform it will be on.
I'm stumped, which is it?
Let's find a list of architectures anywhere that failed to have improved utilization with the passage of time.
The first abacus had better utilization with the passage of time.

Then, imagine a situation where somehow having more to utilizable resources cannot in any way yield an improvement.

My assumption is that historical trends will continue into the future.
You're arguing for the developmental equivalent of the End of Days.
No, I don't believe I've said anything like that at all. My argument is/was:

"But will it be superior enough to show a significant (ie. equal to or greater than Xbox vs. PS2 graphics quality in games) difference in graphics/AI/whatever vs. X360 games without forcing devs to either have massive budgets and/or significantly longer development time and/or cut content out of their games?"
 
Of course it isn't and of course better or even great middleware could be made available to run on Cell, but middleware aint' a silver bullet that solves every problem or even most of them, and Xenon is easier and more straight forward to work with than Cell yes?

I think it'll be easier to get something up and running on 360 but getting the best out of it will prove just as difficult as Cell - if not more so.

On Cell you divide the tasks among the SPEs, they load data into the LS and work from it there. The tuning is making sure serial parts are not harming performance, making sure everything is vector friendly and making best use of the LS by streaming / buffering.

X360 may be easier to get running due to being similar(ish) to a conventional processor but there are additional issues that will crop up when trying to get the best out of it. There is only 1 L2 which is shared between 6 threads - developers will have to ensure threads don't go kicking out each others data.

X360 has numerous features to get around this though, I/O buffers which don't write to cache, cache locking and the ability to stream data directly to / from the L1.

Xenon has it's own share of complexities and that's on top of sharing RAM bandwidth with the GPU and maybe even worrying about tiles.

It's pretty complex stuff, Cell does not have a monopoly on complexity!
 
Of course it isn't and of course better or even great middleware could be made available to run on Cell, but middleware aint' a silver bullet that solves every problem or even most of them, and Xenon is easier and more straight forward to work with than Cell yes?
Doesn't make it easy to work with.
If peak performance didn't matter, why even make a next gen console?

Its one of the larger if not largest pieces though normally for most titles right? Certainly has a big impact on things, so much so that the difficulties can't be ignored correct?
On the average, it's something around ~25% or less of the total budget. Asset creation is another biggie.
There was a pie chart or something a while back. It's hard to say because that is likely highly variable between projects and game types.

But by saying that Cell is technically superior isn't that exactly what you are doing?
DEC's Alpha was a technically superior processor compared to most of the architectures that outlasted it. It was beating many of them for years after it was discontinued.
Being technically superior does not trump the market.

There are some issues though, that for the foreseeable future, cannot be gotten around by just experience though aren't there (in a recent interview Carmack mentions how inheritly fragile multi threaded code is for instance)? Certainly no easy approach to working on something like Cell right?
Xenon is multithreaded. It doesn't escape Carmack's reservations, and he's going to get over his problems with multithreading or he's going to retire, because we're stuck with it.

No they don't care about any of that stuff, but they do care about end results (is the game good? how does it look? how much gameplay is there? etc.) and those thing can (will?) have a significant impact on those end results won't they?
And the fact the dev team took an extra six months to program for Cell affects this how?

None?! I'd have no idea why they couldn't get ANY performance out of Cell over the next year much less 5, but I also have no idea why I should believe that higher potential automajically translates into higher performance.

Since you agree that they'll utilize it more than they do now:

Let's just make up some handwavy numbers, and assume that Cell is, for the sake of argument has 1.5x Xenon's peak number of operations per second, because I don't want to do too much math to restate my point for the third time.

Year 1 2 3 4 5
% utilized of peak
Xenon 60 70 75 77 82
Cell 50 58 60 65 68

Realized performance relative to Xenon:
.75 .86 .9 .975 1.02

I'm betting I've hobbled Cell quite a bit, since it's likely that overall performance in year one should be closer to 1 already.

Quite a bit more or just a little? If we can't see the difference who cares right?
It depends if you lose the ability to count ninjas, they are pretty sneaky that way.

"But will it be superior enough to show a significant (ie. equal to or greater than Xbox vs. PS2 graphics quality in games) difference in graphics/AI/whatever vs. X360 games without forcing devs to either have massive budgets and/or significantly longer development time and/or cut content out of their games?"

In other words, you've purposefuly limited your argument to exclude the portion of games I was discussing and inflated the scope of the argument to apply what I've said to things I've specifically said my points do not apply.

AAA big money exclusives are the ones where this will be apparent.
 
One of the problems with these arguements is that they strongly rely on flops. There are cases where a problem lends itself to easy resolution through parallelization and/or a SIMD architecture. And isolating a problem that dominates e.g. 80% of your execution time by parallelizing it frees up resources for other core tasks that remain serial.

And while the SPEs offer a lot of power they are not a panacea for all performance ills. If that was the case the PS3 Cell would have 2 more SPEs instead of the PPE and we would be using things like GPUs for CPUs. Things like memory coherance and branching are significant wins for many situations. There are going to be places where a problem maps well to the SPE architecture and can be parallelized, and in these situations Cell will absolutely destroy Xenon.

But from the devs I have talked there is a big gulf between that and reality, and reality being you have to be selective about what you put on the SPEs--because if you don't your performance is going to suck horrible. Pretty big disparity between "suck horribly" and "absolutely destroy Xenon". Both are true, it just depends on what your game design lends itself to.

PS3 exclusive games that have game designs driven by the technology (like how id Software designs games around their technology roadmap) will absolutely demonstrate the power of Cell. But the reality is that a processor is part of a system and that the market is much more multiplatform. We saw EE, a much more difficult chip, be significantly exposed because of the massive install base (random fact: Sony exceeded 24M PS2 units after 18 months of being on the market, 12 in the US--Sony exceeded the 5 year total Xbox and GCN install bases, respectively, in less than 2 years. Basically before the Xbox and GCN launched Sony had already exceeded their total individual hardware sales) created an environment that not only supported defacto exclusives, it also strongly encouraged multiplatform titles to use the PS2/EE as the lead SKU and made it worth while, financially, to invest in the architecture. Cell isn't nearly as bad as EE, but the PS3 isn't nearly in the position the PS2 was, either. This will be a factor in regards to how much the PS3 is pushed.

Another factor is there is already a lot of talk about using Cell to do other tasks to compensate for other areas of the design, especially for multiplatform titles. There was already a long thread about vertex work and the disparity between the two consoles here. Another is the suggestion of using SPE resources to do framebuffer scaling. It is nice to have a flexible and powerful CPU to be utilized for many various tasks, and that is part of the platforms design, but it does emphasize the design decisions as a platform and they should be regarded as such.

And ultimately the industry is dictated by economics as well as simple things like lead SKU (see PS2 and Xbox, where the later rarely got to strut its stuff for this very reason) and realities like dev teams have usually 2 years to get a title out of the door on a tight budget with a limited number of senior programmers who can work efficiently in a parallel environment--be it Cell or Xenon. It really isn't as simple as a flop or core count to determine who has the best design. It comes down to the people they have (experience, skill), the resources available (time, money), and how well they are able to map their game design decisions to the hardware. If it takes too long, is too expensive, too difficult, etc it won't happen.

And I think that is the crux of the Playstation 4 / Xbox 3. I don't think it will come down to raw performance at all. I have heard enough developers say the opposite.

Besides the reality of making baby steps in regards to solving the problems of parallelizing game design, next gen will see even bigger budgets and even more people touching the game code. Expectations for graphics, sound, animation, AI, physics, and gameplay will only be higher. The successful console next next generation will be the one with a workflow that brings to greatest amount of utilization of resources to developers. Some of these solutions will need to come in the unsexy areas of the hardware as well. Just reading Intel's information on Terra-scale shows how much thought was put into I/O and memory management. It is one thing to make a 80 core processor with a high peak theoretical flop count, it is another to have high realworld utilization--especially over a large spectrum of code (obviously something Terra-scale isn't oriented toward). Parallel computing on the scale we see next gen is going to require significant strides in regards to memory controllers and inner chip communication. These things don't show up on peak flop PR sheets. Seeing quotes like this from John Carmack ("In an ideal world PlayStation 3 will be more powerful, but for the vast majority of the cases, you'll be able to effectively exploit more power from the 360" or
"I do sweat about the fragility of what we do with the large-scale software stuff with multiple programmers developing on things, and adding multi-core development makes it much scarier and much worse in that regard.") and hearing the same from other developers I don't see how things will change next generation: the emphasis will be on getting the most end product in a finite periond period of time. The solutions many developers are looking for at this point are production related--platforms that allow them to get the most out of the hardware within fixed constraints.

Next next gen (PS4 and Xbox 3) it seems MS and Sony are going different directions to a degree. Sony is using Cell as their platform, which has some instant benefits. We should see some killer software day 1 and see existing techniques that are not victims of Amdahl's Law be implimented quickly with impressive results. Some of the complaints and frustrations felt now will resolved by the hardware, others by experience and known solutions and areas to avoid. Having a stable platform should also allow Sony to create software tools and solutions that maximize development on their platform. MS doesn't have anything close to a hardware platform. Even on the graphics side it is DX. I don't forsee them going with IBM, at least not a solution like Xenon (which will not scale well at all). I think whatever direction they go they will go with something that is well known and approachable and will probably augment the design for any perceived deficencies (especially in PR). Just as Xenon is 3 PPC cores with an investment in heftier VMX units (and this wasn't even MS's first choice, they wanted OOOe CPUs) I think we will see them jump on either the AMD or Intel bandwagon so we could see an AMD design that maybe uses something like Fusion for the marketing flops or even go with a GPU heavy design (and develop tools to offload tasks like physics primarily to the GPU) or with Intel and have something like a Terrascale "co-processor". Whatever MS does it will be with the mindset of bang-for-buck from the software side. Going with something like Cell where you have "hands on" Local Stores and more narrowly focused processors is not in their playbook I would guess. While Sony is breaking the 34-68 processor mark I have a hard time seeing MS breaking 20 (minus a specialized, very narrowly focused chip).

Which is better will totally depend on the developer and project. And I think this is equally true this generation. Each will excell in their own areas, with some overlap, and depending on market forces and install bases which gives the best, most consistant result, will be the one with the best overall market approach and support, not hardware (although the hardware will play a not so insignificant role in determining which platform developers support).
 
Last edited by a moderator:
AAA big money exclusives are the ones where this will be apparent.

This is true, but I think as a platforms go I think exclusives from both camps will show where their hardware designs were the best choices. Unless of course you believe that in every game design case Cell would be a better processor for an exclusive game (longer dev time, larger budget, etc).
 
This is true, but I think as a platforms go I think exclusives from both camps will show where their hardware designs were the best choices. Unless of course you believe that in every game design case Cell would be a better processor for an exclusive game (longer dev time, larger budget, etc).

I think there will be a trend where games that have the money and dev time to massage the systems will tend to be technically more feature-laden or have more of a given feature if they are on Cell. There will be exceptions and checkboxes one platform can tout over the over, but the general trend will be one where the more robust platform becomes clear.

The game type does matter, of course. An RPG in the vein of Disgaea with similar art direction or any game that by design is not hardware-limited has no reason to differentiate based on a console's power, but these won't be the AAA big-money titles.
 
Back
Top