PlayStation Platform Standardization

Lazy8s

Veteran
Though the last topic got locked without ever fully exploring the issue, I think the possibility of Sony choosing standardization makes for interesting consideration.

As a market changes, a platform needs to grow more appealing to at least some of its demographics in order to not lose share to competitors. If the platform has proliferated across to a wide range of demographics, introducing this change with an incompatible architecture reinvention while trying to retain enough of the demographics may not be feasible. Such is the case with the PC industry: Computers are used for so many different purposes, tasks ranging from entertainment to medical research to business operation, by so many different audiences, from enthusiasts to scientists to business persons, that trying to get them all to switch to something completely new just isn't practical. So, the PC market stays with their existing platform and its demographics and introduces the change instead as an upgrade built onto the standard already in place. And while this explains the standardization of the PC industry, Sony's situation in the console market will probably only remain different for so long.

The PlayStation platform has never spread significantly outside of the entertainment sector. Thus, the logistics for targeting and moving just that single demographic, expecially one as accepting of advancement as enthusiasts, over to a new, incompatible architecture is more manageable.

Sony's confident they can recapture their audience each generational cycle since they actually have introduced the incentive of standardization at the level where it really matters for the console industry - the consumer. The PlayStation brand name, the ability to use old games and peripherals, and familiarity of franchises have carried over between their platforms due to backwards compatibility and smart marketing strategies. Not having standardization at the developer or retail level wouldn't matter as much because those communities naturally follow where the consumer money goes.

With it being logistically feasible to move enough of their demographics over to a new architecture and with the confidence that they could succeed in doing so, deciding whether to reinvent their platform each time comes down to how badly Sony wants to fix any failures of the previous one and introduce any radically new ideas. While they could still choose platform standardization to maintain a familiar development environment for the dev community, Sony, like the precedent set successfully before in the console industry, has created a new architecture every generation.

So... what if Sony's ultimate aspirations for CELL, or any future applicable platform of theirs, pan out as the architecture for a networked, pervasive computing era? Their platform would have to be spread across to many of the demographics the PC now captures like business, research, science, networking, finance, and commerce... Sony would then be facing the same logistical nightmare as the PC industry if they tried to introduce a new architecture after that. Would Sony then have to standardize their platform and evolve it through upgrades, essentially just swapping CELL in place of x86 and having it become legacy baggage for future generations? Or could they still find a way to introduce a different, revolutionary, and superior architecture every five years or so? Or is CELL so forward-looking, upgradeable and adaptive that it would be more than enough to work with until the next major market revolution occurred?
 
Also, as I didn't get a chance to continue before the other thread got locked over arguing, I wanted to respond to some of the misconceptions that kept dragging off-topic:

Fafalada:
Hybrid UMA is the term Sony used to describe the "failed" design mentality of PS2.
The failure isn't in the memory architecture; it's in the games that don't natively support non-interlaced output and/or don't have certain texture effects and properties when the games of the older and less expensive DC do. I meant it in relative terms, for the PS2 design certainly has its good qualities too.

london-boy:
OK, so me saying "looks miles better" is somehow less of a fact
Literally speaking, "looks miles better" isn't a fact at all.

The technically accurate definition of a fact is a statement which can be proven either true or false. What you gave was an opinion, a statement which is of personal taste and is, by nature, neither right nor wrong.

I commented on your phrasing for its inherent contradiction - starting off with "in fact", and then proceeding to give an opinion.

What I was trying to explain was how there is no absolute standard for comparing graphics. If, like how it is for me, ugly rendering such as comparatively warpy textures, washed out colors, flaky stability, and flicker kill the beauty of something more than slightly less rounded edges and less geometric detail do, then IQ plays a large part in which graphics are better. That's what it comes down to; it's illogical to state one set of graphical elements are more important than another in absolute terms when there's no absolute way to compare them.
than u saying "After seeing their PS2 outperformed in texturing and image quality by older and less expensive tech like Dreamcast". and that is because all u say is FACT and all i say is not...?
I backed my statement up with the facts -

DC can natively output the highest quality signal, VGA, and its library of games output in the necessary proscan as a standard to use it... not just in 5% of the titles. Hardware is most definitely a factor as PS2 devs, even good ones like Jak and Daxter, have often field-rendered their graphics, making them incompatible for proscan (which requires full frame rendering, standard for DC). PS2 devs have often used a lower 448 vertical res compared to the standard 480. Good ones, too, like MGS2, GT3 and many others.

Why make such changes? Because they wanted to go out of their way to make their graphics worse? No, it was to save memory resources. DC devs didn't have to do that. And the DC also naturally flicker filters well and outputs beautifully through standard cables.
we're discussing "hardware capabilities" here aren't we? the hardware is "CAPABLE" of running games in progressive scan.
Capable does not always mean practical with every game. In the case of field-rendered games like J&D, the hardware is not capable of running it in progressive scan because the front buffer is half-height. I'm sure Naughty Dog would've run J&D in a full sized front buffer if they could've, but it apparently was more practical to save the memory resources with that concession. And this was the case with J&D, still one of the more impressive PS2 games around.

It's like Le Mans on DC - the hardware is capable of anisotropic texture filtering, and it was used on the necessary textures in that game. That doesn't mean it's practical to think every DC game could've also used anisotropic filtering.
in fact they (all newer consoles) can run better looking games than most DC games.
You did it again here - "run better looking games" is a matter of opinion on what you like, yet you prefaced the opinion with the phrase "in fact". Literal contradiction and illogical.

As for your point, I of course agree with it in many ways.
the fact that the "option" of VGA is not on those consoles does NOT make the hardware LESS CAPABLE. it's a software issue.
Field-rendered games can't be output non-interlaced for proscan or VGA. Trying to enable proscan for them is not a software issue... it traces back some to the hardware, why they were being field-rendered in the first place.
again, if "some games" can run in progressive scan, then "all games" can potentially run in progressive scan. the hardware is there, the option is not. the hardware is capable of doing it, the software doesnt support it. is it so hard to grasp?
That ignores the individuality of game engines and the flexibility of hardware. It's like saying Shenmue could've potentially run at 60 fps with self-shadowing because Sonic Adventure 2 did on their DC system. If Shenmue graphics were scaled to Sonic Adventure 2 graphics, then yes -it could've... and then it would be a Sonic Adventure 2 graphical clone and not the game it was.

Within the scope of what a hardware offers you, you have to make compromises to get certain results. You can't have everything with every type of graphic engine.

But sure - all games could potentially run in proscan on PS2, until you start making choices which make it impractical (that even the best of devs have conceded to make.)

Phil:
You did say "After seeing their PS2 outperformed in texturing and image quality by older and less expensive tech like Dreamcast" - which isn't quite correct. If anything, perhaps in average, games are being outperformed * in IQ by the older hardware - not the hardware. You don't messure the potential of a hardware by the average or the weakest performer but the best, as we wouldn't want to blame the console manufacturer for sloppy/lazy programming would we?
You can't gauge a hardware's potential for something based on just its best examples. As I did with my analysis, you have to take into account all the games, and here's why:

Graphic design is a balancing act. Taking an advantage somewhere means a little something is being given up somewhere else in compromise. No one balance is anymore right or wrong than another. For instance, both Primal and Baldur's Gate: Dark Alliance are top-of-the-line examples of PS2 programming, both achieving different looks. Primal achieves proscan, while the technique used in Baldur's Gate requires a half-height front buffer to be used on PS2 and consequently cannot be run in proscan or on displays which require proscan, like native VGA. Using the amazing supersampled FSAA of Dark Alliance, games on PS2 render to a half-sized front buffer and then can't be run in progressive scan.

So, even among the PS2's best, like Baldur's Gate, the potential for proscan can't be consistent as it had to be traded to use the game's FSAA. Because all games are results of compromises, a console must be judged by its full spectrum of games and not just selective examples.

And if you're judging by theoretical potential, both Dreamcast and PS2 are capable of outputting HDTV resolutions, supersampled FSAA, and all that... whether or not it's practical. Technically, though, DC has more potential since there's space for a more robust frame buffer with it's 8 MB of display memory.
 
CTRL-F revealed 'DC' used 10 times (1 was a quote) in a PlayStation thread :(

edit: guess I should clarify that its not really in good taste to bring up the DC so much (which is always a sensitive issue) in a topic revolving around Sony and the PS brand.
 
zurich:
CTRL-F revealed 'DC' used 10 times (1 was a quote) in a PlayStation thread
It wasn't mentioned a single time in my topic post about PlayStation Platform Standardization, as other consoles had nothing to do with it.

I prefaced my second post with the disclaimer that it was going off-topic because it addressed those off-topic responses others gave me for stating the facts in the last Standardization topic.
 
I'm sure Naughty Dog would've run J&D in a full sized front buffer if they could've, but it apparently was more practical to save the memory resources with that concession.
Well, you'll be happy to know that the sequel definitely runs in full frame buffer and looks even better in every aspect than the first one.

I think your line of reasoning is ultimately very different than what most people do. Most people ignore poor looking games and concentrate on those that look the best. Using average as comparision would be valid if those best looking games often weren't the best playing too. Potential or not, if you compare the best looking games in any genre on PS2 with the best looking games in that genre on DC, you will find that there's pretty much nothing left on DC that compares favourably.
 
Sony would then be facing the same logistical nightmare as the PC industry if they tried to introduce a new architecture after that. Would Sony then have to standardize their platform and evolve it through upgrades, essentially just swapping CELL in place of x86 and having it become legacy baggage for future generations?

What kind of baggage do you think it will put on the future generations ?


Or could they still find a way to introduce a different, revolutionary, and superior architecture every five years or so? Or is CELL so forward-looking, upgradeable and adaptive that it would be more than enough to work with until the next major market revolution occurred?

Well, with no product available, can't really tell. But with what they promised its the second option, but that is if they deliver the promise.
 
Lazy, first and foremost it must be said that your post is, quite honestly, a breath of fresh air in this forum as it's not only composed of a solid core topic, but it's points are well articulated and natively intriguing. So, well done.

My personal responce would be focused in two distinct areas, these being:

  • The necessitated inter-generational legacy baggage
  • Cell's long term impact on SCE/Sony Group and affecting the conditions set in statement 1.
Thus, the underlying question becomes one that addresses the impact Cell will have on companies like Sony Group, Toshiba *1 and their forward pregress along the current status quo of rolling out new architectures. Fundimentally, Will Cell become another x86?

And it's a question that I'm not comfortable answering at this point with what I know - too many unknowns, too many uncertainties. My first thought it to look for historical parallels that can become a general guide for speculation, but it's impossible in this case. Sony basically has no precedent, and has never had the same situation facing it.

As Kunitake Ando said:
Sony is buying nearly $8.4 billion worth of ICs annually.

"Less than 20 percent of them are internally produced. We purchased even core devices that differentiate products. If such devices are fabricated internally and the percentage of internally procured devices goes up twice, Sony's semiconductor strategy will change greatly," said Ando.

So, any speculation is just that - blind stabbing in the dark.


My personal thoughts (if anyone cares... didn't think so :)) is that Cell will probobly become something along the lines of an x86 for the livingroom. And as such, it will scale forward in time - with marginal improvements as seen in x86 - as outlined by Suzuoki's Cell patent. But, this acceptence isn't as profound as one would imagine upon Power Computing as seen in PlayStation calibur devices.

What I mean by this is that by the next iteration of PlayStation, even though the current architecture appears very flexible and open-ended in it's FP usage and low latency/high bandwith local storage that looks to be where the future of High-End graphics is (as foretold by the likes of SGI and others), the basic architecture will probobly be very outdated in 2010*2.

So, our digital Cell will age, mature and ultimatly die just as a biological one does. But, what will be passed on isn't a direct lineage, but rather it's esssence, capabilities for broadband connectivity and the sharing of resources via broadband. Perhaps, there will be an intermediate step where lithography advance allows for a PS3 IC in PS4, just as we've seen in the jumps from PS -> PS2, PS2 -> PS3. Or perhaps we'll see a future IC with ideological roots in Cell. Or perhaps it'll be totally new and alien to us. But, this is really irrelevent in the greater picture and greater ambition... one which is much bigger than Cell.

PlayStation's desire and need for bleeding-edge computing won't die with Cell, it'll just add another [manufacturing] facet to it.

*1
Takeshi Nakagawa said:
We expect to apply Cell to a wide range of applications related to broadband networks, including digital consumer electronics and mobile terminals…the Cell development project is proceeding as planned.

*2
Computing in 2010 and Beyond said:

And thanks to Jose, who makes all thought so much clearer.
 
I do not think that Cell as we know it, as APU ISA, will die in 2010... that would be a waste of money.

It would be akin to Intel developing IPF and changing to a totally different instruction set 5 years after Itanium is introduced to the market.

As long as Cell 2.0 in 2010 can understand the Apulets Cell 1.0 works with, that is all they need.

Going to 45 nm and then 32 nm there is space to increase the number of PEs per chip, the amount of e-DRAM, the clock-speed...

There is space even using Cell 1.0 to evolve the technology and make it MUCH faster... still I undesrtand that we did have to change micro-architecture going from the Pentium Pro family to the Pentium IV family, so I am not opposed to that sort of change, but we need to make of Cell a long living platform if we want to have it be as diffused as we plan it and if we want Cell to have chances of bringing pervasive computing in the living room ( and have other companies license the architecture and so on ).
 
I guess I am not completely grasping this thread. it was interesting to read. perhaps it is a bit beyond me.

the OP was talking about a Playstation standard, not just consoles, but Playstation standard for everything in computing. intstead of Intel/MS/Nvidia/ATI pieces we would have Sony/Toshiba/IBM pieces.


That with PS4. we would have CELL 2.0 much like going from a
Pentium 2 to a Pentium 4.
 
...

To Lazy8s

So... what if Sony's ultimate aspirations for CELL, or any future applicable platform of theirs, pan out as the architecture for a networked, pervasive computing era?
The flaw with above statement is that SCEI and Sony are separate entities with different ideas(Sony builds a handheld, Kutaragi kills it. SCEI releases a DVD playing PSX2 at a price point below Sony's own DVD players), and that Sony doesn't really care for Kutaragi's visions, other than the cash flow SCEI brings in.

Would Sony then have to standardize their platform and evolve it through upgrades, essentially just swapping CELL in place of x86 and having it become legacy baggage for future generations?
I will almost guarantee you that Kutaragi will have different ideas about the future of entertainment in a couple of years, abondon CELL, and comes up with something "better". This is Kutaragi Ken's nature.

To Vince

Will Cell become another x86?
Not a chance. Unless Kutaragi is willing to open up CELL free of charge. The only computing standard that has build a credible alternative to X86/windows is Linux. And Linux got there by being free and open-sourced. CELL will not get there unless SCEI is willing to GPL CELL.

My personal thoughts (if anyone cares... didn't think so ) is that Cell will probobly become something along the lines of an x86 for the livingroom.
Snowball's chance in hell since Matsushita doesn't care, Toshiba doesn't care, Samsung doesn't care, I don't think even Sony headquarter cares. No one will care unless CELL becomes free and open-sourced.

Anyhow, the real question is that could the original PSX1 architecture have been extended to provide improved performance and functionality without breaking code compatibility? The answer is yes, but Japanese don't seem to care about backward compatibility anyway largely because their platform policy is driven by hardware guys and executive types and not software guys. This was acceptable during 16 bit to 32 bit transition because there ware a fundamental shift of programming paradigm, but current generation of consoles are just faster versions of 32 bit consoles and required no programming paradigm shift, hence developers would have been better off extending PSX1 instead of reinventing the wheel. But I doubt Kutaragi sees the values of extending an existing architecture since he is an electrical engineer who is clueless about software development process anyway.

If you are expecting Kutaragi Ken to shape the future of computing, forget it. He is not the man. He has neither the education nor the "correct" vision to pull it off.
 
If you are expecting Kutaragi Ken to shape the future of computing, forget it. He is not the man. He has neither the education nor the "correct" vision to pull it off.

Typical Deadmeat.. :rolleyes:

And about SCE and Sony corp being such separate enties and Sony not caring about what SCE wants and needs I think you are in for a rough wake-up Deadmeat... SCE is getting more and more influential in Sony and the whole semiconductor R&D centers of all Sony corp. sub-divisions are being centralized in a single one and the back-bone of this consolidated structure seems to be, yeah you guessed it, SCE...

Look at the funding Ken Kutaragi manages to get from Sony corp., he is the only top manager in Sony that international investors still have a lot of faith into, he is seen as a very promising leader for Sony and perhaps the best candidate for the C.E.O position ot Sony corp.

He has quite some infleunce in Sony corp. operation and it i a good thing... look at what several anti-digital, business-man type of leader made of Sony Electronics and Sony Music.

They made something all-right, they made Samsung the giant it is today.
 
Heh, for a people who don't care about backwards compatibility, they sure went out of their ways to implement near-perfect PS1 compatiblity in PS2. Isn't that the first time it's been done in the console world too?
 
PS2 being compatable with the PSX is the first time a console has been backward compatable in say a decade or so, but certainly not the first time.

NEC SuperGrafx (PC-Engine2) was compatable with PC-Engine games
Genesis was compatable with Master System games
Atari 7800 was compatable with Atari 2600 games


there are other examples too.


The Super Famicom *was* going to be compatable with the Famicom, but that feature was removed. although there are ment to be adaptors so that Famicom games can be played on SFC.
 
...

SCE is getting more and more influential in Sony and the whole semiconductor R&D centers of all Sony corp. sub-divisions are being centralized in a single one and the back-bone of this consolidated structure seems to be, yeah you guessed it, SCE...
Kutaragi's influence is directly proportional to the profit SCEI brings in. Should PSP and PSX3 falter, Kutaragi is a goner....

Heh, for a people who don't care about backwards compatibility, they sure went out of their ways to implement near-perfect PS1 compatiblity in PS2.
I meant the backward compatibility from the developer's point of view and not consumer's point of view. Developers would have loved to recycle their tools and engines from PSX1 era.
 
Re: ...

Thanks for raiding a good discussion ass:

DeadheadGA said:
Snowball's chance in hell since Matsushita doesn't care, Toshiba doesn't care, Samsung doesn't care.

Read my post again jackass:

  • "We expect to apply Cell to a wide range of applications related to broadband networks, including digital consumer electronics and mobile terminals…the Cell development project is proceeding as planned." - Takeshi Nakagawa, a senior vice president with Toshiba

DeadmeatGA said:
I don't think even Sony headquarter cares.

See here:
  • TOKYO — Sony Corp. and its Sony Computer Entertainment Inc. (SCEI) unit on Monday (April 21) announced a $1.7 billion (Â¥200 billion) investment plan for 65-nm process technology on 300-mm wafers.
  • Sony's investment will beneficial not only to SCEI but to whole Sony group, said Kunitake Ando, president and Sony Group COO. Ando said Sony is buying nearly $8.4 billion worth of ICs annually.
  • This announcement by SCEI and Sony is a confirmation of the progress we've made with the Cell design itself, of our advances in semiconductor technology to help it reach its full potential and of Cell's far-reaching implications for a wide variety of applications. - Dr. John Kelly, senior vice president and group executive for the IBM Technology Group

You know what, I don't even feel like replying to the rest of this biased horseshit. Can we ban this guy already for his obsessive compulsive behavor relating to Kutaragi? It borders on the fringe, get help.
 
Re: ...

DeadmeatGA said:
I meant the backward compatibility from the developer's point of view and not consumer's point of view. Developers would have loved to recycle their tools and engines from PSX1 era.

Who the hell cares? Sony is the de facto console, developers will come regardless. What they did is good buisness for a variety of technological, advancement, and buisness reasons.

Their job isn't to take away from the developer's job. This is very simple, PS2 is ~40M ahead of the other consoles combined. They have approaching 70% of the marketplace and are projected to reach 100Million by 2005. Either conform or go develop for the Cube (which will be dead in a year) or the XBox.

Seriously, your constant ranting concerning developers is sickening me. Games developing/programming is their jobs. They get payed for a reason, earn your damn salery.

You don't see the rest of us complaining that our patients come in too fucked up and that they shouldn't have sheared off their appendages in a car accident. Or that the child-molester a lawyer is defending is too guilty or too crazy. It's your job, this is what you were given, fucking do it.
 
...

To Vince

Read my post again jackass:
Doesn't this constitute a personal attack?

Can we ban this guy
You just made a case for your own ban. Nice knowing you Vince, watch your language where ever you go. Throwing flames only burn your own hand.

Who the hell cares? Sony is the de facto console, developers will come regardless.
It was only a decade ago that Nintendo was the de facto console with a 90% market share. It took them only 10 years to be driven out of console market which it created.

Either conform or go develop for the Cube (which will be dead in a year) or the XBox.
Nintendo used to say the same. Do not piss off developers, as they do have means of fighting back.

Seriously, your constant ranting concerning developers is sickening me.
Why would developers be sick of hearing my "legitmate" concerns???
 
,,

You may have noticed that my criticism is solely directed to PSX2 only. I do not criticize PSX1, Xbox, and GC because these are fairly well-balanced machines with no obvious design flaws or weaknesses.
 
Re: ...

DeatmeatGA said:
It was only a decade ago that Nintendo was the de facto console with a 90% market share. It took them only 10 years to be driven out of console market which it created.

And Sony has nothing in common with Nintendo other than both being dominent at a point in their lifetimes. Sony advances, they adapt, they strive for more - Nintendo doesn't. They're the quintessential praxis of your preachings for short-evolutionary steps that favor the developer, prefering to keep the established status-quo and just evolve it. Easy on developers - but nobody wants it.

It's comapnies, such as Sony (or Microsoft to an extent), who strive for revolution or rapid-advancment by living on the edge and pushing the envelope with their designes that suceed. Rough innovation and advancement is preferable to the calm normalcy and leasurly evolution that you want.

Why would developers be sick of hearing my "legitmate" concerns???

HA!
 
Re: ,,

DeadmeatGA said:
You may have noticed that my criticism is solely directed to PSX2 only. I do not criticize PSX1, Xbox, and GC because these are fairly well-balanced machines with no obvious design flaws or weaknesses.

And this shows just how pointless your criticism is. PS2 is from 2000, a comperable platform in your 'balanced design of an XBox' would have been a PentiumII 450 with GeForce2 [NV15] and 32MB of PC133/DDR266.

If you can't see that PlayStation2's design is superior to this, then I have no use arguing with you. Although, IMHO, the very fact that in 2003 we can sit here debating how PS2's visuals compare [eg. SH3, KZ, MGS3] to the GeForce4+ [NV25+] powered XBox - tells you something.
 
Back
Top