The Future of the Playstation

I think the idea is more like VHS or DVD. A standard that everyone develops for. Like PC hardware even, and that was a direction MS were talking about. Future XB's wouldn't necessarily be a fixed hardware platform, but a hardware standard. If anyone's going to go that route, it'd be MS who want to tie the two sides of the industry together. Well, three sides really, with PC, Media PC and games console.

In that case my bet is that MS will join up with AMD.
 
I think the idea is more like VHS or DVD. A standard that everyone develops for. Like PC hardware even, and that was a direction MS were talking about.
I wish I could find it, I can't recall who it was (J. Allard?) but it was an old interview in some mag when the first Xbox was coming out.

Basically, he lamented the state of the console industry with "exclusives" and different platforms - the ideal future would be just like you mentioned - DVD or VHS. You simply pick up a game and put it into your "console". He admitted this could be many generations off, but it definitely seemed like that's the direction MS would like (and myself as well).
 
How many people are going to buy a more expensive PS3+ for a few niche titles? Surely the whole point of a PS3+ is to enable a better experience in Madden, R:FoM3, and the mainstream titles hardcore gamers buy hardware for? Not to mention the fact that to use more resources on the whole requires more expenditure. PSN titles are mostly smaller titles that don't use PS3 completely. It's hardly a deployment service for the most taxing software with the most assets. For that, with huge amounts of data, you have discs that are more viable. 25GB downloads aren't going to be popular and mainstream for a while yet!
Oh I thought you meant games that can't run on current PS3. Big boys can just port a new PC game with better frame rate and more effects (PS3.5 version: high, PS3 version: low) and that works.
 
Well, I wouldn't like it. I'm quite pissed off with Windows being the standard. I'd rather that file formats and some basic readers and such were standard, and we had more popular platforms that do different things. This would allow for more radical innovation, and push technology forward at a much more efficient rate. Right now, the Web has allowed it to some extent, and it's taking us there a little, but I'd definitely have preferred it on a lower level as well.
 
I wish I could find it, I can't recall who it was (J. Allard?) but it was an old interview in some mag when the first Xbox was coming out.
With XB360 there was talk of a PC standard. MS were offering XNA as a single platform for development to run on any software (and prior to that, DirectX, although likewise then no-one took them up on their offer). There was even an interview about a single software standard, kinda like 3DO. Whoever-it-was in the interview was talking about a possible performance rating scale, where a game would be badged 'Require Gaming Performance 5 or greater to run' and you'd buy hardware ranked at different numerical scores, with a console coming in at a suitable entry level price and performance.

I've never been able to find this interview again though!
 
The way the PS3 is struggling right now, why would Sony risk changing the paradigm, whether it's unproven computer architecture or business models?

Segmenting the installed base is counterintuitive. If anything, they should be looking to push costs down, not make an Elite SKU.

As for server-based models, there's been talk about thin clients for a decade now. Hasn't happened yet.

Plus, what stores are going to sell that hardware for you if they aren't allowed to sell software, which is where the money is? That is partly why Sony is distributing Blu-Ray SKU of Warhawk in addition to the PSN download.

To not antagonize the retail channel.

What if Warhawk became a big hit and it was a PSN download only? There would be backlash.

That is why there will be Blu-Ray SKUs of GT5 Prologue, Little Big Planet and the rest.
 
The way the PS3 is struggling right now, why would Sony risk changing the paradigm, whether it's unproven computer architecture or business models?

Segmenting the installed base is counterintuitive. If anything, they should be looking to push costs down, not make an Elite SKU.

As for server-based models, there's been talk about thin clients for a decade now. Hasn't happened yet.

Plus, what stores are going to sell that hardware for you if they aren't allowed to sell software, which is where the money is? That is partly why Sony is distributing Blu-Ray SKU of Warhawk in addition to the PSN download.

To not antagonize the retail channel.

What if Warhawk became a big hit and it was a PSN download only? There would be backlash.

That is why there will be Blu-Ray SKUs of GT5 Prologue, Little Big Planet and the rest.

Also, there's still a lot of people without broadband to download games. But I totally get and agree with what you're saying.
 
Well, I wouldn't like it. I'm quite pissed off with Windows being the standard. I'd rather that file formats and some basic readers and such were standard, and we had more popular platforms that do different things. This would allow for more radical innovation, and push technology forward at a much more efficient rate. Right now, the Web has allowed it to some extent, and it's taking us there a little, but I'd definitely have preferred it on a lower level as well.

Push which technologiy though..

At the end of the day a standard hardware spec doesn't have to encapsulate additional services/connections/peripherals (online gaming) which can be used to enrich games or just provide additional functionality.. This would be the core means for the hardware vendors to differentiate their own product..

Plus in terms of technological advancement, the standard wouldn't ever have to iterate so quickly like it does currently.. But could maintain true 10 year life-cycles (akin to the transitions from SDTV to HDTV or DVD to HD Optical Disk for examples of longer market life-cycles..)
The console would still provide sufficient power for developers to tap and consistently squeeze more out of to give the user deeper and richer experiences (plus don't forget that there are advances in areas like AI, animation, cinamatography which require time to develop good implementations more than they ever require more and more graphics processing power..)
Plus it would benefit middleware providers who could develop and maintain extremely sophisticated, feature-rich and mature tools and solutions for developers to use which would rarely ever have to be thrown out and a new suite developed from the ground up in similar vein to how things are currently...

If you're only really into consoles purely out of a desire for consistently cutting edge graphics however then you're probably better off buying a high-end PC.. & once we reach the point where the visual quality of games reaches the point where the largest distinguishing factors [of quality] lie in art direction and how much AA the title has (we're practically there already to be honest..) then you'll know full well that bring more and more processing power onto the table becomes irrelevant with regards to creating a marketable and appealing product..
 
If you're only really into consoles purely out of a desire for consistently cutting edge graphics however then you're probably better off buying a high-end PC..
But this is entirely the problem; it's a wholly different platform. It's not as simple as saying "I really value graphics, so I'm going to spend $1200 on a PC instead of a $400 console", because some of the best looking games (with respect to art direction) may not be available on the PC regardless of how much you spend.

The utopian ideal is to simply pay for performance, and be able to run any game that's within your budget. That's not a possibility now, if you want to have access to all games you're getting 3 consoles + a PC.
 
The way the PS3 is struggling right now, why would Sony risk changing the paradigm, whether it's unproven computer architecture or business models?

Segmenting the installed base is counterintuitive. If anything, they should be looking to push costs down, not make an Elite SKU.

As for server-based models, there's been talk about thin clients for a decade now. Hasn't happened yet.

Plus, what stores are going to sell that hardware for you if they aren't allowed to sell software, which is where the money is? That is partly why Sony is distributing Blu-Ray SKU of Warhawk in addition to the PSN download.

To not antagonize the retail channel.

What if Warhawk became a big hit and it was a PSN download only? There would be backlash.

That is why there will be Blu-Ray SKUs of GT5 Prologue, Little Big Planet and the rest.
They are not mutually exclusive.

(local) PS3 -> various PS3 configurations
(network) PSN -> PS4

This 10 year evolution is very soft approach because of software compatibility. Anyway you can see even today a segregation exists so you can't play Warhawk or enter Home if you don't have network access. Making the network service richer is the way to prolong the life of PS3. Retailers can sell PS3 softwares longer. As the network becomes richer, the price of the basic PS3 goes down. When advertisement revenue from Home kicks in, there will be more incentive to sell PS3 at a cheaper price for a greater install base. This makes today's PS3 a thin client after 5 or more years, relatively speaking. Some of local clients will get a spec update, games will run better, web browsing will be smoother, Linux operation will be faster. One day, PS4 servers will become the mainstream and you have $50 PSP in your hand, or 3D glasses. But it's a far future when Blu-ray dies. If Blu-ray (or other local media) is immortal, there's another future, PS4 will be a PS3 with more SPEs and maybe PS Eye. But isn't it just the same conclusion as the one above?

But this is entirely the problem; it's a wholly different platform. It's not as simple as saying "I really value graphics, so I'm going to spend $1200 on a PC instead of a $400 console", because some of the best looking games (with respect to art direction) may not be available on the PC regardless of how much you spend.

The utopian ideal is to simply pay for performance, and be able to run any game that's within your budget. That's not a possibility now, if you want to have access to all games you're getting 3 consoles + a PC.
Agreed, there must have been people who wanted to run Shadow of the Colossus with a better frame rate and were willing to pay money. Even though there was a demand, PS2 couldn't answer it. In this generation, PS3 has a proper OS as an abstraction layer. It's no different from the new PSP-2000 of which loading speed is faster. Some people may appreciate better web browsing on it, but I believe the main driver of the new PSP is game-related interest. (The form factor is also game-related, as you hold it to play!)
 
Last edited by a moderator:
Push which technologiy though..

At the end of the day a standard hardware spec doesn't have to encapsulate additional services/connections/peripherals (online gaming) which can be used to enrich games or just provide additional functionality.. This would be the core means for the hardware vendors to differentiate their own product..

Which is my point exactly. Have a hardware standard for a long time, and allow software to get the best out of it, rather than spending energy in adjusting to new hardware a lot, and force itself to higher levels of abstraction that compromise performance. Just think about it. Say that there are 50.000.000 consoles out there of a particular type, and you owned all of them. If you want better performance, are you then going to spend $50 per console to improve performance of a certain game, amounting to 2.500.000.000 spent? Or are you going to hire some kick-ass programmers for less than a thousandth of the costs and improve your software? In my opinion, the fact that the $50 may be shared expenses across the 50.000.000 console owners don't matter - it's still darn inefficient. I'd rather pay $1 to the programmer than $50 on the hardware.

If you look at PC gaming though, this is a lot like what happens. You pay more for hardware that you get less performance out of. And sure, you may be able to get more cutting edge as a result, but not a whole darn lot. In contrast, witness the PS2's software development and improvements, resulting in stuff like Resident Evil 4, God of War 2, GT4, Final Fantasy XII and so on, on 6 year old hardware, etc. Sure it doesn't all look as hot as the latest PC, but I think you get my point.

Plus it would benefit middleware providers who could develop and maintain extremely sophisticated, feature-rich and mature tools and solutions for developers to use which would rarely ever have to be thrown out and a new suite developed from the ground up in similar vein to how things are currently...

But the same thing goes here as above. They too are getting more out of the hardware as time progresses.

If you're only really into consoles purely out of a desire for consistently cutting edge graphics however then you're probably better off buying a high-end PC..

Again, that quite simply depends on how much money you wish to spend. Say that I buy a console once every 5-6 years. Then I pay about $100 a year on hardware. Good luck with that over the same period of time with a PC. So for the buck, I'm getting a lot more cutting edge graphics. ;) It's all where you want to take it.

& once we reach the point where the visual quality of games reaches the point where the largest distinguishing factors [of quality] lie in art direction and how much AA the title has (we're practically there already to be honest..) then you'll know full well that bring more and more processing power onto the table becomes irrelevant with regards to creating a marketable and appealing product..

We're not there by a long shot. And technological advances will go in more than just the 2D realm. Eventually we'll get 3d, and before we have power enough to create believable worlds in full 3D, I think we'll be 10 years further. And even then, I'm convinced there will be new advances in technology. Maybe neurological control, maybe even bi-directional. There will be expensive stuff, and cheap stuff, and it will be all over the place. We'll get more and more different game machines, and we'll get different software platforms. Maybe eventually games will end up like different file formats, but some of those formats will always be limited to certain cutting edge or specialised platforms, even if it is perhaps only a timed exclusive. ;)
 
Sure it's better for publishers.. but what about developers?

At the moment publishers recognise the costs associated with middleware (see UE3.0) licenses which currently maybe a viable solution for a single-team, single-project looking to release cross-platform but this solution is still far too expensive for publishers to buy and distribute to all developers they contract.. Then there's developing their own middleware which works out even more expensive..

Basically as a result of these high development costs you have fat publishers too scared to invest in projects other than those which either have already been heavily invested in (by the developer, if they have the cash..) to establish a proof of concept (with much of the technology in place) so it's financial and marketable viability can be assessed, Or are established franchises or licensed IP..

Most publishers won't even touch you if your a small scale development house with a big vision and no history (i.e. hit titles under your belt..) and with the costs of development in terms of staying competitve being as high as they are nowadays you're either forced to do outsourcing work (if you can find it) or target a smaller niche (e.g. handheld..) where you can get yourt products out the door cheaply and quickly (albiet with just as small returns..)

I'm not saying a unified hardware specification standard would be particularly simple and straightforward to see introduction in the near future, but in the end something needs to be done to help the current situation change for the better and encourage an industry that is far more lucrative across the board & far less volatile than the one we have today..

If content is the heart and sould of the industry then surely content developers need our lives making easier and not more and more difficult by continuing to increase technical complexity (iterating every few years) and reducing/seggregating financial viability.. :cry:

Yeah, sure the developers wouldn´t mind having one single development environment for the low level development. But I do believe that diversity is a good sign of a healthy marketing place. The fact is that all the current gen consoles contain IBM CPUs, I don´t think you should ask for more than that (such as completely binary compatible CPUs or GPUs).

I don´t take it for granted that it will be the case in the next generation of consoles. If Apple was able to move away from IBM, I am sure MS, NES and Sony can move their console business to another CPU if they wanted to, though Sony most certainly have to much invested in the Cell venture with Toshiba and IBM. MS and Nintendo certainly don´t want to be to dependent on IBM, they must have other serious options to be able to negotiate good IP licensing deals when they are developing their next generation of consoles. Sony is in the same situation with regard to the GPU and Nvidia.

Concerning the small game developers with big plans, their situation is no different than the creators in other creative businesses, you either have to start small and prove yourself or be good at sweet talking to get the financial backing. Actually I got the impression Epic offer pretty flexible license deals for their engine if multi-platform is crucial for your plans, they are in a pretty competetive environment and does not by any mean have a monpoly on middleware engines. I believe all of the big players have their own engines in development to a varying degree, even if they may not be using them in the end it may be good for their negotiations with middleware developers.
 
Last edited by a moderator:
Do you think that Cells with 8 working SPEs will eventually find their way into Playstation 3? I mean the yield will probably go way up when the 65 nm transistion is completed and I don´t know what kind of device would need 8 SPE Cells in such high volumes, IBM will probably be using their Cells with enhanced DP for their HPCs.

Such an upgraded Playstation 3 could maybe be offered already by Christmas 2008 perhaps with some minor upgrades such as dual HDMI and some more RAM as I earlier suggested?

that sounds too much like the oft-rumored
PlayStation Type C / PSX Type C upgrade and standalone console which was supposedly a PS1 with more video RAM, a 4x CD-ROM (the standalone version) and possibly a faster CPU to bring it upto the specs of Namco's System 12 board that ran Tekken 3. it never happened.

an upgraded PS3 by 2008-2009 is not going to happen either, IMO. Sony will no doubt go with a very significantly more powerful console. even if its similar to PS3 architecture. a CPU with largly increased performance: 2-4 upgraded PPEs, 32 or more upgraded SPEs, improved memory access, better XDR RAM and more of it. As well as a much more powerful Nvidia GPU based on G100 (NV60) or beyond, hopefully with EDRAM. A faster Blu-ray drive with support for 200 GB discs, etc. even this could possibly end up being a smaller leap over PS3 than PS3 was over PS2, but is more than a modest "PS3.1" upgrade.
 
Saturn never had a "multi-core CPU". It had lots of CPUs. Kinda different.

EDIT: Oh err.. that post was centuries old... sorry, it's been a while...

Saturn had a crap-load of processors, 8 or 9 of them:

*SH-2
*SH-2
*SH-1 (CD-ROM controller)
*SCU (Saturn Control Unit) with on-chip DSP for geometry processing and a DMA controller
*VDP1 - graphics
*VDP2 - graphics
*Motorola 68EC000 - sound controller
*Yamaha FH1 DSP - sound processor
*Hitachi MCU / (SMPC) (System Manager & Peripheral Control)
 
As well as a much more powerful Nvidia GPU based on G100 (NV60) or beyond, hopefully with EDRAM. A faster Blu-ray drive with support for 200 GB discs, etc.

I think the current 50gb blu-rays should still be ok. I'm guessing that the first person at Sony that suggests a $599 list price will get fired on the spot. $399 seems like the smarter target for PS4. Going with that, they could probably skip on the BD-200's. If they really had to, they could just ship some two disc games. That would still give a game 100gb of space (!!!).

I'd definitely expect a unified architecture video card this time around. I wonder if they will really stick with NVidia...if they aren't bound by contract to do so, seems like they would want to shop around on that one. I agree, hopefully they go with an edram approach, that screaming fast memory has some killer advantages. Although hopefully they will be smart enough to include enough in there to not have to tile ( yes 360, I'm scowling at you! ).

I'm not expecting them to go crazy esoteric on PS4. PS3 gave them some hard learned lessons, most notably that price is king. So simply put, I'm thinking PS4 will be the same capacity blu-ray drives (faster loading though), perhaps two or three of the next flavor of cell, a unified architecture video card with edram-esque design, and a built in hdd of around 160gb - 250gb.

I think that would be enough for them in ~2011. They don't need to be the most powerfull box on the block to win out, they just need to get to market at <= $399, and have enough tool/support to make devs lives easier. The Playstation brand will likely still be strong in 2011, they just need to capitalize on it. They should also consider getting to market first before the Xbox 720, or at most launch at the same time.

Saturn had a crap-load of processors, 8 or 9 of them:

Our PS3 lead shipped some Saturn games. He seems to talk about that console with both extreme pain and pride. Apparently coding for Saturn made coding for PS2 look easy ;)
 
For someone who isn't a coder, what would this mean for performance?

Unified shader architecture is much more flexible than the old traditional GPU architectures, you can use whatever ratio of pixel\vertex shaders you want, unlike traditional gpu's.

Because each pipe can do anything, be it pixel or vertex, you have no shaders sitting idle when they are not in use, like with traditional GPU's. Basically, what it means is that in real life cases you can get a very similiar level of performance at a significantly lower cost.

As far as eDRAM goes, given that its big enough it greately reduces the need for high memory bandwidth (which costs a lot $$$), and it can also give you high levels of AA for very small performance loss. (X360 Xenos has eDRAM but unfortunately its to small to fit interesting resolutions (and AA levels) without tiling so it requires a lot of work)
 
Is there a possibility that edram could be come cheap enough in quantity to replace most of the sdram in a console?

Same question, different wording:

Since edram is embedded onto the chip itself, and bandwidth you say being expensive Ostepop, could there we reach a point where it would cost the same or less in a beneficial enough amount?

Beyond that question and regarding previous questions of cost, I believe that since most of the groundwork has been done with PS3 the future Playstation consoles will undoubtebly be cheaper.
 
Is there a possibility that edram could be come cheap enough in quantity to replace most of the sdram in a console?

No.

The up coming consoles will have, if history holds firm, 2GB-4GB of total memory. Seeing as 1GB GPUs are already on the market, it doesn't seem unlike that in 3-4 years this would happen.

Assuming, best case scenario, that when the next consoles ship that the chips are on the 32nm process. That is an 8-fold density improvement. So with approximately the same die space as Xenos' daughter die, we could see a 80MB eDRAM chip for roughly the same price as the 360's eDRAM.

Depending on silicon budgets they may be able to go up to 128MB (enough footprint for a framebuffer to hold a 1080p 4xMSAA FP16 buffer without tiling).

Of course you have your render targets, all the new spiffy buffers (G-Buffers, A-Buffers, etc) that devs are using... and still need a place to store all your textures and geometry.

Remember, the eDRAM on the 360 also has limitations. It is a scratchpad that is tied to the ROPs. You cannot texture from it or other tasks. Adding the ability to texture or other tasks increases the complexity and size (cost) of the chip.

I do wonder, in general, if the eDRAM on future consoles has a large enough footprint that, possibly, some more complexity be added to it, namely as a local memory store for the GPU, or possibly, a shared assets with the CPU. While not huge, 128MB of extremely fast memory may open up some neat techniques. But this is even pie in the sky, and in frequently "smarter & elegent" approaches work better than brute force soluitions. Your design needs some flexibility, but the design choices need tangible benefits. This is one reason I am not sure how long we will have to wait for classic RT solutions -- they are extremely simple to setup and allow a lot of artistic control (a simple renderer that allows more artistic access and fewer pre-defined assets and layers could be a big win), but the visual return from RT is a lot lower than most raster solutions. We may see some hybrids, but my point is that any design choices in the consoles will most likely be the result of proven approaches, or ones that have been tested and with a refined design become usable.

On that note I think MS and Sony both did pretty good jobs with their current consoles. There isn't a lot of waste.
 
Back
Top