End of Cell for IBM

He referred to the ICE Team earlier in the thread, and it doesn't negate at all the point being made. Which for my part was that for the cost of one of the early effort 'exclusive IPs,' putting money towards easing developer access across the broader industry would have been money better spent.

Sony also acquired the devtool vendor, SN Systems; and added a list of partners to its dev suite (for free access). They made PhryeEngine too.

The job of easing developer access across the broader industry may belong to IBM. Sony only takes care of the gaming industry, and Toshiba the CE space. Chubachi was tasked to examine the use of Cell in other Sony businesses, but came back empty-handed. For external efforts, IBM needs to charge high for its involvement (but PS3 undercuts the Cell Blades), Sony needs to recoup its own investment (so the PS3 Linux effort was stunted). Toshiba made SpursEngine but ultimately, spent more $$ in HD-DVD market development than Cell market development.

In general, R&D happens "slower" in real life. Perhaps Sony could have dedicated more resources to support 3rd party game development. Then again, the Sony team needs to learn Cell programming from scratch too. And the fastest and safest vehicle to do that was first party titles.
 
I don't disagree that 1st party development is the best 'laboratory' in terms of coming up with best practices and novel tools for an architecture, but whether it should have been IBM, Toshiba, or whoever doing tools development as part of their own commitment to the architecture, the fact is that Sony had and has always had the most exposure - and Sony being a little ambivalent towards tools provision at the outset is no new thing. I said it like I meant it; sacrificing LAIR for a mass tools effort pre-launch would in my mind have been the better move between the two.

I think Sony's tool effort as it stands today is quite fine, and I'm personally a fan of the forks and variations.
 
I said it like I meant it; sacrificing LAIR for a mass tools effort pre-launch would in my mind have been the better move between the two.

That's on hindsight though. Many people thought Factor 5 could deliver a superb game and technology platform given their track record. A lot of things could be learnt or reused from that project if things have gone well. But it didn't pan out.

What mass tools effort does Cell needs (that was not in-place or not in-place early enough ?).

I thought the problem is the (small scale or lack of) market development and PS3 Linux conundrum. Those issues can't be solved by technical tools. Within the gaming space, I think Sony could have supported third party developers better (more aggressively).
 
That's actually an interesting option. Cell is meaty enough for future game processing requirements. It's just being used to bolster RSX at the moment. Couple it with a good GPU and you'll actually have a capable platform that's BC. Sure, it won't be cutting edge performance, but as a viable economy option next-gen, it looks good to me. Developers will already have the tools and code and the experience, unlike trying to program new architectures. The GPU will work the same as all the others. It'd be a relatively straightforward system. The other alternative would be a straight multicore x86, which would be simpler to program but with lesser performance, and probably difficult hardware cost management (read Intel charging a huge markup!).

Much as I'd like a Cell2 64 SPU monster, a smaller improved Cell makes considerable sense in the cost-conscious future.

I'd bet it would match PC game graphics if done well. Would cell bottleneck a high end GPU like the 5870 (the bottlenecking that happens with PC is that the GPU is waiting on the CPU to complete a task and vice versa right?) or would the system being a closed system allow devs to optimize and take full advantage of each?

Thinking about the example the GPU should definitely get the most attention (or atleast graphics capability). The question is how far they want to take physics and such in the next gen and what they want to use to process it.

What would be needed to fully utilize physx for example? The cell is naturally better suited for this than a general purpose cpu?
 
Add in depth of field, hdr, ambient occlusion, etc, and it wasn't too shabby for something we did back in 2006!

Not shabby indeed, but that year and during most of 2007, developers were struggling to get decent performance out of the PS3. IIRC, EA's Madden 08 ran at 30Hz and made quite a ruckus.

You won't get an argument out of anyone that the Cell/RSX combination was not made with developers in mind unlike the 360's Xenon/Xenos.
 
What mass tools effort does Cell needs (that was not in-place or not in-place early enough ?).

I'd say what they needed is all of what they now have. 'Early enough' is where we might have something to debate. :)
 
'Early enough' is where we might have something to debate. :)

Debate would be an understatement :) In the early days PS3 had no graphics profiling tool, poor cpu tools, crashing compilers, sdk's that hogged mass amounts of memory, no Spurs, etc, and little support from Sony who basically ignored us. We were literally working blind. Contrast that to Microsoft that had good tools including Pix available before the console even shipped, and who let us fly over to their Redmond headquarters where their engineers spent two days with us going over everything, Pix grabs, hardware tidbits, the works, it was awesome.
 
Crossbar this line of thinking doesn't make much sense. If someone says that console games need to look better every year, that is not the same as saying that the architecture needs to launch with a built-in, purposefully steep learning curve. Games can show improvements and developers extract more performance when things don't punch them in the face as well, y'know?
Back in 2006 PC games had not started to take advantage of multi-core CPUs so I guess the 9 core Cell was surely a punch in the face to many developers. I remember Carmack whining about the pain of multi core development, how the junior programmers could screw up a program in ways they couldn´t do before. But do you know what, sooner or later they had to bite the bullit and adapt, in the case of the PS3 it was sooner rather than later. The same goes with the heterogenous characteristics of Cell. The Cell was far ahead of the curve, that it did not have a fully developed tool chain with feature rich libraries goes without saying, but still it had compilers and graphics libraries that were miles ahead of what the PS2 had at launch. If not MS offered a more feature rich toolset at the same time I doubt we would have this discussion, but kudos to MS for the good standard of their tools at launch.

Case in point - do you think that Sony will *purposefully* build in an equivalent learning curve into PS4? No, probably not... I would hope that anyone would recognize that. Because it would be ridiculous; you don't purposefully build in flaws into your system if you can help it, lest your grip on sanity be in question. On the contrary, it is clear they want PS4 to be 'developer friendly,' insomuch as possible. Believe me come the PS4 launch, you'll be reading quotes from Kaz extolling the greater approachability of the system.
No I don´t think the PS4 will have such a steep learning curve, because by then massive multi-core as well as heterogenous designs will be a commodity. But I do think it will be a design that will take years to master fully, it will not rely on huge caches and bloated cores, instead it will be densly packed alus that will require careful tuning to get peak perfomance out of.

I'm also of the mind that PS3 will never have a BOM lower than PS2's - either on an absolute or a relative basis. Firstly the HDD inclusion alone prevents it, but even assuming they go to a remote storage model in PS3v3.0, unless someone does a redesign of the RSX and/or Cell, I don't see them ever sharing the same die space... which as we know is a major factor in the PS2's path to cost reduction.
The hard drive can and will be replaced with flash in some entry unit some time in the future, that is my bet, how much flash that is, we leave to another discussion. Of course merging Cell and RSX would require a redesign, but if they both are moved to a compatible process replacing the FlexIO bus lines should be quite easy. Such a move would of course be compared to the cost of keeping them separate, if Toshiba keeps using Cell, economics of scale may help keep the cost of the Cell cpu in the peanut range, at 28 nm the power draw should not be much of deal any more. Also remember that the merged EE+GS could never be moved to a really cheap bulk CMOS-process due to the EDRAM.
 
Back in 2006 PC games had not started to take advantage of multi-core CPUs so I guess the 9 core Cell was surely a punch in the face to many developers. I remember Carmack whining about the pain of multi core development, how the junior programmers could screw up a program in ways they couldn´t do before. But do you know what, sooner or later they had to bite the bullit and adapt, in the case of the PS3 it was sooner rather than later. The same goes with the heterogenous characteristics of Cell. The Cell was far ahead of the curve, that it did not have a fully developed tool chain with feature rich libraries goes without saying, but still it had compilers and graphics libraries that were miles ahead of what the PS2 had at launch. If not MS offered a more feature rich toolset at the same time I doubt we would have this discussion, but kudos to MS for the good standard of their tools at launch.

All of this I generally agree with and have stated myself; Cell was ahead of the curve. I am a fan and proponent of this fact. BUT, the difficulty associated with this being ahead is not any sort of benefit gained from that fact - which is the assertion of Hirai's which I simply do not believe is reflected in even his own view of things. Cell had the architecting it had for performance - the tools have come far enough along that Sony can stick to that line; there is no need in my mind to tack on other 'benefits' that were *of course* never considered as positives during the development of the architecture. Aka, punching developers in the face. :)

No I don´t think the PS4 will have such a steep learning curve, because by then massive multi-core as well as heterogenous designs will be a commodity. But I do think it will be a design that will take years to master fully, it will not rely on huge caches and bloated cores, instead it will be densly packed alus that will require careful tuning to get peak perfomance out of.

And so certainly you must agree that PS4 will not be a worse console for being easier to program for, right? As would Kaz, I'm pretty sure. Any system/architecture/etc can have improvements in software as the gen goes on - it's always just a matter of time, cleverness, familiarity, and money. Hell the PC for all of its generic ability and 'known quantity' factor would suddenly turn into quite the long-lived platform itself if it were frozen in time within a single configuration.

Put in a simpler way, I don't think Sony's gaming business would really have minded if it went without the 'graphically lesser' label it endured during the first year and a half of launch. Which is to say selling more consoles and more games is *hopefully* more important to Sony than showing improved graphics as years go by. And of course, I know it to be - which is why Kaz's comments were needlessly PR in my book.

The hard drive can and will be replaced with flash in some entry unit some time in the future, that is my bet, how much flash that is, we leave to another discussion. Of course merging Cell and RSX would require a redesign, but if they both are moved to a compatible process replacing the FlexIO bus lines should be quite easy. Such a move would of course be compared to the cost of keeping them separate, if Toshiba keeps using Cell, economics of scale may help keep the cost of the Cell cpu in the peanut range, at 28 nm the power draw should not be much of deal any more. Also remember that the merged EE+GS could never be moved to a really cheap bulk CMOS-process due to the EDRAM.

But a flash HDD is still more expensive than no HDD from a BOM standpoint, and whatever the relative expense of the OTSS 90nm CMOS process vs some others, it was certainly cheap enough, eDRAM or no. For RSX and Cell to share a die it would require more than the reworking of FlexIO, it would require also a unification of memory controller... unless they were to keep separate buses to still separate memory pools.. in which case I wonder, what is even the point? To top it all off of course there then comes the issue of NVidia and STI playing nice when it comes to the cross-company technology/licensing effort that would be required in order to create a hybrid Cell/RSX. For me... I'm never expecting it to happen.

As for Toshiba and Cell volume production, the only thing that's peanuts in my book is the number of SPE-derived chips in their product line right now. As far as TVs go, a 2-years late TV set at the stratospheric top of the range is a non-factor. If they migrate the technology down and throughout, then we can see. And SpursEngine, well... who knows. But unless there is process consolidation at 32nm between Spurs and Cell 'standard,' I don't think that any CE-derived economies of scale would factor in anyway.

Bottom line for me is PS2 will always be cheaper (relatively speaking; on an absolute basis obviously it would always be). There are simply fewer parts, and those parts share greater commonalities in their origins.
 
Last edited by a moderator:
Yes, Kaz could probably have expressed himself differently and it was pretty needless, I concur.

But a flash HDD is still more expensive than no HDD from a BOM standpoint, and whatever the relative expense of the OTSS 90nm CMOS process vs some others, it was certainly cheap enough, eDRAM or no. For RSX and Cell to share a die it would require more than the reworking of FlexIO, it would require also a unification of memory controller... unless they were to keep separate buses to still separate memory pools.. in which case I wonder, what is even the point? To top it all off of course there then comes the issue of NVidia and STI playing nice when it comes to the cross-company technology/licensing effort that would be required in order to create a hybrid Cell/RSX. For me... I'm never expecting it to happen..
The price of a Flash HD depends on the size and when in time. I also don´t make any assumptions about if they will share memory controller or not. The number of extra address pins to address a second memory pool are quite few and the data pins you would require anyways. The extra die space required for the address logic should be pretty small as well. But of course a shared memory controller would be preferable if it is easy to share and distribute the data bandwidth correctly, but I hardly think it is a show stopper, the advantages of removing som FlexIO lines and reducing the chip count are pretty big anyway.

As for Toshiba and Cell volume production, the only thing that's peanuts in my book is the number of SPE-derived chips in their product line right now. As far as TVs go, a 2-years late TV set at the stratospheric top of the range is a non-factor. If they migrate the technology down and throughout, then we can see. And SpursEngine, well... who knows. But unless there is process consolidation at 32nm between Spurs and Cell 'standard,' I don't think that any CE-derived economies of scale would factor in anyway.
Yes, Toshibas future support of Cell is a big unknown, I don´t count on it.

Bottom line for me is PS2 will always be cheaper (relatively speaking; on an absolute basis obviously it would always be). There are simply fewer parts, and those parts share greater commonalities in their origins.
All we can be sure of at this point in time is that the innards of a PS3 five-six years from now will look VERY different from what it looks like today.
 
I'd say what they needed is all of what they now have. 'Early enough' is where we might have something to debate. :)

Ah, you won't hear me complain about "early enough".

However I don't know if it's possible to judge whether first party games are more worthwhile vs a common toolset (e.g., One may argue why not use the Lair money for another successful exclusive on hind sight ?). They are both extremely important. Sony should not spare any resources to look into broader industry application for Cell. In retrospect, I'd rather they use the PS3 Linux resources to beef up the toolset team (initially).

Given the unconventional architecture, everyone needs "proven" application level framework(s) for reference, which may take 2-3 years to mature. It's just a shame that Lair bombed and we didn't even get say... a good LoD framework/lesson out of it.

Hopefully some of the developers could carry the experience and work on other Cell projects.
 
However I don't know if it's possible to judge whether first party games are more worthwhile vs a common toolset (e.g., One may argue why not use the Lair money for another successful exclusive on hind sight ?). They are both extremely important. Sony should not spare any resources to look into broader industry application for Cell. In retrospect, I'd rather they use the PS3 Linux resources to beef up the toolset team (initially).

Seems pretty clear to me and easy to judge (even without hindsight) the two routes:

1) Fund a flashy product in hopes it sells new hardware that's just released.
2) Build the tools needed to complement new hardware, enabling others to release flashy products.

It's obvious the bean counters & sales/marketing types wanted option 1 because they lack the patience and logic to realize that if you do not have the proper tools (in a best case scenario, developed in parallel with the hardware) you will spend more time and effort trying to get a product developed than if you had the proper tools. Of course, you'll have some product out quicker, that can generate sales, but it would have taken more resouces and its quality could even have been negatively affected by the lack of quality tools.

I see this all the time, in projects big and small. The pressure to start/finish a project when the underlying requirements haven't been met have universally (in my experience) resulted in more wasted resources, and probably, a lesser quality end-result.

Edited to add:
Now, if I were to speculate, with hind-sight, I might have gone this route if I were Sony:

Knowing that Lair would have bombed -- mostly because of game design & control issues, I believe (haven't played it), I would have removed all six-axis functionality from the controller and focused on getting shock/vibration by settling the lawsuit or whatever it was that originally caused them to replace the vibration mot ors with those silly sensors used for six-axis. It's a neat toy to play with initially, but not very practical. Of course, this has nothing to do with the Cell, and I'm really just speculating, but with hindsight in relations to the question of funding Lair vs. development tools... but that was my intended purpose with this addition. I think it's pretty obvious to anyone (especially RTS players) that if you spend more building your resources, you'll get stronger results -- unfortunately, your competition will also be strengthening at the same time. How you balance that is the trick. Again, if hindsight could be combined with time travel, one might go back and tell Sony to invest more in the development tools. Heck, they didn't even need hindsight to know this. They only had to look honestly at the PS2 vs. XBox development environment which was similar. Sure, the PS2 ended up with great tools, and developers who knew the architecture, but there was still a big difference when XBox came on the scene.
 
Last edited by a moderator:
Seems pretty clear to me and easy to judge (even without hindsight) the two routes:

1) Fund a flashy product in hopes it sells new hardware that's just released.
2) Build the tools needed to complement new hardware, enabling others to release flashy products.

It's obvious the bean counters & sales/marketing types wanted option 1 because they lack the patience and logic to realize that if you do not have the proper tools (in a best case scenario, developed in parallel with the hardware) you will spend more time and effort trying to get a product developed than if you had the proper tools. Of course, you'll have some product out quicker, that can generate sales, but it would have taken more resouces and its quality could even have been negatively affected by the lack of quality tools.

I see this all the time, in projects big and small. The pressure to start/finish a project when the underlying requirements haven't been met have universally (in my experience) resulted in more wasted resources, and probably, a lesser quality end-result.

There are 2 kinds of tools here: The general utilities and the application level frameworks/modules/patterns. Both are important in a brand new architecture.

Both routes will take time to realize and both help developers. Not sure why you think developing game is quickly than a tool. Without actual hands on to solve real problems, the design patterns, application level frameworks may not be realized or useful in a unconventional architecture like Cell. e.g. the SPUR vertex culling system is one of the most used application-level "modules", and is created by the same team who help to build these exclusives.

Not to mention Sony has extra resources elsewhere (e.g., Life with Playstation, PS3 Linux, even PS Home -- if Sony sees it only as a vertical app), which may be tapped to do the general tools and developer support.

Edited to add:
Now, if I were to speculate, with hind-sight, I might have gone this route if I were Sony:

Knowing that Lair would have bombed -- mostly because of game design & control issues, I believe (haven't played it), I would have removed all six-axis functionality from the controller and focused on getting shock/vibration by settling the lawsuit or whatever it was that originally caused them to replace the vibration mot ors with those silly sensors used for six-axis. It's a neat toy to play with initially, but not very practical. Of course, this has nothing to do with the Cell, and I'm really just speculating, but with hindsight in relations to the question of funding Lair vs. development tools... but that was my intended purpose with this addition. I think it's pretty obvious to anyone (especially RTS players) that if you spend more building your resources, you'll get stronger results -- unfortunately, your competition will also be strengthening at the same time. How you balance that is the trick. Again, if hindsight could be combined with time travel, one might go back and tell Sony to invest more in the development tools. Heck, they didn't even need hindsight to know this. They only had to look honestly at the PS2 vs. XBox development environment which was similar. Sure, the PS2 ended up with great tools, and developers who knew the architecture, but there was still a big difference when XBox came on the scene.

Lair is a good example why SIXAXIS/Natal/Gem control schemes need more careful thought and implementation. But I personally think games like Flower and Folklore put SIXAXIS to great use.
 
I thought Lair was a 3rd-party game?

It is, but Sony funneled a lot of money into it. The fact that it was 3rd (2nd) party though is one reason why I thought all the more reason for it to be the sacrificial lamb. To say nothing of its mediocrity; mediocrity that should have been apparent early on.

Both routes will take time to realize and both help developers. Not sure why you think developing game is quickly than a tool. Without actual hands on to solve real problems, the design patterns, application level frameworks may not be realized or useful in a unconventional architecture like Cell. e.g. the SPUR vertex culling system is one of the most used application-level "modules", and is created by the same team who help to build these exclusives.

Not to mention Sony has extra resources elsewhere (e.g., Life with Playstation, PS3 Linux, even PS Home -- if Sony sees it only as a vertical app), which may be tapped to do the general tools and developer support.

The point is not that Sony does not reap development-side benefits from the results of the various internal teams, be they ICE, the Home crew, or whoever, the point is that it is a mismanagement of resources and reflective of too cavalier an attitude in the early days to have not spent some extra cash on tools. Early tools would have been cruder and less-for-your-money than future tool efforts, but it would have been money better spent than the early days IP grabs. At least in my humble opinion. When Sony was funding Factor 5's studio buildout, they could instead have been funding an internal team dedicated towards providing 3rd parties with 'cleaner' tools and greater abstraction. Cell was a known quantity for a long time within STI - the tools for games development, at least on a basic level, could have been there at launch - if Sony had decided to make it a priority.

My view is that whatever money they saved on the effort, they more than lost due to the negative PR storm it eventually helped to fuel.
 
Last edited by a moderator:
the point is that it is a mismanagement of resources and reflective of too cavalier an attitude in the early days to have not spent some extra cash on tools.

The early allocation indeed sounds too generous. I think the problem is with the dispense. They should have a tighter rein on when to give out the actual money (beyond just funds allocation). I am not familiar with how Phil Harrison ran this. There may be wastage but I don't know if Sony later clamp down on unreasonable spending later on.

Early tools would have been cruder and less-for-your-money than future tool efforts, but it would have been money better spent than the early days IP grabs. At least in my humble opinion. When Sony was funding Factor 5's studio buildout, they could instead have been funding an internal team dedicated towards providing 3rd parties with 'cleaner' tools and greater abstraction. Cell was a known quantity for a long time within STI - the tools for games development, at least on a basic level, could have been there at launch - if Sony had decided to make it a priority.

My view is that whatever money they saved on the effort, they more than lost due to the negative PR storm it eventually helped to fuel.

Some would say they could also have used the funds to anchor/retain an old PS2 exclusive for example. It would also help in positive buzz generation.

I don't know how much they spent on SN Systems. But all I am saying is there are other pockets where the resources could come from during that time. They don't necessarily need to come only from the studios, Sony has a bunch of half-developed concepts and work-in-progress projects. They all could be expanded/canned as required (including the tools division, and first parties). The issue of studio mismanagement may be separate from the issue of tool quality. For what it's worth, some of the tools were also created by people in those studios.

There were (are ?) probably unhappiness among the tools and support team. I remember Mark DeLoura (Manager of Developer Relations, heard he's good) left in 2006. I do agree that Sony should always, even now, pump resources into better developer support, not just the tools themselves. But I don't necessarily think Lair or first parties can be blamed. They are in a title business, such investment is inherently risky. And they have to be careful with how their money is spent. At the same time, there may be a lot of other wastage and overlapped cost that could be targeted.
 
Some would say they could also have used the funds to anchor/retain an old PS2 exclusive for example. It would also help in positive buzz generation.

Indeed I think many would say this. And that, too, should probably have been a game plan. But the reasons for not going that route at least had basis in a thought pattern I could understand, even if it was ostensibly the wrong choice.

They don't necessarily need to come only from the studios, Sony has a bunch of half-developed concepts and work-in-progress projects. They all could be expanded/canned as required (including the tools division, and first parties). The issue of studio mismanagement may be separate from the issue of tool quality. For what it's worth, some of the tools were also created by people in those studios.

I'm not talking about studio mismanagement though - not my concern here. And LAIR is just a prop in my morality play. No need to say that tools have stemmed from internal studio efforts, I think I've acknowledged that like five times now. :) And I'm a huge fan of the grass-roots tools development, and indeed the sandbox style creativity process that SCE as a whole promotes. Two thumbs up. But the problem is when it comes to 3rd party support, you can't rely on this exclusively. PSSG would be more the type of thing I'm talking about - but on a broader scale. It's all well and good in my book that Sony has cobbled together some great developer tools and aids from across the spectrum of its top-tier internal efforts, it just doesn't make sense to me that a unified effort with that specific purpose never seemed to warrant a financial commitment in its own right in the beginning. Every thing is fine and dandy at the end of 2009 so far as I'm concerned, so it's a matter of a previous (though serious) mistake rather than an ongoing lapse.

But I don't necessarily think Lair or first parties can be blamed. They are in a title business, such investment is inherently risky.

Of course. No one is blaming Factor 5 or any dev house. Their job is to make games. In fact it's the opposite - Sony should have hired more developers (or in my example different developers): toolset developers!
 
PSSG would be more the type of thing I'm talking about - but on a broader scale. It's all well and good in my book that Sony has cobbled together some great developer tools and aids from across the spectrum of its top-tier internal efforts, it just doesn't make sense to me that a unified effort with that specific purpose never seemed to warrant a financial commitment in its own right in the beginning. Every thing is fine and dandy at the end of 2009 so far as I'm concerned, so it's a matter of a previous (though serious) mistake rather than an ongoing lapse.

Yes, at the PSSG level. ^_^ In theory, the pluggability should allow these subsystems to be reusable. However in practice, I do wonder how many studios out there use it. So far, we hear PhyreEngine and SPURS stuff were used in quite a few studios.

I think the difficulty here is the "weird" architecture may demand too much upfront assumptions about the run-time and workflow. Hence, I always thought the Sony folks will need to eat their own dog food to come up with a right "size" solution (short of going to Epic and engineer all their work into UE3.* and beyond). Would be great to hear some reports on these tech output.

Of course. No one is blaming Factor 5 or any dev house. Their job is to make games. In fact it's the opposite - Sony should have hired more developers (or in my example different developers): toolset developers!

Beef up developer support ? No argument from me. :)
 
So far, we hear PhyreEngine and SPURS stuff were used in quite a few studios.

PSSG and Phyre are the same thing though of course. Not saying it for you, but for those that might not be as familiar with the terms/tools.

But right - like that. I personally feel that Sony stepped up to the plate in multiple ways after the launch - and to be fair both of these were in the works prior as well - I just think they would have served their own interests a bit more by taking it more seriously before the blowback. Especially with all the hype MS was tossing around about XNA - approachability seemed a fairly logical dev-set checkbox to try and put some resources towards. :)
 
:LOL: I have totally forgotten that PhyreEngine is PSSG. I only vaguely remember your interview with Jason Doig. Thought PhyreEngine is a ground up, cross platform game engine vs PSSG's pluggable subsystems.

So you can dismantle PhyreEngine and use only small pieces of it easily ?
 
Back
Top