YDL v5.0 confirmed for PS3; arriving mid November

No of course not. It means you can't develop games that can be run directly from the XMB. In other words, you can't develop any signed content that targets the PS3 outside of the Linux environment.

So to run homebrew you will have to have Linux installed...
 
So to run homebrew you will have to have Linux installed...

Well, of course. You sound surprised? Stuff developed under Linux for Linux will require Linux to run. Allowing arbitrary user code to execute on the GameOS/XMB would be directly opening up the system to all those nasty things the sceptics so often liked to cry out about whenever Linux on PS3 was ever mentioned (e.g. viruses, piracy, online hacks/cheating etc.).
 
Well, of course. You sound surprised? Stuff developed under Linux for Linux will require Linux to run. Allowing arbitrary user code to execute on the GameOS/XMB would be directly opening up the system to all those nasty things the sceptics so often liked to cry out about whenever Linux on PS3 was ever mentioned (e.g. viruses, piracy, online hacks/cheating etc.).


How much power would one have available to develop games, would you be able to use cell and RSX freely. Compared to the 360 were MS has provided libraries and tools for it, how easy is it to develop for each platform?...
 
How much power would one have available to develop games, would you be able to use cell and RSX freely.

How much hardware access one has is unknown currently. On PS2 you had pretty much unfettered access, sans the DVD drive.

Compared to the 360 were MS has provided libraries and tools for it, how easy is it to develop for each platform?...

The Cell SDK is included for sure. One can hope nVidia will make available its tools freely as it does on PCs. Comparing to the development environment MS has provided, well for starters there's contrasts in the language choice (managed versus unmanaged). The PS3 Linux experience could be very similar to what a professional developer is exposed to in that you'll be using C++ (the same standard compiler in fact, gcc). That's on the Cell side - again the GPU access and how it can be programmed isn't clear yet.
 
How much power would one have available to develop games, would you be able to use cell and RSX freely. Compared to the 360 were MS has provided libraries and tools for it, how easy is it to develop for each platform?...
That's an unknown, but the most likely expectation is access to Cell with SPEs (perhaps a limited number, but SPE access will be included as Sony want to grow Cell experts and the full Cell SDK is thus included) and RSX through PSGL or straight OpenGL. A key part of this move is to generate a development 'ecology' (as the buzzword is) that develops new algorithms and techniques that strengthen the software base for Cell code.
 
How much hardware access one has is unknown currently. On PS2 you had pretty much unfettered access, sans the DVD drive.

And what kind of access! I just run some demos off the PS2linux site the other day... some very amazing stuff there! :oops:

I really can't wait to get my hands on that Linux for PS3. This is like a wet dream come true!
 
And what kind of access! I just run some demos off the PS2linux site the other day... some very amazing stuff there! :oops:
I really hope the demo scene comes alive. IT was fantastic what they achieved in the 16 bit era, and a number of ubergeeks went on to found successful games companies. PS3 has so much potential for an incredible range of demos, every sort of tech imaginable, and I really hope it gets the attention it merits. I remember the Amiga used to be shown in shops running demos, and nothing got friends as interested quite as well as some of the more impressive demos. As an advertising gimmick it's great! As well as being good for development (pioneering techniques) and an entertainment (though if they want Sony's approval for downloaded content on their service, they might want to reign in on the foul language!).

I can imagine already some of the demos that we could see. How's about a real-time geological creation demo with water and fluid dynamics creating dynamic geometry? Or a real-time GI demo that has a Cornel Box and throws in a ball, then another, then a dozen more, then just keeps pilling them on, in that classical demo style 'you're impressed with 10 spheres, and we're giving you 1000!'

It has to be said, a closed box of this level of hardware is sooooo cool :runaway:
 
In my book, that is called good coding practice! You need good foundations, and if the ones you don't have aren't sufficient, it's like building a wooden shack instead of a brick house because you happen to have lame foundations. ;) Better just rebuild the foundations so that you can get that brick house out there.

Unless of course you're certain you'll only ever need a wooden shack, and/or you know beforehand that you can afford to rebuild the foundations *and* the whole house later on. ;)

That's only valid to a certain point. In my opinion the absolutely most important lesson in software development is that changes in a project become exponentially more expensive as time goes on. Starting a project over from scratch is probably the biggest change you can make.

There are times when it's necessary to do a rewrite. If you got the fundamentals wrong, you are better off starting over earlier rather than later. Still, that's not a decision to be made lightly. Too many programmers want to throw away the entire existing code base without really examining if it can be fixed.

I don't know enough about Raster's code to say if he really needed so many rewrites. Either way, enlightenment has been in development for over 10 years and it's arguably less usable now than it was years ago. There's a point at which you need to buckle down, clean things up as best you can, and get a release out. Maybe E17 will finally be that release. He better hurry up, because the new software abstraction layers being written for 3D acceleration hardware are threatening to make much of his work obsolete or limited to niche devices.

Nite_Hawk
 
Last edited by a moderator:
You'd need a way of remote controlling these PS3s though - it's not fun running around pulling out the video, keyboard, mouse cords and plugging them into different machines. Once that problem is solved (and it's probably accomplished pretty easily, there are remote desktop type programs out there after all, and they should be able to run on PS3, at least after a recompile).

The way you would do things on a beowulf cluster is to boot the Linux kernel and initrd off a network server using an Etherboot network bootrom ( http://rom-o-matic.net/ )or PXE boot and tftp. You would then run Linux off the NFS network file system. This will give you diskless compute nodes which boot off a single server image which can easily be updated centrally and distributed by simply remote booting the nodes.

I am not sure how Sony would be running their clusters since PXE booting may not be supported. As far as remote control is concerned Linux is the undisputed champion. You can log into and run sessions on other networked computers using remote X, since it is fully network enables. You have other remote desktop protocols like NX and FreeNX http://www.linuxjournal.com/article/8477 . You can also run individual remote graphical applications in a window on the local machine from the command line using OpenSSH with X forwarding enabled. On a supercomputer compute node though, you would not want to run a GUI because it consumes resources. You would therefore run the compute nodes headless (ie. without a graphics card) and you could control it remotely using the command line with OpenSSH.

PS3 - despite being large for a console - is actually much smaller than most practically useful PCs. While you can't stack PS3s, they don't take up much space standing upright next to one another... :D

That 130,000 CPU supercomputer would be bluegene/L, right? Those CPUs are actually rather cool-running as they were designed for low power useage. There's probably not a single supercomputer CPU around anymore that draws upwards of 300W, at least not being actively developed. There could be installed legacy hardware here and there of course, to mention one of them, some of Cray's designs were incredibly power-thirsty. The T90 for example uses four tons (!) of fluorinert to cool the CPUs... I actually believe it used phase-change nozzles to hose down the chips, and then sucked away the coolant vapor to a heat exchanger to condense it back into fluid form. That's pretty badass radical stuff compared to what we use in our PCs. Anyway, that system was hot (literally) ten years ago, not so much today. Even the Earth Simulator custom CPUs 'only' draw like 150W AFAIR, and that's for the whole blade PCB and not just the CPU itself.

Yes supercomputers consume a lot less power per node than a PC. I think you have not understood the point I was trying to make. People are saying is that a cluster of PS3s is cheaper than Mercury blade servers. I am trying to make the point that clusters of PS3s or PCs are far more expensive for production use because of the space, and power consumption, that is why I have quoted the PC power consumption. Nobody in their right mind would use 130,000 PCs instead of Gene Blue because apart from anything else, the power consumption is just daft. What people actually use for supercomputers are high density racks of headless blade servers/ or racks of high end 128 way NUMA servers - they may be more expensive, but running costs are much lower.

I don't think they'd use PS3s per se if that was the case, but rather more or less custom hardware without things like BR drives or GPUs.

If that was the case, they surely would be using Mercury blades right away (unless it is just a PR exercise). It may be possible to run some part of a multi-player networked game on a PS3 server cluster. Maybe they are doing something game orientated that uses the RSX for something. The application might be part of a networked multi-player game in which AI or collision detection is done on the game server - in which case there may be a reason for using PS3 boards (without the BD drive in a high density case) in place of Mercury blades. On the other hand it might be a multi-player game where end users simply plug in a number of PS3 into a gigabit LAN. The PS3 would seem to be a bit heavy for a "LAN party" though.
[/QUOTE]
 
From the Ars Technica article:
Terra Soft's YDL 5.0 distribution, which will ship with GCC 4 and the Cell SDK in addition to a broad assortment of common open source desktop
All the materials released by Terrasoft so far indicated GCC 3.2.2 (I distinctly remember because I was somewhat disappointed by this). But now when I went back to the press release to check after reading the above (could have just been another case of the "new" Ars not getting all the facts right) I found out that they stealth-edited it and it now says 4.1.1!

Either they changed it just now - that would be very strange, as changing GCC major versions is quite a big step for a distro - or the original information was just wrong already at the time of publishing. Anyway, good news.


Nite_Hawk said:
He better hurry up, because the new software abstraction layers being written for 3D acceleration hardware are threatening to make much of his work obsolete or limited to niche devices.
While I agree with most of your post, it should be mentioned that Evas has had GL acceleration for a long time already, much longer than the mainstream environments.

[edit]
Back to the article:
"an aggressive, rapid co-development project conducted by Carsten Haitzler and the Enlightenment development team."
Ha! Obviously someone like Rasterman would love the opportunity to get access to an early PS3 unit to port his favourite project to! If it's him I would even suspect some low-level optimizations already. I believe a fixed platform truly is a better outlet for something like E17 than the PC landscape.
 
Last edited by a moderator:
While I agree with most of your post, it should be mentioned that Evas has had GL acceleration for a long time already, much longer than the mainstream environments.

Well, it's not really opengl support in and of itself, but rather what the abstraction libraries do. From what I understand Evas doesn't really deal with vector formats per se, it's a pixel pushing technology. You get a really fast canvas to work with which you can try to implement widgets/vectors/etc ontop of. Cairo+Glitz on the other hand, isn't as fast (especially without opengl acceleration), but everything is vector based internally so accelerating (and rendering to) svg, postscript, pdf, etc becomes much easier.

The only really big advantage I see evas having is that it is very fast at software rendering. These days though, reasonably fast opengl acceleration is already available on chipsets from nvidia, amd and intel. Almost every system sold these days has some form of opengl acceleration. Evas makes a lot of sense if you have restricted hardware (like in a PDA), but most of it's benefits already seem unecessary in the PC space.

Nite_Hawk
 
From the Ars Technica article:
All the materials released by Terrasoft so far indicated GCC 3.2.2 (I distinctly remember because I was somewhat disappointed by this). But now when I went back to the press release to check after reading the above (could have just been another case of the "new" Ars not getting all the facts right) I found out that they stealth-edited it and it now says 4.1.1!
Yeah that was my not-too-hot point about it too, glad the official PR was corrected for C++-friendly g++ 4.

BTW this was a :LOL: PR for me.
Ars Technica said:
In light of the PS3's poor reception from the press and the gaming community, one immediately wonders why Terra Soft would work with Sony rather than Microsoft or Nintendo. Staats says that the PS3 "looks and feels like a real computer," and explains that "it is important to understand that the Xbox and Nintendo are not designed to be personal computers, the PS3 is." What about the XBox 360? Staats "contacted the Microsoft 'Linux Lab' a few months ago, just to see what they might say," and received no response.
 
would this work?

I had an idea, hope it doesn't sound too stupid, maybe someone can tell me if this is possible:

couldn't the swapping algorithms in kernel be patched to use RSX and its memory as a kind of cache? I mean, the 256MB GDDR3 isn't really needed for OpenOffice and stuff like that, so lets say we reserve 192MB (or whatever) of that and keep the rest for framebuffer an PS3-OS needs. Now before you move a tile to hard drive, you could have RSX put it into that reserved region, eventually further moving it to hard drive when it isn't really needed a few milliseconds later on. You would have to instruct RSX to do the moving, because CELL can't access GDDR3 fast enough. This way, you could have a small portion of the swapfile in GDDR which is obviously a lot faster. In (my) theory, this should better the overall performance at least when you need just a little more memory than you really have (the 256 MB XDR).

Any comments?
 
Well, it's an interesting idea and it should certainly be possible, but I really doubt that it will be implemented soon. As you say one would have to get RSX to actually fetch the data and write it back if needed (as we all know how fast Cell access to the GDDR is). Still, it seems like one of the few ways to make non-PS3 specific non-3D apps benefit from the other 256 MB of memory, so it should certainly be investigated.

Too bad that it's not really something that can be implemented well without low-level RSX access, otherwise we could try it ourselves if no one else does ;)

[edit]This was addressed to Planet, I wrote it before seeing one's reply)
 
Cell could do the writing itself, I think. It has 4GB/s writes to GDDR3 which may be OK? But yes, you'd need enough control over RSX to get it to write back to XDR when necessary, at least.

Definitely an interesting thought. I'm guessing implementation would be challenging.
 
No one picked up on this in the Ars Article?

"Our work with Sony has granted us a unique opportunity to have in our possession beta PlayStation units (the same that the game developers have used) in order to work closely with the system to ensure a high quality end-user experience, from bootloader to halt, from installation to playing CDs and configuring the desktop."

Access to optical drive confirmed? Would be a nice not to have to use an external drive.
 
It's been touched on, it does seem to suggest optical drive access. If there is, I guess the next question is if you've access to protected content (e.g. can you play back DVD movies etc.).
 
A little bump, for a little new information - the question was posed before about how much of Cell you'd have access to as a developer under YDL, and Kai Staats of Terra Soft provides an answer in one of his list postings:

At boot, YDL shows 2 PPUs (one real, one a hyperthread) and 6 SPEs ... but to
take advantage of them is another story.

http://lists.terrasoftsolutions.com/pipermail/yellowdog-general/2006-October/020749.html

So that's that :D Interesting that it is 6 SPEs, and not (for example), 7. An indication that the GameOS is still working away in the background? Regardless, it seems you'll have as much access as games do, which is a good thing.

Also, seems like we'll have little new info (unless more stuff like this just slips out) until mid November. That's, of course, when the PS3 is launching, but there's also a conference happening then that they'll be at. It seems they can't talk about stuff like how it'll boot etc. until then.
 
Last edited by a moderator:
  • Like
Reactions: one
Wasn't there a talk once about a SPE also being tied and responsible for security? If so, I would think that's what that one SPE does, next to taking care of OS tasks during gameplay...
 
Back
Top