Offical ps3 thread part 2

You don't need a render farm to render those pictures. I tryd the tool out some months ago that did this procedural images, a fast PC is enough to render them in a few hours or less.

You can download a demo of the tool here: http://www.pandromeda.com/

I'm sure something like that will be possible as a tech demo in realtime, maybe with less draw distance, I just do'nt expect to see something like that in a real game. But maybe I'm wrong. We will see.

Fredi
 
McFly said:
You don't need a render farm to render those pictures. I tryd the tool out some months ago that did this procedural images, a fast PC is enough to render them in a few hours or less.

You can download a demo of the tool here: http://www.pandromeda.com/

I'm sure something like that will be possible as a tech demo in realtime, maybe with less draw distance, I just do'nt expect to see something like that in a real game. But maybe I'm wrong. We will see.

Fredi



I'M very pessimistic too... still, one day there will be games that look like that.. and that is worth waiting for i think.... totally drooooooool...
 
Lets talk about the distributed computing aspect of PS3. I found this on ...another forum

AlgebraicRing wrote:
1) The cost of communicating the solution must be less than the cost of computing the solution.

Corollary: The cost of scheduduling and distributing the problem must be less than the time to compute the answer AND receive the results back

There's no point in asking someone else what 1+1 is when it takes less time to figure it out than to ask the question. Are there any kind of problems that are easy to describe, require lots of computation and cost little to send the results around?


This is a very good point and I thank for bringing it up because all I explained in the Cell article was that there were metrics that governed how and where Cell packets were sent.

To explain in more detail i will provide data metric formulas for governing what data gets sent where. As the actual data and code of the packet isn't to important for understand this point rather the metric formula.

First let me point out that Cell packets would not be transmitted to other Cells unless the transmitting Cell is already reached maximum resource capacity or the Cell packet contains more computational resources than the machine has.

Next lets present some scenarios as to why Cell packet transmission might occur:

In a Client / Application Server (Say for example: SOCOM server interface) Cell relationship the bulk of the static code (basic environments, permanent objects, etc.) would be processed on the server and the end results be transmitted to the Client systems GPU for processing, then output to VRAM for display.

Ok so say we have a populous of networked Cells (WAN Scale). We have a Cell packet generated by a transmitting Cell that is greater than ( < ) 1sec in processing time for said transmitting Cell. So to help understand this further and to easily input this into an equation- let’s assign it a Resource Unit.

This "Resource Unit" metric will map directly to the number of PE's (processing elements) (In a “preferred” embodiment (as the Patent outlined it), a PE comprises eight APUs) divided by the number of seconds for said Cell packet (remember that Cell packets contain data and instructions for processing). So this transmitting Cell (a PS3) contains 6 PEs. So 6/1sec. = (6 RU). Now this information is appended to the Cell Packet before it is processed by the Routing logic and becomes part of Cell packet.

Example Cell Packet (BTW we are assuming Ethernet network and transmission via TCP/IP) I’m not going to replicate the TCP/IP or ETHERNET encapsulation for now, to save time:

--- header --- -----RU------ -----payload----- ---CRC--
[destination / source ] [000110 (6)] [data & instructions] [checksum]

Once this packet hits the Routing logic it will make decision on where to forward this packet based on these metrics:

- Delay (hop count) in MS (milliseconds)
- Member Cells PE load (RU)
- Number of PEs (or Cell Class, eg 8APUs = A class node)

So say for the sake of example we have a member Cell in California, New York, Nevada, Colorado, and Florida. While the transmitting Cell resides in California.

Now let’s assign an RU & Delay Metric to each member Cell:

California: RU load = .3 Delay = 73ms PEs = 6
New York: RU load = .8 Delay = 179ms PEs = 16
Nevada: RU load = 1 Delay = 89ms PEs = 6
Colorado: RU load = 2 Delay = 97ms PEs = 6
Florida: RU load = 10 Delay = 164ms PEs = 1

So in order to understand how to find the best metric, we have to resolve the RU load floating point (decimal). So to resolve we simply find for x; (x being the milliseconds. of processing / transmission time of said softcell).

California: RU load = 20,000ms. + Delay = 73ms product = 20,073
New York: RU load = 20,000ms. + Delay = 179ms product = 20,179
Nevada: RU load = 6,000ms. + Delay = 89ms product = 6,089
Colorado: RU load = 3,000ms. + Delay = 97ms product = 3,097
Florida: RU load = 100ms. + Delay = 164ms product = 264

So this shows us that while the server in New York has the most processing power (currently available); right now it’s CPU and delay creates a prohibitive metric. Conversely the Florida Metric is much better; only producing a cost of 264ms. However this can be deceiving as this task will take 6,000ms longer on the Florida system due to its single PE configuration. So the actual best metric would in turn be the Colorado system which offers 3,097ms for current processes and another 1,000ms to process this request process from the transmitting member Cell; resulting in roughly a total 4,097ms to execute this process and receive the results.

Said process would have otherwise been held in queue until the transmitting Cell completed it's full cycle of processing, which would have been longer than 4,097ms in order for the transmitting Cell to off-load the process onto the Cell network.

I certainly hope I’ve shed some light onto this question. I will explain how QOS can play a roll in certain softcells later on, for now; let me get to the other questions...

AlgebraicRing wrote:
2) What happens when an ISP hiccups and a block of machines become unreachable, the content of their computation becomes unreachable?

This would then cause a check response to be sent to the unreachble Cell, in turn would cause re-transmission of the Cell packet. So yes, the data would be lost. Although this should be rare.

AlgebraicRing wrote:
3) What about latency between machines? For a real time game, Machine X is going to have to get data from Machine Y at least 30 times a second.

I explained this above.

AlgebraicRing wrote:
4) Am I in control of what code is being run on my machine? If I'm playing Game Y will my machine be calculating results for Game Z? Can I donate my down time to a particular game?

Most likely not in the intial phases. Probably leaving your system on and connected will make you a part of the collective.
 
Does the PS3-GRID network only apply to hosting online games? (ie: if your machine is in standby mode, it helps host a game of SOCOM 3?)
 
zurich said:
Does the PS3-GRID network only apply to hosting online games? (ie: if your machine is in standby mode, it helps host a game of SOCOM 3?)

As I said in another thread, while it doesn't appear that locality is necessary for the Apulets; I question how it will be used.

While I'm not ruling it out, I'm more confident of the ability to easily send data over the network seemlessly as opposed to sharing processing with non-local sources. But then again this is a new architecture and new programming enviroment that could be very different from what we're accustomed to thinking about.

But, perhaps others with more knowledge could enlighten us...

EDIT: For example, if you thought the XBox's custom playlist feature where you need to rip the music yourself was cool then you'll like this. Perhaps you could suscribe to a Sony Music provided library of music and then stream the ATRAC(3+) to you during the game which is stored in a sandbox and processed by a few APUs arbitrarily grouped/pipelined under the guidence of a PU.
 
Ok folks by now everyone should have heard about the IGN article about developers and publishers starting to plan out PS3 games.

If not here you are,

http://ps2.ign.com/articles/427/427173p1.html

I find this quote most interesting.

For instance, in yesterday's (7/2/03) Game Daily newsletter, wanted ads have been posted for jobs to work on PlayStation 3.

Amazing is all I'm going to say. I am also happy that developers are planning ahead, hopefully Sony treats them well and give's them some good middleware, tools and manuals to work with. It will greatly pay off.

A good deal of time ago IGN also posted something in their rumour section about EA supposidly designing a Medal of honor game for PS3 which featured completely destructable enviornments. I cannot find the link though.
 
Paul said:
A good deal of time ago IGN also posted something in their rumour section about EA supposidly designing a Medal of honor game for PS3 which featured completely destructable enviornments. I cannot find the link though.



is that all???? i mean, yeah, if it has completely destructible scenery which destructs realistically, cool.... but i was expecting things like, don't know..... "thousands of on-screen characters..." u know...
 
Deepak said:
A scene like Normandy invasion in Saving Private Ryan would be great////



yeah, but something that doesnt look like a game.... i mean the intro in medal of honour frontline was amazing and all realtime, but it still looked like a game u know what i mean.... not asking for FF:TSW, just, u know....
 
PS3 memory rumour mill turns again

Further information about Sony's next-gen console leaks out of memory suppliers?

In the ongoing quest to guess what's going to go inside the PlayStation 3, the most useful sources of information so far (aside from Sony's own rumblings about the Cell microprocessor) have been the companies contracted to make memory chips for the console.

Once again this week the memory manufacturers appear to have done a convincing leaky sieve impression, with strong rumours suggesting that they have revealed the amount of RAM which the PS3 will boast, and giving an idea of what volume of consoles Sony hopes to manufacture in the first year.

Online sources are today reporting that the console will incorporate four XDR-DRAM chips, for a total of 256MB of main RAM - an eight-fold increase over the 32MB found in the PlayStation 2. The memory bus speed is also significantly faster than the PS2's.

The three memory suppliers working on XDR-DRAM chips are Elpida, Toshiba and Samsung, all of which are expected to supply chips for the PS3 - although only Elpida has been announced as an official supplier so far.

As reported earlier, the three manufacturers will begin bulk production of the RAM in early 2005, and expect to produce some 20 million XDR-DRAM chips within that year - meaning that Sony could potentially build 5 million units of the PS3 by the end of 2005, enough for a reasonably sized launch (over a million units) in all three major territories.

The yield for 2006 is expected to be in the region of 30 million chips - enough to build 7.5 million PS3s. This is a surprisingly low figure, however - given the speed with which Sony shipped PlayStation 2's in the first year of the consoles lifespan, surely it would be hoping for more than 12.5 million PS3s on the market by the end of 2006, assuming a late 2005 launch?

Could it be that Sony's ambitious technical specifications for the PS3, featuring leading-edge RAM and CPU technologies, may restrict supplies of the console - or does the giant manufacturer have an ace up its sleeve?
 
SCE recruiting emulation experts

No official explanation for new recruitment drive, but it's not hard to guess


Sony Computer Entertainment has begun recruiting specialists in the field of emulation, including experts in compiler development and CPU micro-architecture, to work on a forthcoming project at the company.

No official explanation has been offered by Sony for the recruitment drive, which was reported today by ZDNet Japan - however, it's not hard to work out why the company might want such experts on its staff.

Some speculation has suggested that they might be working on emulating older systems on the PS3, which should technically be powerful enough to emulate the PS2; however, since recent initiatives have managed to put most of the core elements of the PS2 onto a single chip which would easily be integrated onto the PS3 board, this seems vaguely unlikely.

What's altogether more likely is that the team will be working on providing emulation for PSone titles on the forthcoming PlayStation Portable, which is expected to be at least as powerful as a PS2 but will have quite different architecture - thus requiring software emulation in order to play PSone games.

An onboard emulator would allow Sony to repackage PSone titles on UMD discs and sell them as PSP titles without any changes to the game code - and indeed, the emulator could even act like some PSone emulators on the PC and automatically upgrade the graphics of the game with higher resolutions, anti-aliasing and sharper textures.
 
Only 256 of main ram or that is extrnal ram. That would be a mistake if the ps3 only has 256 of total system memoray ram i am looking at 512mb-1g of ram
 
qwerty2000 said:
Only 256 of main ram or that is extrnal ram. That would be a mistake if the ps3 only has 256 of total system memoray ram i am looking at 512mb-1g of ram

i believe its 256 of main ram. Then 32megs of ram for the cell chip. Mabye 32-64 megs of on board ram for the gpu ? They don't need more than that .
 
I doubt 256 of main ram is enough today's cpus come with 512 of ram. It is cheap also by the time 2004 1g of ram is going to be cheap also
 
qwerty, console memory is different than a PC. A console is a fixed platform with no huge OS or bloated software programs to run in the background, not to mention programmers specifically program for what they have. Meaning you get ALOT more use out of console memory than that of a PC.

Just look what Xbox is doing with 64mb of main memory, this isn't even all for graphics as it is split up.

256mb of external memory 32mb e-dram for Cell than 64mb of e-dram just for graphics will be plenty. Not to mention ps3 obviously is going to have texture compression which will help out even more.

Not to mention.. I'm sure this ram isn't going to be the cheapest thing to produce, maybe the cost won't be insane but they will still be up there. Sony could stick a GB of SD ram in ps3 sure, but that ram sucks XDR ram rules. Bottom line.
 
Paul:

Windows doesn't neccessarily bloat all that much since unused OS code gets swapped out to disk. 256MB WILL be very little by 2005 standards, mark my words.

XDR won't neccessarily be all that expensive as Sony would be fabbing it essentially themselves. Nvidia paid through the nose for DDR2 for their miserable NV30-failure because they had to go on the commodities market which follows the laws of supply and demand. Low supply = high price, and in this case they could only get the RAM from one supplier, ie Samsung. Hence, high price... It's not the same thing when you make the chips yourself, or else all proprietary ASIC chips (such as EE, GS, Flipper, Gekko, XGPU etc etc etc) would be prohibitively expensive since they cater to a very specific market. ;)


*G*
 
Sony will not make the chips... Elpida, Toshiba and Samsung will.

Of course here we have three suppliers and one of them is a close partner...
 
Well it was just an example, that PC's have a shitload of stuff going on in the background that can slow game performance while consoles don't.

Yea, XDR won't be KILLER expensive however it still won't be incredibly cheap.

I do agree with you though, I do see ps3 coming with that 512mb of XDR however if they do that maybe they will have to cut back on the eDRAM. I think we are all underestimating what ps3 will be able to do..

Ask people two years prior to ps2 launch how many polygons the system would be able to push max and they would say 10-20M, well everyone was in for a shock.. I have no question it will happen this time aswell.
 
qwerty2000 said:
I doubt 256 of main ram is enough today's cpus come with 512 of ram. It is cheap also by the time 2004 1g of ram is going to be cheap also

not XDram!!

while you can never say no to more ram I wonder if this puts an unneccessary limit on the geometry/texture ratio.

any guess ppls?
 
All I can say is that Sony, IBM, Toshiba, Elpida aren't dumb people and I'm sure that they aren't going to make the same mistake twice as they did with PS2 and have a incredibly powerfull system with limited resources. Atleast.. I hope ;)
 
Back
Top