Sony's Hidden Online Strategy

Inane_Dork said:
That still leaves the question how how gamers will find each other. If the solution is made per game, experience will greatly vary and there will be no communication without a game in the drive. There still has to be a central service, and it has to be big & beefy to handle the millions that will sign on without going down.

I wouldn't think it would need to be any more beefy than any servers for MSN Messenger, AIM, or ICQ, really. Maybe needs to handle a few other things (just to keep track of things; the server shouldn't need to hold the hand of every user), but most of that stuff would all be P2P (like the actual chatting/communication).

A friends list needs little, and adding more functionality than that (like seeing what game they are playing and being able to start a multiplayer game with them) requires little as well. Having a seperate solution per game doesn't really remove the ability to have a friends list (similar to how some of the PC gaming friends list things work). It's quite possible to offer an XBLive like service at a minimal cost (relatively)... the nature of most things in question don't really require all that much from servers depending on how you set them up (unlike, say, an MMO). I'd imagine most of the bandwidth the clients use happens on the initial login and sparsely after that (telling the server they are online and setting an IP on the server or something, and when you start a game/movie/etc) -- otherwise everything would likely be P2Pable (or at least require little interaction with a main server).

Kind of interesting stuff when you dig into it...
 
ROG27 said:
Just because other systems can share cycles, they would never be able to share them in as a productive manner as they would have to be interleaved throughout the game code being processed concurrently. The CELLs set-up on the PS3 allows for up to 8 TRUELY independent threads (compared w/ XCpu's 3). The nature of so much independent processing makes the Grid computing model more feasible on a non-dedicated, P2P network.
So the PS3 OS is constantly going to be monitoring the system, seeing what RAM is available for non-game related tasks (and we know 512 MB is a total abundance. Games only really need 64 MB, right, leaving plenty spare for non-gaming tasks...) and what threads aren't in use (despite the way the devs use threads is up to them, and it might not be a packet based, appulet model) and downloading and uploading content (which isn't going to interfere with online gaming at all because we know most people have such a huge excess of upload BW available that after game content there's buckets to spare...) and make a wonderful network computer transparent in operation, huh?

Somehow I'm not convinced ;)
 
rabidrabbit said:
Should we revise some old news:

Now, if we replace the "IBM eServer xSeries systems" with "STI "Cell" based IBM BladeCenter Servers", would that still be somewhat relevant news?

Edit: Would such "Grids" of Cell supercomputers be overkill just for PS3 games and downloadable content though?
On the other hand, would the bandwidth taken by gamers be insignificant enough, that it would be possible to "lease" the needed bandwidth to gamers from those supercomputer Grids? Maybe for exchange of money, or even for exchange of the idle processing time of their PS3's?
It would be cool if your PS3, when you're not playing, would run a seti@home type application, but instead of trying to find extraterrestial life it would search for Bin Laden... sti@home!!! (search for terrorists and intruders@home).
It would have sweet irony, first the terrorists buy those PS2 to guide missiles, now a PS3 would be used to find them!!!


Those server all also used for many other things ;) not just for gaming
 
Shifty Geezer said:
So the PS3 OS is constantly going to be monitoring the system, seeing what RAM is available for non-game related tasks (and we know 512 MB is a total abundance. Games only really need 64 MB, right, leaving plenty spare for non-gaming tasks...) and what threads aren't in use (despite the way the devs use threads is up to them, and it might not be a packet based, appulet model) and downloading and uploading content (which isn't going to interfere with online gaming at all because we know most people have such a huge excess of upload BW available that after game content there's buckets to spare...) and make a wonderful network computer transparent in operation, huh?

Somehow I'm not convinced ;)

I agree with you that you are not going to see a drastic change in the way things work...but I do believe, yes, there will be swapping of packets for processing and redistribution at some level. In theory 100% CPU utilization at all times would be ideal, but in practice even an increase of 10% efficiency could have a noticable impact on how tolerable the end user experience is for your impatient casual player that doesn't know how to deal with some of the limitations of the online experience like the avid/hardcore gamer has grown accustomed to. DL/UL rates are the biggest limiting factor but the ability to make the online world more robust in anyway is a plus, in addition to proving through application the viability of the grid computing model for use in later iterations of the technology.
 
rabidrabbit said:
Should we revise some old news:

Now, if we replace the "IBM eServer xSeries systems" with "STI "Cell" based IBM BladeCenter Servers", would that still be somewhat relevant news?

Edit: Would such "Grids" of Cell supercomputers be overkill just for PS3 games and downloadable content though?
On the other hand, would the bandwidth taken by gamers be insignificant enough, that it would be possible to "lease" the needed bandwidth to gamers from those supercomputer Grids? Maybe for exchange of money, or even for exchange of the idle processing time of their PS3's?
It would be cool if your PS3, when you're not playing, would run a seti@home type application, but instead of trying to find extraterrestial life it would search for Bin Laden... sti@home!!! (search for terrorists and intruders@home).
It would have sweet irony, first the terrorists buy those PS2 to guide missiles, now a PS3 would be used to find them!!!
lol nice one rabbit.
 
Titanio said:
It's not really relevant to the topic, but of course there'd be servers to handle that. The games themselves woud be hosted by player machines, that's all I'm saying, somewhere there has to be a sony or publisher server tracking things and players etc. Did something I said contradict that?
What I'm saying is that there has to be a central service and it has to be very solid. Some people seem to think that will cost Sony very little, but my point is that it will likely cost them as much (or more) as Live has cost MS. More people with basically the same demand per person.

I guess that's where "secure" comes in.
:LOL: Sure, just throw a word in there and pirates are stopped cold.

I'm not talking about distributing games across the network necessarily, but media (music,video, photos etc.), and communication (voice, im etc.), though these are as worthy of protection as anything else.
Once a P2P network is up, it's used for what the people want. If they want to pirate games, they do so. If they want to seed hacked material, they can. That's the dangerous part, whether Sony thinks it's secure or not.



Bobbler said:
So, basically, MS spent millions on technology and upkeep that they didn't need to, and they knew they didn't need to from their experience with Messenger and Zone. That's pretty fishy, IMO. MS isn't the most thrifty company on the planet, but that's an excess worthy of none less than the US government.
 
ROG27 said:
Simply, massive parallezation (having 8 available processing units which can work on independent tasks simultaneously), a smart DMA, and a data model for distributive computing that works.

Your PC can already donate idle processing cycles for cancer research, today. If you have a closed, secure network, consisting of uniform hardware...things can be coordinated much better. I don't see why the grid computing model wouldn't work with the unique nature of CELL.

Examples of practicle use right now:

http://folding.stanford.edu/

http://toolbar.google.com/dc/

http://www.cs.queensu.ca/~chen/CyclesDonation/index.html

This type of sharing only works when the time to compute the result >> time to send the data. Cell has nothing to solve this problem, because it's a fundamental problem with the internet. The internet is designed for high bandwidth, and with no real regard for latency, this is a result of it's history as a way to transfer data in academia.

It's great if you want to model a supernova and don't care about getting the reult back today, it's useless if you need the result in <16ms.
 
Inane_Dork said:
So, basically, MS spent millions on technology and upkeep that they didn't need to, and they knew they didn't need to from their experience with Messenger and Zone. That's pretty fishy, IMO. MS isn't the most thrifty company on the planet, but that's an excess worthy of none less than the US government.

Who says they spent all that much? ;)

I haven't heard a number of how much Live alone cost them. It likely isn't as high as people assume (although depending on how they do the demo/trailer DL's they could be sucking up quite a bit of bandwidth there). Sure they spent millions, but there are a lot of numbers that fit into the range of "millions." I'd be surprised if they are really losing money on Live alone. Who knows though... I'm very much of the opinion that the cost and difficulty of setting up Live was grossly overstated by MS (to make it seem more worthwhile and a huge accomplishment and to put doubt into consumer's minds about competitor -- PR/Marketing). I dunno, I just haven't seen anything to make me think otherwise.

I'd love to know how much per month XBLive costs to run. Anyone happen to have that figure? ;)
 
ERP said:
This type of sharing only works when the time to compute the result >> time to send the data. Cell has nothing to solve this problem, because it's a fundamental problem with the internet. The internet is designed for high bandwidth, and with no real regard for latency, this is a result of it's history as a way to transfer data in academia.

It's great if you want to model a supernova and don't care about getting the reult back today, it's useless if you need the result in <16ms.

How about MMORPGs, where incredibly detailed, in-depth batch statistics can reside in a virtually serverless environment and at the end of the day interact with the game model to add sporadic variety to environmental events that occur in the future?
 
ROG27 said:
How about MMORPGs, where incredibly detailed, in-depth batch statistics can reside in a virtually serverless environment and at the end of the day interact with the game model to add sporadic variety to environmental events that occur in the future?


See my earlier post...
Doesn't work unless everyone has a huge amount of upstream bandwidth, or you have some cental server that has the bandwidth.

It's a nice idea, but most internet connections are designed to recieve data, not to push it.

If their are N players that can observe anything I simulate I have to send the data to N players, so upstrem bandwidth requirements scale linearly with the number of players.
 
Inane_Dork said:
What I'm saying is that there has to be a central service and it has to be very solid. Some people seem to think that will cost Sony very little, but my point is that it will likely cost them as much (or more) as Live has cost MS. More people with basically the same demand per person.

OK, I wasn't really talking about this at all. I was simply looking at the game setup, that it wouldn't be client-server as such for game management between Sony and player but between player and player (as on Live for most games).

Inane_Dork said:
:LOL: Sure, just throw a word in there and pirates are stopped cold.

Well, uh, I guess there's an attempt on Sony's part to make it "secure" whatever that means.


Inane_Dork said:
Once a P2P network is up, it's used for what the people want. If they want to pirate games, they do so. If they want to seed hacked material, they can. That's the dangerous part, whether Sony thinks it's secure or not.

I'm not sure where you get the idea that there'd be an open p2p network ala the networks you see on PCs. P2P is simply the topology Sony would use for certain services and make available for use by certain services, not an open free-for-all where I can trade whatever I want with whomever I want. Who said there'd be a file-sharing service?? P2P doesn't imply Bittorrent anymore than client-server suggests Amazon, if you get my drift, it's just a network topology, it tells us nothing about the applications running on top of that.
 
Last edited by a moderator:
Titanio said:
I'm not sure where you get the idea that there'd be an open p2p network ala the networks you see on PCs. P2P is simply the topology Sony would use for certain services and make available for use by certain services, not an open free-for-all where I can trade whatever I want with whomever I want.
You keep referring to the ideal, but that's not where the system stays. No company intends to promote illegal activity, but they certainly can do things that allow them. I think this would be very close to allowing a great deal of stuff.

I mean, just look at all the stuff that people have cooked up for MCE and the 360 since launch. They're streaming basically every format of video and audio and I think there's even a web browser. That too is dangerous, though it's been pretty harmless so far.

Who said there'd be a file-sharing service??
I don't understand what else you could possibly mean by having media distributed across a P2P network.
 
Inane_Dork said:
You keep referring to the ideal, but that's not where the system stays. No company intends to promote illegal activity, but they certainly can do things that allow them. I think this would be very close to allowing a great deal of stuff.

I mean, just look at all the stuff that people have cooked up for MCE and the 360 since launch. They're streaming basically every format of video and audio and I think there's even a web browser. That too is dangerous, though it's been pretty harmless so far.

I don't understand what else you could possibly mean by having media distributed across a P2P network.
The MCE and 360 thing is still a client server model. I would thinl that p2p is just a fncy way of saying everyone is a server and a client, kind of like the workgroup model for windows. The real trick seems to be what to serve when. Static data is easier to chop up and serve out, games don't tend to be static so a p2p model is harder to sync.
 
ERP said:
See my earlier post...
Doesn't work unless everyone has a huge amount of upstream bandwidth, or you have some cental server that has the bandwidth.

It's a nice idea, but most internet connections are designed to recieve data, not to push it.

If their are N players that can observe anything I simulate I have to send the data to N players, so upstrem bandwidth requirements scale linearly with the number of players.

To solve the bandwidth problem...

Some people can argue that by building a spanning tree of the active nodes, you can distribute the data to all N nodes without communicating directly with all (N - 1) nodes. Some of the existing P2P networks today use similar techniques. But they are network applications (i.e., communications and file exchange) and are not compute intensive.

To share computing power across the Internet, the primary problem is still the original issue higlighted by ERP: "This type of sharing only works when the time to compute the result >> time to send the data". Successful examples include SETI and any other key breaking exercises where it is known that the completion time is "infinity".

For most of the other compute-intensive scenarios, I would argue that it may be easier and cheaper to implement a system based on simpler computing model to _approximate_ the end result.

It is not impossible to build a system you envision. Usually the end result does not justify the R&D effort involved.
 
Last edited by a moderator:
patsu said:
Some people can argue that by building a spanning tree of the active nodes, you can distribute the data to all N nodes without communicating directly with all (N - 1) nodes. Some of the existing P2P networks today use similar techniques. But they are network applications (i.e., communications and file exchange) and are not compute intensive.


Sure in the extreme case a ring you theoretically you only need to send the data to one person.... But your latency is increased upto a factor of N-1 times.

You also need a view of any data that must be pased on from you to the rest of the ring, since someone in the ring is always interested in some of the data, you must pass not only the data that you have live (are serving) but copies of all of the rest of the data so in large collections it's likely you don't even save bandwidth.

To be clear the N in my example is the number of observers which in any real MMORPG would be less than the number of players, but still large in extreme cases.

There are probably topologies that will give better average results, but it's going to be very application dependant.
 
Inane Dork said:
That still leaves the question how how gamers will find each other. If the solution is made per game, experience will greatly vary and there will be no communication without a game in the drive. There still has to be a central service, and it has to be big & beefy to handle the millions that will sign on without going down.

Bobbler said:
Who says they spent all that much? ;)
I haven't heard a number of how much Live alone cost them. It likely isn't as high as people assume (although depending on how they do the demo/trailer DL's they could be sucking up quite a bit of bandwidth there). Sure they spent millions, but there are a lot of numbers that fit into the range of "millions." I'd be surprised if they are really losing money on Live alone. Who knows though... I'm very much of the opinion that the cost and difficulty of setting up Live was grossly overstated by MS (to make it seem more worthwhile and a huge accomplishment and to put doubt into consumer's minds about competitor -- PR/Marketing). I dunno, I just haven't seen anything to make me think otherwise.

I'd love to know how much per month XBLive costs to run. Anyone happen to have that figure? ;)

I don't have that number but...
For user management, you generally need 2 kinds of user registries.

(A) The online user registry (for sign-on and perhaps any other routing functions)
Typically implemented as an in-memory database (See "Times Ten": http://www.oracle.com/timesten/index.html), indexed storage or a RDBMS with lotsa memory. You can also partition the data to spread the load.

Without much optimization, we were able to handle 70,000 - 100,000 concurrent users with a Sun mid-range server in late 90s.

(B) The registered user registry (for registration and other user search and profiling functions)
This is more problematic if the registration volume is high and you allow flexible queries. We were able to handle above 3 millions registered user records using 1 mid-range server just for registration and a replicated one for queries (lotsa memory for caching). At that time, I know people like eBay deployed Sun E10K for their registered user database. It also depends on how you design your application to limit wild queries.

These days, we have much more powerful hardware and also more mature software. So it should be easier. I'd imagine bandwidth is still a killer that's why P2P _is_ attractive.
 
Last edited by a moderator:
Inane_Dork said:
I don't understand what else you could possibly mean by having media distributed across a P2P network.

One where Sony only allows sharing of certain types of media. I think I also said twixt my PSP and PS3, not necessarily my PS3 and stranger's PS3.

It would not be an open one wherein I can share any filetype I like., with whomever I like.

Games taking advantage of the "secure P2P" offered by PS3's OS (or whatever) could use it to let me share pre-defined game content with other strangers from within a game, perhaps, but that's different.

That's just my speculation, though, based on what Sony have talked about. The mere mention of "secure P2P" on a slide tells us nothing about how it'll be applied.
 
Last edited by a moderator:
ERP said:
Sure in the extreme case a ring you theoretically you only need to send the data to one person.... But your latency is increased upto a factor of N-1 times.

You also need a view of any data that must be pased on from you to the rest of the ring, since someone in the ring is always interested in some of the data, you must pass not only the data that you have live (are serving) but copies of all of the rest of the data so in large collections it's likely you don't even save bandwidth.

To be clear the N in my example is the number of observers which in any real MMORPG would be less than the number of players, but still large in extreme cases.

There are probably topologies that will give better average results, but it's going to be very application dependant.

I didn't see your post. Yes, it's a difficult problem generally classified as "distributed snapshot" (Getting a "global" picture in a fully distributed network). There are quite a few proposed algorithms with different performance characteristics and problem size. Some allow only 1 node to get the global picture, while others allow all the participating nodes to obtain the global picture cooperatively. And yes, it's application specific.

We have only touched the tip of the iceberg. The other problem related to distributed snapshot is actually "time" synchronization between the peer nodes (so you know the global picture you obtain is consistent).

Too many theoretical issues and too many real-life constraints (especially in a home environment as opposed to a well-managed cluster). What do we gain ? Must choose the right problem and scope to be effective.
 
Last edited by a moderator:
The way I see it, most people have 256 Kb upload, right? That's 32 KB a second upload. At 60 fps that affords you 5 KB per frame of upload data. For a client node uploading the gamer's data, that's workable. But if that client node is also passing on data from dozens of other clients, it'll choke.

Until the BW limits are broken there's no amount of clever techniques that can be used to decrease latency, distribute processing in realtime responses, etc. because there's barely enough BW available for a single upstream or downstream for a complex game, let alone excess which you can use to serve other nodes.
 
Back
Top