Sony's Hidden Online Strategy

Shifty Geezer said:
Another concern is if a 'local' server gets switched off in the middle of a transfer, and an alternative will need to be found and the datastream picked up. That could add some very unpredictable lag I'd have thought. You won't notice in file distribution, but in online gaming you probably would. And with millions of users creating millions of nodes, these'll be getting disconnected all the time.
It's covered by another patent "DYNAMIC PLAYER MANAGEMENT" by Masayuki Chatani and Glen Van Datta.

Sis said:
Interesting that everyone takes the P2P communication to mean some P2P topology. I assumed two things about this slide from E3: A) It was put together by a PR person with minimal understanding of online gaming and B) P2P communication was about the communication part, such as chatting. You throw in the term "secure" to give it a nice buzz word feeling.

Seriously, this slide was one of the most ambiguous slides at E3.
That section of the conference was told by Masayuki Chatani, the same person whose name is on the patent above. I think that tells something.
 
Bobbler said:
Wouldn't that 'local' server issue be a problem in XbLive as well? It seems like its one of those issues you can't really get around when you allow games to be hosted by players. It doesn't seem like a big problem, really, as players will generally not randomly turn off their console if they want to play. Also I would think the net code would be setup so that when the "server" is killed the remaining players can 'judge' who will be the next server (some PC games do this, not sure if any XBLive games do it).

I wouldn't think it would be a problem, at least not one that the online gaming space hasn't already seen for ages. Unless, maybe I'm not understanding your concern?
Right, Halo 2 handled this by freezing everyone and selecting a new host.
 
Sis said:
Interesting that everyone takes the P2P communication to mean some P2P topology. I assumed two things about this slide from E3: A) It was put together by a PR person with minimal understanding of online gaming and B) P2P communication was about the communication part, such as chatting. You throw in the term "secure" to give it a nice buzz word feeling.

Seriously, this slide was one of the most ambiguous slides at E3. Therefore, I predict this thread will spiral into 13 pages of two points of view. Side1: The PS3 network will be leverage the CELL networking capabilities allowing for limitless possibilities. And it will be free. Side2: Sony has no online strategy.

This would be my guess too. I'd expect some of the stuff on this mysterious online plan to be done with P2P, but I don't imagine all of it would (it just doesn't make sense for all of it). File downloads could be P2P or btorrent style (they could set it up so that when you DL anything, and if your PS3 is on, and you still have the file it 'shares' it -- that would likely take quite a bit of stress off DL servers). I imagine the games will be a server/client system though, like XB Live (where the players are often the servers, or P2P on smaller number games -- ~4 or less players, possibly). Chat and things like that would likely be P2P.

I dunno. It's sort of hard to guess at the moment because we don't even know what features the 'plan' consists of, so it's a bunch of silliness to guess about the implementation of possibly non existant features (even though I just did, I suppose).
 
one said:
It's covered by another patent "DYNAMIC PLAYER MANAGEMENT" by Masayuki Chatani and Glen Van Datta.
And what little I read of it, it appears that Halo 2 has something very similar already in place.
That section of the conference was told by Masayuki Chatani, the same person whose name is on the patent above. I think that tells something.
My post was about my impression of the slide. It is an opinion based on the wording used and the ambiguity. You can, and apparently have, filled in the gaps of the ambiguity based on finding a patent authored by the presenter. But it doesn't change my impression ;)
 
Bobbler said:
I dunno. It's sort of hard to guess at the moment because we don't even know what features the 'plan' consists of, so it's a bunch of silliness to guess about the implementation of possibly non existant features (even though I just did, I suppose).
This is a better wording for my point. It's obvious to me that some form of P2P infrastructure will be in place. It's obvious that something involving cyberspace, and the navigating thereof, will be in place. And obviously leveraging the Playstation brand in the title of the online community makes sense.

So what's this slide really tell us?
 
Sis said:
And what little I read of it, it appears that Halo 2 has something very similar already in place.
My post was about my impression of the slide. It is an opinion based on the wording used and the ambiguity. You can, and apparently have, filled in the gaps of the ambiguity based on finding a patent authored by the presenter. But it doesn't change my impression ;)
Hehe, at least Chatani knows what he's talking about. :smile:
As for "secure", I think it's meant toward content providers. Check out this IBM paper (pdf) about Cell security, "Cell Broadband Engine Support for Privacy, Security, and Digital Rights Management Applications" to fill in another gap or two...
 
Last edited by a moderator:
rabidrabbit said:
What does this mean for the early adopters, if the PS3 is going to be short in supply there will be less P's 2 P, and thus the network experience will be slow(er) and (more) unreliable in the beginning until there's millions of PS3 connected.

On the other hand, I think hundred(s) of thousands of PS3's online could already make a good P2P, would it?

Good thing is, it can only get better :)

Well if they want to, Sony can seed the initial P2P network with its own nodes or partner with someone who has an established P2P network, even one with PC clients :) . There are very powerful and interesting dynamics if Sony is willing to go that far (but I doubt it given their track record).

Like others, I think P2P should only be part of a larger initiative.

So far P2P has been proven for media exchange, sharing and communication applications. I wonder whether someone will repackage the PS2 HDD and make PS2 into a DLNA media server for this. A lot of working PS2 will be collecting dust soon assuming PS3 has good backward compatibility. I don't mind repurposing it for new application at all.

I also think it is a good idea for a small group of friends to play together using their own network resources, so cheating in this case may be irrelevant. This particular feature will probably require the game to be written specially for it though.

And thanks to one for consolidating all the Sony network patents in 1 post. I will look through them when I have the time.
 
Danalys said:
well you could split the game world up in to packets. with enough players each packet is backed up with multiple players sharing the same packet. as some one enters an area they download the apropriate packet. the difficulty is in updating packets and proliferating the new data. especially when new data is coming from many sources. but this is the same as with any online game. you could think of it as dynamically shifting between different hosts with hosts hosting over lapping game areas.

The difficulty in distributed servers is that you still limit the maximum load on a single server by that servers up stream bandwidth.

Most MMORPG's run distributed servers at the backend, but they have every fast access to the other servers in the cluster, and fast upstream connections to the clients. Even then high load in specific areas an make games laggy.

Using a distributed server approach in a conventional 4 (or 16 or 32 player or whatever) online game is actually not a bad idea, you split the responsibility of the server accross the machines playing the game, it's more compilcated than traditional client server to implement. But without tons of upstream bandwidth, your still going to be limited to the number of players that a single node in the network can support. I can only see none local changes as fast as the other peers can sent them to me. :(
 
Yes, I'm indeed refering to very small group of players because I can hardly gather enough friends at the right time to play these days. The traditional board game player size (2-6) would be fine. More is a bonus. In my mind, we get to keep our own scores instead of appearing at the bottom 2 of the public chart :)D).

For certain games, concurrent users may not be relevant too. My friends and I can just play independently and update the shared game score.

This is kind of related to Ninzel's online gaming post I guess.
 
I think Sony might have had a different P2P model than we are used to:

If theoretically...you do not turn your PS3 off, but keep it idle and thus still connected to the network, even when not playing it...your idle processor cycles (given the nature of CELL) will be working on processing information for someone else's game in some other remote location...given the nature of distributive/grid computing. If I understand correctly, this is what KK was talking about when he spoke of application packets, 100% efficient utilization, and millions of CELLs working in unison--all theoretical practices in real grid computing.
 
ROG27 said:
I think Sony might have had a different P2P model than we are used to:

If theoretically...you do not turn your PS3 off, but keep it idle and thus still connected to the network, even when not playing it...your idle processor cycles (given the nature of CELL) will be working on processing information for someone else's game in some other remote location...given the nature of distributive/grid computing. If I understand correctly, this is what KK was talking about when he spoke of application packets, 100% efficient utilization, and millions of CELLs working in unison--all theoretical practices in real grid computing.
Please, enlighten us...what is it about "the nature of CELL" that allows this to happen? I'm ever so interested.
 
I think some of you are reading way way too much into this.

The system is likely to be quite like Live, for games, in so far as players themselves will host games. That, apparently, is subtly different from the precise definition of P2P - that's still client-server, technically, but the servers and clients are all player machines - but Sony's references to P2P are likely simply for things like realtime communication and media sharing between devices.
 
Last edited by a moderator:
Sethamin said:
Please, enlighten us...what is it about "the nature of CELL" that allows this to happen? I'm ever so interested.

Simply, massive parallezation (having 8 available processing units which can work on independent tasks simultaneously), a smart DMA, and a data model for distributive computing that works.

Your PC can already donate idle processing cycles for cancer research, today. If you have a closed, secure network, consisting of uniform hardware...things can be coordinated much better. I don't see why the grid computing model wouldn't work with the unique nature of CELL.

Examples of practicle use right now:

http://folding.stanford.edu/

http://toolbar.google.com/dc/

http://www.cs.queensu.ca/~chen/CyclesDonation/index.html
 
Last edited by a moderator:
Titanio said:
I think some of you are reading way way too much into this.

The system is likely to be quite like Live, for games, in so far as players themselves will host games. That, apparently, is subtly different from the precise definition of P2P - that's still client-server, techncially, but the servers and clients are all player machines - but Sony's references to P2P are likely simply for things like realtime communication and media sharing between devices.

I too believe that the most likely use of P2P is for communication and media sharing (which is why I pair it up with DLNA discussion).

However given Sony's player management patent (among other things), I took the opportunity to talk about a different online gaming dynamics. I should have kept my online gaming discussion in Ninzel's thread to keep the discussion focused.
 
ROG27 said:
Simply, massive parallezation (having 8 available processing units which can work on independent tasks simultaneously), a smart DMA, and a data model for distributive computing that works.

Your PC can already donate idle processing cycles for cancer research, today. If you have a closed, secure network, consisting of uniform hardware...things can be coordinated much better. I don't see why the grid computing model wouldn't work with the unique nature of CELL.

Examples of practicle use right now:

http://folding.stanford.edu/

http://toolbar.google.com/dc/

http://www.cs.queensu.ca/~chen/CyclesDonation/index.html

Like many have indicated before, not many applications can gain from such a distributed computing model. To my knowledge, it is only applied to batch-oriented, large grain problems to date.

In its general form, one needs to solve the following problems:
* Partition and send the right subset of data to the local node
* Gathering statistics/heart beats about all the nodes in the vast network for load balancing purposes
* Handle redundancy when part of the network becomes unavailable suddenly
* Compute, route and consolidate result sets in a near real-time fashion to the users
There are too many issues and too many uncertainty in the home user environment to ensure quality of service. You can google for "Parallel Virtual Machine (PVM)" for one of the earliest attempts.
 
Last edited by a moderator:
ROG27 said:
Simply, massive parallezation (having 8 available processing units which can work on independent tasks simultaneously), a smart DMA, and a data model for distributive computing that works.

Your PC can already donate idle processing cycles for cancer research, today. If you have a closed, secure network, consisting of uniform hardware...things can be coordinated much better. I don't see why the grid computing model wouldn't work with the unique nature of CELL.

Examples of practicle use right now:

http://folding.stanford.edu/

http://toolbar.google.com/dc/

http://www.cs.queensu.ca/~chen/CyclesDonation/index.html
Your idea, though, doesn't require anything special from CELL other than "leaving the game console on" (I'm sure I saw a Sony patent on that one).
 
Titanio said:
The system is likely to be quite like Live, for games, in so far as players themselves will host games.
That still leaves the question how how gamers will find each other. If the solution is made per game, experience will greatly vary and there will be no communication without a game in the drive. There still has to be a central service, and it has to be big & beefy to handle the millions that will sign on without going down.

but Sony's references to P2P are likely simply for things like realtime communication and media sharing between devices.
Isn't that pretty dangerous? I mean, that would allow PS3 gamers to transport entire games across the network. And there's also the problem of PC's posing as PS3's to either distribute illegal content or malicious content. If anything like that happened, Sony would be blamed and they would have no real recourse. I'm not saying it will happen, of course, but rather that this is dangerous.
 
Inane_Dork said:
That still leaves the question how how gamers will find each other. If the solution is made per game, experience will greatly vary and there will be no communication without a game in the drive. There still has to be a central service, and it has to be big & beefy to handle the millions that will sign on without going down.

It's not really relevant to the topic, but of course there'd be servers to handle that. The games themselves woud be hosted by player machines, that's all I'm saying, somewhere there has to be a sony or publisher server tracking things and players etc. Did something I said contradict that?

Inane_Dork said:
Isn't that pretty dangerous? I mean, that would allow PS3 gamers to transport entire games across the network. And there's also the problem of PC's posing as PS3's to either distribute illegal content or malicious content. If anything like that happened, Sony would be blamed and they would have no real recourse. I'm not saying it will happen, of course, but rather that this is dangerous.

I guess that's where "secure" comes in. I'm not talking about distributing games across the network necessarily, but media (music,video, photos etc.), and communication (voice, im etc.), though these are as worthy of protection as anything else.
 
Sis said:
Your idea, though, doesn't require anything special from CELL other than "leaving the game console on" (I'm sure I saw a Sony patent on that one).

But this wouldn't just be going on when CELL is completely idle (ie PS3 in sleep mode). The nature of CELL (massive parallezation) could allow for 100% CPU utilization at all times, even when the PS3 is in use and on the network (idle SPEs would never really be idle because they would be processing application packets cycling in and out over the secure network--a form of realtime, remote, dynamic load balancing utilizing many threads simultaneously to perform unrelated/indirectly related tasks) to make the overall experience to every user more robust.

Just because other systems can share cycles, they would never be able to share them in as a productive manner as they would have to be interleaved throughout the game code being processed concurrently. The CELLs set-up on the PS3 allows for up to 8 TRUELY independent threads (compared w/ XCpu's 3). The nature of so much independent processing makes the Grid computing model more feasible on a non-dedicated, P2P network.
 
Back
Top