Thin clients<->Central Computing Device or Distributed Computing?

mrcorbo

Foo Fighter
Veteran
I'm curious as to what the general consensus is on which of these two models is the most efficient in more current and speculative home user applications, such as gaming, media consumption, content creation, etc.

I've always favored the client<->server model as being superior when it comes to accomplishing tasks that would be typical for most home users, but I still continue to see posters pining for a Ken Kutaragi-esque vision of connected computing devices sharing processing power.

I'm looking for input on the general Pros and Cons of each approach and some use cases where one of the models is superior to the other.
 
My 2c on this is neither.
It's too cheap to put decent computation at the point of consumption for pure thin client type scenarios and it's too compelling not to host certain things centrally.

So I think you end up with something in the middle, which I guess ends up close to the WWW model.

I think in the extreme long term distributed computing makes a lot of sense, but the infrastructure just isn't here to do it in the short term.
 
So no specific examples of how a distributed computing model makes sense in the home.

I can look at products like the Kindle Fire and OnLive which are practical examples of a consumer use for offloading processing from the device the user is directly interacting with onto a more powerful device. Yet the only times I see distributed computing being used are in larger projects run by large organizations where the inherent inefficiencies caused by the overhead of shuffling data around are masked by sheer scale.

I don't see any of the fundamental inefficiencies of distributed computing being addressed by technological development either as whatever advances there are that benefit that model would benefit the client-server model even more.
 
Distributed computing examples at home ?

At the WAN level, if you use Skype, you're using its P2P network to make calls. Other examples are P2P video sharing and streaming services, and P2P network gaming.

At the LAN or home network level, DLNA is divided into many roles (controller, player, server, renderer, printer). These roles are distributed across different devices to serve music, video and image. Adhoc network gaming for portable game consoles is also done locally. However, you can use an "extender" (like PS3's Adhoc Party software) to connect remote adhoc groups together.

At the system level, the most recent example is probably Sony Vaio Z (dock with external GPU and RAM via LightPeak). There is another startup making external GPU for Mac boxes via Thunderbolt, but it's not a full product. Need Apple to write drivers for it.

Very often, the network services you use are delivered from a cluster of equipments + software on the vendor's sites. When you download XBL/PSN games and movies via Akamai or other CDN providers, you're using their distributed edge servers to pull files. When you use HTTP Live Streaming to watch NetFlix movies (e.g., on iOS and Macs), you're also pulling files from the local HDD as well as a network of distributed servers to ensure smooth delivery.

EDIT:
WiiU is another good candidate when the main unit interacts with the pad and delivers scenes to it at the same time. Need to see how it's done. There were loose hints of WiiU interoperating with DS/3DS too.
 
Distributed computing examples at home ?

At the WAN level, if you use Skype, you're using its P2P network to make calls. Other examples are P2P video sharing and streaming services, and P2P network gaming.

At the LAN or home network level, DLNA is divided into many roles (controller, player, server, renderer, printer). These roles are distributed across different devices to serve music, video and image. Adhoc network gaming for portable game consoles is also done locally. However, you can use an "extender" (like PS3's Adhoc Party software) to connect remote adhoc groups together.

At the system level, the most recent example is probably Sony Vaio Z (dock with external GPU and RAM via LightPeak). There is another startup making external GPU for Mac boxes via Thunderbolt, but it's not a full product. Need Apple to write drivers for it.

Very often, the network services you use are delivered from a cluster of equipments + software on the vendor's sites. When you download XBL/PSN games and movies via Akamai or other CDN providers, you're using their distributed edge servers to pull files. When you use HTTP Live Streaming to watch NetFlix movies (e.g., on iOS and Macs), you're also pulling files from the local HDD as well as a network of distributed servers to ensure smooth delivery.

EDIT:
WiiU is another good candidate when the main unit interacts with the pad and delivers scenes to it at the same time. Need to see how it's done. There were loose hints of WiiU interoperating with DS/3DS too.

I think I need to clarify, since the term "distributed computing" could actually be applied quite broadly. I would first say, though, that of the examples above I see DLNA and Netflix as examples of Client<->Server models. And the various Thunderbolt docks are examples of peripherals; they have no independent functionality when not attached to a host device. When I say distributed computing in the home I am referring to leveraging the compute resources of the independant networked devices within a household to collectively accomplish computing tasks.
 
I think I need to clarify, since the term "distributed computing" could actually be applied quite broadly. I would first say, though, that of the examples above I see DLNA and Netflix as examples of Client<->Server models. And the various Thunderbolt docks are examples of peripherals; they have no independent functionality when not attached to a host device. When I say distributed computing in the home I am referring to leveraging the compute resources of the independant networked devices within a household to collectively accomplish computing tasks.

Distributed computing just means you have multiple autonomous systems working together to achieve a common goal. The communication protocols can be client-server RPC, or HTTP, or message queue, or something else.

In DLNA's case, depending on what roles are involved in a network, you can have multiple units coordinating the playback between a controller (e.g., PSP), renderer (PS3) and server (file server).

In HTTP Live Streaming (e.g., for NetFlix), the client pulls a playlist of (incomplete) video segments distributed across multiple servers. The simplest setup is just a client and a server, but an elaborate setup can involve CDN servers, redirects, and even relocation of segments. e.g., There is an Apple patent that advises the client to relocate the first segment to the local HDD. On second playthrough, this allows immediate playback of the media while it fetches subsequent segments.

I'm not referring to simple Thunderbolt peripherals (e.g., a display) in my example. In Vaio Z's implementation, the CPU will gain another GPU when connected to the dock. They will work together to run Windows applications over the optical bus.
 
Back
Top