Project Looking Glass screenshots

Ludicrous as it may sound, there are actually interfaces that "read thoughts." They essentially use an EKG-type machine to measure the electrical impulses in your brain. The computer is programmed to interpret certain electrical patterns (most likely more impulses in one region of the brain or another) as a direction, and thus a cursor is moved via thought.

The user must learn how to use the interface, of course, and last I saw it was pretty hard to have good control over the cursor (Imagine if a movement in the corner of your eye or a random diversion in thought interrupted your input? How much of pain would that be?). It's just so much easier to let the brain use the outputs it's designed with (voice or motion) rather than trying to teach the brain to output in a totally new way.
 
Chalnoth said:
Ludicrous as it may sound, there are actually interfaces that "read thoughts." They essentially use an EKG-type machine to measure the electrical impulses in your brain. The computer is programmed to interpret certain electrical patterns (most likely more impulses in one region of the brain or another) as a direction, and thus a cursor is moved via thought.

The user must learn how to use the interface, of course, and last I saw it was pretty hard to have good control over the cursor (Imagine if a movement in the corner of your eye or a random diversion in thought interrupted your input? How much of pain would that be?). It's just so much easier to let the brain use the outputs it's designed with (voice or motion) rather than trying to teach the brain to output in a totally new way.

Of course, people don't have instant mastery of the mouse either, I've seen some people really struggle with it.

Like most things, if a mind interface were going to happen it'd be gradual, and some people would grow up with it, it would seem perfectly natural, and one day you'd only have people left who'd never used a mouse and thought the whole idea a bit odd :). Presumably if such an interface existed, it wouldn't only be PC's using your mind, so people would grow-up into this method.
 
Well, as that article states, I think the best use of such a system would be as an aid for people who do not have the use of various limbs. I think the move to a direct computer interface would be simply too abstract for application in the near-term.
 
All this brings back memories of the cyber-man 3D HID and the Nintendo power glove.

In anycase, if you think about the way we work with concrete objects on our desks and what not is a pretty good system. I like the idea of a light weight device like a piece of paper and a pen/pencil to manipulate it, heck I can go to the park, not just the desk. The point is that the desktop metaphor is pretty good. Same with most spatial metaphors. It's very natural for people.

Personally, I'd like to see the 2D desktop improve. With superior handling of workspaces. Allowing varied user interfaces, metaphors and what not which are optimal to the system. Rather than having to reorganize your desk manual each time to optimize of a certain task or create a generally optimal layout, it should simply reconfigure with the touch of a button.
 
Back
Top