The non-standard game interfaces discussion thread (move, voice, vitality, etc.)

Well, actually, in South africa, they proposed a law (I don't know if it has passed yet) in which a weatherman can get 10 years in jail and a $500 000 fine if they predict the weather incorrectly...

Well it's pretty easy in South Africa to be fair...

"Tomorrow is going to be sunny all day, with the possibility for light showers in the afternoon"... :cool:

;)
 
Well it's pretty easy in South Africa to be fair...

"Tomorrow is going to be sunny all day, with the possibility for light showers in the afternoon"... :cool:

;)
Heh, on the highveld, during summer, the probably of an afternoon thunderstorm at around 4:15pm approaches 1. Luckily also lasts only 15 minutes or so.

We also got golf ball sized hail on occasion. My parent's car still has the dents. And it snowed once (in my lifetime) in september. :)
 
Would this qualify as "non standard" for the mobile realm? :LOL:

Anyway joke aside after playing a bit more on Tegra2 device I believe that we are no where near exploiting properly motion sensitivity especially in what I would called standard games. It got me thinking about the old discussions about the six-axis and how I would most likely have a different pov now.
 
Microsoft releases Robotics Developer Studio 4, bring your own Kinect
http://www.engadget.com/2012/03/10/microsoft-releases-robotics-developer-studio-4-bring-your-own-k/

It's been available in beta for a few months, but Microsoft has now made the final version of its Robotics Developer Studio 4 toolkit available for download. As before, it remains completely free, and it's also now compatible with the release version of the Kinect for Windows SDK so you can build your own beverage-carrying robot like the one Microsoft shows off in the video after the break.


Now quick, someone go make a Vita PS Suite module with Robotics Developer Studio. I see my backyard sentry gun platform becoming a reality soon.
 
Intel drops $21 million for ten percent stake in eye-tracking firm Tobii
http://www.engadget.com/2012/03/16/intel-drops-21-million-for-ten-percent-stake-in-eye-tracking-fi/

Tobii has managed to impress quite a few folks with its eye-tracking technology -- most recently in the form of the "Eye Asteroids" arcade game -- and it looks like Intel has been paying particularly close attention to the company. As Computer Sweden reports, Intel (or Intel Capital, specifically) has now shelled out roughly $21 million to buy a ten percent stake in the Swedish company, which hopes to soon see its eye-tracking system used in everything from desktops and laptops to phones and even vehicles.

Use it on visors !
 
Between kinect and the new iPad I want voice integration done better and better it's really good and I think that's the future.
 
Sony Mobile files patent for a “sensor-equipped” display
http://www.xperiablog.net/2012/03/2...tent-for-a-sensor-equipped-display/#more-6162

Sony Mobile has filed a patent for a sensor-equipped display. This basically includes a light-transmissive display made out of an opaque material behind which a sensor would sit.

The idea is that you could place a sensor such as a front-facing camera, proximity sensor, illuminance sensor or fingerprint sensor behind the display. As the display is light-transmissive it would be able to detect light passing through it. This would mean that Sony could use more real estate for the display and less for worrying about room for the sensors that normally sit at the top of the phone.


Apple's haptic touch feedback concept uses actuators, senses force on iPhone, iPad
http://www.appleinsider.com/article...es_actuators_senses_force_on_iphone_ipad.html

The concept described in Apple's new patent application is quite different, relying on actuators to physically provide haptic feedback on a touchscreen, rather than giving sensations through an electric field. But it demonstrates Apple's continued pursuit of providing users with some sort of physical feedback when using a touchscreen device.

"The user can typically only feel the rigid surface of the touch screen, making it difficult to find icons, hyperlinks, textboxes, or other user-selectable elements that are being displayed," Apple's filing reads. "A touch-based user interface may help a user navigate content displayed on the display screen by incorporating haptic feedback."

Rather than simply vibrating the device when a button is tapped, as some touchscreen devices do, Apple's solution could utilize piezoelectric actuators for "localized haptic feedback." This would allow the user to feel a virtual button on their fingertips.
 
I know it's wrong and I know I should ignore it, but I just cannot get over the fact that move peripherals have a sodding glowing ping-pong ball glued to them. It just adds a sense of the ridiculous to it all.
 
I know it's wrong and I know I should ignore it, but I just cannot get over the fact that move peripherals have a sodding glowing ping-pong ball glued to them. It just adds a sense of the ridiculous to it all.

Until you play a game with it and it suddenly just seems right. :)
 
I know it's wrong and I know I should ignore it, but I just cannot get over the fact that move peripherals have a sodding glowing ping-pong ball glued to them. It just adds a sense of the ridiculous to it all.

Yep, I realized that myself, when I was feeling like a badass for owning the CPU in "the Fight". I was a ferocious beast, until I looked at the things I had in my hand :-0 with a pink and baby blue glowing orb attached to either one :cool:
 
Yep, I realized that myself, when I was feeling like a badass for owning the CPU in "the Fight". I was a ferocious beast, until I looked at the things I had in my hand :-0 with a pink and baby blue glowing orb attached to either one :cool:

I thought a red orb in Killzone 3 actually added to the atmosphere quite well ;)

Just started Sports Champions 2 by the way, just tried boxing for three rounds, seems pretty good.
 
If you had a VR headset on you wouldn't be able to see the light ball. They could also attach the camera to the headset with a fish eye lens for tracking, so the position of the device would always be correct as long as you could see it. you could even have some kind of Kinect or Leap to track your whole arms and fingers. But if you had some kind of haptic feed back apparatus it might just be better to get positional information from them.
 
Pretty interesting repurposing of Move peripherals with very little added, that allows control of dual nav+move:


Basically, left analog is for movement, right analog is for camera, and the spatial tracking of either hand is for controlling .. hands/arms.
 
I'd never even thought of using two nav cons as a split controller, and that goes even further.

Edit: Found an interview at iwaggle http://www.iwaggle3d.com/2012/12/dualplay-officially-unveiled-its.html

All sounds good but I would use the analogue nature of triggers to control both holding and shooting. So you'd be squeezing the triggers to hold on to the guns but press them all the way to shoot. This would give you the sensation of having a gun primed to shoot.
 
Last edited by a moderator:
Back
Top