Tue Jun 14, 2011
As far as I know, there isn’t much interaction between smart phones and computers today. They are mostly used as two independent devices. I can access the files in my phone from my PC, sync contacts and music, but not much more. When I’m in front of my computer I usually have my phone plugged into it. This phone can detect acceleration and has a hi-res touch screen. Why not use the phone to augment the computer?
It would be great if my laptop could read all phone sensors in real time. Then it could be used as a game controller. A mouse in one hand, and my phone in the other. I could tilt it, touch the screen and even see information about the game in my phone’s display. Maybe it would be even possible to use the phone for aiming, using the phone camera to track objects in my computer’s screen (registration points).
But as I don’t play much, I’m more interested in productivity than in games. I’d like to use the phone’s display as an external display for my laptop. For instance the Gimp’s toolbar or a color selector could be shown in my phone, leaving more free pixels in my laptops display. As I’m working with my Wacom tablet on my right hand, I’d like to use my left hand tilting the phone to control color, opacity or brush width. If I’m creating music, I’d like to tilt the phone to control the filter frequency or the amount of effect in a certain audio track. If latency is low enough it would be a much better interface than a mouse and a keyboard, something that could complement other MIDI controllers.
I think there are some experiments in this direction, but if Google (or someone else) added the required APIs and libraries to Android it would become much easier to experiment in this field. Hopefully some developers and artists find this concept interesting enough to work on it.