Senseye
Senseye Senseye

As you think about what the new year will bring in 2012, try to imagine a whole new way to interact with your mobile devices. It was only 2007 when Apple brought us the touchscreen smartphone, and now, a program called Senseye will enable people to control their devices with their eyes. It's called eye-movement-based navigation and the video below shows it in action. A tiny camera is aimed at your eyes and can tell where you are looking on the screen. It tracks eye movement and can scroll or move the cursor as needed. The inventors plan on getting the software on some Android phones in 2013, and some tablets possibly even earlier.

The technology is a great example of how tools made to help the disabled sometimes end up helping many more people. One great example is the PlayStation 3 Avenger controller that has been a big part of the Ocean Marketing/Penny Arcade meme currently taking over your inbox. The controller was built to help a disabled kid play games, but people liked it so much it is in much wider use now. Another example are exoskeleton suits for people who have limited use of their legs. A suit that can read nerve impulses from your brain and move the cooresponding limb is a nascent technology to be sure, but the possibilities are exciting.

As for eyeball tracking technology, there are already technologies like communicating with the aid of a computer that detects eye movements. Asrtrophysicist Stephen Hawking doesn't use an eye tracking computer, but he does use a computer to help him speak by clicking a button with his good hand.The Wall Street Journal has reported about a program called QuickGlance that has helped a student with cerebal palsy study at the Rochester Institute of Technology. Let us know in the comments if you welcome our new robot overlords or if you can't wait to use Senseye on your Android device.