Home » Lifestyle » Tech News » Controlling your computer using gestures

Controlling your computer using gestures


by Rom Feria

Controlling devices using gestures is no longer the realm of science fiction. It may have been popularized by Tom Cruise on Minority Report, but that technology has been put in your living room by Microsoft, via their Kinect controller. Some smart TVs like the ones from Samsung also come with a camera that tracks gestures that allow you to control it. However, there is still nothing as generic as a peripheral that you can add on your computer to give you such functionality.

Yes, Microsoft promised a Kinect-like technology for your desktop, but some other company beat them to the punch. Leap Motion, a San Francisco-based company, has released their Leap Motion controller, a small sensor, shaped like the original Apple iPod Shuffle, that connects to your computer via USB. (The UP Department of Computer Science had a similar research done under the supervision of Prof. Prospero Naval, Jr., that used an ordinary camera to detect and recognize gestures).

The Leap Motion controller is a small device that sits between the computer and/or display and the user, ala keyboard, that tracks any gestures made by your hands. Whilst the current Kinect (before the Xbox One, probably) can track your entire hand/arm movements, the Leap Motion can track each finger. You hold your hand over an area covered by the controller and the sensors can track any movement that you make. It is pretty amazing at how accurately it can track each of your finger.

Installing the device is as simple as going to to download the software, plug-in the controller and then going through the step-by-step installation. Once installation is done, it brings you to the Airspace app, this is akin to the iTunes App Store, where you can download both free and paid applications that are optimized for the controller.

I have installed it on my iMac and downloaded a popular game, Cut the Rope, just to see how the gesture controls perform. On this game, you need just a finger to slice the ropes. It takes a few seconds to get used to the tracking, but once you have done it, it is a bit natural than using a trackpad or a mouse. However, it is very similar to how you do it on your iOS device — without touching the screen.

There are dozens of applications available and I am sure that developers are coming up with more applications that support this new UI. I have downloaded a couple of other applications, mostly educational, and the gesture interface makes using the apps more fun. For instance, the chemistry application allows you to use your hands to zoom in/out of the molecules, rotate it, etc. Whilst the app is a demo, it would be perfect if you can use it to combine molecules and see what happens. Another app allows you to dissect a skull — you can rotate it, zoom in/out, point on a portion (and the app identifies it), cut it out, put it back and more. There is paid app that I expect to be more functional.

I share the same sentiments as my educator friend, Joel Suplido (@jsuplido), that the entire experience is tiring for the arms. That being said, it is one of the future of user interfaces — gesture controls — pair it with augment reality (Google Glass perhaps) and natural language processing/voice control (Google Now/Siri) – now that is the future and we are rapidly seeing it unfold — give it two years for keyboards, mouse and the trackpad will start to become less functional.