Touchscreens Go Touchless with Elliptic Labs Android SDK

Elliptic Labs has launched what it says is the first SDK to enable touchless gesturing using ultrasound, which has been created for intergration into Android handsets.

Elliptic uses sound waves sent from the device to detect users hand movements, registering these gestures on screen to control the phones function. The company says this tech uses less power than camera or light gesture control solutions on the market.

Ultrasound gesture control was demoed by Qualcomm back in 2011 but doesn’t appear to have got beyond the R&D phase, although the company also acquired assets from an ultrasound speacialist last year.

Leap Motion is another mover in this space, they have both a controller that plugs into your computer by USB and an SDK on the market, but this system uses infra-red to detect motion.