The new Samsung Galaxy S4 have and interesting new type of gestures called "Air Gesture" that use an infrared sensor to process users hand movement in front of the screen, adding a "pre-touch"/"proximity" event, in which the fingers are detected before the touch.
I'd like to use this "Floating Touch" event in one of my apps. I searched both on Google and in the Samsung Developers Center, but I couldn't find any API or information about that.
Are the API available or is too early? Does someone have any link/info?
http://www.samsung.com/global/microsite/galaxys4/lifetask.html#page=airview
There is a good youtube video from a Google I/O session called "the sensitive side of Android" that explains how you could do something similar. Basically you use the light sensors and then you register a drop. This is quite simple but you gotta remember that changing light sources while running the app can affect the results of this feature.
That said I think they use hand detection on the front camera in order to pull off all those gestures and that would be quite complex. Also I haven't seen any api that makes it simple to do hand detection.