Hacker News new | past | comments | ask | show | jobs | submit login

For that matter even with Touch Screen, a gesture is extracted/identified by touch Algorithms (either in Touch controller or SoC). Only later, an action is taken. So there will defenitely be an inherent delay. These algos involve similar classification methods as mentioned in this demo.



But at least on a touch screen I can perform a partial gesture and see a partial result — i.e., the feeling is that the image zooms as my fingers pinch the screen, or the map pans as my fingers move. It's a very different feeling to the approach demonstrated in the video.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: