Hacker News new | past | comments | ask | show | jobs | submit login

William Gibson's novel the Peripheral (2014) and Agency (2020) have characters who control implanted smartphones (like a HUD on your eyes and audio channel built into you ears) that are controlled by using the tip of you tongue on the roof of your mouth. It's possible he was inspired by the video or some other source but yeah it's clear the idea has been around for a while.



Speaker for the Dead (1987) has something similar so the idea is much older than that at least.


IIRC the book(s) mainly referenced "subvocal" controls, based on detecting almost-talking movements within the user's face/throat/voicebox.

I'd argue those are qualitatively different, more like a modern-day system of saying "AssistantBot: Send E-Mail", as opposed to a geometric mapping one could use for, say, painting a picture.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: