Hacker News new | past | comments | ask | show | jobs | submit login

How groundbreaking is this? On that note, what is the state of the art for brain-computer interfaces, invasive or non-invasive, with which the user can actually input data into a computer?

As far as I understand the method described in the article, it could eventually be employed as an alternative to eye tracking for computer input, i.e., instead of determining what letter the user's eyes are looking at by using cameras pointed at their face and computer vision you would scan the user's visual cortex directly. One can immediately think of applications this would have even outside of the assistive technology market, e.g., for mobile input.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: