Hacker News new | past | comments | ask | show | jobs | submit login

Co-founder of Augmental here. I really appreciate your kind comments. It's been a four-year journey to get here, and we're thankful for the amazing people who've helped us along the way. Feel free to shoot any questions you have my way!



I don’t have any physical disabilities, but I am still interested in something like this as may help be smoke less and up my productivity.

I am curious what this would be like pairing it with an eye tracker, so you could look at a particular area then as soon as you move your tongue slightly that’s when the cursor appears where you are looking at. So that the cursor never feels in the way or something you are dragging, and even less movement with the tongue. I’d give a regular mouse up for that in a heartbeat.

But yeah, congrats in seeing your idea out - impressive stuff :)


Thanks for your message and interest in the MouthPad^! In the future, we plan to integrate various health-tracking capabilities, including gas sensing. This could help monitor smoking habits and might assist in efforts to smoke less.

Your use case sounds entirely feasible and promising. We're also looking at the possibility of coupling eye tracking for cursor movement with the MouthPad^ trackpad, functioning as a gestural trackpad. This setup would allow for left clicks, right clicks, swipes, click-and-drag actions, and other gestures. It's an exciting prospect that could significantly improve the eye-tracking experience by overcoming the typically slow response of dwell-based clicking.

Stay tuned for these developments, and thanks for recognizing our work – it’s greatly appreciated!


This very cool and looks like it will help a lot of people.

Have you given any thought to the upcoming Vision Pro? I’m guessing it may depend on how it’s received by the public, if it’s worth putting the time into, but with its eye tracking and simple finger tap input, I’m thinking replacing the finger tap with a tongue tap would make for a pretty great experience.


Thanks for your enthusiasm and insight! We're in talks with Apple about integrating the MouthPad^ with the Vision Pro. The goal is to pair our tech with their eye-tracking system, offering a hands-free alternative for clicks and gestures currently done by hand.

Also, as more AR headsets become common, hands-free solutions will be crucial. Traditional hand tracking for clicking isn't always reliable, as hands might not be in the camera's view. The MouthPad^ could provide a consistent way to interact with these technologies, making computing more inclusive.


I cannot wait to see what colab people come up with to integrate your device with a head-tracking mouse, e.g. TrackerPro by AbleNet?

Haven't looked in to your product at all, but am trying to help an eighty year old genius succumbing to Parkinsonism.


That's a great idea! We're currently beta testing head tracking on the MouthPad^, which is aimed at improving upon traditional head tracking methods. This feature lets users recenter the tracker by simply placing their tongue on the trackpad, making navigation more intuitive and less fatiguing by reducing neck movements.

We're excited about the potential of the MouthPad^ to help people with various motor abilities. Its broad range of functionalities could be particularly helpful for individuals with conditions like Parkinson's.


If I wanted to support this in a user interface, what would the inputs look like?


The MouthPad^ connects via standard Bluetooth pairing and is compatible with most common operating systems, functioning like a Bluetooth mouse under standard Bluetooth HID protocols. In the future, we will introduce an SDK to enable direct mappings for more tailored applications.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: