Hacker News new | past | comments | ask | show | jobs | submit login
Hacking my arm prosthesis to plug into a synth: thought-controlled music [video] (youtube.com)
310 points by tomduncalf on Feb 16, 2020 | hide | past | favorite | 46 comments



Honestly one of the coolest things I’ve seen on HN. It’s a great display of a ton of things that have just recently (10-20 years) become accessible to the masses. 3D printing for the enclosure, the internet for the schematics, and advanced prosthetics to translate the muscle signal into voltage combined with tried and true technologies like relatively simple circuit design and modular synths. Thanks for sharing!


The thing that's really struck me (which is potentially saying a lot - or nothing - because I'm still young) is how cheap PCB Fabrication (and Assembly!) at a hobbyist level has gotten recently e.g. An 8-layer ENIG, double silkscreen, a few gold fingers for $40 to $50 shipped is pretty damn cheap

Think of all the projects to never get round to making!


I don't think making PCB was ever expensive, perhaps making a professionally looking one.

In old times you had a piece of plastic with a copper layer. You put wax on it, then you could use a needle to remove wax in right places (you basically print the inverse of the circuit)

Place that PCB in acid, and after the copper gets dissolved you clean it up, and have printed PCB.

There are many techniques people have, and modern ones involves printers, but you kind of always could do one at low cost. It might just not look that nice.


Doing a complex 8 layer board would have been traditionally expensive... or nonexistent, depending on how long ago we’re talking. DIY accessible stuff is usually single layer or dual layer through-hole stuff.


my bad, now after reading your comment I noticed the 8 layer. Yeah I meant only simple PCB.


"advanced prosthetics to translate the muscle signal into voltage". This is not new. Myoelectric controlled arm protheses were developed and in production already in the 1960ies. https://eduard.horvath-vienna.at/ot2003.140-141.pdf


This is incredible. I think the next step from here - technology wise I mean - is figuring out how devices can tap into this without needing to lose a limb. Implants that can read the signals from a limb to control something else could be the bridge in between what we have now, and brain-computer interfaces.

You know those devices that severely disabled people like the late Stephen Hawking used to interact with his computer? They require only a very small amount of muscle movement to control a switch etc. What if an implant could be put directly into a muscle or a group of muscles (or a a nerve ending somewhere more suitable) so that a tiny movement could trigger some action? It would essentially give humans a set of re-programmable multi-function buttons. Connect it to your phone over bluetooth, and you can setup an action to be triggered by it. How would you filter out regular/involuntary input vs. intentional input? Not sure! But this avenue of development is exciting.



Another cool device, which is not an arm and is for input rather than control, is one that lets people see through their tongues.[1]

It's currently targeted at the blind, but I'd imagine seeing people could use it just as well if it picked up, say, infrared, ultraviolet, x-rays, radio waves, ionizing radiation or virtually anything else normally undetectable to human senses.

The potential of human sensory enhancement is truly staggering.

[1] - https://www.newyorker.com/magazine/2017/05/15/seeing-with-yo...


If it was cheap and easy enough, I'd seriously try it as a sighted person, to read while resting my eyes, which routinely get strained by the end of the day. Possibly decoupled from the camera and getting the signal from an adapter which itself gets text.

https://www.wicab.com/brainport-vision-pro

"Ready to buy or try a BrainPort Vision Pro? Fill out the patient survey here to determine if you are a good candidate. A Certified Trainer will contact you to discuss the next steps!"

Oh, never mind... I got from the quoted text that it would probably be prohibitively expensive, and that if I tried to find out more I'd probably divert professionals from helping actual blind people in need of this tech .


It costs $10,000.

I wonder how hard it'd be to engineer something similar for a lot less money, though.

It just looks like a tiny camera mounted in regular sunglasses and connected to a computer which then translates the pixels from the camera to electric impulses on a grid of electrodes on a "lollipop" paddle that goes in to the mouth.

The camera, computer, and software translating the signals seems like the easy part. The only tricky part would be the electrodes, as those would have to be spaced pretty closely and create enough of a signal to be felt while not interfering with each other and at the same time being safe to put in to the mouth and not be affected by the saliva in the mouth, not taste bad, resist being accidentally bitten on, and probably some other engineering challenges I can't think of right now off the top of my head.

It's definitely an interesting challenge.


It's great that the representatives from the company sent him the relevant parts of the schematic to help him connect his arm to the interface board. Products aren't often built with accessibility with mind and seeing DIY projects like this inspire better solutions for these groups.


Not only that, but it must be incredibly rewarding to work on something that actually helps people. In addition, they got to work on something that's really cutting edge or even futuristic in a way that most technology work isn't.

It's a huge win all around, and the world be a much better place if more people in tech did stuff like this than figuring out more effective ways of getting people to click ads or spy on them.


The thing that caught me the most about this video has nothing to do with the hack (which is awesome). What impressed me was how natural the prosthesis is, and how naturally he moves with it. Specifically, he grasps his non-prosthetic wrist with his prosthetic hand, casually, while he's talking. It shows how he's able to use the prosthesis without thinking about it. That and the fact that the prosthesis has the same dimensions as a natural limb, and doesn't require any bulky power attachment. I'm guessing this is standard now, or at least not experimental. I didn't realize prosthetics had advanced this far.


Another interesting video along these lines: [1], of a girl with two proshetic arms.

[1] - https://www.youtube.com/watch?v=4eMQuhxNDJM


"the fact that the prosthesis has the same dimensions as a natural limb, and doesn't require any bulky power attachment."

This is standard in arm prothesis at least since the 1960ies.


For powered ones, vs purely manual? I guess it doesn't need a lot of power.



This is incredible, indeed. And must be well understood technology by now. There is for example this man who build prosthetics with lego: https://www.reuters.com/article/us-spain-lego-prosthetic-arm....


In case anyone's wondering what that cool synthesizer that he's using is, it's a Eurorack[1] modular synthesizer[2].

If you are at all interested in this, I'd strongly encourage you to check out Muffwiggler[3], a huge forum for modular synthesizer enthusiasts.

Also useful is the Modular Grid[4], a gigantic database of thousands of amazing modules (and which also lets you design your own modular setup using these modules).

VCV Rack[4] is also great for playing around with software versions of some of these modules, which can be pretty expensive (the guy in the video has likely invested thousands and thousands of dollars in to his Eurorack setup, while VCV Rack is free). On the other hand, you can also build your own Eurorack modules (see the DIY forum on Muffwiggler for that), which can be a lot cheaper.

Something else I was thinking about was that if the main problem the guy in the video was trying to solve was that the knobs and switches on his modules were too difficult to manipulate with his prosthesis, then he could have tried the HackMe Vectr[5][6], which would have let him simply wave his whole hand around above the module to send out three voltages from his hand movements in three dimensional space. Mind control is, of course, way cooler, and doesn't require movement at all.

Finally, I was also thinking about how his modular interface might have been designed with opto-isolators, to minimize the risk of voltage inadvertently being sent back from the modular in to his arm, should he plug his cable in to an output rather than an input. Someone with more of an electronics background than me can probably comment on whether this would be a good idea.

[1] - https://en.wikipedia.org/wiki/Eurorack

[2] - https://en.wikipedia.org/wiki/Modular_synthesizer

[3] - https://muffwiggler.com/forum/

[4] - https://vcvrack.com/

[5] - https://www.youtube.com/watch?v=HSoVBz-1eQ0

[6] - http://hackmeopen.com/Vectr/


Yeah, some degree of isolation is probably a good idea. The issue wouldn't be an accident due to plugging into an output, but issues with bad grounding or a short to mains. It's unlikely, as it would be a failure condition for the modular synth too, but better safe than sorry. I'd personally like 2-5k kV of galvanic isolation between me (and my $$$ prosthesis!) and the wall.

Plugging into an output would merely make some op-amps unhappy. Maybe a little magic smoke, but nothing lethal...

(Not a real EE and I'm definitely not confident enough to design interfaces between fleshy bits and mains-powered hardware.)


See also the JACK audio and LV2 plugin form of CV - https://llllllll.co/t/jack-lv2-cv-ports/29428


Very cool, I didn’t know this existed! Is it Linux only?


Both JACK and LV2 are multiplatform, but there are Windows build problems for JACK atm. I'm unaware as to the current state of things regarding macOS. However, Carla is multiplatform can do it all internally.

There are a limited number of binary LV2 bundles available for Windows n macOS. That number has been slowly increasing, though just a fraction of LV2 use CV ports so far, but it is early days yet.

https://kx.studio/Applications:Carla https://github.com/falkTX/Carla

https://github.com/lv2/lv2/wiki


I think there is a big advantage in having some engineering/hacking minded people being the users of medical devices like this arm prosthesis. I work in medical devices and I always think how good it would be to have test patients who have a deep understanding of our devices. I am sure progress could be accelerated a lot if you had patients who can give better feedback. I know it’s not FDA approved but if I had device myself I certainly would play with the programming. Obviously the level of experimentation depends on the device. If a malfunction can do serious harm you probably don’t want to experiment much. But there are a lot of devices that won’t kill you if something goes wrong.


real user feedback? do you live in la-la land or something? We're going to need an outside agency, a giant game of telephone, and a bunch of contrived questionnaires to spend that kind of money.


This is really fascinating to me. I've often wondered what exactly makes an artist/musician. Is it the person who can create it in their head, or the person who can actually write it down and/or actually play it on an instrument? Up until now, it's always been the latter. Not everybody can read/write music or play an actual instrument, but most can create it in their head. Like, I'm sure most of you can create a simple tune and hum it, and don't know how to read/write music or play an instrument. This kind of thing would allow you to. I wonder how many hidden artists throughout time have created brilliant masterpieces, but just could never express it.


I'd encourage you to read Brian Eno's "Composers as Gardeners"[1], which talks about a different way of looking at composition, which is pretty different from the usual way people think about it.

Many non-musicians are under the impression that composers think up music in their head and then write it down, and while some do work that way to some extent (with they myth of Mozart working like that in the movie Amadeus being probably the most well known and most extreme case of composition being completely like that), for most composers (including Mozart) this is much more of an iterative, experimental process.

In recent decades, and especially with the advent of electronic music and computers, a lot more randomness and elements not fully or directly under the composer's control have entered in (though, again, this goes back at least back to the age of Mozart, with his dice game[2]).

There's a whole field of algorithmic composition[3] where music-generation becomes either semi-automated or even completely automated, and the composer's role is one of writing or choosing the algorithms, and of choosing various other aspects of the music (such as which instruments or sounds get used) but without pre-determining the final work in totality or in advance.

Brian Eno talks about this much more eloquently and deeply than I can, so I again encourage you to read his talk.

[1] - https://www.edge.org/conversation/composers-as-gardeners

[2] - https://en.wikipedia.org/wiki/Musikalisches_W%C3%BCrfelspiel

[3] - https://en.wikipedia.org/wiki/Algorithmic_composition


I've always wondered why we've learn a specific music instrument and I say this after learning flute (simple bamboo one I bought from India)

Even without music theory or knowledge of specific instrument we are able to hum or vocally replicate a piece of music - why is it not possible to directly transform it into some instrument?

Is it not possible to hum > grab the pattern > generate instrument music with that?

I mean i can articulate smth best with my voice which is with me since I've learned to generate voice on my own so why we don't exploit it with help of some software?


I had a similar thought until I actually started learning an instrument and my SO, flutist graduated from an Italian conservatory, made me start singing with the right intonation (singing the same notes you are playing on the instrument) and rhythm (division and subdivision alongside a metronome). Long story short: singing is hard and requires real effort. There is a good chance that what you think you can articulate is probably far away from minimal quality on both pitch and time.


While that's true enough for many people, adding a feedback step and some practice would improve matters quickly.

There is also a good chance someone can learn to encode quite a bit into vocalizations.


"some practice would improve matters quickly"

Anyone who seriously learned an instrument or to sing, knows that it takes very hard practice for years to improve.


I have done that myself. Both vocal and instrument.

Yes, but vocalizations may well improve more quickly.

Frankly, I have toyed with this kind of thing. Software could do a lot, in particular, allow the user to adapt their expressions in various ways.


You can to some extent. There are a number of tools that attempt to convert from audio to note information (MIDI).

As someone who's done some experiments with those, there are some major caveats:

- Most people can't sing a melody as well as they think they can. Trying to figure out what they were trying to do is where most of the effort goes. Even I'm mediocre at such, and I'm at a professional level at one instrument, and competent at a few others.

- You have to deal with overtones. Almost all naturally produced notes are a stack of waveforms, and figuring out which one caries the intended signal isn't always trivial (and can be even more complicated in polyphonic harmony).


It is really cool, indeed. Bravo for the creativity and the effort. Facebook recently acquired a company called ctrl-labs that develops a neural interface. Their vision is making computer interface more natural. Bertolt, in the video says “The thing is for me, that is such a natural thing to do, I do not really have to think about it. I just do it, it is zero effort.”(6.23) I love that, I hope we can have such computer interfaces in near future.


Great concept and interesting demo, however the title is pretty misleading. This isn't what I would call "thought controlled" more than the old Thalmic Myo or any EMG [0] is thought controlled.

This is reading muscle signals with electrodes and translating those muscle activations into a variable voltage signal. One could argue that because you're not reading mechanical action it could be considered "thought control" but I'll leave that up to the individual to gauge for themselves.

[0] https://techcrunch.com/2018/12/10/ctrl-labs-first-dev-kit-is...


Most cheaper EEG headbands are misadvertised and actually measure skull muscle activation instead of actual brain signals, so there's that..


If anyone owns a Neurosky (brain wave reader that fits on your head) and wants to try out something similar, you can try a simple demo that I wrote: https://github.com/yayitswei/hairtime


Did this remind anybody of Exapunks?

It's a game that has a defined assembly-like language, you hack into things to do other things. The catch is some of those things you hack into are actually your nerves, so that you can still function properly.

Pretty cool game by Zachtronics


Indeed, very cool. I kept thinking that with an Arduino he probably could have build a first prototype with first building a custom circuit. Just nitpicking.


Initially I was wondering why his prosthesis has ARM processor...


Very cool, skip to the end to hear it in action.


Don't skip to the end. It's not about the final product.


That's awesome. I figure its only a matter of time before I'll be hard pressed not to make the difficult decision of giving up my arms for the eventual advantage of a prosthetic, and the seamless way he is able to control the synth brings that horizon much closer.

Combine this with a little ML and you could probably have a general purpose interface trainable in a few seconds/minutes to interact with any suitable output based on your personal signals! Imagine piloting a vehicle without moving a muscle, or playing a video game, or interacting with someone remotely...

Hell, now that I think of it, there's no reason you have to cut off your arms to do so. Is anyone aware of any open source prosthetic tech/code that one could conceivably hook up to functioning appendages?


brb cutting off my arm


I know this is meant to be humor, but realistically, as I understand it, you can just tap into the muscular signals with electrodes - that does not need any such dramatic step.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: