Hacker News new | past | comments | ask | show | jobs | submit login

I can see why some people might be concerned about the camera, but I'm far more concerned by the microphone. There's far more sensitive and actionable information that can be gathered from me that way! I'm glad that macOS started putting a light in the menubar when the microphone is in use, but I'd prefer to have unhackable hardware for that instead.



I believe it is possible to turn a speaker into a microphone. Found a paper which claims to do just that[0]. So, there is no safety anywhere?

  SPEAKE(a)R: Turn Speakers to Microphones for Fun and Profit
  It is possible to manipulate the headphones (or earphones) connected to a computer, silently turning them into a pair of eavesdropping microphones - with software alone. The same is also true for some types of loudspeakers. This paper focuses on this threat in a cyber-security context. We present SPEAKE(a)R, a software that can covertly turn the headphones connected to a PC into a microphone. We present technical background and explain why most of PCs and laptops are susceptible to this type of attack. We examine an attack scenario in which malware can use a computer as an eavesdropping device, even when a microphone is not present, muted, taped, or turned off. We measure the signal quality and the effective distance, and survey the defensive countermeasures. 
[0] https://arxiv.org/abs/1611.07350


This only works on audio chipsets that allow pin retasking. Which is, coincidentally, all Realtek chipsets that are present in every PC...

(you also need to plug the speaker directly, mostly limiting it to headphones and laptop speakers)


Even where it works, speakers are much worse microphones that dedicated microphones, and so the amount of data that can be gathered is low. Why bother when you probably have a microphone on the same PC that can capture far more sound?


This isn't about audio fidelity, this just about getting audible spoken words, which is definitely possible even with the worst microphone.


I think there was a long period where a proper PC would frequently have only the cheap stereo speakers which are small enough to far outperform raw microphone leads. But I'm not sure this works that well in >=HDMI even if some monitor speakers might otherwise be ideal.


Despite this being a 2016 paper, it's worth noting that this is true in general and has been common(ish) knowledge among electrical engineers for decades. Highschoolers and undergrads in electrical engineering classes often discover this independently.

What's notable about this paper is only that they demonstrate it as a practical attack, rather than just a neat fun fact of audio engineering.

As a fun fact, an LED can also be used as a photometer. (You can verify this with just a multimeter, an LED, and a light source.) But I doubt there's any practical attack using a monitor as a photosensor.


and has been common(ish) knowledge among electrical engineers for decades.

Not only is it common knowledge it's how drive-thru kiosks work!

Source: I used to test microphone/speakers for a kiosk OEM.


Is it really a single unit that acts as both the speaker and mic? Can it do both simultaneously? Is that why it sounds so trash?


Yes! LEDs as photometers is something that you don't really see around much anymore, but it is really cool. Even an LED matrix can be used as a self-illuminating proximity sensor with the right setup.

https://www.youtube.com/watch?v=GaAtpAuNN_o


I recall in the early or mid 2000s using some cheap earbuds plugged into the microphone port of my family computer as a pair of microphones in lieu of having a real microphone nor the money for one. Then I used Audacity to turn the terrible recording into a passable sound effect for the video games I was making.

Not knowing much about how soundcards work, I imagine it would be feasible to flash some soundcards with custom firmware to use the speaker port for input without the user knowing.


This is common at nightclubs (or was) - a DJ can use their headphones as a microphone, speaking into one channel and listening to another

Example https://m.youtube.com/watch?v=1NNP6AFkpjs

:-)


You will still see DJs do this in NYC! Old school flavor. You can also see Skepta rapping into a pair on the the music video for That's Not Me: https://www.youtube.com/watch?v=_xQKWnvtg6c

I've seen some theatrical DJs bring a cheap pair, snap them in half, and then use them like a "lollipop." Crowd eats it up. Even older school: using a telephone handset: https://imgur.com/a/1fUghXY


Yup it's wild to me how much anxiety there is about cameras while no mind is given to microphones. Conversations are much more privileged than potentially seeing me in my underwear.

That said the most sensitive information is what we already willingly transmit: search queries, interactions, etc. We feed these systems with so much data that they arguably learn things about us that we're not even consciously aware of.

Covering your camera with tape seems like a totally backwards assessment of privacy risk.


I’m just going to assume you’re a man, and don’t generally worry about things like revenge porn. Because that is a bigger concern to me than you, it seems. Sure, I don’t want my sound to be recorded either, but that’s why I put a cover on the webcam AND turn off the physical switch on my (external) microphone. They are both easy things to do.


> Conversations are much more privileged than potentially seeing me in my underwear.

Depends on how you look in underwear.


> Yup it's wild to me how much anxiety there is about cameras while no mind is given to microphones. Conversations are much more privileged than potentially seeing me in my underwear.

It depends on the person, I don't think you could gain much from me? I don't say credit card numbers out loud, I don't talk about hypothetical crimes out loud. I don't say my wallet seed phrases out loud. I also don't type in my passwords. Yes you could probably find out what restaurant I'm ordering delivery for, but other than that I suppose my conversations are really boring.


The cost of feeding your entire years speech to an LLM will be $0.5/person. I'm sure summarized and searchable your conversation will be very very valuable.


The microphone also can't be covered with a $1 plastic camera cover off Amazon. It's so easy to solve the camera issue if you care about it, but there's really nothing you can do about the mic.


You can do it even cheaper with some painter's tape!

For the mic, perhaps you could disable it by plugging in an unconnected trrs plug into the audio jack. I'm not sure how low level the switching of the microphone source is when you do this, so maybe it's not a good method.


I went the "batshit insane" route and my microphone hole is plugged in with some clay.

It did most likely physically damage it forever, but at least I now know it's OFF for good.


i tried that with some sugru on an old phone (samsung s10e) and it does a really good job of blocking the microphone.

if you have a case on your phone its a lot less destructive too since you can just stuff the sugru into the microphone hole in the case. the case i was using was soft rubber so it was easy enough to pop out the corner of the case to be able to use the microphone for a call.

that wasnt my daily phone at the time though so im not sure how well it would work in reality. i could see myself forgetting to pop out the case when i get a call and the other person just handing up before i realised what was going on.

it also doesnt work on every phone. i tried the same thing on a pixel 5 but blocking the mic hole did nothing, but that phone uses an under screen speaker so maybe there is something similar going on with the mic


Why not shut it off in the bios?


If it can be software controlled, that doesn't really protect against the route documented for cameras in the original post


As if there's an option to do so...


FWIW, modern Macbooks also hardware disable the mic when the lid is closed.

https://support.apple.com/en-ca/guide/security/secbbd20b00b/...


How is that true? I use my macbook mic occasionally with the lid closed, and an external monitor.


Plus one-ing this - I think the external monitor may be the kicker to keeping the mic active. This drives me up the wall when Google Meet decides to just default to the closed Macbook next to me instead of my already connected Air Pods when joining work meetings.


The closed macbook next to you has infinitely better sound quality than the airpods mic which will sound like you are underwater.


Are you sure it’s the MacBook (T2 or Arm) mic? I imagine you’d sound super muffled if you were trying to use it while closed anyway, so I can’t imagine it’s very usable to yourself?


I just tested this with Voice Memo and can confirm it works at least in that scenario. The recording didn't stop, the mic was just disconnected then reconnected when lid was opened. Using Amphetamine w/ script installed on M1.


Just to point it out, but there’s a native terminal command `caffeinate` that does the same as Amphetamine.

I use the -disu flags


Miclocks are a thing, or any chopped 3.5mm 3 prong plug should do the trick

https://mic-lock.com/products/copy-of-mic-lock-3-5mm-metalli...

This still doesn't stop a program from switching the input from external back to the internal mics though afaik


I'm not sure if an attacker could get some additional sensitive information from me with access to the microphone or the camera, if they already have full access to my PC (files, screen captures, keylogger). Most things they would be interested in is already there.


Hardware switch in line with the microphone. Can’t be turned on behind my back.


Wireless noise-cancelling headphones. Oh no, the microphone is back through bluetooth.


If you're half-serious about this sort of opsec, you already have bluetooth disabled. Ideally your hardware wouldn't have support for it at all. Same for wifi.


My Librem 14 has a microphon+camera kill switch.

Also, on Qubes OS, everything runs in VMs and you choose explicitly which one has the access to microphone and camera (non by default). Admin VM has no network.


Soldering iron to the rescue. Locate the microphone and unsolder it.

I haven't seen any microphone integrated in the processor.

Yet


M2 and newer MacBooks have an IMU on-board, which is just a funny way of spelling microphone. Admittedly a very low quality one; I'm not sure if you could pick up understandable speech at the 1.6kHz sample rate Bosch's IMUs seem to support.


> M2 and newer MacBooks have an IMU on-board, which is just a funny way of spelling microphone. Admittedly a very low quality one; I'm not sure if you could pick up understandable speech at the 1.6kHz sample rate Bosch's IMUs seem to support.

Are there examples of using IMUs to get audio data you could point to? A quick search didn't reveal anything.


There's this paper, which made the news at the time I think: https://crypto.stanford.edu/gyrophone/files/gyromic.pdf

And there's this post, which includes an audio clip: https://goughlui.com/2019/02/02/weekend-project-mma8451q-acc...


> M2 and newer MacBooks have an IMU on-board

Why?


Going into full paranoid mode, I wonder if some other sensors / components can be used as a makeshift microphone. For instance, a sufficiently accurate accelerometer can pick up vibrations, right? Maybe even the laser in a CD drive? Anything else?



Impossible with normal cameras.


"We also explore how to leverage the rolling shutter in regular consumer cameras to recover audio from standard frame-rate videos, and use the spatial resolution of our method to visualize how sound-related vibrations vary over an object’s surface, which we can use to recover the vibration modes of an object."


A condenser microphone is just a capacitor. Your computer is full of them.

They are very low level input and generally need a pre-amp just to get the signal outside the microphone. However conceptually at least they are there and so maybe someone can get it to work.


Well it doesn’t need to be visible to work in contrast to camera. Seriously though, no technological and almost no economic barrier preventing embedding a mic into every wireless communication chip.


Sure, but that requires the manufacturer to be intending to spy, in contrast to someone compromising after the fact.


How will microphone access be monetized?

For video, it is extortion. For microphone, it's much harder.


Record, produce transcript, look for keywords, alert the puppeteer when something interesting is picked up - trade secrets, pre-shared keys, defense sector intelligence, etc.


And even record keystroke sound to extract passwords.


Only works if there's labeled data for your prior keystrokes as training data. Unless, there's some uniform manufacturing defect per key in a widely available keyboard like Macbook Air


macOS is a proprietary binary blob, remotely controlled by Apple. So, the light in the menu bar is not a reliable indicator of anything. There is no privacy on macOS, nor any other proprietary system. You can never be 100% sure what the system is doing right now, as can be anything it is capable of. Apple is putting a lot of money to "teach people" otherwise, but that is marketing, not truth.


> There is no privacy on macOS, nor any other proprietary system.

Nor is there on any free system for which you didn't make every hardware component yourself, as well as audit the executable of the compiler with which you compiled every executable. (You did self-compile everything, hopefully?)


> Nor is there on any free system for which you didn't make every hardware component yourself, as well as audit the executable of the compiler with which you compiled every executable.

If the components follow standards and have multiple independent implementations, you can be reasonable confident it's not backdoored in ways that would require cooperation across the stack. At least you raise the cost bar a lot. Whereas for a vertically integrated system, made by a company headquartered in a jurisdiction with a national security law that permits them to force companies to secretly compromise themselves, the cost of compromise is so low that it would be crazy to think it hasn't been done.


> You did self-compile everything, hopefully?

Including the compiler, of course.


That's where things get circular, which is why I wrote "audit the compiler". But then, how much can you really trust your hex editor... :)


The root of all trust is eventually some human, or group of humans. See "Reflections on Trusting Trust." At least so far, Apple has convinced me that they are both willing and competent enough to maintain that trust.


Myself, I stopped trusting Apple. There are now too many dark patterns in their software, especially once one stops using their services. And, DRM was re-instantiated, when iTunes started streaming as Apple Music. On top of that, their lies, such as those about the Butterfly keyboards being fixed, cost me a fortune. They fuck up the keyboard design, and then they buy the computer back for 40% of its original price, due to a microscopic scratch nobody else could see. And that happened twice to me. They put a lot of money into advertising themselves as being ethical, but that is only marketing. These, of course, are my personal opinions.


> DRM was re-instantiated, when iTunes started streaming as Apple Music

Purchased music is DRM free. Streaming music was never DRM free, since you arguably do not "own" music that you have not purchased. Though I'm sure record labels would love if they could get DRM back on purchased music again.


I get it, free software take, nothing new.

But this is a pretty extremist take. Just because a company doesn't push source code and you can't deterministically have 100% certainty, doesn't mean you can't make any assertions about the software.

To refuse to make any claims about software without source is as principled as it is lazy.

Imagine an engineer brought to a worksite, and they don't have blueprints, can he do no work at all? Ok, good for you, but there's engineers that can.


Yes, I think all devices packed with sensors that live in our homes should be transparent in what they do, that is their code should be available for everyone to see. And yes, it is extremist take, given where we ended up today.


It’s even dumber than that because the people that do assurance work don’t rely solely on source even when it’s available.

Reversing the software is table stakes for assurance work already so suggesting source is a requirement just doesn’t match reality.


> There is no privacy on macOS, nor any other proprietary system.

Which is to say, every system in actual widespread use. All such CPUs, GPUs, storage devices, displays, etc. run closed microcode and firmware. It'd be funny if it wasn't so profoundly sad.

And even if they didn't, the silicon design is again, closed. And even if it wasn't closed, it's some fab out somewhere that manufactures it into a product for you. What are you gonna do, buy an electron microscope, etch/blast it layer by layer, and inspect it all the way through? You'll have nothing by the end. The synchrotron option isn't exactly compelling either.


Yes, ultimately, I want everything to be open. This is not a bag of rice. These are devices packed with sensors, in our homes. As for inspection, I do not have a problem trusting others. I just do not trust big corporations with remotely controlled binary blobs, no matter how much money they put into the safety and security ads. This is a personal opinion, of course.


> As for inspection, I do not have a problem trusting others. I just do not trust big corporations with remotely controlled binary blobs

I'll just highlight this excerpt of your own words for you, and usher you to evaluate whether your position is even internally consistent.


(not OP) Don't think that is inconsistent.

Trusting someone doing the right thing when you purchase is different from trusting them not tampering things remotely in the future. Companies can change management, human can change their mind. The time factor is important


Hardware can be and is implemented such that it changes behavior over time too, or have undisclosed remote capabilities. There are also fun features where various fuses blow internally if you do specific things the vendor doesn't fancy.

There sure is a difference in threat model, but I don't think the person I was replying to appreciates that, which is kind of what triggered my reply.


Why do you think my stance is internally inconsistent?

For example, I completely trust Emacs maintainers, as I have yet to see any malice or dark patterns coming from them. The same applies to other free and open source software I use on a daily basis. These projects respect my privacy, have nothing to hide, and I have no problem trusting them.

On the other hand, I see more and more dark patterns coming from Apple, say when signed out of their cloud services. They pour millions into their privacy ads, but I do not trust them to act ethically, especially when money is on the table.

Does this not make sense?


Thinking about it, I might have misunderstood what you wrote a bit. What I read was that you trust people, but then you also don't. That's not really a fair reading of what you wrote.

That being said, I have seen "patterns" with open source software as well, so I'm hesitant to agree on trusting it. But that's a different problem.

I also know how little hardware, microcode and firmware can be trusted, so that doesn't help either.


Thank you for the clarification. I certainly could have worded my comment better. I agree with you on that we should never trust open-source software blindly. That said, we can at least audit it, along with every new patch, which is impossible with binary blobs. That is why, I personally think, open-source should be preferred, for free and non-free software alike.


> I just do not trust big corporations with remotely controlled binary blobs

Only outstanding individuals such as Jia Tan.


Once malware is installed, the proprietary blobs from my hardware vendor are the least of my concerns. Thus my request for hardware.


You can watch network traffic for data leaving the device. Trust but verify.


For something as compressible as voice, I do not know how you would feel confident that data was not slipping through. Edge transcription models (eg Whisper) are continuing to get better, so it would be possible for malware to send a single bit if a user says a trigger word.


Good luck auditing even just a single day of moderately active web browsing.


It's easier than reading all of the code in Ubuntu.


But still entirely impossible. So does it matter?


Network traffic monitoring is routinely done at enterprises. It's usually part-automated using the typical approaches (rules and AI), and part-manual (via a dedicated SOC team).

There are actual compromises caught this way too, it's not (entirely) just for show. A high-profile example would be Kaspersky catching a sophisticated data exfiltration campaign at their own headquarters: https://www.youtube.com/watch?v=1f6YyH62jFE

So it is definitely possible, just maybe not how you imagine it being done.


I do believe that it sometimes works, but it's effectively like missile defense: Immensely more expensive for the defender than for the attacker.

If the attacker has little to lose (e.g. because they're anonymous, doing this massively against many unsuspecting users etc.), the chance of them eventually succeeding is almost certain.


All cyberdefenses I'm aware of are asymmetric in nature like that, unfortunately.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: