Hacker News new | past | comments | ask | show | jobs | submit login
Stochastic Resonance (wikipedia.org)
125 points by caseymarquis on March 10, 2018 | hide | past | favorite | 26 comments



I've found a lot of these explanations to be a lot harder to understand with words, and painfully obvious explained visually.

This diagram makes it a lot simpler: https://www.researchgate.net/profile/Sverker_Sikstroem/publi...


The visual explanation makes it look like overlaying any sufficiently high single frequency high intensity signal would achieve the detectable threshold passing events. Is that true? Why does the article mainly focus on white noise?


Yeah, I don't think the picture is correct. It's just showing how a signal can be detected even in a high noise environment by observing threshold crossing events.

The article talks about the white noise resonating with the signal but not the noise. I'm trying to picture white noise as a summation of all frequencies, but I've never thought about the phase of the components. If two signals of the same frequency are in phase then they resonate, but if they are out of phase they cancel each other out. What does the phase of the components look like in white noise?


The picture seems to match what the article says, with respect to a detection system having a step-like threshold. It is not a matter of the signal being detected even in a high-noise environment, as in this case, it would not be detected at all without the noise: the signal (blue) line is always below the threshold (the dotted line.) With noise added, the detector picks up the spikes above the threshold (those circled in red), and the time-sequence of those spikes has the same periodicity and phase as the original signal (plus some noise).

The article points out that with too much noise, this does not work.


In my mind's ear you're turning up the volume until you hear the beat.

After that, filter out unimportant frequencies and return the volume to normal.

The important data isn't the volume necessarily, but the specific frequency and tempo.


Maybe it is because this example has a simple step threshold. If the response were more smoothly non-linear, perhaps using a single high frequency would lead to relatively strong intermodulation [1] artifacts at specific frequencies, while white noise generates more diffuse white-noise intermodulation.

[1] https://en.wikipedia.org/wiki/Intermodulation


Your supposition seems correct to me, though I may well have misunderstood something. As for the focus on white noise, the history section in the scholarpedia article [0] suggests it's simply historical contingency.

[0] http://www.scholarpedia.org/article/Stochastic_resonance#His...


Possibly because white noise has spectrum where all frequencies have equal intensity so it's easier to extract the diff created by the signal.


Where is the "resonance" here? What would a non-"resonating" version of adding noise to a sinusoid look like?


I don't think there is any resonance. It's just a terrible name for it.


The system I've seen it used in was Oscillatory due to regulation. A balance sensing aid for human feet.

On this article they describe it as a phenomenon of bistable systems. The noise enables the oscillation.


Yeah, reading it I was just thinking "It is dithering. Call it dithering already. DITHERING!"


The article actually states that dithering is a different but related technique.


Exactly. I find it alarming that so many people make bizzare comments as if they understand nothing about basic mathematics. It feels like I'm on reddit again, and I don't want to.


"In physics, resonance is a phenomenon in which a vibrating system or external force drives another system to oscillate with greater amplitude at specific frequencies"

Which bit of this doesn't apply?


The "drives with greater amplitude". Resonance requires an amplification of amplitudes not just adding two waves. Otherwise, adding any signal between 90 and -90 degrees out of phase would constitute 'resonance'.


Ceramic resonators are cheap,narrowband filters used in inexpensive FM radio (etc). If you put one in the feedback path of an op-amp you'll meet the criteria described by the physics definition but putting that same component on the input of the op-amp may result in an undetectable signal.


This is essentially signal normalization. Sensor systems operate on a narrow band of inputs using non-linear components (like e.g. diodes). Adding noise is like raising the DC voltage on a signal so that it shifts the signal to within your operating parameters. For example, for a transistor amplifier to work, the base diode needs a 1.4V difference to turn on and amplify a 10mV source signal.

What makes white noise interesting is that it has a zero DC component (on average) by randomly adding energy as often as it subtracts, while passing through analog components (like a capacitors where R(s) = 1/(sC) in s-domain). In my view, this is still a form of amplification.

Adding Gaussian noise around your source frequency is certainly source amplification because it's equivalent to attenuating the surrounding "noisy" frequencies for the benefit of your sensor equipment. This would increase the signal-to-noise ratio, but you have to know what the base frequency is you are trying to measure.

I hope to be corrected by the signal folks, but it seems misleading to me that "the frequencies in the white noise corresponding to the original signal's frequencies will resonate with each other," when resonance does not occur in the signal, but in the resonant body of the sensor, and so this is more about sensor system calibration than boosting an unknown source signal.

It seems that the practical implications are that you can have simpler sensors (that work in a narrower band) if you can cheaply control the input noise based on sensor reading.


This is a richly intriguing phenomenon, with interesting implications!

For those interested in reading more, this article (on stochastic resonance's potential importance in biological sensing) is edifying: https://www.physik.uni-augsburg.de/theo1/hanggi/Papers/282.p... (Hänggi, Peter. "Stochastic resonance in biology: how noise can enhance detection of weak signals and help improve biological information processing." ChemPhysChem 3.3 (2002): 285-290.)

(Perhaps this is an example of how biological systems can value accurate sensation so highly that they invent ingenious sensing schemes to achieve high performance.)


I've heard the term "dithering" used for this, in electronics and a sort of related (reverse?) idea in displaying images on low color depth displays. I think. For example if you have a DAC generated 8-bit data, but you want finer resolution, then you could imagine averaging maybe every pair of samples, which if you add (and divide by two) the samples, you get an extra bit of information.

But that only works if your signal (including the measurement circuit) is not rock steady, which would make both samples identical and so you don't gain any information. Adding noise helps the samples be different which can increase the effective resolution. If you were just on the brink of a discretization bucket, a kick of noise will likely push you into the next bucket


Dithering isn't the reverse idea, but maybe more of an inverse idea. Instead of adding noise to produce more signal, you add noise to reduce artifacts (unwanted signal in your signal).

Dithering exploits the human senses, so maybe in a way it is the same thing, just for the meatware in our heads.


That's how I meant it. Your eye, optics, and nerves too probably, make averaging happen. Noise gets averaged.

Not sure what distinction you are making between inverse and reverse, but I'm curious.


If you find this interesting, another cool (and also badly named - "resonance" would actually make some sense for that one!) measurement technique for even tinier signals is lock-in amplification.

The idea isn't very complicated, but the Wikipedia explanation is god-awful. I'll try: You modulate the cause of the signal you want to measure with some sine function of frequency f. Then you take the resulting signal and multiply it with the clean sine function of frequency f. You integrate (sum up) the result of that over a long time. Incredible sensitivity (and very slow reaction to changes) results.

If you know what a convolution is, the explanation can be shorter.


Adding noise just amplifies the total signal intensity, allowing it to exceed a threshold for detection. It's not really amplification, and there's no real resonance. You cannot have a signal resonate with random noise, mixing an 8 bit signal with 8 bits of noise gives you less than 2^-40 chance of resonating for more than 10 cycles. The name is a flawed way of describing a signal processing technique.


Nice, though, you could probably get better performance by using an analogue integrator and sigma Delta encoding. Probably even better if you down convert (multiply with expected frequency first) You can swap bandwidth for resolution.


Doea anyone know if there have been any attempts to see if this effect can be used to detect anomalous log events that get drowned out by all the noisy logs?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: