Hacker News new | past | comments | ask | show | jobs | submit login
Hilbert Transform (electroagenda.com)
130 points by topsycatt on July 10, 2023 | hide | past | favorite | 27 comments



Another useful application of the Hilbert transform is through the Kramers-Kronig relations. Basically, for a causal signal (e.g. an impulse that turns on), the real and imaginary parts of the Fourier transform are related by a Hilbert transform. This is very useful, for example, when trying to extract a time-domain impulse from imperfect frequency-domain measurements, for example with a network analyzer. Rather than attempting to inverse Fourier transform the frequency domain measurements, which will tend to produce an unphysical mess due to measurement errors, it's often better to take either just the real or imaginary part and synthesize the other one. Even better would be to try to find the nearest causal waveform to what is measured, but that's a bit more difficult...

Another application is finding the minimum phase response given an amplitude response.


I find it incredibly funny that this appears a couple of days after I finished my master's thesis. I realized I made a bit of a blunder a week before handing it in: I thought the Hilbert transform gets rid of the aliasing when a real signal's bandwidth reaches below the frequency origin and into negative frequencies, which it doesn't. The bandwidth "folding" is still there, but the negative frequency components are gone (which I did know). Since I got rid of the negative frequencies because of the symmetry of real signals about the frequency origin, there wasn't any point in doing the Hilbert transform. Thankfully I checked all the preprocessing steps I carried out in the thesis, and weeded out this unnecessary step.

In my defense, please do bear in mind that I have not had a thorough education in signal processing, physics degrees don't usually have courses like this. I know Hilbert spaces much more well than I do the Hilbert transform.


I am an analytical chemistry PhD student about to turn in my dissertation, and I also have had no thorough education in signal processing, and had to teach myself what I know. It's really a shame because I do know a ton about the phenomena that produce signals that we analyze, but very little about the signals themselves. Kind of a deficiency in my education I suppose.


During my materials science PhD, I've encounteres this transform as a way to process and smooth out electrochemical impedance spectrograms, with the Z-HIT algorithm[0]. As far as I know, only one or two EIS appliance manufacturers include it in their software, unfortunately not the one I was using.

[0] https://en.wikipedia.org/wiki/Z-HIT


I always thought of the Hilbert Transform and Z-HIT as the complex Laplace like substitute for Fourier and DFT analysis. Having never used them or SSB in particular, is that an fair analysis or am I missing something important?


The signal model for Laplace is x(t)*exp(alpha t) where as the signal model for HT is x(t) only. Another way to say it is that Laplace models the signal envelope as an exponential explicitly while decomposing the remaining signal into frequency components. In Fourier space, the exponential is "rolled into" the frequency components.


Singular integrals are a lovely topic, especially from a Fourier analytic perspective. Taking the Fourier transform, F(H(f))(x) = -i * sgn(x) * F(f)(x), which implies H^2 = - I. H has the Fourier multiplier -i * sgn(x). The Riesz transforms R_j are a higher dimensional generalization of the Hilbert transform with Fourier multipliers -i * x_j/|x_j|, which leads to the nice property sum R_j^2 = -I.


I found a small error I can't edit into this comment: the Fourier multiplier of R_j is -i * x_j/|x|.


Years ago I wrote a genetic program to maximize stock market gains, and as primitives I used the indicators and signals from TA-Lib. The Hilbert Transform Phasor was the clear winner, though the returns were too meagre for me to continue with the project.


One important thing about the hilbert transform is that it is noncausal, and so in practice for realtime applications can only be approximated, and the better the approximation the larger the delay imposed by the need for newer parts of the signal to compute older results. For at lot of applications this doesn't matter, but it does make it harder to use.


This is the rigorous basis for a lot of electrical engineering shortcuts like phasors and active/reactive/real/apparent power analysis.

Makes a hell of a lot more sense when it isn't just pulled out of the ether.


Transfer functions. PTSD flashbacks of Laplace transforms ensue.

Another interesting bit is you can do the majority of signal processing stuff digitally within an ADC with DSPs or in software if you have enough processing power. There are quantized equivalents of LP/HP/BP/BS filters and other transfer functions that can operate on a stream of values post-ADC. The advantage of filtering digitally is FFTs can create filters with no phase shift and infinite rolloff like an ideal filter, properties unattainable in passive analog electronic filters.


Great write up! That explanation seems so much simpler than what was in my Signals and Systems book.


Are ML companies hiring for peeps that do dsp and ML? I'm looking for a new gig and do both!


I just so happen to be working on that at my company, but sadly we do not seem to be hiring more for this position, and I don't think you'd like how low salaries are in southern Europe.


[flagged]


These low-effort GPT comments should be removed as spam.


Good chatgpt


I am unsure if this is a bunch of ChatGPT generated pseudo sciency sounding stuff, or actual useful and real information?

I guess either way it’s written too terse for me to understand


The text attempts to summarize the Hilbert Transform for electronics engineers in telecoms applications just going to the point.

It was written by me, and I am human.

I think I got my purpose. But obviously the text will be useless for many people and very useful for others. Anyway, please take into account the context and the topic of the blog.


Long ago, single-sideband transmitters used a filtering technique or a phasing methodology. The phasing exciters essentially were quadrature phase splitters -- and yep, the Hilbert transform was the mathematics behind this. (another method, by Weaver, uses two Hilbert transforms.

Really good stuff here -- and many thanks! -Cliff, K7TA


Yes. I intend to write one more article on single side band, and another one on the analytic signal. Including there the details you mention.

But for this one I just wanted to focus on Hilbert Transform and a little bit on the applications. Otherwise it would have been a bit "hard" for many readers.

Thanks for your feedback!


Over the years, I've googled on and off about the Hilbert Transform, but never came even close to understanding where it could be useful. Your explanation is the first one that makes sense, though I'll need to reread it a couple of times to really get it. And hopefully, this will help me understand the more rigorous material too.

So thanks! It was very useful for me! And ignore the naysayers.


Thank you so much for your feedback!


You will come across Hilbert transforms in a signals and systems or DSP course, which you would take in an EE/ME undergrad. A lot of the work in this space was done by mathematicians which I think contributes to the abstraction level at which it's taught. It can be useful and is definitely real.


And imaginary ;)


It's actually a very simple explanation if you have a solid background in trig and complex numbers.


Agreed, this article doesn't add any real insight over the references it cites, it just regurgitates the math.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: