The 12-lead ECG is the standard for cardiac diagnosis in a clinic or an emergency department. This uses numerous points of contact on the skin. Each lead (whether directly measured, or virtually measured as a function of the directly measured leads) then is plotted in a standardized way. The X-axis represents time, the Y-axis represents voltage. When the voltage is toward the lead, the value is positive; it can of course be neutral, or pointed away from the lead (negative).
In contrast, when someone needs to be monitored while at home, a several-lead device is cumbersome. So, people have devised single-lead devices that can remain attached for days-to-weeks (e.g., Zio patch), or which can be activated when desired (e.g., Apple Watch, AliveCor).
A single lead ECG can be more difficult to read, in part because you simply get much less information. (Though, generally, what you're looking for [arrhythmia] is a subset of what you look for in a 12-lead ECG.)
Obviously, there are automated algorithms that attempt to read these already. However, deep learning approaches are appealing here, and at a glance this looks like it performed well and is using a reasonable set of comparisons. (Will have to read the methods section to know for sure; heading to bed so won't get a chance to look for awhile.)
The "classical" I,II,III leads are balanced. aVR,aVL,aVF are derived from I,II,III (through voltage dividers). V1-6 are singled ended with a common "ground" lead.
I always wondered how precise were the computerized ECG interpretations from 24-48h Holter results. Considering you have to take notes and report date and time whenever you feel "something" it might need a manual check.
They’re pretty shit. Probably half of the essentially normal EKGs I see come with the automated interpretation of “Abnormal EKG,” and no meaningful information. You absolutely 100% cannot rely on them, and all doctors are trained - from the get-go - not to.
Here are Cardiologs we have developed a FDA-cleared software to analyze Holter recordings.
Our algorithm uses deep neural networks to detect arrhythmia. The physician reviews and update manually our analysis before generating their final report.
This leads to an efficient analysis in a minimum of time. They can even analyze multiple-weeks recordings painlessly without spending hours on it.
Years ago I had to wear a Holter device for 24h, and wondered the same when I was asked to note the time of any events.
I've also had a 12-lead ECG done 3 times that I recall Each time it was done by a nurse rather than a doctor, and each time they barely even glanced at the output before announcing everything was OK. Is ECG output really that easy to read and unambiguous?
> Is ECG output really that easy to read and unambiguous?
Not even close. But nurses generally don’t know how to read EKGs except for the most obvious findings, so when they say “it’s all fine” either they mean they don’t see something very conspicuous (ST elevations, widened QRS, absent p waves), or they’re just giving you the default “stay calm, and wait for the doc to read it” comment.
The only non-docs I’ve ever seen impress me with their EKG reads have been some experienced EMTs, really experienced critical care nurses / mid levels, and some experienced cardiology mid-levels. In short, people that do it every single day and have been doing it for a long while. And even they don’t do it with an instant glance.
EKGs are more complicated than they look. Each lead gives you a different slice through the heart; some of the leads are in the xz axis, and some are in the xy axis, and findings are modified by height, weight, position, metabolic profile, etc. Basic EKG reading takes an hour to learn; being good at it takes forever.
The catch about the above study is that they are doing the easiest possible thing: categorizing arrhythmias. That’s the part you can learn in an hour. Doing it “as well as a cardiologist” just isn’t impressive. Give me a couple of weeks with a bright high schooler and they’ll be doing rhythm classification as well as most doctors.
It’s “precisely which part of the heart is malfunctioning, what part of the vasculature or conduction pathway or whatever does that implicate, and based on the patient’s medical history what underlying diagnosis does that imply? And what is the next best step in medical management?” that cardiologists interpret EKGs for.
> Figure 2. Deep Neural Network architecture. Our deep neural network consisted of 33 convolutional layers followed by a linear output layer into a softmax. The network accepts raw ECG data as input (sampled at 200 Hz, or 200 samples per second), and outputs a prediction of one out of 12 possible rhythm classes every 256 input samples. The first and last layer are special-cased due to the pre-activation residual blocks.
^^^ 33 convolutional layers --- wow that's pretty deep for 1-D input.
It's deep and large model. Maybe unnecessarily so.
In image processing a single pixel or region almost always belongs to one object. Car can be behind a tree, or tree can be behind a car but they are not transparent and mixed together.
In raw 1d radio, audio, ECG or EEG signals there can be multiple additively overlapping singnals across different frequency ranges. Signal patterns are transparent to other signals. ConvNet on raw singnal must learn to discriminate overlapping pattern combinations.
I find it interesting they don't cite any framework. They presumably wrote all the DNN code from scratch, which can certainly be done, but seems like a lot of effort unless you plan to commercialize, which clearly they are.
What is "single lead" device? Is that a single electrical contact patch? How can that detect anything? I've used a single patch with 4 electrical contacts for a 30 day Holter test, plus the standard 24 hour Holter monitor with a bunch of patches and wires.
Some arrhythmias are trivial with a single lead because they manifest in the rate at which the heart beats. And from there, you can imagine there are a spectrum of arrythmias which become more morphological and likely harder to detect.
There are various options, but for long periods single lead devices are often used, can't really walk around for 30 days with a 12 lead holter strapped to you...would make bathing rather difficult.
It's possible to implant small loop recorders under the skin on the chest. These can record for months. Those are mostly useful in cases where symptoms occur infrequently.
I think what I had was called a Body Guardian. It consisted of a bunch of sticky strips (photo here:https://imgur.com/I9cXw1V) that had 4 skin contact pads on the sticky side and 4 snap connectors on the other side. I'd stick this to the center of my chest just above nipple line. There were a pair of small pods that snapped to the 4 contacts. Each was about the 1/2 the size of a deck of cards. They would record continuously for about 16 hours. I'd alternate between them every 12 hours or so, recharging the one I wasn't using in the provided recharger. Also included was a customized Samsung cell phone with it's own recharger and an app that communicated wirelessly with the pods. The cell would download results from the pod & upload it to the company that provided the equipment. I carried that phone with me most of the time. When I connected a pod to the recharger, it would signal the cell phone which would download all results from the pod and upload to the company. My cardiologist got the results from them.
The pods had a button that I could press when I felt like I was having "an event". The cell phone app would popup a selection list for possible events like skipped beat, extra beat, weak pulse, etc. I can't remember the entire list. I wasn't actually required to log these when I felt them -- they were just there to let the monitor S/W know I had noticed something. I'd remove the pod to shower but not to sleep or workout. The sticky strips would stay attached for at least a day unless I sweated a lot, like in a sauna. Strips were not reusable. They provide enough strips for the 30 day trial.
My problem is PVCs (premature ventricular contractions ... skipped beats). Everyone gets these every day. I just get them a lot more frequently. Like 3 beats, skip, 4 beats, skip, 60 beats, skip, 10 beats, skip.
I think the monitoring is pretty continuous. Second hand info, so take this with a grain of salt ..... A friend told me that he knew of someone else on the same or a similar system who has a much worse condition. At one point his monitor cell phone, his personal cell phone, his wife's cell phone and the house phone all rang simultaneously. When they answered, someone on the other end told them to get him to the emergency room immediately. He was having an event and not even aware of it. Turns out it was a good thing they did that.
For those who think this will revolutionize medicine, it won't. ECG analysis is a tiny part of diagnostics, and can only give a very limited amount of information for most people, and for the small number of people who absolutely need ECG for diagnosis, a clinician is still required and the ECG is a part of the information needed for determining treatment.
Further, the bar is set fairly low: as good as a panel of cardiologists at a very specific task, that can also be done by non-cardiologists depending on level of training and desire for mastery.
Lastly, this technology will never be a panacea where you walk into a Walmart and the ECG analysis spits out a report of your overall "health".
I would have thought that you could get pretty damn close with just some simple stats in frequency space normalised to the base frequency, i.e. Fourier transform divided by the fundamental frequency.
Not likely. Heart rhythm isn't so much about the rate at which the whole thing beats, it's rather a little more complicated than that. It's about the synchronization between different parts of the heart, about the way the signal propagates from the Sinus Node, about how it relates to the body's requested demand.
Many things can go wrong. Many things do go wrong even in healthy hearts, for instance most people get ectopic beats every now and again, often without their knowledge. A simplistic FT approach would highlight stuff like that and no one would benefit from it.
Thats... not quite what I meant, perhaps I worded that a little bit weird. I meant frequency invariant, not amplitude normalised. As in divide the frequencies by the fundamental frequency (i.e. subtract their logs on the 'scope), not divide the amplitudes by the amplitude on the fundamental.
Any deviation from the profile of the FT irrespective of its base frequency, from the normal sinus rhythm, would easily show. There might be a few problems dealing with cardiac load transitions, and I suppose if you are trying to differentiate the exact class of arrhythmia (as in the paper) then perhaps not, but differentiating normal (sinus), noise, abnormal it should work fine.
For those of you concerned about heart disease, whether you have risk factors or a family history, consider getting a calcium score. It's the gold standard for detecting heart disease. It's usually not covered by insurance but it's very inexpensive (About $200).
Involves a CAT scan of the chest so it's not a blood test and there are some X-Rays involved. But it's totally worth it.
The study contains no comparison with competing algorithms, so I don't see what supports the claim that
If confirmed in clinical settings, this approach could reduce the rate of misdiagnosed computerized ECG interpretations and improve the efficiency of expert human ECG interpretation by accurately triaging or prioritizing the most urgent conditions.
In contrast, when someone needs to be monitored while at home, a several-lead device is cumbersome. So, people have devised single-lead devices that can remain attached for days-to-weeks (e.g., Zio patch), or which can be activated when desired (e.g., Apple Watch, AliveCor).
A single lead ECG can be more difficult to read, in part because you simply get much less information. (Though, generally, what you're looking for [arrhythmia] is a subset of what you look for in a 12-lead ECG.)
Obviously, there are automated algorithms that attempt to read these already. However, deep learning approaches are appealing here, and at a glance this looks like it performed well and is using a reasonable set of comparisons. (Will have to read the methods section to know for sure; heading to bed so won't get a chance to look for awhile.)