Great article. Digital is not always better, the technology was not there to outperform analogue tech.
I even wonder if a digital camera system nowadays can outperform this 2GB images, I mean how do you transfer that amount of data at this long range without loss?! Is that even possible?
Maybe this analogue picture compression is something which is still usable and valuable in long distance space transmission?
“Lossless” analogue transmission isn’t lossless. It’s just less lossy than the lossy forms of analogue transmission. As a very simplified example it’s really easy to modulate an analogue value in the frequency domain and maintain accuracy and dynamic range. Hence why FM stereo usually sounds pretty amazing still. It’s not terribly sensitive to environmental factors. However conversely AM sounds like crap.
Now we have digital protocols which are still sent on top of analogue signals (everything is analogue down at the bottom, even your CPU). We lose a tiny bit of dynamic range through compression in some circumstances but gain error correction, speed and the ability to recover signals from below the noise floor which means less power and more distance for the same power.
So no, digital is definitely the way.
As an amateur radio operator, some of us at least tend to play with very low powers. You can have a two way conversation 3000km+ with no more more than a watt but only if you use digital modes. One reason why Morse/CW is still popular; it’s a digital encoding.
> Recent advances in digital signal processing have allowed EME contacts, admittedly with low data rate, to take place with powers in the order of 100 Watts and a single Yagi antenna.
> I mean how do you transfer that amount of data at this long range without loss?
Millions of people are watching the football via a satellite with a 80Mbit/sec DVB-S2 link from low earth orbit with consumer hardware. The system uses forward error correction to cover loss.
The main limiting factor of digital cameras is producing really big sensors, but if you want to photograph a stationary object like the moon it can be done readily by stitching lots of small images together.
People are currently getting excited about the picture quality of 4k and beyond. The old cinema film is, as I understand it, roughly equivalent of about 12k digital cinema. The way the work is quite different so it's not directly comparable.
Similarly with photographic film, Zeiss continue to make a medium-format mono film called Gigagbitfilm which requires special developing but when scanned, in theory, can give gigabit images. Unfortunately not only is it mono but only ISO 40 so exposures are difficult.
If you were shooting a still subject, you could put a filter that excluded other wavelengths and capture the RGB channels separately on mono film and then marge them in post. I've been tempted to convert my digital sensor to mono-only by removing the color filter array to see if it made any difference.
Also, I don't quite understand why ISO 40 would make anything difficult. I regularly shoot at ISO 50 on digital.
> I've been tempted to convert my digital sensor to mono-only
Have you come across the Leica M-Monochrom? When I first heard of it I thought it was odd, but understanding how the sensor work I can see what an advantage it could be for B+W photography.
Yes; one of the 3-letter agencies (I forget its name) has a satellite that can take gigapixel photos 20km (IIRC!) wide. Satellite sends 1EB (yes) per 24h of timelapse. Presumably said agency stores a few days/weeks of footage.
Put the input and the output of a Schmidt on an oscilloscope some time and tune across an HF band for weak digital (RTTY) signals. Then try adding crap to the input.
They'll pull perfect (digital, lossless) copy so far down in the (analog) noise that it can barely be heard. I'm sure there's better stuff these days (been a while).
Some casual reading of modern mobile phone signaling can make one's head spin. Best I could tell, every phone talk on top of each other and still the base is able to pick out individual data streams from the resulting soup.
The reason the base station and your phone need to know their distance is so the phone can transmit at the right time so its signal arrives at the base during the phones time slot and vice versa. That way the base can listen to other phones outside of that slot and the phone can not listen, saving energy, outside of the slot.
I don’t get how that’s your takeaway from the article. The entire point was about the pains they went to in order to convert it from analog to digital because digital is absolutely the way to go here.
Maybe this analogue picture compression is something which is still usable and valuable in long distance space transmission?