There are a bazillion other narrower-than-wifi schemes that work in the same band. All the pre-wifi stuff, Symbol Spectrum24 and Xircom Netwave and Proxim HomeRF, for instance. Ricochet poletops used a 1Mbps rate just like Bluetooth, and hopped it over the same chunk of spectrum. Heck, Bluetooth itself is a great example. Won't see that on a wifi stumbler, but it's certainly compliant.
It may not show up in a scan of 802.11(abgn/ac) compliant devices, but something like this can be located with equipment as cheap as the spectrum analysis features built into an old 802.11n rocket m2 radio ($60-70) and antenna ($30).
Or alternatively a HackRF clone ($100 from Alibaba etc), ADALM-PLUTO etc if you want an SDR that can cover a few GHz instead - these are great fun, even a $10 RTL-SDR is amazing (tops out at 1.5GHz though).
In a low-noise environment, sure; this technique surely blows the SNR through the floor. Plus, being an OFDM signal you're probably going to lose all reception while the device is in motion.
>In a low-noise environment, sure; this technique surely blows the SNR through the floor
Could you elaborate about why this would affect SNR? Do you just mean because the band is narrower, or is there a new source of noise? (Or something else.)
Regarding being compliant - there are more factors coming into play than just middle freq and transmission power. You also need to have the width of the transmission under control, occupied bandwidth. Harmonics in adjacent bands must be suppressed. The transmitter duty cycle cannot be too high. The channel must be free before you transmit, etc etc.
Semi-educated guess:
802.11 is a spread-spectrum signal, where you take your information, and use it to modulate a much faster "noise" signal. (Which is then used to modulate the carrier wave)
My guess is that by underclocking the PLL, you are slowing down to pseudo-noise signal, so it will have a narrower bandwidth. The 2.4 GHz carrier is generated separately, so is unaffected by lowering this clock. The overall effect would be that your RF signal is still centered in the same carrier, but the signal looks narrower.
Yeah the ESP8266 has two separate PLLs (RFPLL and BBPLL)[0]. The RFPLL generates the carrier on 2.4GHz. They specifically under clocked the BBPLL[1][2], which is the clock the CPU uses and ultimately what modulates the RF carrier.
Almost. As I understand it, most 802.11 hardware works by creating a baseband I/Q signal with the center frequency at 0 Hz using a pair of DACs and then upconverting it to the final transmission frequency by multiplying it with a seperately-generated carrier frequency. (The idea of negative frequencies may seem a bit strange, but because there are seperate in-phase and quadrature components the math works out.) If you decrease the DAC clock but keep the carrier the same this is what happens.
It's not spread-spectrum, but you are otherwise correct - the main point is that the carrier is generated separately. The modulation signal is slowed down, the carrier is not.
Thanks for the correction! Been a while since I looked at wireless stuff, but on checking it was 802.11b that used DSSS, but newer 802.11 standards use different modulation techniques.
The occupied bandwidth of a signal depends on it's symbol rate (aka baud rate). Slower symbol rate results in a narrower bandwidth signal. The article mentions that this slowed down the baud rate of the serial port. So it sounds like the clock rate ends up slowing down everything that depends on it and the narrow RF is a side effect of that slower rate. Whatever generates the carrier frequency must be separate from the main system clock if the center frequency isn't effected.
I think I understand this like I'm five...or there about...let me know if this sounds correct.
Let's say the clock ticks at 100 Mhz.
So every second, the clock ticks 100,000,000 times.
Let's say the wifi broadcasts at 200 Mhz.
So every 200,000,000 (100,000,000 x 2) clock ticks, it sends something out into the airwaves - regardless of how much time actually passed in those 200,000,000 clock ticks.
Now let's slow the system clock down by half. It's now ticking 50,000,000 times per second - but still, every 200,000,000 ticks, it's going to send that signal out into the air.
Obviously those actual number are grossly misrepresented (because it's not sending something every 200,000,000 clock ticks, it's more likely sending something PER clock tick (or more accurately performing some function per available clock tick and sending it out at some predetermined number of clock ticks) - but this is just for visualization purposes)...and I could be way off - because maybe it's not the actual signal that is being slowed down, but the data inside of that signal - if that were the case, it just wouldn't be 2.4Ghz anymore. Either way, that sounds like a pretty basic why to me - Yes, this is explained below because the system uses two different phase lock loops...so it is not the signal itself, but the data in that signal that is slowed.
So, if you visualise a wave, by reducing the frequency, the wave stays the same except it gets stretched. So if you send the same signal each tick but the time between ticks is increased, the signal is stretched.
I guess the data signal is stretched and this gets modulated into a 2.4 GHz carrier signal? I don’t know how WiFi works ;)
Reminds me of frequency-shift-keying modulation. (Used by teletypes since the 1930s or so.) Demodulators typically used filters to detect the two frequencies used (with an -expected- difference). Unusual frequency-shifts would only allow filtering one channel, at most (not even one, if the shift was very small).
Ther me are comments about this on that page. Apparently the band is unregulated and this does follow the specs in all the ways the FCC apparebrlt cares about. Just because devices won’t be able to read the data packets doesn’t mean they violate FCC rules.
But what I wrote above is based on random people’s comments. I don’t know what the rules are myself.
There's not much more regulation there besides a power limit and "stay within the band". As others have mentioned, there are tons of other devices that emit within that band, including microwave ovens.
If the setting the BBPLL was part of the Wifi standard, this could be a privacy feature since only two units same to same BBPLL clock rate could understand each other. Privacy could be traded at the cost of data rate.
https://www.youtube.com/user/CNLohr