I'm pretty sure their point is that certain frequencies are getting a lot more power than is naturally possible. Not that the photons are special in some way.
I'm not so sure. Even in these threads we see specious distinctions between "natural" and "man-made" EMF.
But even then, it's impossible to discuss without talking about relative strengths. Wi-Fi transmits at about 100mW at full strength. For math purposes, let's assume it's a point source broadcasting in all directions. (That's not that much of a wild assumption, either.) The surface area of a sphere with a radius of 1m is about 12.5 m^2. On average, then, the Wi-Fi RF strength at 1m away is about 0.008W/m^2.
The sun above us delivers about 1360W/m^2 of RF radiation, or approximately 170,000 times the radiation of standing a meter from a Wi-Fi router. If it's across the room, 4m away, the ratio is closer to 3,000,000:1.
Even if our bodies responded to "man-made" radiation differently than the "good, natural" kind, there's so very little of it relatively that it can't make much of a difference. I mean, ever look at a 100W lightbulb? If Wi-Fi were at 400THz instead of 2.4GHz so that you could see it, it would be one thousandth as bright. There's just not enough power there to do anything meaningful to us.
> There's just not enough power there to do anything meaningful to us.
Unless specific frequency bands cause problems because something very specific is triggered.
Sure, wifi may only be hitting you with 1 milliwatt per square meter. But between 2.4GHz and 2.5GHz the sun only hits you with... if I did the math right, and just accounting for blackbody emissions, around 10 picowatts per square meter.
We're probably fine, but whether it's fine can't be proven with a simple physics calculation that ignores spectrum.
I'm pretty sure their point is that certain frequencies are getting a lot more power than is naturally possible. Not that the photons are special in some way.