As far as I understand it near-field effects stop being relevant at a couple of wavelengths away from the transmitter. For 2.4ghz that would give you what, a meter of distance?
Can you link me to reading material explaining how an antenna can have no sidelobes?
Near-field is only a few wavelengths away when the antenna itself is a half wavelength in dimension. Otherwise you would calculate the size of the near field as a function of the physical dimension of the antenna and the wavelength: r2 = 2d^2/λ.
E.g. The radiating near field of a 2.4GHz antenna about 8 meters long would extend about 1km.
Right, but even a phased array will give you sidelobes, it's only a question about reducing them, right? Your claim that there are no sidelobes at all seems a bit dubious to me.
Or is there some metamaterial "magic" going on even at the transmitter that I'm not accounting for?
Right, and anybody who knows enough to ask you about sidelobes likely also knows enough to know that too.
That's why, if you reply that "there are no sidelobes" you're only harming your own credibility.
A good reply is something like "there are sidelobes, but they peak at -{believable number} dB and are contained to within {small area}. We believe this is more than sufficient to address sidelobe concerns because of {standards}".
"I'm not disclosing basic performance metrics because you might reverse engineer my secret sauce from the basic performance metrics"? Really? Seems unlikely.
Can you link me to reading material explaining how an antenna can have no sidelobes?