I'm the founder of a company working to solve this exact problem. Revisit rate of Planet's 200+ Dove satellites are quite good (multiple/day) but are comparatively low-res compared to their Rapideye satellites, of which there are fewer. There are a slew of others (Maxar is the next biggest name that comes to mind) but the thesis is low earth orbit is getting crowded, satellites are incredibly expensive even with off-the-shelf parts and falling launch costs, and hardware capabilities are locked-in at launch.
We're taking the approach of using "free energy" in the form of 100,000+ daily commercial/freight/general aviation aircraft to crowdsource aerial imagery using mobile phones to start. Passengers who opt-in are rewarded with free in-flight wifi (where equipped), and we use the device to do orthorectification and photogrammetry at the edge before transmitting it back down via satellite internet. I'm glossing over much of the actual process, but this frees up a ton of computing that would otherwise have to be done on the ground. In the event the flight is not internet connected, we cache previous images based on flight path and upload the difference after comparing old vs. new on the device once signal is restored. End result is a massive boost in both temporal and spatial resolution at a dramatically lower cost. Think Google Maps, updated every few minutes.
We're on IG @notasatellite if you're interested in looking at some samples.
1. Cell phones from 30,000 ft are going to produce incredibly low resolution images, especially when taken through the window of an airliner. They're also all going to be oblique.
2. If you use real camera rigs, you're going to have to pay a fortune to outfit enough planes. Given that you don't control where the asset goes, this seems really inefficient.
3. Does the entire US actually get covered by all those flights? While ATC tries to give direct routing a lot more than they used to it still seems like you're going to end up with areas where planes hardly ever fly. I'd be really curious to see for a given swath what revisit rate you could get with what confidence from historical ADS-B data.
Please don't take this negatively. I previously cofounded an aerial imagery company and have designed aerial camera systems for a large aerospace company. I came up with an idea like yours, but wrote it off for the reasons I mentioned. It's really cool to see someone pursuing it. Feel free to reach out if you'd like. My email is cornell at cgw3 dot org.
I think it does for a few very specific reasons. I'll answer your questions in order:
1. Resolution is a function of altitude, atmospheric conditions, and camera capabilities. At 30,000ft with zoom, we can get results around 10cm/px on an average smartphone (iPhone SE 2). That's still pretty sharp but we can further enhance the image using satellite base maps, upscaling, and other inference techniques. Obliques can be corrected and used to assemble a "full image" when the opposite oblique is captured, but remains useful (to a certain point on the horizon--right now about 15 miles at cruising). There's still a vast amount of information we can obtain at higher altitudes including crop yield data, snow pack, reservoir/lake water levels, forest density, etc.
2. The physical device we're prototyping is about the size of a headphone case--I actually used a Bose QC25 headphone case to cast the model! There are a few potential avenues to deploy physical sensing hardware on flights including revenue sharing with airlines, using passengers to deploy, and other partnerships in the general aviation space. The DoD in particular has expressed interest in a purpose-built device for aerial sensing but for now, mobile device crowdsourcing in the commercial markets is the focus.
3. There are huge spots in and around US airspace where planes cannot (or generally do not) fly. Satellites will remain the key players when optimizing for coverage, but the high-frequency revisit I believe is best obtained using aerial imagery. I always like to tell people that's why we're "Not A Satellite" instead of "Anti-Satellite". There's more than enough room for both, and we see a huge opportunity to increase revisit and provide a complementary offering to satellite imagery.
We've looked at revisit frequency against ADS-B data and about 80% of the US sees at least 2 flights within a mile each day, exponentially more so around cities and developed areas. Many of our customers are interested in monitoring sites within 15 miles of a major international airport, so we're able to obtain high-resolution images using mobile devices because aircraft are typically below ~8,000ft within that radius. LAX for example can see as many as 500 takeoffs and landings each day, and there are hundreds of industrial sites (ports, fulfillment warehouses, other infrastructure) within the approach path.
I genuinely appreciate your questions. We're still early-stage and the discussions I've had on HN alone have vastly improved the quality of our pitch, business plan, and exposed major blind spots. Thanks again for the kind words, and I'll be sure to drop you a line!
Correct. It sounds a bit far-out (and it is) but we've proven the feasibility with a testing group of around 300 passengers using a multitude of different phones/altitudes/conditions/routes. GPS works in airplane mode, so we pre-cache target coordinates based on the filed flight plan and alert the passenger to hold the device to the window when overflying the target. The gyroscope is used to provide feedback on phone orientation so we can get as close to nadir as possible, but we're still able to correct for obliques at around 20miles on the horizon using some general inference. Resolution is a function of altitude and device, but upscaling techniques enhance the images even further after they're transmitted.
We've also developed our own sensing device prototype we call "The Box" that's equipped with a much more powerful array of sensors, but the mobile device-sensing approach makes the most sense for a scalable MVP.
UX on the flight contributor side is currently a low-touch app that accepts the user's flight number and prompts for their seat selection (assuming a commercial flight). We load an image correction profile based on whether the seat is fore or aft of the wing in order to correct for the blur caused by engine exhaust, after which we cache the image task coordinates we anticipate they'll be flying over. We provide a code for free in-flight wifi (actually 2--one for each side of the plane) that can be redeemed after completing a simple calibration task while taxiing or shortly after takeoff. The calibration looks at window opacity, occlusions, and considers atmospheric conditions for each leg of the flight.
In order to maximize battery life and prevent user fatigue, we only alert the user to begin recording when they're approaching a task coordinate by referencing the cached coordinate with the current GPS coordinate which is readily available even in Airplane mode. The "workload" for test users so far averages about 5 alerts per flight or 5-10 minutes of "recording", typically shortly after takeoff and prior to landing. We're using Lobe to train an ML model with the image feeds, so we're working on implementing a Captcha-style post-flight survey where a few sample images taken in flight are presented to the user for first-pass labeling. If a user completes this survey, they're rewarded with additional airline miles as a thank you
There are plenty of other opportunities to gamify the recording process and we've been taking cues from apps like Waze and OpenStreetMaps to inform some of these potential reward features. Possibilities include revenue sharing, free flights after reaching milestones, travel gear, etc. I remember flying Spirit years ago and they always had a fun way to reward the unlucky souls in the middle seat by putting a sticker on one of the tray tables. The passenger sitting in the "lucky" middle seat got a free round-trip ticket which I've always thought was really cool, so perks like those are top of mind as well.
Tl;dr - we're intent on keeping the UX transparent, engaging, and unobtrusive for the recorder--point, shoot, disembark--while rewarding them in kind for their time and effort.
We're taking the approach of using "free energy" in the form of 100,000+ daily commercial/freight/general aviation aircraft to crowdsource aerial imagery using mobile phones to start. Passengers who opt-in are rewarded with free in-flight wifi (where equipped), and we use the device to do orthorectification and photogrammetry at the edge before transmitting it back down via satellite internet. I'm glossing over much of the actual process, but this frees up a ton of computing that would otherwise have to be done on the ground. In the event the flight is not internet connected, we cache previous images based on flight path and upload the difference after comparing old vs. new on the device once signal is restored. End result is a massive boost in both temporal and spatial resolution at a dramatically lower cost. Think Google Maps, updated every few minutes.
We're on IG @notasatellite if you're interested in looking at some samples.