This article is not informative. It leans on an MIT Center for Energy and Environmental Policy Research study, with no publicly described methodology and a hidden "full" report. Its a waste of time for everyone imo.
This type of "science" journalism is everything that is wrong with the intersection of media and research. Funding aside, the researchers are isolating a non-transparent, non-reproducible data finding, out of context, and feeding it to the press. On the other hand, the press is reporting this finding, and weighting believability upon credential, instead of reasoning. The head researcher of the article even lists his key credentials as follows: "..His work has been covered in numerous popular press articles..". Good, informative, transparent research? who cares, credentials are accumulating.
The output is statistical nonsense. Its a complete waste of time for anyone looking for real information, because the findings are not contextualized in any meaningful way. The article does not come close to offering apples-apples comparisons of other surveys that would give the reader a working understanding of the economic dynamic it talks about.
I'm so confident that the findings are meaningless because I used to own several businesses where I hired 100s of professional drivers. It would be easy to manufacture results like this $3.37 "median" finding. Many drivers in any given year start and quit with no idea of what the job entails or the economics, yes, even when you tell them explicitly up front. I've seen first-time drivers that want to use vehicles that get 12 mpg. Some drivers just shouldn't be drivers; they can't navigate, avoid traffic, or follow basic instructions. Many drivers aren't ready to treat the situation as a business with a profit and cost center - it's not trivial to quantify costs, and many people don't think that way at all. Also, some new drivers that might make good drivers, take some time to learn. They can take a while to work into a rotation and earn more lucrative rides. Also, many drivers take time to understand the principle that some times of day are busier and they will make more money during that time.
Context matters, a lot.
Also, I frequently ask Lyft / Uber drivers what they make. In many different cities. Its all about the same as I used to see in my businesses. It's not magic, the drivers who stick with it understand the system and make anywhere from nominally above minimum wage to maybe 3X minimum wage for the most capable and most opportunistic drivers.
The basic economics of professional contract drivers has been about the same for dozens of years. I'd speculate what might have changed, if anything, is that Lyft / Uber have normalized driving as a profession, and have strong brands that attract more people. Maybe that results in higher churn rates than the industry has ever seen. That would actually be really interesting if a university researched that trend and explained it through quality scientific journalism.
I feel journalists who are serious about presenting research findings will provide: 1) a complete and transparent understanding of the methodology of the findings, 2) impartial reviews from believable research peers, 3) a concrete and thorough description of the context of the finding. This seems like a minimum journalistic standard. It shouldn't be good enough to point to a dude associated with MIT and say "this dude says this thing so <press narrative>". If journalists rely upon credential, without reasoning, to assert truth, its just manipulation hiding behind assumed pedigree.
This type of "science" journalism is everything that is wrong with the intersection of media and research. Funding aside, the researchers are isolating a non-transparent, non-reproducible data finding, out of context, and feeding it to the press. On the other hand, the press is reporting this finding, and weighting believability upon credential, instead of reasoning. The head researcher of the article even lists his key credentials as follows: "..His work has been covered in numerous popular press articles..". Good, informative, transparent research? who cares, credentials are accumulating.
The output is statistical nonsense. Its a complete waste of time for anyone looking for real information, because the findings are not contextualized in any meaningful way. The article does not come close to offering apples-apples comparisons of other surveys that would give the reader a working understanding of the economic dynamic it talks about.
I'm so confident that the findings are meaningless because I used to own several businesses where I hired 100s of professional drivers. It would be easy to manufacture results like this $3.37 "median" finding. Many drivers in any given year start and quit with no idea of what the job entails or the economics, yes, even when you tell them explicitly up front. I've seen first-time drivers that want to use vehicles that get 12 mpg. Some drivers just shouldn't be drivers; they can't navigate, avoid traffic, or follow basic instructions. Many drivers aren't ready to treat the situation as a business with a profit and cost center - it's not trivial to quantify costs, and many people don't think that way at all. Also, some new drivers that might make good drivers, take some time to learn. They can take a while to work into a rotation and earn more lucrative rides. Also, many drivers take time to understand the principle that some times of day are busier and they will make more money during that time.
Context matters, a lot.
Also, I frequently ask Lyft / Uber drivers what they make. In many different cities. Its all about the same as I used to see in my businesses. It's not magic, the drivers who stick with it understand the system and make anywhere from nominally above minimum wage to maybe 3X minimum wage for the most capable and most opportunistic drivers.
The basic economics of professional contract drivers has been about the same for dozens of years. I'd speculate what might have changed, if anything, is that Lyft / Uber have normalized driving as a profession, and have strong brands that attract more people. Maybe that results in higher churn rates than the industry has ever seen. That would actually be really interesting if a university researched that trend and explained it through quality scientific journalism.
I feel journalists who are serious about presenting research findings will provide: 1) a complete and transparent understanding of the methodology of the findings, 2) impartial reviews from believable research peers, 3) a concrete and thorough description of the context of the finding. This seems like a minimum journalistic standard. It shouldn't be good enough to point to a dude associated with MIT and say "this dude says this thing so <press narrative>". If journalists rely upon credential, without reasoning, to assert truth, its just manipulation hiding behind assumed pedigree.