> The site will explore three broad investigative categories: how profiling software discriminates against the poor and other vulnerable groups; internet health and infections like bots, scams and misinformation; and the awesome power of the tech companies.
Shouldn't they prove "if" software discriminates before figuring out "how"? It seems like they already have a conclusion in mind which rarely produces quality results.
I'm not sure why you've been downvoted for this. Although I'm inclined to agree with their starting position it's hard to deny that quality investigative journalism shouldn't be about proving a preexisting hypothesis.
If you read the article, they already proved that Facebook was selling illegal housing ads that were biased racially and they proved that a common parole software was also biased racially. I think it stands to reason there are plenty of other inadvertent, or otherwise, issues out there.
The book "Weapons of Math Destruction" by Cathy O'Neill has plenty of other examples of how profiling algorithms often end up disadvantaging the disadvantaged.
True not all software is however there are some cases which are clearly bias laundering like sentencing algorithms that factor in garden size (UK sense). It is pretty clearly a spurious correlation and even if it was true would suggest requiring paroles be forced to live in areas with larger gardens as a rehabilitative measure - not to give a slap on the wrist to the rural and wealthy.
Shouldn't they prove "if" software discriminates before figuring out "how"? It seems like they already have a conclusion in mind which rarely produces quality results.