It's amazing how companies can get so big predicting the future. The process is a glorified dart throw, especially in social systems. The variables are always changing, and the effect of reacting to a prediction (like increasing police presence) changes the expected outcome, so predictions become elusive even if initially correct.
Also, when the variables like homicides are relatively low numbers, there are huge percentage swings that happen naturally.
Macro indicators like weather and per capita income have long been known to be correlated with crime, but trying to predict and proactively reduce it is much harder.
The promise of AI and predictive analytics is huge, but it misses the mark in non-closed systems.
I doubt the actual effectiveness or usefulness of the system mattered much to the police department. Most likely this was used for one of two purposes: a) as mentioned above, parallel construction and b) an excuse to arrest people they wanted to arrest anyway, with the system as cover.
I don't see the parallel construction here. Parallel construction would have to involve use of material that should require a warrant, but one was never obtained. Just generating valuable leads that result in investigation that leads to a warrant isn't parallel construction.
All of those things are true x10 in the stock market, but people still seem to be able to consistently make money. I think you're overstating the difficulty of the problem a bit. There's a lot you can do with simple models to predict crime - they're not perfect, but as long as their users understand that, that's fine.
Also, certain crimes like drug use are common to the point where even models highlighting "links" or "suspects" purely at random might yield increased prosecutions if law enforcement believed in the model enough to investigate more thoroughly than they would if simply following a "hunch", carrying out random checks or conducting a "sweep of the area". And if a model is largely uncorrelated with actual rather than revealed propensity to commit crime, and rather more closely linked to demographics, then its effect on who police prioritise their resources chasing could be a big problem. Certainly a far more likely problem than a model failing to surface any evidence of any criminal activity.
For other reasons, criminals citing false negatives from limited social network based profiling tools failing to identify them as a gang member as a case for their defence, as the article suggested one New Orleans defendant hoped to do, would also be problematic.
I'm not saying it's trivial, or anything close to it. I'm saying that people do it. There are a number of quant firms that are and have been consistently profitable (two sigma, renaissance, etc..).
Two sigma hasn't been all that consistent. First, their main stage for success only began around 2011, after the financial collapse when recovery was well under way. Second, they had a pretty bad 2017. They might not be the best example here. I'm not overly familiar with renaisance to comment on them.
> The owners’ earnings are also driven by Renaissance’s fabled Medallion Fund, which is closed to outsiders. It has earned an estimated annualized return of 35 percent since 1982.
The fact that a coin that comes up heads 75% of the time came up tails one time does not mean its not a biased coin.
Thanks for the renaisance link, interesting firm. I still dont think 2 sigma was a good example for a counter claim though, their primary success has been post-financial collapse. Renaisance on the other hand has consistent performance over the decades though isnt really representative of the field. But I take your point none the less.
Not all stock markets have the "consistent" growth the US has. I put the word in quotes because that exact notion of the past predicting the future it's what's inherently wrong with many of these companies.
It's super easy to create a model that could beat the market when run against historical data. It's a whole other thing to beat the market in real time.
Evidence of this would be that's it's possible to leverage stock positions by multiple factors. If anyone can accurately predict the future and profit from it, then it's easy to turn $5k into $5M with enough leverage and compounding profits. The problem is no one can do it consistently at that level of accuracy.
Of course there are. They do it by figuring out how to get material non-public information. That's been the scam for as long as there has been a stock market, the rest is noise.
If enough people flip enough coins, there will be outliers. It’s not proof of the ability to manipulate the falling coin, it’s just how probabilities work. Take the same group and keep them flipping coins, and that becomes clear, but in this case the rewards of luck are persistent, in the form of investment.
Many quant firms beat the market with a consistency that puts their p-values well below the threshold that would be required to make this claim, even given the multiple comparisons problem you bring up.
The question you're begging is whether the computer oracle is accurate or transparent at all, let alone more so than traditional police work. The point isn't that the whole thing is a bad idea, it's that the comparison to wall street should not be encouraging. Frankly from the article, it sounds like the system they had just wasn't that useful, so this is moot.
From the outside, it's reminiscent of the LA school system's ipad fiasco. A flashy silicon valley purchase that hits the top three requirements for a government program:
1) Appear to be useful by spending a lot of money
2) Without addressing actual structural problems
3) While funneling large amounts of public money to well connected contractors.
My point is that there is nothing wrong with this technology in principle. I'm not saying anything about this technology in particular. The comparison to wall street should be extremely encouraging - a small number of extremely smart firms consistently beat the market. If a small number of firms can produce policing tools that make it more efficient, that'd be great.
It's unacceptable in principle. If a wall street model is right more than it is wrong then it wins. If police are wrong half the time we have a big problem on our hands.
Nobody makes all the right calls on wall street. Some just win more than they lose.
I don't see how that's relevant. As has been pointed out to you many times "it's as good as wall street" isn't an acceptable standard for criminal investigation.
Sigh. Convictions are the only place that you need a 100% success rate. How do you think a criminal investigation gets conducted? Do you think they only investigate a lead when they are 100% sure that it will lead directly to a conviction? If a piece of software generated them 5 extra leads, and 1 of those extra leads lead to actionable evidence - that's useful. You're right - 'good as wall street' isn't the standard, much less good than wall street is a perfectly acceptable standard for software that acts as an additive, informational aid to policing.
There are a number of quant funds that consistently make money. Most of them are not trying to take anyone's money, because they're not open to the public, because they don't need more AUM.
Also, when the variables like homicides are relatively low numbers, there are huge percentage swings that happen naturally.
Macro indicators like weather and per capita income have long been known to be correlated with crime, but trying to predict and proactively reduce it is much harder.
The promise of AI and predictive analytics is huge, but it misses the mark in non-closed systems.