Similar thing happened to me friend of mine about 20 years ago but without algirithms in New Zealand. He had been warning for more than a year that he former girlfriend was dangerous and he wanted custody.
Then she suffocated their 6 year old daughter with a pillow and burnt the house down.
Who designed the algorithm and who implemented it, and where is the accountability for those two actors? Ultimately they are the ones at fault here, not the “algorithm”
The actual headline is a flawed narrative. It should read “police software written by X and deployed under Y’s purview caused death”. The story should be about that, not the falsehood that some nebulous “algorithm” was responsible
Isn’t this how risk works: you assess 1000 people to be at medium risk, 1 dies. Because medium risk means 1/1000? Kind of like lottery tickets all being a low chance of winning the jackpot, yet someone one usually does…
I don’t mean to talk down one travesty. But this is just stats meeting reality/human (mis)understanding of stats.
I think it's indirectly relevant in a disturbing way; if you had a perfect "algorithm"[1] to predict (say) recidivism (let's not quibble over how it works or exactly what risk factors it would assess, let's just posit it is very accurate) then that algorithm would be racist because its predictions would be correlated with the race of the prisoner.
So we can either have fair, colour blind algorithms or accurate algorithms, but we can't have both.
[1] the term "algorithm" is used in the article but it sounds more like a risk assessment framework.
Bullshit use of passive voice in the article. This woman was not "killed" by a passive, abstract force. A man exercised his human agency to burn down her house with her child in it.
Then she suffocated their 6 year old daughter with a pillow and burnt the house down.
The actual headline is a flawed narrative. It should read “police software written by X and deployed under Y’s purview caused death”. The story should be about that, not the falsehood that some nebulous “algorithm” was responsible
I don’t mean to talk down one travesty. But this is just stats meeting reality/human (mis)understanding of stats.
So we can either have fair, colour blind algorithms or accurate algorithms, but we can't have both.
[1] the term "algorithm" is used in the article but it sounds more like a risk assessment framework.