Q9 $3.00 under an hour spent from wallet flown, forming unto itself; the sun the sky the gras the sea down it flows, and as birds flown down at dawn, done, south we go.
santa fe las vegas nothing spent at gamblin’ houses hacienda la parrilla yeah yeah yes i know thank you thank ye– s tha– nk you very much mmk yes yes goodbye now i am aware of what’s happening here friends under same night moon and roof together unoften
Researchers from the Human Rights Data Analysis Group (previously) reimplemented the algorithm Predpol predictive policing system that police departments around America have spent a fortune on in order to find out where to set their patrols, and fed it Oakland’s 2010 arrest data, then asked it to predict where the crime would be in 2011.
Predpol’s algorithm munged the arrest data, then confidently asserted that the Oakland PD should concentrate the bulk of their resources in a neighborhood that is poor and black. However, data from the census and the National Drug Use and Health Survey show that crime occurred across Oakland, meaning that if the Oakland PD had followed Predpol’s advice in 2011, they would have just gone and rounded up and jailed a bunch of black people (remember, 97% of the people indicted in the USA plead guilty, an impossibly high number that guarantees that innocent people are pleading guilty to escape the extreme sentences available to prosecutors who secure a conviction at trial).
The reason that Predpol’s model predicts that nearly all the crime would occur in a these neighborhoods is that police concentrate policing here, and you can only find crime in places where you look for it. The algorithm distills the bias in the input data. Unsurprisingly, these are neighborhoods predominantly populated by low-income people of color. Predpol and tools like it are sold as data-driven ways to overcome this kind of police bias, but really, they’re just ways of giving bias a veneer of objective responsibility.
Oakland Mayor Libby Schaaf has repeatedly sought an appropriation of $150,000 to buy Predpol for the city.
Other cities are dumping Predpol. In Burbank, where I live, the police got rid of Predpol after it lowered officer morale to the point where 75% of Burbank cops had “low or extremely low” morale.
Predpol is a classic weapon of math destruction in that it creates a model without regard to bias in data – every scientist and statistician knows that sampling bias is a deadly pitfall in any kind of statistical analysis. Then it predicts the future based on that biased data, and directs those in authority to act on those predictions in a way that is guaranteed to show the predictions to have been correct (regardless of whether they are, in fact, correct), and then it re-ingests data from the behavior dictated by the biased predictions, and suggests behavior that produces even more biased outcomes.
To top it off, a bad prediction by the algorithm causes black people to be overpoliced, white people to be underpoliced, exacerbates the problem of coerced guilty pleas, and is a pipeline that feeds into the equally racially biased automated sentencing systems that send black people to prison for longer than white people.
i am feeling far from myself. maybe it’s the hour of day maybe it’s kevin devine on repeat maybe it’s the miles between my family and i maybe it’s the lack of doing that thing that made me feel known – maybe it’s all of it.
i do miss performing though. i miss being in front of people and telling them they mattered. to make people sit with the silence after being told something that isn’t often heard. i don’t know. maybe it never did anything or meant anything to anyone except myself. maybe i just enjoyed doing those things because i felt like i was making a difference, like i was helping people to become more comfortable in their skin by telling them they weren’t alone.
i am far from that version of myself. not in nature, but in action. i need to get back to that.