Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
African American
Related: About this forumThere’s software used across the country to predict future criminals. And it’s biased against blacks
Machine Biashttps://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
N A SPRING AFTERNOON IN 2014, Brisha Borden was running late to pick up her god-sister from school when she spotted an unlocked kids blue Huffy bicycle and a silver Razor scooter. Borden and a friend grabbed the bike and scooter and tried to ride them down the street in the Fort Lauderdale suburb of Coral Springs.
Just as the 18-year-old girls were realizing they were too big for the tiny conveyances which belonged to a 6-year-old boy a woman came running after them saying, Thats my kids stuff. Borden and her friend immediately dropped the bike and scooter and walked away. But it was too late a neighbor who witnessed the heist had already called the police. Borden and her friend were arrested and charged with burglary and petty theft for the items, which were valued at a total of $80.
Compare their crime with a similar one: The previous summer, 41-year-old Vernon Prater was picked up for shoplifting $86.35 worth of tools from a nearby Home Depot store.
Prater was the more seasoned criminal. He had already been convicted of armed robbery and attempted armed robbery, for which he served five years in prison, in addition to another armed robbery charge. Borden had a record, too, but it was for misdemeanors committed when she was a juvenile.
Yet something odd happened when Borden and Prater were booked into jail: A computer program spat out a score predicting the likelihood of each committing a future crime. Borden who is black was rated a high risk. Prater who is white was rated a low risk.
Two years later, we know the computer algorithm got it exactly backward. Borden has not been charged with any new crimes. Prater is serving an eight-year prison term for subsequently breaking into a warehouse and stealing thousands of dollars worth of electronics.
Scores like this known as risk assessments are increasingly common in courtrooms across the nation. They are used to inform decisions about who can be set free at every stage of the criminal justice system, from assigning bond amounts as is the case in Fort Lauderdale to even more fundamental decisions about defendants freedom. In Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington and Wisconsin, the results of such assessments are given to judges during criminal sentencing.
snip
InfoView thread info, including edit history
TrashPut this thread in your Trash Can (My DU » Trash Can)
BookmarkAdd this thread to your Bookmarks (My DU » Bookmarks)
4 replies, 1842 views
ShareGet links to this post and/or share on social media
AlertAlert this post for a rule violation
PowersThere are no powers you can use on this post
EditCannot edit other people's posts
ReplyReply to this post
EditCannot edit other people's posts
Rec (12)
ReplyReply to this post
4 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
There’s software used across the country to predict future criminals. And it’s biased against blacks (Original Post)
AntiBank
May 2016
OP
SusanCalvin
(6,592 posts)1. I was just watching a local story on this.
Supposed to keep people from languishing in jail for lack of bail, but not much good if it's too biased to accurately predict.
AntiBank
(1,339 posts)2. Federal lawsuits, class actions, need to beat this down
It's crazy how impactful in a negative way it is.
Number23
(24,544 posts)3. Machines are only as good as their makers
Thanks for posting.
MrScorpio
(73,630 posts)4. Programming endemic American anti-black bias into the works nt