HomeLatest ThreadsGreatest ThreadsForums & GroupsMy SubscriptionsMy Posts
DU Home » Latest Threads » Forums & Groups » Main » Latest Breaking News (Forum) » Amazon facial-identificat...

Fri Jan 25, 2019, 01:09 PM

Amazon facial-identification software used by police falls short on tests for accuracy and bias, new

Source: Washington Post

Technology
Amazon facial-identification software used by police falls short on tests for accuracy and bias, new research finds

By Drew Harwell
January 25 at 11:01 AM

Facial-recognition software developed by Amazon and marketed to local and federal law enforcement as a powerful crime-fighting tool struggles to pass basic tests of accuracy, such as correctly identifying a person's gender, new research released Thursday says.

Researchers with M.I.T. Media Lab also said Amazon's Rekognition system performed more accurately when assessing lighter-skinned faces, raising concerns about how biased results could tarnish the artificial-intelligence technology's use by police and in public venues, including airports and schools.

Amazon's system performed flawlessly in predicting the gender of lighter-skinned men, the researchers said, but misidentified the gender of darker-skinned women in roughly 30 percent of their tests. Rival facial-recognition systems from Microsoft and other companies performed better but were also error-prone, they said.

The problem, AI researchers and engineers say, is that the vast sets of images the systems have been trained on skew heavily toward white men. The research shows, however, that some systems have rapidly grown more accurate over the past year after greater scrutiny and corporate investment into improving the results.
....

Drew Harwell is a national technology reporter for The Washington Post specializing in artificial intelligence. He previously covered national business and the Trump companies. Follow https://twitter.com/drewharwell

Read more: https://www.washingtonpost.com/technology/2019/01/25/amazon-facial-identification-software-used-by-police-falls-short-tests-accuracy-bias-new-research-finds/



David Fahrenthold Retweeted

[link:https://twitter.com/Fahrenthold

Amazon's facial-recognition software, marketed 2 law enforcement as a powerful crime-fighting tool, struggles 2 pass basic tests o/accuracy, raising concerns abt how biased results could tarnish AI's exploding use by police & surveillance. By @drewharwell


2 replies, 540 views

Reply to this thread

Back to top Alert abuse

Always highlight: 10 newest replies | Replies posted after I mark a forum
Replies to this discussion thread
Arrow 2 replies Author Time Post
Reply Amazon facial-identification software used by police falls short on tests for accuracy and bias, new (Original post)
mahatmakanejeeves Jan 25 OP
McCamy Taylor Jan 25 #1
durablend Jan 25 #2

Response to mahatmakanejeeves (Original post)

Fri Jan 25, 2019, 01:12 PM

1. It would be so easy to fool a computer since it is programmed to check a few signs

without actually having the complicated human right brain that allows us to recognize faces. For instance, such software reads me as "juvenile male" from the nose up. And I ain't a juvenile male! Savvy makeup artists will be applying the touches that they know will fool the computer.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to McCamy Taylor (Reply #1)

Fri Jan 25, 2019, 01:18 PM

2. Keep in mind

It was Mechanical Turk that "trained" the AI, clearly doing a shitty job of it.

Reply to this post

Back to top Alert abuse Link here Permalink

Reply to this thread