HomeLatest ThreadsGreatest ThreadsForums & GroupsMy SubscriptionsMy Posts
DU Home » Latest Threads » Forums & Groups » Main » Editorials & Other Articles (Forum) » A face-scanning algorithm...

Tue Oct 22, 2019, 11:58 AM

A face-scanning algorithm increasingly decides whether you deserve the job

A face-scanning algorithm increasingly decides whether you deserve the job
HireVue claims it uses artificial intelligence to decide who’s best for a job. Outside experts call it ‘profoundly disturbing.’

By Drew Harwell
Oct. 22, 2019 at 12:03 p.m. EDT

An artificial intelligence hiring system has become a powerful gatekeeper for some of America’s most prominent employers, reshaping how companies assess their workforce — and how prospective employees prove their worth.

Designed by the recruiting-technology firm HireVue, the system uses candidates’ computer or cellphone cameras to analyze their facial movements, word choice and speaking voice before ranking them against other applicants based on an automatically generated “employability” score.

HireVue’s “AI-driven assessments” have become so pervasive in some industries, including hospitality and finance, that universities make special efforts to train students on how to look and speak for best results. More than 100 employers now use the system, including Hilton, Unilever and Goldman Sachs, and more than a million job seekers have been analyzed.

Drew Harwell is a technology reporter for The Washington Post covering artificial intelligence and the algorithms changing our lives. He joined The Post in 2014 and has covered national business and the Trump companies. Follow https://twitter.com/drewharwell

4 replies, 967 views

Reply to this thread

Back to top Alert abuse

Always highlight: 10 newest replies | Replies posted after I mark a forum
Replies to this discussion thread
Arrow 4 replies Author Time Post
Reply A face-scanning algorithm increasingly decides whether you deserve the job (Original post)
mahatmakanejeeves Oct 22 OP
Mosby Oct 22 #1
Wellstone ruled Oct 22 #2
Newest Reality Oct 22 #3
dalton99a Oct 22 #4

Response to mahatmakanejeeves (Original post)

Tue Oct 22, 2019, 12:17 PM

1. And the algorithm is proprietary so the BS is well hidden.

Someone needs to sue a company that uses this crap.

Reply to this post

Back to top Alert abuse Link here Permalink

Response to Mosby (Reply #1)

Tue Oct 22, 2019, 12:26 PM

2. The Scab has been picked.

The Genie is out of the bottle to be never returned. Recall a Client Company of mine working on Facial Recognition hardware,software and Algorithms associated to this project. Yes,the algo's can be adjusted to do anything one wants when it comes the selective process.

Reply to this post

Back to top Alert abuse Link here Permalink

Response to mahatmakanejeeves (Original post)

Tue Oct 22, 2019, 12:55 PM

3. To Quote Pink Floyd...

"Welcome my Son, to the machine!"

Lie detector, meet HireVue. There is a movement like this with welfare systems around the world now, too.

How do you spell dystopia? Like China's social credit system, they may weed out the expendables with a digital caste system and play, who gets to be an untouchable? India had something like that.

So, obviously there are going to be a lot of humans who don't have a right to be here anymore since AI looks like a way to thrash populations and separate the wheat from the chaff, (although who is which is questionable). Other than the most elite, the ones who are still viable to that system do not have my envy considering just how restrictive and controlled their lives might be.

What will be done with those of us who are outside the constraints of the algorithms and social credits? Let's see, work camps, prisons, euthanasia, maybe efficient recycling as Soylent Green? With less emphasis on human rights, here comes the Oligarchical Technocracy. It looks like an efficient smoke screen for weeding out the undesirables and the timing is just right since it looks like democracy has a short lifespan now.

I give it 20 to 30-years at this pace.

Reply to this post

Back to top Alert abuse Link here Permalink

Response to mahatmakanejeeves (Original post)

Tue Oct 22, 2019, 10:12 PM

4. "It's pseudoscience. It's a license to discriminate."

But some AI researchers argue the system is digital snake oil — an unfounded blend of superficial measurements and arbitrary number-crunching that is not rooted in scientific fact. Analyzing a human being like this, they argue, could end up penalizing nonnative speakers, visibly nervous interviewees or anyone else who doesn’t fit the model for look and speech.

The system, they argue, will assume a critical role in helping decide a person’s career. But they doubt it even knows what it’s looking for: Just what does the perfect employee look and sound like, anyway?

“It’s a profoundly disturbing development that we have proprietary technology that claims to differentiate between a productive worker and a worker who isn’t fit, based on their facial movements, their tone of voice, their mannerisms,” said Meredith Whittaker, a co-founder of the AI Now Institute, a research center in New York.

“It’s pseudoscience. It’s a license to discriminate,” she added. “And the people whose lives and opportunities are literally being shaped by these systems don’t have any chance to weigh in.”

The inscrutable algorithms have forced job seekers to confront a new kind of interview anxiety. Nicolette Vartuli, a University of Connecticut senior studying math and economics with a 3.5 GPA, said she researched HireVue and did her best to dazzle the job-interview machine. She answered confidently and in the time allotted. She used positive keywords. She smiled, often and wide.

But when she didn’t get the investment banking job, she couldn’t see how the computer had rated her or ask how she could improve, and she agonized over what she had missed. Had she not looked friendly enough? Did she talk too loudly? What did the AI hiring system believe she had gotten wrong?

“I feel like that’s maybe one of the reasons I didn’t get it: I spoke a little too naturally,” Vartuli said. “Maybe I didn’t use enough big, fancy words. I used ‘conglomerate’ one time.”

HireVue said its system dissects the tiniest details of candidates’ responses — their facial expressions, their eye contact and perceived “enthusiasm” — and compiles reports companies can use in deciding whom to hire or disregard.

Job candidates aren’t told their score or what little things they got wrong, and they can’t ask the machine what they could do better. Human hiring managers can use other factors, beyond the HireVue score, to decide which candidates pass the first-round test.

Reply to this post

Back to top Alert abuse Link here Permalink

Reply to this thread