How to Beat Facial Recognition Interviews

young black woman interview cafe.jpeg

By EmployDiversity

Like it or not, Artificial Intelligence (AI) technology has become human resource recruitment’s best friend. In 2018, nearly 70 percent of hiring managers and recruiters surveyed by LinkedIn said AI was saving them time. 

Companies like HireVue sell job interview video platforms that use AI to assess candidates. The company claims its package can predict a candidate’s likelihood to succeed in a position. 

HR’s adoption of the technology has major implications for interviewing and hiring diversity professionals. 

This is not necessarily a positive technological advance for diverse professionals. The United States Equal Opportunity Employment Commission (EEOC) has been investigating at least two discrimination cases involving job decision algorithms, according to Bloomberg Law. The EEOC is the federal agency that deals with employment discrimination. 

Until regulators feel 100% assured AI does not express any of the biases or ignorances of its makers, diverse professionals need to be conscious of the application of the tool as they search for jobs.

The AI is not Alright

AI has been intruding into our lives and work at a rapid pace. Unfortunately, the overwhelming majority of headshots that provide the grist for the mills of AI interpretation are skewed toward pictures of Americans of European descent. AI uses the photographs to map out the emotions and unexpressed thoughts of individuals. Firstly, the success rate of AI determining the mood and considerations of people is very much a “black” art. The success rate plummets to near zero when the sample set is Blacks, Latinos, and women. In other words, AI is already biased by the ignorance of the technology and its creators.

Blind to Diversity

Joi Ito, the director of the Massachusetts Institute of Technology (MIT) Media Labs, pointed out during a Davos 2017 discussion on AI that computer scientists and engineers come from similar backgrounds. For one, they are nearly all white males, he said.

This sort of blinkered view of the world has led to “oversights” such as Google’s AI identifying online photos of black people as gorillas. Laura Sydell relates a story in her report for National Public Radio (USA) about the shock one fellow experienced when he uploaded photos of friends into Google Photos.  He said, “It labeled them as something else. It labeled her as a different species, a creature.” The title of a New York Times article “Facial Recognition is Accurate, if You’re a White Guy” pointedly makes the point that MIT research student Joy Buolamwini discovered for herself.

The Secret is Out

Boulamwimi discovered that in the core libraries for face recognition, dark faces don’t show up. MIT’s Ito concludes, “... these libraries are used in many of the products you have.” 

The AI algorithms that rank search results on Google, for instance, are also swayed, though arguably by social conditioning:

A couple of years ago, a study at Harvard found that when someone searched in Google for a name normally associated with a person of African-American descent, an ad for a company that finds criminal records was more likely to turn up. The algorithm may have initially done this for both black and white people, but over time, the biases of the people who did the search probably got factored in, says Christian Sandvig, a professor at University of Michigan's School of Information.

Sandvig also says that algorithms are more prone to display lower paying job offerings to women than to men. It’s possible, the academic considers, that women take a pass on opportunities the search results display. The behavior becomes reinforcing: results on the same search performed by others also show lower-paying jobs. 

Tips for Beating the Machines

In January 2020, the Illinois Artificial Intelligence Video Interview Act took effect, according to Vox. The new legislation requires that employers using artificial intelligence-based video analysis technology notify, explain, and get the consent of applicants. So … if you suspect AI will be a major component of a video interview, and you don’t like the idea to begin with, ask the interviewing company if facial recognition technology will be a factor in hiring you. If the organization is not forthright in acknowledging the use of the technology, then you’ll have a sense the company’s integrity may not accord with your standards. Opt out of the interview if you really are not comfortable with the process.

Careers consultant Park Seong-jung recommends, "Don't force a smile with your lips," Instead "Smile with your eyes," he says. However, first stand in front of a mirror and practice the technique, which may not come intuitively to many people. 

With AI measuring every twitch and tick of a job candidate’s face, try not to have had an upsetting experience before the interview. Your emotional response may bleed into the answers you give the AI.

And if you have had an upsetting experience before the interview, try to give yourself some self-care: 

  • Meditate to uplifting messaging

  • Exercise

  • Go for a jog (if your a jogger)

  • Hang out or vent with a close friend

  • Eat (or make yourself) an amazing meal

The most important consideration before, during, and after an interview with AI is: be genuine self. After all, life is short.