{"id":81415,"date":"2019-09-06T11:38:35","date_gmt":"2019-09-06T16:38:35","guid":{"rendered":"https:\/\/news.medill.northwestern.edu\/chicago\/?p=81415"},"modified":"2019-09-06T11:38:35","modified_gmt":"2019-09-06T16:38:35","slug":"ai-for-candidate-screening-eliminating-or-reinforcing-bias","status":"publish","type":"post","link":"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/","title":{"rendered":"AI for candidate screening: eliminating or reinforcing bias"},"content":{"rendered":"<p><strong>Yixuan Xie<\/strong><br \/>\n<em>Medill Reports<\/em><\/p>\n<p class=\"dropcap\">While job applicants hope they are evaluated based upon their capabilities and skills when applying for a job, hiring decisions can be full of biases, ranging from dismissing a candidate simply for a name to focusing recruiting efforts on elite schools.<\/p>\n<p>With multiple studies revealing discrimination in recruitment, artificial intelligence is being embraced as a way to level the playing filed. AI removes human interaction from some parts of resume and video screening, helping to address conscious and unconscious hiring bias. But despite creating a more consistent and fairer way to evaluate applicants, it has the potential to be problematic.<\/p>\n\n<!-- iframe plugin v.6.0 wordpress.org\/plugins\/iframe\/ -->\n<iframe src=\"https:\/\/uploads.knightlab.com\/storymapjs\/df58d37309dfaa923b589d6457accd5e\/ai-powered-candidate-screening-tools\/index.html\" frameborder=\"0\" width=\"100%\" height=\"800\" scrolling=\"yes\" class=\"iframe-class\"><\/iframe>\n\n<p><strong>Resume Screening <\/strong><\/p>\n<p><a href=\"https:\/\/digitalcommons.ilr.cornell.edu\/cgi\/viewcontent.cgi?article=1108&amp;context=student\">A 2016 study by Cornell University<\/a> showed that resumes reveal candidates\u2019 personal identifiable information and may introduce bias into the screening process, especially at the initial stages. It found that candidates with Caucasian-sounding names had 50 percent higher call back rates for interviews than those candidates with African American-sounding names. <a href=\"https:\/\/www.payscale.com\/data\/gender-pay-gap\">Research by PayScale<\/a>, a salary trend analysis website, revealed this year that women face barriers to being hired at tech companies, with females being just 29 percent of the industry.<\/p>\n<p><!--more--><\/p>\n<p>Inspired by the anonymous approaches to reviewing resumes, which the Cornell study suggests leads to more equal opportunities for every candidate, Opus AI Inc., a New York-based startup, developed a blind screening system that uses AI technology to enrich a candidate\u2019s profile with additional data collected.<\/p>\n<p>After employers prioritize five of their qualification requirements in\u00a0categories like job skills, soft skills, culture and personality, Opus AI goes beyond candidates\u2019 resumes because they are not sufficient to decide whether or not to bring in someone for an interview, said Loren Davie, CEO at Opus AI. By reaching out to candidates through email, Opus AI asks interview-styled questions based on the qualifications selected by employers such as \u201cHow many years\u2019 experience do you have in this particular skill?\u201d or \u201cTell me about a time when you had a conflict with a co-worker and how did you resolve it,\u201d Davie said.<\/p>\n<p>With candidates\u2019 input and their uploaded resumes, Davie said Opus AI builds profiles and removes names from them by default. It also offers options to remove gender, race, age, email address, phone number and websites before screening. It then generates charts comparing candidates\u2019 qualifications versus what employers are asking for before moving to a human to make decision, Davie said.<\/p>\n<p>\u201cIf you can just look at candidates based on their capabilities, essentially their fits for a particular position, the assessment is effectively skill-based and can bypass this question of bias,\u201d Davie said.<\/p>\n<p><strong>Video Screening <\/strong><\/p>\n<p>While Opus AI only automates the profile-building part within the screening process, two other video screening platforms are using AI to decide whether to move a candidate to future rounds.<\/p>\n<p>HireVue Inc., based in South Jordan, Utah, has built AI <span style=\"font-weight: 400\">assessment <\/span>models that analyze <span style=\"font-weight: 400\">answers to interview questions<\/span> and looks at word choice, frequency of word usage, eye movements, facial cues and other traits when applicants record themselves answering questions that help companies identify who is most likely to be successful at a job, said Kevin Parker, HireVue CEO. When recruiters get a stack of resumes or cover letters, they have to pick the best-fit candidates based on some criteria, and the criteria can be as unfair as only interviewing the ones from Ivy League Schools, he said. Companies need a way to fairly, consistently and thoughtfully select the best people from all applicants, he said.<\/p>\n<p>HireVue personalizes interview questions for each job with a team of <span style=\"font-weight: 400\">data scientists, engineers and <\/span>13 industrial-organizational psychologists, Parker said. The company works with customers to decide what characteristics to look for and identifies several questions that can get to those issues during an interview.<\/p>\n<p>\u201cIn a particular case, the candidate uses the word \u2018I\u2019 more than they use the word \u2018we\u2019 when <span style=\"font-weight: 400\">answering<\/span> questions. But if you are working in a team-oriented environment, maybe that\u2019s indicative of somebody who is more individual oriented in terms of their approach to work,\u201d Parker said.<\/p>\n<p>Knockri Inc., an AI video assessment startup in Toronto, Canada, uses a similar way to measure employers\u2019 most desired soft skills such as empathy, persuasion, growth mindset and collaboration, said Jahanzaib Ansari, co-founder and CEO at Knockri.<\/p>\n<p>\u201cNobody is watching the video and it\u2019s scored by AI,\u201d Ansari said. \u201cOur AI is trained to not see your name, skin color, race or gender. That being said, we completely de-identified any of those characteristics that could lead to the bias. So, it only looks at facial muscle movements and the content of the response and that\u2019s it.\u201d<\/p>\n<p>Ansari said he has personally experienced discrimination when applying for jobs because his name is one that sounds foreign to some people. Despite having good skill sets and being well experienced, he wouldn\u2019t hear back from employers. To avoid those situations, Knockri created a blind video screening tool that is used even before recruiters review resumes, Ansari said.<\/p>\n<p>Besides the question component to test emotional-related capabilities, HireVue has also added game-based challenges such as puzzle games and memory games for cognitive assessments, Parker said. Questions and games tend to be mixed, and a candidate may play one or two games within a 20- to 25-minute interview. The assessment is not only used to make recommendations to customers on which candidates to follow up, but also to give a report back to applicants showing their strengths\u00a0<span style=\"font-weight: 400\">and areas for potential development<\/span>, Parker said.<\/p>\n<p>According to a 2018 survey conducted by Knockri on more than 5,000 individuals, companies using its AI-powered screening tool have seen 23 percent more diverse candidates being shortlisted, Ansari said. There is also a 62 percent reduction in cost and a 68 percent drop in time when filling in positions, he said.<\/p>\n<p>By measuring the interview-to-offer ratio, Parker said that for every 10 candidates who get a final interview, the number of candidates who are hired has increased from one to 6.5. Being able to match what candidates are really good at and what a job requires also helps new hires stay longer at a company, he said.<\/p>\n<p>\u201cWe see turnover actually reduced, whether it&#8217;s moving from 100 percent to 50 percent or on average candidates have stayed a year longer in the job that they otherwise wouldn\u2019t have,\u201d Parker said.<\/p>\n<p>re:work, a Chicago-based <span style=\"font-weight: 400\">non-profit\u00a0<\/span>organization that creates educational and career opportunities for underserved communities, is using HireVue to identify people with strong communication, negotiation skills and persuasion abilities among more than 500 applicants each year to enroll in an eight-week training program that runs every month, said Shelton Banks, CEO at re:work. After the first four weeks, re:work uses HireVue\u2019s AI assessment tool again to measure a person\u2019s growth. Based on the assessments, re:work suggests certain roles and companies for them, Banks said.<\/p>\n<p>Banks said he has seen an increase in candidate quality, with fewer people dropping out of the program and more candidates getting a job.<\/p>\n<p><strong>Concerns about prejudices in algorithms<\/strong><\/p>\n<p>While <a href=\"https:\/\/www.mercer.com\/content\/dam\/mercer\/attachments\/private\/gl-2019-global-talent-trends-study.pdf\">Mercer\u2019s Global Talent Trends 2019 report<\/a>, which gathers the views of more than 7,300 business executives, human resource leaders and employees around the world, shows that three in five companies plan to increase automation this year, there are concerns about deep-rooted human prejudices being inserted into algorithms.<\/p>\n<p>For example, <a href=\"https:\/\/www.reuters.com\/article\/us-amazon-com-jobs-automation-insight\/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G\">Reuters reported<\/a> in October 2018 that Amazon\u2019s AI-powered recruiting system was built based on resumes submitted to the company over a 10-year period, most of which came from men. Trained on such historical data that comes from an environment where there has already been bias in hiring, the system creates a correlation between high-performing candidates and gender and therefore favors men over women. It downgrades resumes that include the word \u201cwomen\u2019s\u201d and assigns lower scores to graduates of women-only colleges, according to Reuters. As a result, the company had to kill the project in 2017.<\/p>\n<p>Ben Eubanks, a human resource analyst and author of \u201cArtificial Intelligence for HR: Use AI to Support and Develop a Successful Workforce,\u201d said he is concerned about algorithms having a hard time detecting facial features of people who are not Caucasian males or downgrading candidates due to looking away from the camera when in some cultures looking directly at someone is impolite.<\/p>\n<p>\u201cI think that fear is common when comes to technology,\u201d Banks said. \u201cWe still actually take a second look at those videos to see if AI is really true. In some cases, we still picked up the person despite what HireVue is saying, but overall it is extremely accurate in assessing if a person can do well.\u201d<\/p>\n<p>Ninety-eight percent of re:work\u2019s candidates are minorities, and Banks said he has seen people getting high scores on HireVue regardless of their backgrounds.<\/p>\n<p>HireVue has a broad set of training data that is diverse, and the company always tests the data before deploying an algorithm to make sure that different groups of people are not being adversely impacted, Parker said.<\/p>\n<p>\u201cFor example, what we found over time in the interviewing process is that men tend to talk faster than women,\u201d Parker said. \u201cSo, when we evaluate speed of vocalization, we have so much data in our algorithms that we can completely avoid those factors as consistently and as fairly as we possibly can.\u201d<\/p>\n<p><strong>Problems with AI<\/strong><\/p>\n<p>Taking the potential discriminative variables out of algorithms will lead to two problems, said Mireille Hildebrandt, a research professor on \u201cInterfacing law and technology\u201d at the Vrije Universiteit Brussels and general co-chair of ACM-FAT*2020 Conference, the renowned\u00a0 computer science conference on fairness, accountability and transparency in machine learning<\/p>\n<p>\u201cPeople of a certain age behave differently from people of another age. If you take the age out in the data, that means the system will simply use those other data that correlate with this prohibited data in order to be effective,\u201d Hildebrandt said. \u201cIn the end, it will still give you the same outcome.\u201d<\/p>\n<p>Hildebrandt said people in the field of computer science are working hard to solve this problem, but the lack of transparency in testing is also an issue.<\/p>\n<p>\u201cIf the application is behind a trade secret or intellectual property rights, the company can say it has done everything. But if we cannot test it, why ever should we believe this, as it may be a short term competitive advantage to refrain from investing in what is now called \u2018fair, accountable and transparent\u2019 AI,\u201d Hildebrandt said.<\/p>\n<p>Hildebrandt stressed the need for more transparency of proprietary software and said companies should be able to show regulators or machine learning experts their testing processes and prove whether the tool is actually discriminatory or not.<\/p>\n<div class=\"featurecaption\">Photo at top: Artificial intelligence has reshaped how candidates are evaluated. (Gerd Altmann\/Pixabay)<\/div>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Yixuan Xie Medill Reports While job applicants hope they are evaluated based upon their capabilities and skills when applying for a job, hiring decisions can be full of biases, ranging from dismissing a candidate simply for a name to focusing recruiting efforts on elite schools. With multiple studies revealing discrimination in recruitment, artificial intelligence is [&hellip;]<\/p>\n","protected":false},"author":542,"featured_media":81423,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[27,4779,4630],"tags":[],"class_list":["post-81415","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-business","category-summer-2019","category-technology"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.6 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>AI for candidate screening: eliminating or reinforcing bias - Medill Reports Chicago<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"AI for candidate screening: eliminating or reinforcing bias - Medill Reports Chicago\" \/>\n<meta property=\"og:description\" content=\"Yixuan Xie Medill Reports While job applicants hope they are evaluated based upon their capabilities and skills when applying for a job, hiring decisions can be full of biases, ranging from dismissing a candidate simply for a name to focusing recruiting efforts on elite schools. With multiple studies revealing discrimination in recruitment, artificial intelligence is [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/\" \/>\n<meta property=\"og:site_name\" content=\"Medill Reports Chicago\" \/>\n<meta property=\"article:published_time\" content=\"2019-09-06T16:38:35+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/s3.amazonaws.com\/medill.wordpress.offload\/WP%20Media%20Folder%20-%20medill-reports-chicago\/wp-content\/uploads\/sites\/3\/2019\/08\/Recruitment-1.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1100\" \/>\n\t<meta property=\"og:image:height\" content=\"479\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"yixuanxie2019\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"yixuanxie2019\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"9 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/\",\"url\":\"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/\",\"name\":\"AI for candidate screening: eliminating or reinforcing bias - Medill Reports Chicago\",\"isPartOf\":{\"@id\":\"https:\/\/news.medill.northwestern.edu\/chicago\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/s3.amazonaws.com\/medill.wordpress.offload\/WP%20Media%20Folder%20-%20medill-reports-chicago\/wp-content\/uploads\/sites\/3\/2019\/08\/Recruitment-1.jpg\",\"datePublished\":\"2019-09-06T16:38:35+00:00\",\"author\":{\"@id\":\"https:\/\/news.medill.northwestern.edu\/chicago\/#\/schema\/person\/92c87c9047f286bebc41d680e493e38e\"},\"breadcrumb\":{\"@id\":\"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/#primaryimage\",\"url\":\"https:\/\/s3.amazonaws.com\/medill.wordpress.offload\/WP%20Media%20Folder%20-%20medill-reports-chicago\/wp-content\/uploads\/sites\/3\/2019\/08\/Recruitment-1.jpg\",\"contentUrl\":\"https:\/\/s3.amazonaws.com\/medill.wordpress.offload\/WP%20Media%20Folder%20-%20medill-reports-chicago\/wp-content\/uploads\/sites\/3\/2019\/08\/Recruitment-1.jpg\",\"width\":1100,\"height\":479},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/news.medill.northwestern.edu\/chicago\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"AI for candidate screening: eliminating or reinforcing bias\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/news.medill.northwestern.edu\/chicago\/#website\",\"url\":\"https:\/\/news.medill.northwestern.edu\/chicago\/\",\"name\":\"Medill Reports Chicago\",\"description\":\"\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/news.medill.northwestern.edu\/chicago\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/news.medill.northwestern.edu\/chicago\/#\/schema\/person\/92c87c9047f286bebc41d680e493e38e\",\"name\":\"yixuanxie2019\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/news.medill.northwestern.edu\/chicago\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/48ea688fc95ad84210dffd68390cdcbb422e441f855f729129977eabb6273789?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/48ea688fc95ad84210dffd68390cdcbb422e441f855f729129977eabb6273789?s=96&d=mm&r=g\",\"caption\":\"yixuanxie2019\"},\"url\":\"https:\/\/news.medill.northwestern.edu\/chicago\/author\/yixuanxie2019\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"AI for candidate screening: eliminating or reinforcing bias - Medill Reports Chicago","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/","og_locale":"en_US","og_type":"article","og_title":"AI for candidate screening: eliminating or reinforcing bias - Medill Reports Chicago","og_description":"Yixuan Xie Medill Reports While job applicants hope they are evaluated based upon their capabilities and skills when applying for a job, hiring decisions can be full of biases, ranging from dismissing a candidate simply for a name to focusing recruiting efforts on elite schools. With multiple studies revealing discrimination in recruitment, artificial intelligence is [&hellip;]","og_url":"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/","og_site_name":"Medill Reports Chicago","article_published_time":"2019-09-06T16:38:35+00:00","og_image":[{"width":1100,"height":479,"url":"https:\/\/s3.amazonaws.com\/medill.wordpress.offload\/WP%20Media%20Folder%20-%20medill-reports-chicago\/wp-content\/uploads\/sites\/3\/2019\/08\/Recruitment-1.jpg","type":"image\/jpeg"}],"author":"yixuanxie2019","twitter_card":"summary_large_image","twitter_misc":{"Written by":"yixuanxie2019","Est. reading time":"9 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/","url":"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/","name":"AI for candidate screening: eliminating or reinforcing bias - Medill Reports Chicago","isPartOf":{"@id":"https:\/\/news.medill.northwestern.edu\/chicago\/#website"},"primaryImageOfPage":{"@id":"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/#primaryimage"},"image":{"@id":"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/#primaryimage"},"thumbnailUrl":"https:\/\/s3.amazonaws.com\/medill.wordpress.offload\/WP%20Media%20Folder%20-%20medill-reports-chicago\/wp-content\/uploads\/sites\/3\/2019\/08\/Recruitment-1.jpg","datePublished":"2019-09-06T16:38:35+00:00","author":{"@id":"https:\/\/news.medill.northwestern.edu\/chicago\/#\/schema\/person\/92c87c9047f286bebc41d680e493e38e"},"breadcrumb":{"@id":"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/#primaryimage","url":"https:\/\/s3.amazonaws.com\/medill.wordpress.offload\/WP%20Media%20Folder%20-%20medill-reports-chicago\/wp-content\/uploads\/sites\/3\/2019\/08\/Recruitment-1.jpg","contentUrl":"https:\/\/s3.amazonaws.com\/medill.wordpress.offload\/WP%20Media%20Folder%20-%20medill-reports-chicago\/wp-content\/uploads\/sites\/3\/2019\/08\/Recruitment-1.jpg","width":1100,"height":479},{"@type":"BreadcrumbList","@id":"https:\/\/news.medill.northwestern.edu\/chicago\/ai-for-candidate-screening-eliminating-or-reinforcing-bias\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/news.medill.northwestern.edu\/chicago\/"},{"@type":"ListItem","position":2,"name":"AI for candidate screening: eliminating or reinforcing bias"}]},{"@type":"WebSite","@id":"https:\/\/news.medill.northwestern.edu\/chicago\/#website","url":"https:\/\/news.medill.northwestern.edu\/chicago\/","name":"Medill Reports Chicago","description":"","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/news.medill.northwestern.edu\/chicago\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/news.medill.northwestern.edu\/chicago\/#\/schema\/person\/92c87c9047f286bebc41d680e493e38e","name":"yixuanxie2019","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/news.medill.northwestern.edu\/chicago\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/48ea688fc95ad84210dffd68390cdcbb422e441f855f729129977eabb6273789?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/48ea688fc95ad84210dffd68390cdcbb422e441f855f729129977eabb6273789?s=96&d=mm&r=g","caption":"yixuanxie2019"},"url":"https:\/\/news.medill.northwestern.edu\/chicago\/author\/yixuanxie2019\/"}]}},"_links":{"self":[{"href":"https:\/\/news.medill.northwestern.edu\/chicago\/wp-json\/wp\/v2\/posts\/81415","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/news.medill.northwestern.edu\/chicago\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/news.medill.northwestern.edu\/chicago\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/news.medill.northwestern.edu\/chicago\/wp-json\/wp\/v2\/users\/542"}],"replies":[{"embeddable":true,"href":"https:\/\/news.medill.northwestern.edu\/chicago\/wp-json\/wp\/v2\/comments?post=81415"}],"version-history":[{"count":0,"href":"https:\/\/news.medill.northwestern.edu\/chicago\/wp-json\/wp\/v2\/posts\/81415\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/news.medill.northwestern.edu\/chicago\/wp-json\/wp\/v2\/media\/81423"}],"wp:attachment":[{"href":"https:\/\/news.medill.northwestern.edu\/chicago\/wp-json\/wp\/v2\/media?parent=81415"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/news.medill.northwestern.edu\/chicago\/wp-json\/wp\/v2\/categories?post=81415"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/news.medill.northwestern.edu\/chicago\/wp-json\/wp\/v2\/tags?post=81415"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}