Need a job? A loan? The software making the decision may have a racial or gender bias

Usually, the American Civil Liberties Union tackles civil rights issues caused directly by other humans. Now, the organization and a group of researchers partnered up to form the The AI Now Initiative out of concern with machines that are showing signs of hidden bias. For example, a ProPublica investigation found disparities in a computer algorithm a being used to predict whether a person would commit a crime in the future, known as a “risk assessment.” The findings? The mathematical formula was more likely to mislabel black defendants as future criminals while determining white defendants as less of a risk. Another example includes algorithmic bias in job hiring tools used by human resource departments that may end up excluding potential employees based on gender, race, age, disability, or military service, which are all protected classes under employment law, the Harvard Business Review reported. These types algorithmic biases could be potentially dangerous for low-income communities and minorities, especially since algorithms are and can be used to decide who can receive a loan or get a job interview, according to the MIT Technology Review. And neither tech companies nor the government have addressed the problem, the MIT Technology Review reported. Kate Crawford, a…


Link to Full Article: Need a job? A loan? The software making the decision may have a racial or gender bias

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!