snapchat img
Snapchat

Since January 2015, Snapchatters have been using filter to transform themselves into puppies, flower princesses and babies. In order to apply a filter on a face, Snapchat recognizes human face with Viola-Jones algorithm, and then using Image Processing to apply features onto a full face. However, not every face can be recognized by Snapchat’s algorithm. According to Joy Buolamwini’s research with the MIT Media Lab, facial recognition technology works significantly less for darker-skinned females. Compare to 0.8% of error rate for lighter-skinned male, there’s a 34.4% increase for darker-skinned female.

Read more
facebook img
Facebook

The Facebook algorithm controls the ordering and presentation of posts ads, so users see what is most relevant to them. In 2016, a report from ProPublica showed that Facebook’s ethnic affinities demographic could filter racial groups from ad reach—a violation of federal laws. The algorithm unknowingly helped spread discrimination. For example, Facebook’s algorithm may decide to aim an ad for high-end real estate at an affluent zip code, which demographically skews white, thus violating the Fair Housing Act.

Read more

Google img
Google

Google created an artificial intelligence algorithm in 2016 meant to monitor and prevent hate speech on social media platforms and websites. However, researchers at the University of Washington discovered that the tool was profiling tweets posted by African-Americans as hate speech. African-American Vernacular English (AAVE), such as the “n-word” which is culturally acceptable by other African- Americans, were often flagged as offensive and therefore labeled as "hate speech." This made the algorithm grow to be inherently biased towards African-Americans.

Read more
Healthcare img
Healthcare System

Published Oct. 25, 2019 in the journal Science, a study found that a type of software algorithm that determines who gets access to high-risk health care management programs routinely lets healthier white people into the programs ahead of black people who are less healthy. The algorithms encode racial bias by using health care costs to determine patient ‘risk,’ or who was mostly likely to benefit from care management programs. Because of the structural inequalities in our health care system, black people at a given level of health end up generating lower costs than white people, which causes less black patients who are sicker to be admitted to the health care programs.

Read more