A Stanford University study showed racial differences in speech recognition systems in Amazon, Apple, Google, IBM and Microsoft, with high and low recognition rates for white and black speech. The study, published in the Proceedings of the National Academy of Sciences (PNAS), found that speech recognition systems from Amazon, Apple, Google, IBM and Microsoft, the world’s top five technology companies, made far fewer errors among white users than among black users.
The study showed that the speech recognition systems of the five technology companies were about 19 percent more likely to misidentify words among whites. Among black users, the error rate jumped to 35%. In addition, about 2 percent of white audio clips are considered unreadable by these systems. Among black users, the percentage is as high as 20 per cent.
The study, which uses an unusually comprehensive approach to measure deviations in speech recognition systems, is a wake-up call for the development of artificial intelligence (AI) technology. At present, AI technology is rapidly integrated into people’s daily life.
Previously, other studies have shown that as facial recognition systems are adopted by police departments and other government agencies, they are much less accurate when trying to identify women and people of color. Other tests have shown that “chatbots”, translation services, and other systems designed to deal with and imitate written and verbal language are sexist and racist.
In response, Ravi Shroff, a professor of statistics at New York University, said: “I don’t understand why these companies didn’t do more due diligence before these technologies were released.” I don’t understand why these problems always arise. “
The study suggests that leading speech recognition systems may also be flawed because these companies are not using as much data as possible in training technology. The task of these systems is to learn mainly from whites, while blacks are less involved.
“These five companies may be the largest speech recognition technology companies, but they’re all making the same mistakes,” said John Rickford, a Stanford University researcher involved in the study. We thought these companies represented all races well, but that wasn’t the case. “
Brendan O’Connor, a professor at the University of Massachusetts at Amherst, says these companies also have difficulty collecting data and lack motivation. Because these companies may face the question of “chicken first or egg first”.
If their services were primarily used by whites, they would have difficulty collecting data that could serve blacks. If they encounter such difficulties in collecting data, these services will continue to be used primarily by whites. But in any case, Noah Smith, a professor at the University of Washington, thinks this is a worrying problem.