Why AI Needs to Be Able to Understand All the World's Languages

Why AI Needs to Be Able to Understand All the World's Languages

Apple’s Siri, Google Assistant, and Amazon’s Alexa collectively service zero African languages. For them, speech recognition technology could help bridge the gap between illiteracy and access to valuable information and services from agricultural information to medical care. Languages spoken by smaller populations are often casualties of commercial prioritization. Instead of exploiting data sets from unrelated, high-resource languages, we leveraged speech data that are abundantly available, even in low-resource languages: radio broadcasting archives. The second, West African Virtual Assistant Speech Recognition Corpus, consists of 10,000 labeled audio clips in four languages.

About Us

When you want to outsmart the world, you turn to the facts. And the facts are in the science.

Subscribe to our newsletter!