Recently an algorithm embedded in LinkedIn suggested an ideal job for me: a pilot for Qatar Airways. While I have been a student pilot and have soloed in a Cessna, the most advanced plane I've flown is a Cirrus. That doesn't mean I'm qualified to fly an A380 or 777 on international routes. What went wrong? The Wall Street Journal in a recent story on AI pointed out several issues. Product developers often use data sets that are not diverse. One popular set is reportedly more than 74% male and 83% white. Both Google and Amazon had issues, since corrected, that were adverse to African-Americans and women respectively.
MIT media lab found wide discrepancy in accuracy between dark-skinned women and light-skinned men in three versions of facial recognition software often used by law enforcement. In another area, better understanding of the people, aka the data set, led Ford Motor Co. to extend credit reliably to younger buyers who traditionally may not have been approved - real help to broadening opportunities for younger buyers who haven't yet built a strong credit history.
Big data is here and it is real. Affectiva, an AI company in Boston, has a database of 4 billion facial images from 87 countries. Over 100 corporate customers reportedly use its facial expressions AI technology to study everything from consumer's reactions to ads to drivers for possible drowsiness and distraction.
Better diversity in data sets as well as in who is doing the programming improve the info. As the Ph.D. co-founder of Affectiva, Dr. el Kaliouby, said in the WSJ article, "If you are a 30-year-old white guy programming this algorithm you might not think of..."
It's a new world of data. We at AHMdigital are here to help you figure out the right message for the right people. Reach out to start a conversation. If we don't get back to you right away, I might be landing that super jumbo in Dubai, but I'll be back soon.
Comments