ABSTRACT

In 2009, researchers from MIT launched a start-up company called Affectiva to commercialize automated facial expression analysis (AFEA)—their effort to program computers to parse out meaningful expressions displayed on human faces captured in video. The time was ripe, they decided, to migrate their work from the lab to the marketplace and see what profitable uses might take hold. At the time, the only commercial application of automated facial expression analysis was Sony’s Smile Shutter™ app, a feature installed on its Cyber-shot cameras that was designed to automatically snap a photo when the person being photographed smiles. (“Switch on Smile Shutter and let your Cyber-shot take the photo for you!” proclaims an online promotion [Sony n.d.].) The founders of Affectiva had a different idea for their version of the technology: “to measure the emotional connection people have with advertising, brands, and media.” The plan was to build a profitable company by marketing AFEA as a market research tool, at the same time gathering video data from the online world for further research and development of AFEA. Since Affectiva’s launch, scientists from the Machine Perception Lab at the University of California, San Diego, formed a similar venture called Emotient. On their website, Emotient claims to be “the leading authority on facial expression recognition and analysis technologies.” The company markets their FACET™ software development kit (SDK) and FACET™ Vision products to “Fortune 500 companies, market research firms, and a growing number of vertical markets” (Emotient 2014: “About Emotient”).