In this brave, not-so-new world of automation, self-service devices are learning to analyze customer preferences and behavioral information and then serve up appropriate offers on the fly. But when it comes to understanding and adapting to emotional reactions, the good old human touch wins. For instance, your local bank rep can discern your confusion when trying to cross-sell an additional service and suggest something different. But just try to get the bank's ATM to react to the bored expression on your face as it cheerily flashes yet another bundled checking account offer across the screen.
So we were intrigued by "E-motional" technology, an emerging e-business class being developed by researchers at the University of Southern California's Integrated Media Systems Center. The project, funded by NCR's Teradata division, is all about understanding customer needs by creating smarter self-service systems that adapt to emotional signals. "Today, you get a one-size-fits-all bank experience at a kiosk or an ATM," says Dave Schrader, Teradata's e-marketing director and technology strategist for the project. "We're working toward having machines do exactly what a good salesperson does: recognize and adapt to a customer's interest in what she's being offered."
E-motion in action
How can a machine learn to understand human emotion? First, researchers create templates of facial features, such as joy, surprise, anger, fear and sadness. Second, by using "smart" cameras that operate on sophisticated modeling techniques, each customer's facial features are mapped against a base of known facial reactions, then matched to an emotional state.
Based on that data, the device can then make offers in real time. For example, say you forget your glasses and are forced to squint when reading the ATM monitor, and suddenly the font size increases; or you smile at an aromatherapy ad at a shopping center kiosk -- and zip -- out comes a coupon. In a neat twist on learning relationships, with each interaction, the device grows more savvy about your emotional behavior, and as a result, it's able to accurately deduce your feelings.
Schrader estimates that it will probably be several years before we'll actually see e-motional technology out on the market. But in a tech landscape of e-commerce-enabled kiosks and biometric identification systems (not to mention the fact that most banks already use cameras at ATMs), it's not far-fetched to envision such technology becoming "pervasive," says Shrader.
The early adopters
There's definitely a growing interest in leveraging emotional data to improve the overall customer experience. IBM, for instance, is working on a project dubbed "BlueEyes," which, among other things, could enable an end user to switch TV channels with a smile or a nod. Then there's the "Pod" car, a collaborative effort from Sony and Toyota that uses physiological indicators to adapt to a driver's mood state and expressed preferences.
While e-motional technology won't exactly transform your local ATM or kiosk into Dr. Frasier Crane, it's a positive sign that in the not-too-distant future, some of the gaps in emotional intelligence between human and machine will narrow considerably.
To read more articles like this one, visit Peppers and Rogers Group's Web site at www.1to1.com.
All materials copyright 2002 Peppers and Rogers Group - 1:1 Marketing.