How Intelligent Automation Will Become Emotional Automation
(5 min read)
As technical innovation marches forward, machines are becoming more lifelike and conversational every day. “Digital humans” now use a combination of AI, machine learning, natural language processing, and facial recognition to produce in-depth and wide-ranging interactions with people. Soon, these digital humans will play a central role in helping companies intelligently automate their businesses while maintaining personal and emotional connections to customers and employees.
In today’s digitized world of online shopping and app-based services, we can get a lot done with very little human interaction. But if you think about the best customer service you’ve ever received, it probably wasn’t performed entirely by automated computer systems. It took a real person to listen to and understand your needs, go above and beyond to provide an amazing experience, and do it all with a pleasant smile. That’s the human touch machines just can’t deliver – or can they?
As technical innovation marches forward, machines are becoming more lifelike and conversational every day. And soon, companies will intelligently automate their businesses to not just get the job done, but to do it with an increasingly personal and emotional connection to humans.
The Evolution of Digital Humans
Just 40 years ago, the only way to interact with machines was by typing in code. Then came command prompts, graphic interfaces, the mouse, point-and-click and drag-and-drop. But those methods had one limiting factor in common: communication occurred in the computer’s language, on its terms.
Recently, however, the tables are finally turning. Thanks to advances in artificial intelligence, deep learning, and natural language processing, we’re increasingly able to ask questions and give commands in our own language, and computers respond accordingly. We frequently interact with virtual agents over the phone, chatbots on web sites, and our own virtual assistants at home. Though, for the most part, we’re still (sometimes painfully) aware of the fact that we’re speaking to a machine with an extremely limited skill set. These systems aren’t able to pick up the subtle nuances of human interaction and make decisions based on emotions they detect.
But that’s changing, too. The next generation of service-oriented bots will be closer to digital humans than anything we’ve ever seen. While we’re unlikely to see humanoid robots walking among us anytime soon, the machines we interact with will have names, faces, and most importantly, the ability to read and respond to our emotions, just as humans do.
Digital Becomes Emotional
Imagine this scenario. You’re having problems with your cell phone, so you go to your service provider’s website to search for a solution. Instead of sifting through FAQs and user forums, you’re greeted by the smiling face of Molly, an on-screen avatar whose appearance and speech are nearly indistinguishable from a real person. Molly asks how she can help you, and whether she can access your web cam and microphone so you can chat “face-to-face.” With that step complete, she asks you some questions about your phone and the problem you’re having. From your responses, Molly can sense you’re frustrated about the broken phone; she reassures you she’ll do everything she can to help. Based on your description, she walks you through some troubleshooting procedures that might fix the issue. The second solution works, and Molly looks just as happy as you that she could come to the rescue. You leave the chat relieved to have your phone back online. More than that, you’re delighted by Molly’s fast, friendly, and convenient service.
While most people have yet to experience this type of effortless interaction with a machine, it won’t be long before it’s commonplace – not just on web sites, but in retail settings, our workplaces, and even in our cars.
The applications are limitless, and companies including NTT DATA Business Solutions and others already have digital human platforms on offer. Our solution it.human platform combines elements of AI and machine learning, facial recognition (through a camera) and natural language processing (through a microphone). By training the program using thousands of examples of human faces expressing various emotions, the system has learned to recognize when someone is happy, surprised, angry, or disgusted. The same goes for voice recognition and sensing your mood based on verbal cues and tone of voice. With every interaction, it gets smarter and more attuned to the nuances of human communication.
Automation on Human Terms
Customer service may be the first widespread use case for digital humans, but it’s not the only one. In fact, as these solutions go mainstream, we expect organizations to employ digital workers to bring a new level of intelligence and efficiency to their operations.
Already, companies use automation in many ways to streamline processes, enable lightning fast data entry and transactional work around the clock, and remove the risk of human error from repetitive tasks. Now imagine if creating and managing automation was as easy as having a conversation with a digital human, which in turn translates our intentions into tasks and activities in connected systems. That too, will become reality, by integrating digital humans with RPA bots, a strong backend system, machine learning, and other intelligent automation capabilities.
This type of natural interaction between people and powerful machines opens up a world of opportunity for organizations to maximize technology in new and exciting ways.