I, Chatbot: 3 Laws of Robotics in Customer Service

Science fiction author Isaac Asimov’s book, “I, Robot,” lists “3 laws of robotics.” In his futuristic novel, robots obey these laws to ensure the wellbeing of society. I started thinking about these guidelines and how we could apply them to chatbot customer service. Bear in mind, as a trainer, I am from the “people” side of contact center operations. So, my comments are non-technical in nature!

 

In Asimov’s novel, the first law of robotics has two parts. The first part is, “A robot may not injure a human being.” For customer service chatbots that means giving out correct information. For example, giving out the correct cooking time for a company’s food product. So, the customer does not undercook their food and become ill. So, chatbots need to draw responses from an accurate information database. Vet information before it goes into the database. Have a process for ensuring this information is up to date.

 

The second part of this “law” in the novel is, “Through inaction, allow a human being to come to harm.” For customer service, that could mean giving additional information to avoid a potential problem. For instance, a chatbot might answer a customer’s question about the sale price of an item. However, it would also mention when the sale ends and if their local store has the item in stock. Programming proactive responses into the chatbot database can reduce unnecessary issues. Your contact center team often sees the impact of missing or incorrect information. So, they can help improve chatbot responses. In addition, is there is a big performance gap between veteran and new hire agents? If so, there may be a gap between Knowledge Base information and what veterans actually do on the job. So, be sure to capture those insights and update your chatbot.

 

The second law in Asimov’s science fiction story is, “A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.” For customer service that could mean having a chatbot detect unusual customer orders. For instance, a customer normally orders one box a month online. However, this time, they type “100.” The chatbot could recognize the “out of pattern” order size and ask, “Do you want to order one hundred boxes this month?” That would flag the issue to the customer and allow them to correct it before confirming their order. That might save having 99 boxes returned later!

 

The third law in the novel is, “A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.” For chatbots that could equate to cybersecurity. How do you balance customer service and still prevent hacking or social engineering attacks? It could also mean doing enough to justify the investment in technology. Does your chatbot improve efficiency and customer satisfaction? Does it justify its existence?

 

Chatbots can be wonderful customer service tools. However, there should be guidelines, or “laws,” to ensure they create a great customer experience.