Arinti and InspireX made national press with a Proof of Concept we’ve been working on for the city of Aalter (BE), together with TomorrowLab. The city is looking at different ways to improve and modernize communication with citizens (Chatbot technology, Smart City Furniture, …). Part of this program was a one week trial at the city hall with our robot Pepper, a humanoid robot developed by Softbank.
The Proof of Concept was focused on 3 domains:
- Is it possible to hook the robot up to a chatbot, so that the robot can answer a variety of frequently asked questions?
- Is it possible for the robot to navigate in the city hall, and guide visitors to the right window or civil servant?
- How will the inhabitants of Aalter react to and interact with the robot?
What we’ve concluded is the following:
- Hooking up the robot to a chatbot is most definitely possible, but has it’s challenges. The latency in answering questions being one. Questions have to be sent to the cloud, speech has to be converted to text, the chatbot has to understand the question, text has to be converted to speech, and finally the robot has to receive the answer from the chatbot. All of this takes a few seconds (depending on the internet connection speed), which in technical terms is quite impressing, but people expect the robot to perform like a human being, and that means instant answers. Local dialects was another hurdle to overcome. The robot simply didn’t understand a few people with heavy dialects, and that’s again something people don’t expect. They think the robot will perform the same or better than a human being, which today is not (yet) the case.
- The Pepper robot has sensors to avoid objects. Avoiding objects and intelligently navigating towards a certain point is absolutely not the same. We’ve tried different ways of navigating (I’ll spare you the technical details) but none of them really worked the way we wanted to. Offcourse you could hard-code the robot (‘move 10 meters forward’), but that’s not really ‘intelligent navigation’. Above that, the robot’s sensors are very sensitive. Meaning that if you come too close to Pepper, the navigation script simply stops running and the robot ‘freezes’. We must have tried this 100 times during the Proof of Concept, and let’s say 40 times it worked and 60 times it didn’t. So for now, we’re still working on a better way to get the robot to guide visitors in the city hall. We’re currently working with the ROS framework, and this seems to be a promising solution.
- Overall reactions have been great. Visitors of the city hall could interact with the robot, thanks to a short quiz we created. Pepper’s stay at the city hall was announced in the local media, and the robot drew quite a crowd. During the robot’s stay at the city hall we clearly saw that people are attracted to the robot and want to engage with it. On the other hand, some people clearly expected the robot to do more. People tend to think the robot can answer to all kinds of questions and want to have a conversation with it, like you would with a human being.
This blogpost is inspired by our cooperation with Arinti.