Our first Pepper robot ‘Leolani‘ publication:
P. Vossen, S. Báez, L. Bajčetić, and B. Kraaijeveld, “Leolani: a reference machine with a theory of mind for social communication”, invited keynote speech, in Proceedings of TSD 2018, Brno, June 05, 2018. Download paper.
Object Recognition: Selene Báez showing Pepper robot Leolani a plush rabbit (see our repository for Pepper applications at GitHub).
Piek Vossen keynote speaker at 21st International Conference on Text, Speech and Dialogue (TSD 2018), Brno, Czech Republic, September 11–14 2018
Abstract. Our state of mind is based on experiences and what other people tell us. This may result in conflicting information, uncertainty, and alternative facts. We present a robot that models relativity of knowledge and perception within social interaction following principles of the theory of mind. We utilized vision and speech capabilities on a Pepper robot to build an interaction model that stores the interpretations of perceptions and conversations in combination with provenance on its sources. The robot learns directly from what people tell it, possibly in relation to its perception. We demonstrate how the robot’s communication is driven
by hunger to acquire more knowledge from and on people and objects, to resolve uncertainties and conflicts, and to share awareness of the perceived environment. Likewise, the robot can make reference to the world and its knowledge about the world and the encounters with people that yielded this knowledge.
Keywords: robot, theory of mind, social learning, communication
Visit our Pepper robot Leolani GitHub.
#1 Leolani introduces herself
#2 Leolani and Wolfram|Alpha
#3 Get to know
#4 Object Recognition: Plain
#5 Object Recognition: Corrected
#6 Object Recognition: Learning
#7 Make statements: one person “I live in Amsterdam”
#8 Make statements: two persons “Selene & Lenka own a book”
#9 Make statements: two persons “Bram likes science fiction movies”
Object Recognition. Frames from robot cam: Leolani recognizing Person (81%, 86%, 89%), Bram (100%), Chair (82%), Bottle (81%), Laptop (94% and 98%).