The recent technological advances in robot technology, musical information retrieval, artificial intelligence, etc. are enabling humanoid robots to roughly emulate the physical and perceptual capabilities of musicians while playing musical instruments. In particular, a wind instrument playing humanoid robot requires many different complex systems to work together integrating musical representation, techniques, expressions, detailed control and sensitive multimodal interactions within the context of a piece, as well as interactions between performers. More recently, the development of human-friendly robots drives research that aims at autonomous or semi-autonomous robots that are natural and intuitive for the average consumer to: interact with; communicate with; and work with as partners, besides learning new capabilities. In this talk, the research on the development of wind playing instrument humanoid robots is given. Qualitative experimental evaluations are described to understand what kind of impression musicians have about the robot’s performance, and therefore, to raise their possible implications.
Speaker: After his Ph.D. in Robotics at the Scuola Superiore Sant'Anna in Pisa, Italy, he went as postdoc to Japan and works since 2011 as Associated Professor at the Karlstad University in Sweden and at the same time at the Waseda University in Tokyo. His interest is focussed on the interaction between humans and robots and the cognition theoretical and ethical aspects of this interaction.