The capability of humanoid robots to generate facial expressions is crucial for enhancing interactivity and emotional resonance in human-robot interaction. However, humanoid robots vary in mechanics, manufacturing, and ap-pearance. The lack of consistent processing techniques and the complexity of generating facial expressions pose significant challenges in the field. To acquire solutions with high confidence, it is necessary to enable robots to explore the solution space automatically based on performance feedback. To this end, we designed a physical robot with a human-like appearance and developed a general framework for automatic expression generation using the MAP-Elites algorithm. The main advan-tage of our framework is that it does not only generate facial expressions automatically but can also be customized according to user preferences. The experimental results demonstrate that our framework can efficiently generate realistic facial expressions without hard coding or prior knowledge of the robot kinematics. Moreover, it can guide the solution-generation process in accordance with user preferences, which is desirable in many real-world applications.
» Read on@inproceedings{TCCCHWicra23,
address = {London, UK},
author = {Bing Tang and Rongyun Cao and Rongya Chen and Xiaoping Chen and Bei Hua and Feng Wu},
booktitle = {2023 IEEE International Conference on Robotics and Automation (ICRA)},
doi = {10.1109/ICRA48891.2023.10160409},
month = {May},
pages = {7606-7613},
title = {Automatic Generation of Robot Facial Expressions with Preferences},
year = {2023}
}