Sarıgül, BüşraSaltık, İmgeHokelek, BatuhanÜrgen, Burcu A.2021-01-292021-01-29202097814503705782003http://hdl.handle.net/11693/54962Conference name: HRI '20: ACM/IEEE International Conference on Human-Robot InteractionDate of Conference: , 23–26 March 2020Robots increasingly become part of our lives. How we perceive and predict their behavior has been an important issue in HRI. To address this issue, we adapted a well-established prediction paradigm from cognitive science for HRI. Participants listened a greeting phrase that sounds either human-like or robotic. They indicated whether the voice belongs to a human or a robot as fast as possible with a key press. Each voice was preceded with a human or robot image (a human-like robot or a mechanical robot) to cue the participant about the upcoming voice. The image was either congruent or incongruent with the sound stimulus. Our findings show that people reacted faster to robotic sounds in congruent trials than incongruent trials, suggesting the role of predictive processes in robot perception. In sum, our study provides insights about how robots should be designed, and suggests that designing robots that do not violate our expectations may result in a more efficient interaction between humans and robots.EnglishHumanoid robotsRobot designAudio-visual mismatchPredictionRobotic voiceHuman perceptionCognitive sciencesDoes the appearance of an agent affect how we perceive his/her voice? Audio-visual predictive processes in human-robot interactionConference Paper10.1145/3371382.3378302