Sarıgül, B.Urgen, Burcu A.2024-03-062024-03-062023-04-051875-4791https://hdl.handle.net/11693/114367Recent work in cognitive science suggests that our expectations affect visual perception. With the rise of artificial agents in human life in the last few decades, one important question is whether our expectations about non-human agents such as humanoid robots affect how we perceive them. In the present study, we addressed this question in an audio–visual context. Participants reported whether a voice embedded in a noise belonged to a human or a robot. Prior to this judgment, they were presented with a human or a robot image that served as a cue and allowed them to form an expectation about the category of the voice that would follow. This cue was either congruent or incongruent with the category of the voice. Our results show that participants were faster and more accurate when the auditory target was preceded by a congruent cue than an incongruent cue. This was true regardless of the human-likeness of the robot. Overall, these results suggest that our expectations affect how we perceive non-human agents and shed light on future work in robot design.enPredictionExpectation violationHuman–robot interactionAudio–visual mismatchAudio–visual predictive processing in the perception of humans and robotsArticle10.1007/s12369-023-00990-6CC BY1875-4805