Show simple item record

dc.contributor.authorGiritlioğlu, Dersuen_US
dc.contributor.authorMandira, Buraken_US
dc.contributor.authorYılmaz, Selim Fıraten_US
dc.contributor.authorErtenli, C. U.en_US
dc.contributor.authorAkgür, Berhan Faruken_US
dc.contributor.authorKınıklıoğlu, Merveen_US
dc.contributor.authorKurt, Aslı Gülen_US
dc.contributor.authorMutlu, E.en_US
dc.contributor.authorDibeklioğlu, Hamdien_US
dc.date.accessioned2021-03-08T12:21:52Z
dc.date.available2021-03-08T12:21:52Z
dc.date.issued2020
dc.identifier.issn1783-7677
dc.identifier.urihttp://hdl.handle.net/11693/75892
dc.description.abstractPersonality analysis is an important area of research in several fields, including psychology, psychiatry, and neuroscience. With the recent dramatic improvements in machine learning, it has also become a popular research area in computer science. While the current computational methods are able to interpret behavioral cues (e.g., facial expressions, gesture, and voice) to estimate the level of (apparent) personality traits, accessible assessment tools are still substandard for practical use, not to mention the need for fast and accurate methods for such analyses. In this study, we present multimodal deep architectures to estimate the Big Five personality traits from (temporal) audio-visual cues and transcribed speech. Furthermore, for a detailed analysis of personality traits, we have collected a new audio-visual dataset, namely: Self-presentation and Induced Behavior Archive for Personality Analysis (SIAP). In contrast to the available datasets, SIAP introduces recordings of induced behavior in addition to self-presentation (speech) videos. With thorough experiments on SIAP and ChaLearn LAP First Impressions datasets, we systematically assess the reliability of different behavioral modalities and their combined use. Furthermore, we investigate the characteristics and discriminative power of induced behavior for personality analysis, showing that the induced behavior indeed includes signs of personality traits.en_US
dc.language.isoEnglishen_US
dc.source.titleJournal on Multimodal User Interfacesen_US
dc.relation.isversionofhttps://dx.doi.org/10.1007/s12193-020-00347-7en_US
dc.subjectBig fiveen_US
dc.subjectEstimation of personality traitsen_US
dc.subjectDeep learningen_US
dc.subjectMultimodal fusionen_US
dc.subjectSelf-presentationen_US
dc.subjectInduced behavioren_US
dc.titleMultimodal analysis of personality traits on videos of self-presentation and induced behavioren_US
dc.typeArticleen_US
dc.departmentDepartment of Computer Engineeringen_US
dc.departmentDepartment of Electrical and Electronics Engineeringen_US
dc.departmentInterdisciplinary Program in Neuroscience (NEUROSCIENCE)en_US
dc.departmentNational Magnetic Resonance Research Center (UMRAM)en_US
dc.identifier.doi10.1007/s12193-020-00347-7en_US
dc.publisherSpringeren_US
dc.contributor.bilkentauthorGiritlioğlu, Dersu
dc.contributor.bilkentauthorMandira, Burak
dc.contributor.bilkentauthorYılmaz, Selim Fırat
dc.contributor.bilkentauthorAkgür, Berhan Faruk
dc.contributor.bilkentauthorKınıklıoğlu, Merve
dc.contributor.bilkentauthorKurt, Aslı Gül
dc.contributor.bilkentauthorDibeklioğlu, Hamdien_US
dc.identifier.eissn1783-8738en_US
buir.contributor.orcidGiritlioğlu, Dersu|0000-0003-1503-0213en_US
buir.contributor.orcidMandira, Burak|0000-0001-9605-8642en_US
buir.contributor.orcidYılmaz, Selim Fırat|0000-0002-0486-7731en_US
buir.contributor.orcidAkgür, Berhan Faruk|0000-0002-9390-1160en_US
buir.contributor.orcidKınıklıoğlu, Merve|0000-0002-6668-9168en_US
buir.contributor.orcidKurt, Aslı Gül|0000-0001-5222-8291en_US
buir.contributor.orcidDibeklioğlu, Hamdi|0000-0003-0851-7808en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record