Show simple item record

dc.contributor.advisorDibeklioğlu, Hamdi
dc.contributor.authorGiritlioğlu, Dersu
dc.date.accessioned2021-10-07T06:13:58Z
dc.date.available2021-10-07T06:13:58Z
dc.date.copyright2021-09
dc.date.issued2021-09
dc.date.submitted2021-10-06
dc.identifier.urihttp://hdl.handle.net/11693/76587
dc.descriptionCataloged from PDF version of article.en_US
dc.descriptionThesis (Master's): Bilkent University, Department of Computer Engineering, İhsan Doğramacı Bilkent University, 2021.en_US
dc.descriptionIncludes bibliographical references (leaves 55-62).en_US
dc.description.abstractInterpretation of nonverbal behavior is vital for a reliable analysis of social interactions. To this end, we automatically analyze facial expressions of romantic couples during their dyadic interactions, for the first time in the literature. We use a recently collected romantic relationship dataset, including videos of 167 couples while talking on a conflicting case and a positive experience they share. To distin-guish between interactions during positive experience and conflicting discussions, we model facial expressions employing a deep multiple instance learning (MIL) framework, adapted from the anomaly detection literature. Spatio-temporal rep-resentation of facial behavior is obtained from short video segments through a 3D residual network and used as the instances in MIL bag formations. The goal is to detect conflicting sessions by revealing distinctive facial cues that are displayed in short periods. To this end, instance representations of positive experience and conflict sessions are further optimized, so as to be more separable using deep met-ric learning. In addition, for a more reliable analysis of dyadic interaction, facial expressions of both subjects in the interaction are analyzed in a joint manner. Our experiments show that the proposed approach reaches an accuracy of 71%. In addition to providing comparisons to several baseline models, we have also conducted a human evaluation study for the same task, employing 6 participants. The proposed approach performs 5% more accurately than humans as well as outperforming all baseline models. As suggested by the experimental results, reliable modeling of facial behavior can greatly contribute to the analysis of dyadic interactions, yielding a better performance than that of humans.en_US
dc.description.statementofresponsibilityby Dersu Giritlioğluen_US
dc.format.extentix, 62 leaves : illustrations (some color), charts ; 30 cm.en_US
dc.language.isoEnglishen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectDyadic interactionen_US
dc.subjectBehavior analysisen_US
dc.subjectFacial expressionen_US
dc.subjectMultiple instance learningen_US
dc.subjectMetric learningen_US
dc.subjectDeep learningen_US
dc.titleFacial analysis of dyadic interactions using multiple instance learningen_US
dc.title.alternativeİkili etkileşimlerde çoklu örnekle öğrenme kullanılarak yüz incelemesien_US
dc.typeThesisen_US
dc.departmentDepartment of Computer Engineeringen_US
dc.publisherBilkent Universityen_US
dc.description.degreeM.S.en_US
dc.identifier.itemidB134904
dc.embargo.release2022-04-05


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record