Facial analysis of dyadic interactions using multiple instance learning

buir.advisorDibeklioğlu, Hamdi
dc.contributor.authorGiritlioğlu, Dersu
dc.date.accessioned2021-10-07T06:13:58Z
dc.date.available2021-10-07T06:13:58Z
dc.date.copyright2021-09
dc.date.issued2021-09
dc.date.submitted2021-10-06
dc.descriptionCataloged from PDF version of article.en_US
dc.descriptionThesis (Master's): Bilkent University, Department of Computer Engineering, İhsan Doğramacı Bilkent University, 2021.en_US
dc.descriptionIncludes bibliographical references (leaves 55-62).en_US
dc.description.abstractInterpretation of nonverbal behavior is vital for a reliable analysis of social interactions. To this end, we automatically analyze facial expressions of romantic couples during their dyadic interactions, for the first time in the literature. We use a recently collected romantic relationship dataset, including videos of 167 couples while talking on a conflicting case and a positive experience they share. To distin-guish between interactions during positive experience and conflicting discussions, we model facial expressions employing a deep multiple instance learning (MIL) framework, adapted from the anomaly detection literature. Spatio-temporal rep-resentation of facial behavior is obtained from short video segments through a 3D residual network and used as the instances in MIL bag formations. The goal is to detect conflicting sessions by revealing distinctive facial cues that are displayed in short periods. To this end, instance representations of positive experience and conflict sessions are further optimized, so as to be more separable using deep met-ric learning. In addition, for a more reliable analysis of dyadic interaction, facial expressions of both subjects in the interaction are analyzed in a joint manner. Our experiments show that the proposed approach reaches an accuracy of 71%. In addition to providing comparisons to several baseline models, we have also conducted a human evaluation study for the same task, employing 6 participants. The proposed approach performs 5% more accurately than humans as well as outperforming all baseline models. As suggested by the experimental results, reliable modeling of facial behavior can greatly contribute to the analysis of dyadic interactions, yielding a better performance than that of humans.en_US
dc.description.provenanceSubmitted by Betül Özen (ozen@bilkent.edu.tr) on 2021-10-07T06:13:58Z No. of bitstreams: 1 10425069.pdf: 6778057 bytes, checksum: ddcc5e94c5c8ea6d07de92b57d69c0e9 (MD5)en
dc.description.provenanceMade available in DSpace on 2021-10-07T06:13:58Z (GMT). No. of bitstreams: 1 10425069.pdf: 6778057 bytes, checksum: ddcc5e94c5c8ea6d07de92b57d69c0e9 (MD5) Previous issue date: 2021-09en
dc.description.statementofresponsibilityby Dersu Giritlioğluen_US
dc.embargo.release2022-04-05
dc.format.extentix, 62 leaves : illustrations (some color), charts ; 30 cm.en_US
dc.identifier.itemidB134904
dc.identifier.urihttp://hdl.handle.net/11693/76587
dc.language.isoEnglishen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectDyadic interactionen_US
dc.subjectBehavior analysisen_US
dc.subjectFacial expressionen_US
dc.subjectMultiple instance learningen_US
dc.subjectMetric learningen_US
dc.subjectDeep learningen_US
dc.titleFacial analysis of dyadic interactions using multiple instance learningen_US
dc.title.alternativeİkili etkileşimlerde çoklu örnekle öğrenme kullanılarak yüz incelemesien_US
dc.typeThesisen_US
thesis.degree.disciplineComputer Engineering
thesis.degree.grantorBilkent University
thesis.degree.levelMaster's
thesis.degree.nameMS (Master of Science)

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
10425069.pdf
Size:
6.46 MB
Format:
Adobe Portable Document Format
Description:
Full printable version

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.69 KB
Format:
Item-specific license agreed upon to submission
Description: