Affect and personality aware analysis of speech content for automatic estimation of depression severity

buir.advisorDibeklioğlu, Hamdi
dc.contributor.authorGönç, Kaan
dc.date.accessioned2023-09-22T07:04:32Z
dc.date.available2023-09-22T07:04:32Z
dc.date.copyright2023-09
dc.date.issued2023-09
dc.date.submitted2023-09-20
dc.descriptionCataloged from PDF version of article.
dc.descriptionThesis (Master's): Bilkent University, Department of Computer Engineering, İhsan Doğramacı Bilkent University, 2023.
dc.descriptionIncludes bibliographical references (leaves 62-73).
dc.description.abstractThe detection of depression has gained a significant amount of scientific attention for its potential in early diagnosis and intervention. In light of this, we propose a novel approach that places exclusive emphasis on textual features for depression severity estimation. The proposed method seamlessly integrates affect (emotion and sentiment), and personality features as distinct yet interconnected modalities within a transformer-based architecture. Our key contribution lies in a masked multimodal joint cross-attention fusion, which adeptly combines the information gleaned from these different text modalities. This fusion approach empowers the model not only to discern subtle contextual cues within textual data but also to comprehend intricate interdependencies between the modalities. A comprehensive experimental evaluation is undertaken to meticulously assess the individual components comprising the proposed architecture, as well as extraneous ones that are not inherent to it. The evaluation additionally includes the assessments conducted in a unimodal setting where the impact of each modality is examined individually. The findings derived from these experiments substantiate the self-contained efficacy of our architecture. Furthermore, we explore the significance of individual sentences within speech content, offering valuable insights into the contribution of specific textual cues and we perform a segmented evaluation of the proposed method for different ranges of depression severity. Finally, we compare our method with existing state-of-the-art studies utilizing different combinations of auditory, visual, and textual features. The final results demonstrate that our method achieves promising results in depression severity estimation, outperforming the other methods.
dc.description.provenanceMade available in DSpace on 2023-09-22T07:04:32Z (GMT). No. of bitstreams: 1 B162535.pdf: 1250884 bytes, checksum: 6b85a6707ec498031640a0b6bc8742d9 (MD5) Previous issue date: 2023-09en
dc.description.statementofresponsibilityby Kaan Gönç
dc.embargo.release2024-03-20
dc.format.extentxii, 73 leaves : charts ; 30 cm.
dc.identifier.itemidB162535
dc.identifier.urihttps://hdl.handle.net/11693/113889
dc.language.isoEnglish
dc.rightsinfo:eu-repo/semantics/openAccess
dc.subjectDepression severity estimation
dc.subjectDeep learning
dc.subjectNatural language pro-cessing
dc.subjectMultimodal fusion
dc.titleAffect and personality aware analysis of speech content for automatic estimation of depression severity
dc.title.alternativeDepresyon şiddetinin otomatik tahmini için konuşma içeriğinin duygulanıma ve kişiliğe bağlı analizi
dc.typeThesis
thesis.degree.disciplineComputer Engineering
thesis.degree.grantorBilkent University
thesis.degree.levelMaster's
thesis.degree.nameMS (Master of Science)

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
B162535.pdf
Size:
1.19 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: