A transformer-based prior legal case retrieval method
buir.contributor.author | Öztürk, Ceyhun Emre | |
buir.contributor.author | Özçelik, Şemsi Barış | |
buir.contributor.author | Koç, Aykut | |
buir.contributor.orcid | Öztürk, Ceyhun Emre|0000-0001-9744-6778 | |
buir.contributor.orcid | Özçelik, Şemsi Barış|0000-0002-3666-8366 | |
buir.contributor.orcid | Koç, Aykut|0000-0002-6348-2663 | |
dc.citation.epage | 4 | en_US |
dc.citation.spage | 1 | |
dc.contributor.author | Öztürk, Ceyhun Emre | |
dc.contributor.author | Özçelik, Şemsi Barış | |
dc.contributor.author | Koç, Aykut | |
dc.coverage.spatial | İstanbul, Türkiye | |
dc.date.accessioned | 2024-03-22T12:55:52Z | |
dc.date.available | 2024-03-22T12:55:52Z | |
dc.date.issued | 2023-08-28 | |
dc.department | Department of Electrical and Electronics Engineering | |
dc.department | Department of Law | |
dc.department | National Magnetic Resonance Research Center (UMRAM) | |
dc.description | Date of Conference: 05-08 July 2023 | |
dc.description | Conference Name: 31st IEEE Conference on Signal Processing and Communications Applications, SIU 2023 | |
dc.description.abstract | In this work, BERTurk-Legal, a transformer-based language model, is introduced to retrieve prior legal cases. BERTurk-Legal is pre-trained on a dataset from the Turkish legal domain. This dataset does not contain any labels related to the prior court case retrieval task. Masked language modeling is used to train BERTurk-Legal in a self-supervised manner. With zero-shot classification, BERTurk-Legal provides state-of-the-art results on the dataset consisting of legal cases of the Court of Cassation of Turkey. The results of the experiments show the necessity of developing language models specific to the Turkish law domain. | |
dc.description.abstract | Bu çalışmada BERTurk-Legal isimli dönüştürücü tabanlı model emsal karar bulma görevinde kullanılmak üzere önerilmektedir. BERTurk-Legal’in ön eğitimi Türkçe hukuk alanında bir veri kümesi ile yapılmıştır. Bu veri kümesi emsal kararlar ile ilgili herhangi bir etiket bulundurmamaktadır. BERTurk-Legal maskeli dil modelleme kullanılarak kendiliğinden denetimli bir şekilde eğitilmiştir. BERTurk-Legal sınıflandırma görevi üzerinde eğitilmeksizin Yargıtay davalarından oluşan bir veri kümesinde literatürdeki en iyi sonuçları vermiştir. Deney sonuçları Türkçe hukuk alanına özel dil modelleri geliştirme gerekliliğini göstermektedir. | |
dc.description.provenance | Made available in DSpace on 2024-03-22T12:55:52Z (GMT). No. of bitstreams: 1 A_transformer-based_prior_legal_case_retrieval_method.pdf: 561295 bytes, checksum: 641473ca3ac0af074748d3a3d29fcb24 (MD5) Previous issue date: 2023-08 | en |
dc.identifier.doi | 10.1109/SIU59756.2023.10223938 | |
dc.identifier.eisbn | 9798350343557 | |
dc.identifier.isbn | 9798350343564 | |
dc.identifier.issn | 2165-0608 | |
dc.identifier.uri | https://hdl.handle.net/11693/115089 | |
dc.language.iso | Turkish | |
dc.publisher | IEEE - Institute of Electrical and Electronics Engineers | |
dc.relation.isversionof | https://dx.doi.org/10.1109/SIU59756.2023.10223938 | |
dc.source.title | 2023 31st Signal Processing and Communications Applications Conference (SIU 2023) | |
dc.subject | Natural language processing | |
dc.subject | Legal tech | |
dc.subject | Deep learning | |
dc.subject | Prior legal case retrieval | |
dc.subject | Legal NLP | |
dc.subject | Turkish NLP | |
dc.subject | Doğal dil işleme | |
dc.subject | Hukuk teknolojileri | |
dc.subject | Derin öğrenme | |
dc.subject | Emsal karar bulma | |
dc.subject | Hukukta NLP | |
dc.subject | Türkçe NLP | |
dc.title | A transformer-based prior legal case retrieval method | |
dc.title.alternative | Dönüştürücü tabanlı emsal karar bulma yöntemi | |
dc.type | Conference Paper |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- A_transformer-based_prior_legal_case_retrieval_method.pdf
- Size:
- 548.14 KB
- Format:
- Adobe Portable Document Format
License bundle
1 - 1 of 1
No Thumbnail Available
- Name:
- license.txt
- Size:
- 2.01 KB
- Format:
- Item-specific license agreed upon to submission
- Description: