You can manage bookmarks using lists, please log in to your user account for this.
Media type:
E-Article
Title:
Applying BERT Embeddings to Predict Legal Textual Entailment
Contributor:
Wehnert, Sabine;
Dureja, Shipra;
Kutty, Libin;
Sudhi, Viju;
De Luca, Ernesto William
imprint:
Springer Science and Business Media LLC, 2022
Published in:The Review of Socionetwork Strategies
Language:
English
DOI:
10.1007/s12626-022-00101-3
ISSN:
2523-3173;
1867-3236
Origination:
Footnote:
Description:
<jats:title>Abstract</jats:title><jats:p>Textual entailment classification is one of the hardest tasks for the Natural Language Processing community. In particular, working on entailment with legal statutes comes with an increased difficulty, for example in terms of different abstraction levels, terminology and required domain knowledge to solve this task. In course of the COLIEE competition, we develop three approaches to classify entailment. The first approach combines Sentence-BERT embeddings with a graph neural network, while the second approach uses the domain-specific model LEGAL-BERT, further trained on the competition’s retrieval task and fine-tuned for entailment classification. The third approach involves embedding syntactic parse trees with the KERMIT encoder and using them with a BERT model. In this work, we discuss the potential of the latter technique and why of all our submissions, the LEGAL-BERT runs may have outperformed the graph-based approach.</jats:p>