Abstract
This paper details our contribution to the BioASQ CLEF Lab 2025 [1] GutBrainIE shared task (task 6) [2]. The mission focuses on Named Entity Recognition (NER) and Relation Extraction (RE) from biomedical abstracts concerning the gut-brain axis, Parkinson’s disease, and mental health. We developed a system leveraging large language models (LLMs), employing GLiNER for NER and ATLOP for RE, fine-tuned on various backbones including GLiNER Large Bio and roberta large. Our approach involved a three-stage pipeline: data processing, model fine-tuning, and prediction generation. We participated in four subtasks: NER (6.1), Binary Tag-Based RE (6.2.1), Ternary Tag-Based RE (6.2.2), and Ternary Mention-Based RE (6.2.3). Our results indicate strong performance in tag-based RE, with our roberta-large model achieving a micro-F1 score of 0.6122 in binary RE (ranked 5th out of 11) and 0.5911 in Ternary tag-based RE (ranked 6th out of 12), outperforming the baseline in both cases. However, our system struggled with NER (micro-F1 0.4816, ranking 15th) and particularly with Ternary Mention-Based RE (micro-F1 0.1799, ranking 11th), highlighting challenges with fine-grained entity detection and mention-level relation identification. We conclude that while large transformers are effective for extraction of biomedical relationships, future work must address domain adaptation for NER and explore joint modeling approaches to improve mention-level performance.
| Original language | English |
|---|---|
| Pages (from-to) | 359-372 |
| Number of pages | 14 |
| Journal | CEUR Workshop Proceedings |
| Volume | 4038 |
| State | Published - 2025 |
| Externally published | Yes |
| Event | 26th Working Notes of the Conference and Labs of the Evaluation Forum, CLEF 2025 - Madrid, Spain Duration: 9 Sep 2025 → 12 Sep 2025 |
Bibliographical note
Publisher Copyright:© 2025 Copyright for this paper by its authors.
Keywords
- ATLOP
- Biomedical Information Extraction
- Deep Learning
- GLiNER
- Named Entity Recognition
- Relation Extraction
- Transformers