Document Type

Article

Publication Title

arXiv

Abstract

In this paper we report on our submission to the Multidocument Summarisation for Literature Review (MSLR) shared task. Specifically, we adapt PRIMERA (Xiao et al., 2022) to the biomedical domain by placing global attention on important biomedical entities in several ways. We analyse the outputs of the 23 resulting models, and report patterns in the results related to the presence of additional global attention, number of training steps, and the input configuration. © 2022, CC BY-SA.

DOI

10.48550/arXiv.2209.08698

Publication Date

9-19-2022

Keywords

Biomedical domain, Literature reviews, Multi documents summarization

Comments

Preprint: arXiv

Archived with thanks to arXiv

Preprint License: CC by SA 4.0

Uploaded 12 October 2022

Share

COinS