UniMRE: a unified framework for zero-shot medicial relation extraction with large language models

Ojemann WK, Xie K, Liu K, Chang E, Roth D, Litt B, Ellis CA. Zero-shot extraction of seizure outcomes from clinical notes using generative pretrained transformers, medRxiv 2024. 2024–11.

Fan C, Wei W, Qu X, Lu Z, Xie W, Cheng Y, Chen D. Enhancing low-resource relation representations through multi-view decoupling. in: Proceedings of the AAAI Conference on Artificial Intelligence. 2024;38:17968–76.

Google Scholar 

Shang Y-M, Huang H, Sun X, Wei W, Mao X-L. A pattern-aware self-attention network for distant supervised relation extraction. Inf Sci. 2022;584:269–79.

Google Scholar 

Chia YK, Bing L, Poria S, Si L. Relationprompt: Leveraging prompts to generate synthetic data for zero-shot relation triplet extraction, in: Findings of the Association for Computational Linguistics: ACL 2022, 2022;45–57.

Yang A, Yang B, Hui B, Zheng B, Yu B, Zhou C, Li C, Li C, Liu D, Huang F, et al. Qwen2 technical report, arXiv preprint arXiv:2407.10671 2024.

Dubey A, Jauhri A, Pandey A, Kadian A, Al-Dahle A, Letman A, Mathur A, Schelten A, Yang A, Fan A, et al. The llama 3 herd of models, arXiv preprint arXiv:2407.21783 2024.

Tang W, Xu B, Zhao Y, Mao Z, Liu Y, Liao Y, Xie H. Unirel: Unified representation and interaction for joint relational triple extraction, in: Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, 2022;7087–7099.

Wang Q, Zhou K, Qiao Q, Li Y, Li Q. Improving unsupervised relation extraction by augmenting diverse sentence pairs, in: Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, 2023;12136–12147.

Duan J, Liao X, An Y, Wang J. Keyee: enhancing low-resource generative event extraction with auxiliary keyword sub-prompt. Big Data Min Anal. 2024;7(2):547–60.

Google Scholar 

Li G, Xu Z, Shang Z, Liu J, Ji K, Guo Y. Empirical analysis of dialogue relation extraction with large language models, in: Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence, 2024;6359–6367.

Li G, Wang P, Liu J, Guo Y, Ji K, Shang Z, Xu Z. Meta in-context learning makes large language models better zero and few-shot relation extractors, in: Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence, 2024;6350–6358.

Liu Y, Peng X, Du T, Yin J, Liu W, Zhang X. Era-cot: Improving chain-of-thought through entity relationship analysis, in: Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2024;8780–8794.

Labrak Y, Rouvier M, Dufour R. A zero-shot and few-shot study of instruction-finetuned large language models applied to clinical and biomedical tasks, in: Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), 2024;2049–2066.

Li G, Ke W, Wang P, Xu Z, Ji K, Liu J, Shang Z, Luo Q. Unlocking instructive in-context learning with tabular prompting for relational triple extraction, in: Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), 2024;17131–17143.

Zhang K, Gutiérrez B.J, Su Y. Aligning instruction tasks unlocks large language models as zero-shot relation extractors, in: Findings of the Association for Computational Linguistics: ACL 2023, 2023;794–812.

Mo Y, Liu J, Yang J, Wang Q, Zhang S, Wang J, Li Z. C-icl: contrastive in-context learning for information extraction, arXiv preprint arXiv:2402.11254 2024.

Wadhwa S, Amir S, Wallace BC. Revisiting relation extraction in the era of large language models. in: Proceedings of the conference Association for Computational Linguistics Meeting. 2023;2023:15566–89.

Google Scholar 

Roy A, Pan S. Incorporating medical knowledge in bert for clinical relation extraction, in: Proceedings of the 2021 conference on empirical methods in natural language processing, 2021;5357–5366.

Tang X, Su Q, Wang J, Deng Z. Chisiec: An information extraction corpus for ancient chinese history, in: Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), 2024;3192–3202.

Mintz M, Bills S, Snow R, Jurafsky D. Distant supervision for relation extraction without labeled data, in: Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP, 2009;1003–1011.

Ma R, Gui T, Li L, Zhang Q, Huang X.-J, Zhou Y. Sent: Sentence-level distant relation extraction via negative training, in: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), 2021;6201–6213.

Takanobu R, Zhang T, Liu J, Huang M. A hierarchical framework for relation extraction with reinforcement learning. in: Proceedings of the AAAI conference on artificial intelligence. 2019;33:7072–9.

Google Scholar 

Han X, Gao T, Yao Y, Ye D, Liu Z, Sun M. Opennre: An open and extensible toolkit for neural relation extraction, in: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP): System Demonstrations, 2019;169–174.

Zhou W, Chen M. An improved baseline for sentence-level relation extraction, in: Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), 2022;161–168.

Wei X, Cui X, Cheng N, Wang X, Zhang X, Huang S, Xie P, Xu J, Chen Y, Zhang M, et al. Chatie: Zero-shot information extraction via chatting with chatgpt. 2023.

Li XL, Holtzman A, Fried D, Liang P, Eisner J, Hashimoto T, Zettlemoyer L, Lewis M. Contrastive decoding: Open-ended text generation as optimization, in: Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2023;12286–12312.

Lewis P, Perez E, Piktus A, Petroni F, Karpukhin V, Goyal N, Küttler H, Lewis M, Yih W-T, Rocktäschel T, et al. Retrieval-augmented generation for knowledge-intensive nlp tasks. Adv Neural Inform Process Syst. 2020;33:9459–74.

Google Scholar 

Ovadia O, Brief M, Mishaeli M, Elisha O. Fine-tuning or retrieval? comparing knowledge injection in llms, in: Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, 2024;237–250.

Maekawa S, Iso H, Gurajada S, Bhutani N. Retrieval helps or hurts? a deeper dive into the efficacy of retrieval augmentation to language models, in: Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), 2024;5506–5521.

Li M, Kilicoglu H, Xu H, Zhang R. Biomedrag: a retrieval augmented large language model for biomedicine. J Biomed Inf. 2025;162: 104769.

Google Scholar 

Devlin J, Chang M.-W, Lee K, Toutanova K. Bert: Pre-training of deep bidirectional transformers for language understanding, in: Proceedings of the 2019 conference of the North American chapter of the association for computational linguistics: human language technologies, volume 1 (long and short papers), 2019;4171–4186.

Lewis M, Liu Y, Goyal N, Ghazvininejad M, Mohamed A, Levy O, Stoyanov V, Zettlemoyer L. Bart: Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension, in: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Association for Computational Linguistics, 2020;7871.

Tan KL, Lee CP, Anbananthen KSM, Lim KM. Roberta-lstm: a hybrid model for sentiment analysis with transformer and recurrent neural network. IEEE Access. 2022;10:21517–25.

Google Scholar 

Zhang L, Liu M, Wang L, Zhang Y, Xu X, Pan Z, Feng Y, Zhao J, Zhang L, Yao G, et al. Constructing a large language model to generate impressions from findings in radiology reports. Radiology. 2024;312(3): e240885.

Google Scholar 

Wan Z, Cheng F, Mao Z, Liu Q, Song H, Li J, Kurohashi S. Gpt-re: In-context learning for relation extraction using large language models, in: Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, 2023;3534–3547.

Kim B, Iso H, Bhutani N, Hruschka E, Nakashole N, Mitchell T. Zero-shot triplet extraction by template infilling, in: Proceedings of the 13th International Joint Conference on Natural Language Processing and the 3rd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics (Volume 1: Long Papers), 2023;272–284.

Zhang K, Zhang C, Zhang W, Zan H. Corpus construction of critical illness entities and relationships, in: Workshop on Chinese Lexical Semantics, Springer, 2023;61–75.

Guan T, Zan H, Zhou X, Xu H, Zhang K. Cmeie: construction and evaluation of chinese medical information extraction dataset, in: Natural Language Processing and Chinese Computing: 9th CCF International Conference, NLPCC 2020, Zhengzhou, China, October 14–18, 2020, Proceedings, Part I 9, Springer, 2020;270–282.

Comments (0)

No login
gif