BIBM 의료 CTG - Songwooseok123/Study_Space GitHub Wiki
BioGPT: Generative Pre-trained Transformer for Biomedical Text Generation and Mining
-
2022 Oct
-
pre-train GPT-2 with collected data + Fine-tuning(task에 맞게)
- end-to-end relation extraction
- question answering (QA)
- document classification
- Text Generation
-
Bio로 control 된 모습 ↓
BioBART: Pretraining and Evaluation of A Biomedical Generative Language Model
- we introduce the generative language model BioBART that adapts BART to the biomedical domain. We collate various biomedical language generation tasks including dialogue, summarization, entity linking, and named entity recognition. BioBART pretrained on PubMed abstracts has enhanced performance compared to BART and set strong baselines on several tasks. Furthermore, we conduct ablation studies on the pretraining tasks for BioBART and find that sentence permutation has negative effects on downstream tasks.
Generation and evaluation of artificial mental health records for Natural Language Processing