This is a checkpoint for the BioBERT Base Cased model compatible with NeMo that is converted from https://github.com/dmis-lab/biobert#download. This model has the same network architecture as the original BERT, but instead of Wikipedia and BookCorpus it is pretrained on PubMed, a large biomedical text corpus, which achieves better performance in biomedical downstream tasks, such as question answering(QA), named entity recognition(NER) and relationship extraction(RE). This model was trained for 1M steps. For more information please refer to the original paper https://academic.oup.com/bioinformatics/article/36/4/1234/5566506.
The model achieves SAcc/MRR/LAcc of 39/59.86/47.03 on QA dataset BioASQ-7b-factoid and macro precision/recall/f1 of 78.22/80.2/79.15 on RE dataset ChemProt.
Please be sure to download the latest version in order to ensure compatibility with the latest NeMo release.
For more details regarding BERT and pretraining please refer to https://ngc.nvidia.com/catalog/models/nvidia:bertbasecasedfornemo. For more details about BioBERT and training setup please refer to https://academic.oup.com/bioinformatics/article/36/4/1234/5566506.
Source code and developer guide is available at https://github.com/NVIDIA/NeMo Refer to documentation at https://docs.nvidia.com/deeplearning/nemo/neural-modules-release-notes/index.html
This model checkpoint can be used for either finetuning BioBERT on your custom dataset, or finetuning downstream tasks. All of these tasks and scripts can be found at https://github.com/NVIDIA/NeMo.
In the following we show examples for how to finetune BioBERT on different downstream tasks.
Visit https://github.com/NVIDIA/NeMo/blob/master/examples/nlp/biobert_notebooks/biobert_qa.ipynb
Visit https://github.com/NVIDIA/NeMo/blob/master/examples/nlp/biobert_notebooks/biobert_re.ipynb
Visit https://github.com/NVIDIA/NeMo/blob/master/examples/nlp/biobert_notebooks/biobert_ner.ipynb