NGC | Catalog
CatalogModelsBioBERT TF checkpoint (Base, Uncased, RE, ChemProt, AMP)

BioBERT TF checkpoint (Base, Uncased, RE, ChemProt, AMP)

Logo for BioBERT TF checkpoint (Base, Uncased, RE, ChemProt, AMP)
Description
BioBERT Base Uncased Fine tuned checkpoint for Relation Extraction on ChemProt Dataset.
Publisher
NVIDIA Deep Learning Examples
Latest Version
19.08.1_amp
Modified
April 4, 2023
Size
1.23 GB

Model Overview

BERT for biomedical text-mining.

Model Architecture

In the original BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding paper, pre-training is done on Wikipedia and Books Corpus, with state-of-the-art results demonstrated on SQuAD (Stanford Question Answering Dataset) benchmark.

Meanwhile, many works, including BioBERT, SciBERT, NCBI-BERT, ClinicalBERT (MIT), ClinicalBERT (NYU, Princeton), and others at BioNLP'19 workshop, show that additional pre-training of BERT on large biomedical text corpus such as PubMed results in better performance in biomedical text-mining tasks.

This repository provides scripts and recipe to adopt the NVIDIA BERT code-base to achieve state-of-the-art results in the following biomedical text-mining benchmark tasks:

Training

This model was trained using script available on NGC and in GitHub repo.

Dataset

The following datasets were used to train this model:

  • PubMed - Database contains more than 33 million citations and abstracts of biomedical literature.
  • ChemProt - Database of disease chemical biology.

Performance

Performance numbers for this model are available in NGC.

References

License

This model was trained using open-source software available in Deep Learning Examples repository. For terms of use, please refer to the license of the script and the datasets the model was derived from.