NGC Catalog
CLASSIC
Welcome Guest
Models
BioBERT TF checkpoint (Base, Uncased, RE, ChemProt, AMP)

BioBERT TF checkpoint (Base, Uncased, RE, ChemProt, AMP)

For downloads and more information, please view on a desktop device.
Logo for BioBERT TF checkpoint (Base, Uncased, RE, ChemProt, AMP)
Description
BioBERT Base Uncased Fine tuned checkpoint for Relation Extraction on ChemProt Dataset.
Publisher
NVIDIA Deep Learning Examples
Latest Version
19.08.1_amp
Modified
April 4, 2023
Size
1.23 GB

Model Overview

BERT for biomedical text-mining.

Model Architecture

In the original BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding paper, pre-training is done on Wikipedia and Books Corpus, with state-of-the-art results demonstrated on SQuAD (Stanford Question Answering Dataset) benchmark.

Meanwhile, many works, including BioBERT, SciBERT, NCBI-BERT, ClinicalBERT (MIT), ClinicalBERT (NYU, Princeton), and others at BioNLP'19 workshop, show that additional pre-training of BERT on large biomedical text corpus such as PubMed results in better performance in biomedical text-mining tasks.

This repository provides scripts and recipe to adopt the NVIDIA BERT code-base to achieve state-of-the-art results in the following biomedical text-mining benchmark tasks:

  • BC5CDR-disease A Named-Entity-Recognition task to recognize diseases mentioned in a collection of 1500 PubMed titles and abstracts (Li et al., 2016)

  • BC5CDR-chemical A Named-Entity-Recognition task to recognize chemicals mentioned in a collection of 1500 PubMed titles and abstracts (Li et al., 2016)

  • ChemProt A Relation-Extraction task to determine chemical-protein interactions in a collection of 1820 PubMed abstracts (Krallinger et al., 2017)

Training

This model was trained using script available on NGC and in GitHub repo.

Dataset

The following datasets were used to train this model:

  • PubMed - Database contains more than 33 million citations and abstracts of biomedical literature.
  • ChemProt - Database of disease chemical biology.

Performance

Performance numbers for this model are available in NGC.

References

  • Original paper
  • NVIDIA model implementation in NGC
  • NVIDIA model implementation on GitHub

License

This model was trained using open-source software available in Deep Learning Examples repository. For terms of use, please refer to the license of the script and the datasets the model was derived from.