NGC | Catalog
Welcome Guest

DeepPavlov

For pull tags and more information, please view on a desktop device.
Logo for DeepPavlov

Description

DeepPavlov is an open-source conversational AI library built on TensorFlow and Keras. DeepPavlov is designed for development of production ready chatbots and complex conversational systems, research in the area of NLP and, particularly, of dialog systems.

Publisher

DeepPavlov

Latest Tag

0.11.0

Modified

October 18, 2021

Compressed Size

5.42 GB

Multinode Support

No

Multi-Arch Support

No

What is DeepPavlov?

DeepPavlov is an open-source conversational AI library built on TensorFlow and Keras.

DeepPavlov is designed for

  • development of production ready chat-bots and complex conversational systems,
  • research in the area of NLP and, particularly, of dialog systems.

Please leave us your feedback on how we can improve the DeepPavlov framework.

This repository contains pre-built DeepPavlov images. The images allow you to run DeepPavlov models and communicate them via REST-like HTTP API (see riseapi DeepPavlov docs for more details). Images from these repository are built to be run on GPU and require to have NVIDIA Container Toolkit installed. Dockerfile can be found here.

Running DeepPavlov

Run following to start DeepPavlov model:

nvidia-docker run -e CONFIG=deeppavlov_config \
    -p host_port:5000 \
    -v dp_components_volume:/root/.deeppavlov \
    nvcr.io/partners/deeppavlov:latest

Where:

  1. deeppavlov_config - is config file name (without extension) for model you want to run. You can get DeepPavlov models list with description in DP features docs or browse DP GitHub here.

  2. host_port - port on which you want to run DeepPavlov model.

  3. dp_components_dir - directory on the host where you can mount DeepPavlov downloaded components dir. Most of DeepPavlov models use downloadable components (pretrained model pickles, embeddings...) which are downloaded from DeepPavlov servers. To prevent downloading components (some of them are quite heavy) each time you run Docker image for specific DeepPavlov config, you can mount volume. If you do it, DeepPavlov will store components downloaded during the first launch of any DeepPavlov config in this volume, so during further launches DeepPavlov won't reload components. We recommend to use one dp_components_volume for all models because some of them can use same components. DeepPavlov will automatically manage downloaded components for all configs in this volume.

After model initiate, follow url http://127.0.0.1:host_port in your browser to get Swagger for model API and endpoint reference.

Example:

  1. This will run Docker container with NER Ontonotes on your host with GPU acceleration:
nvidia-docker run -e CONFIG=ner_ontonotes \
    -p 5555:5000 \
    -v ~/my_dp_components:/root/.deeppavlov \
    nvcr.io/partners/deeppavlov:latest
  1. Follow http://127.0.0.1:5555 URL in your browser to get Swagger with model API info;

  2. Downloadable components located in ~/my_dp_components (contents of this dir is managed by DeepPavlov).

License

By pulling and using this container, you accept the terms and conditions of Apache 2.0 license.