NGC Catalog
CLASSIC
Welcome Guest
Models
Phi-3-mini-128k Instruct Int4 RTX

Phi-3-mini-128k Instruct Int4 RTX

For downloads and more information, please view on a desktop device.
Logo for Phi-3-mini-128k Instruct Int4 RTX
Features
Description
The Phi-3-Mini-128K-Instruct is a 3.8 billion-parameter, lightweight, state-of-the-art open model trained using the Phi-3 datasets. This dataset includes both synthetic data and filtered publicly available website data, with high-quality properties.
Publisher
Microsoft
Latest Version
1.0
Modified
August 9, 2024
Size
2.4 GB

Description:

The Phi-3-Mini-128K-Instruct is a 3.8 billion-parameter, lightweight, state-of-the-art open model trained using the Phi-3 datasets. This dataset includes both synthetic data and filtered publicly available website data, with an emphasis on high-quality and reasoning-dense properties. The model belongs to the Phi-3 family with the Mini version in two variants 4K and 128K which is the context length (in tokens) that it can support. The model underwent a rigorous enhancement process, incorporating both supervised fine-tuning and direct preference optimization to ensure precise instruction adherence and robust safety measures. Phi-3 Mini has 3.8B parameters and is a dense decoder-only Transformer model.

The model is licensed under the MIT license.

Terms of use:

By accessing this model, you are agreeing to the phi3 Terms and Conditions of the MIT License.

References(s):

  • Phi-3-mini-128k-instruct - Model Card
  • Phi-3 - Microsoft Blog

Model Architecture:

Architecture Type: Transformer

Input:

Input Format: Text

Input Parameters: None

Output:

Output Format: Text

Output Parameters: None

Software Integration:

Supported Hardware Platform(s): RTX 4090, Ada GPUs

Supported Operating System(s): Windows

Inference:

TRT-LLM Inference Engine
Windows Setup with TRT-LLM

Test Hardware:

RTX 4090