Linux / amd64
NVIDIA NIM, part of NVIDIA AI Enterprise, is a set of easy-to-use microservices designed to speed up generative AI deployment in enterprises. Supporting a wide range of AI models, including NVIDIA AI foundation and custom models, it ensures seamless, scalable AI inferencing, on-premises or in the cloud, leveraging industry standard APIs.
Hive’s Deepfake Image Detection model analyzes images and returns a confidence score on how likely the image contains a deepfake. The model was trained on millions of images from dozens of major deepfake generators, and has frequent updates to account for new deepfake engines and adversarial techniques.
NVIDIA NIM offers prebuilt containers for multimodal safety models that can be used to safeguard AI applications — or any application that needs to understand and generate multimodal content. Each NIM consists of a container and a model and uses a CUDA-accelerated runtime for all NVIDIA GPUs, with special optimizations available for many configurations. Whether on-premises or in the cloud, NIM is the fastest way to achieve accelerated generative AI inference at scale.
NVIDIA NIM abstracts away model inference internals such as execution engine and runtime operations.
Consumer applications: Detect deepfake images often used by fraudsters and bad actors Social Media Monitoring: Screen for scams posted on social media leveraging deepfake content of celebrities and public figures Identity Verification: Confirm photo verification documents provided by users are legitimate and not deepfaked
Deploying and integrating NVIDIA NIM is straightforward thanks to our industry standard APIs. Visit the NIM for Multimodal Safety page for release documentation, deployment guides and more.
Please review the Security Scanning tab to view the latest security scan results.
For certain open-source vulnerabilities listed in the scan results, NVIDIA provides a response in the form of a Vulnerability Exploitability eXchange (VEX) document. The VEX information can be reviewed and downloaded from the Security Scanning tab.
Get access to knowledge base articles and support cases or submit a ticket.
Visit the NIM for Multimodal Safety page for release documentation, deployment guides and more.
The NIM container is governed by the NVIDIA AI Product Agreement; and the use of this model is governed by the Hive Model Agreement.
You are responsible for ensuring that your use of NVIDIA AI Foundation Models complies with all applicable laws.