Llama Guard

Logo for Llama Guard
Description
Llama Guard is a model for classifying the safety of LLM prompts and responses, using a taxonomy of safety risks.
Publisher
Meta
Modified
January 18, 2024
Meta Terms of Use: By using the Llama Guard model, you are agreeing to the terms and conditions of the license, acceptable use policy and Meta’s privacy policy.