Meta Llama API Pricing (Updated 2025)

This page tracks Meta Llama API pricing for all 16 models. Prices are shown per 1,000 tokens (cost per token) with clear examples so you can estimate spend quickly.

Current Prices (per 1,000 tokens)

Model
Input ($/1K tokens)
Output ($/1K tokens)
Context Length
llama-3-70b-instruct
$0.000300
$0.000400
8,192
llama-3-8b-instruct
$0.000030
$0.000060
8,192
llama-3.1-405b
$0.002000
$0.002000
32,768
llama-3.1-405b-instruct
$0.000800
$0.000800
32,768
llama-3.1-70b-instruct
$0.000100
$0.000280
131,072
llama-3.1-8b-instruct
$0.000015
$0.000020
131,072
llama-3.2-11b-vision-instruct
$0.000049
$0.000049
131,072
llama-3.2-1b-instruct
$0.000005
$0.000010
131,072
llama-3.2-3b-instruct
$0.000003
$0.000006
20,000
llama-3.2-90b-vision-instruct
$0.001200
$0.001200
131,072
llama-3.3-70b-instruct
$0.000038
$0.000120
131,072
llama-4-maverick
$0.000150
$0.000600
1,048,576
llama-4-scout
$0.000080
$0.000300
1,048,576
llama-guard-2-8b
$0.000200
$0.000200
8,192
llama-guard-3-8b
$0.000020
$0.000060
131,072
llama-guard-4-12b
$0.000180
$0.000180
163,840

* Some models use tiered pricing based on prompt length. Displayed prices are for prompts ≤ 200k tokens.

What does "cost per token" mean?

Tokens are the basic units that AI models use to process text. Generally, 1,000 tokens ≈ ~750 words. The cost per token determines how much you pay for each unit of text processed by the model.

Formula: Total Cost = (tokens / 1,000) × price per 1K tokens

Examples

Example usage: A 500-word prompt + 300-word response ≈ ~1,067 tokens total.

  • • 1,000 tokens ≈ ~750 words of text
  • • Short email: ~200-400 tokens
  • • Blog post: ~1,000-3,000 tokens
  • • Research paper: ~10,000+ tokens

Compare with Other Providers

See how Meta Llama compares with other AI providers including OpenAI, Anthropic, Google, and more.

View full pricing comparison →

Frequently Asked Questions

Currently, llama-3.2-3b-instruct is the most affordable at $0.000003 per 1,000 input tokens.
Count your input tokens (prompt) and expected output tokens (response). Use the formula: (input_tokens/1000 × input_price) + (output_tokens/1000 × output_price). Our pricing calculator can help automate this.
Meta Llama pricing varies by model. Input costs range from $0.000003 to $0.002000 per 1,000 tokens. See the table above for specific model pricing.