Price Per TokenPrice Per Token
Meta-llama

Code Llama 70B Instruct API Pricing 2026

Compare pricing, benchmarks, and providers for Code Llama 70B Instruct. Find the best value for your use case.

Get our weekly newsletter on pricing changes, new releases, and tools.

No bots, no big tech influence — join the new community for AI devs
8 Ways to Use Fewer Tokens
Last updated: April 23, 2026 at 08:37 AM

Overview

Code Llama 70B Instruct. Pricing starts at $0.900 per million input tokens and $0.900 per million output tokens. The model supports a context window of up to 4K tokens. API access is available through Meta-llama.

Context Window
4K
tokens
Pricing
Input$0.900
Output$0.900
Cached$0.450
per 1M tokens
Speed
Output-
TTFT-
median latency
Capabilities
Text only

Pricing Comparison

Compare Code Llama 70B Instruct with 0 similar models by price.

Current Pricing (per 1M tokens)

1 models

Provider
Model
Input $/M
Output $/M
Coding
MMLU
GPQA
Context
Actions
$0.900
$0.900
4,096
Try

* Some models use tiered pricing based on prompt length. Displayed prices are for prompts ≤ 200k tokens.

Compare Providers

Code Llama 70B Instruct is available from multiple providers with different pricing and availability.

No multi-provider data available for this model

This model may only be available from a single provider

Community Rankings

How does Code Llama 70B Instruct perform? Vote based on your experience.

Cost vs. Quality

Compare Code Llama 70B Instruct's benchmark performance against all models.

X-axis:
Y-axis:
Loading chart...
Other models

Try Code Llama 70B Instruct

Use our Calculator

Estimate your costs based on expected token usage.

Open Cost Calculator

Try in Playground

Test Code Llama 70B Instruct directly in your browser.

Open Playground

Frequently Asked Questions

Code Llama 70B Instruct costs $0.000900 per 1,000 input tokens and $0.000900 per 1,000 output tokens.

All Meta-llama Models

See pricing for all Meta-llama models.

View all Meta-llama models →