Skip to main content

Anthropic Claude

Required Anthropic Claude Attributes

The following cost calculation tags are required for traces originating from Anthropic Claude API calls.

Always Required Fields

These fields are always required for Beakpoint to calculate Claude costs:

Attribute NameExample ValueAllowed Values
gen_ai.systemanthropicanthropic (must be this exact value)
gen_ai.request.modelclaude-sonnet-4Any valid Anthropic model name
gen_ai.usage.input_tokens512Non-negative integer
gen_ai.usage.output_tokens128Non-negative integer

Optional Enrichment Attributes

These fields are optional but improve cost accuracy when provided:

Attribute NameExample ValueDescription
gen_ai.response.modelclaude-sonnet-4-20250514The exact model version returned in the response. When present, this takes precedence over gen_ai.request.model for pricing lookups.
gen_ai.usage.input_tokens.cache_creation256Tokens written into the prompt cache. Cache creation tokens are billed at a premium rate above standard input pricing.
gen_ai.usage.input_tokens.cache_read128Tokens read from the prompt cache. Cache read tokens are billed at a significantly reduced rate.
note

Claude's prompt caching uses two distinct token counters — cache_creation and cache_read — rather than the single cached counter used by OpenAI. Provide both when available to ensure accurate cost attribution.

Supported Models

Beakpoint calculates costs for the following Anthropic Claude models. Prices are per 1 million tokens (USD).

ModelInput ($/M)Cache Creation ($/M)Cache Read ($/M)Output ($/M)
claude-sonnet-4$3.00$3.75$0.30$15.00
claude-opus-4$15.00$18.75$1.50$75.00
claude-haiku-3-5$0.80$1.00$0.08$4.00
note

Prices reflect Anthropic list pricing and may change. Beakpoint keeps these rates up to date, but check the Anthropic pricing page for the latest figures.

Python Example

The quickest way to emit the required attributes is with the opentelemetry-instrumentation-anthropic package, which automatically attaches GenAI semantic conventions to every Anthropic API call.

pip install opentelemetry-instrumentation-anthropic
from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
import anthropic

# Instrument before creating the client
AnthropicInstrumentor().instrument()

client = anthropic.Anthropic()

message = client.messages.create(
model="claude-haiku-3-5",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello!"}],
)

The instrumentation automatically sets gen_ai.system, gen_ai.request.model, gen_ai.usage.input_tokens, gen_ai.usage.output_tokens, and the optional cache token counters whenever they are available in the API response.

For full setup instructions, including how to configure the OpenTelemetry exporter for Beakpoint, see the Track LLM Costs guide.