Skip to content

mirascope.core.anthropic.call_response_chunk

This module contains the AnthropicCallResponseChunk class.

Usage Documentation

Streams

AnthropicCallResponseChunk

Bases: BaseCallResponseChunk[MessageStreamEvent, FinishReason]

A convenience wrapper around the Anthropic ChatCompletionChunk streamed chunks.

When calling the Anthropic API using a function decorated with anthropic_call and stream set to True, the stream will contain AnthropicResponseChunk instances with properties that allow for more convenient access to commonly used attributes.

Example:

from mirascope.core import prompt_template
from mirascope.core.anthropic import anthropic_call


@anthropic_call("claude-3-5-sonnet-20240620", stream=True)
@prompt_template("Recommend a {genre} book")
def recommend_book(genre: str):
    ...


stream = recommend_book("fantasy")  # response is an `AnthropicStream`
for chunk, _ in stream:
    print(chunk.content, end="", flush=True)

content property

content: str

Returns the string content of the 0th message.

finish_reasons property

finish_reasons: list[FinishReason] | None

Returns the finish reason of the response.

model property

model: str | None

Returns the name of the response model.

id property

id: str | None

Returns the id of the response.

usage property

usage: Usage | MessageDeltaUsage | None

Returns the usage of the message.

input_tokens property

input_tokens: int | None

Returns the number of input tokens.

output_tokens property

output_tokens: int | None

Returns the number of output tokens.