wandb¶
Integrations with Weights & Biases toolins (wandb, weave).
WandbCallMixin
¶
Bases: _WandbBaseCall
, Generic[BaseCallResponseT]
A mixin for integrating a call with Weights & Biases.
Use this class's built in call_with_trace
method to log traces to WandB along with
your calls to LLM. These calls will include all of the additional metadata
information such as the prompt template, template variables, and more.
Example:
import os
from mirascope.openai import OpenAICall, OpenAICallResponse
from mirascope.wandb import WandbCallMixin
import wandb
wandb.login(key="YOUR_WANDB_API_KEY")
wandb.init(project="wandb_logged_chain")
os.environ["OPENAI_API_KEY"] = "YOUR_OPENAI_API_KEY"
class BookRecommender(OpenAICall, WandbCallMixin[OpenAICallResponse]):
prompt_template = """
SYSTEM:
You are the world's greatest librarian.
USER:
Please recommend a {genre} book.
"""
genre: str
recommender = BookRecommender(span_type="llm", genre="fantasy")
response, span = recommender.call_with_trace()
# ^ this is a `Span` returned from the trace (or trace error).
Source code in mirascope/wandb/wandb.py
call_with_trace(parent=None, **kwargs)
¶
Creates an LLM response and logs it via a W&B Trace
.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
parent |
Optional[Trace]
|
The parent trace to connect to. |
None
|
Returns:
Type | Description |
---|---|
tuple[Optional[BaseCallResponseT], Trace]
|
A tuple containing the completion and its trace (which has been connected to the parent). |
Source code in mirascope/wandb/wandb.py
WandbExtractorMixin
¶
Bases: _WandbBaseExtractor
, Generic[T]
A extractor mixin for integrating with Weights & Biases.
Use this class's built in extract_with_trace
method to log traces to WandB along
with your calls to the LLM. These calls will include all of the additional metadata
information such as the prompt template, template variables, and more.
Example:
import os
from typing import Type
from mirascope.openai import OpenAIExtractor
from mirascope.wandb import WandbExtractorMixin
from pydantic import BaseModel
import wandb
wandb.login(key="YOUR_WANDB_API_KEY")
wandb.init(project="wandb_logged_chain")
os.environ["OPENAI_API_KEY"] = "YOUR_OPENAI_API_KEY"
class Book(BaseModel):
title: str
author: str
class BookRecommender(OpenAIExtractor[Book], WandbExtractorMixin[Book]):
extract_schema: Type[Book] = Book
prompt_template = """
SYSTEM:
You are the world's greatest librarian.
USER:
Please recommend a {genre} book.
"""
genre: str
recommender = BookRecommender(span_type="tool", genre="fantasy")
book, span = recommender.extract_with_trace()
# ^ this is a `Span` returned from the trace (or trace error).
Source code in mirascope/wandb/wandb.py
236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 |
|
extract_with_trace(parent=None, retries=0, **kwargs)
¶
Extracts extract_schema
from the LLM call response and traces it.
The extract_schema
is converted into an tool, complete with a description of
the tool, all of the fields, and their types. This allows us to take advantage
of tool/function calling functionality to extract information from a response
according to the context provided by the BaseModel
schema.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
parent |
Optional[Trace]
|
The parent trace to connect to. |
None
|
retries |
int
|
The maximum number of times to retry the query on validation error. |
0
|
**kwargs |
Any
|
Additional keyword arguments parameters to pass to the call. These
will override any existing arguments in |
{}
|
Returns:
Type | Description |
---|---|
tuple[Optional[T], Trace]
|
The |
Source code in mirascope/wandb/wandb.py
with_weave(cls)
¶
Wraps base classes to automatically use weave.
Supported base classes: BaseCall
, BaseExtractor
, BaseVectorStore
,
BaseChunker
, BaseEmbedder
Example:
import weave
from mirascope.openai import OpenAICall
from mirascope.wandb import with_weave
weave.init("my-project")
@with_weave
class BookRecommender(OpenAICall):
prompt_template = "Please recommend some {genre} books"
genre: str
recommender = BookRecommender(genre="fantasy")
response = recommender.call() # this will automatically get logged with weave
print(response.content)