mirascope.core.base.prompt¶
The BasePrompt
class for better prompt engineering.
BasePrompt
¶
Bases: BaseModel
The base class for engineering prompts.
Usage Documentation
This class is implemented as the base for all prompting needs. It is intended to work across various providers by providing a common prompt interface.
Example:
from mirascope.core import BasePrompt, metadata, prompt_template
@prompt_template("Recommend a {genre} book")
@metadata({"tags": {"version:0001", "books"}})
class BookRecommendationPrompt(BasePrompt):
genre: str
prompt = BookRecommendationPrompt(genre="fantasy")
print(prompt)
# > Recommend a fantasy book
print(prompt.message_params())
# > [BaseMessageParam(role="user", content="Recommend a fantasy book")]
print(prompt.dump()["metadata"])
# > {"metadata": {"version:0001", "books"}}
message_params
¶
message_params() -> list[BaseMessageParam]
Returns the list of parsed message parameters.
Source code in mirascope/core/base/prompt.py
dynamic_config
¶
dynamic_config() -> BaseDynamicConfig
dump
¶
Dumps the contents of the prompt into a dictionary.
run
¶
run(
call_decorator: (
Callable[
[Callable[..., BaseDynamicConfig]],
Callable[..., _BaseCallResponseT],
]
| Callable[
[Callable[..., BaseDynamicConfig]],
Callable[..., _BaseStreamT],
]
| Callable[
[Callable[..., BaseDynamicConfig]],
Callable[..., _ResponseModelT],
]
| Callable[
[Callable[..., BaseDynamicConfig]],
Callable[..., Iterable[_ResponseModelT]],
]
),
*additional_decorators: Callable[[_T], _T]
) -> (
_BaseCallResponseT
| _BaseStreamT
| _ResponseModelT
| Iterable[_ResponseModelT]
)
Returns the response of calling the API of the provided decorator.
Usage Documentation
Example:
from mirascope.core import BasePrompt, openai, prompt_template
@prompt_template("Recommend a {genre} book")
class BookRecommendationPrompt(BasePrompt):
genre: str
prompt = BookRecommendationPrompt(genre="fantasy")
response = prompt.run(openai.call("gpt-4o-mini"))
print(response.content)
Source code in mirascope/core/base/prompt.py
run_async
¶
run_async(
call_decorator: (
Callable[
[Callable[..., Awaitable[BaseDynamicConfig]]],
Callable[..., Awaitable[_BaseCallResponseT]],
]
| Callable[
[Callable[..., Awaitable[BaseDynamicConfig]]],
Callable[..., Awaitable[_BaseStreamT]],
]
| Callable[
[Callable[..., Awaitable[BaseDynamicConfig]]],
Callable[..., Awaitable[_ResponseModelT]],
]
| Callable[
[Callable[..., Awaitable[BaseDynamicConfig]]],
Callable[
...,
Awaitable[AsyncIterable[_ResponseModelT]],
],
]
),
*additional_decorators: Callable[[_T], _T]
) -> (
Awaitable[_BaseCallResponseT]
| Awaitable[_BaseStreamT]
| Awaitable[_ResponseModelT]
| Awaitable[AsyncIterable[_ResponseModelT]]
)
Returns the response of calling the API of the provided decorator.
Usage Documentation
Example:
import asyncio
from mirascope.core import BasePrompt, openai, prompt_template
@prompt_template("Recommend a {genre} book")
class BookRecommendationPrompt(BasePrompt):
genre: str
async def run():
prompt = BookRecommendationPrompt(genre="fantasy")
response = await prompt.run_async(openai.call("gpt-4o-mini"))
print(response.content)
asyncio.run(run())
Source code in mirascope/core/base/prompt.py
prompt_template
¶
prompt_template(template: str) -> PromptDecorator
A decorator for setting the prompt_template
of a BasePrompt
or call
.
Usage Documentation
Example:
from mirascope.core import openai, prompt_template
@openai.call("gpt-4o-mini")
@prompt_template("Recommend a {genre} book")
def recommend_book(genre: str):
...
response = recommend_book("fantasy")
print(response.prompt_template)
print(response.fn_args)
Returns:
Name | Type | Description |
---|---|---|
decorator |
Callable
|
The decorator function that updates the |
Source code in mirascope/core/base/prompt.py
metadata
¶
metadata(metadata: Metadata) -> MetadataDecorator
A decorator for adding metadata to a BasePrompt
or call
.
Usage Documentation
Adding this decorator to a BasePrompt
or call
updates the metadata
annotation
to the given value. This is useful for adding metadata to a BasePrompt
or call
that can be used for logging or filtering.
Example:
from mirascope.core import metadata, openai, prompt_template
@openai.call("gpt-4o-mini")
@prompt_template("Recommend a {genre} book")
@metadata({"tags": {"version:0001", "books"}})
def recommend_book(genre: str):
...
response = recommend_book("fantasy")
print(response.metadata)
Returns:
Name | Type | Description |
---|---|---|
decorator |
Callable
|
The decorator function that updates the |