Generating content¶
Now that you have your prompt, you can combine it with a model call to generate content. Mirascope provides high-level wrappers around common providers so you can focus on prompt engineering instead of learning the interface for providers. Our high-level wrappers are not required to use our prompts but simply provide convenience if you wish to use it.
Note
This doc uses OpenAI. See supported LLM providers for how to generate content with other model providers like Anthropic, Mistral, Cohere and more.
OpenAICall¶
OpenAICall
extends BasePrompt
and BaseCall
to support interacting with the OpenAI API.
Call¶
You can initialize an OpenAICall
instance and call the call
method to generate an OpenAICallResponse
:
import os
from mirascope import OpenAICall, OpenAICallParams
os.environ["OPENAI_API_KEY"] = "YOUR_OPENAI_API_KEY"
class RecipeRecommender(OpenAICall):
prompt_template = "Recommend recipes that use {ingredient} as an ingredient"
ingredient: str
call_params = OpenAICallParams(model="gpt-3.5-turbo-0125")
response = RecipeRecommender(ingredient="apples").call()
print(response.content) # prints the string content of the call
The call_params
of the OpenAI client is tied to the call (and thereby the prompt). Refer to Engineering better prompts [Add link] for more information.
Async¶
If you are want concurrency, you can use the async
function instead.
import asyncio
import os
from mirascope import OpenAICall, OpenAICallParams
os.environ["OPENAI_API_KEY"] = "YOUR_API_KEY"
class RecipeRecommender(OpenAICall):
prompt_template = "Recommend recipes that use {ingredient} as an ingredient"
ingredient: str
call_params = OpenAICallParams(model="gpt-3.5-turbo-0125")
async def recommend_recipes():
"""Asynchronously calls the model using `OpenAICall` to generate a recipe."""
return await RecipeRecommender(ingredient="apples").call_async()
print(asyncio.run(recommend_recipes()))
CallResponse¶
The call
method returns an OpenAICallResponse
class instance, which is a simple wrapper around the ChatCompletion
class in openai
that extends BaseCallResponse
. In fact, you can access everything from the original response as desired. The primary purpose of the class is to provide convenience.
from mirascope.openai.types import OpenAICallResponse
response = OpenAICallResponse(...)
completion.content # original.choices[0].message.content
completion.tool_calls # original.choices[0].message.tool_calls
completion.message # original.choices[0].message
completion.choice # original.choices[0]
completion.choices # original.choices
response.response # ChatCompletion(...)