mistral.types¶
Types for working with Mistral prompts.
MistralCallParams
¶
Bases: BaseCallParams[MistralTool]
The parameters to use when calling the Mistral API.
Source code in mirascope/mistral/types.py
MistralCallResponse
¶
Bases: BaseCallResponse[ChatCompletionResponse, MistralTool]
Convenience wrapper for Mistral's chat model completions.
When using Mirascope's convenience wrappers to interact with Mistral models via
MistralCall
, responses using MistralCall.call()
will return a
MistralCallResponse
, whereby the implemented properties allow for simpler syntax
and a convenient developer experience.
Example:
from mirascope.mistral import MistralCall
class BookRecommender(MistralCall):
prompt_template = "Please recommend a {genre} book"
genre: str
response = Bookrecommender(genre="fantasy").call()
print(response.content)
#> The Name of the Wind
print(response.message)
#> ChatMessage(content='The Name of the Wind', role='assistant',
# function_call=None, tool_calls=None)
print(response.choices)
#> [Choice(finish_reason='stop', index=0, logprobs=None,
# message=ChatMessage(content='The Name of the Wind', role='assistant',
# function_call=None, tool_calls=None))]
Source code in mirascope/mistral/types.py
37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 |
|
choice: ChatCompletionResponseChoice
property
¶
Returns the 0th choice.
choices: list[ChatCompletionResponseChoice]
property
¶
Returns the array of chat completion choices.
content: str
property
¶
The content of the chat completion for the 0th choice.
input_tokens: int
property
¶
Returns the number of input tokens.
message: ChatMessage
property
¶
Returns the message of the chat completion for the 0th choice.
output_tokens: Optional[int]
property
¶
Returns the number of output tokens.
tool: Optional[MistralTool]
property
¶
Returns the 0th tool for the 0th choice message.
Raises:
Type | Description |
---|---|
ValidationError
|
if the tool call doesn't match the tool's schema. |
tool_calls: Optional[list[ToolCall]]
property
¶
Returns the tool calls for the 0th choice message.
tools: Optional[list[MistralTool]]
property
¶
Returns the tools for the 0th choice message.
Raises:
Type | Description |
---|---|
ValidationError
|
if the tool call doesn't match the tool's schema. |
usage: UsageInfo
property
¶
Returns the usage of the chat completion.
dump()
¶
Dumps the response to a dictionary.
MistralCallResponseChunk
¶
Bases: BaseCallResponseChunk[ChatCompletionStreamResponse, MistralTool]
Convenience wrapper around chat completion streaming chunks.
When using Mirascope's convenience wrappers to interact with Mistral models via
MistralCall.stream
, responses will return an MistralCallResponseChunk
, whereby
the implemented properties allow for simpler syntax and a convenient developer
experience.
Example:
from mirascope.mistral import MistralCall
class Math(MistralCall):
prompt_template = "What is 1 + 2?"
content = ""
for chunk in Math().stream():
content += chunk.content
print(content)
#> 1
# 1 +
# 1 + 2
# 1 + 2 equals
# 1 + 2 equals
# 1 + 2 equals 3
# 1 + 2 equals 3.
Source code in mirascope/mistral/types.py
choice: ChatCompletionResponseStreamChoice
property
¶
Returns the 0th choice.
choices: list[ChatCompletionResponseStreamChoice]
property
¶
Returns the array of chat completion choices.
content: str
property
¶
Returns the content of the delta.
delta: DeltaMessage
property
¶
Returns the delta of the 0th choice.
tool_calls: Optional[list[ToolCall]]
property
¶
Returns the partial tool calls for the 0th choice message.