I am using litellm to usel ollama. And When I use ollama model with function calling with litellm then ollama giving me answer in json formate tool related data.
This is my code
from litellm import completion
tools = [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
},
"required": ["location"],
},
}
}
]
response = completion(
model="ollama/llama3.2",
messages = [{ "content": "Hello, how are you?","role": "user"}],
api_base="http://localhost:11434",
stream=True,
tools=tools
)
for chunk in response:
delta = chunk.choices[0].delta
print(delta.content, end="")
And It give me this result
{"name": "get_current_weather", "arguments": {"unit": "celsius", "location": "San Francisco, CA"}}None
What is the problem is this a bug of litellm or ollama model. Or my code problem