Learn how to use function calling capabilities with Large Language Models
Tool calling (or function calling) allows your LLM to interact with external tools and APIs. This enables the model to request specific information or perform actions that are outside its training data or capabilities.
import { OllamaClient } from 'tekimax-sdk';const client = new OllamaClient();async function basicToolCalling() { // Define a simple calculator tool const calculatorTool = { type: "function", name: "calculate", description: "Perform a mathematical calculation", parameters: { type: "object", required: ["expression"], properties: { expression: { type: "string", description: "The mathematical expression to evaluate, e.g. '2 + 2'" } } } }; // Call the model with the calculator tool const response = await client.tools.callWithTools({ model: 'llama3', // Make sure your model supports function calling prompt: 'What is the square root of 144 plus 16?', tools: [calculatorTool] }); console.log('Initial response:', response.message.content); // Check if the model wants to use the calculator if (response.message.tool_calls && response.message.tool_calls.length > 0) { const toolCall = response.message.tool_calls[0]; console.log(`Model wants to call: ${toolCall.name}`); console.log('With input:', toolCall.input); // Execute the calculator tool let result; if (toolCall.name === 'calculate') { try { // SECURITY WARNING: In a real app, use a safer evaluation method! // This is just for demonstration purposes. const expression = toolCall.input.expression; result = eval(expression); // DO NOT use eval() in production! } catch (error) { result = `Error: ${error.message}`; } } // Return the result to the model const finalResponse = await client.tools.executeToolCalls( { model: 'llama3', prompt: 'What is the square root of 144 plus 16?', tools: [calculatorTool] }, [toolCall], // The original tool calls [{ tool_call_id: toolCall.id, role: "tool", name: toolCall.name, content: JSON.stringify(result) }] ); console.log('Final response:', finalResponse.message.content); }}basicToolCalling().catch(console.error);
The SDK also provides CLI functionality for tool calling:
Copy
# Define tools in a JSON file (tools.json)# Then use the CLI to call with toolsollama-sdk tools -m llama3 -p "What's the weather in Paris?" --tools-file ./tools.json
Tool calling is a powerful feature that extends what LLMs can do. With Ollama SDK, you can easily implement this functionality in your applications, allowing your AI to access external data, perform calculations, or take actions in the real world.For more examples, check out the tools example in the examples section.