OpenAI Function Calling — Complete Tutorial

Sanjeev SharmaSanjeev Sharma
7 min read

Advertisement

Introduction

Function calling enables AI models to return structured function calls instead of just text. This allows models to interact with external APIs, databases, and tools within your application logic. Function calling is the foundation for building AI agents, chatbots that perform actions, and intelligent automation. This guide covers everything from basic setup to production-ready implementations.

Understanding Function Calling

Instead of the model just generating text, function calling lets you define functions the model can call. The model decides whether and when to call functions, passes parameters, and you execute the function and return results to the model.

This enables workflows like:

  • A customer service bot that looks up order information
  • A code assistant that runs code and returns output
  • A data analysis tool that queries databases

The key difference from Chat Completions: with function calling, the model can request tool use, not just describe what it would do.

Basic Function Calling Example

Here's a simple example:

from openai import OpenAI
import json

client = OpenAI()

# Define functions the model can call
tools = [
    {
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "Get weather for a location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "City name"
                    },
                    "unit": {
                        "type": "string",
                        "enum": ["celsius", "fahrenheit"],
                        "description": "Temperature unit"
                    }
                },
                "required": ["location"]
            }
        }
    }
]

# Simulate weather function
def get_weather(location, unit="fahrenheit"):
    weather_data = {
        "New York": "72°F",
        "London": "55°F",
        "Tokyo": "68°F"
    }
    temp = weather_data.get(location, "Unknown")
    if unit == "celsius" and temp != "Unknown":
        celsius = (int(temp.rstrip("°F")) - 32) * 5/9
        return f"{celsius:.1f}°C"
    return temp

# Call ChatGPT with function calling enabled
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[
        {"role": "user", "content": "What's the weather in New York?"}
    ],
    tools=tools,
    tool_choice="auto"  # Let model decide if/when to call
)

# Check if model wants to call a function
if response.choices[0].message.tool_calls:
    tool_call = response.choices[0].message.tool_calls[0]

    # Execute the function
    if tool_call.function.name == "get_weather":
        args = json.loads(tool_call.function.arguments)
        result = get_weather(**args)

    # Send result back to model
    messages = [
        {"role": "user", "content": "What's the weather in New York?"},
        response.choices[0].message,
        {
            "role": "tool",
            "tool_call_id": tool_call.id,
            "content": result
        }
    ]

    # Get final response
    final_response = client.chat.completions.create(
        model="gpt-4o",
        messages=messages,
        tools=tools
    )

    print(final_response.choices[0].message.content)

Multi-Turn Function Calling

Real applications often need multiple function calls in a loop:

def run_agent(user_query, tools, function_implementations):
    """
    Run an agent that can call multiple functions
    """
    client = OpenAI()
    messages = [{"role": "user", "content": user_query}]

    while True:
        # Call model
        response = client.chat.completions.create(
            model="gpt-4o",
            messages=messages,
            tools=tools,
            tool_choice="auto"
        )

        # Add assistant response to messages
        messages.append(response.choices[0].message)

        # Check if we're done
        if not response.choices[0].message.tool_calls:
            # No more function calls, return final response
            return response.choices[0].message.content

        # Process each tool call
        for tool_call in response.choices[0].message.tool_calls:
            function_name = tool_call.function.name
            function_args = json.loads(tool_call.function.arguments)

            # Execute the function
            if function_name in function_implementations:
                result = function_implementations[function_name](**function_args)
            else:
                result = f"Function {function_name} not found"

            # Add result to messages
            messages.append({
                "role": "tool",
                "tool_call_id": tool_call.id,
                "content": str(result)
            })

# Example usage
tools = [
    {
        "type": "function",
        "function": {
            "name": "search_database",
            "description": "Search product database",
            "parameters": {
                "type": "object",
                "properties": {
                    "query": {
                        "type": "string",
                        "description": "Search query"
                    }
                },
                "required": ["query"]
            }
        }
    },
    {
        "type": "function",
        "function": {
            "name": "check_inventory",
            "description": "Check product inventory",
            "parameters": {
                "type": "object",
                "properties": {
                    "product_id": {
                        "type": "string",
                        "description": "Product ID"
                    }
                },
                "required": ["product_id"]
            }
        }
    }
]

def search_database(query):
    results = {"laptop": "prod_123", "mouse": "prod_456"}
    return results.get(query, "Not found")

def check_inventory(product_id):
    inventory = {"prod_123": 15, "prod_456": 42}
    return inventory.get(product_id, 0)

functions = {
    "search_database": search_database,
    "check_inventory": check_inventory
}

response = run_agent(
    "Is there a laptop in stock?",
    tools,
    functions
)
print(response)

Forcing Function Calls

Sometimes you want to force the model to call a specific function:

# Force model to call a specific function
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Get current temperature"}],
    tools=tools,
    tool_choice={
        "type": "function",
        "function": {"name": "get_weather"}
    }
)

# Force ANY function call (don't allow text-only responses)
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Get current temperature"}],
    tools=tools,
    tool_choice="required"
)

Parallel Function Calling

When the model returns multiple tool calls, you can execute them in parallel:

import concurrent.futures

def run_with_parallel_execution(user_query, tools, functions):
    """Execute multiple function calls in parallel"""
    client = OpenAI()
    messages = [{"role": "user", "content": user_query}]

    response = client.chat.completions.create(
        model="gpt-4o",
        messages=messages,
        tools=tools
    )

    if not response.choices[0].message.tool_calls:
        return response.choices[0].message.content

    # Execute all tool calls in parallel
    tool_results = []

    with concurrent.futures.ThreadPoolExecutor(max_workers=5) as executor:
        futures = {}

        for tool_call in response.choices[0].message.tool_calls:
            function_name = tool_call.function.name
            function_args = json.loads(tool_call.function.arguments)

            if function_name in functions:
                future = executor.submit(
                    functions[function_name],
                    **function_args
                )
                futures[tool_call.id] = future

        # Collect results
        for tool_call_id, future in futures.items():
            try:
                result = future.result()
                tool_results.append({
                    "tool_call_id": tool_call_id,
                    "content": str(result)
                })
            except Exception as e:
                tool_results.append({
                    "tool_call_id": tool_call_id,
                    "content": f"Error: {str(e)}"
                })

    # Add results to messages
    messages.append(response.choices[0].message)

    for result in tool_results:
        messages.append({
            "role": "tool",
            "tool_call_id": result["tool_call_id"],
            "content": result["content"]
        })

    # Get final response
    final = client.chat.completions.create(
        model="gpt-4o",
        messages=messages,
        tools=tools
    )

    return final.choices[0].message.content

Error Handling

Production function calling requires robust error handling:

def safe_function_call(function_name, args, functions, timeout=5):
    """Safely call a function with error handling"""
    try:
        if function_name not in functions:
            return f"Error: Function {function_name} not found"

        # Set timeout to prevent hanging
        import signal

        def timeout_handler(signum, frame):
            raise TimeoutError(f"Function call timed out after {timeout}s")

        signal.signal(signal.SIGALRM, timeout_handler)
        signal.alarm(timeout)

        result = functions[function_name](**args)

        signal.alarm(0)  # Disable alarm
        return result

    except TimeoutError:
        return f"Error: Function call timed out"
    except TypeError as e:
        return f"Error: Invalid arguments - {str(e)}"
    except Exception as e:
        return f"Error: {str(e)}"

Complex Parameter Definitions

For sophisticated functions, define parameters precisely:

tools = [
    {
        "type": "function",
        "function": {
            "name": "analyze_data",
            "description": "Analyze data with various options",
            "parameters": {
                "type": "object",
                "properties": {
                    "data": {
                        "type": "array",
                        "items": {"type": "number"},
                        "description": "Array of numbers to analyze"
                    },
                    "metrics": {
                        "type": "array",
                        "items": {
                            "type": "string",
                            "enum": ["mean", "median", "std_dev", "min", "max"]
                        },
                        "description": "Which metrics to calculate"
                    },
                    "outlier_detection": {
                        "type": "boolean",
                        "description": "Enable outlier detection"
                    },
                    "threshold": {
                        "type": "number",
                        "description": "Outlier threshold (1-3 standard deviations)",
                        "minimum": 1,
                        "maximum": 3
                    }
                },
                "required": ["data", "metrics"]
            }
        }
    }
]

Practical Use Cases

Order Management Bot: Look up orders, check status, initiate returns using function calls to your database.

Code Execution: Let the model write code, execute it in a sandbox, and return results.

Analytics Dashboard: Query different data sources, aggregate results, and present insights.

Booking System: Check availability, make reservations, process payments through function calls.

Best Practices

  1. Name functions clearly: Use descriptive names like get_customer_orders not func1
  2. Document parameters: Description fields help the model use functions correctly
  3. Validate input: Check function arguments before execution
  4. Handle errors gracefully: Return meaningful error messages the model can act on
  5. Set timeouts: Prevent hanging on long-running functions
  6. Log calls: Track which functions are called for debugging and optimization

Conclusion

Function calling is the bridge between language models and real-world applications. It enables building AI systems that don't just talk about actions but actually perform them. Master function calling and you can build sophisticated agents that solve real business problems.

FAQ

Q: When should I use function calling vs Assistants API? A: Use function calling for custom applications where you want fine control. Use Assistants API for simpler chatbots where managed state is acceptable.

Q: Can I call functions recursively? A: Yes, but be careful about infinite loops. Implement recursion depth limits and timeouts.

Q: How many functions can I define? A: There's no hard limit, but defining too many (100+) can make the model confused about which to use. Keep to 10-20 focused functions.

Advertisement

Sanjeev Sharma

Written by

Sanjeev Sharma

Full Stack Engineer · E-mopro