Introduction to OpenAI ChatGPT Function Calling With Examples

OpenAI recently introduced an exciting new feature named “Function Calling”. As the name suggests, Function Calling is a way for ChatGPT to call program functions. Functions can be any kind of code including HTTP APIs, database queries or just plain old code.
Function Calling Example
Prompt: | "What is the weather like today in Chicago?" |
Function Definition: | computeWeather(String locationName) |
Normally, GPT can't compute the weather in Chicago because it can’t access real time information. By passing the function definition “computeWeather” to ChatGPT (along with a bit more information), GPT is able to understand that it can fulfill the original prompt via the computerWeather function. GPT is also able to understand that the computeWeather function should be passed a parameter locationName with the value of “Chicago”.
GPT isn’t able to actually invoke the computeWeather function. It’s up to your client code to invoke the function, report the results back to GPT and then GPT can process the original prompt request.
Here’s a diagram showing the order of how all of this works at a high level:

In Step 1, the client sends a prompt (“What is the weather like today in Chicago?”) as well as a list of function definitions (only 1 function in this example). GPT is aware that it cannot compute the today's weather so GPT will examine whether any of the functions that the Client supplied can. GPT will determine that the computeWeather function can fulfill the prompt and that the function should be supplied with a “locationName” argument. GPT will look at the information available and realize that the “locationName” argument should have the value of “Chicago”.
In Step 2, the client is responsible for actually invoking the computeWeather function. The client should pass the GPT computed arguments {locationName: “Chicago”} to the function. The function results (the weather today in Chicago) should be stored and passed to GPT in step 3.
In Step 3, the client passes the previous conversation (prompt + functions) as well as the function results (Chicago's weather) to GPT. GPT will examine the entire conversation and determine it has enough information to fulfill the original prompt and respond with a summary of the current weather in Chicago today.
This is powerful. GPT is able to:
- Determine if it can fulfill the prompt via one of the provided functions.
- Determine what arguments should be supplied to invoke the function. GPT uses the previously supplied prompts, previously invoked function outputs as well as its vast LLM to understand what the argument values should be.
In this example, GPT can fulfill the prompt by invoking a single function. In more complex situations, GPT is able to take a large list of functions and determine which functions should be invoked in the proper dependency order. Follow up SaaSGlue blog articles will review more complex scenarios with multiple functions / dependencies.
Try It Out
To use GPT Function Calling, you must use the Chat GPT API. (Function Calling is not available in the popular Chat GPT console found at https://chat.openai.com.)
To use the Chat GPT API, you must create an OpenAI API account at https://openai.com. Once you’ve created an API account, you’ll be able to generate an API Key at https://platform.openai.com/account/api-keys. The API Key will start with the value “sk-”. Copy that API key - you will need this key to invoke the GPT API and test Function Calling.
Enter the OpenAI API Key here and it will be injected into the example curl requests below.
Try It Step 1.
Send prompt + functions.
Receive function_name + arguments.
The first request to the GPT API will contain the original prompt + the list of functions the client can invoke to fulfill the request.
curl --location 'https://api.openai.com/v1/chat/completions' --header 'Authorization: Bearer Enter Your OpenAI API Key Here' --header 'Content-Type: application/json' --data '{
"model": "gpt-3.5-turbo-0613",
"messages": [{"role": "user", "content": "What is the weather today in Chicago?"}],
"functions": [{
"name": "computeWeather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"locationName": {
"type": "string",
"description": "The name of the location."
}
},
"required": ["locationName"]
}
}],
"temperature": 0
}'
The GPT API response will look something like this (some details omitted):
"choices": [
{
"index": "0",
"message": {
"role": "assistant",
"content": null,
"function_call": {
"name": "computeWeather",
"arguments": "{"locationName": "Chicago"}"
}
},
"finish_reason": "function_call"
}
]
The message field contains a property named function_call which contains 2 properties: name and arguments. Your client code is responsible to pay attention to the function_call.name property. The API request in step 1 promised the existence of the function named computeWeather and GPT is now requesting that the function be invoked.
Try It Step 2.
Invoke the Function(s)
Your client code is responsible for invoking the function(s) that GPT requested. Your client code functions should be invoked with the recommended arguments supplied by GPT.
Example of client code invoking custom functions (some details omitted):
function handleGPTResponse(gtpResponse){
if(gptResponse.choices[0].message.function_call.name === 'ComputeWeather'){
const theWeather = computeWeather(gptResponse.choices[0].message.function_call.arguments);
// send the weatherResponse to GPT
}
}
function computeWeather(location){
if(location.name === 'Chicago'){
return 'Cloudy with a chance of meatballs';
}
else {
return 'Sunny';
}
}
Try It Step 3.
Send function results to GPT.
Receive GPT answer.
The second request to GPT will contain all previous messages but also adds the funtion result to GPT. The function result is sent to GPT as a new message in the messages array the the following properties:
- "role" == "function"
- "name" == "compute"
- "content" == {\"description\": \"Cloudy with a chance of meatballs!\"}
curl --location 'https://api.openai.com/v1/chat/completions' --header 'Authorization: Bearer Enter Your OpenAI API Key Here' --header 'Content-Type: application/json' --data '{
"model": "gpt-3.5-turbo-0613",
"messages": [
{"role": "user", "content": "What is the weather today in Chicago?"},
{"role": "assistant", "content": null,
"function_call": {
"name": "computeWeather",
"arguments": "{\n \"locationName\": \"Chicago\"\n}"
}
},
{"role": "function", "name": "computeWeather", "content": "Cloudy with a chance of meatballs."}
],
"functions": [{
"name": "computeWeather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"locationName": {
"type": "string",
"description": "The name of the location."
}
},
"required": ["locationName"]
}
}],
"temperature": 0
}'
The GPT API response will look something like this (some details omitted):
{
"index": "0",
"message": {
"role": "assistant",
"content": "The weather today in Chicago is cloudy with a chance of meatballs."
},
"finish_reason": "stop"
}
The message field will have a role of assistant which means GPT is responding to the original prompt with a response. The response is the content field of the final answer "The weather today in Chicago is cloudy with a chance of meatballs". Which is the weather calculated by the computeWeather function definition. Recap
OpenAI ChatGPT Function Calling is a powerful mechanism to leverage AI to dynamically invoke code.
This example is trivial but there are other much more complicated examples. In one example GPT is able to ingest a small database schema, be prompted with natural language requests and respond with fairly complicated / valid sql queries.
GPT is also able to accept many function definitions and determine which function (if any) can fulfil the original prompt. GPT can even invoke multiple functions in the correct order and use the output of previously invoked functions as the input to subsequent function calls.
Hallucination / Dangers
In the previous curl request examples, a temperature parameter of "0" is used. Temperature represents how random GPT should make the request.
Sometimes it's useful to request randomness / creativity in GPT responses. This is usually undesired behavior with Function Calling.
It's important to verify GPT Function Calling behavior with a human spot checking GPT behaviors.