Skip to main content

Function Calling

Overview

Function Calling refers to the process where users describe functions and target tasks, enabling the large model to attempt calling a specific function.

It's important to note that large models themselves cannot execute functions independently. Based on user input and function definitions, the large model provides you with: whether a call is needed, which function to call, and the function parameters. After obtaining this information, the client executes the function itself, then feeds the execution result back to the large model for the next round of task processing.

Certain frameworks like LangGraph and LlamaIndex can simplify this process. MoArk offers out-of-the-box large model function calling capabilities, which will be explained in detail below.

tip

Function Calling is a similar concept to tool call; tool call is an upgraded version and has replaced Function Calling.
The tool list needs to be passed in via the tools parameter.

Example 1: Enable AI to Get Today's Weather and Directly Parse Function Call Results

This example demonstrates the most straightforward method, designed to help understand the workflow and principles of Function Calling. For a simpler implementation using LangChain, refer to Example 2.

Step 1: Assemble the tools Parameter

First, assemble the tools parameter. The following code describes a function named get_current_weather to the large model, which accepts parameters city and date and is capable of retrieving weather conditions for a location using the city name and date:

python
tools = [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the weather conditions of a location via city name and date, including temperature, weather status, etc.",
"parameters": {
"type": "object",
"properties": {
"city": {
"type": "string",
"description": "The name of the city for which the user wants to query the weather",
},
"date": {
"type": "string",
"description": "The date to query, such as 'today', 'tomorrow', or a specific date like '2023-10-01'",
}
},
"required": ["city", "date"],
},
}
}
]

tools is a list that can define multiple functions. Parameter explanations:

  • type: Defines the type of the parameter object, usually object, indicating that the parameter structure is a JSON object containing multiple properties.
  • properties: This is the core part, listing specific definitions for each parameter in JSON Schema format.
    • name: The name of each property, corresponding to the function's parameter name.
    • type: The data type of the parameter, such as string, float, integer, etc.
    • description: Describes the purpose of each parameter to help the model understand how to populate parameter values.
    • required: Specifies which parameters are mandatory. If a parameter is in the required list, the model must populate it when generating a call.

Step 2: Call the Large Model

Pass the assembled tools parameter into the client, parse the response, and call the defined get_current_weather function based on the request.

python
import json

completion_tool = client.chat.completions.create(
model=model_name,
stream=False,
tools=tools,
temperature=0.1,
top_p=0.95,
tool_choice="auto",
messages=[
{"role": "system", "content": "You are an intelligent assistant. You will call tools when you need to query the weather."},
{"role": "user", "content": "What's the weather like in Beijing today?"}
]
)

print(completion_tool, end="\n\n")


def get_current_weather(city: str, date: str) -> str:
print(f"Executing get_current_weather with parameters: city: {city}, date: {date}")
# Simulate the return result of a weather query API
return f"""Weather information for {city} on {date}:
Condition: Sunny
Temperature: 25-32°C
Wind: Northeast wind, 3-4 levels
Humidity: 45%
UV index: Moderate, recommended to apply sunscreen when going out"""


function_res = completion_tool.choices[0].message.tool_calls[0].function
arguments_res = function_res.arguments

print("Function to be called: ", function_res.name)

json_arguments = json.loads(
completion_tool.choices[0].message.tool_calls[0].function.arguments)

print(f"tool call parameters: City: {json_arguments['city']}, Date: {json_arguments['date']}")

# 执行函数
eval(f'{function_res.name}(**{arguments_res})')

The function has been successfully called! You can format the function response as follows:

{
'role': 'tool',
'name': 'get_current_weather',
'content': 'Weather information for Beijing today: Condition: Sunny...',
'tool_call_id': 'xxxx'
}

Example 2: Using LangChain to Let AI Summarize and Report Today's News

Step 1: Install Required Libraries

Libraries like LangChain provide more convenient tools and syntax. First, install the necessary libraries:

bash
pip install \
langchain==0.3.3 \
langgraph==0.2.38 \
langchain_core \
langchain_community==0.3.2 \
langchain_openai \
-i https://mirrors.cloud.tencent.com/pypi/simple

You can also use the JavaScript version of LangChain

Using the langchain @tool decorator automatically converts functions into standardized tools parameters. Comments starting with """xxx""" and Annotated comments will be converted into the description field in the tools parameter. Using LangGraph's create_react_agent to create an agent automatically handles function calling, tool execution, and tool message feedback, greatly simplifying the workflow.

Step 2: Retrieve News Information

The following code enables AI to fetch news, execute Python code, and write the news into a news.txt file:

python
from langchain_openai import ChatOpenAI
from langchain.tools import tool
from langchain_community.document_loaders import WebBaseLoader
from typing import AsyncIterator, Iterator, List, Optional, Annotated
from langgraph.prebuilt import create_react_agent
from langchain_core.messages import HumanMessage, AIMessage, AIMessageChunk, ToolMessage

# For personal learning demonstration only.:
@tool
def get_news(query: str):
"""Retrieve the latest news list and today's news. This tool provides popular news summaries, links, and covers for news with images."""
try:
news_load = WebBaseLoader(
# Ignore "2019/07" – this actually returns the latest news
'https://news.cctv.com/2019/07/gaiban/cmsdatainterface/page/news_1.jsonp?cb=news').load()
news = news_load[0].page_content
print('Length of get_news result', len(news))
# Truncate text to prevent excessive length and speed up AI processing
return news[:4000]
except Exception as e:
print("get_news fail", e)

# This function has security risks and is for demonstration only – do not use in production environments:
@tool
def python_code_exec(code: Annotated[str, 'Safe Python code/expressions. Assign results to the variable `result`. Strings should be multi-line.']) -> dict:
"""Execute Python code and perform mathematical calculations based on input expressions. Solve problems using Python programming. You can freely write safe, complete, and executable Python code, then retrieve the execution result."""
local_vars = {}
try:
exec(code, {}, local_vars)
return f"Result: {str(local_vars)}"
except Exception as e:
return f"Execution failed: {str(e)}"

tools_list = [get_news, python_code_exec]

model_name = "Qwen2.5-72B-Instruct"
base_url = "https://moark.ai/v1"

# Get your access token from https://moark.ai/dashboard/settings/tokens
GITEE_AI_API_KEY = ""

llm = ChatOpenAI(model=model_name, api_key=GITEE_AI_API_KEY, base_url=base_url, streaming=True, temperature=0.1, presence_penalty=1.05, top_p=0.9,
extra_body={
"tool_choice": "auto",
})
system_message = """You are an intelligent assistant. Briefly report before calling tools."""

# Use LangGraph to create an agent, which automatically handles calling, tool execution, and tool message feedback
agent_executor = create_react_agent(
llm, tools=tools_list, debug=False)

ai_res_msg = ''
first = True
config = {"configurable": {"thread_id": "xxx", "recursion_limit": 10}}

for ai_msg, metadata in agent_executor.stream(
{"messages": [system_message, HumanMessage(content="Get the titles of the top 3 news stories today, then write Python code to save them to ./news.txt with utf-8 encoding")]}, config, stream_mode="messages"
):
if ai_msg.content and isinstance(ai_msg, AIMessage):
# Real-time output
print(ai_msg.content, end="")
ai_res_msg += ai_msg.content

if isinstance(ai_msg, AIMessageChunk):
if first:
gathered = ai_msg
first = False
else:
gathered = gathered + ai_msg
if ai_msg.tool_call_chunks:
print("Called function:", gathered.tool_calls)
if isinstance(ai_msg, ToolMessage):
print("Tool call result:", ai_msg.content)

# Summary output
print(ai_res_msg)

You will see the model's real-time calling process and the latest news results, then AI will write code to save the news titles to the news.txt file!