ํ•จ์ˆ˜ ํ˜ธ์ถœ ์ฐธ์กฐ

ํ•จ์ˆ˜ ํ˜ธ์ถœ์€ ๊ด€๋ จ์„ฑ์ด ์žˆ๋Š” ์ƒํ™ฉ๋ณ„ ๋‹ต๋ณ€์„ ์ œ๊ณตํ•˜๋Š” LLM์˜ ๊ธฐ๋Šฅ์„ ํ–ฅ์ƒ์‹œํ‚ต๋‹ˆ๋‹ค.

Function Calling API๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ์ปค์Šคํ…€ ํ•จ์ˆ˜๋ฅผ ์ƒ์„ฑํ˜• AI ๋ชจ๋ธ์— ์ œ๊ณตํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ๋ชจ๋ธ์€ ์ด๋Ÿฌํ•œ ํ•จ์ˆ˜๋ฅผ ์ง์ ‘ ํ˜ธ์ถœํ•˜์ง€ ์•Š์ง€๋งŒ ๋Œ€์‹  ํ•จ์ˆ˜ ์ด๋ฆ„๊ณผ ์ถ”์ฒœ ์ธ์ˆ˜๋ฅผ ์ง€์ •ํ•˜๋Š” ๊ตฌ์กฐํ™”๋œ ๋ฐ์ดํ„ฐ ์ถœ๋ ฅ์„ ์ƒ์„ฑํ•ฉ๋‹ˆ๋‹ค.

์ด ์ถœ๋ ฅ์„ ํ†ตํ•ด ์™ธ๋ถ€ API ๋˜๋Š” ์ •๋ณด ์‹œ์Šคํ…œ(์˜ˆ: ๋ฐ์ดํ„ฐ๋ฒ ์ด์Šค, ๊ณ ๊ฐ ๊ด€๊ณ„ ๊ด€๋ฆฌ ์‹œ์Šคํ…œ, ๋ฌธ์„œ ์ €์žฅ์†Œ)์„ ํ˜ธ์ถœํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. LLM์—์„œ ์‘๋‹ต ํ’ˆ์งˆ์„ ๊ฐœ์„ ํ•˜๋Š” ๋ฐ ๊ฒฐ๊ณผ API ์ถœ๋ ฅ์„ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

ํ•จ์ˆ˜ ํ˜ธ์ถœ์— ๊ด€ํ•œ ๊ฐœ๋…์  ๋ฌธ์„œ๋Š” ํ•จ์ˆ˜ ํ˜ธ์ถœ์„ ์ฐธ๊ณ ํ•˜์„ธ์š”.

์ง€์›๋˜๋Š” ๋ชจ๋ธ:

๋ชจ๋ธ ๋ฒ„์ „
Gemini 2.0 Flash-Lite gemini-2.0-flash-lite-001
Gemini 2.0 Flash gemini-2.0-flash-001

์ œํ•œ์‚ฌํ•ญ:

  • ์š”์ฒญ์— ์ œ๊ณตํ•  ์ˆ˜ ์žˆ๋Š” ์ตœ๋Œ€ ํ•จ์ˆ˜ ์„ ์–ธ ์ˆ˜๋Š” 128๊ฐœ์ž…๋‹ˆ๋‹ค.

์˜ˆ์‹œ ๊ตฌ๋ฌธ

ํ•จ์ˆ˜ ํ˜ธ์ถœ API ์š”์ฒญ์„ ๋ณด๋‚ด๋Š” ๊ตฌ๋ฌธ์ž…๋‹ˆ๋‹ค.

curl

curl -X POST \
  -H "Authorization: Bearer $(gcloud auth print-access-token)" \
  -H "Content-Type: application/json" \

https://${LOCATION}-aiplatform.googleapis.com/v1/projects/${PROJECT_ID}/locations/${LOCATION}/publishers/google/models/${MODEL_ID}:generateContent \
-d '{
  "contents": [{
    ...
  }],
  "tools": [{
    "function_declarations": [
      {
        ...
      }
    ]
  }]
}'

Python

gemini_model = GenerativeModel(
    MODEL_ID,
    generation_config=generation_config,
    tools=[
        Tool(
            function_declarations=[
                FunctionDeclaration(
                    ...
                )
            ]
        )
    ],
)

ํŒŒ๋ผ๋ฏธํ„ฐ ๋ชฉ๋ก

๊ตฌํ˜„ ์„ธ๋ถ€์ •๋ณด๋Š” ์˜ˆ์‹œ๋ฅผ ์ฐธ๊ณ ํ•˜์„ธ์š”.

FunctionDeclaration

OpenAPI 3.0 ์‚ฌ์–‘์— ๋”ฐ๋ผ ๋ชจ๋ธ์—์„œ JSON ์ž…๋ ฅ์„ ์ƒ์„ฑํ•  ์ˆ˜ ์žˆ๋Š” ํ•จ์ˆ˜๋ฅผ ์ •์˜ํ•ฉ๋‹ˆ๋‹ค.

ํŒŒ๋ผ๋ฏธํ„ฐ

name

string

ํ˜ธ์ถœํ•˜๋ ค๋Š” ํ•จ์ˆ˜์˜ ์ด๋ฆ„์ž…๋‹ˆ๋‹ค. ๋ฌธ์ž ๋˜๋Š” ๋ฐ‘์ค„๋กœ ์‹œ์ž‘ํ•ด์•ผ ํ•˜๋ฉฐ a~z, A~Z, 0~9์ด๊ฑฐ๋‚˜ ๋ฐ‘์ค„, ์  ๋˜๋Š” ๋Œ€์‹œ๋ฅผ ํฌํ•จํ•  ์ˆ˜ ์žˆ๊ณ  ์ตœ๋Œ€ 64์ž ๊ธธ์ด์ž…๋‹ˆ๋‹ค.

description

(์„ ํƒ์‚ฌํ•ญ) string

ํ•จ์ˆ˜ ์„ค๋ช… ๋ฐ ์šฉ๋„์ž…๋‹ˆ๋‹ค. ๋ชจ๋ธ์€ ์ด๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ํ•จ์ˆ˜๋ฅผ ํ˜ธ์ถœํ• ์ง€ ์—ฌ๋ถ€์™€ ๋ฐฉ๋ฒ•์„ ๊ฒฐ์ •ํ•ฉ๋‹ˆ๋‹ค. ์ตœ์ƒ์˜ ๊ฒฐ๊ณผ๋ฅผ ์–ป์œผ๋ ค๋ฉด ์„ค๋ช…์„ ํฌํ•จํ•˜๋Š” ๊ฒƒ์ด ์ข‹์Šต๋‹ˆ๋‹ค.

parameters

(์„ ํƒ์‚ฌํ•ญ) Schema

ํ•จ์ˆ˜ ํŒŒ๋ผ๋ฏธํ„ฐ๋ฅผ OpenAPI JSON ์Šคํ‚ค๋งˆ ๊ฐ์ฒด ํ˜•์‹์ธ OpenAPI 3.0 ์‚ฌ์–‘์œผ๋กœ ์„ค๋ช…ํ•ฉ๋‹ˆ๋‹ค.

response

(์„ ํƒ์‚ฌํ•ญ) Schema

ํ•จ์ˆ˜์˜ ์ถœ๋ ฅ์„ OpenAPI JSON ์Šคํ‚ค๋งˆ ๊ฐ์ฒด ํ˜•์‹์ธ OpenAPI 3.0 ์‚ฌ์–‘์œผ๋กœ ์„ค๋ช…ํ•ฉ๋‹ˆ๋‹ค.

์ž์„ธํ•œ ๋‚ด์šฉ์€ ํ•จ์ˆ˜ ํ˜ธ์ถœ์„ ์ฐธ์กฐํ•˜์„ธ์š”.

Schema

OpenAPI 3.0 ์Šคํ‚ค๋งˆ ์‚ฌ์–‘์— ๋”ฐ๋ผ ํ•จ์ˆ˜ ํ˜ธ์ถœ์—์„œ ์ž…๋ ฅ ๋ฐ ์ถœ๋ ฅ ๋ฐ์ดํ„ฐ์˜ ํ˜•์‹์„ ์ •์˜ํ•ฉ๋‹ˆ๋‹ค.

ํŒŒ๋ผ๋ฏธํ„ฐ
์œ ํ˜•

string

Enum. ๋ฐ์ดํ„ฐ ์œ ํ˜•์ž…๋‹ˆ๋‹ค. ๋‹ค์Œ ์ค‘ ํ•˜๋‚˜์—ฌ์•ผ ํ•ฉ๋‹ˆ๋‹ค.

  • STRING
  • INTEGER
  • BOOLEAN
  • NUMBER
  • ARRAY
  • OBJECT
description

(์„ ํƒ์‚ฌํ•ญ) string

๋ฐ์ดํ„ฐ์— ๋Œ€ํ•œ ์„ค๋ช…์ž…๋‹ˆ๋‹ค.

enum

(์„ ํƒ์‚ฌํ•ญ) string[]

enum ํ˜•์‹์˜ ๊ฐ€๋Šฅํ•œ ์›์‹œ ์œ ํ˜• ์š”์†Œ ๊ฐ’์ž…๋‹ˆ๋‹ค.

items

(์„ ํƒ์‚ฌํ•ญ) Schema[]

Type.ARRAY ์š”์†Œ์˜ ์Šคํ‚ค๋งˆ

properties

(์„ ํƒ์‚ฌํ•ญ) Schema

Type.OBJECT ์†์„ฑ์˜ ์Šคํ‚ค๋งˆ

required

(์„ ํƒ์‚ฌํ•ญ) string[]

Type.OBJECT์˜ ํ•„์ˆ˜ ์†์„ฑ์ž…๋‹ˆ๋‹ค.

nullable

(์„ ํƒ์‚ฌํ•ญ) bool

๊ฐ’์ด null์ธ์ง€ ๋‚˜ํƒ€๋ƒ…๋‹ˆ๋‹ค.

FunctionCallingConfig

FunctionCallingConfig๋Š” ๋ชจ๋ธ์˜ ๋™์ž‘์„ ์ œ์–ดํ•˜๊ณ  ํ˜ธ์ถœํ•  ํ•จ์ˆ˜ ์œ ํ˜•์„ ๊ฒฐ์ •ํ•ฉ๋‹ˆ๋‹ค.

ํŒŒ๋ผ๋ฏธํ„ฐ

mode

(์„ ํƒ์‚ฌํ•ญ) enum/string[]

  • AUTO: ๊ธฐ๋ณธ ๋ชจ๋ธ ๋™์ž‘์ž…๋‹ˆ๋‹ค. ๋ชจ๋ธ์€ ํ•จ์ˆ˜ ํ˜ธ์ถœ ํ˜•์‹ ๋˜๋Š” ์ž์—ฐ์–ด ์‘๋‹ต ํ˜•์‹์œผ๋กœ ์˜ˆ์ธกํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ๋ชจ๋ธ์€ ์ปจํ…์ŠคํŠธ์— ๋”ฐ๋ผ ์‚ฌ์šฉํ•  ์–‘์‹์„ ๊ฒฐ์ •ํ•ฉ๋‹ˆ๋‹ค.
  • NONE: ๋ชจ๋ธ์ด ํ•จ์ˆ˜ ํ˜ธ์ถœ ํ˜•์‹์œผ๋กœ ์˜ˆ์ธกํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
  • ANY: ๋ชจ๋ธ์ด ํ•ญ์ƒ ํ•จ์ˆ˜ ํ˜ธ์ถœ์„ ์˜ˆ์ธกํ•˜๋„๋ก ์ œํ•œ๋ฉ๋‹ˆ๋‹ค. allowed_function_names๊ฐ€ ์ œ๊ณต๋˜์ง€ ์•Š์œผ๋ฉด ๋ชจ๋ธ์€ ์‚ฌ์šฉ ๊ฐ€๋Šฅํ•œ ๋ชจ๋“  ํ•จ์ˆ˜ ์„ ์–ธ ์ค‘์—์„œ ์„ ํƒํ•ฉ๋‹ˆ๋‹ค. allowed_function_names๊ฐ€ ์ œ๊ณต๋˜์—ˆ์œผ๋ฉด ๋ชจ๋ธ์ด ํ—ˆ์šฉ๋œ ํ•จ์ˆ˜ ์ง‘ํ•ฉ ์ค‘์—์„œ ์„ ํƒํ•ฉ๋‹ˆ๋‹ค.

allowed_function_names

(์„ ํƒ์‚ฌํ•ญ) string[]

ํ˜ธ์ถœํ•  ํ•จ์ˆ˜ ์ด๋ฆ„์ž…๋‹ˆ๋‹ค. mode๊ฐ€ ANY์ธ ๊ฒฝ์šฐ์—๋งŒ ์„ค์ •ํ•ฉ๋‹ˆ๋‹ค. ํ•จ์ˆ˜ ์ด๋ฆ„์€ [FunctionDeclaration.name]๊ณผ ์ผ์น˜ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค. ๋ชจ๋“œ๋ฅผ ANY๋กœ ์„ค์ •ํ•˜๋ฉด ๋ชจ๋ธ์ด ์ œ๊ณต๋œ ํ•จ์ˆ˜ ์ด๋ฆ„ ์„ธํŠธ์—์„œ ํ•จ์ˆ˜ ํ˜ธ์ถœ์„ ์˜ˆ์ธกํ•ฉ๋‹ˆ๋‹ค.

functionCall

functionDeclaration.name์„ ๋‚˜ํƒ€๋‚ด๋Š” ๋ฌธ์ž์—ด ๋ฐ ํŒŒ๋ผ๋ฏธํ„ฐ์™€ ํ•ด๋‹น ๊ฐ’์ด ํฌํ•จ๋œ ๊ตฌ์กฐํ™”๋œ JSON ๊ฐ์ฒด๋ฅผ ํฌํ•จํ•˜๋Š” ๋ชจ๋ธ์—์„œ ๋ฐ˜ํ™˜๋œ ์˜ˆ์ธก๋œ functionCall์ž…๋‹ˆ๋‹ค.

ํŒŒ๋ผ๋ฏธํ„ฐ

name

string

ํ˜ธ์ถœํ•˜๋ ค๋Š” ํ•จ์ˆ˜์˜ ์ด๋ฆ„์ž…๋‹ˆ๋‹ค.

args

Struct

JSON ๊ฐ์ฒด ํ˜•์‹์˜ ํ•จ์ˆ˜ ํŒŒ๋ผ๋ฏธํ„ฐ์™€ ๊ฐ’์ž…๋‹ˆ๋‹ค.

ํŒŒ๋ผ๋ฏธํ„ฐ ์„ธ๋ถ€์ •๋ณด๋Š” ํ•จ์ˆ˜ ํ˜ธ์ถœ์„ ์ฐธ๊ณ ํ•˜์„ธ์š”.

functionResponse

FunctionDeclaration.name์„ ๋‚˜ํƒ€๋‚ด๋Š” ๋ฌธ์ž์—ด์ด ํฌํ•จ๋œ FunctionCall์˜ ๊ฒฐ๊ณผ ์ถœ๋ ฅ์ž…๋‹ˆ๋‹ค. ํ•จ์ˆ˜์˜ ์ถœ๋ ฅ์ด ํฌํ•จ๋œ ๊ตฌ์กฐํ™”๋œ JSON ๊ฐ์ฒด๋„ ํฌํ•จ๋˜๋ฉฐ ์ด๋ฅผ ๋ชจ๋ธ์˜ ์ปจํ…์ŠคํŠธ๋กœ ์‚ฌ์šฉํ•ฉ๋‹ˆ๋‹ค. ์—ฌ๊ธฐ์—๋Š” ๋ชจ๋ธ ์˜ˆ์ธก์„ ๊ธฐ๋ฐ˜์œผ๋กœ ์ƒ์„ฑ๋œ FunctionCall์˜ ๊ฒฐ๊ณผ๊ฐ€ ํฌํ•จ๋˜์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.

ํŒŒ๋ผ๋ฏธํ„ฐ

name

string

ํ˜ธ์ถœํ•˜๋ ค๋Š” ํ•จ์ˆ˜์˜ ์ด๋ฆ„์ž…๋‹ˆ๋‹ค.

response

Struct

JSON ๊ฐ์ฒด ํ˜•์‹์˜ ํ•จ์ˆ˜ ์‘๋‹ต์ž…๋‹ˆ๋‹ค.

์˜ˆ์‹œ

ํ•จ์ˆ˜ ์„ ์–ธ ์ „์†ก

๋‹ค์Œ์€ ์ฟผ๋ฆฌ ๋ฐ ํ•จ์ˆ˜ ์„ ์–ธ์„ ๋ชจ๋ธ์— ์ „์†กํ•˜๋Š” ๊ธฐ๋ณธ ์˜ˆ์‹œ์ž…๋‹ˆ๋‹ค.

REST

์š”์ฒญ ๋ฐ์ดํ„ฐ๋ฅผ ์‚ฌ์šฉํ•˜๊ธฐ ์ „์— ๋‹ค์Œ์„ ๋ฐ”๊ฟ‰๋‹ˆ๋‹ค.

  • PROJECT_ID: ํ”„๋กœ์ ํŠธ ID์ž…๋‹ˆ๋‹ค.
  • LOCATION: ์š”์ฒญ์„ ์ฒ˜๋ฆฌํ•˜๋Š” ๋ฆฌ์ „
  • MODEL_ID: ์ฒ˜๋ฆฌ ์ค‘์ธ ๋ชจ๋ธ์˜ ID
  • ROLE: ๋ฉ”์‹œ์ง€๋ฅผ ๋งŒ๋“œ๋Š” ํ•ญ๋ชฉ์˜ ID
  • TEXT: ๋ชจ๋ธ์— ์ „์†กํ•  ํ”„๋กฌํ”„ํŠธ
  • NAME: ํ˜ธ์ถœํ•˜๋ ค๋Š” ํ•จ์ˆ˜์˜ ์ด๋ฆ„
  • DESCRIPTION: ํ•จ์ˆ˜ ์„ค๋ช… ๋ฐ ์šฉ๋„
  • ๋‹ค๋ฅธ ํ•„๋“œ์— ๋Œ€ํ•ด์„œ๋Š” ํŒŒ๋ผ๋ฏธํ„ฐ ๋ชฉ๋ก ํ…Œ์ด๋ธ”์„ ์ฐธ๊ณ ํ•˜์„ธ์š”.

HTTP ๋ฉ”์„œ๋“œ ๋ฐ URL:

POST https://LOCATION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/publishers/google/models/MODEL_ID:generateContent

JSON ์š”์ฒญ ๋ณธ๋ฌธ:

{
  "contents": [{
    "role": "ROLE",
    "parts": [{
      "text": "TEXT"
    }]
  }],
  "tools": [{
    "function_declarations": [
      {
        "name": "NAME",
        "description": "DESCRIPTION",
        "parameters": {
          "type": "TYPE",
          "properties": {
            "location": {
              "type": "TYPE",
              "description": "DESCRIPTION"
            }
          },
          "required": [
            "location"
          ]
        }
      }
    ]
  }]
}

์š”์ฒญ์„ ๋ณด๋‚ด๋ ค๋ฉด ๋‹ค์Œ ์˜ต์…˜ ์ค‘ ํ•˜๋‚˜๋ฅผ ์„ ํƒํ•ฉ๋‹ˆ๋‹ค.

curl

์š”์ฒญ ๋ณธ๋ฌธ์„ request.json ํŒŒ์ผ์— ์ €์žฅํ•˜๊ณ  ๋‹ค์Œ ๋ช…๋ น์–ด๋ฅผ ์‹คํ–‰ํ•ฉ๋‹ˆ๋‹ค.

curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://LOCATION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/publishers/google/models/MODEL_ID:generateContent"

PowerShell

์š”์ฒญ ๋ณธ๋ฌธ์„ request.json ํŒŒ์ผ์— ์ €์žฅํ•˜๊ณ  ๋‹ค์Œ ๋ช…๋ น์–ด๋ฅผ ์‹คํ–‰ํ•ฉ๋‹ˆ๋‹ค.

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://LOCATION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/publishers/google/models/MODEL_ID:generateContent" | Select-Object -Expand Content

curl ๋ช…๋ น์–ด ์˜ˆ์‹œ

PROJECT_ID=myproject
LOCATION=us-central1
MODEL_ID=gemini-2.0-flash-001

curl -X POST \
  -H "Authorization: Bearer $(gcloud auth print-access-token)" \
  -H "Content-Type: application/json" \
  https://${LOCATION}-aiplatform.googleapis.com/v1/projects/${PROJECT_ID}/locations/${LOCATION}/publishers/google/models/${MODEL_ID}:generateContent \
  -d '{
    "contents": [{
      "role": "user",
      "parts": [{
        "text": "What is the weather in Boston?"
      }]
    }],
    "tools": [{
      "functionDeclarations": [
        {
          "name": "get_current_weather",
          "description": "Get the current weather in a given location",
          "parameters": {
            "type": "object",
            "properties": {
              "location": {
                "type": "string",
                "description": "The city and state, e.g. San Francisco, CA or a zip code e.g. 95616"
              }
            },
            "required": [
              "location"
            ]
          }
        }
      ]
    }]
  }'

Python์šฉ ์ƒ์„ฑํ˜• AI SDK

from google import genai
from google.genai.types import GenerateContentConfig, HttpOptions

def get_current_weather(location: str) -> str:
    """Example method. Returns the current weather.

    Args:
        location: The city and state, e.g. San Francisco, CA
    """
    weather_map: dict[str, str] = {
        "Boston, MA": "snowing",
        "San Francisco, CA": "foggy",
        "Seattle, WA": "raining",
        "Austin, TX": "hot",
        "Chicago, IL": "windy",
    }
    return weather_map.get(location, "unknown")

client = genai.Client(http_options=HttpOptions(api_version="v1"))
model_id = "gemini-2.0-flash-001"

response = client.models.generate_content(
    model=model_id,
    contents="What is the weather like in Boston?",
    config=GenerateContentConfig(
        tools=[get_current_weather],
        temperature=0,
    ),
)

print(response.text)
# Example response:
# The weather in Boston is sunny.

Node.js

const {
  VertexAI,
  FunctionDeclarationSchemaType,
} = require('@google-cloud/vertexai');

const functionDeclarations = [
  {
    function_declarations: [
      {
        name: 'get_current_weather',
        description: 'get weather in a given location',
        parameters: {
          type: FunctionDeclarationSchemaType.OBJECT,
          properties: {
            location: {type: FunctionDeclarationSchemaType.STRING},
            unit: {
              type: FunctionDeclarationSchemaType.STRING,
              enum: ['celsius', 'fahrenheit'],
            },
          },
          required: ['location'],
        },
      },
    ],
  },
];

/**
 * TODO(developer): Update these variables before running the sample.
 */
async function functionCallingBasic(
  projectId = 'PROJECT_ID',
  location = 'us-central1',
  model = 'gemini-1.5-flash-001'
) {
  // Initialize Vertex with your Cloud project and location
  const vertexAI = new VertexAI({project: projectId, location: location});

  // Instantiate the model
  const generativeModel = vertexAI.preview.getGenerativeModel({
    model: model,
  });

  const request = {
    contents: [
      {role: 'user', parts: [{text: 'What is the weather in Boston?'}]},
    ],
    tools: functionDeclarations,
  };
  const result = await generativeModel.generateContent(request);
  console.log(JSON.stringify(result.response.candidates[0].content));
}

Java

import com.google.cloud.vertexai.VertexAI;
import com.google.cloud.vertexai.api.Content;
import com.google.cloud.vertexai.api.FunctionDeclaration;
import com.google.cloud.vertexai.api.GenerateContentResponse;
import com.google.cloud.vertexai.api.Schema;
import com.google.cloud.vertexai.api.Tool;
import com.google.cloud.vertexai.api.Type;
import com.google.cloud.vertexai.generativeai.ChatSession;
import com.google.cloud.vertexai.generativeai.ContentMaker;
import com.google.cloud.vertexai.generativeai.GenerativeModel;
import com.google.cloud.vertexai.generativeai.PartMaker;
import com.google.cloud.vertexai.generativeai.ResponseHandler;
import java.io.IOException;
import java.util.Arrays;
import java.util.Collections;

public class FunctionCalling {
  public static void main(String[] args) throws IOException {
    // TODO(developer): Replace these variables before running the sample.
    String projectId = "your-google-cloud-project-id";
    String location = "us-central1";
    String modelName = "gemini-1.5-flash-001";

    String promptText = "What's the weather like in Paris?";

    whatsTheWeatherLike(projectId, location, modelName, promptText);
  }

  // A request involving the interaction with an external tool
  public static String whatsTheWeatherLike(String projectId, String location,
                                           String modelName, String promptText)
      throws IOException {
    // Initialize client that will be used to send requests.
    // This client only needs to be created once, and can be reused for multiple requests.
    try (VertexAI vertexAI = new VertexAI(projectId, location)) {

      FunctionDeclaration functionDeclaration = FunctionDeclaration.newBuilder()
          .setName("getCurrentWeather")
          .setDescription("Get the current weather in a given location")
          .setParameters(
              Schema.newBuilder()
                  .setType(Type.OBJECT)
                  .putProperties("location", Schema.newBuilder()
                      .setType(Type.STRING)
                      .setDescription("location")
                      .build()
                  )
                  .addRequired("location")
                  .build()
          )
          .build();

      System.out.println("Function declaration:");
      System.out.println(functionDeclaration);

      // Add the function to a "tool"
      Tool tool = Tool.newBuilder()
          .addFunctionDeclarations(functionDeclaration)
          .build();

      // Start a chat session from a model, with the use of the declared function.
      GenerativeModel model = new GenerativeModel(modelName, vertexAI)
          .withTools(Arrays.asList(tool));
      ChatSession chat = model.startChat();

      System.out.println(String.format("Ask the question: %s", promptText));
      GenerateContentResponse response = chat.sendMessage(promptText);

      // The model will most likely return a function call to the declared
      // function `getCurrentWeather` with "Paris" as the value for the
      // argument `location`.
      System.out.println("\nPrint response: ");
      System.out.println(ResponseHandler.getContent(response));

      // Provide an answer to the model so that it knows what the result
      // of a "function call" is.
      Content content =
          ContentMaker.fromMultiModalData(
              PartMaker.fromFunctionResponse(
                  "getCurrentWeather",
                  Collections.singletonMap("currentWeather", "sunny")));
      System.out.println("Provide the function response: ");
      System.out.println(content);
      response = chat.sendMessage(content);

      // See what the model replies now
      System.out.println("Print response: ");
      String finalAnswer = ResponseHandler.getText(response);
      System.out.println(finalAnswer);

      return finalAnswer;
    }
  }
}

Go

import (
	"context"
	"encoding/json"
	"errors"
	"fmt"
	"io"

	"cloud.google.com/go/vertexai/genai"
)

// functionCalling demonstrates how to submit a prompt and a function declaration to the model,
// allowing it to suggest a call to the function to fetch external data. Returning this data
// to the model enables it to generate a text response that incorporates the data.
func functionCalling(w io.Writer, projectID, location, modelName string) error {
	// location = "us-central1"
	// modelName = "gemini-1.5-flash-002"
	ctx := context.Background()
	client, err := genai.NewClient(ctx, projectID, location)
	if err != nil {
		return fmt.Errorf("failed to create GenAI client: %w", err)
	}
	defer client.Close()

	model := client.GenerativeModel(modelName)
	// Set temperature to 0.0 for maximum determinism in function calling.
	model.SetTemperature(0.0)

	funcName := "getCurrentWeather"
	funcDecl := &genai.FunctionDeclaration{
		Name:        funcName,
		Description: "Get the current weather in a given location",
		Parameters: &genai.Schema{
			Type: genai.TypeObject,
			Properties: map[string]*genai.Schema{
				"location": {
					Type:        genai.TypeString,
					Description: "location",
				},
			},
			Required: []string{"location"},
		},
	}
	// Add the weather function to our model toolbox.
	model.Tools = []*genai.Tool{
		{
			FunctionDeclarations: []*genai.FunctionDeclaration{funcDecl},
		},
	}

	prompt := genai.Text("What's the weather like in Boston?")
	resp, err := model.GenerateContent(ctx, prompt)

	if err != nil {
		return fmt.Errorf("failed to generate content: %w", err)
	}
	if len(resp.Candidates) == 0 {
		return errors.New("got empty response from model")
	} else if len(resp.Candidates[0].FunctionCalls()) == 0 {
		return errors.New("got no function call suggestions from model")
	}

	// In a production environment, consider adding validations for function names and arguments.
	for _, fnCall := range resp.Candidates[0].FunctionCalls() {
		fmt.Fprintf(w, "The model suggests to call the function %q with args: %v\n", fnCall.Name, fnCall.Args)
		// Example response:
		// The model suggests to call the function "getCurrentWeather" with args: map[location:Boston]
	}
	// Use synthetic data to simulate a response from the external API.
	// In a real application, this would come from an actual weather API.
	mockAPIResp, err := json.Marshal(map[string]string{
		"location":         "Boston",
		"temperature":      "38",
		"temperature_unit": "F",
		"description":      "Cold and cloudy",
		"humidity":         "65",
		"wind":             `{"speed": "10", "direction": "NW"}`,
	})
	if err != nil {
		return fmt.Errorf("failed to marshal function response to JSON: %w", err)
	}

	funcResp := &genai.FunctionResponse{
		Name: funcName,
		Response: map[string]any{
			"content": mockAPIResp,
		},
	}

	// Return the API response to the model allowing it to complete its response.
	resp, err = model.GenerateContent(ctx, prompt, funcResp)
	if err != nil {
		return fmt.Errorf("failed to generate content: %w", err)
	}
	if len(resp.Candidates) == 0 || len(resp.Candidates[0].Content.Parts) == 0 {
		return errors.New("got empty response from model")
	}

	fmt.Fprintln(w, resp.Candidates[0].Content.Parts[0])
	// Example response:
	// The weather in Boston is cold and cloudy, with a humidity of 65% and a temperature of 38ยฐF. ...

	return nil
}

REST(OpenAI)

OpenAI ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ Function Calling API๋ฅผ ํ˜ธ์ถœํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์ž์„ธํ•œ ๋‚ด์šฉ์€ OpenAI ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ Vertex AI ๋ชจ๋ธ ํ˜ธ์ถœ์„ ์ฐธ์กฐํ•˜์„ธ์š”.

์š”์ฒญ ๋ฐ์ดํ„ฐ๋ฅผ ์‚ฌ์šฉํ•˜๊ธฐ ์ „์— ๋‹ค์Œ์„ ๋ฐ”๊ฟ‰๋‹ˆ๋‹ค.

  • PROJECT_ID: ํ”„๋กœ์ ํŠธ ID์ž…๋‹ˆ๋‹ค.
  • LOCATION: ์š”์ฒญ์„ ์ฒ˜๋ฆฌํ•˜๋Š” ๋ฆฌ์ „
  • MODEL_ID: ์ฒ˜๋ฆฌ ์ค‘์ธ ๋ชจ๋ธ์˜ ID

HTTP ๋ฉ”์„œ๋“œ ๋ฐ URL:

POST https://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/endpoints/openapi/chat/completions

JSON ์š”์ฒญ ๋ณธ๋ฌธ:

{
  "model": "google/MODEL_ID",
  "messages": [
    {
      "role": "user",
      "content": "What is the weather in Boston?"
    }
  ],
  "tools": [
    {
      "type": "function",
      "function": {
        "name": "get_current_weather",
        "description": "Get the current weather in a given location",
        "parameters": {
          "type": "OBJECT",
          "properties": {
            "location": {
              "type": "string",
              "description": "The city and state, e.g. San Francisco, CA or a zip code e.g. 95616"
            }
           },
          "required": ["location"]
        }
      }
    }
  ]
}

์š”์ฒญ์„ ๋ณด๋‚ด๋ ค๋ฉด ๋‹ค์Œ ์˜ต์…˜ ์ค‘ ํ•˜๋‚˜๋ฅผ ์„ ํƒํ•ฉ๋‹ˆ๋‹ค.

curl

์š”์ฒญ ๋ณธ๋ฌธ์„ request.json ํŒŒ์ผ์— ์ €์žฅํ•˜๊ณ  ๋‹ค์Œ ๋ช…๋ น์–ด๋ฅผ ์‹คํ–‰ํ•ฉ๋‹ˆ๋‹ค.

curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/endpoints/openapi/chat/completions"

PowerShell

์š”์ฒญ ๋ณธ๋ฌธ์„ request.json ํŒŒ์ผ์— ์ €์žฅํ•˜๊ณ  ๋‹ค์Œ ๋ช…๋ น์–ด๋ฅผ ์‹คํ–‰ํ•ฉ๋‹ˆ๋‹ค.

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/endpoints/openapi/chat/completions" | Select-Object -Expand Content

Python(OpenAI)

OpenAI ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ Function Calling API๋ฅผ ํ˜ธ์ถœํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์ž์„ธํ•œ ๋‚ด์šฉ์€ OpenAI ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ Vertex AI ๋ชจ๋ธ ํ˜ธ์ถœ์„ ์ฐธ์กฐํ•˜์„ธ์š”.

import vertexai
import openai

from google.auth import default, transport

# TODO(developer): Update & uncomment below line
# PROJECT_ID = "your-project-id"
location = "us-central1"

vertexai.init(project=PROJECT_ID, location=location)

# Programmatically get an access token
credentials, _ = default(scopes=["https://www.googleapis.com/auth/cloud-platform"])
auth_request = transport.requests.Request()
credentials.refresh(auth_request)

# # OpenAI Client
client = openai.OpenAI(
    base_url=f"https://{location}-aiplatform.googleapis.com/v1beta1/projects/{PROJECT_ID}/locations/{location}/endpoints/openapi",
    api_key=credentials.token,
)

tools = [
    {
        "type": "function",
        "function": {
            "name": "get_current_weather",
            "description": "Get the current weather in a given location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g. San Francisco, CA or a zip code e.g. 95616",
                    },
                },
                "required": ["location"],
            },
        },
    }
]

messages = []
messages.append(
    {
        "role": "system",
        "content": "Don't make assumptions about what values to plug into functions. Ask for clarification if a user request is ambiguous.",
    }
)
messages.append({"role": "user", "content": "What is the weather in Boston?"})

response = client.chat.completions.create(
    model="google/gemini-1.5-flash-001",
    messages=messages,
    tools=tools,
)

print("Function:", response.choices[0].message.tool_calls[0].id)
print("Arguments:", response.choices[0].message.tool_calls[0].function.arguments)
# Example response:
# Function: get_current_weather
# Arguments: {"location":"Boston"}

FunctionCallingConfig๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ํ•จ์ˆ˜ ์„ ์–ธ ์ „์†ก

๋‹ค์Œ์€ FunctionCallingConfig๋ฅผ ๋ชจ๋ธ์— ์ „๋‹ฌํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ๋ณด์—ฌ์ฃผ๋Š” ์˜ˆ์‹œ์ž…๋‹ˆ๋‹ค.

functionCallingConfig๋Š” ๋ชจ๋ธ ์ถœ๋ ฅ์ด ํ•ญ์ƒ ํŠน์ • ํ•จ์ˆ˜ ํ˜ธ์ถœ์ด ๋˜๋„๋ก ํ™•์ธํ•ฉ๋‹ˆ๋‹ค. ๊ตฌ์„ฑ ๋ฐฉ๋ฒ•:

  • ํ˜ธ์ถœ ํ•จ์ˆ˜ mode๋ฅผ ANY๋กœ ์„ค์ •ํ•ฉ๋‹ˆ๋‹ค.
  • allowed_function_names์—์„œ ์‚ฌ์šฉํ•  ํ•จ์ˆ˜ ์ด๋ฆ„์„ ์ง€์ •ํ•ฉ๋‹ˆ๋‹ค. allowed_function_names๊ฐ€ ๋น„์–ด ์žˆ์œผ๋ฉด ์ œ๊ณต๋œ ํ•จ์ˆ˜ ์ค‘ ํ•˜๋‚˜๊ฐ€ ๋ฐ˜ํ™˜๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

REST

PROJECT_ID=myproject
LOCATION=us-central1
MODEL_ID=gemini-2.0-flash-001

curl -X POST \
  -H "Authorization: Bearer $(gcloud auth print-access-token)" \
  -H "Content-Type: application/json" \
  https://${LOCATION}-aiplatform.googleapis.com/v1beta1/projects/${PROJECT_ID}/locations/${LOCATION}/publishers/google/models/${MODEL_ID}:generateContent \
  -d '{
    "contents": [{
      "role": "user",
      "parts": [{
        "text": "Do you have the White Pixel 8 Pro 128GB in stock in the US?"
      }]
    }],
    "tools": [{
      "functionDeclarations": [
        {
          "name": "get_product_sku",
          "description": "Get the available inventory for a Google products, e.g: Pixel phones, Pixel Watches, Google Home etc",
          "parameters": {
            "type": "object",
            "properties": {
              "product_name": {"type": "string", "description": "Product name"}
            }
          }
        },
        {
          "name": "get_store_location",
          "description": "Get the location of the closest store",
          "parameters": {
            "type": "object",
            "properties": {
              "location": {"type": "string", "description": "Location"}
            },
          }
        }
      ]
    }],
    "toolConfig": {
        "functionCallingConfig": {
            "mode":"ANY",
            "allowedFunctionNames": ["get_product_sku"]
      }
    },
    "generationConfig": {
      "temperature": 0.95,
      "topP": 1.0,
      "maxOutputTokens": 8192
    }
  }'

Python์šฉ ์ƒ์„ฑํ˜• AI SDK

from google import genai
from google.genai.types import (
    FunctionDeclaration,
    GenerateContentConfig,
    HttpOptions,
    Tool,
)

client = genai.Client(http_options=HttpOptions(api_version="v1"))
model_id = "gemini-2.0-flash-001"

get_album_sales = FunctionDeclaration(
    name="get_album_sales",
    description="Gets the number of albums sold",
    # Function parameters are specified in JSON schema format
    parameters={
        "type": "OBJECT",
        "properties": {
            "albums": {
                "type": "ARRAY",
                "description": "List of albums",
                "items": {
                    "description": "Album and its sales",
                    "type": "OBJECT",
                    "properties": {
                        "album_name": {
                            "type": "STRING",
                            "description": "Name of the music album",
                        },
                        "copies_sold": {
                            "type": "INTEGER",
                            "description": "Number of copies sold",
                        },
                    },
                },
            },
        },
    },
)

sales_tool = Tool(
    function_declarations=[get_album_sales],
)

response = client.models.generate_content(
    model=model_id,
    contents='At Stellar Sounds, a music label, 2024 was a rollercoaster. "Echoes of the Night," a debut synth-pop album, '
    'surprisingly sold 350,000 copies, while veteran rock band "Crimson Tide\'s" latest, "Reckless Hearts," '
    'lagged at 120,000. Their up-and-coming indie artist, "Luna Bloom\'s" EP, "Whispers of Dawn," '
    'secured 75,000 sales. The biggest disappointment was the highly-anticipated rap album "Street Symphony" '
    "only reaching 100,000 units. Overall, Stellar Sounds moved over 645,000 units this year, revealing unexpected "
    "trends in music consumption.",
    config=GenerateContentConfig(
        tools=[sales_tool],
        temperature=0,
    ),
)

print(response.function_calls[0])
# Example response:
# [FunctionCall(
#     id=None,
#     name="get_album_sales",
#     args={
#         "albums": [
#             {"album_name": "Echoes of the Night", "copies_sold": 350000},
#             {"copies_sold": 120000, "album_name": "Reckless Hearts"},
#             {"copies_sold": 75000, "album_name": "Whispers of Dawn"},
#             {"copies_sold": 100000, "album_name": "Street Symphony"},
#         ]
#     },
# )]

Node.js

const {
  VertexAI,
  FunctionDeclarationSchemaType,
} = require('@google-cloud/vertexai');

const functionDeclarations = [
  {
    function_declarations: [
      {
        name: 'get_product_sku',
        description:
          'Get the available inventory for a Google products, e.g: Pixel phones, Pixel Watches, Google Home etc',
        parameters: {
          type: FunctionDeclarationSchemaType.OBJECT,
          properties: {
            productName: {type: FunctionDeclarationSchemaType.STRING},
          },
        },
      },
      {
        name: 'get_store_location',
        description: 'Get the location of the closest store',
        parameters: {
          type: FunctionDeclarationSchemaType.OBJECT,
          properties: {
            location: {type: FunctionDeclarationSchemaType.STRING},
          },
        },
      },
    ],
  },
];

const toolConfig = {
  function_calling_config: {
    mode: 'ANY',
    allowed_function_names: ['get_product_sku'],
  },
};

const generationConfig = {
  temperature: 0.95,
  topP: 1.0,
  maxOutputTokens: 8192,
};

/**
 * TODO(developer): Update these variables before running the sample.
 */
async function functionCallingAdvanced(
  projectId = 'PROJECT_ID',
  location = 'us-central1',
  model = 'gemini-1.5-flash-001'
) {
  // Initialize Vertex with your Cloud project and location
  const vertexAI = new VertexAI({project: projectId, location: location});

  // Instantiate the model
  const generativeModel = vertexAI.preview.getGenerativeModel({
    model: model,
  });

  const request = {
    contents: [
      {
        role: 'user',
        parts: [
          {text: 'Do you have the White Pixel 8 Pro 128GB in stock in the US?'},
        ],
      },
    ],
    tools: functionDeclarations,
    tool_config: toolConfig,
    generation_config: generationConfig,
  };
  const result = await generativeModel.generateContent(request);
  console.log(JSON.stringify(result.response.candidates[0].content));
}

Go

import (
	"context"
	"encoding/json"
	"errors"
	"fmt"
	"io"

	"cloud.google.com/go/vertexai/genai"
)

// functionCallsChat opens a chat session and sends 4 messages to the model:
// - convert a first text question into a structured function call request
// - convert the first structured function call response into natural language
// - convert a second text question into a structured function call request
// - convert the second structured function call response into natural language
func functionCallsChat(w io.Writer, projectID, location, modelName string) error {
	// location := "us-central1"
	// modelName := "gemini-1.5-flash-001"
	ctx := context.Background()
	client, err := genai.NewClient(ctx, projectID, location)
	if err != nil {
		return fmt.Errorf("unable to create client: %w", err)
	}
	defer client.Close()

	model := client.GenerativeModel(modelName)

	// Build an OpenAPI schema, in memory
	paramsProduct := &genai.Schema{
		Type: genai.TypeObject,
		Properties: map[string]*genai.Schema{
			"productName": {
				Type:        genai.TypeString,
				Description: "Product name",
			},
		},
	}
	fundeclProductInfo := &genai.FunctionDeclaration{
		Name:        "getProductSku",
		Description: "Get the SKU for a product",
		Parameters:  paramsProduct,
	}
	paramsStore := &genai.Schema{
		Type: genai.TypeObject,
		Properties: map[string]*genai.Schema{
			"location": {
				Type:        genai.TypeString,
				Description: "Location",
			},
		},
	}
	fundeclStoreLocation := &genai.FunctionDeclaration{
		Name:        "getStoreLocation",
		Description: "Get the location of the closest store",
		Parameters:  paramsStore,
	}
	model.Tools = []*genai.Tool{
		{FunctionDeclarations: []*genai.FunctionDeclaration{
			fundeclProductInfo,
			fundeclStoreLocation,
		}},
	}
	model.SetTemperature(0.0)

	chat := model.StartChat()

	// Send a prompt for the first conversation turn that should invoke the getProductSku function
	prompt := "Do you have the Pixel 8 Pro in stock?"
	fmt.Fprintf(w, "Question: %s\n", prompt)
	resp, err := chat.SendMessage(ctx, genai.Text(prompt))
	if err != nil {
		return err
	}
	if len(resp.Candidates) == 0 ||
		len(resp.Candidates[0].Content.Parts) == 0 {
		return errors.New("empty response from model")
	}

	// The model has returned a function call to the declared function `getProductSku`
	// with a value for the argument `productName`.
	jsondata, err := json.MarshalIndent(resp.Candidates[0].Content.Parts[0], "\t", "  ")
	if err != nil {
		return fmt.Errorf("json.MarshalIndent: %w", err)
	}
	fmt.Fprintf(w, "function call generated by the model:\n\t%s\n", string(jsondata))

	// Create a function call response, to simulate the result of a call to a
	// real service
	funresp := &genai.FunctionResponse{
		Name: "getProductSku",
		Response: map[string]any{
			"sku":      "GA04834-US",
			"in_stock": "yes",
		},
	}
	jsondata, err = json.MarshalIndent(funresp, "\t", "  ")
	if err != nil {
		return fmt.Errorf("json.MarshalIndent: %w", err)
	}
	fmt.Fprintf(w, "function call response sent to the model:\n\t%s\n\n", string(jsondata))

	// And provide the function call response to the model
	resp, err = chat.SendMessage(ctx, funresp)
	if err != nil {
		return err
	}
	if len(resp.Candidates) == 0 ||
		len(resp.Candidates[0].Content.Parts) == 0 {
		return errors.New("empty response from model")
	}

	// The model has taken the function call response as input, and has
	// reformulated the response to the user.
	jsondata, err = json.MarshalIndent(resp.Candidates[0].Content.Parts[0], "\t", "  ")
	if err != nil {
		return fmt.Errorf("json.MarshalIndent: %w", err)
	}
	fmt.Fprintf(w, "Answer generated by the model:\n\t%s\n\n", string(jsondata))

	// Send a prompt for the second conversation turn that should invoke the getStoreLocation function
	prompt2 := "Is there a store in Mountain View, CA that I can visit to try it out?"
	fmt.Fprintf(w, "Question: %s\n", prompt)

	resp, err = chat.SendMessage(ctx, genai.Text(prompt2))
	if err != nil {
		return err
	}
	if len(resp.Candidates) == 0 ||
		len(resp.Candidates[0].Content.Parts) == 0 {
		return errors.New("empty response from model")
	}

	// The model has returned a function call to the declared function `getStoreLocation`
	// with a value for the argument `store`.
	jsondata, err = json.MarshalIndent(resp.Candidates[0].Content.Parts[0], "\t", "  ")
	if err != nil {
		return fmt.Errorf("json.MarshalIndent: %w", err)
	}
	fmt.Fprintf(w, "function call generated by the model:\n\t%s\n", string(jsondata))

	// Create a function call response, to simulate the result of a call to a
	// real service
	funresp = &genai.FunctionResponse{
		Name: "getStoreLocation",
		Response: map[string]any{
			"store": "2000 N Shoreline Blvd, Mountain View, CA 94043, US",
		},
	}
	jsondata, err = json.MarshalIndent(funresp, "\t", "  ")
	if err != nil {
		return fmt.Errorf("json.MarshalIndent: %w", err)
	}
	fmt.Fprintf(w, "function call response sent to the model:\n\t%s\n\n", string(jsondata))

	// And provide the function call response to the model
	resp, err = chat.SendMessage(ctx, funresp)
	if err != nil {
		return err
	}
	if len(resp.Candidates) == 0 ||
		len(resp.Candidates[0].Content.Parts) == 0 {
		return errors.New("empty response from model")
	}

	// The model has taken the function call response as input, and has
	// reformulated the response to the user.
	jsondata, err = json.MarshalIndent(resp.Candidates[0].Content.Parts[0], "\t", "  ")
	if err != nil {
		return fmt.Errorf("json.MarshalIndent: %w", err)
	}
	fmt.Fprintf(w, "Answer generated by the model:\n\t%s\n\n", string(jsondata))
	return nil
}

REST(OpenAI)

OpenAI ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ Function Calling API๋ฅผ ํ˜ธ์ถœํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์ž์„ธํ•œ ๋‚ด์šฉ์€ OpenAI ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ Vertex AI ๋ชจ๋ธ ํ˜ธ์ถœ์„ ์ฐธ์กฐํ•˜์„ธ์š”.

์š”์ฒญ ๋ฐ์ดํ„ฐ๋ฅผ ์‚ฌ์šฉํ•˜๊ธฐ ์ „์— ๋‹ค์Œ์„ ๋ฐ”๊ฟ‰๋‹ˆ๋‹ค.

  • PROJECT_ID: ํ”„๋กœ์ ํŠธ ID์ž…๋‹ˆ๋‹ค.
  • LOCATION: ์š”์ฒญ์„ ์ฒ˜๋ฆฌํ•˜๋Š” ๋ฆฌ์ „
  • MODEL_ID: ์ฒ˜๋ฆฌ ์ค‘์ธ ๋ชจ๋ธ์˜ ID

HTTP ๋ฉ”์„œ๋“œ ๋ฐ URL:

POST https://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/endpoints/openapi/chat/completions

JSON ์š”์ฒญ ๋ณธ๋ฌธ:

{
  "model": "google/MODEL_ID",
  "messages": [
  {
    "role": "user",
    "content": "What is the weather in Boston?"
  }
],
"tools": [
  {
    "type": "function",
    "function": {
      "name": "get_current_weather",
      "description": "Get the current weather in a given location",
      "parameters": {
        "type": "OBJECT",
        "properties": {
          "location": {
            "type": "string",
            "description": "The city and state, e.g. San Francisco, CA or a zip code e.g. 95616"
          }
         },
        "required": ["location"]
      }
    }
  }
],
"tool_choice": "auto"
}

์š”์ฒญ์„ ๋ณด๋‚ด๋ ค๋ฉด ๋‹ค์Œ ์˜ต์…˜ ์ค‘ ํ•˜๋‚˜๋ฅผ ์„ ํƒํ•ฉ๋‹ˆ๋‹ค.

curl

์š”์ฒญ ๋ณธ๋ฌธ์„ request.json ํŒŒ์ผ์— ์ €์žฅํ•˜๊ณ  ๋‹ค์Œ ๋ช…๋ น์–ด๋ฅผ ์‹คํ–‰ํ•ฉ๋‹ˆ๋‹ค.

curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/endpoints/openapi/chat/completions"

PowerShell

์š”์ฒญ ๋ณธ๋ฌธ์„ request.json ํŒŒ์ผ์— ์ €์žฅํ•˜๊ณ  ๋‹ค์Œ ๋ช…๋ น์–ด๋ฅผ ์‹คํ–‰ํ•ฉ๋‹ˆ๋‹ค.

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/endpoints/openapi/chat/completions" | Select-Object -Expand Content

Python(OpenAI)

OpenAI ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ Function Calling API๋ฅผ ํ˜ธ์ถœํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์ž์„ธํ•œ ๋‚ด์šฉ์€ OpenAI ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ Vertex AI ๋ชจ๋ธ ํ˜ธ์ถœ์„ ์ฐธ์กฐํ•˜์„ธ์š”.

import vertexai
import openai

from google.auth import default, transport

# TODO(developer): Update & uncomment below line
# PROJECT_ID = "your-project-id"
location = "us-central1"

vertexai.init(project=PROJECT_ID, location=location)

# Programmatically get an access token
credentials, _ = default(scopes=["https://www.googleapis.com/auth/cloud-platform"])
auth_request = transport.requests.Request()
credentials.refresh(auth_request)

# OpenAI Client
client = openai.OpenAI(
    base_url=f"https://{location}-aiplatform.googleapis.com/v1beta1/projects/{PROJECT_ID}/locations/{location}/endpoints/openapi",
    api_key=credentials.token,
)

tools = [
    {
        "type": "function",
        "function": {
            "name": "get_current_weather",
            "description": "Get the current weather in a given location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g. San Francisco, CA or a zip code e.g. 95616",
                    },
                },
                "required": ["location"],
            },
        },
    }
]

messages = []
messages.append(
    {
        "role": "system",
        "content": "Don't make assumptions about what values to plug into functions. Ask for clarification if a user request is ambiguous.",
    }
)
messages.append({"role": "user", "content": "What is the weather in Boston, MA?"})

response = client.chat.completions.create(
    model="google/gemini-1.5-flash-002",
    messages=messages,
    tools=tools,
    tool_choice="auto",
)

print("Function:", response.choices[0].message.tool_calls[0].id)
print("Arguments:", response.choices[0].message.tool_calls[0].function.arguments)
# Example response:
# Function: get_current_weather
# Arguments: {"location":"Boston"}

๋‹ค์Œ ๋‹จ๊ณ„

์ž์„ธํ•œ ๋ฌธ์„œ๋Š” ๋‹ค์Œ์„ ์ฐธ์กฐํ•˜์„ธ์š”.