Skip to main content

GigaChat

https://developers.sber.ru/docs/ru/gigachat/api/overview

GigaChat is Sber AI's large language model, Russia's leading LLM provider.

tip

We support ALL GigaChat models, just set model=gigachat/<any-model-on-gigachat> as a prefix when sending litellm requests

warning

GigaChat API uses self-signed SSL certificates. You must pass ssl_verify=False in your requests.

Supported Features​

FeatureSupported
Chat CompletionYes
StreamingYes
AsyncYes
Function Calling / ToolsYes
Structured Output (JSON Schema)Yes (via function call emulation)
Image InputYes (base64 and URL) - GigaChat-2-Max, GigaChat-2-Pro only
EmbeddingsYes

API Key​

GigaChat uses OAuth authentication. Set your credentials as environment variables:

import os

# Required: Set credentials (base64-encoded client_id:client_secret)
os.environ['GIGACHAT_CREDENTIALS'] = "your-credentials-here"

# Optional: Set scope (default is GIGACHAT_API_PERS for personal use)
os.environ['GIGACHAT_SCOPE'] = "GIGACHAT_API_PERS" # or GIGACHAT_API_B2B for business

Get your credentials at: https://developers.sber.ru/studio/

Sample Usage​

from litellm import completion
import os

os.environ['GIGACHAT_CREDENTIALS'] = "your-credentials-here"

response = completion(
model="gigachat/GigaChat-2-Max",
messages=[
{"role": "user", "content": "Hello from LiteLLM!"}
],
ssl_verify=False, # Required for GigaChat
)
print(response)

Sample Usage - Streaming​

from litellm import completion
import os

os.environ['GIGACHAT_CREDENTIALS'] = "your-credentials-here"

response = completion(
model="gigachat/GigaChat-2-Max",
messages=[
{"role": "user", "content": "Hello from LiteLLM!"}
],
stream=True,
ssl_verify=False, # Required for GigaChat
)

for chunk in response:
print(chunk)

Sample Usage - Function Calling​

from litellm import completion
import os

os.environ['GIGACHAT_CREDENTIALS'] = "your-credentials-here"

tools = [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get weather for a city",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string", "description": "City name"}
},
"required": ["city"]
}
}
}]

response = completion(
model="gigachat/GigaChat-2-Max",
messages=[{"role": "user", "content": "What's the weather in Moscow?"}],
tools=tools,
ssl_verify=False, # Required for GigaChat
)
print(response)

Sample Usage - Structured Output​

GigaChat supports structured output via JSON schema (emulated through function calling):

from litellm import completion
import os

os.environ['GIGACHAT_CREDENTIALS'] = "your-credentials-here"

response = completion(
model="gigachat/GigaChat-2-Max",
messages=[{"role": "user", "content": "Extract info: John is 30 years old"}],
response_format={
"type": "json_schema",
"json_schema": {
"name": "person",
"schema": {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"}
}
}
}
},
ssl_verify=False, # Required for GigaChat
)
print(response) # Returns JSON: {"name": "John", "age": 30}

Sample Usage - Image Input​

GigaChat supports image input via base64 or URL (GigaChat-2-Max and GigaChat-2-Pro only):

from litellm import completion
import os

os.environ['GIGACHAT_CREDENTIALS'] = "your-credentials-here"

response = completion(
model="gigachat/GigaChat-2-Max", # Vision requires GigaChat-2-Max or GigaChat-2-Pro
messages=[{
"role": "user",
"content": [
{"type": "text", "text": "What's in this image?"},
{"type": "image_url", "image_url": {"url": "https://example.com/image.jpg"}}
]
}],
ssl_verify=False, # Required for GigaChat
)
print(response)

Sample Usage - Embeddings​

from litellm import embedding
import os

os.environ['GIGACHAT_CREDENTIALS'] = "your-credentials-here"

response = embedding(
model="gigachat/Embeddings",
input=["Hello world", "How are you?"],
ssl_verify=False, # Required for GigaChat
)
print(response)

Usage with LiteLLM Proxy​

1. Set GigaChat Models on config.yaml​

model_list:
- model_name: gigachat
litellm_params:
model: gigachat/GigaChat-2-Max
api_key: "os.environ/GIGACHAT_CREDENTIALS"
ssl_verify: false
- model_name: gigachat-lite
litellm_params:
model: gigachat/GigaChat-2-Lite
api_key: "os.environ/GIGACHAT_CREDENTIALS"
ssl_verify: false
- model_name: gigachat-embeddings
litellm_params:
model: gigachat/Embeddings
api_key: "os.environ/GIGACHAT_CREDENTIALS"
ssl_verify: false

2. Start Proxy​

litellm --config config.yaml

3. Test it​

curl --location 'http://0.0.0.0:4000/chat/completions' \
--header 'Content-Type: application/json' \
--data '{
"model": "gigachat",
"messages": [
{
"role": "user",
"content": "Hello!"
}
]
}'

Supported Models​

Chat Models​

Model NameContext WindowVisionDescription
gigachat/GigaChat-2-Lite128KNoFast, lightweight model
gigachat/GigaChat-2-Pro128KYesProfessional model with vision
gigachat/GigaChat-2-Max128KYesMaximum capability model

Embedding Models​

Model NameMax InputDimensionsDescription
gigachat/Embeddings5121024Standard embeddings
gigachat/Embeddings-25121024Updated embeddings
gigachat/EmbeddingsGigaR40962560High-dimensional embeddings
note

Available models may vary depending on your API access level (personal or business).

Limitations​

  • Only one function call per request (GigaChat API limitation)
  • Maximum 1 image per message, 10 images total per conversation
  • GigaChat API uses self-signed SSL certificates - ssl_verify=False is required