Overview
By default, every Lua agent uses Google Gemini 2.5 Flash. Themodel property on LuaAgent lets you choose a different model or select one dynamically based on the request context.
Lua manages the API credentials. You don’t need to configure any API keys or provider accounts — Lua handles all LLM infrastructure on your behalf.Support for user-provided API keys (Bring Your Own Key) is coming in a future release.
Available Models
| Model string | Provider | Context window |
|---|---|---|
google/gemini-2.5-flash | Google (Vertex AI) | 1M tokens (default) |
google/gemini-2.5-pro | Google (Vertex AI) | 1M tokens |
google/gemini-2.0-flash | Google (Vertex AI) | 1M tokens |
openai/gpt-4o | OpenAI | 128K tokens |
openai/gpt-4o-mini | OpenAI | 128K tokens |
openai/gpt-4.1 | OpenAI | 1M tokens |
openai/gpt-4.1-mini | OpenAI | 1M tokens |
anthropic/claude-3.5-sonnet | Anthropic | 200K tokens |
anthropic/claude-3.7-sonnet | Anthropic | 200K tokens |
Static Model
The simplest form — one model for all requests:Dynamic Model Resolver
Use a function to select the model per request. The resolver receives the full request with access to all platform APIs —User, Baskets, Products, Data, and more.
'provider/model' string synchronously or asynchronously.
Common Patterns
Premium vs free users
Channel-based selection
Content-based routing
Environment-based
Default Model
If you don’t setmodel, your agent uses google/gemini-2.5-flash. This is a fast, capable model with a 1M token context window — suitable for most use cases.

