ConfigKeys
Canonical environment-variable names recognised by Llm4sConfig.
All LLM4S configuration is read from environment variables (or JVM system properties with the same name). This object centralises the names to avoid typos and make grep-friendly references possible.
== Quick reference ==
LLM_MODEL— required; formatprovider/model(e.g."openai/gpt-4o").- Provider API keys — required for cloud providers; see per-section comments.
TRACING_MODE— optional;langfuse,opentelemetry,console, ornone.EMBEDDING_MODEL— required when using embeddings; formatprovider/model.
Attributes
- Graph
-
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
ConfigKeys.type
Members list
Value members
Concrete fields
Anthropic API key (sk-ant-...).
Anthropic API key (sk-ant-...).
Attributes
Overrides the Anthropic API base URL. Defaults to "https://api.anthropic.com".
Overrides the Anthropic API base URL. Defaults to "https://api.anthropic.com".
Attributes
Azure OpenAI deployment endpoint URL, e.g. "https://my-resource.openai.azure.com/...".
Azure OpenAI deployment endpoint URL, e.g. "https://my-resource.openai.azure.com/...".
Attributes
Azure API key for the deployment.
Azure API key for the deployment.
Attributes
Azure OpenAI API version string, e.g. "2025-01-01-preview".
Azure OpenAI API version string, e.g. "2025-01-01-preview".
Attributes
Brave Search API key. Required when using the Brave web-search tool.
Brave Search API key. Required when using the Brave web-search tool.
Attributes
Enables or disables document chunking (true/false). Default: true.
Enables or disables document chunking (true/false). Default: true.
Attributes
Token overlap between consecutive chunks. Default: 100.
Token overlap between consecutive chunks. Default: 100.
Attributes
Token count per chunk when splitting documents for embedding. Default: 1000.
Token count per chunk when splitting documents for embedding. Default: 1000.
Attributes
DeepSeek API key.
DeepSeek API key.
Attributes
Overrides the DeepSeek API base URL. Defaults to "https://api.deepseek.com".
Overrides the DeepSeek API base URL. Defaults to "https://api.deepseek.com".
Attributes
Path to the file or directory to embed.
Path to the file or directory to embed.
Attributes
Unified embedding provider and model selector.
Unified embedding provider and model selector.
Format: provider/model, e.g. "openai/text-embedding-3-small", "voyage/voyage-3", "ollama/nomic-embed-text". Takes precedence over the legacy EMBEDDING_PROVIDER variable.
Attributes
Legacy embedding provider selector; superseded by EMBEDDING_MODEL.
Query string used when searching an embedding index.
Query string used when searching an embedding index.
Attributes
Optional Langfuse environment tag (e.g. "production", "staging").
Optional Langfuse environment tag (e.g. "production", "staging").
Attributes
Langfuse public key (pk-lf-...).
Langfuse public key (pk-lf-...).
Attributes
Optional Langfuse release version tag.
Optional Langfuse release version tag.
Attributes
Langfuse secret key (sk-lf-...).
Langfuse secret key (sk-lf-...).
Attributes
Langfuse server URL. Defaults to "https://cloud.langfuse.com" when not set.
Langfuse server URL. Defaults to "https://cloud.langfuse.com" when not set.
Attributes
Optional Langfuse SDK version override.
Optional Langfuse SDK version override.
Attributes
Selects the LLM provider and model. Format: provider/model, e.g. "openai/gpt-4o".
Selects the LLM provider and model. Format: provider/model, e.g. "openai/gpt-4o".
Attributes
Ollama server URL. Defaults to "http://localhost:11434" when not set.
Ollama server URL. Defaults to "http://localhost:11434" when not set.
Attributes
Overrides the Ollama embedding base URL independently of OLLAMA_BASE_URL.
Selects the Ollama embedding model when using the legacy provider format.
Selects the Ollama embedding model when using the legacy provider format.
Attributes
OpenAI API key (sk-...). Also used for embeddings when no separate embedding key is set.
OpenAI API key (sk-...). Also used for embeddings when no separate embedding key is set.
Attributes
Overrides the OpenAI API base URL.
Overrides the OpenAI API base URL.
Defaults to "https://api.openai.com/v1". Set to an OpenRouter URL (containing "openrouter.ai") to route through OpenRouter without a separate config type — the same org.llm4s.llmconnect.config.OpenAIConfig is reused and the client detects OpenRouter from this URL.
Attributes
Overrides the base URL used for OpenAI embedding requests.
Overrides the base URL used for OpenAI embedding requests.
Useful when routing embeddings through a proxy or compatible endpoint independently of the LLM base URL.
Attributes
Selects the OpenAI embedding model when using the legacy provider format.
Selects the OpenAI embedding model when using the legacy provider format.
Attributes
Optional OpenAI organisation ID forwarded in the OpenAI-Organization header.
Optional OpenAI organisation ID forwarded in the OpenAI-Organization header.
Attributes
OpenRouter base URL alias.
OpenRouter base URL alias.
OpenRouter uses the same OPENAI_BASE_URL variable — there is no separate OPENROUTER_BASE_URL. Set OPENAI_BASE_URL to your OpenRouter endpoint and LLM_MODEL to openrouter/model-name.
Attributes
Voyage AI API key (pa-...).
Voyage AI API key (pa-...).
Attributes
Overrides the Voyage AI embedding base URL.
Overrides the Voyage AI embedding base URL.
Attributes
Selects the Voyage AI embedding model when using the legacy provider format.
Selects the Voyage AI embedding model when using the legacy provider format.