Configuration for Azure OpenAI deployments.
Although Azure exposes an OpenAI-compatible API, it uses a different URL structure (per-deployment endpoint) and requires an apiVersion query parameter. org.llm4s.llmconnect.LLMConnect constructs an org.llm4s.llmconnect.provider.OpenAIClient internally; this config carries the Azure-specific fields that OpenAIConfig does not have.
Prefer AzureConfig.fromValues over the primary constructor; it resolves contextWindow and reserveCompletion automatically.
Value parameters
- apiKey
-
Azure API key; redacted in
toString. - apiVersion
-
Azure OpenAI API version string, e.g.
"2025-01-01-preview". - contextWindow
-
Model's total token capacity (prompt + completion combined).
- endpoint
-
Azure OpenAI deployment endpoint URL, e.g.
"https://my-resource.openai.azure.com/openai/deployments/my-deploy". - model
-
Deployment name used as the model identifier.
- reserveCompletion
-
Tokens held back from prompt history for the completion.
Attributes
- Companion
- object
- Graph
-
- Supertypes
-
trait Serializabletrait Producttrait Equalstrait ProviderConfigclass Objecttrait Matchableclass Any