org.llm4s.llmconnect.config.OllamaConfig
See theOllamaConfig companion object
case class OllamaConfig(model: String, baseUrl: String, contextWindow: Int, reserveCompletion: Int) extends ProviderConfig
Configuration for a locally-running Ollama instance.
Ollama requires no API key — authentication is handled at the network level by controlling access to the Ollama endpoint. Prefer OllamaConfig.fromValues over the primary constructor.
Value parameters
- baseUrl
-
Ollama server URL, e.g.
"http://localhost:11434". - contextWindow
-
Model's total token capacity (prompt + completion combined).
- model
-
Model identifier as registered in Ollama, e.g.
"llama3". - reserveCompletion
-
Tokens held back from prompt history for the completion.
Attributes
- Companion
- object
- Graph
-
- Supertypes
-
trait Serializabletrait Producttrait Equalstrait ProviderConfigclass Objecttrait Matchableclass Any
Members list
In this article