OllamaConfig

org.llm4s.llmconnect.config.OllamaConfig
See theOllamaConfig companion object
case class OllamaConfig(model: String, baseUrl: String, contextWindow: Int, reserveCompletion: Int) extends ProviderConfig

Configuration for a locally-running Ollama instance.

Ollama requires no API key — authentication is handled at the network level by controlling access to the Ollama endpoint. Prefer OllamaConfig.fromValues over the primary constructor.

Value parameters

baseUrl

Ollama server URL, e.g. "http://localhost:11434".

contextWindow

Model's total token capacity (prompt + completion combined).

model

Model identifier as registered in Ollama, e.g. "llama3".

reserveCompletion

Tokens held back from prompt history for the completion.

Attributes

Companion
object
Graph
Supertypes
trait Serializable
trait Product
trait Equals
class Object
trait Matchable
class Any
Show all

Members list

Value members

Inherited methods

def productElementNames: Iterator[String]

Attributes

Inherited from:
Product
def productIterator: Iterator[Any]

Attributes

Inherited from:
Product