org.llm4s.llmconnect.provider

Members list

Type members

Classlikes

Attributes

Companion
object
Supertypes
trait LLMClient
class Object
trait Matchable
class Any

Attributes

Companion
class
Supertypes
class Object
trait Matchable
class Any
Self type

DeepSeek LLM client implementation using the OpenAI-compatible API.

DeepSeek LLM client implementation using the OpenAI-compatible API.

Provides access to DeepSeek models including DeepSeek-Chat (V3) with 64K context and DeepSeek-Reasoner (R1) with 128K context for advanced reasoning tasks.

Uses the same request/response format as OpenAI, making it compatible with standard OpenAI tooling and client code patterns.

Value parameters

config

DeepSeek configuration containing API key, model, base URL, and context settings

metrics

MetricsCollector for recording request metrics

Attributes

Companion
object
Supertypes
trait LLMClient
class Object
trait Matchable
class Any

Attributes

Companion
class
Supertypes
class Object
trait Matchable
class Any
Self type

Text embedding provider interface for generating vector representations.

Text embedding provider interface for generating vector representations.

Provides a unified interface for different embedding services (OpenAI, VoyageAI, Ollama). Each implementation handles provider-specific API calls and response formats.

Text content is the primary input; multimedia content (images, audio) should be processed through the UniversalEncoder façade which handles content extraction before embedding.

== Usage Example ==

val provider: EmbeddingProvider = OpenAIEmbeddingProvider.fromConfig(config)
val request = EmbeddingRequest(
 input = Seq("Hello world", "How are you?"),
 model = EmbeddingModelName("text-embedding-3-small")
)
val result: Result[EmbeddingResponse] = provider.embed(request)

Attributes

See also

OpenAIEmbeddingProvider for OpenAI text-embedding models

VoyageAIEmbeddingProvider for VoyageAI embedding models

OllamaEmbeddingProvider for local Ollama embedding models

Supertypes
class Object
trait Matchable
class Any
class GeminiClient(config: GeminiConfig, val metrics: MetricsCollector) extends LLMClient, MetricsRecording

LLMClient implementation for Google Gemini models.

LLMClient implementation for Google Gemini models.

Provides access to Gemini 2.0, 1.5 Pro, 1.5 Flash and other Gemini models via Google's Generative AI API.

== Supported Features ==

  • Chat completions
  • Streaming responses
  • Tool/function calling
  • Large context windows (up to 1M+ tokens)

== Configuration ==

export LLM_MODEL=gemini/gemini-2.0-flash
export GOOGLE_API_KEY=your-api-key

== API Format ==

Gemini uses a different message format than OpenAI:

  • Messages have role (user/model) and parts (array of content)
  • System instructions are sent separately
  • Tool calls use functionDeclarations format

Value parameters

config

Gemini configuration with API key, model, and base URL

metrics

metrics collector for observability (default: noop)

Attributes

See also

org.llm4s.llmconnect.config.GeminiConfig for configuration options

Companion
object
Supertypes
trait LLMClient
class Object
trait Matchable
class Any
object GeminiClient

Attributes

Companion
class
Supertypes
class Object
trait Matchable
class Any
Self type
sealed trait LLMProvider

Enumeration of supported LLM providers.

Enumeration of supported LLM providers.

Defines the available language model service providers that can be used with llm4s. Each provider has specific configuration requirements and API characteristics.

Attributes

See also

org.llm4s.llmconnect.config.ProviderConfig for provider-specific configuration

Companion
object
Supertypes
class Object
trait Matchable
class Any
Known subtypes
object Anthropic
object Azure
object DeepSeek
object Gemini
object Ollama
object OpenAI
object OpenRouter
object Zai
Show all
object LLMProvider

Companion object providing LLM provider instances and utilities.

Companion object providing LLM provider instances and utilities.

Attributes

Companion
trait
Supertypes
trait Sum
trait Mirror
class Object
trait Matchable
class Any
Self type

Helper trait for recording metrics consistently across all provider clients.

Helper trait for recording metrics consistently across all provider clients.

Extracts the common pattern of timing requests, observing outcomes, recording tokens, and calculating costs.

Attributes

Supertypes
class Object
trait Matchable
class Any
Known subtypes
class OllamaClient(config: OllamaConfig, val metrics: MetricsCollector) extends LLMClient, MetricsRecording

Attributes

Companion
object
Supertypes
trait LLMClient
class Object
trait Matchable
class Any
object OllamaClient

Attributes

Companion
class
Supertypes
class Object
trait Matchable
class Any
Self type

Attributes

Supertypes
class Object
trait Matchable
class Any
Self type

LLMClient implementation supporting both OpenAI and Azure OpenAI services.

LLMClient implementation supporting both OpenAI and Azure OpenAI services.

Provides a unified interface for interacting with OpenAI's API and Azure's OpenAI service. Handles message conversion between llm4s format and OpenAI format, completion requests, streaming responses, and tool calling (function calling) capabilities.

Uses Azure's OpenAI client library internally, which supports both direct OpenAI and Azure-hosted OpenAI endpoints.

== Extended Thinking / Reasoning Support ==

For OpenAI o1/o3/o4 models with reasoning capabilities, use OpenRouterClient instead, which fully supports the reasoning_effort parameter. The Azure SDK used by this client does not yet expose the reasoning_effort API parameter.

For Anthropic Claude models with extended thinking, use AnthropicClient which has full support for the thinking parameter with budget_tokens.

Value parameters

client

configured Azure OpenAI client instance

config

provider configuration containing context window and reserve completion settings

metrics

metrics collector for observability (default: noop)

model

the model identifier (e.g., "gpt-4", "gpt-3.5-turbo")

Attributes

Companion
object
Supertypes
trait LLMClient
class Object
trait Matchable
class Any
object OpenAIClient

Factory methods for creating OpenAIClient instances.

Factory methods for creating OpenAIClient instances.

Provides safe construction of OpenAI clients with error handling via Result type.

Attributes

Companion
class
Supertypes
class Object
trait Matchable
class Any
Self type

OpenAI embedding provider implementation.

OpenAI embedding provider implementation.

Provides text embeddings using OpenAI's embedding API (text-embedding-3-small, text-embedding-3-large, text-embedding-ada-002). Supports batch embedding of multiple texts in a single request.

== Supported Models ==

  • text-embedding-3-small - Efficient, lower cost (recommended)
  • text-embedding-3-large - Higher quality, higher cost
  • text-embedding-ada-002 - Legacy model

== Token Usage == The response includes token usage information when available from the API.

Attributes

See also

EmbeddingProvider for the provider interface

Supertypes
class Object
trait Matchable
class Any
Self type

Attributes

Companion
object
Supertypes
trait LLMClient
class Object
trait Matchable
class Any

Attributes

Companion
class
Supertypes
class Object
trait Matchable
class Any
Self type

Attributes

Supertypes
class Object
trait Matchable
class Any
Self type
class ZaiClient(config: ZaiConfig, val metrics: MetricsCollector) extends LLMClient, MetricsRecording

Attributes

Companion
object
Supertypes
trait LLMClient
class Object
trait Matchable
class Any
object ZaiClient

Attributes

Companion
class
Supertypes
class Object
trait Matchable
class Any
Self type
ZaiClient.type