ToolRegistry

org.llm4s.toolapi.ToolRegistry
See theToolRegistry companion object
class ToolRegistry(initialTools: Seq[ToolFunction[_, _]])

Registry for tool functions with execution capabilities.

Acts as the single point of truth for tools available to an agent. Supports synchronous, asynchronous, and batched execution with configurable concurrency strategies (see ToolExecutionStrategy):

  • execute() — synchronous, blocking execution
  • executeAsync() — asynchronous, non-blocking execution
  • executeAll() — batch execution with a configurable ToolExecutionStrategy

Create a registry by passing an initial set of ToolFunction instances:

val registry = new ToolRegistry(Seq(myTool, anotherTool))
// or use the convenience factories:
ToolRegistry.empty
BuiltinTools.coreSafe.map(new ToolRegistry(_))

Value parameters

initialTools

The tools available in this registry

Attributes

Companion
object
Graph
Supertypes
class Object
trait Matchable
class Any
Known subtypes

Members list

Value members

Concrete methods

def addToAzureOptions(chatOptions: ChatCompletionsOptions): ChatCompletionsOptions

Adds the tools from this registry to an Azure OpenAI ChatCompletionsOptions

Adds the tools from this registry to an Azure OpenAI ChatCompletionsOptions

Value parameters

chatOptions

The chat options to add the tools to

Attributes

Returns

The updated chat options

def execute(request: ToolCallRequest): Either[ToolCallError, Value]

Executes a tool call synchronously, wrapping any thrown exception.

Executes a tool call synchronously, wrapping any thrown exception.

We use Try here (rather than org.llm4s.core.safety.Safety.safely) so that tool execution remains independent of the Safety API and returns Either for direct use by retry/timeout logic. Safety is still used elsewhere in the codebase (e.g. tracing, agent entry points).

Exceptions thrown inside the tool implementation are caught and converted to ToolCallError.ExecutionError with the original throwable preserved (so retry logic can treat e.g. IOException as retryable). Callers always receive a typed Either and never need to guard against unexpected exceptions from tool code.

Tool-returned Left values are propagated unchanged, so callers may receive any ToolCallError subtype that the tool itself produces (not only ExecutionError).

Value parameters

request

The tool name and pre-parsed JSON arguments.

Attributes

Returns

Right(result) on success; Left(ToolCallError.UnknownFunction) when no tool with the given name is registered; Left(ToolCallError.ExecutionError) when the tool throws an exception; or Left(error) with the tool's own ToolCallError when the tool returns a Left directly.

def execute(request: ToolCallRequest, config: ToolExecutionConfig)(implicit ec: ExecutionContext): Either[ToolCallError, Value]

Executes a tool call with optional per-tool timeout and retry.

Executes a tool call with optional per-tool timeout and retry.

When config is default (no timeout, no retry), behavior is identical to execute.

Value parameters

config

Optional timeout and retry policy

ec

ExecutionContext required when timeout or retry is used

request

The tool call request

Attributes

def executeAll(requests: Seq[ToolCallRequest], strategy: ToolExecutionStrategy, config: ToolExecutionConfig)(implicit ec: ExecutionContext): Future[Seq[Either[ToolCallError, Value]]]

Execute multiple tool calls with a configurable strategy.

Execute multiple tool calls with a configurable strategy.

Value parameters

ec

ExecutionContext for async execution

requests

The tool call requests to execute

strategy

Execution strategy (Sequential, Parallel, or ParallelWithLimit)

Attributes

Returns

Future containing results in the same order as requests

def executeAsync(request: ToolCallRequest)(implicit ec: ExecutionContext): Future[Either[ToolCallError, Value]]

Execute a tool call asynchronously.

Execute a tool call asynchronously.

Wraps synchronous execution in a Future for non-blocking operation. NOTE: Tool execution typically involves blocking I/O. We use blocking to hint the ExecutionContext to expand its pool if necessary.

Value parameters

ec

ExecutionContext for async execution

request

The tool call request

Attributes

Returns

Future containing the result

def executeAsync(request: ToolCallRequest, config: ToolExecutionConfig)(implicit ec: ExecutionContext): Future[Either[ToolCallError, Value]]

Execute a tool call asynchronously with optional timeout and retry.

Execute a tool call asynchronously with optional timeout and retry.

Value parameters

config

Optional timeout and retry policy

ec

ExecutionContext for async execution

request

The tool call request

Attributes

def getOpenAITools(strict: Boolean): Arr

Generate OpenAI tool definitions for all tools.

Generate OpenAI tool definitions for all tools.

Value parameters

strict

When true (default), all object properties are treated as required.

Attributes

Returns

A ujson.Arr containing one tool-definition object per registered tool

def getTool(name: String): Option[ToolFunction[_, _]]

Get a specific tool by name

Get a specific tool by name

Attributes

def getToolDefinitionsSafe(provider: String): Result[Value]

Generate tool definitions in the format expected by a specific LLM provider.

Generate tool definitions in the format expected by a specific LLM provider.

Currently all supported providers (openai, anthropic, gemini) use the same OpenAI-compatible format.

Value parameters

provider

Provider name (case-insensitive): "openai", "anthropic", "gemini"

Attributes

Returns

Right(tools) for supported providers, Left(ValidationError) for unsupported ones

def tools: Seq[ToolFunction[_, _]]

All tools registered in this registry.

All tools registered in this registry.

Attributes

Deprecated methods

def getToolDefinitions(provider: String): Value

Generate a specific format of tool definitions for a particular LLM provider.

Generate a specific format of tool definitions for a particular LLM provider.

Value parameters

provider

Provider name (case-insensitive): "openai", "anthropic", "gemini"

Attributes

Throws
java.lang.IllegalArgumentException

for unsupported provider names

Deprecated
[Since version 0.2.9] Use getToolDefinitionsSafe() which returns Result[ujson.Value] for safe error handling