StreamingResponseHandler

org.llm4s.llmconnect.streaming.StreamingResponseHandler
See theStreamingResponseHandler companion object

Handles streaming responses from LLM providers. Manages the lifecycle of streaming, chunk accumulation, and error handling.

Attributes

Companion
object
Graph
Supertypes
class Object
trait Matchable
class Any
Known subtypes

Members list

Value members

Abstract methods

def cleanup(): Unit

Clean up resources

Clean up resources

Attributes

Get the final completion after streaming is done

Get the final completion after streaming is done

Attributes

def handleError(error: LLMError): Unit

Handle error during streaming

Handle error during streaming

Attributes

def isComplete: Boolean

Check if streaming is complete

Check if streaming is complete

Attributes

def processChunk(chunk: String): Result[Option[StreamedChunk]]

Process a streaming chunk

Process a streaming chunk

Attributes