Additional key-value pairs representing provider-specific parameters (e.g., temperature, max_tokens, top_p). These often override defaults set in ThreadConfig.
OptionalcallProvides context for the LLM call, identifying which phase of agent execution is making the request. This determines the tokenType prefix in StreamEvents.
Carries the specific target provider and configuration for this call.
OptionalsessionOptional session ID.
OptionalstepStep context for execution phase, passed to StreamEvent for step identification.
OptionalstreamRequest a streaming response from the LLM provider. Adapters MUST check this flag.
The mandatory thread ID, used by the ReasoningEngine to fetch thread-specific configuration (e.g., model, params) via StateManager.
OptionaltraceOptional trace ID for correlation.
OptionaluserOptional user ID.
Options for configuring an LLM call, including streaming and context information.
CallOptions