Conversation
The Conversation
class is the heart of the @obayd/agentic
framework. It orchestrates the entire interaction between the user, the LLM, and any defined tools.
Initialization
You create a Conversation
instance by providing an llmCallback
function and optional configuration.
llmCallback
(Required): An async function* (async generator function) that handles communication with your LLM. See LLM Integration for details.options (Optional): An object for configuration.
initialEnabledToolpacks
: An array of strings specifying the names of Toolpacks to be enabled from the start of the conversation.
Defining Content (.content())
Before starting the conversation, you define the system prompt and available tools using the .content() method. This method accepts an array containing various types of items:
The order in the array generally influences the order in the final system prompt sent to the LLM.
Tool and Toolpack definitions are automatically formatted into instructions for the LLM.
If any Toolpack instances are included, the internal enable_toolpack tool is automatically added.
Using functions allows for dynamic adjustments to the system prompt or available tools based on runtime conditions or arguments passed to send(). See Dynamic Content.
Sending Messages (.send(...))
The .send() method is used to send a user message and initiate the LLM response cycle. It returns an async generator that yields events as the conversation progresses.
messageContent: Can be a string, an array of content parts (for multimodal input), or a pre-formatted Message object ({ role: 'user', content: '...' }).
...args: Optional additional arguments passed down to dynamic content functions and tool action handlers. See Passing Arguments.
Return Value: An AsyncGenerator. You must iterate through this generator (e.g., using for await...of) to drive the conversation forward and receive results. See Streaming & Events for details on event types.
Message History (.messages)
The conversation.messages property holds an array of all messages exchanged during the conversation, including user inputs, assistant responses, and tool calls/results.
Messages follow the Message interface structure.
tool messages contain details about the call (callId, name, params, raw) and the outcome (result, normalized content, optional error).
You can inspect this history, save it, or use it to restore conversation.
Enabled Toolpacks (.enabledToolpacks)
The conversation.enabledToolpacks property is a Set containing the names of toolpacks currently active for the next LLM turn.
This set is automatically updated when the
enable_toolpack
tool successfully executes.You can modify it directly between send() calls, but changes only affect the subsequent call to send().
Last updated