Conversation
The Conversation
class is the heart of the @obayd/agentic
framework. It orchestrates the entire interaction between the user, the LLM, and any defined tools.
Initialization
You create a Conversation
instance by providing an llmCallback
function and optional configuration.
import { Conversation } from '@obayd/agentic';
// Your LLM callback (async generator function)
async function* llmCallback(messages, options) {
// ... fetch response from LLM and yield string chunks ...
}
// Basic initialization
const conversation = new Conversation(llmCallback);
// Initialization with options
const conversationWithOptions = new Conversation(llmCallback, {
initialEnabledToolpacks: ['web_search_tools'] // Pre-enable a toolpack ( less common )
});
llmCallback
(Required): An async function* (async generator function) that handles communication with your LLM. See LLM Integration for details.options (Optional): An object for configuration.
initialEnabledToolpacks
: An array of strings specifying the names of Toolpacks to be enabled from the start of the conversation.
Defining Content (.content())
Before starting the conversation, you define the system prompt and available tools using the .content() method. This method accepts an array containing various types of items:
import { Tool, Toolpack } from '@obayd/agentic';
const myTool = Tool.make("tool_name");
const myToolpack = Toolpack.make("toolpack_name").add(myTool);
conversation.content([
// 1. Simple strings for the system prompt
"You are a helpful assistant.",
"Use tools when necessary.",
// 2. Tool instances (makes the tool directly available)
Tool.make("get_time").action(async () => ({ time: new Date().toISOString() })),
// 3. Toolpack instances (tools are available if the pack is enabled)
myToolpack,
// 4. Content objects (e.g., for multimodal input - passed directly to LLM)
// { type: 'image_url', image_url: { url: '...' } },
// 5. Async or static functions for dynamic content
async (conv, ...args) => {
// You can access conversation state here
const userTier = args[0]?.userTier || 'free';
if (userTier === 'premium') {
return Tool.make('premium_feature').action(async () => ({ content: 'Premium access granted!' }));
}
return null; // Return null or undefined if nothing to add
}
]);
The order in the array generally influences the order in the final system prompt sent to the LLM.
Tool and Toolpack definitions are automatically formatted into instructions for the LLM.
If any Toolpack instances are included, the internal enable_toolpack tool is automatically added.
Using functions allows for dynamic adjustments to the system prompt or available tools based on runtime conditions or arguments passed to send(). See Dynamic Content.
Sending Messages (.send(...))
The .send() method is used to send a user message and initiate the LLM response cycle. It returns an async generator that yields events as the conversation progresses.
async function interact(userInput, ...extraArgs) {
// extraArgs will be passed to dynamic content functions and tool actions
const stream = conversation.send(userInput, ...extraArgs);
for await (const event of stream) {
// Process events: 'assistant', 'tool.calling', 'tool', 'error', etc.
console.log(event);
}
}
interact("What time is it?");
interact("Search for cats.", { userTier: 'premium' }); // Pass extra args
messageContent: Can be a string, an array of content parts (for multimodal input), or a pre-formatted Message object ({ role: 'user', content: '...' }).
...args: Optional additional arguments passed down to dynamic content functions and tool action handlers. See Passing Arguments.
Return Value: An AsyncGenerator. You must iterate through this generator (e.g., using for await...of) to drive the conversation forward and receive results. See Streaming & Events for details on event types.
Message History (.messages)
The conversation.messages property holds an array of all messages exchanged during the conversation, including user inputs, assistant responses, and tool calls/results.
// After interacting:
console.log(conversation.messages);
/* Example Output:
[
{ role: 'user', content: 'What time is it?' },
{
role: 'tool',
callId: 'call_abc123',
name: 'get_time',
params: {},
raw: null,
result: { time: '2023-10-27T10:00:00.000Z' }, // Raw result
content: [ { type: 'text', text: '{"time":"2023-10-27T10:00:00.000Z"}' } ] // Normalized
},
{ role: 'assistant', content: 'The current time is 10:00 AM UTC.' }
]
*/
Messages follow the Message interface structure.
tool messages contain details about the call (callId, name, params, raw) and the outcome (result, normalized content, optional error).
You can inspect this history, save it, or use it to restore conversation.
Enabled Toolpacks (.enabledToolpacks)
The conversation.enabledToolpacks property is a Set containing the names of toolpacks currently active for the next LLM turn.
// Check if a toolpack is enabled
if (conversation.enabledToolpacks.has('web_search_tools')) {
console.log("Web search is enabled!");
}
// Manually enable a toolpack for the next turn (less common than using enable_toolpack tool)
// conversation.enabledToolpacks.add('database_tools');
// Manually disable
// conversation.enabledToolpacks.delete('web_search_tools');
This set is automatically updated when the
enable_toolpack
tool successfully executes.You can modify it directly between send() calls, but changes only affect the subsequent call to send().
Last updated