LLM Integration
@obayd/agentic
is designed to be LLM-agnostic. It achieves this through a single, crucial integration point: the llmCallback
function provided to the Conversation
constructor.
The llmCallback
Function
llmCallback
FunctionThis function is the bridge between the @obayd/agentic
framework and your chosen Large Language Model API. It has a specific signature and behavior:
Signature:
Requirements:
Async Generator: It must be an async function* (an async generator function).
Parameters:
messages
: Receives an array of Message objects. This array is prepared by the Conversation class and includes the formatted system prompt, user messages, assistant responses, and tool results in the format expected by many LLM APIs (like OpenAI's). You should pass this directly to your LLM API call.options
: An object containing options. Currently, it primarily includes { stream: true }, indicating that the Conversation expects a streaming response. You might use this or other potential future options to configure your API call.
Streaming Response: Your function must initiate a streaming request to your LLM API.
Yielding Chunks: It must iterate over the streaming response from the LLM and yield individual string chunks of the assistant's reply as they arrive.
Error Handling: It should ideally include error handling for the API call itself (e.g., network errors, authentication issues) and potentially yield an informative error string if the connection fails.
Example Implementation (fetch + fetchResponseToStream)
Many LLM streaming APIs use Server-Sent Events (SSE). @obayd/agentic
includes a utility function fetchResponseToStream
to simplify handling these common streams.
Adaptation: You must adapt the fetch call (URL, headers, body parameters like model, max_tokens, temperature, and potentially how tools are specified in the request body) to match the exact requirements of your specific LLM provider's API.
fetchResponseToStream
: This helper expects a standard SSE stream where text content is found in JSON like data: {"choices":[{"delta":{"content":"..."}}]}. If your API uses a different streaming format, you'll need to write custom logic to parse the stream and yield the text chunks. See the Utilities page for more on this helper.
Compatibility Notes
Function Calling Format:
@obayd/agentic
uses its own internal XML-like tag format (<fn_...>...</fn_...>) within the assistant's response to signal function calls. The system prompt instructs the LLM to use this format. This works well with many instruction-following models.Native API Tool Use: Some APIs (like OpenAI, Anthropic) have their own specific request/response formats for "tool use" or "function calling", you can just discard them, as agentic uses it's own.
By providing this flexible llmCallback interface,
@obayd/agentic
allows you to integrate with a wide variety of LLM backends while maintaining a consistent framework for conversation and tool management.
Last updated