Home
Introduction to Agentic
Welcome to the documentation for @obayd/agentic
!
@obayd/agentic
is a powerful, lightweight framework designed to streamline the development of LLM-powered agents in Node.js environments. It focuses on simplifying the complexities around function calling (tool usage), conversation management, and asynchronous streaming, allowing you to focus on building intelligent agent behavior.
What Problem Does It Solve?
Building sophisticated LLM agents often involves several challenges:
Function Calling Logic: Defining functions (tools) the LLM can call, parsing the LLM's requests, executing the functions, and feeding the results back requires significant boilerplate.
Conversation Flow: Managing the back-and-forth between the user, the LLM, and tool executions can become complex, especially with multiple potential tool calls per turn.
Streaming Responses: Handling streamed responses from LLMs effectively to provide a responsive user experience while also parsing for potential function calls requires careful state management.
Prompt Engineering: Crafting effective system prompts that instruct the LLM on how and when to use available tools is crucial but can be tedious.
LLM Agnosticism: Integrating with different LLM providers often means adapting to slightly different API request/response formats.
@obayd/agentic
addresses these challenges by providing a structured and intuitive API.
Key Features
✨ Fluent Tool Definition: Define tools with parameters, descriptions, required flags, enums, and even raw text input using a clean, chainable API (
Tool.make().description().param()...
).📦 Toolpacks: Group related tools into
Toolpack
s. Enable or disable entire sets of tools dynamically within a conversation using the built-inenable_toolpack
tool.🌊 Streaming First: Designed around async generators for handling LLM responses and tool events. Process information as it arrives for a more interactive feel.
🔌 Flexible LLM Integration: Connect to virtually any LLM that supports function calling and streaming responses by providing a simple
llmCallback
async generator function. A utility (fetchResponseToStream
) is included for common SSE formats.🗣️ Automated Conversation Management: The
Conversation
class manages message history, builds system prompts incorporating tool definitions, parses LLM responses for tool calls, executes tools, and formats results back for the LLM.⚙️ Dynamic System Prompts: Define system prompt content using a mix of static strings,
Tool
instances,Toolpack
instances, or even asynchronous functions that return content dynamically based on the conversation state.🔒 Type-Safe: Comes with comprehensive TypeScript declaration files (
.d.ts
) for robust type checking and excellent developer experience (autocompletion!) in both TypeScript and JavaScript projects.☀️ Zero-Dependency: No external dependencies, making it easy to integrate with your existing project.
Last updated