Utilities
@obayd/agentic
includes a few utility functions, primarily for internal use but potentially helpful for users, especially when implementing the llmCallback
.
fetchResponseToStream
This is the most likely utility you might use directly. It's designed to simplify processing streaming responses from LLM APIs that use the Server-Sent Events (SSE) protocol, which is common (e.g., OpenAI, Anthropic in some modes).
Signature:
Functionality:
Takes a standard Response object obtained from a fetch call.
Checks if the response was successful (
response.ok
). Throws an error with status and body text if not.Checks if the response has a body (
response.body
). Throws if not.Reads the response body as a stream (
ReadableStream
).Decodes the stream chunks using
TextDecoder
.Splits the decoded text by lines, handling potential partial lines across chunks.
Looks for lines starting with data:.
Attempts to JSON parse the content following data:.
Specifically looks for text content within the parsed JSON at the path
choices[0].delta.content
(common in OpenAI-like streams).If found, yields the
textChunk
. If JSON parsing fails or the specific path isn't found, it currently yields the raw data content (afterdata:
) as a fallback.Stops when the stream ends or encounters a data: [DONE] message.
Usage Example (in llmCallback):
Note: If your LLM API uses a different streaming format (e.g., newline-delimited JSON, a custom binary format), fetchResponseToStream
will likely not work correctly, and you will need to implement your own stream parsing logic within your llmCallback.
Last updated