The ChatBotKit API is a powerful tool for developers looking to integrate conversational AI functionality into their applications. This documentation provides a comprehensive guide to understanding and utilizing the various endpoints and features offered by the API.

The ChatBotKit API is a powerful tool for developers looking to integrate conversational AI functionality into their applications. This documentation provides a comprehensive guide to understanding and utilizing the various endpoints and features offered by the API. You can find the full V1 OpenAPI specification here.


The ChatBotKit API provides the following endpoints:

  • API Base URL:
  • API Version: v1

The current version of the API, v1, can be accessed at

Action-Based Approach to Resources

The ChatBotKit API follows an action-based approach to resources. Each resource within the API supports a set of available actions, including list, create, update, delete, search, and more. When interacting with an endpoint, you will need to specify the desired action by sending a POST request. However, some actions, such as list, only support GET requests. POST requests expect JSON payloads.

Example: Listing Conversations
GET /v1/conversation/list HTTP/1.1 Host: Authorization: Token ...
Example: Creating a Conversation
POST /v1/conversation/create HTTP/1.1 Host: Authorization: Token ... Content-Type: application/json ...


ChatBotKit supports token-based authorization. To access the API, developers need to create tokens from the developer tools available at Tokens serve as the authorization mechanism for making calls to the API.

Some endpoints in the ChatBotKit API can generate scoped tokens. These tokens are designed to be limited to specific resources and actions. Developers can use scoped tokens to authorize calls to the API, ensuring that only the necessary permissions are granted.

To authorize a call to the API, include the token in the Authorization header of the request:

Authorization: Token {your_token_here}

Make sure to replace {your_token_here} with the actual token value.

User Assumption

In addition to the Authorization header, you can also pass in X-RunAs-UserId header if you want to switch the request context to a sub-account (also known as a child account) in a parent-child account relationship configuration. This is part of the Partner API. Here is an example:

Authorization: Token {your_token_here} X-RunAs-UserId: {child_user_id_here}

Make sure to replace both {your_token_here} and {child_user_id_here} with the corresponding values.


Pagination is available only on the list actions in ChatBotKit API. It is based on cursors, which provide a way to navigate through a large set of results. Cursor-based pagination allows you to retrieve a set of resources in smaller chunks, making it easier to handle large datasets.

In the context of ChatBotKit API, when making a list call, the API may include an optional cursor parameter. This cursor corresponds to the ID of the last resource from the previous list of resources retrieved.

To use cursor-based pagination with the /v1/conversation/list action, you can make the following HTTP requests:

  1. Initial request without a cursor:

    GET /v1/conversation/list HTTP/1.1 Host: Authorization: Token {your_token_here}

    This request will return the first page of results.

  2. Subsequent requests using the cursor parameter:

    GET /v1/conversation/list?cursor={last_resource_id} HTTP/1.1 Host: Authorization: Token {your_token_here}

    Replace {last_resource_id} with the ID of the last resource from the previous response. This request will retrieve the next page of results.


In addition to pagination, the ChatBotKit API also supports streaming for certain actions. Streaming allows the caller to receive data in bite-sized chunks and continues until the stream is depleted. This can be useful for efficiently pulling large quantities of data from the API.

To activate streaming, the caller must supply an Accept header with the value application/jsonl. When using this streaming method, the information from the API will be delivered in a continuous stream of JSON lines.

Streaming Example: Conversation List

To stream data using the /v1/conversation/list route, you can make the following HTTP request:

GET /v1/conversation/list HTTP/1.1 Host: Authorization: Token {your_token_here} Accept: application/jsonl

This request specifies the Accept header with the value application/jsonl, indicating that the response should be streamed in JSON line format. The API will continuously send JSON lines until all the data has been retrieved.

Streaming Example: Tokens and Events

Streaming can also be used to receive real-time updates and events from the ChatBotKit conversational AI engine. By streaming the /v1/conversation/{conversationId}/complete route, you can receive tokens and other events related to a specific conversation.

To stream tokens and events, you need to make a POST request to the /v1/conversation/{conversationId}/complete route. The request should include a JSON payload with a text parameter corresponding to the user's input:

POST /v1/conversation/{conversationId}/complete HTTP/1.1 Host: Authorization: Token {your_token_here} Content-Type: application/json { "text": "User input goes here" }

This request will initiate the streaming of tokens and events related to the specified conversation. The API will continuously send updates until the conversation is completed or terminated.

Streaming can be a powerful alternative to pagination, allowing you to receive data in real-time and efficiently handle large quantities of information.


Error handling in the ChatBotKit API is based on HTTP status codes. Each error corresponds to a specific status code, indicating the nature of the error. The following table provides a summary of the error definitions and their corresponding status codes:

Error CodeStatus CodeError Description
BAD_REQUEST400Request is malformed
NOT_AUTHENTICATED401Not authenticated
NOT_AUTHORIZED401Not authorized
NO_SUBSCRIPTION402No subscription
NOT_FOUND404Not found
METHOD_NOT_ALLOWED405Method not allowed
CONFLICT409There is a conflict when dealign with the resource
TOO_MANY_REQUESTS429Too many requests
LIMITS_REACHED429Subscription limits reached
INTERNAL_SERVER_ERROR500Internal server error

All errors returned by the ChatBotKit API include a JSON payload in the following format:

{ "message": "error message", "code": "error code" }

The message field provides a brief description of the error, while the code field specifies the error code associated with the error. This payload can help developers identify and handle errors appropriately when integrating the ChatBotKit API into their applications.

In the ChatBotKit API, there is a specific class of errors that can occur when all available tokens are maxed out. These errors are related to limits and indicate that the maximum number of requests or actions allowed has been reached.

When encountering these errors, developers may receive one of the following status codes:

  • TOO_MANY_REQUESTS: This status code (429) indicates that the rate limit for making requests has been exceeded.
  • LIMITS_REACHED: This status code (429) indicates that the limits for certain actions or resources have been reached.

To handle these errors, developers should implement appropriate error handling mechanisms in their applications. This may involve implementing retry logic, adjusting the rate of requests, or contacting the API provider to inquire about increasing the limits for specific actions or resources.

It is important to monitor and manage limits effectively to ensure smooth and uninterrupted integration with the ChatBotKit API.

Scope Errors Codes

Some API calls may return a scoped error. This is an error that is within a subsystem, such as remote store or a model. Scoped errors produce error codes similar to the one above but they are prefixed by the system that exhibits the error. For example, if there is a rate limited enforced by Discord, the error will be DISCORD_TOO_MANY_REQUESTS instead of TOO_MANY_REQUESTS.

It is important to note that the ChatBotKit platform has a comprehensive retry policy for all upstream services it interacts with. We will attempt to fulfil every single request no matter the situation. In rare circumstances all requests may fail. This is when we will return a scoped error.


The ChatBotKit API offers a range of powerful features that enhance the functionality and capabilities of conversational AI applications. These features enable developers to create interactive and engaging chatbots and virtual assistants that can handle complex dialogues and provide meaningful responses.


The continuation feature in the ChatBotKit API is a powerful tool that enables any limited model with a limited context size to continue streaming indefinitely. With this feature, developers can overcome the constraints of context size and seamlessly generate responses that maintain consistency and coherence over extended conversations.

By utilizing the continuation feature, developers can extend the conversation beyond the context limitations of the model, ensuring a smooth and uninterrupted flow of dialogue. This is particularly useful in scenarios where maintaining context and generating coherent responses over long conversations is crucial.

The continuation feature empowers developers to create more interactive and engaging conversational AI experiences. It opens up possibilities for building chatbots and virtual assistants that can handle complex dialogues and provide meaningful responses, regardless of the length or complexity of the conversation.

Token Reconciliation

The ChatBotKit API includes a powerful feature called token reconciliation, which automatically adjusts messages and conversations to fit perfectly within the maximum allowed context size. This feature is particularly useful when working with models that have limitations on the number of tokens they can process.

When a conversation or message exceeds the maximum allowed context size, the API intelligently applies different strategies depending on the model being used. These strategies ensure that the conversation remains coherent and consistent, even when dealing with lengthy or complex interactions.

By leveraging token reconciliation, developers can confidently build conversational AI experiences that seamlessly handle extended conversations. This feature eliminates the need to manually truncate or modify messages to fit within context limitations, allowing for a more streamlined and natural dialogue flow.

Whether you are working with models that have strict token constraints or simply want to ensure optimal performance, token reconciliation in the ChatBotKit API provides a reliable solution for managing context size and generating high-quality responses.

Agents and Background Tasks

One of the powerful features of ChatBotKit is its ability to handle AI agents and background tasks. With this functionality, ChatBotKit can seamlessly perform lengthy processes that may take minutes to complete. This capability is especially useful for conducting multi-step interactions with a bot and performing complex tasks utilizing various skillsets.

By utilizing agents and background tasks, developers can create conversational AI experiences that go beyond simple question-and-answer interactions. The bot can perform tasks that involve multiple steps and require time to complete, such as retrieving information from external sources, processing large datasets, or executing complex algorithms.

This functionality enables the bot to handle complex user requests that go beyond the capabilities of traditional chatbots. It allows for more intricate and dynamic conversations, providing users with a richer and more interactive experience.

API Spec

The full API specification for the ChatBotKit API is available in OpenAPI format. You can access the API spec at

For a more interactive experience, you can explore the API spec at This documentation provides detailed information about each endpoint, including request and response examples, parameter descriptions, and more.

Node SDK

The ChatBotKit Node SDK is a software development kit specifically designed for developers working with Node.js. This SDK provides a comprehensive set of tools and libraries that simplify the integration of the ChatBotKit API into Node.js applications.

With the Node SDK, developers can easily leverage the powerful features of the ChatBotKit API without having to manually handle low-level HTTP requests and responses. The SDK abstracts away the complexities of API communication, allowing developers to focus on building conversational AI functionality into their applications.

To get started with the ChatBotKit Node SDK, refer to the full documentation available at This documentation provides detailed instructions on how to install the SDK, authenticate with the API, and make various API calls using the SDK's intuitive methods and functions.