Skip to main content
We recommend you migrate any usage of /chat/send-message and /chat/send-message-simple-api to this new API by February 1st, 2026.
The /chat/send-chat-message API is used to send a message to Gorbit. It is the same API that the Gorbit frontend uses to send and receive messages. You have the option of receiving a streaming response or the complete response as a string. This guide was explain all of the parameters you can pass in to the API and provide a code sample.

Request Parameters

ParameterDescription
messageThe user message to send to the Agent.
llm_overridePass an object to override the default LLM settings for this request. If None, you will get the default Gorbit behavior.

You can pass or exclude any of the following fields:
model_provider
model_version
temperature

If you pass an invalid configuration (e.g., specifying claude-sonnet-4.5 when the default model_provider is OpenAI), your request will fail.
allowed_tool_idsAgents are created with a set of Actions they are allowed to invoke. You can further configure this set for your immediate interaction using this parameter. See the list of Actions and their IDs via the GET /tool endpoint. Pass in an empty list to disable all Actions. Pass in None to allow all the Actions which are configured for the Agent.
forced_tool_idForce the Agent to use a specific Action for this request. The Agent may run other Actions before returning its final response, but it will be guaranteed to use this one. Leave empty to let the Agent decide which Actions to use.
file_descriptorsA list of files to include along with your request. File IDs can be found via the POST /user/projects/file/upload and the GET /user/projects/file/ endpoints.
search_filtersFilters to narrow down the internal search results used by the Agent. All filter arguments are optional and can be combined:

source_type – Source types like web, slack, google_drive, confluence
document_set – The name of the document sets to search within
time_cutoff – ISO 8601 format: YYYY-MM-DDTHH:MM:SSZ
tags – Format: {"tag_key": "tag_value"}
deep_researchEnables Deep Research mode for this request.

Note: This mode consumes significantly more tokens, so be careful accessing it via the API.
parent_message_idThe ID of the parent message in the chat history (primary-key for the previous message in the chat history tree). If not passed in, your new message is assumed to be sequentially after the last message.

Warning: If set to None, the chat history is reset and the new message is considered the first message in the chat history.
chat_session_idTo continue an existing conversation, pass in the chat session ID where the message should be sent. If left blank, a new chat session will be created according to chat_session_info.
chat_session_infoDetails about the chat session which will be used for all messages in the session. Fields can be left blank to use defaults.

persona_id – The ID of the Agent to use for the chat session
project_id – ID of a Project if the chat should be scoped to a Project
streamIf true, responds with an SSE stream of individual packets (same set used for the Gorbit UI). Fields like the Answer, reasoning tokens, and iterative Tool Calls need to be pieced together from streamed tokens.
include_citationsIf true, responses will include citations for the sources used to generate the answer.
additional_contextA string of extra context injected into the LLM call for this request. The context is passed to the model but is not stored in the database and will not appear in chat history.
Use this to supply ephemeral, request-scoped information (e.g. the user’s current page URL, session metadata, or any runtime context) without polluting the persistent conversation history.
Pass null or omit the field to use no additional context.

Response Format

Streaming Response

Gorbit returns various types of packets in the streaming response depending on the LLM’s behavior. See our streaming_models.py on GitHub for the complete list of packet types and their corresponding fields.
class StreamingType(Enum):
    """Enum defining all streaming packet types."""

    SECTION_END = "section_end"
    STOP = "stop"
    TOP_LEVEL_BRANCHING = "top_level_branching"
    ERROR = "error"

    MESSAGE_START = "message_start"
    MESSAGE_DELTA = "message_delta"
    SEARCH_TOOL_START = "search_tool_start"
    SEARCH_TOOL_QUERIES_DELTA = "search_tool_queries_delta"
    SEARCH_TOOL_DOCUMENTS_DELTA = "search_tool_documents_delta"
    OPEN_URL_START = "open_url_start"
    OPEN_URL_URLS = "open_url_urls"
    OPEN_URL_DOCUMENTS = "open_url_documents"
    IMAGE_GENERATION_START = "image_generation_start"
    IMAGE_GENERATION_HEARTBEAT = "image_generation_heartbeat"
    IMAGE_GENERATION_FINAL = "image_generation_final"
    PYTHON_TOOL_START = "python_tool_start"
    PYTHON_TOOL_DELTA = "python_tool_delta"
    CUSTOM_TOOL_START = "custom_tool_start"
    CUSTOM_TOOL_DELTA = "custom_tool_delta"
    REASONING_START = "reasoning_start"
    REASONING_DELTA = "reasoning_delta"
    REASONING_DONE = "reasoning_done"
    CITATION_INFO = "citation_info"

    DEEP_RESEARCH_PLAN_START = "deep_research_plan_start"
    DEEP_RESEARCH_PLAN_DELTA = "deep_research_plan_delta"
    RESEARCH_AGENT_START = "research_agent_start"
    INTERMEDIATE_REPORT_START = "intermediate_report_start"
    INTERMEDIATE_REPORT_DELTA = "intermediate_report_delta"
    INTERMEDIATE_REPORT_CITED_DOCS = "intermediate_report_cited_docs"

Non-streaming Response

class ChatFullResponse(BaseModel):
    """Complete non-streaming response with all available data."""

    # Core response fields
    answer: str
    answer_citationless: str
    pre_answer_reasoning: str | None = None
    tool_calls: list[ToolCallResponse] = []

    # Documents & citations
    top_documents: list[SearchDoc]
    citation_info: list[CitationInfo]

    # Metadata
    message_id: int
    chat_session_id: UUID | None = None
    error_msg: str | None = None

Sample Request

import requests

API_BASE_URL = "https://cloud.gorbit.app/api"  # or your own domain
API_KEY = "YOUR_KEY_HERE"

headers = {
    "Authorization": f"Bearer {API_KEY}",
    "Content-Type": "application/json"
}

response = requests.post(
    f"{API_BASE_URL}/chat/send-chat-message",
    headers=headers,
    json={
        "message": "What is Gorbit?",
    }
)

data = response.json()
print("Answer:", data["answer"])
print("Message ID:", data["message_id"])

Next Steps