Introduction
Production-ready AI Copilots for any product. Connect any LLM, deploy on your infrastructure, own your data. Built for speed and control.
Build AI Copilots for Your Product
Production-ready AI Copilots for any product. Connect any LLM, deploy on your infrastructure, own your data. Built for speed and control.
What Makes This Different?
Prebuilt Copilot UI
Production-ready chat components. Streaming, markdown, code highlighting, and file attachments included.
Tool Execution
Define tools with Zod schemas. AI calls them, you handle the result. Full agentic loop support.
Plug & Play
Works out of the box. No configuration headaches. Just React hooks.
Multi-LLM
OpenAI, Anthropic, Google, xAI, and more. Swap providers without changing your code.
The Gist
import { CopilotProvider } from '@yourgpt/copilot-sdk/react';
import { CopilotChat } from '@yourgpt/copilot-sdk/ui';
function App() {
return (
<CopilotProvider runtimeUrl="/api/chat">
<CopilotChat />
</CopilotProvider>
);
}That's a working AI chat. Want the AI to see your screen when users say "I have an error"?
<CopilotProvider
runtimeUrl="/api/chat"
tools={{ screenshot: true, console: true, requireConsent: true }}
>
<CopilotChat />
</CopilotProvider>Done. The SDK handles consent UI, captures context, sends it to the AI.
Packages
| Package | What it does |
|---|---|
@yourgpt/copilot-sdk | React hooks, provider, UI components, and core utilities |
@yourgpt/llm-sdk | Multi-provider LLM integration + streaming |
SDK Requirements
| Provider | SDK Required |
|---|---|
| OpenAI, Google, xAI | openai |
| Anthropic | @anthropic-ai/sdk |
Most providers use OpenAI-compatible APIs, so you only need one of 2 SDKs. Learn more →
Quick Install
npm install @yourgpt/copilot-sdk @yourgpt/llm-sdk openaipnpm add @yourgpt/copilot-sdk @yourgpt/llm-sdk openaibun add @yourgpt/copilot-sdk @yourgpt/llm-sdk openaiThe openai SDK works with OpenAI, Google Gemini, and xAI. For Anthropic, use @anthropic-ai/sdk instead. See all providers →
The Flow
User types message
↓
CopilotProvider sends to your /api/chat
↓
Runtime talks to OpenAI/Anthropic/etc
↓
AI decides: respond OR call a tool
↓
Tool executes client-side → result sent back
↓
AI continues until done (agentic loop)
↓
Response streams to UIAll of this is handled. You just define tools and build UI.
Real Example: Navigation Tool
import { useToolWithSchema } from '@yourgpt/copilot-sdk/react';
import { z } from 'zod';
function NavigationTool() {
const navigate = useNavigate();
useToolWithSchema({
name: 'navigate_to_page',
description: 'Navigate user to a specific page',
schema: z.object({
path: z.string().describe('The URL path to navigate to'),
}),
handler: async ({ path }) => {
navigate(path);
return { success: true, navigatedTo: path };
},
});
return null;
}Now when user says "take me to settings", AI calls navigate_to_page({ path: '/settings' }).