Enhance applications with a lightweight React SDK designed for AI chat streaming. With a simple useAIChat hook, it supports any backend and offers customizable UI options. Ideal for SaaS teams and enterprise solutions, it provides the flexibility to integrate AI chat without being tied to a specific design.
react-ai-stream
Backend-agnostic AI streaming primitives for React.
react-ai-stream is a lightweight React SDK for building ChatGPT-style interfaces with any AI backend. Stream responses from OpenAI, Anthropic, Groq, FastAPI, Rails, Go, or custom SSE endpoints using a single useAIChat hook.
Designed for teams that want full control over their backend, UI, and streaming pipeline — without being locked into a specific provider or framework.
Key Features
-
Single React Hook Manage streaming messages, loading states, errors, and abort handling with a minimal API.
-
Backend-Agnostic Architecture Works with OpenAI, Anthropic, Groq, or any backend capable of streaming Server-Sent Events (SSE).
-
Drop-in or Fully Custom UI Use the included
<Chat />component or build your own interface with Tailwind, shadcn/ui, Chakra, or any design system. -
Streaming Event Hooks
onToken,onComplete, andonErrorcallbacks make analytics, voice interfaces, agents, and side-effects easy without extra state management. -
TypeScript-First Strictly typed APIs with ESM + CJS support and lightweight bundle sizes.
Who Is It For?
- SaaS teams embedding AI copilots into dashboards
- Developers building multi-model AI comparison tools
- Enterprise apps needing multiple isolated chat instances
- Teams using non-Node.js backends (FastAPI, Go, Rails, etc.)
- Developers who want streaming AI infrastructure without framework lock-in
Example
'use client'
import { useAIChat } from '@react-ai-stream/react'
import { Chat } from '@react-ai-stream/ui'
import '@react-ai-stream/ui/styles'
export default function Page() {
const { messages, sendMessage, loading, stop } = useAIChat({
endpoint: '/api/chat',
})
return (
<div style={{ height: '80vh' }}>
<Chat
messages={messages}
onSend={sendMessage}
onStop={stop}
loading={loading}
/>
</div>
)
}
Why It’s Different
Most AI chat SDKs tightly couple your frontend to a specific provider or backend framework.
react-ai-stream treats AI streaming as a simple protocol boundary:
textdoneerror
Your React app never needs to know which LLM produced the stream.
Highlights
- ~12 kB lightweight architecture
- Works with plain React and Next.js
- Supports multiple independent chat instances
- Markdown rendering + syntax highlighting included
- Built-in abort handling and streaming lifecycle management
- Fully documented with architecture diagrams and live examples
react-ai-stream is open source, MIT licensed, and focused on composable AI streaming infrastructure for modern React applications.
No comments yet.
Sign in to be the first to comment.