Advanced Guide
In this guide, you’ll build production‑ready chat apps using framework adapters, secure server layers, and reliable streaming patterns.
Overview
This guide extends Quick Start with real‑world patterns you’ll use in production:
- Framework adapters for clean UIs without manual fetch plumbing.
- Secure server layers that keep keys off the client.
- Server‑side streaming consumption (assemble the final assistant message before responding).
Key Features
- Adapter‑based UI patterns for React, Vue, SvelteKit, Angular, and Nuxt.
- Two server strategies:
- Stream‑through (client renders tokens as they arrive).
- Server‑consume (server aggregates SSE to a final answer, then returns JSON).
UI integrations by framework
Use the official adapters to avoid manual fetch/state plumbing. Each adapter exposes a high‑level chat primitive (e.g., useChat hook or a Chat store/class) that manages message state, streaming status, stop/regenerate controls, and error handling for you. Under the hood, these clients post to your server proxy (recommended /api/chat) and render text as it streams in. Your UI should render messages and their parts (text, tool invocations) and surface a Stop button while status is submitted or streaming. Prefer the adapter for your framework; only fall back to raw fetch + SSE for highly bespoke flows. If you need to keep secrets off the client or aggregate results, see the Server Side section below.
Note: DefaultChatTransport options like headers and body may be provided either as a static object (when values never change) or as an async function returning the object. Prefer the async function form when values are dynamic (e.g., session/user tokens, per‑request metadata); use static values only when they are truly constant.
Next.js / React (@ai-sdk/react)
First, install the required dependencies:
npm install ai @ai-sdk/react
# or
yarn add ai @ai-sdk/react
# or
pnpm add ai @ai-sdk/react
Then use it in your component:
// app/page.tsx
'use client';
import { useChat } from '@ai-sdk/react';
import { DefaultChatTransport, generateId } from 'ai';
export default function Chat() {
const networkTransport = new DefaultChatTransport({
api: 'https://api.korinai.com/api/chat',
headers: async () => {
// Fetch a user/session token from your auth layer. Do not expose API keys to the client.
// For example, getAuthToken is a function that returns a token (or api key) from localStorage
// We use promise function on headers here to ensure token is always up to date
const token = await getAuthToken();
return {
Authorization: 'Bearer ' + token,
...(requestHeaders || {}),
};
},
body: {
participantEmail: "user@mail.com",
},
});
const { error, status, sendMessage, messages, regenerate, stop } = useChat({
id: generateId(),
transport: networkTransport,
});
return (
<div className="flex flex-col w-full max-w-md py-12 mx-auto">
{messages.map(m => (
<div key={m.id} className="whitespace-pre-wrap">
{m.role === 'user' ? 'User: ' : 'AI: '}
{m.parts.map(part => part.type === 'text' ? part.text : '').join('')}
</div>
))}
{(status === 'submitted' || status === 'streaming') && (
<div className="mt-4 text-muted-foreground">
{status === 'submitted' && <div>Loading...</div>}
<button type="button" className="px-3 py-1 border rounded" onClick={stop}>Stop</button>
</div>
)}
{error && (
<div className="mt-4">
<div className="text-red-500">An error occurred.</div>
<button type="button" className="px-3 py-1 border rounded" onClick={() => regenerate()}>
Retry
</button>
</div>
)}
<form onSubmit={(e) => { e.preventDefault(); const form = e.currentTarget as HTMLFormElement; const input = (form.elements.namedItem('msg') as HTMLInputElement); if (!input.value) return; sendMessage({ text: input.value }); input.value=''; }}>
<input name="msg" className="mt-6 w-full p-2 border rounded" placeholder="Say something..." />
</form>
</div>
);
}
SvelteKit (@ai-sdk/svelte)
First, install the required dependencies:
npm install ai @ai-sdk/svelte
# or
yarn add ai @ai-sdk/svelte
# or
pnpm add ai @ai-sdk/svelte
Then use it in your component:
<!-- src/routes/chat/+page.svelte -->
<script lang="ts">
import { Chat } from '@ai-sdk/svelte';
import { DefaultChatTransport, generateId } from 'ai';
const networkTransport = new DefaultChatTransport({
api: 'https://api.korinai.com/api/chat',
headers: async () => {
// Fetch a user/session token from your auth layer. Do not expose API keys to the client.
const token = await getAuthToken();
return {
Authorization: 'Bearer ' + token,
...(requestHeaders || {}),
};
},
body: {
participantEmail: "user@mail.com",
},
});
const chat = new Chat({ id: generateId(), transport: networkTransport });
let input = '';
const disabled = $derived(chat.status !== 'ready');
function handleSubmit(e: Event) { e.preventDefault(); chat.sendMessage({ text: input }); input=''; }
</script>
<div class="flex flex-col w-full max-w-md py-12 mx-auto">
{#each chat.messages as m (m.id)}
<div class="whitespace-pre-wrap">
{m.role === 'user' ? 'User: ' : 'AI: '}
{m.parts.map((p) => p.type === 'text' ? p.text : '').join('')}
</div>
{/each}
{#if chat.status === 'submitted' || chat.status === 'streaming'}
<div class="mt-4 text-gray-500">
{#if chat.status === 'submitted'}<div>Loading...</div>{/if}
<button type="button" class="px-3 py-1 border rounded" on:click={chat.stop}>Stop</button>
</div>
{/if}
{#if chat.error}
<div class="mt-4">
<div class="text-red-500">An error occurred.</div>
<button type="button" class="px-3 py-1 border rounded" on:click={() => chat.regenerate()}>Retry</button>
</div>
{/if}
<form on:submit|preventDefault={handleSubmit}>
<input bind:value={input} class="mt-6 w-full p-2 border rounded" placeholder="Say something..." disabled={disabled} />
</form>
<p>{chat.status}</p>
</div>
Angular (@ai-sdk/angular)
First, install the required dependencies:
npm install ai @ai-sdk/angular
# or
yarn add ai @ai-sdk/angular
# or
pnpm add ai @ai-sdk/angular
Then use it in your component:
// chat.component.ts (standalone component)
import { Component, inject } from '@angular/core';
import { CommonModule } from '@angular/common';
import { FormBuilder, FormGroup, ReactiveFormsModule, Validators } from '@angular/forms';
import { Chat } from '@ai-sdk/angular';
import { DefaultChatTransport, generateId } from 'ai';
const networkTransport = new DefaultChatTransport({
api: 'https://api.korinai.com/api/chat',
headers: async () => {
// Get a user/session token from your auth layer (do not expose API keys in the client)
const token = await getAuthToken();
return {
Authorization: 'Bearer ' + token,
...(requestHeaders || {}),
};
},
body: {
participantEmail: "user@mail.com",
},
});
@Component({
selector: 'app-chat',
standalone: true,
imports: [CommonModule, ReactiveFormsModule],
template: `
<div class="flex flex-col gap-3 max-w-xl">
<ul class="prose dark:prose-invert">
<li *ngFor="let m of chat.messages"><strong>{{ m.role }}:</strong> {{ m.content }}</li>
</ul>
<form [formGroup]="chatForm" (ngSubmit)="sendMessage()" class="flex gap-2">
<input class="flex-1" formControlName="userInput" placeholder="Say something" />
<button type="submit" [disabled]="chat.isLoading">Send</button>
</form>
</div>
`,
})
export class ChatComponent {
private fb = inject(FormBuilder);
public chat: Chat = new Chat({
id: generateId(),
transport: networkTransport,
});
chatForm: FormGroup = this.fb.group({ userInput: ['', Validators.required] });
sendMessage() {
if (this.chatForm.invalid) return;
const userInput = this.chatForm.value.userInput as string;
this.chatForm.reset();
this.chat.sendMessage(
{ text: userInput },
{
// Optional: forward extra payload to your server proxy
body: { selectedModel: 'gpt-4.1' },
},
);
}
}
Nuxt (@ai-sdk/vue)
First, install the required dependencies:
npm install ai @ai-sdk/vue
# or
yarn add ai @ai-sdk/vue
# or
pnpm add ai @ai-sdk/vue
Then use it in your component:
<!-- pages/index.vue -->
<script setup lang="ts">
import { Chat } from '@ai-sdk/vue';
import { computed, ref } from 'vue';
import { DefaultChatTransport, generateId } from 'ai';
const networkTransport = new DefaultChatTransport({
api: 'https://api.korinai.com/api/chat',
headers: async () => {
// Fetch a user/session token from your auth layer. Do not expose API keys to the client.
const token = await getAuthToken();
return {
Authorization: 'Bearer ' + token,
...(requestHeaders || {}),
};
},
body: {
participantEmail: "user@mail.com",
},
});
const chat = new Chat({ id: generateId(), transport: networkTransport });
const input = ref('');
const disabled = computed(() => chat.status !== 'ready');
const handleSubmit = (e: Event) => { e.preventDefault(); chat.sendMessage({ text: input.value }); input.value=''; };
</script>
<template>
<div class="flex flex-col w-full max-w-md py-12 mx-auto">
<div v-for="m in chat.messages" :key="m.id" class="whitespace-pre-wrap">
{{ m.role === 'user' ? 'User: ' : 'AI: ' }}
{{ m.parts.map(part => (part.type === 'text' ? part.text : '')).join('') }}
</div>
<div v-if="chat.status === 'submitted' || chat.status === 'streaming'" class="mt-4 text-gray-500">
<div v-if="chat.status === 'submitted'">Loading...</div>
<button type="button" class="px-3 py-1 border rounded" @click="chat.stop">Stop</button>
</div>
<div v-if="chat.error" class="mt-4">
<div class="text-red-500">An error occurred.</div>
<button type="button" class="px-3 py-1 border rounded" @click="() => chat.regenerate()">Retry</button>
</div>
<form @submit="handleSubmit">
<input class="mt-6 w-full p-2 border rounded" v-model="input" placeholder="Say something..." :disabled="disabled" />
</form>
</div>
</template>
Server Side
On the server side, your proxy keeps secrets off the client, standardizes headers, and decides how to deliver results: either stream-through SSE for interactive UIs or server-consume to aggregate the final assistant message and return JSON. It should validate inputs, enforce auth/rate limits, set CORS appropriately, and implement sensible timeouts and retries. Log upstream request IDs and finish reasons for observability. Choose stream-through for the best UX; choose server-consume for workflows that need a single finalized payload.
Note: Request body must include participantEmail and messages.
- participantEmail is the email of the participant to chat with. Use the authenticated user’s email to chat to self; or use another user’s email to chat with that user/agent.
- See Quick Start for minimal request examples.
Node.js (Express)
// server.js
import express from 'express';
import fetch from 'node-fetch';
const app = express();
app.use(express.json());
app.post('/api/chat', async (req, res) => {
// Forward request to upstream, then CONSUME SSE server-side and return JSON
const upstream = await fetch('https://api.korinai.com/api/chat', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${process.env.KORIN_API_KEY}`,
},
body: JSON.stringify(req.body),
});
const reader = upstream.body?.getReader();
if (!reader) {
return res.status(502).json({ error: 'Upstream did not return a body' });
}
const decoder = new TextDecoder();
let buffer = '';
let finalText = '';
let metadata: any = undefined;
while (true) {
const { done, value } = await reader.read();
if (done) break;
buffer += decoder.decode(value, { stream: true });
const lines = buffer.split('\n');
buffer = lines.pop() || '';
for (const line of lines) {
if (!line.startsWith('data: ')) continue;
const data = line.slice(6);
if (data === '[DONE]') break;
try {
const evt = JSON.parse(data);
if (evt.type === 'text-delta' && typeof evt.delta === 'string') {
finalText += evt.delta;
} else if (evt.type === 'message-metadata') {
metadata = evt.metadata;
}
} catch {}
}
}
return res.json({ text: finalText, metadata });
});
app.listen(3000, () => console.log('Listening on http://localhost:3000'));
PHP (proxy endpoint)
<?php
// chat-proxy.php
// IMPORTANT: Never expose your API key to the browser.
// This PHP script proxies the request server-side and streams SSE back to the client.
// This endpoint CONSUMES SSE server-side and returns JSON (no SSE passthrough)
header('Content-Type: application/json; charset=utf-8');
// Read JSON body from client
$input = file_get_contents('php://input');
if ($input === false) {
http_response_code(400);
echo json_encode(['error' => 'Invalid request body']);
exit;
}
$ch = curl_init('https://api.korinai.com/api/chat');
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, 'POST');
curl_setopt($ch, CURLOPT_HTTPHEADER, [
'Content-Type: application/json',
'Authorization: Bearer ' . getenv('KORIN_API_KEY'),
]);
curl_setopt($ch, CURLOPT_POSTFIELDS, $input);
// Stream response and aggregate the final assistant text
$finalText = '';
$metadata = null;
curl_setopt($ch, CURLOPT_WRITEFUNCTION, function($ch, $chunk) use (&$finalText, &$metadata) {
// Split into lines and parse SSE data lines
$lines = preg_split("/\r?\n/", $chunk);
foreach ($lines as $line) {
if (strpos($line, 'data: ') !== 0) continue;
$data = substr($line, 6);
if ($data === '[DONE]') continue;
$evt = json_decode($data, true);
if (json_last_error() === JSON_ERROR_NONE && is_array($evt)) {
if (($evt['type'] ?? null) === 'text-delta' && isset($evt['delta'])) {
$finalText .= (string)$evt['delta'];
} elseif (($evt['type'] ?? null) === 'message-metadata' && isset($evt['metadata'])) {
$metadata = $evt['metadata'];
}
}
}
return strlen($chunk);
});
$ok = curl_exec($ch);
if ($ok === false) {
http_response_code(502);
echo json_encode(['error' => curl_error($ch)]);
curl_close($ch);
exit;
}
curl_close($ch);
echo json_encode(['text' => $finalText, 'metadata' => $metadata]);
Best Practices
-
Security (keys & surfaces)
- Keep provider/API keys on the server only. Never expose them in client bundles or headers.
- Prefer environment variables for secrets and rotate keys on a schedule. Scope keys to least privilege.
- Validate input server‑side (length, type, file sizes) before forwarding to upstream.
-
Streaming UX
- Show typing/streaming status and provide a Stop button during generation.
- Optimistically render the user message immediately; stream the assistant response incrementally.
- For long answers, auto‑scroll to bottom while streaming and pause auto‑scroll when the user scrolls up.
-
Error handling & retries
- Distinguish user‑visible errors (401/403/429) from transient server errors (5xx). Show clear, actionable messages.
- Include a Retry action; on retry, consider resending with the same room/conversation context.
- Log upstream request IDs (when available) to correlate client and server logs.
-
Rate limits & credits
- Enforce per‑user/tenant quotas server‑side before calling upstream. Return 403 with a helpful message if exceeded.
- Track usage per model/tool for analytics and cost control. Consider soft limits with warnings.
-
Performance
- Stream‑through for interactive UX; server‑consume for endpoints that must return a compact JSON payload.
- Avoid heavy work inside the SSE read loop—append deltas, schedule expensive work after stream end.
- Use HTTP keep‑alive and compression where appropriate; disable compression for SSE if it hurts latency.
-
Orchestration & tools
- Keep roles narrow and compose flows (Research → Draft → Review). Name connections clearly for audits.
- Require confirmation for tools with side effects or network calls; log tool inputs/outputs.
- Prefer adapter‑level affordances (status, stop, regenerate) rather than custom re‑implementations.
-
Knowledge grounding
- Keep files/notes small and well‑titled; ask for citations to improve trust.
- Default to auto‑search across enabled knowledge; allow scoping to specific sources when precision matters.
-
Observability
- Log stream lifecycle: start, deltas, finish, abort, finish reason. Include conversation/room IDs for traceability.
- Capture minimal metadata for privacy (timestamps, request IDs, model, duration, token counts).
-
Testing & environments
- Maintain staging vs production agents and tool toggles. Smoke‑test streaming, stop, and retry flows before release.
- Add integration tests that simulate slow networks and large outputs; assert UI remains responsive.