Step 6: History in Frontend - How I Build
Adding login, session sidebar, and fixing the streaming bug with useRef in the Next.js frontend.
6 min readThe Goal
Connect the frontend to the auth system. Users can register, login, see their past conversations in a sidebar, and resume any chat. Messages are saved during streaming, not after.
Login/Register Screen
A simple toggle between login and register forms. Nothing fancy, just functional.
"use client";
import { useState } from "react";
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
interface AuthScreenProps {
onLogin: (token: string) => void;
}
export function AuthScreen({ onLogin }: AuthScreenProps) {
const [mode, setMode] = useState<"login" | "register">("login");
const [email, setEmail] = useState("");
const [password, setPassword] = useState("");
const [error, setError] = useState("");
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();
setError("");
const endpoint = mode === "login" ? "/auth/login" : "/auth/register";
try {
const res = await fetch(`${process.env.NEXT_PUBLIC_API_URL}${endpoint}`, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ email, password }),
});
if (!res.ok) {
const data = await res.json();
setError(data.detail || "Something went wrong");
return;
}
const { token } = await res.json();
localStorage.setItem("token", token);
onLogin(token);
} catch {
setError("Cannot connect to server");
}
};
return (
<form onSubmit={handleSubmit} className="max-w-sm mx-auto mt-20 space-y-4">
<h1 className="text-2xl font-bold">
{mode === "login" ? "Sign In" : "Create Account"}
</h1>
<Input
type="email"
placeholder="Email"
value={email}
onChange={(e) => setEmail(e.target.value)}
required
/>
<Input
type="password"
placeholder="Password (min 8 chars)"
value={password}
onChange={(e) => setPassword(e.target.value)}
minLength={8}
required
/>
{error && <p className="text-red-500 text-sm">{error}</p>}
<Button type="submit" className="w-full">
{mode === "login" ? "Sign In" : "Register"}
</Button>
<p className="text-sm text-center">
{mode === "login" ? "No account? " : "Already have an account? "}
<button
type="button"
onClick={() => setMode(mode === "login" ? "register" : "login")}
className="underline"
>
{mode === "login" ? "Register" : "Sign In"}
</button>
</p>
</form>
);
}
Sidebar with Sessions
The sidebar fetches all sessions for the current user and displays them as a clickable list.
interface Session {
id: number;
title: string;
updated_at: string;
}
export function Sidebar({
sessions,
activeId,
onSelect,
onNewChat,
}: {
sessions: Session[];
activeId: number | null;
onSelect: (id: number) => void;
onNewChat: () => void;
}) {
return (
<aside className="w-64 border-r h-screen flex flex-col">
<div className="p-4">
<Button onClick={onNewChat} className="w-full">
+ New Chat
</Button>
</div>
<div className="flex-1 overflow-y-auto">
{sessions.map((session) => (
<button
key={session.id}
onClick={() => onSelect(session.id)}
className={`w-full text-left px-4 py-2 text-sm truncate hover:bg-gray-100
${activeId === session.id ? "bg-gray-100 font-medium" : ""}`}
>
{session.title}
</button>
))}
</div>
</aside>
);
}
The isChattingRef Bug
This was the most frustrating bug in the entire project. Here is what happened:
The problem: When a streaming response was in progress, the useEffect that loads session messages would fire and reset the message list to what was in the database. But the database only had the user's message (the assistant response was still streaming). So the streaming answer would disappear mid-sentence.
The wrong fix: Adding the streaming message to state and checking if (messages.length > 0). This caused infinite re-renders because setting state inside useEffect triggered the effect again.
The correct fix: A ref that tracks whether we are currently streaming.
import { useRef, useEffect, useState } from "react";
export function useChat(sessionId: number | null, token: string) {
const [messages, setMessages] = useState<Message[]>([]);
const isChattingRef = useRef(false);
// Load messages when session changes - but NOT while streaming
useEffect(() => {
if (!sessionId || isChattingRef.current) return;
async function loadMessages() {
const res = await fetch(
`${API_URL}/sessions/${sessionId}/messages`,
{ headers: { Authorization: `Bearer ${token}` } }
);
const data = await res.json();
setMessages(data);
}
loadMessages();
}, [sessionId, token]);
const sendMessage = async (content: string) => {
// Set ref BEFORE starting the stream
isChattingRef.current = true;
// Add user message to UI immediately
setMessages((prev) => [...prev, { role: "user", content }]);
// Start streaming
const res = await fetch(`${API_URL}/chat/stream`, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${token}`,
},
body: JSON.stringify({ question: content, session_id: sessionId }),
});
const reader = res.body!.getReader();
const decoder = new TextDecoder();
let assistantMessage = "";
// Add empty assistant message
setMessages((prev) => [...prev, { role: "assistant", content: "" }]);
while (true) {
const { done, value } = await reader.read();
if (done) break;
const text = decoder.decode(value);
const lines = text.split("\n");
for (const line of lines) {
if (line.startsWith("data: ") && line !== "data: [DONE]") {
try {
const json = JSON.parse(line.slice(6));
assistantMessage += json.content;
// Update the last message in place
setMessages((prev) => {
const updated = [...prev];
updated[updated.length - 1] = {
role: "assistant",
content: assistantMessage,
};
return updated;
});
} catch {
// Skip malformed chunks
}
}
}
}
// Stream complete - allow useEffect to run again
isChattingRef.current = false;
};
return { messages, sendMessage };
}
Why useRef instead of useState:
- Refs do not trigger re-renders. If
isChattingwas state, changing it would re-render the component, which would re-run the effect, which would load messages, which would overwrite the stream. Circular. - Refs are always current. Inside the
useEffectclosure, a state value is stale (captured at render time). A ref always gives you the latest value.
Saving Messages During Streaming
Messages are saved to the database as they stream, not after. The backend handles this:
@app.post("/chat/stream")
async def chat_stream(request: ChatRequest, user_id: int = Depends(get_current_user)):
"""Stream response and save both messages to the database."""
# Save user message immediately
save_message(request.session_id, "user", request.question)
async def generate():
full_response = ""
# ... stream chunks ...
for chunk in response:
if chunk.text:
full_response += chunk.text
yield f"data: {json.dumps({'content': chunk.text})}\n\n"
# Save assistant message after stream completes
save_message(request.session_id, "assistant", full_response)
yield "data: [DONE]\n\n"
return StreamingResponse(generate(), media_type="text/event-stream")
The user message is saved before streaming starts. The assistant message is saved after the full response is accumulated. If the stream breaks midway, you lose the partial response but keep the user's question.
localStorage Token Persistence
// On app mount: check for existing token
useEffect(() => {
const saved = localStorage.getItem("token");
if (saved) {
setToken(saved);
setIsAuthenticated(true);
}
}, []);
// On login: save token
const handleLogin = (token: string) => {
localStorage.setItem("token", token);
setToken(token);
setIsAuthenticated(true);
};
// On sign out: clear token
const handleSignOut = () => {
localStorage.removeItem("token");
setToken("");
setIsAuthenticated(false);
};
For a POC, localStorage is fine. For production, I will move to httpOnly cookies to prevent XSS attacks from stealing the token.
Lessons Learned
- useRef for cross-render flags, useState for UI. If something needs to be visible, use state. If something needs to be checked inside closures without triggering re-renders, use a ref.
- Save during streaming, not after. If the user closes the tab mid-stream, you still have the question saved. Partial data is better than no data.
- localStorage is a POC pattern. It works, it is simple, but it is vulnerable to XSS. Know the tradeoff.
- Auto-scroll during streaming. One detail I missed initially: the chat container needs to scroll to bottom as new tokens arrive. A simple
scrollIntoView()on the last message element fixes this.
What is Next
Step 7 containerizes everything with Docker so it can be deployed anywhere with a single docker compose up.