arrow_backBack to projects
AI Interface·Peaqock Financials·2024

Enterprise AI Assistant

Built a ChatGPT-like assistant with streaming responses and contextual product previews — deployed inside an enterprise trade platform.

Overview

The platform was drowning users in data — thousands of products and companies with no fast way to get answers. The solution was an AI assistant built directly into the product: a streaming chat interface that understood user intent and surfaced the right information inline, without leaving the page.

The Challenge

Traditional search wasn't enough. Users needed something conversational and smart — able to retrieve company profiles, product specs, and documents, and render them structured inside the conversation. The hard part was making the AI interface feel instant, not just functional.

My Role

I owned the entire frontend for the assistant — chat architecture, streaming token renderer, tool-calling system, inline preview cards, session state management, and the embeddable widget architecture that let the assistant be deployed across 5+ platform sections.

Architecture Decisions

SSE streaming over REST

Used Server-Sent Events instead of waiting for full responses — tokens rendered as they arrived, eliminating the frozen-loading feel and making the UI feel instant.

Tool-call driven rendering

Instead of plain text responses, designed a system where the model triggered frontend components (product card, company profile, document viewer) — keeping structured data structured inside chat.

Embeddable widget architecture

Built the assistant as a self-contained React component with its own context provider — mountable anywhere on the platform without coupling to the host page's state or routing.

Session isolation with abort control

Each session had its own message history, loading state, and AbortController — so users could cancel mid-stream or start a new session without corrupting in-flight requests.

Hard Problems Solved

  • 01

    Streaming partial JSON — tokens arrive mid-object. Built a buffer that waited for complete tool-call payloads before rendering preview components, preventing broken UI states.

  • 02

    Cancel/retry without race conditions — multiple in-flight requests, abort signals, and optimistic UI rollback all had to work cleanly together.

  • 03

    Token budget management — long conversations hit context limits. Built a compression strategy that summarized old messages while preserving user intent.

Impact

  • check_circle

    Users could explore thousands of products and companies through natural conversation — no page navigation required

  • check_circle

    Inline preview cards eliminated the majority of lookup friction for business queries

  • check_circle

    The embeddable architecture let the product team ship the assistant across 5+ sections within days of completion

Tech Stack

Next.jsReactTypeScriptOpenAI APIServer-Sent EventsReact QueryAxiosTailwind CSS