---
id: "ET-LLM-007"
title: "Especificación Frontend del Módulo LLM Agent"
type: "Technical Specification"
status: "Done"
priority: "Alta"
epic: "OQI-007"
project: "trading-platform"
version: "1.0.0"
created_date: "2025-12-15"
updated_date: "2026-01-25"
---
# ET-LLM-007: Especificación Frontend del Módulo LLM Agent
**Épica:** OQI-007 - LLM Strategy Agent
**Versión:** 1.0
**Fecha:** 2025-12-15
**Estado:** Implementado
**Prioridad:** P1 - Alto
---
## Resumen
Esta especificación define la arquitectura, componentes y servicios del frontend para el módulo LLM Strategy Agent. El frontend proporciona una interfaz conversacional interactiva que permite a los usuarios comunicarse con el agente LLM para análisis de trading, consultas estratégicas y visualización de señales en tiempo real.
---
## Arquitectura General
```
┌──────────────────────────────────────────────────────────────────────┐
│ FRONTEND (React) │
│ Port: 3000 │
├──────────────────────────────────────────────────────────────────────┤
│ │
│ ┌────────────────────┐ ┌────────────────────┐ ┌────────────────┐ │
│ │ Pages │ │ Components │ │ Services │ │
│ ├────────────────────┤ ├────────────────────┤ ├────────────────┤ │
│ │ • AssistantPage │ │ • ChatWindow │ │ • llmAgentSvc │ │
│ │ • ChatPage │ │ • ChatMessage │ │ • llmProviders │ │
│ │ • StrategyPage │ │ • ChatInput │ │ • tokenService │ │
│ │ • AnalysisPage │ │ • SignalCard │ │ • storageService│ │
│ │ │ │ • ToolCallCard │ │ │ │
│ │ │ │ • MessageList │ │ │ │
│ │ │ │ • SidebarConv │ │ │ │
│ │ │ │ • LoadingSpinner │ │ │ │
│ │ │ │ • ErrorBoundary │ │ │ │
│ └────────────────────┘ └────────────────────┘ └────────────────┘ │
│ │ │ │ │
│ └──────────────────────┼──────────────────────┘ │
│ │ │
│ ┌────────────────────────▼──────────────────────────┐ │
│ │ Zustand Store │ │
│ │ ┌─────────────┐ ┌───────────┐ ┌─────────────┐ │ │
│ │ │ chatStore │ │ uiStore │ │ signalStore │ │ │
│ │ └─────────────┘ └───────────┘ └─────────────┘ │ │
│ └────────────────────┬───────────────────────────────┘ │
│ │ │
│ ┌────────────────────▼──────────────────────────────┐ │
│ │ Query Client (React Query) │ │
│ │ Caching, Sincronización, Invalidación │ │
│ └────────────────────┬───────────────────────────────┘ │
│ │ │
└──────────────────────────────┼───────────────────────────────────────┘
│
┌───────────────────┼───────────────────┐
│ │ │
┌──────────▼────────┐ ┌───────▼────────┐ ┌──────▼──────────┐
│ LLM Agent API │ │ Backend API │ │ WebSocket │
│ :3085 │ │ :3080 │ │ :3085/ws │
│ │ │ │ │ │
│ POST /chat │ │ GET /profiles │ │ message:send │
│ GET /history │ │ GET /signals │ │ agent:stream │
│ POST /strategies │ │ POST /orders │ │ agent:complete │
│ DELETE /convs │ │ GET /portfolio │ │ error │
│ │ │ │ │ │
└───────────────────┘ └────────────────┘ └─────────────────┘
```
---
## Stack Tecnológico Frontend
```yaml
Core:
- React: 18.2.0
- TypeScript: 5.3.0
- Vite: 6.2.0
- React Router: 6.18.0
State Management:
- Zustand: 4.4.7
- React Query (TanStack Query): 5.14.0
Styling:
- Tailwind CSS: 3.3.0
- Headless UI: 1.7.0
- Radix UI: 1.0.0
Components & UI:
- React Markdown: 8.0.0
- Highlight.js: 11.8.0
- Recharts: 2.10.0 (para gráficos simples)
- lightweight-charts: 4.1.1 (para charts de trading)
Utilities:
- axios: 1.6.0
- date-fns: 2.30.0
- clsx: 2.0.0
- zustand: 4.4.7
Testing:
- Vitest: 1.0.0
- @testing-library/react: 14.1.0
- @testing-library/user-event: 14.5.0
Dev Tools:
- @types/react: 18.2.0
- @types/node: 20.9.0
- ESLint: 8.50.0
- Prettier: 3.0.0
```
---
## Estructura de Directorios Frontend
```
apps/frontend/src/
├── pages/
│ ├── AssistantPage.tsx # Página principal del asistente
│ ├── ChatPage.tsx # Página de chat conversacional
│ ├── StrategyPage.tsx # Análisis de estrategias
│ └── AnalysisPage.tsx # Análisis detallados
│
├── modules/
│ └── llm-agent/
│ ├── components/
│ │ ├── ChatWindow.tsx
│ │ ├── ChatMessage.tsx
│ │ ├── ChatInput.tsx
│ │ ├── SignalCard.tsx
│ │ ├── ToolCallCard.tsx
│ │ ├── MessageList.tsx
│ │ ├── SidebarConversations.tsx
│ │ ├── LoadingIndicator.tsx
│ │ └── ErrorBoundary.tsx
│ │
│ ├── hooks/
│ │ ├── useChat.ts
│ │ ├── useLlmAgent.ts
│ │ ├── useSignals.ts
│ │ ├── useWebSocket.ts
│ │ └── useMessageStream.ts
│ │
│ ├── services/
│ │ ├── llmAgentService.ts # Cliente API LLM Agent
│ │ ├── websocketService.ts # Conexión WS
│ │ ├── llmProviderService.ts # Adaptadores OpenAI/Claude
│ │ └── tokenService.ts # Conteo de tokens
│ │
│ ├── stores/
│ │ ├── chatStore.ts # Zustand chat store
│ │ ├── uiStore.ts # UI state
│ │ └── signalStore.ts # Señales y análisis
│ │
│ ├── types/
│ │ ├── index.ts
│ │ ├── chat.types.ts
│ │ ├── signal.types.ts
│ │ └── api.types.ts
│ │
│ ├── utils/
│ │ ├── messageFormatters.ts
│ │ ├── tokenCounter.ts
│ │ ├── errorHandlers.ts
│ │ └── conversationHelpers.ts
│ │
│ ├── hooks.test.ts
│ ├── services.test.ts
│ └── components.test.tsx
│
├── shared/
│ ├── components/
│ │ ├── Button.tsx
│ │ ├── Card.tsx
│ │ ├── Modal.tsx
│ │ └── Spinner.tsx
│ │
│ └── hooks/
│ └── useWindowSize.ts
│
├── stores/
│ └── authStore.ts # Estado de autenticación global
│
├── services/
│ ├── api.ts # Configuración axios
│ ├── authService.ts # JWT, login, logout
│ └── storageService.ts # LocalStorage helpers
│
├── App.tsx
├── main.tsx
├── index.css
└── types.d.ts
```
---
## Componentes Principales
### 1. ChatWindow Component
**Ubicación:** `apps/frontend/src/modules/llm-agent/components/ChatWindow.tsx`
**Responsabilidad:** Contenedor principal que orquesta el chat conversacional
```typescript
interface ChatWindowProps {
conversationId: string;
onClose?: () => void;
initialMessage?: string;
theme?: 'light' | 'dark';
}
export function ChatWindow({
conversationId,
onClose,
initialMessage,
theme = 'light'
}: ChatWindowProps) {
// State management
const { messages, isLoading, streamingContent } = useChatStore();
const { connected, connect, disconnect } = useWebSocket();
// Lifecycle
useEffect(() => {
// Conectar y cargar historial
}, [conversationId]);
// Handlers
const handleSendMessage = (content: string) => {
// Enviar mensaje vía WS
};
const handleCancel = () => {
// Cancelar generación
};
return (
);
}
```
**Features:**
- Soporte para múltiples conversaciones
- Auto-scroll a últimos mensajes
- Indicador de conexión
- Tema claro/oscuro
- Responsive design (mobile-first)
---
### 2. ChatMessage Component
**Ubicación:** `apps/frontend/src/modules/llm-agent/components/ChatMessage.tsx`
**Responsabilidad:** Renderizar mensaje individual con soporte para markdown, código y tool calls
```typescript
interface ChatMessageProps {
message: Message;
isStreaming?: boolean;
onFeedback?: (rating: number, comment?: string) => void;
}
export function ChatMessage({
message,
isStreaming,
onFeedback
}: ChatMessageProps) {
const [showFeedback, setShowFeedback] = useState(false);
const [feedback, setFeedback] = useState();
return (
{message.role === 'user' ?
:
}
{/* Renderizar contenido */}
{/* Tool calls si existen */}
{message.toolCalls && (
{message.toolCalls.map((tool) => (
))}
)}
{/* Signals si existen */}
{message.signals && (
{message.signals.map((signal) => (
))}
)}
{/* Feedback */}
{message.role === 'assistant' && (
)}
{formatTime(message.createdAt)}
{message.tokensInput && (
{message.tokensInput + (message.tokensOutput || 0)} tokens
)}
);
}
```
**Features:**
- Renderizado seguro de Markdown
- Highlight de código con Highlight.js
- Soporte para LaTeX matemático
- Feedback rating (thumbs up/down + comentario)
- Copyable code blocks
- Display de tokens consumidos
---
### 3. ChatInput Component
**Ubicación:** `apps/frontend/src/modules/llm-agent/components/ChatInput.tsx`
**Responsabilidad:** Input de usuario con soporte para multi-line, attachments y comandos
```typescript
interface ChatInputProps {
onSend: (content: string, files?: File[]) => void;
onCancel?: () => void;
isLoading?: boolean;
disabled?: boolean;
maxLength?: number;
placeholder?: string;
}
export function ChatInput({
onSend,
onCancel,
isLoading,
disabled,
maxLength = 4000,
placeholder = 'Escribe tu consulta aquí...'
}: ChatInputProps) {
const [content, setContent] = useState('');
const [files, setFiles] = useState([]);
const textareaRef = useRef(null);
// Auto-resize textarea
useEffect(() => {
if (textareaRef.current) {
textareaRef.current.style.height = 'auto';
textareaRef.current.style.height =
Math.min(textareaRef.current.scrollHeight, 200) + 'px';
}
}, [content]);
const handleSend = () => {
if (!content.trim()) return;
onSend(content, files);
setContent('');
setFiles([]);
};
const handleKeyDown = (e: KeyboardEvent) => {
// Enviar con Ctrl+Enter o Cmd+Enter
if ((e.ctrlKey || e.metaKey) && e.key === 'Enter') {
e.preventDefault();
handleSend();
}
};
return (
{/* File attachments preview */}
{files.length > 0 && (
{files.map((file) => (
setFiles(f => f.filter(x => x !== file))}
/>
))}
)}
);
}
```
**Features:**
- Auto-expand textarea
- Keyboard shortcuts (Ctrl+Enter para enviar)
- Attachments de archivos (CSV, imágenes, PDFs)
- Character counter
- Cancel button cuando está procesando
- Disabled state apropiado
---
### 4. SignalCard Component
**Ubicación:** `apps/frontend/src/modules/llm-agent/components/SignalCard.tsx`
**Responsabilidad:** Mostrar señales de trading recomendadas por el agente
```typescript
interface SignalCardProps {
signal: TradingSignal;
onExecute?: (signal: TradingSignal) => void;
onAddToWatchlist?: (signal: TradingSignal) => void;
compact?: boolean;
}
export function SignalCard({
signal,
onExecute,
onAddToWatchlist,
compact = false
}: SignalCardProps) {
const { rating, confidence } = signal;
const isPositive = signal.direction === 'LONG';
return (
{signal.direction}
{signal.symbol}
{signal.timeframe}
{!compact && (
Análisis
{signal.reasoning}
)}
{!compact && signal.indicators && (
Indicadores
{signal.indicators.map((indicator) => (
-
{indicator.name}
{indicator.value}
))}
)}
{onExecute && (
)}
{onAddToWatchlist && (
)}
);
}
```
**Features:**
- Métricas de confianza y rating
- Niveles de entrada, TP, SL
- Cálculo de risk/reward
- Análisis y razonamiento
- Indicadores técnicos
- Acciones rápidas (ejecutar, agregar watchlist)
- Modo compacto/expandido
---
## Servicios Frontend
### 1. LLM Agent Service
**Ubicación:** `apps/frontend/src/modules/llm-agent/services/llmAgentService.ts`
**Responsabilidad:** Cliente HTTP para API del LLM Agent
```typescript
class LLMAgentService {
private baseUrl = 'http://localhost:3085';
private client = axios.create({
baseURL: this.baseUrl,
timeout: 30000,
});
// Conversaciones
async getConversations(limit = 20): Promise {
const { data } = await this.client.get('/conversations', {
params: { limit },
});
return data;
}
async createConversation(title?: string): Promise {
const { data } = await this.client.post('/conversations', { title });
return data;
}
async getConversation(id: string): Promise {
const { data } = await this.client.get(`/conversations/${id}`);
return data;
}
async deleteConversation(id: string): Promise {
await this.client.delete(`/conversations/${id}`);
}
async renameConversation(id: string, title: string): Promise {
const { data } = await this.client.patch(`/conversations/${id}`, { title });
return data;
}
// Mensajes
async getMessages(
conversationId: string,
limit = 50,
offset = 0
): Promise {
const { data } = await this.client.get(
`/conversations/${conversationId}/messages`,
{ params: { limit, offset } }
);
return data;
}
async sendMessage(
conversationId: string,
content: string
): Promise {
const { data } = await this.client.post(
`/conversations/${conversationId}/messages`,
{ content }
);
return data;
}
async provideFeedback(
messageId: string,
rating: number,
comment?: string
): Promise {
await this.client.post(`/messages/${messageId}/feedback`, {
rating,
comment,
});
}
// Estrategias
async analyzeStrategy(
symbol: string,
timeframe: string,
priceData?: number[]
): Promise {
const { data } = await this.client.post('/strategies/analyze', {
symbol,
timeframe,
priceData,
});
return data;
}
async getSignals(
conversationId: string,
limit = 10
): Promise {
const { data } = await this.client.get(
`/conversations/${conversationId}/signals`,
{ params: { limit } }
);
return data;
}
}
export const llmAgentService = new LLMAgentService();
```
---
### 2. WebSocket Service
**Ubicación:** `apps/frontend/src/modules/llm-agent/services/websocketService.ts`
**Responsabilidad:** Mantener conexión WebSocket con backend
```typescript
class WebSocketService extends EventEmitter {
private socket: Socket | null = null;
private url = 'http://localhost:3085';
private reconnectDelay = 1000;
private maxReconnectAttempts = 5;
connect(token: string): Promise {
return new Promise((resolve, reject) => {
this.socket = io(this.url, {
auth: { token },
reconnection: true,
reconnectionDelay: this.reconnectDelay,
reconnectionDelayMax: 5000,
reconnectionAttempts: this.maxReconnectAttempts,
});
this.socket.on('connect', () => {
console.log('WebSocket conectado');
resolve();
});
this.socket.on('error', (error) => {
console.error('Error WS:', error);
reject(error);
});
this.setupListeners();
});
}
private setupListeners() {
this.socket?.on('message:saved', (message: Message) => {
this.emit('messageSaved', message);
});
this.socket?.on('agent:thinking', () => {
this.emit('agentThinking');
});
this.socket?.on('agent:stream', ({ chunk, toolCalls }: StreamChunk) => {
this.emit('agentStream', { chunk, toolCalls });
});
this.socket?.on('agent:complete', (message: Message) => {
this.emit('agentComplete', message);
});
this.socket?.on('agent:cancelled', () => {
this.emit('agentCancelled');
});
this.socket?.on('error', (error: any) => {
this.emit('error', error);
});
}
sendMessage(conversationId: string, content: string) {
this.socket?.emit('message:send', { conversationId, content });
}
cancelGeneration(conversationId: string) {
this.socket?.emit('message:cancel', { conversationId });
}
disconnect() {
this.socket?.disconnect();
this.socket = null;
}
isConnected(): boolean {
return this.socket?.connected ?? false;
}
}
export const wsService = new WebSocketService();
```
---
## Zustand Stores
### 1. Chat Store
**Ubicación:** `apps/frontend/src/modules/llm-agent/stores/chatStore.ts`
```typescript
interface ChatState {
// State
conversations: Conversation[];
activeConversationId: string | null;
messages: Map;
isLoading: boolean;
streamingContent: string;
streamingConversationId: string | null;
// Actions
setConversations: (convs: Conversation[]) => void;
selectConversation: (id: string) => void;
createConversation: (title?: string) => Promise;
deleteConversation: (id: string) => Promise;
renameConversation: (id: string, title: string) => Promise;
addMessage: (conversationId: string, message: Message) => void;
updateStreamingContent: (content: string) => void;
completeStream: (message: Message) => void;
clearMessages: () => void;
}
export const useChatStore = create((set, get) => ({
// Initial state
conversations: [],
activeConversationId: null,
messages: new Map(),
isLoading: false,
streamingContent: '',
streamingConversationId: null,
// Actions
setConversations: (convs) => set({ conversations: convs }),
selectConversation: (id) => {
set({ activeConversationId: id });
},
createConversation: async (title) => {
try {
const conversation = await llmAgentService.createConversation(title);
set((state) => ({
conversations: [conversation, ...state.conversations],
activeConversationId: conversation.id,
}));
} catch (error) {
console.error('Error crear conversación:', error);
}
},
deleteConversation: async (id) => {
try {
await llmAgentService.deleteConversation(id);
set((state) => ({
conversations: state.conversations.filter((c) => c.id !== id),
activeConversationId:
state.activeConversationId === id ? null : state.activeConversationId,
}));
} catch (error) {
console.error('Error eliminar conversación:', error);
}
},
addMessage: (conversationId, message) => {
set((state) => {
const messages = new Map(state.messages);
const conv = messages.get(conversationId) || [];
messages.set(conversationId, [...conv, message]);
return { messages };
});
},
updateStreamingContent: (content) => {
set((state) => ({
streamingContent: state.streamingContent + content,
}));
},
completeStream: (message) => {
set((state) => {
const conversationId = state.activeConversationId;
if (!conversationId) return state;
const messages = new Map(state.messages);
const conv = messages.get(conversationId) || [];
messages.set(conversationId, [...conv, message]);
return {
messages,
isLoading: false,
streamingContent: '',
};
});
},
clearMessages: () => set({ messages: new Map() }),
}));
```
---
## Types & Interfaces
**Ubicación:** `apps/frontend/src/modules/llm-agent/types/`
```typescript
// chat.types.ts
export interface Message {
id: string;
conversationId: string;
role: 'user' | 'assistant' | 'system';
content: string;
toolCalls?: ToolCall[];
signals?: TradingSignal[];
createdAt: Date;
updatedAt: Date;
tokensInput?: number;
tokensOutput?: number;
feedbackRating?: number;
feedbackComment?: string;
}
export interface Conversation {
id: string;
userId: string;
title: string;
status: 'active' | 'archived';
messagesCount: number;
metadata?: Record;
createdAt: Date;
updatedAt: Date;
deletedAt?: Date;
}
export interface ToolCall {
id: string;
name: string;
arguments: Record;
result?: any;
}
export interface TradingSignal {
id: string;
conversationId: string;
messageId: string;
symbol: string;
direction: 'LONG' | 'SHORT';
entry: number;
tp: number;
sl: number;
timeframe: string;
confidence: number;
rating: number;
riskReward: number;
reasoning: string;
indicators?: Array<{ name: string; value: string }>;
type: 'TECHNICAL' | 'FUNDAMENTAL' | 'ML' | 'HYBRID';
createdAt: Date;
}
export interface StreamChunk {
type: 'text' | 'tool_result' | 'signal';
content: string;
toolCalls?: ToolCall[];
signals?: TradingSignal[];
}
```
---
## Hooks Personalizados
### useChat Hook
**Ubicación:** `apps/frontend/src/modules/llm-agent/hooks/useChat.ts`
```typescript
export function useChat(conversationId: string) {
const {
activeConversationId,
messages,
isLoading,
streamingContent,
addMessage,
updateStreamingContent,
completeStream,
} = useChatStore();
const { isConnected, sendMessage, cancelGeneration } = useWebSocket();
const currentMessages = useMemo(
() => messages.get(conversationId) || [],
[messages, conversationId]
);
const handleSendMessage = useCallback(
(content: string) => {
if (!isConnected) {
toast.error('Desconectado del servidor');
return;
}
sendMessage(conversationId, content);
},
[conversationId, isConnected, sendMessage]
);
const handleCancel = useCallback(() => {
cancelGeneration(conversationId);
}, [conversationId, cancelGeneration]);
return {
messages: currentMessages,
isLoading,
streamingContent,
isConnected,
sendMessage: handleSendMessage,
cancelGeneration: handleCancel,
};
}
```
---
## Estilos
### Estructura CSS con Tailwind + SCSS
```scss
// apps/frontend/src/modules/llm-agent/styles/index.scss
// Variables
$primary-color: #3b82f6;
$agent-bg: #f3f4f6;
$user-bg: #dbeafe;
$text-primary: #1f2937;
$text-secondary: #6b7280;
// Chat Window
.chat-window {
display: flex;
flex-direction: column;
height: 100%;
background: white;
border-radius: 8px;
box-shadow: 0 1px 3px rgba(0, 0, 0, 0.1);
.message-list {
flex: 1;
overflow-y: auto;
padding: 16px;
scroll-behavior: smooth;
}
.message {
display: flex;
gap: 12px;
margin-bottom: 16px;
animation: slideIn 0.3s ease-in;
&.message-assistant {
.message-bubble {
background: $agent-bg;
color: $text-primary;
}
}
&.message-user {
flex-direction: row-reverse;
.message-bubble {
background: $primary-color;
color: white;
}
}
}
}
// Signal Card
.signal-card {
border: 1px solid #e5e7eb;
border-radius: 8px;
padding: 16px;
margin: 12px 0;
background: #f9fafb;
.signal-header {
display: flex;
justify-content: space-between;
margin-bottom: 12px;
}
.signal-body {
margin: 12px 0;
padding: 12px;
background: white;
border-radius: 4px;
}
.badge-direction {
display: inline-block;
padding: 4px 8px;
border-radius: 4px;
font-size: 12px;
font-weight: 600;
&.badge-long {
background: #d1fae5;
color: #065f46;
}
&.badge-short {
background: #fee2e2;
color: #7f1d1d;
}
}
}
// Animations
@keyframes slideIn {
from {
opacity: 0;
transform: translateY(8px);
}
to {
opacity: 1;
transform: translateY(0);
}
}
// Responsive
@media (max-width: 768px) {
.chat-window {
.message-list {
padding: 12px;
}
.message {
gap: 8px;
}
.signal-card {
padding: 12px;
font-size: 14px;
}
}
}
```
---
## Testing
### Unit Tests para Componentes
```typescript
// apps/frontend/src/modules/llm-agent/components/ChatMessage.test.tsx
import { render, screen } from '@testing-library/react';
import { ChatMessage } from './ChatMessage';
describe('ChatMessage Component', () => {
const mockMessage: Message = {
id: '1',
conversationId: 'conv-1',
role: 'assistant',
content: 'Hola, soy tu asistente',
createdAt: new Date(),
updatedAt: new Date(),
};
it('debería renderizar mensaje de texto', () => {
render();
expect(screen.getByText(/Hola, soy tu asistente/)).toBeInTheDocument();
});
it('debería renderizar markdown correctamente', () => {
const mdMessage: Message = {
...mockMessage,
content: '**Bold** *italic*',
};
render();
expect(screen.getByText('Bold')).toHaveStyle('font-weight: bold');
});
it('debería mostrar tool calls', () => {
const msgWithTools: Message = {
...mockMessage,
toolCalls: [
{
id: '1',
name: 'get_price',
arguments: { symbol: 'BTCUSDT' },
},
],
};
render();
expect(screen.getByText('get_price')).toBeInTheDocument();
});
});
```
---
## Integración con Backend
### Flow de Mensajes
```
┌─────────────┐
│ Usuario │
│ escribe en │
│ ChatInput │
└──────┬──────┘
│
▼
┌─────────────────────────────────────┐
│ ChatInput.handleSend() │
│ ├─ Validar contenido │
│ ├─ Emitir 'message:send' vía WS │
│ └─ Limpiar input │
└──────┬──────────────────────────────┘
│
▼
┌─────────────────────────────────────┐
│ Backend (LLM Agent - :3085) │
│ ├─ Recibir mensaje │
│ ├─ Guardar en DB │
│ ├─ Procesar con LLM │
│ ├─ Emitir 'agent:thinking' │
│ └─ Stream chunks vía WS │
└──────┬──────────────────────────────┘
│
├─ 'agent:stream' (chunk)
│
▼
┌─────────────────────────────────────┐
│ Frontend WebSocket Handler │
│ ├─ useChatStore.updateStreamingContent()
│ ├─ Renderizar chunk en tiempo real │
│ └─ Scroll automático │
└──────┬──────────────────────────────┘
│
├─ 'agent:complete' (mensaje completo)
│
▼
┌─────────────────────────────────────┐
│ ChatMessage Component │
│ ├─ Mostrar mensaje completo │
│ ├─ Renderizar signals si existen │
│ ├─ Mostrar tool calls │
│ └─ Permitir feedback │
└─────────────────────────────────────┘
```
---
## Performance & Optimización
### 1. Code Splitting
```typescript
// apps/frontend/src/App.tsx
const ChatPage = lazy(() => import('./pages/ChatPage'));
const AssistantPage = lazy(() => import('./pages/AssistantPage'));
const StrategyPage = lazy(() => import('./pages/StrategyPage'));
}>
} />
} />
} />
```
### 2. Message Virtualization
Para listas largas de mensajes, usar `react-window`:
```typescript
import { FixedSizeList } from 'react-window';
{({ index, style }) => (
)}
```
### 3. React Query Caching
```typescript
const { data: conversations } = useQuery({
queryKey: ['conversations'],
queryFn: () => llmAgentService.getConversations(),
staleTime: 5 * 60 * 1000, // 5 minutos
gcTime: 10 * 60 * 1000, // 10 minutos
});
```
---
## Configuración de Ambiente
**Ubicación:** `apps/frontend/.env.local`
```env
VITE_API_URL=http://localhost:3080
VITE_LLM_AGENT_URL=http://localhost:3085
VITE_WS_URL=http://localhost:3085
VITE_ENVIRONMENT=development
VITE_LOG_LEVEL=debug
VITE_MAX_MESSAGE_LENGTH=4000
VITE_CHAT_PAGE_SIZE=50
VITE_AUTO_SCROLL=true
```
---
## Build & Deploy
### Development
```bash
cd apps/frontend
npm install
npm run dev # Vite dev server en :3000
```
### Production
```bash
npm run build
npm run preview
# Output
# ├── dist/
# │ ├── index.html
# │ ├── assets/
# │ │ ├── chunk-*.js
# │ │ └── style-*.css
# │ └── assets/vendor-*.js
```
---
## Roadmap Frontend
### Fase 1 (En Progreso)
- [x] Chat Window base
- [x] Message rendering con markdown
- [x] WebSocket connection
- [ ] Feedback system
### Fase 2 (Próxima)
- [ ] Signal visualization mejorada
- [ ] Charts integrados
- [ ] File attachment handling
- [ ] Message search/filtering
### Fase 3 (Futura)
- [ ] Voice input/output
- [ ] PDF export de conversaciones
- [ ] Collaboration features
- [ ] Mobile app
---
## Referencias
- [ET-LLM-001: Arquitectura del Sistema de Chat](./ET-LLM-001-arquitectura-chat.md)
- [ET-LLM-005: Arquitectura de Tools](./ET-LLM-005-arquitectura-tools.md)
- [ET-LLM-006: Gestión de Memoria](./ET-LLM-006-gestion-memoria.md)
- [RF-LLM-001: Requerimientos de Chat Interface](../requerimientos/RF-LLM-001-chat-interface.md)
---
## Notas de Implementación
1. **CORS:** El frontend debe estar en puerto 3000, backend :3085 con CORS habilitado
2. **Token JWT:** Se obtiene en login y se envía en header `Authorization: Bearer `
3. **Rate Limiting:** El servidor implementa límites, frontend debe mostrar error si se excede
4. **Reconnection:** Socket.IO maneja reconexión automática, pero se debe validar estado
5. **Mobile First:** Diseño responsive desde inicio, tablet y desktop como enhancement
---
*Especificación técnica - Sistema NEXUS v4.0*
*Trading Platform - OQI-007*
*Versión 1.0 - Implementado*