trading-platform/docs/02-definicion-modulos/OQI-007-llm-agent/especificaciones/ET-LLM-007-frontend.md
Adrian Flores Cortes cdec253b02 [TASK-2026-01-25-FRONTEND-ANALYSIS] docs: Add frontend specifications and user stories
- Add 5 frontend specification documents (ET-*-frontend.md):
  - ET-AUTH-006: Authentication module frontend spec
  - ET-ML-008: ML Signals module frontend spec
  - ET-LLM-007: LLM Agent module frontend spec
  - ET-PFM-008: Portfolio Manager frontend spec (design)
  - ET-MKT-003: Marketplace frontend spec (design)

- Add 8 new user stories:
  - US-AUTH-013: Global logout
  - US-AUTH-014: Device management
  - US-ML-008: Ensemble signal view
  - US-ML-009: ICT analysis view
  - US-ML-010: Multi-symbol scan
  - US-LLM-011: Execute trade from chat
  - US-PFM-013: Rebalance alerts
  - US-PFM-014: PDF report generation

- Update task index with completed analysis

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-25 01:47:27 -06:00

38 KiB

id title type status priority epic project version created_date updated_date
ET-LLM-007 Especificación Frontend del Módulo LLM Agent Technical Specification Done Alta OQI-007 trading-platform 1.0.0 2025-12-15 2026-01-25

ET-LLM-007: Especificación Frontend del Módulo LLM Agent

Épica: OQI-007 - LLM Strategy Agent Versión: 1.0 Fecha: 2025-12-15 Estado: Implementado Prioridad: P1 - Alto


Resumen

Esta especificación define la arquitectura, componentes y servicios del frontend para el módulo LLM Strategy Agent. El frontend proporciona una interfaz conversacional interactiva que permite a los usuarios comunicarse con el agente LLM para análisis de trading, consultas estratégicas y visualización de señales en tiempo real.


Arquitectura General

┌──────────────────────────────────────────────────────────────────────┐
│                          FRONTEND (React)                            │
│                          Port: 3000                                  │
├──────────────────────────────────────────────────────────────────────┤
│                                                                      │
│  ┌────────────────────┐  ┌────────────────────┐  ┌────────────────┐ │
│  │   Pages            │  │   Components       │  │   Services     │ │
│  ├────────────────────┤  ├────────────────────┤  ├────────────────┤ │
│  │ • AssistantPage    │  │ • ChatWindow       │  │ • llmAgentSvc  │ │
│  │ • ChatPage         │  │ • ChatMessage      │  │ • llmProviders │ │
│  │ • StrategyPage     │  │ • ChatInput        │  │ • tokenService │ │
│  │ • AnalysisPage     │  │ • SignalCard       │  │ • storageService│ │
│  │                    │  │ • ToolCallCard     │  │                │ │
│  │                    │  │ • MessageList      │  │                │ │
│  │                    │  │ • SidebarConv      │  │                │ │
│  │                    │  │ • LoadingSpinner   │  │                │ │
│  │                    │  │ • ErrorBoundary    │  │                │ │
│  └────────────────────┘  └────────────────────┘  └────────────────┘ │
│           │                      │                      │            │
│           └──────────────────────┼──────────────────────┘            │
│                                  │                                   │
│         ┌────────────────────────▼──────────────────────────┐        │
│         │              Zustand Store                        │        │
│         │  ┌─────────────┐  ┌───────────┐  ┌─────────────┐ │        │
│         │  │ chatStore   │  │ uiStore   │  │ signalStore │ │        │
│         │  └─────────────┘  └───────────┘  └─────────────┘ │        │
│         └────────────────────┬───────────────────────────────┘        │
│                              │                                       │
│         ┌────────────────────▼──────────────────────────────┐        │
│         │              Query Client (React Query)          │        │
│         │  Caching, Sincronización, Invalidación          │        │
│         └────────────────────┬───────────────────────────────┘        │
│                              │                                       │
└──────────────────────────────┼───────────────────────────────────────┘
                               │
           ┌───────────────────┼───────────────────┐
           │                   │                   │
┌──────────▼────────┐  ┌───────▼────────┐  ┌──────▼──────────┐
│  LLM Agent API    │  │  Backend API   │  │  WebSocket      │
│  :3085            │  │  :3080         │  │  :3085/ws       │
│                   │  │                │  │                 │
│ POST /chat        │  │ GET /profiles  │  │ message:send    │
│ GET /history      │  │ GET /signals   │  │ agent:stream    │
│ POST /strategies  │  │ POST /orders   │  │ agent:complete  │
│ DELETE /convs     │  │ GET /portfolio │  │ error           │
│                   │  │                │  │                 │
└───────────────────┘  └────────────────┘  └─────────────────┘

Stack Tecnológico Frontend

Core:
  - React: 18.2.0
  - TypeScript: 5.3.0
  - Vite: 6.2.0
  - React Router: 6.18.0

State Management:
  - Zustand: 4.4.7
  - React Query (TanStack Query): 5.14.0

Styling:
  - Tailwind CSS: 3.3.0
  - Headless UI: 1.7.0
  - Radix UI: 1.0.0

Components & UI:
  - React Markdown: 8.0.0
  - Highlight.js: 11.8.0
  - Recharts: 2.10.0 (para gráficos simples)
  - lightweight-charts: 4.1.1 (para charts de trading)

Utilities:
  - axios: 1.6.0
  - date-fns: 2.30.0
  - clsx: 2.0.0
  - zustand: 4.4.7

Testing:
  - Vitest: 1.0.0
  - @testing-library/react: 14.1.0
  - @testing-library/user-event: 14.5.0

Dev Tools:
  - @types/react: 18.2.0
  - @types/node: 20.9.0
  - ESLint: 8.50.0
  - Prettier: 3.0.0

Estructura de Directorios Frontend

apps/frontend/src/
├── pages/
│   ├── AssistantPage.tsx          # Página principal del asistente
│   ├── ChatPage.tsx               # Página de chat conversacional
│   ├── StrategyPage.tsx           # Análisis de estrategias
│   └── AnalysisPage.tsx           # Análisis detallados
│
├── modules/
│   └── llm-agent/
│       ├── components/
│       │   ├── ChatWindow.tsx
│       │   ├── ChatMessage.tsx
│       │   ├── ChatInput.tsx
│       │   ├── SignalCard.tsx
│       │   ├── ToolCallCard.tsx
│       │   ├── MessageList.tsx
│       │   ├── SidebarConversations.tsx
│       │   ├── LoadingIndicator.tsx
│       │   └── ErrorBoundary.tsx
│       │
│       ├── hooks/
│       │   ├── useChat.ts
│       │   ├── useLlmAgent.ts
│       │   ├── useSignals.ts
│       │   ├── useWebSocket.ts
│       │   └── useMessageStream.ts
│       │
│       ├── services/
│       │   ├── llmAgentService.ts     # Cliente API LLM Agent
│       │   ├── websocketService.ts    # Conexión WS
│       │   ├── llmProviderService.ts  # Adaptadores OpenAI/Claude
│       │   └── tokenService.ts        # Conteo de tokens
│       │
│       ├── stores/
│       │   ├── chatStore.ts           # Zustand chat store
│       │   ├── uiStore.ts             # UI state
│       │   └── signalStore.ts         # Señales y análisis
│       │
│       ├── types/
│       │   ├── index.ts
│       │   ├── chat.types.ts
│       │   ├── signal.types.ts
│       │   └── api.types.ts
│       │
│       ├── utils/
│       │   ├── messageFormatters.ts
│       │   ├── tokenCounter.ts
│       │   ├── errorHandlers.ts
│       │   └── conversationHelpers.ts
│       │
│       ├── hooks.test.ts
│       ├── services.test.ts
│       └── components.test.tsx
│
├── shared/
│   ├── components/
│   │   ├── Button.tsx
│   │   ├── Card.tsx
│   │   ├── Modal.tsx
│   │   └── Spinner.tsx
│   │
│   └── hooks/
│       └── useWindowSize.ts
│
├── stores/
│   └── authStore.ts                 # Estado de autenticación global
│
├── services/
│   ├── api.ts                       # Configuración axios
│   ├── authService.ts               # JWT, login, logout
│   └── storageService.ts            # LocalStorage helpers
│
├── App.tsx
├── main.tsx
├── index.css
└── types.d.ts

Componentes Principales

1. ChatWindow Component

Ubicación: apps/frontend/src/modules/llm-agent/components/ChatWindow.tsx

Responsabilidad: Contenedor principal que orquesta el chat conversacional

interface ChatWindowProps {
  conversationId: string;
  onClose?: () => void;
  initialMessage?: string;
  theme?: 'light' | 'dark';
}

export function ChatWindow({
  conversationId,
  onClose,
  initialMessage,
  theme = 'light'
}: ChatWindowProps) {
  // State management
  const { messages, isLoading, streamingContent } = useChatStore();
  const { connected, connect, disconnect } = useWebSocket();

  // Lifecycle
  useEffect(() => {
    // Conectar y cargar historial
  }, [conversationId]);

  // Handlers
  const handleSendMessage = (content: string) => {
    // Enviar mensaje vía WS
  };

  const handleCancel = () => {
    // Cancelar generación
  };

  return (
    <div className={`chat-window theme-${theme}`}>
      <ChatHeader />
      <MessageList messages={messages} streaming={streamingContent} />
      <ChatInput onSend={handleSendMessage} onCancel={handleCancel} />
    </div>
  );
}

Features:

  • Soporte para múltiples conversaciones
  • Auto-scroll a últimos mensajes
  • Indicador de conexión
  • Tema claro/oscuro
  • Responsive design (mobile-first)

2. ChatMessage Component

Ubicación: apps/frontend/src/modules/llm-agent/components/ChatMessage.tsx

Responsabilidad: Renderizar mensaje individual con soporte para markdown, código y tool calls

interface ChatMessageProps {
  message: Message;
  isStreaming?: boolean;
  onFeedback?: (rating: number, comment?: string) => void;
}

export function ChatMessage({
  message,
  isStreaming,
  onFeedback
}: ChatMessageProps) {
  const [showFeedback, setShowFeedback] = useState(false);
  const [feedback, setFeedback] = useState<MessageFeedback>();

  return (
    <div className={`message message-${message.role}`}>
      <div className="message-avatar">
        {message.role === 'user' ? <UserIcon /> : <AgentIcon />}
      </div>

      <div className="message-content">
        {/* Renderizar contenido */}
        <MessageContent message={message} isStreaming={isStreaming} />

        {/* Tool calls si existen */}
        {message.toolCalls && (
          <div className="tool-calls">
            {message.toolCalls.map((tool) => (
              <ToolCallCard key={tool.id} toolCall={tool} />
            ))}
          </div>
        )}

        {/* Signals si existen */}
        {message.signals && (
          <div className="signals">
            {message.signals.map((signal) => (
              <SignalCard key={signal.id} signal={signal} />
            ))}
          </div>
        )}

        {/* Feedback */}
        {message.role === 'assistant' && (
          <MessageFeedback
            onRating={handleFeedback}
            showForm={showFeedback}
          />
        )}
      </div>

      <div className="message-metadata">
        <span className="timestamp">
          {formatTime(message.createdAt)}
        </span>
        {message.tokensInput && (
          <span className="tokens">
            {message.tokensInput + (message.tokensOutput || 0)} tokens
          </span>
        )}
      </div>
    </div>
  );
}

Features:

  • Renderizado seguro de Markdown
  • Highlight de código con Highlight.js
  • Soporte para LaTeX matemático
  • Feedback rating (thumbs up/down + comentario)
  • Copyable code blocks
  • Display de tokens consumidos

3. ChatInput Component

Ubicación: apps/frontend/src/modules/llm-agent/components/ChatInput.tsx

Responsabilidad: Input de usuario con soporte para multi-line, attachments y comandos

interface ChatInputProps {
  onSend: (content: string, files?: File[]) => void;
  onCancel?: () => void;
  isLoading?: boolean;
  disabled?: boolean;
  maxLength?: number;
  placeholder?: string;
}

export function ChatInput({
  onSend,
  onCancel,
  isLoading,
  disabled,
  maxLength = 4000,
  placeholder = 'Escribe tu consulta aquí...'
}: ChatInputProps) {
  const [content, setContent] = useState('');
  const [files, setFiles] = useState<File[]>([]);
  const textareaRef = useRef<HTMLTextAreaElement>(null);

  // Auto-resize textarea
  useEffect(() => {
    if (textareaRef.current) {
      textareaRef.current.style.height = 'auto';
      textareaRef.current.style.height =
        Math.min(textareaRef.current.scrollHeight, 200) + 'px';
    }
  }, [content]);

  const handleSend = () => {
    if (!content.trim()) return;
    onSend(content, files);
    setContent('');
    setFiles([]);
  };

  const handleKeyDown = (e: KeyboardEvent) => {
    // Enviar con Ctrl+Enter o Cmd+Enter
    if ((e.ctrlKey || e.metaKey) && e.key === 'Enter') {
      e.preventDefault();
      handleSend();
    }
  };

  return (
    <div className="chat-input-container">
      {/* File attachments preview */}
      {files.length > 0 && (
        <div className="attachments">
          {files.map((file) => (
            <FileAttachment
              key={file.name}
              file={file}
              onRemove={() => setFiles(f => f.filter(x => x !== file))}
            />
          ))}
        </div>
      )}

      <div className="input-toolbar">
        {/* Attach files */}
        <button
          className="btn-attach"
          onClick={handleFileAttach}
          disabled={isLoading || disabled}
          title="Adjuntar archivo"
        >
          <AttachIcon />
        </button>

        <textarea
          ref={textareaRef}
          value={content}
          onChange={(e) => setContent(e.target.value)}
          onKeyDown={handleKeyDown}
          placeholder={placeholder}
          disabled={isLoading || disabled}
          maxLength={maxLength}
          className="input-textarea"
          rows={1}
        />

        <div className="input-actions">
          <span className="char-count">
            {content.length}/{maxLength}
          </span>

          {isLoading ? (
            <button
              className="btn-cancel"
              onClick={onCancel}
              title="Cancelar"
            >
              <StopIcon />
            </button>
          ) : (
            <button
              className="btn-send"
              onClick={handleSend}
              disabled={!content.trim() || disabled}
              title="Enviar (Ctrl+Enter)"
            >
              <SendIcon />
            </button>
          )}
        </div>
      </div>
    </div>
  );
}

Features:

  • Auto-expand textarea
  • Keyboard shortcuts (Ctrl+Enter para enviar)
  • Attachments de archivos (CSV, imágenes, PDFs)
  • Character counter
  • Cancel button cuando está procesando
  • Disabled state apropiado

4. SignalCard Component

Ubicación: apps/frontend/src/modules/llm-agent/components/SignalCard.tsx

Responsabilidad: Mostrar señales de trading recomendadas por el agente

interface SignalCardProps {
  signal: TradingSignal;
  onExecute?: (signal: TradingSignal) => void;
  onAddToWatchlist?: (signal: TradingSignal) => void;
  compact?: boolean;
}

export function SignalCard({
  signal,
  onExecute,
  onAddToWatchlist,
  compact = false
}: SignalCardProps) {
  const { rating, confidence } = signal;
  const isPositive = signal.direction === 'LONG';

  return (
    <div className={`signal-card signal-${signal.type} ${compact ? 'compact' : ''}`}>
      <div className="signal-header">
        <div className="signal-title">
          <span className={`badge-direction badge-${signal.direction.toLowerCase()}`}>
            {signal.direction}
          </span>
          <h4>{signal.symbol}</h4>
          <span className="timeframe">{signal.timeframe}</span>
        </div>

        <div className="signal-metrics">
          <MetricBadge
            label="Confianza"
            value={confidence}
            type="confidence"
          />
          <MetricBadge
            label="Señal"
            value={rating}
            type="rating"
          />
        </div>
      </div>

      <div className="signal-body">
        <div className="signal-details">
          <DetailRow label="Precio Entrada" value={formatPrice(signal.entry)} />
          <DetailRow label="Take Profit" value={formatPrice(signal.tp)} />
          <DetailRow label="Stop Loss" value={formatPrice(signal.sl)} />
          <DetailRow label="Risk/Reward" value={signal.riskReward.toFixed(2)} />
        </div>

        {!compact && (
          <div className="signal-reasoning">
            <h5>Análisis</h5>
            <p>{signal.reasoning}</p>
          </div>
        )}

        {!compact && signal.indicators && (
          <div className="signal-indicators">
            <h5>Indicadores</h5>
            <ul>
              {signal.indicators.map((indicator) => (
                <li key={indicator.name}>
                  <span className="indicator-name">{indicator.name}</span>
                  <span className="indicator-value">{indicator.value}</span>
                </li>
              ))}
            </ul>
          </div>
        )}
      </div>

      <div className="signal-footer">
        {onExecute && (
          <button
            className="btn-execute"
            onClick={() => onExecute(signal)}
          >
            Ejecutar Orden
          </button>
        )}

        {onAddToWatchlist && (
          <button
            className="btn-watchlist"
            onClick={() => onAddToWatchlist(signal)}
          >
            Agregar a Observados
          </button>
        )}
      </div>
    </div>
  );
}

Features:

  • Métricas de confianza y rating
  • Niveles de entrada, TP, SL
  • Cálculo de risk/reward
  • Análisis y razonamiento
  • Indicadores técnicos
  • Acciones rápidas (ejecutar, agregar watchlist)
  • Modo compacto/expandido

Servicios Frontend

1. LLM Agent Service

Ubicación: apps/frontend/src/modules/llm-agent/services/llmAgentService.ts

Responsabilidad: Cliente HTTP para API del LLM Agent

class LLMAgentService {
  private baseUrl = 'http://localhost:3085';
  private client = axios.create({
    baseURL: this.baseUrl,
    timeout: 30000,
  });

  // Conversaciones
  async getConversations(limit = 20): Promise<Conversation[]> {
    const { data } = await this.client.get('/conversations', {
      params: { limit },
    });
    return data;
  }

  async createConversation(title?: string): Promise<Conversation> {
    const { data } = await this.client.post('/conversations', { title });
    return data;
  }

  async getConversation(id: string): Promise<Conversation> {
    const { data } = await this.client.get(`/conversations/${id}`);
    return data;
  }

  async deleteConversation(id: string): Promise<void> {
    await this.client.delete(`/conversations/${id}`);
  }

  async renameConversation(id: string, title: string): Promise<Conversation> {
    const { data } = await this.client.patch(`/conversations/${id}`, { title });
    return data;
  }

  // Mensajes
  async getMessages(
    conversationId: string,
    limit = 50,
    offset = 0
  ): Promise<Message[]> {
    const { data } = await this.client.get(
      `/conversations/${conversationId}/messages`,
      { params: { limit, offset } }
    );
    return data;
  }

  async sendMessage(
    conversationId: string,
    content: string
  ): Promise<Message> {
    const { data } = await this.client.post(
      `/conversations/${conversationId}/messages`,
      { content }
    );
    return data;
  }

  async provideFeedback(
    messageId: string,
    rating: number,
    comment?: string
  ): Promise<void> {
    await this.client.post(`/messages/${messageId}/feedback`, {
      rating,
      comment,
    });
  }

  // Estrategias
  async analyzeStrategy(
    symbol: string,
    timeframe: string,
    priceData?: number[]
  ): Promise<StrategyAnalysis> {
    const { data } = await this.client.post('/strategies/analyze', {
      symbol,
      timeframe,
      priceData,
    });
    return data;
  }

  async getSignals(
    conversationId: string,
    limit = 10
  ): Promise<TradingSignal[]> {
    const { data } = await this.client.get(
      `/conversations/${conversationId}/signals`,
      { params: { limit } }
    );
    return data;
  }
}

export const llmAgentService = new LLMAgentService();

2. WebSocket Service

Ubicación: apps/frontend/src/modules/llm-agent/services/websocketService.ts

Responsabilidad: Mantener conexión WebSocket con backend

class WebSocketService extends EventEmitter {
  private socket: Socket | null = null;
  private url = 'http://localhost:3085';
  private reconnectDelay = 1000;
  private maxReconnectAttempts = 5;

  connect(token: string): Promise<void> {
    return new Promise((resolve, reject) => {
      this.socket = io(this.url, {
        auth: { token },
        reconnection: true,
        reconnectionDelay: this.reconnectDelay,
        reconnectionDelayMax: 5000,
        reconnectionAttempts: this.maxReconnectAttempts,
      });

      this.socket.on('connect', () => {
        console.log('WebSocket conectado');
        resolve();
      });

      this.socket.on('error', (error) => {
        console.error('Error WS:', error);
        reject(error);
      });

      this.setupListeners();
    });
  }

  private setupListeners() {
    this.socket?.on('message:saved', (message: Message) => {
      this.emit('messageSaved', message);
    });

    this.socket?.on('agent:thinking', () => {
      this.emit('agentThinking');
    });

    this.socket?.on('agent:stream', ({ chunk, toolCalls }: StreamChunk) => {
      this.emit('agentStream', { chunk, toolCalls });
    });

    this.socket?.on('agent:complete', (message: Message) => {
      this.emit('agentComplete', message);
    });

    this.socket?.on('agent:cancelled', () => {
      this.emit('agentCancelled');
    });

    this.socket?.on('error', (error: any) => {
      this.emit('error', error);
    });
  }

  sendMessage(conversationId: string, content: string) {
    this.socket?.emit('message:send', { conversationId, content });
  }

  cancelGeneration(conversationId: string) {
    this.socket?.emit('message:cancel', { conversationId });
  }

  disconnect() {
    this.socket?.disconnect();
    this.socket = null;
  }

  isConnected(): boolean {
    return this.socket?.connected ?? false;
  }
}

export const wsService = new WebSocketService();

Zustand Stores

1. Chat Store

Ubicación: apps/frontend/src/modules/llm-agent/stores/chatStore.ts

interface ChatState {
  // State
  conversations: Conversation[];
  activeConversationId: string | null;
  messages: Map<string, Message[]>;
  isLoading: boolean;
  streamingContent: string;
  streamingConversationId: string | null;

  // Actions
  setConversations: (convs: Conversation[]) => void;
  selectConversation: (id: string) => void;
  createConversation: (title?: string) => Promise<void>;
  deleteConversation: (id: string) => Promise<void>;
  renameConversation: (id: string, title: string) => Promise<void>;

  addMessage: (conversationId: string, message: Message) => void;
  updateStreamingContent: (content: string) => void;
  completeStream: (message: Message) => void;
  clearMessages: () => void;
}

export const useChatStore = create<ChatState>((set, get) => ({
  // Initial state
  conversations: [],
  activeConversationId: null,
  messages: new Map(),
  isLoading: false,
  streamingContent: '',
  streamingConversationId: null,

  // Actions
  setConversations: (convs) => set({ conversations: convs }),

  selectConversation: (id) => {
    set({ activeConversationId: id });
  },

  createConversation: async (title) => {
    try {
      const conversation = await llmAgentService.createConversation(title);
      set((state) => ({
        conversations: [conversation, ...state.conversations],
        activeConversationId: conversation.id,
      }));
    } catch (error) {
      console.error('Error crear conversación:', error);
    }
  },

  deleteConversation: async (id) => {
    try {
      await llmAgentService.deleteConversation(id);
      set((state) => ({
        conversations: state.conversations.filter((c) => c.id !== id),
        activeConversationId:
          state.activeConversationId === id ? null : state.activeConversationId,
      }));
    } catch (error) {
      console.error('Error eliminar conversación:', error);
    }
  },

  addMessage: (conversationId, message) => {
    set((state) => {
      const messages = new Map(state.messages);
      const conv = messages.get(conversationId) || [];
      messages.set(conversationId, [...conv, message]);
      return { messages };
    });
  },

  updateStreamingContent: (content) => {
    set((state) => ({
      streamingContent: state.streamingContent + content,
    }));
  },

  completeStream: (message) => {
    set((state) => {
      const conversationId = state.activeConversationId;
      if (!conversationId) return state;

      const messages = new Map(state.messages);
      const conv = messages.get(conversationId) || [];
      messages.set(conversationId, [...conv, message]);

      return {
        messages,
        isLoading: false,
        streamingContent: '',
      };
    });
  },

  clearMessages: () => set({ messages: new Map() }),
}));

Types & Interfaces

Ubicación: apps/frontend/src/modules/llm-agent/types/

// chat.types.ts
export interface Message {
  id: string;
  conversationId: string;
  role: 'user' | 'assistant' | 'system';
  content: string;
  toolCalls?: ToolCall[];
  signals?: TradingSignal[];
  createdAt: Date;
  updatedAt: Date;
  tokensInput?: number;
  tokensOutput?: number;
  feedbackRating?: number;
  feedbackComment?: string;
}

export interface Conversation {
  id: string;
  userId: string;
  title: string;
  status: 'active' | 'archived';
  messagesCount: number;
  metadata?: Record<string, any>;
  createdAt: Date;
  updatedAt: Date;
  deletedAt?: Date;
}

export interface ToolCall {
  id: string;
  name: string;
  arguments: Record<string, any>;
  result?: any;
}

export interface TradingSignal {
  id: string;
  conversationId: string;
  messageId: string;
  symbol: string;
  direction: 'LONG' | 'SHORT';
  entry: number;
  tp: number;
  sl: number;
  timeframe: string;
  confidence: number;
  rating: number;
  riskReward: number;
  reasoning: string;
  indicators?: Array<{ name: string; value: string }>;
  type: 'TECHNICAL' | 'FUNDAMENTAL' | 'ML' | 'HYBRID';
  createdAt: Date;
}

export interface StreamChunk {
  type: 'text' | 'tool_result' | 'signal';
  content: string;
  toolCalls?: ToolCall[];
  signals?: TradingSignal[];
}

Hooks Personalizados

useChat Hook

Ubicación: apps/frontend/src/modules/llm-agent/hooks/useChat.ts

export function useChat(conversationId: string) {
  const {
    activeConversationId,
    messages,
    isLoading,
    streamingContent,
    addMessage,
    updateStreamingContent,
    completeStream,
  } = useChatStore();

  const { isConnected, sendMessage, cancelGeneration } = useWebSocket();

  const currentMessages = useMemo(
    () => messages.get(conversationId) || [],
    [messages, conversationId]
  );

  const handleSendMessage = useCallback(
    (content: string) => {
      if (!isConnected) {
        toast.error('Desconectado del servidor');
        return;
      }

      sendMessage(conversationId, content);
    },
    [conversationId, isConnected, sendMessage]
  );

  const handleCancel = useCallback(() => {
    cancelGeneration(conversationId);
  }, [conversationId, cancelGeneration]);

  return {
    messages: currentMessages,
    isLoading,
    streamingContent,
    isConnected,
    sendMessage: handleSendMessage,
    cancelGeneration: handleCancel,
  };
}

Estilos

Estructura CSS con Tailwind + SCSS

// apps/frontend/src/modules/llm-agent/styles/index.scss

// Variables
$primary-color: #3b82f6;
$agent-bg: #f3f4f6;
$user-bg: #dbeafe;
$text-primary: #1f2937;
$text-secondary: #6b7280;

// Chat Window
.chat-window {
  display: flex;
  flex-direction: column;
  height: 100%;
  background: white;
  border-radius: 8px;
  box-shadow: 0 1px 3px rgba(0, 0, 0, 0.1);

  .message-list {
    flex: 1;
    overflow-y: auto;
    padding: 16px;
    scroll-behavior: smooth;
  }

  .message {
    display: flex;
    gap: 12px;
    margin-bottom: 16px;
    animation: slideIn 0.3s ease-in;

    &.message-assistant {
      .message-bubble {
        background: $agent-bg;
        color: $text-primary;
      }
    }

    &.message-user {
      flex-direction: row-reverse;
      .message-bubble {
        background: $primary-color;
        color: white;
      }
    }
  }
}

// Signal Card
.signal-card {
  border: 1px solid #e5e7eb;
  border-radius: 8px;
  padding: 16px;
  margin: 12px 0;
  background: #f9fafb;

  .signal-header {
    display: flex;
    justify-content: space-between;
    margin-bottom: 12px;
  }

  .signal-body {
    margin: 12px 0;
    padding: 12px;
    background: white;
    border-radius: 4px;
  }

  .badge-direction {
    display: inline-block;
    padding: 4px 8px;
    border-radius: 4px;
    font-size: 12px;
    font-weight: 600;

    &.badge-long {
      background: #d1fae5;
      color: #065f46;
    }

    &.badge-short {
      background: #fee2e2;
      color: #7f1d1d;
    }
  }
}

// Animations
@keyframes slideIn {
  from {
    opacity: 0;
    transform: translateY(8px);
  }
  to {
    opacity: 1;
    transform: translateY(0);
  }
}

// Responsive
@media (max-width: 768px) {
  .chat-window {
    .message-list {
      padding: 12px;
    }

    .message {
      gap: 8px;
    }

    .signal-card {
      padding: 12px;
      font-size: 14px;
    }
  }
}

Testing

Unit Tests para Componentes

// apps/frontend/src/modules/llm-agent/components/ChatMessage.test.tsx

import { render, screen } from '@testing-library/react';
import { ChatMessage } from './ChatMessage';

describe('ChatMessage Component', () => {
  const mockMessage: Message = {
    id: '1',
    conversationId: 'conv-1',
    role: 'assistant',
    content: 'Hola, soy tu asistente',
    createdAt: new Date(),
    updatedAt: new Date(),
  };

  it('debería renderizar mensaje de texto', () => {
    render(<ChatMessage message={mockMessage} />);
    expect(screen.getByText(/Hola, soy tu asistente/)).toBeInTheDocument();
  });

  it('debería renderizar markdown correctamente', () => {
    const mdMessage: Message = {
      ...mockMessage,
      content: '**Bold** *italic*',
    };
    render(<ChatMessage message={mdMessage} />);
    expect(screen.getByText('Bold')).toHaveStyle('font-weight: bold');
  });

  it('debería mostrar tool calls', () => {
    const msgWithTools: Message = {
      ...mockMessage,
      toolCalls: [
        {
          id: '1',
          name: 'get_price',
          arguments: { symbol: 'BTCUSDT' },
        },
      ],
    };
    render(<ChatMessage message={msgWithTools} />);
    expect(screen.getByText('get_price')).toBeInTheDocument();
  });
});

Integración con Backend

Flow de Mensajes

┌─────────────┐
│   Usuario   │
│  escribe en │
│  ChatInput  │
└──────┬──────┘
       │
       ▼
┌─────────────────────────────────────┐
│ ChatInput.handleSend()              │
│ ├─ Validar contenido                │
│ ├─ Emitir 'message:send' vía WS     │
│ └─ Limpiar input                    │
└──────┬──────────────────────────────┘
       │
       ▼
┌─────────────────────────────────────┐
│ Backend (LLM Agent - :3085)         │
│ ├─ Recibir mensaje                  │
│ ├─ Guardar en DB                    │
│ ├─ Procesar con LLM                 │
│ ├─ Emitir 'agent:thinking'          │
│ └─ Stream chunks vía WS             │
└──────┬──────────────────────────────┘
       │
       ├─ 'agent:stream' (chunk)
       │
       ▼
┌─────────────────────────────────────┐
│ Frontend WebSocket Handler          │
│ ├─ useChatStore.updateStreamingContent()
│ ├─ Renderizar chunk en tiempo real  │
│ └─ Scroll automático                │
└──────┬──────────────────────────────┘
       │
       ├─ 'agent:complete' (mensaje completo)
       │
       ▼
┌─────────────────────────────────────┐
│ ChatMessage Component               │
│ ├─ Mostrar mensaje completo         │
│ ├─ Renderizar signals si existen    │
│ ├─ Mostrar tool calls               │
│ └─ Permitir feedback                │
└─────────────────────────────────────┘

Performance & Optimización

1. Code Splitting

// apps/frontend/src/App.tsx
const ChatPage = lazy(() => import('./pages/ChatPage'));
const AssistantPage = lazy(() => import('./pages/AssistantPage'));
const StrategyPage = lazy(() => import('./pages/StrategyPage'));

<Suspense fallback={<LoadingSpinner />}>
  <Routes>
    <Route path="/chat" element={<ChatPage />} />
    <Route path="/assistant" element={<AssistantPage />} />
    <Route path="/strategy" element={<StrategyPage />} />
  </Routes>
</Suspense>

2. Message Virtualization

Para listas largas de mensajes, usar react-window:

import { FixedSizeList } from 'react-window';

<FixedSizeList
  height={600}
  itemCount={messages.length}
  itemSize={120}
  width="100%"
>
  {({ index, style }) => (
    <ChatMessage
      style={style}
      message={messages[index]}
    />
  )}
</FixedSizeList>

3. React Query Caching

const { data: conversations } = useQuery({
  queryKey: ['conversations'],
  queryFn: () => llmAgentService.getConversations(),
  staleTime: 5 * 60 * 1000, // 5 minutos
  gcTime: 10 * 60 * 1000, // 10 minutos
});

Configuración de Ambiente

Ubicación: apps/frontend/.env.local

VITE_API_URL=http://localhost:3080
VITE_LLM_AGENT_URL=http://localhost:3085
VITE_WS_URL=http://localhost:3085
VITE_ENVIRONMENT=development
VITE_LOG_LEVEL=debug
VITE_MAX_MESSAGE_LENGTH=4000
VITE_CHAT_PAGE_SIZE=50
VITE_AUTO_SCROLL=true

Build & Deploy

Development

cd apps/frontend
npm install
npm run dev  # Vite dev server en :3000

Production

npm run build
npm run preview

# Output
# ├── dist/
# │   ├── index.html
# │   ├── assets/
# │   │   ├── chunk-*.js
# │   │   └── style-*.css
# │   └── assets/vendor-*.js

Roadmap Frontend

Fase 1 (En Progreso)

  • Chat Window base
  • Message rendering con markdown
  • WebSocket connection
  • Feedback system

Fase 2 (Próxima)

  • Signal visualization mejorada
  • Charts integrados
  • File attachment handling
  • Message search/filtering

Fase 3 (Futura)

  • Voice input/output
  • PDF export de conversaciones
  • Collaboration features
  • Mobile app

Referencias


Notas de Implementación

  1. CORS: El frontend debe estar en puerto 3000, backend :3085 con CORS habilitado
  2. Token JWT: Se obtiene en login y se envía en header Authorization: Bearer <token>
  3. Rate Limiting: El servidor implementa límites, frontend debe mostrar error si se excede
  4. Reconnection: Socket.IO maneja reconexión automática, pero se debe validar estado
  5. Mobile First: Diseño responsive desde inicio, tablet y desktop como enhancement

Especificación técnica - Sistema NEXUS v4.0 Trading Platform - OQI-007 Versión 1.0 - Implementado