trading-platform/docs/01-arquitectura/INTEGRACION-TRADINGAGENT.md
rckrdmrd c1b5081208 feat(ml): Complete FASE 11 - BTCUSD update and comprehensive documentation alignment
ML Engine Updates:
- Updated BTCUSD with Polygon API data (2024-2025): 215,699 new records
- Re-trained all ML models: Attention (R²: 0.223), Base, Metamodel (87.3% confidence)
- Backtest results: +176.71R profit with aggressive_filter strategy

Documentation Consolidation:
- Created docs/99-analisis/_MAP.md index with 13 new analysis documents
- Consolidated inventories: removed duplicates from orchestration/inventarios/
- Updated ML_INVENTORY.yml with BTCUSD metrics and training results
- Added execution reports: FASE11-BTCUSD, correction issues, alignment validation

Architecture & Integration:
- Updated all module documentation with NEXUS v3.4 frontmatter
- Fixed _MAP.md indexes across all folders
- Updated orchestration plans and traces

Files: 229 changed, 5064 insertions(+), 1872 deletions(-)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-07 09:31:29 -06:00

648 lines
16 KiB
Markdown

---
id: "INTEGRACION-TRADINGAGENT"
title: "Integracion TradingAgent - Trading Platform"
type: "Documentation"
project: "trading-platform"
version: "1.0.0"
updated_date: "2026-01-04"
---
# Integración TradingAgent - Trading Platform
**Versión:** 1.0.0
**Última actualización:** 2025-12-05
**Estado:** Plan de Integración
---
## Resumen
Este documento detalla el plan de integración del proyecto TradingAgent existente (`[LEGACY: apps/ml-engine - migrado desde TradingAgent]`) con la nueva plataforma Trading Platform. El objetivo es reutilizar los componentes ML ya desarrollados y probados.
---
## Estado Actual del TradingAgent
### Componentes Listos para Producción
| Componente | Ubicación | Estado | Métricas |
|------------|-----------|--------|----------|
| XGBoost GPU | `src/models/base/xgboost_model.py` | ✅ Producción | MAE 0.24% |
| GRU Attention | `src/models/base/gru_model.py` | ✅ Producción | - |
| Transformer | `src/models/base/transformer_model.py` | ✅ Producción | - |
| RangePredictor | `src/models/range_predictor.py` | ✅ Producción | 69.3% accuracy |
| TPSLClassifier | `src/models/tp_sl_classifier.py` | ✅ Producción | 0.94 AUC |
| SignalGenerator | `src/models/signal_generator.py` | ✅ Producción | JSON format |
| AMDDetector | `src/strategies/amd_detector.py` | ✅ Producción | 4 fases |
| FastAPI Server | `src/api/server.py` | ✅ Producción | REST + WS |
| Dashboard | `src/visualization/dashboard.py` | ✅ Producción | Real-time |
| SignalLogger | `src/utils/signal_logger.py` | ✅ Producción | LLM format |
### Datos Disponibles
```
Base de datos MySQL existente:
- Host: 72.60.226.4
- Database: db_trading_meta
- Tabla: tickers_agg_data
- XAUUSD: 663,289 registros (10 años)
- EURUSD: 755,896 registros
- GBPUSD: 734,316 registros
- USDJPY: 752,502 registros
- Timeframe: 5 minutos
```
### Modelos Entrenados
```
models/phase2/
├── range_predictor/
│ ├── 15m/
│ │ ├── model_high.json # XGBoost para delta_high
│ │ └── model_low.json # XGBoost para delta_low
│ └── 1h/
│ ├── model_high.json
│ └── model_low.json
├── tpsl_classifier/
│ ├── 15m_rr_2_1.json # TP/SL classifier R:R 2:1
│ └── 15m_rr_3_1.json # TP/SL classifier R:R 3:1
├── feature_columns.txt # Lista de 21 features
└── training_report.json # Métricas de entrenamiento
```
---
## Plan de Integración
### Fase 1: Migración de Código
```
ANTES (TradingAgent standalone):
[LEGACY: apps/ml-engine - migrado desde TradingAgent]/
├── src/
├── models/
├── config/
└── scripts/
DESPUÉS (Integrado en Trading Platform):
/home/isem/workspace/projects/trading-platform/
├── apps/
│ ├── frontend/ # React (nuevo)
│ ├── backend/ # Express.js (nuevo)
│ └── ml-engine/ # Python FastAPI (migrado de TradingAgent)
│ ├── app/
│ │ ├── api/ # Rutas FastAPI
│ │ ├── models/ # Modelos ML (de TradingAgent)
│ │ ├── features/ # Feature engineering
│ │ ├── strategies/# AMD y otras estrategias
│ │ └── services/ # Servicios de negocio
│ ├── models/ # Modelos entrenados (.json)
│ └── config/ # Configuraciones
├── packages/
│ └── shared/ # Tipos compartidos
└── docs/
```
### Fase 2: Adaptación de APIs
#### API Actual (TradingAgent)
```python
# src/api/server.py actual
@app.get("/api/predict/{symbol}")
async def get_prediction(symbol: str):
# Retorna predicción directa
return {
"symbol": symbol,
"predicted_high": ...,
"predicted_low": ...
}
```
#### API Nueva (Trading Platform)
```python
# apps/ml-engine/app/api/routers/predictions.py
@router.post("/predictions")
async def create_prediction(
request: PredictionRequest,
api_key: str = Depends(validate_api_key)
):
# Validación de API key
# Rate limiting por usuario
# Formato estandarizado
return PredictionResponse(
success=True,
data={
"symbol": request.symbol,
"horizon": request.horizon,
"horizon_label": "intraday",
"current_price": current_price,
"predicted_high": prediction.high,
"predicted_low": prediction.low,
"delta_high_percent": delta_high,
"delta_low_percent": delta_low,
"confidence": {
"mae": model_mae,
"model_version": "v1.2.0"
}
},
metadata={
"request_id": request_id,
"latency_ms": latency,
"cached": was_cached
}
)
```
### Fase 3: Migración de Base de Datos
```sql
-- Migración de MySQL a PostgreSQL
-- 1. Crear schema en PostgreSQL
CREATE SCHEMA ml;
-- 2. Tabla de datos de mercado (migrar de MySQL)
CREATE TABLE ml.market_data (
id BIGSERIAL PRIMARY KEY,
symbol VARCHAR(20) NOT NULL,
timestamp TIMESTAMPTZ NOT NULL,
open DECIMAL(20, 8),
high DECIMAL(20, 8),
low DECIMAL(20, 8),
close DECIMAL(20, 8),
volume DECIMAL(30, 8),
UNIQUE(symbol, timestamp)
);
-- Índices para performance
CREATE INDEX idx_market_data_symbol_time
ON ml.market_data(symbol, timestamp DESC);
-- 3. Tabla de predicciones
CREATE TABLE ml.predictions (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
symbol VARCHAR(20) NOT NULL,
horizon INTEGER NOT NULL,
predicted_high DECIMAL(20, 8),
predicted_low DECIMAL(20, 8),
actual_high DECIMAL(20, 8),
actual_low DECIMAL(20, 8),
mae_high DECIMAL(10, 6),
mae_low DECIMAL(10, 6),
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- 4. Tabla de señales
CREATE TABLE ml.signals (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
symbol VARCHAR(20) NOT NULL,
horizon INTEGER NOT NULL,
signal_type VARCHAR(10) NOT NULL,
confidence DECIMAL(5, 4),
phase_amd VARCHAR(20),
entry_price DECIMAL(20, 8),
stop_loss DECIMAL(20, 8),
take_profit DECIMAL(20, 8),
prob_tp_first DECIMAL(5, 4),
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- 5. Tabla de outcomes (para fine-tuning LLM)
CREATE TABLE ml.signal_outcomes (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
signal_id UUID REFERENCES ml.signals(id),
outcome VARCHAR(10),
pnl_percent DECIMAL(10, 4),
exit_price DECIMAL(20, 8),
exit_reason VARCHAR(50),
duration_minutes INTEGER,
closed_at TIMESTAMPTZ
);
```
### Fase 4: Integración con Backend Express
```typescript
// apps/backend/src/services/ml/ml-client.service.ts
import axios, { AxiosInstance } from 'axios';
export class MLClientService {
private client: AxiosInstance;
constructor() {
this.client = axios.create({
baseURL: process.env.ML_ENGINE_URL, // http://ml-engine:8000
timeout: 30000,
headers: {
'X-API-Key': process.env.ML_API_KEY,
'Content-Type': 'application/json',
},
});
}
async getPrediction(symbol: string, horizon: number) {
const response = await this.client.post('/predictions', {
symbol,
horizon,
});
return response.data;
}
async getSignal(symbol: string, horizon: number) {
const response = await this.client.post('/signals', {
symbol,
horizon,
include_range: true,
include_tpsl: true,
});
return response.data;
}
async getIndicators(symbol: string) {
const response = await this.client.get(`/indicators?symbol=${symbol}`);
return response.data;
}
}
```
### Fase 5: WebSocket para Real-Time
```typescript
// apps/backend/src/services/ml/signal-subscriber.service.ts
import Redis from 'ioredis';
import { Server as SocketServer } from 'socket.io';
export class SignalSubscriberService {
private subscriber: Redis;
private io: SocketServer;
constructor() {
this.subscriber = new Redis(process.env.REDIS_URL);
}
async start(io: SocketServer) {
this.io = io;
// Suscribirse a canales de señales del ML Engine
await this.subscriber.subscribe(
'signals:BTCUSDT',
'signals:ETHUSDT',
'signals:XAUUSD'
);
this.subscriber.on('message', (channel, message) => {
const signal = JSON.parse(message);
const symbol = channel.split(':')[1];
// Broadcast a clientes suscritos
this.io.to(`signals:${symbol}`).emit('signal', signal);
});
}
}
```
---
## Mapeo de Componentes
### Modelos ML
| TradingAgent | Trading Platform | Cambios Requeridos |
|--------------|-----------|-------------------|
| `XGBoostModel` | `RangePredictor` | Renombrar, agregar schemas |
| `TPSLClassifier` | `TPSLClassifier` | Sin cambios |
| `SignalGenerator` | `SignalGenerator` | Adaptar output format |
| `AMDDetector` | `MarketPhaseDetector` | Renombrar, documentar |
| `Meta-Model` | `EnsembleManager` | Reorganizar |
### Features
```python
# Mapeo de features TradingAgent → Trading Platform
FEATURE_MAPPING = {
# Volatilidad
'volatility_10': 'volatility_10',
'volatility_20': 'volatility_20',
'atr_ratio': 'atr_14_ratio',
# Momentum
'rsi': 'rsi_14',
'momentum_5': 'momentum_5',
'momentum_10': 'momentum_10',
# Trend
'sma_10': 'sma_10',
'sma_20': 'sma_20',
'sma_50': 'sma_50',
'close_sma20_ratio': 'price_sma_ratio_20',
# Volume
'volume_ratio': 'volume_ratio_20',
# Candlestick
'body_size': 'candle_body_size',
'upper_wick': 'candle_upper_wick',
'lower_wick': 'candle_lower_wick',
}
```
### Estrategias
| TradingAgent | Trading Platform Agent | Descripción |
|--------------|-----------------|-------------|
| AMD Accumulation | Atlas Entry | Entradas en acumulación |
| AMD Distribution | Atlas Exit | Salidas en distribución |
| Trend Following | Orion Core | Seguimiento de tendencia |
| Breakout | Orion Breakout | Rupturas de rango |
| Momentum | Nova Core | Momentum trading |
| Altcoin Rotation | Nova Rotation | Rotación de altcoins |
---
## Configuración de Desarrollo
### Variables de Entorno
```bash
# apps/ml-engine/.env
# API
API_HOST=0.0.0.0
API_PORT=8000
API_KEY=your-ml-api-key
# Database (nuevo PostgreSQL)
DATABASE_URL=postgresql://user:pass@postgres:5432/trading
LEGACY_MYSQL_URL=mysql://root:pass@72.60.226.4:3306/db_trading_meta
# Redis
REDIS_URL=redis://redis:6379
# Exchange
BINANCE_API_KEY=your-binance-key
BINANCE_API_SECRET=your-binance-secret
# Models
MODEL_PATH=/app/models
SUPPORTED_SYMBOLS=BTCUSDT,ETHUSDT,XAUUSD
DEFAULT_HORIZONS=6,18,36,72
# GPU
CUDA_VISIBLE_DEVICES=0
```
### Docker Compose para ML Engine
```yaml
# docker-compose.yml
services:
ml-engine:
build:
context: ./apps/ml-engine
dockerfile: Dockerfile
ports:
- "8000:8000"
environment:
- DATABASE_URL=${DATABASE_URL}
- REDIS_URL=redis://redis:6379
- API_KEY=${ML_API_KEY}
volumes:
- ./apps/ml-engine/models:/app/models
- ./apps/ml-engine/config:/app/config
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities: [gpu]
depends_on:
- redis
- postgres
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
interval: 30s
timeout: 10s
retries: 3
```
### Dockerfile para ML Engine
```dockerfile
# apps/ml-engine/Dockerfile
FROM nvidia/cuda:12.1-runtime-ubuntu22.04
# Python
RUN apt-get update && apt-get install -y \
python3.11 \
python3-pip \
&& rm -rf /var/lib/apt/lists/*
WORKDIR /app
# Dependencies
COPY requirements.txt .
RUN pip3 install --no-cache-dir -r requirements.txt
# Application
COPY app/ ./app/
COPY models/ ./models/
COPY config/ ./config/
# Expose port
EXPOSE 8000
# Run
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
```
---
## Scripts de Migración
### 1. Migrar Datos de MySQL a PostgreSQL
```python
# scripts/migrate_market_data.py
import pandas as pd
from sqlalchemy import create_engine
import os
def migrate_market_data():
# Conexión MySQL (origen)
mysql_engine = create_engine(os.getenv('LEGACY_MYSQL_URL'))
# Conexión PostgreSQL (destino)
pg_engine = create_engine(os.getenv('DATABASE_URL'))
symbols = ['XAUUSD', 'EURUSD', 'GBPUSD', 'USDJPY']
for symbol in symbols:
print(f"Migrando {symbol}...")
# Leer de MySQL
query = f"""
SELECT
'{symbol}' as symbol,
date_curr as timestamp,
open, high, low, close, volume
FROM tickers_agg_data
WHERE ticker = 'C:{symbol}'
ORDER BY date_curr
"""
df = pd.read_sql(query, mysql_engine)
# Escribir a PostgreSQL
df.to_sql(
'market_data',
pg_engine,
schema='ml',
if_exists='append',
index=False,
chunksize=10000
)
print(f" {len(df)} registros migrados")
if __name__ == '__main__':
migrate_market_data()
```
### 2. Copiar Modelos Entrenados
```bash
#!/bin/bash
# scripts/copy_models.sh
SOURCE="[LEGACY: apps/ml-engine - migrado desde TradingAgent]/models/phase2"
DEST="/home/isem/workspace/projects/trading-platform/apps/ml-engine/models"
mkdir -p $DEST
cp -r $SOURCE/* $DEST/
echo "Modelos copiados a $DEST"
ls -la $DEST
```
### 3. Validar Integración
```python
# scripts/validate_integration.py
import requests
import json
def validate_ml_engine():
base_url = "http://localhost:8000"
# Test health
health = requests.get(f"{base_url}/health")
assert health.status_code == 200
print("✅ Health check passed")
# Test prediction
prediction = requests.post(
f"{base_url}/predictions",
json={"symbol": "BTCUSDT", "horizon": 18},
headers={"X-API-Key": "test-key"}
)
assert prediction.status_code == 200
data = prediction.json()
assert "predicted_high" in data["data"]
print("✅ Prediction endpoint working")
# Test signal
signal = requests.post(
f"{base_url}/signals",
json={
"symbol": "BTCUSDT",
"horizon": 18,
"include_range": True,
"include_tpsl": True
},
headers={"X-API-Key": "test-key"}
)
assert signal.status_code == 200
data = signal.json()
assert "signal" in data["data"]
print("✅ Signal endpoint working")
print("\n🎉 All validations passed!")
if __name__ == '__main__':
validate_ml_engine()
```
---
## Checklist de Integración
### Pre-Migración
- [ ] Backup de TradingAgent completo
- [ ] Documentar configuraciones actuales
- [ ] Listar dependencias exactas (requirements.txt)
- [ ] Verificar modelos entrenados están disponibles
### Migración de Código
- [ ] Copiar estructura de directorios
- [ ] Adaptar imports y rutas
- [ ] Actualizar configuraciones
- [ ] Crear Dockerfile
- [ ] Crear docker-compose.yml
### Migración de Datos
- [ ] Crear schema PostgreSQL
- [ ] Migrar datos históricos de mercado
- [ ] Copiar modelos entrenados
- [ ] Validar integridad de datos
### Integración con Backend
- [ ] Crear MLClientService
- [ ] Implementar rate limiting por usuario
- [ ] Configurar WebSocket para señales
- [ ] Agregar endpoints proxy
### Testing
- [ ] Tests unitarios de modelos
- [ ] Tests de integración API
- [ ] Tests de performance
- [ ] Tests E2E con frontend
### Producción
- [ ] Configurar GPU en servidor
- [ ] Configurar monitoreo (Prometheus)
- [ ] Configurar logging (ELK)
- [ ] Configurar alertas
---
## Timeline Estimado
| Fase | Duración | Dependencias |
|------|----------|--------------|
| Pre-Migración | 2 días | Ninguna |
| Migración Código | 3 días | Pre-Migración |
| Migración Datos | 2 días | Schema PostgreSQL |
| Integración Backend | 3 días | Migración Código |
| Testing | 3 días | Todo lo anterior |
| Producción | 2 días | Testing completado |
**Total: ~15 días de trabajo**
---
## Referencias
- [TradingAgent Source]([LEGACY: apps/ml-engine - migrado desde TradingAgent]/)
- [ARQUITECTURA-UNIFICADA](./ARQUITECTURA-UNIFICADA.md)
- [OQI-006: ML Signals](../02-definicion-modulos/OQI-006-ml-signals/)