Documentation

Guide complet pour utiliser HFThot Research Lab

Quick Start (5 min) Algorithmes (HMM, MFG, Chiarella) Polarway Lakehouse REST · WebSocket · gRPC
⚡

Developer API Reference

Documentation technique avancĂ©e — REST API, WebSocket streaming, gRPC/RPC endpoints, Lakehouse queries, exemples Python & Rust, authentification. Conçue pour les dĂ©veloppeurs et quants qui intĂšgrent HFThot dans leurs systĂšmes.

API Reference → Polarway Docs

Démarrage rapide (5 minutes)

1. Connexion

Créez un compte gratuit ou connectez-vous en tant qu'invité

Ouvrir l'App →

Mode invité (gratuit) ou créez un compte

2. Charger les Données

Sélectionnez un jeu de données (ex: BTC-USDT historique)

# Chargez des données BTC/USDT
data = load_dataset("BTC-USDT")
# PrĂȘt Ă  trader !

Jeux de données pré-chargés disponibles

3. Lancer une Stratégie

Choisissez Mean-Reversion ou Trend-Following

# Stratégie mean-reversion
strategy = MeanRevStrategy()
results = backtest(strategy)

Résultats en temps réel

Conseil :

Commencez par la Détection de Régime pour comprendre les conditions de marché, puis explorez les Stratégies Adaptatives pour optimiser votre portefeuille.

🎯 Suite d'Algorithmes

Notre plateforme de trading implémente trois familles d'algorithmes complémentaires, optimisées pour différents contextes de marché :

📊 DĂ©tection de RĂ©gime

ModÚles de Markov Cachés identifient les états de marché (tendance, retour à la moyenne, volatilité) en temps réel.

→ SĂ©lection de stratĂ©gie adaptĂ©e dynamiquement

đŸ€– ModĂšles Multi-Agents

Chiarella Multi-Agent simule des traders hétérogÚnes (fondamentalistes, chartistes, bruit).

→ PrĂ©dit l'impact prix & volatilitĂ©

⚡ ExĂ©cution Adaptative

CARA/Risk Parity/Sparse optimisent l'allocation de portefeuille sous contraintes de risque.

→ Maximisation du Sharpe avec drawdowns contrĂŽlĂ©s

🔍 Regime Detection via HMM

We use a 3-state Hidden Markov Model to classify market conditions:

\[ P(S_t | r_{1:t}) = \frac{P(r_t | S_t) \cdot \sum_{S_{t-1}} P(S_t | S_{t-1}) P(S_{t-1} | r_{1:t-1})}{\sum_{S_t} P(r_t | S_t) \cdot \sum_{S_{t-1}} P(S_t | S_{t-1}) P(S_{t-1} | r_{1:t-1})} \]

States

📈 Trending

Characteristics:

  • High autocorrelation
  • Low mean reversion
  • Positive momentum

Strategy: Breakout/Momentum

↔ Mean-Reverting

Characteristics:

  • Negative autocorrelation
  • Fast reversion (τ < 5 min)
  • Stable volatility

Strategy: Pairs/Stat Arb

đŸ’„ Volatile

Characteristics:

  • High variance
  • Low predictability
  • News/event-driven

Strategy: Reduce exposure

Implementation

📊 Market Returnsr₁, r₂, 
, rₜ
↓
Forward Passα[s,t] = P(rₜ | Sₜ=s) × ÎŁ P(Sₜ=s|Sₜ₋₁=sâ€Č) α[sâ€Č,t−1]
↓
Backward PassÎČ[s,t] = ÎŁ P(rₜ₊₁|Sₜ₊₁=sâ€Č) × P(Sₜ₊₁=sâ€Č|Sₜ=s) × ÎČ[sâ€Č,t+1]
↓
Most Likely Statearg max_s α[s,t] × ÎČ[s,t]
↓
Strategy SelectionMomentum / Mean-Reversion / Risk-Off

Performance: 85% accuracy on S&P 500 data (2018-2024), < 100ÎŒs latency per update.

đŸ€– Chiarella Agent-Based Model

Simulates market microstructure with 3 trader types:

\[ \frac{dp_t}{dt} = \lambda \left( \sum_{i=1}^N w_i D_i(p_t, F_t) \right) + \sigma_p dW_t \] where \( D_i \) is the demand function for agent type \(i\), \( F_t \) is fundamental value, \( \lambda \) is price adjustment speed.

📚 Fundamentalists

\( D_F = \alpha_F (F_t - p_t) \)

Mean revert to fair value. Stabilizing force.

📈 Chartists

\( D_C = \alpha_C (p_t - p_{t-1}) \)

Follow trends. Destabilizing (momentum).

đŸŽČ Noise Traders

\( D_N \sim \mathcal{N}(0, \sigma_N^2) \)

Random demand. Liquidity providers.

Price Dynamics

📈 ChartistsTrend Following
↓
📚 FundamentalistsMean Reversion
←
Price
p(t)
→
đŸŽČ Noise TradersRandomness
↓
Equilibrium: D_F + D_C + D_N = 0
↓
Stable Marketw_F > w_C → Reversion to F
Volatile Marketw_C > w_F → Bubbles / Crashes

Calibration: Fit \( \alpha_F, \alpha_C, w_i \) to historical order flow. Use case: Predict impact of large orders (>1% ADV).

⚡ Adaptive Portfolio Strategies

Three optimization frameworks for different objectives:

1. CARA (Constant Absolute Risk Aversion)

\[ \mathbf{w}^* = \arg\max_{\mathbf{w}} \left\{ \mathbf{w}^\top \boldsymbol{\mu} - \frac{\gamma}{2} \mathbf{w}^\top \boldsymbol{\Sigma} \mathbf{w} \right\} \] Subject to: \( \sum w_i = 1, \; w_i \geq 0 \)

Closed-form solution: \( \mathbf{w}^* = \frac{1}{\gamma} \boldsymbol{\Sigma}^{-1} \boldsymbol{\mu} \) (rescaled to sum to 1).

Best for: Short-term mean-reversion strategies with stable covariance.

2. Risk Parity

\[ \text{RC}_i = w_i \frac{\partial \sigma_p}{\partial w_i} = \frac{1}{N} \sigma_p \quad \forall i \] where \( \sigma_p = \sqrt{\mathbf{w}^\top \boldsymbol{\Sigma} \mathbf{w}} \) is portfolio volatility.

Equalizes risk contribution across assets. Best for: Diversified portfolios with heterogeneous volatilities.

3. Sparse Optimization (L1 Penalty)

\[ \mathbf{w}^* = \arg\min_{\mathbf{w}} \left\{ \mathbf{w}^\top \boldsymbol{\Sigma} \mathbf{w} - \lambda \mathbf{w}^\top \boldsymbol{\mu} + \rho \|\mathbf{w}\|_1 \right\} \]

Promotes sparsity (few non-zero positions). Best for: High transaction costs, limited capital.

Without Sparse Penalty (ρ=0)

95 assets with w > 0.001 · Transaction Costs: 2.1% · Turnover: High

With Sparse Penalty (ρ=0.05)

Only 12 assets selected · Transaction Costs: 0.3% · Turnover: Low

🔄 Integration: Adaptive Pipeline

1

Data → Regime Detection (HMM)

Market Data (1-min bars) → Viterbi Algorithm → State: {Trending, Mean-Reverting, Volatile}

→ Confidence: 0.87
↓
2

Regime → Strategy Selection

Trending → Momentum · Mean-Reverting → Stat Arb · Volatile → Risk-Off

→ Selected: Mean-Reverting (confidence > 0.8)
↓
3

Strategy → Chiarella Simulation

α_F=0.5, α_C=0.3, w_F=0.6, w_C=0.3, w_N=0.1 · 1000 paths (Δt=1s, horizon=5min)

→ Expected Price: $105.23 (±0.15) · Impact: −0.08%
↓
4

Position Sizing → Adaptive Optimization

High Costs → Sparse (L1) · Low Costs → Risk Parity · Mean-Reversion → CARA (Îł=2.0)

→ w* = [0.35, 0.28, 0.20, 0.12, 0.05]
↓
5

Execution → Trade Signals

Generate orders → TWAP/VWAP slicing → Send to broker → Monitor fills → Loop

📊 Performance Metrics

Sharpe Ratio

2.8

Annualized (2023-2024 backtest)

Max Drawdown

-8.2%

Controlled by risk parity

Win Rate

62%

On 10,000+ trades

Regime-Specific Performance

📈 Trending

Sharpe: 3.2

Strategy: Momentum (breakout)

Avg Hold: 12 hours

↔ Mean-Reverting

Sharpe: 4.1

Strategy: Stat Arb (pairs)

Avg Hold: 45 minutes

đŸ’„ Volatile

Sharpe: 0.9

Strategy: Reduced exposure

Avg Hold: N/A (mostly flat)

đŸ—„ïž Polarway Lakehouse - Technical Reference

Le Lakehouse est notre infrastructure d'authentification et de gestion de données construite sur Delta Lake. Il garantit ACID compliance, time-travel, RGPD compliance, et une traçabilité totale de toutes les opérations utilisateurs.

ACID Transactions

Toutes les opĂ©rations (registration, login, API key updates) sont atomiques grĂące Ă  Delta Lake. Pas de corruption de donnĂ©es, mĂȘme en cas de crash.

Time-Travel

Accédez à n'importe quelle version de vos données utilisateurs ou API keys. Parfait pour les audits et la conformité réglementaire.

RGPD Ready

Export et suppression de données garantis. Droit à l'oubli avec VACUUM pour effacement permanent.

API Key Management

Le systÚme de gestion d'API keys permet un tracking transparent de vos quotas API (Finnhub, Alpha Vantage, Kraken, etc.) avec mise à jour en temps réel.

Récupérer vos API keys et quotas

from python.lakehouse.client import LakehouseClient

# Initialize client
client = LakehouseClient("/app/data/lakehouse")

# Get API keys with usage stats
keys = client.get_api_keys(user_id="user-123")

# Example output:
# {
#   "hfthot_api_key": "hft_abc123_...",
#   "provider_keys": {
#     "finnhub": {
#       "api_key": "c8q...",
#       "queries_done": 245,
#       "queries_remaining": 3355,
#       "queries_limit": 3600
#     },
#     "alpha_vantage": {
#       "api_key": "DEMO",
#       "queries_done": 12,
#       "queries_remaining": 488,
#       "queries_limit": 500
#     }
#   },
#   "data_sharing_consent": true
# }

Mettre Ă  jour les quotas aprĂšs requĂȘte API

# After making a Finnhub API call
success = client.update_query_count(
    user_id="user-123",
    provider="finnhub",
    increment=1
)

# Check updated stats
stats = client.get_query_stats("user-123", "finnhub")
print(f"Queries remaining: {stats['queries_remaining']}/{stats['queries_limit']}")

# Example output:
# Queries remaining: 3354/3600

Sauvegarder de nouvelles API keys

# Save user's API keys (e.g., after subscribing to Data Sharing Program)
client.save_api_keys(
    user_id="user-123",
    provider_keys={
        "finnhub": {"api_key": "c8q2i...", "queries_limit": 3600},
        "alpha_vantage": {"api_key": "DEMO", "queries_limit": 500}
    },
    data_sharing_consent=True  # User opts into Data Sharing Program
)

# Generates a unique HFThoT API key automatically
hfthot_key = client.generate_hfthot_api_key("user-123")
print(f"Your HFThoT API key: {hfthot_key}")

# Example output:
# Your HFThoT API key: hft_abc123_xyz-456-789

Time-Travel Queries

Chaque modification dans le Lakehouse crée une nouvelle version Delta. Vous pouvez accéder à l'état exact des données à n'importe quel moment historique.

Lire une version spécifique (par numéro)

# Read users table at version 5 (e.g., before a data migration)
users_v5 = client.read_version("users", version=5)
print(f"Total users at version 5: {len(users_v5)}")

# Read api_keys table at version 10 (e.g., before quota reset)
api_keys_v10 = client.read_version("api_keys", version=10)

Lire une version par timestamp

from datetime import datetime

# Get state of users table on January 15, 2026 at 14:00 UTC
timestamp = datetime(2026, 1, 15, 14, 0, 0)
users_jan15 = client.read_at_timestamp("users", timestamp)

# Perfect for regulatory audits (MiFID II, FINMA compliance)

Use Case:

Un utilisateur conteste un dĂ©compte de quota API ? Utilisez time-travel pour vĂ©rifier l'Ă©tat exact de sa table api_keys au moment de la requĂȘte contestĂ©e.

RGPD Compliance Procedures

Conformité totale au RGPD (RÚglement Général sur la Protection des Données) avec export, suppression et portabilité des données.

1. Droit d'accÚs - Exporter toutes vos données

# Export all user data (users, sessions, api_keys, audit_log)
user_data = client.export_user_data(user_id="user-123")

# Returns JSON with:
# {
#   "user": {...},
#   "sessions": [{...}, {...}],
#   "api_keys": {...},
#   "audit_log": [{...}, ...]
# }

# Save to file for user download (CSV or JSON)
import json
with open(f"user_data_{user_id}.json", "w") as f:
    json.dump(user_data, f, indent=2)

2. Droit à l'oubli - Supprimer vos données

# Soft delete: Mark user as deleted (tombstone)
client.delete_user_soft(user_id="user-123")
# → User marked as deleted, data still recoverable for 30 days

# Hard delete: Permanent removal with VACUUM
client.delete_user_permanent(user_id="user-123")
# → Deletes from users, sessions, api_keys, audit_log
# → Runs VACUUM to permanently erase Parquet files

3. Portabilité - Transférer vers un autre service

# Export in standard formats (JSON, CSV)
export_data = client.export_user_data(user_id="user-123", format="csv")

# User can import into any service that accepts CSV/JSON
# Includes: API keys, subscription tier, billing history

Conformité légale:

Ces procédures respectent les articles 15 (droit d'accÚs), 17 (droit à l'oubli), et 20 (portabilité) du RGPD. Toutes les opérations sont loggées dans audit_log/ pour traçabilité.

Audit Logs & Traceability

Toutes les opérations utilisateurs sont automatiquement enregistrées dans la table audit_log/ partitionnée par date. Rétention: 90 jours (configurable).

Consulter l'historique d'un utilisateur

# Get all actions for a user
logs = client.get_audit_log(
    user_id="user-123",
    start_date="2026-01-01",
    end_date="2026-02-01"
)

# Example output:
# [
#   {"timestamp": "2026-01-15T10:23:45Z", "action": "login", "ip": "203.0.113.42"},
#   {"timestamp": "2026-01-15T10:24:12Z", "action": "update_api_keys", "provider": "finnhub"},
#   {"timestamp": "2026-01-15T14:56:30Z", "action": "query_api", "provider": "finnhub", "queries_remaining": 3354}
# ]

Filtrer par type d'action

# Get only login attempts (success + failures)
login_logs = client.get_audit_log(
    user_id="user-123",
    action_type="login"
)

# Detect suspicious activity (e.g., multiple failed logins)
failed_logins = [log for log in login_logs if log.get("success") == False]
if len(failed_logins) > 5:
    print("⚠ Warning: Multiple failed login attempts detected")

Logged Events:

  • ✓ register - User account creation
  • ✓ login / logout - Authentication events
  • ✓ update_api_keys - API key modifications
  • ✓ query_api - API usage (provider, quota remaining)
  • ✓ upgrade_tier - Subscription changes
  • ✓ export_data / delete_data - RGPD actions

Documentation ComplĂšte

Pour une documentation technique exhaustive du module Lakehouse (installation, configuration, API reference, best practices), consultez ReadTheDocs.

📖 Polarway Lakehouse Docs đŸ—ïž Architecture Overview 💳 API Key Plans & Pricing

Technologies:
Delta Lake (ACID transactions) ‱ PyArrow (zero-copy) ‱ Polars (lazy evaluation) ‱ Argon2 (encryption)

🚀 Ressources & Documentation

📓 Interactive Notebooks

🔒 Unlock interactive Jupyter notebooks with an Educational or Professional account.

Try it out live by subscribing to a Hobbyist tier and access the interactive Streamlit platform.

Available notebooks:
‱ Regime Detection (HMM)
‱ Kalman Filter Market Making
‱ Mean Field Games Portfolio
‱ Differential Evolution
‱ Sparse Mean Reversion

📚 Theoretical Documentation

🔬 Regime Detection (HMM)

Hidden Markov Models theory with Forward-Backward and Viterbi algorithms

📊 Chiarella Agent-Based Model

Multi-agent market microstructure with heterogeneous traders (Fundamentalists, Chartists, Noise)

⚙ Adaptive Portfolio Strategies

Risk Parity, Sparse L1 optimization, and CARA utility maximization

📂 Browse all theoretical documentation →

🚀 Try it Live

🎯 Streamlit Interactive Platform

Test algorithms on real market data with an interactive interface

💳 View Pricing & Subscription Tiers

From Hobbyist (€9/month) to Professional (custom pricing)

🆓 Free Beta Access

Try the platform with limited features. No credit card required.

Launch Beta →