Démarrage rapide (5 minutes)
1. Connexion
Créez un compte gratuit ou connectez-vous en tant qu'invité
Ouvrir l'App âMode invitĂ© (gratuit) ou crĂ©ez un compte
2. Charger les Données
Sélectionnez un jeu de données (ex: BTC-USDT historique)
data = load_dataset("BTC-USDT")
# PrĂȘt Ă trader !
Jeux de données pré-chargés disponibles
3. Lancer une Stratégie
Choisissez Mean-Reversion ou Trend-Following
strategy = MeanRevStrategy()
results = backtest(strategy)
Résultats en temps réel
Conseil :
Commencez par la Détection de Régime pour comprendre les conditions de marché, puis explorez les Stratégies Adaptatives pour optimiser votre portefeuille.
đŻ Suite d'Algorithmes
Notre plateforme de trading implémente trois familles d'algorithmes complémentaires, optimisées pour différents contextes de marché :
đ DĂ©tection de RĂ©gime
ModÚles de Markov Cachés identifient les états de marché (tendance, retour à la moyenne, volatilité) en temps réel.
â SĂ©lection de stratĂ©gie adaptĂ©e dynamiquement
đ€ ModĂšles Multi-Agents
Chiarella Multi-Agent simule des traders hétérogÚnes (fondamentalistes, chartistes, bruit).
â PrĂ©dit l'impact prix & volatilitĂ©
⥠Exécution Adaptative
CARA/Risk Parity/Sparse optimisent l'allocation de portefeuille sous contraintes de risque.
â Maximisation du Sharpe avec drawdowns contrĂŽlĂ©s
đ Regime Detection via HMM
We use a 3-state Hidden Markov Model to classify market conditions:
States
đ Trending
Characteristics:
- High autocorrelation
- Low mean reversion
- Positive momentum
Strategy: Breakout/Momentum
âïž Mean-Reverting
Characteristics:
- Negative autocorrelation
- Fast reversion (Ï < 5 min)
- Stable volatility
Strategy: Pairs/Stat Arb
đ„ Volatile
Characteristics:
- High variance
- Low predictability
- News/event-driven
Strategy: Reduce exposure
Implementation
Performance: 85% accuracy on S&P 500 data (2018-2024), < 100ÎŒs latency per update.
đ€ Chiarella Agent-Based Model
Simulates market microstructure with 3 trader types:
đ Fundamentalists
Mean revert to fair value. Stabilizing force.
đ Chartists
Follow trends. Destabilizing (momentum).
đČ Noise Traders
Random demand. Liquidity providers.
Price Dynamics
p(t)
Calibration: Fit \( \alpha_F, \alpha_C, w_i \) to historical order flow. Use case: Predict impact of large orders (>1% ADV).
⥠Adaptive Portfolio Strategies
Three optimization frameworks for different objectives:
1. CARA (Constant Absolute Risk Aversion)
Closed-form solution: \( \mathbf{w}^* = \frac{1}{\gamma} \boldsymbol{\Sigma}^{-1} \boldsymbol{\mu} \) (rescaled to sum to 1).
Best for: Short-term mean-reversion strategies with stable covariance.
2. Risk Parity
Equalizes risk contribution across assets. Best for: Diversified portfolios with heterogeneous volatilities.
3. Sparse Optimization (L1 Penalty)
Promotes sparsity (few non-zero positions). Best for: High transaction costs, limited capital.
Without Sparse Penalty (Ï=0)
With Sparse Penalty (Ï=0.05)
đ Integration: Adaptive Pipeline
Data â Regime Detection (HMM)
Market Data (1-min bars) â Viterbi Algorithm â State: {Trending, Mean-Reverting, Volatile}
Regime â Strategy Selection
Trending â Momentum · Mean-Reverting â Stat Arb · Volatile â Risk-Off
Strategy â Chiarella Simulation
α_F=0.5, α_C=0.3, w_F=0.6, w_C=0.3, w_N=0.1 · 1000 paths (Ît=1s, horizon=5min)
Position Sizing â Adaptive Optimization
High Costs â Sparse (L1) · Low Costs â Risk Parity · Mean-Reversion â CARA (Îł=2.0)
Execution â Trade Signals
Generate orders â TWAP/VWAP slicing â Send to broker â Monitor fills â Loop
đ Performance Metrics
Sharpe Ratio
2.8
Annualized (2023-2024 backtest)
Max Drawdown
-8.2%
Controlled by risk parity
Win Rate
62%
On 10,000+ trades
Regime-Specific Performance
đ Trending
Sharpe: 3.2
Strategy: Momentum (breakout)
Avg Hold: 12 hours
âïž Mean-Reverting
Sharpe: 4.1
Strategy: Stat Arb (pairs)
Avg Hold: 45 minutes
đ„ Volatile
Sharpe: 0.9
Strategy: Reduced exposure
Avg Hold: N/A (mostly flat)
đïž Polarway Lakehouse - Technical Reference
Le Lakehouse est notre infrastructure d'authentification et de gestion de données construite sur Delta Lake. Il garantit ACID compliance, time-travel, RGPD compliance, et une traçabilité totale de toutes les opérations utilisateurs.
ACID Transactions
Toutes les opĂ©rations (registration, login, API key updates) sont atomiques grĂące Ă Delta Lake. Pas de corruption de donnĂ©es, mĂȘme en cas de crash.
Time-Travel
Accédez à n'importe quelle version de vos données utilisateurs ou API keys. Parfait pour les audits et la conformité réglementaire.
RGPD Ready
Export et suppression de données garantis.
Droit Ă l'oubli avec VACUUM pour effacement permanent.
API Key Management
Le systÚme de gestion d'API keys permet un tracking transparent de vos quotas API (Finnhub, Alpha Vantage, Kraken, etc.) avec mise à jour en temps réel.
Récupérer vos API keys et quotas
from python.lakehouse.client import LakehouseClient
# Initialize client
client = LakehouseClient("/app/data/lakehouse")
# Get API keys with usage stats
keys = client.get_api_keys(user_id="user-123")
# Example output:
# {
# "hfthot_api_key": "hft_abc123_...",
# "provider_keys": {
# "finnhub": {
# "api_key": "c8q...",
# "queries_done": 245,
# "queries_remaining": 3355,
# "queries_limit": 3600
# },
# "alpha_vantage": {
# "api_key": "DEMO",
# "queries_done": 12,
# "queries_remaining": 488,
# "queries_limit": 500
# }
# },
# "data_sharing_consent": true
# }
Mettre Ă jour les quotas aprĂšs requĂȘte API
# After making a Finnhub API call
success = client.update_query_count(
user_id="user-123",
provider="finnhub",
increment=1
)
# Check updated stats
stats = client.get_query_stats("user-123", "finnhub")
print(f"Queries remaining: {stats['queries_remaining']}/{stats['queries_limit']}")
# Example output:
# Queries remaining: 3354/3600
Sauvegarder de nouvelles API keys
# Save user's API keys (e.g., after subscribing to Data Sharing Program)
client.save_api_keys(
user_id="user-123",
provider_keys={
"finnhub": {"api_key": "c8q2i...", "queries_limit": 3600},
"alpha_vantage": {"api_key": "DEMO", "queries_limit": 500}
},
data_sharing_consent=True # User opts into Data Sharing Program
)
# Generates a unique HFThoT API key automatically
hfthot_key = client.generate_hfthot_api_key("user-123")
print(f"Your HFThoT API key: {hfthot_key}")
# Example output:
# Your HFThoT API key: hft_abc123_xyz-456-789
Time-Travel Queries
Chaque modification dans le Lakehouse crée une nouvelle version Delta. Vous pouvez accéder à l'état exact des données à n'importe quel moment historique.
Lire une version spécifique (par numéro)
# Read users table at version 5 (e.g., before a data migration)
users_v5 = client.read_version("users", version=5)
print(f"Total users at version 5: {len(users_v5)}")
# Read api_keys table at version 10 (e.g., before quota reset)
api_keys_v10 = client.read_version("api_keys", version=10)
Lire une version par timestamp
from datetime import datetime
# Get state of users table on January 15, 2026 at 14:00 UTC
timestamp = datetime(2026, 1, 15, 14, 0, 0)
users_jan15 = client.read_at_timestamp("users", timestamp)
# Perfect for regulatory audits (MiFID II, FINMA compliance)
Use Case:
Un utilisateur conteste un dĂ©compte de quota API ? Utilisez time-travel pour vĂ©rifier l'Ă©tat exact de sa table api_keys au moment de la requĂȘte contestĂ©e.
RGPD Compliance Procedures
Conformité totale au RGPD (RÚglement Général sur la Protection des Données) avec export, suppression et portabilité des données.
1. Droit d'accÚs - Exporter toutes vos données
# Export all user data (users, sessions, api_keys, audit_log)
user_data = client.export_user_data(user_id="user-123")
# Returns JSON with:
# {
# "user": {...},
# "sessions": [{...}, {...}],
# "api_keys": {...},
# "audit_log": [{...}, ...]
# }
# Save to file for user download (CSV or JSON)
import json
with open(f"user_data_{user_id}.json", "w") as f:
json.dump(user_data, f, indent=2)
2. Droit à l'oubli - Supprimer vos données
# Soft delete: Mark user as deleted (tombstone) client.delete_user_soft(user_id="user-123") # â User marked as deleted, data still recoverable for 30 days # Hard delete: Permanent removal with VACUUM client.delete_user_permanent(user_id="user-123") # â Deletes from users, sessions, api_keys, audit_log # â Runs VACUUM to permanently erase Parquet files
3. Portabilité - Transférer vers un autre service
# Export in standard formats (JSON, CSV) export_data = client.export_user_data(user_id="user-123", format="csv") # User can import into any service that accepts CSV/JSON # Includes: API keys, subscription tier, billing history
Conformité légale:
Ces procédures respectent les articles 15 (droit d'accÚs), 17 (droit à l'oubli), et 20 (portabilité) du RGPD.
Toutes les opérations sont loggées dans audit_log/ pour traçabilité.
Audit Logs & Traceability
Toutes les opérations utilisateurs sont automatiquement enregistrées dans la table audit_log/ partitionnée par date. Rétention: 90 jours (configurable).
Consulter l'historique d'un utilisateur
# Get all actions for a user
logs = client.get_audit_log(
user_id="user-123",
start_date="2026-01-01",
end_date="2026-02-01"
)
# Example output:
# [
# {"timestamp": "2026-01-15T10:23:45Z", "action": "login", "ip": "203.0.113.42"},
# {"timestamp": "2026-01-15T10:24:12Z", "action": "update_api_keys", "provider": "finnhub"},
# {"timestamp": "2026-01-15T14:56:30Z", "action": "query_api", "provider": "finnhub", "queries_remaining": 3354}
# ]
Filtrer par type d'action
# Get only login attempts (success + failures)
login_logs = client.get_audit_log(
user_id="user-123",
action_type="login"
)
# Detect suspicious activity (e.g., multiple failed logins)
failed_logins = [log for log in login_logs if log.get("success") == False]
if len(failed_logins) > 5:
print("â ïž Warning: Multiple failed login attempts detected")
Logged Events:
- â
register- User account creation - â
login/logout- Authentication events - â
update_api_keys- API key modifications - â
query_api- API usage (provider, quota remaining) - â
upgrade_tier- Subscription changes - â
export_data/delete_data- RGPD actions
Documentation ComplĂšte
Pour une documentation technique exhaustive du module Lakehouse (installation, configuration, API reference, best practices), consultez ReadTheDocs.
Technologies:
Delta Lake (ACID transactions) âą
PyArrow (zero-copy) âą
Polars (lazy evaluation) âą
Argon2 (encryption)
đ Ressources & Documentation
đ Interactive Notebooks
đ Unlock interactive Jupyter notebooks with an Educational or Professional account.
Try it out live by subscribing to a Hobbyist tier and access the interactive Streamlit platform.
Available notebooks:
âą Regime Detection (HMM)
âą Kalman Filter Market Making
âą Mean Field Games Portfolio
âą Differential Evolution
âą Sparse Mean Reversion
đ Theoretical Documentation
Hidden Markov Models theory with Forward-Backward and Viterbi algorithms
đ Chiarella Agent-Based Model
Multi-agent market microstructure with heterogeneous traders (Fundamentalists, Chartists, Noise)
âïž Adaptive Portfolio Strategies
Risk Parity, Sparse L1 optimization, and CARA utility maximization
đ Try it Live
đŻ Streamlit Interactive Platform
Test algorithms on real market data with an interactive interface
đł View Pricing & Subscription Tiers
From Hobbyist (âŹ9/month) to Professional (custom pricing)
đ Free Beta Access
Try the platform with limited features. No credit card required.
Launch Beta â