Now in beta: Managed hyperparameter optimizationExplore

Tune parameters. Not infrastructure.

Hyperparameter optimization (HPO) automatically finds the best settings for your trading strategy, ML model, or simulation — testing hundreds of combinations so you don't have to. Bring your Docker container. We handle the rest.

Experiment #38 — atr-grid-v2Completed48 trials
#paramssharpemax dd
1lookback=50, atr=1.82.310.08
2lookback=30, atr=2.01.940.11
3lookback=20, atr=1.51.850.13
4lookback=100, atr=1.21.620.09
Best config: lookback=50, atr=1.8·sharpe 2.31·max dd 0.08

10 free beta spots

Be among the first to try managed hyperparameter optimization. No credit card, no commitment — just sign up and start optimizing.

Apply for early access

Why HyperOptimizer?

Focus on strategy. We handle the search.

Smarter than brute force

Bayesian optimization learns from each trial result, focusing the next candidates on regions where performance peaks. Better configurations in fewer runs — not just random guessing.

As simple as print("Hello World!")

No clusters to provision, no job queues, no teardown scripts. Push a Docker image, configure parameter ranges in the dashboard, and we handle everything else.

Your strategy stays private

Containers run in full isolation. We only ever see the metric lines you explicitly print to stdout. Zero access to your source code, trading logic, or data.

Results you can read

Leaderboard ranked by any metric, convergence plots showing how the optimizer is progressing, and Pareto frontiers for multi-objective experiments. Visual clarity, not a CSV dump.

How it works

Two changes to your code. We do the rest.

No SDK, no client library, no lock-in. Dockerize once, and we run hundreds of trials for you.

1

Parse our args, run your code

We inject --hpo-* flags at runtime. Parse them with argparse or any CLI library — no dependency on any SDK of ours.

$ python backtest.py \
--hpo-lookback-window=50 \
--hpo-atr-multiplier=1.8
2

Print metrics to stdout

After your backtest finishes, emit metrics with the hpo.metrics. prefix. We collect them automatically — no SDK required.

Backtesting 2020–2024...
Trades: 412 Win rate: 54.1%
hpo.metrics.sharpe=2.31
hpo.metrics.max_drawdown=0.08
hpo.metrics.cagr=0.34
3

View results in the dashboard

5 trials run in parallel by default. The optimizer suggests the next parameter set as results come in — view the live leaderboard as your experiment runs.

#paramssharpemax dd
1w=50 atr=1.82.310.08
2w=30 atr=2.01.940.11
3w=20 atr=1.51.850.13

Questions & Answers

Join the private beta

We're opening 10 free spots for teams who want to optimize smarter. Try the platform, share your feedback, and help shape the product. Want us to support your framework or stack? Let us know.