Product

Hyperparameter optimization, managed for you

Define your search space and objective. We run trials, track metrics, and surface the best configuration in a clean dashboard. No infra needed.

Benefits

Focus on strategy. We handle the search.

Stop manually tuning hyperparameters. Let Bayesian optimization find the best configuration while you focus on what matters: your model, strategy, or simulation.

Bayesian optimization

Each trial result informs the next. The optimizer focuses on promising regions of your parameter space, finding better configurations in fewer runs than grid or random search.

Zero infrastructure

No clusters to provision, no job queues, no teardown scripts. Push a Docker image, configure parameter ranges in the dashboard, and we handle everything else.

Full privacy & isolation

Containers run in complete isolation. We only see the metric lines you explicitly print to stdout. Zero access to your source code, trading logic, or data.

Visual results dashboard

Leaderboard ranked by any metric, convergence plots showing optimizer progress, and Pareto frontiers for multi-objective experiments. Visual clarity, not a CSV dump.

Parallel trial execution

Multiple trials run simultaneously across our infrastructure (5 by default). Experiments finish faster while the optimizer uses completed results to guide the search.

Flexible parameter spaces

Integer, float, categorical: define any combination of hyperparameters and their ranges. The optimizer handles the sampling and search strategy automatically.

How it works

Three steps. No SDK.

1

Build a Docker image

Package your code in a Docker container. Any language, any framework: if it runs in Docker, it works with us.

2

Parse args & print metrics

Read --hpo-* flags at runtime and emit metrics with the hpo.metrics. prefix.

3

View results in the dashboard

We run parallel trials, the optimizer suggests the next parameter set, and you see the live leaderboard as your experiment runs.

Strengths

Built for real workloads

No vendor lock-in, no proprietary SDK. Just Docker, stdout, and CLI args.

Docker-native

Any language, any framework. If it runs in Docker, it works with HyperOptimizer. No SDK, no client library, no lock-in.

No code changes needed

Parse CLI args with argparse (or any library) and print metrics to stdout. Two changes to your code. That's it.

Multi-objective optimization

Optimize for multiple metrics simultaneously. Pareto frontiers help you balance competing objectives like return vs. drawdown.

Fault-tolerant trials

Each trial runs independently. If one fails or times out, the others continue. Results from completed trials are always preserved.

Ready to optimize?

Join the private beta. 10 free spots, no credit card, no commitment. Want us to support your framework? Let us know .