1<div align="center"><img src="https://raw.githubusercontent.com/optuna/optuna/master/docs/image/optuna-logo.png" width="800"/></div>
2
3# Optuna: A hyperparameter optimization framework
4
5[![Python](https://img.shields.io/badge/python-3.6%20%7C%203.7%20%7C%203.8%20%7C%203.9-blue)](https://www.python.org)
6[![pypi](https://img.shields.io/pypi/v/optuna.svg)](https://pypi.python.org/pypi/optuna)
7[![conda](https://img.shields.io/conda/vn/conda-forge/optuna.svg)](https://anaconda.org/conda-forge/optuna)
8[![GitHub license](https://img.shields.io/badge/license-MIT-blue.svg)](https://github.com/optuna/optuna)
9[![CircleCI](https://circleci.com/gh/optuna/optuna.svg?style=svg)](https://circleci.com/gh/optuna/optuna)
10[![Read the Docs](https://readthedocs.org/projects/optuna/badge/?version=stable)](https://optuna.readthedocs.io/en/stable/)
11[![Codecov](https://codecov.io/gh/optuna/optuna/branch/master/graph/badge.svg)](https://codecov.io/gh/optuna/optuna/branch/master)
12[![Gitter chat](https://badges.gitter.im/optuna/gitter.svg)](https://gitter.im/optuna/optuna)
13
14[**Website**](https://optuna.org/)
15| [**Docs**](https://optuna.readthedocs.io/en/stable/)
16| [**Install Guide**](https://optuna.readthedocs.io/en/stable/installation.html)
17| [**Tutorial**](https://optuna.readthedocs.io/en/stable/tutorial/index.html)
18
19*Optuna* is an automatic hyperparameter optimization software framework, particularly designed
20for machine learning. It features an imperative, *define-by-run* style user API. Thanks to our
21*define-by-run* API, the code written with Optuna enjoys high modularity, and the user of
22Optuna can dynamically construct the search spaces for the hyperparameters.
23
24## News
25
26Help us create the next version of Optuna!
27Please take a few minutes to fill in this survey, and let us know how you use Optuna now and what improvements you'd like.��
28
29All questions optional. ��‍♂️
30https://forms.gle/mCAttqxVg5oUifKV8
31
32## Key Features
33
34Optuna has modern functionalities as follows:
35
36- [Lightweight, versatile, and platform agnostic architecture](https://optuna.readthedocs.io/en/stable/tutorial/10_key_features/001_first.html)
37  - Handle a wide variety of tasks with a simple installation that has few requirements.
38- [Pythonic search spaces](https://optuna.readthedocs.io/en/stable/tutorial/10_key_features/002_configurations.html)
39  - Define search spaces using familiar Python syntax including conditionals and loops.
40- [Efficient optimization algorithms](https://optuna.readthedocs.io/en/stable/tutorial/10_key_features/003_efficient_optimization_algorithms.html)
41  - Adopt state-of-the-art algorithms for sampling hyperparameters and efficiently pruning unpromising trials.
42- [Easy parallelization](https://optuna.readthedocs.io/en/stable/tutorial/10_key_features/004_distributed.html)
43  - Scale studies to tens or hundreds or workers with little or no changes to the code.
44- [Quick visualization](https://optuna.readthedocs.io/en/stable/tutorial/10_key_features/005_visualization.html)
45  - Inspect optimization histories from a variety of plotting functions.
46
47
48## Basic Concepts
49
50We use the terms *study* and *trial* as follows:
51
52- Study: optimization based on an objective function
53- Trial: a single execution of the objective function
54
55Please refer to sample code below. The goal of a *study* is to find out the optimal set of
56hyperparameter values (e.g., `classifier` and `svm_c`) through multiple *trials* (e.g.,
57`n_trials=100`). Optuna is a framework designed for the automation and the acceleration of the
58optimization *studies*.
59
60[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](http://colab.research.google.com/github/optuna/optuna-examples/blob/main/quickstart.ipynb)
61
62```python
63import ...
64
65# Define an objective function to be minimized.
66def objective(trial):
67
68    # Invoke suggest methods of a Trial object to generate hyperparameters.
69    regressor_name = trial.suggest_categorical('classifier', ['SVR', 'RandomForest'])
70    if regressor_name == 'SVR':
71        svr_c = trial.suggest_float('svr_c', 1e-10, 1e10, log=True)
72        regressor_obj = sklearn.svm.SVR(C=svr_c)
73    else:
74        rf_max_depth = trial.suggest_int('rf_max_depth', 2, 32)
75        regressor_obj = sklearn.ensemble.RandomForestRegressor(max_depth=rf_max_depth)
76
77    X, y = sklearn.datasets.load_boston(return_X_y=True)
78    X_train, X_val, y_train, y_val = sklearn.model_selection.train_test_split(X, y, random_state=0)
79
80    regressor_obj.fit(X_train, y_train)
81    y_pred = regressor_obj.predict(X_val)
82
83    error = sklearn.metrics.mean_squared_error(y_val, y_pred)
84
85    return error  # An objective value linked with the Trial object.
86
87study = optuna.create_study()  # Create a new study.
88study.optimize(objective, n_trials=100)  # Invoke optimization of the objective function.
89```
90
91## Examples
92
93Examples can be found in [optuna/optuna-examples](https://github.com/optuna/optuna-examples).
94
95## Integrations
96
97[Integrations modules](https://optuna.readthedocs.io/en/stable/tutorial/10_key_features/003_efficient_optimization_algorithms.html#integration-modules-for-pruning), which allow pruning, or early stopping, of unpromising trials are available for the following libraries:
98
99* [AllenNLP](https://github.com/optuna/optuna-examples/tree/main/allennlp)
100* [Catalyst](https://github.com/optuna/optuna-examples/tree/main/pytorch/catalyst_simple.py)
101* [Catboost](https://github.com/optuna/optuna-examples/tree/main/catboost/catboost_simple.py)
102* [Chainer](https://github.com/optuna/optuna-examples/tree/main/chainer/chainer_integration.py)
103* FastAI ([V1](https://github.com/optuna/optuna-examples/tree/main/fastai/fastaiv1_simple.py), [V2](https://github.com/optuna/optuna-examples/tree/main/fastai/fastaiv2_simple.py))
104* [Keras](https://github.com/optuna/optuna-examples/tree/main/keras/keras_integration.py)
105* [LightGBM](https://github.com/optuna/optuna-examples/tree/main/lightgbm/lightgbm_integration.py)
106* [MXNet](https://github.com/optuna/optuna-examples/tree/main/mxnet/mxnet_integration.py)
107* [PyTorch](https://github.com/optuna/optuna-examples/tree/main/pytorch/pytorch_simple.py)
108* [PyTorch Ignite](https://github.com/optuna/optuna-examples/tree/main/pytorch/pytorch_ignite_simple.py)
109* [PyTorch Lightning](https://github.com/optuna/optuna-examples/tree/main/pytorch/pytorch_lightning_simple.py)
110* [TensorFlow](https://github.com/optuna/optuna-examples/tree/main/tensorflow/tensorflow_estimator_integration.py)
111* [tf.keras](https://github.com/optuna/optuna-examples/tree/main/tfkeras/tfkeras_integration.py)
112* [XGBoost](https://github.com/optuna/optuna-examples/tree/main/xgboost/xgboost_integration.py)
113
114
115## Web Dashboard (experimental)
116
117The new Web dashboard is under the development at [optuna-dashboard](https://github.com/optuna/optuna-dashboard).
118It is still experimental, but much better in many regards.
119Feature requests and bug reports welcome!
120
121| Manage studies | Visualize with interactive graphs |
122| -------------- | --------------------------------- |
123| ![manage-studies](https://user-images.githubusercontent.com/5564044/97099702-4107be80-16cf-11eb-9d97-f5ceec98ce52.gif) | ![optuna-realtime-graph](https://user-images.githubusercontent.com/5564044/97099797-66e19300-16d0-11eb-826c-6977e3941fb0.gif) |
124
125Install `optuna-dashboard` via pip:
126
127```
128$ pip install optuna-dashboard
129$ optuna-dashboard sqlite:///db.sqlite3
130...
131Listening on http://localhost:8080/
132Hit Ctrl-C to quit.
133```
134
135## Installation
136
137Optuna is available at [the Python Package Index](https://pypi.org/project/optuna/) and on [Anaconda Cloud](https://anaconda.org/conda-forge/optuna).
138
139```bash
140# PyPI
141$ pip install optuna
142```
143
144```bash
145# Anaconda Cloud
146$ conda install -c conda-forge optuna
147```
148
149Optuna supports Python 3.6 or newer.
150
151Also, we also provide Optuna docker images on [DockerHub](https://hub.docker.com/r/optuna/optuna).
152
153## Communication
154
155- [GitHub Issues] for bug reports, feature requests and questions.
156- [Gitter] for interactive chat with developers.
157- [Stack Overflow] for questions.
158
159[GitHub issues]: https://github.com/optuna/optuna/issues
160[Gitter]: https://gitter.im/optuna/optuna
161[Stack Overflow]: https://stackoverflow.com/questions/tagged/optuna
162
163
164## Contribution
165
166Any contributions to Optuna are more than welcome!
167
168If you are new to Optuna, please check the [good first issues](https://github.com/optuna/optuna/labels/good%20first%20issue). They are relatively simple, well-defined and are often good starting points for you to get familiar with the contribution workflow and other developers.
169
170If you already have contributed to Optuna, we recommend the other [contribution-welcome issues](https://github.com/optuna/optuna/labels/contribution-welcome).
171
172For general guidelines how to contribute to the project, take a look at [CONTRIBUTING.md](./CONTRIBUTING.md).
173
174
175## Reference
176
177Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. 2019.
178Optuna: A Next-generation Hyperparameter Optimization Framework. In KDD ([arXiv](https://arxiv.org/abs/1907.10902)).
179