You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+38-45Lines changed: 38 additions & 45 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,6 +2,8 @@
2
2
3
3
HPOBench is a library for providing benchmarks for (multi-fidelity) hyperparameter optimization and with a focus on reproducibility.
4
4
5
+
A list of benchmarks can be found in the [wiki](https://github.com/automl/HPOBench/wiki/Available-Containerized-Benchmarks) and a guide on howto contribute benchmarks is avaiable [here](https://github.com/automl/HPOBench/wiki/https://github.com/automl/HPOBench/wiki/How-to-add-a-new-benchmark-step-by-step)
Containerized benchmarks do not rely on external dependencies and thus do not change. To do so, we rely on [Singularity (version 3.5)](https://sylabs.io/guides/3.5/user-guide/).
36
-
**Note:** All containers are uploaded to a [gitlab registry](https://gitlab.tf.uni-freiburg.de/muelleph/hpobench-registry/container_registry)
37
-
38
-
Further requirements are: [ConfigSpace](https://github.com/automl/ConfigSpace), *scipy* and *numpy*
39
-
40
-
**Note:** Each benchmark can also be run locally, but the dependencies must be installed manually and might conflict with other benchmarks.
41
-
This can be arbitrarily complex and further information can be found in the docstring of the benchmark.
42
-
43
-
A simple example is the XGBoost benchmark which can be installed with `pip install .[xgboost]`
44
-
45
-
```python
46
-
from hpobench.benchmarks.ml.xgboost_benchmark_old import XGBoostBenchmark
Before we start, we recommend using a virtual environment. To run any benchmark using its singularity container,
58
-
run the following:
41
+
We recommend using a virtual environment. To install HPOBench, please run the following:
59
42
```
60
43
git clone https://github.com/automl/HPOBench.git
61
44
cd HPOBench
@@ -64,24 +47,31 @@ pip install .
64
47
65
48
**Note:** This does not install *singularity (version 3.5)*. Please follow the steps described here: [user-guide](https://sylabs.io/guides/3.5/user-guide/quick_start.html#quick-installation-steps).
66
49
67
-
## Further Notes
50
+
## Containerized Benchmarks
51
+
52
+
We provide all benchmarks as containerized versions to (i) isolate their dependencies and (ii) keep them reproducible. Our containerized benchmarks do not rely on external dependencies and thus do not change over time. For this, we rely on [Singularity (version 3.5)](https://sylabs.io/guides/3.5/user-guide/) and for now upload all containers to a [gitlab registry](https://gitlab.tf.uni-freiburg.de/muelleph/hpobench-registry/container_registry)
68
53
69
-
### Configure the HPOBench
54
+
The only other requirements are: [ConfigSpace](https://github.com/automl/ConfigSpace), *scipy* and *numpy*
70
55
71
-
All of HPOBench's settings are stored in a file, the `hpobenchrc`-file.
72
-
It is a yaml file, which is automatically generated at the first use of HPOBench.
73
-
By default, it is placed in `$XDG_CONFIG_HOME`. If `$XDG_CONFIG_HOME` is not set, then the
74
-
`hpobenchrc`-file is saved to `'~/.config/hpobench'`. When using the containerized benchmarks, the Unix socket is
75
-
defined via `$TEMP_DIR`. This is by default `\tmp`. Make sure to have write permissions in those directories.
56
+
### Run a Benchmark Locally
76
57
77
-
In the `hpobenchrc`, you can specify for example the directory, in that the benchmark containers are
78
-
downloaded. We encourage you to take a look into the `hpobenchrc`, to find out more about all
79
-
possible settings.
58
+
Each benchmark can also be run locally, but the dependencies must be installed manually and might conflict with other benchmarks. This can be arbitrarily complex and further information can be found in the docstring of the benchmark.
59
+
60
+
A simple example is the XGBoost benchmark which can be installed with `pip install .[xgboost]`
61
+
62
+
```python
63
+
from hpobench.benchmarks.ml.xgboost_benchmark_old import XGBoostBenchmark
All of HPOBench's settings are stored in a file, the `hpobenchrc`-file. It is a .yaml file, which is automatically generated at the first use of HPOBench.
94
+
By default, it is placed in `$XDG_CONFIG_HOME` (or if not set this defaults to `'~/.config/hpobench'`). This file defines where to store containers and datasets and much more. We highly recommend to have a look at this file once it's created. Furthermore, please make sure to have write permission in these directories or adapt if necessary. For more information on where data is stored, please see the section on `HPOBench Data` below.
95
+
96
+
Furthermore, for running containers, we rely on Unix sockets which by default are located in `$TEMP_DIR` (or if not set this defaults to `\tmp`).
97
+
101
98
### Remove all data, containers, and caches
102
99
103
-
Update: In version 0.0.8, we have added the script `hpobench/util/clean_up_script.py`. It allows to easily remove all
104
-
data, downloaded containers, and caches. To get more information, you can use the following command.
100
+
Feel free to use `hpobench/util/clean_up_script.py` to remove all data, downloaded containers and caches:
105
101
```bash
106
102
python ./hpobench/util/clean_up_script.py --help
107
103
```
108
104
109
-
If you like to delete only specific parts, i.e. a single container,
110
-
you can find the benchmark's data, container, and caches in the following directories:
105
+
If you like to delete only specific parts, i.e. a single container, you can find the benchmark's data, container, and caches in the following directories:
111
106
112
-
#### HPOBench data
107
+
#### HPOBench Data
113
108
HPOBench stores downloaded containers and datasets at the following locations:
0 commit comments