Skip to content

Commit a974b11

Browse files
Update README.md
1 parent 23e9365 commit a974b11

File tree

1 file changed

+38
-45
lines changed

1 file changed

+38
-45
lines changed

README.md

Lines changed: 38 additions & 45 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,8 @@
22

33
HPOBench is a library for providing benchmarks for (multi-fidelity) hyperparameter optimization and with a focus on reproducibility.
44

5+
A list of benchmarks can be found in the [wiki](https://github.com/automl/HPOBench/wiki/Available-Containerized-Benchmarks) and a guide on howto contribute benchmarks is avaiable [here](https://github.com/automl/HPOBench/wiki/https://github.com/automl/HPOBench/wiki/How-to-add-a-new-benchmark-step-by-step)
6+
57
## Status
68

79
Status for Master Branch:
@@ -32,30 +34,11 @@ result_dict = b.objective_function(configuration=config, fidelity={"n_estimators
3234
result_dict = b.objective_function(configuration=config, rng=1)
3335
```
3436

35-
Containerized benchmarks do not rely on external dependencies and thus do not change. To do so, we rely on [Singularity (version 3.5)](https://sylabs.io/guides/3.5/user-guide/).
36-
**Note:** All containers are uploaded to a [gitlab registry](https://gitlab.tf.uni-freiburg.de/muelleph/hpobench-registry/container_registry)
37-
38-
Further requirements are: [ConfigSpace](https://github.com/automl/ConfigSpace), *scipy* and *numpy*
39-
40-
**Note:** Each benchmark can also be run locally, but the dependencies must be installed manually and might conflict with other benchmarks.
41-
This can be arbitrarily complex and further information can be found in the docstring of the benchmark.
42-
43-
A simple example is the XGBoost benchmark which can be installed with `pip install .[xgboost]`
44-
45-
```python
46-
from hpobench.benchmarks.ml.xgboost_benchmark_old import XGBoostBenchmark
47-
48-
b = XGBoostBenchmark(task_id=167149)
49-
config = b.get_configuration_space(seed=1).sample_configuration()
50-
result_dict = b.objective_function(configuration=config,
51-
fidelity={"n_estimators": 128, "dataset_fraction": 0.5}, rng=1)
52-
53-
```
37+
For more examples see `/example/`.
5438

5539
## Installation
5640

57-
Before we start, we recommend using a virtual environment. To run any benchmark using its singularity container,
58-
run the following:
41+
We recommend using a virtual environment. To install HPOBench, please run the following:
5942
```
6043
git clone https://github.com/automl/HPOBench.git
6144
cd HPOBench
@@ -64,24 +47,31 @@ pip install .
6447

6548
**Note:** This does not install *singularity (version 3.5)*. Please follow the steps described here: [user-guide](https://sylabs.io/guides/3.5/user-guide/quick_start.html#quick-installation-steps).
6649

67-
## Further Notes
50+
## Containerized Benchmarks
51+
52+
We provide all benchmarks as containerized versions to (i) isolate their dependencies and (ii) keep them reproducible. Our containerized benchmarks do not rely on external dependencies and thus do not change over time. For this, we rely on [Singularity (version 3.5)](https://sylabs.io/guides/3.5/user-guide/) and for now upload all containers to a [gitlab registry](https://gitlab.tf.uni-freiburg.de/muelleph/hpobench-registry/container_registry)
6853

69-
### Configure the HPOBench
54+
The only other requirements are: [ConfigSpace](https://github.com/automl/ConfigSpace), *scipy* and *numpy*
7055

71-
All of HPOBench's settings are stored in a file, the `hpobenchrc`-file.
72-
It is a yaml file, which is automatically generated at the first use of HPOBench.
73-
By default, it is placed in `$XDG_CONFIG_HOME`. If `$XDG_CONFIG_HOME` is not set, then the
74-
`hpobenchrc`-file is saved to `'~/.config/hpobench'`. When using the containerized benchmarks, the Unix socket is
75-
defined via `$TEMP_DIR`. This is by default `\tmp`. Make sure to have write permissions in those directories.
56+
### Run a Benchmark Locally
7657

77-
In the `hpobenchrc`, you can specify for example the directory, in that the benchmark containers are
78-
downloaded. We encourage you to take a look into the `hpobenchrc`, to find out more about all
79-
possible settings.
58+
Each benchmark can also be run locally, but the dependencies must be installed manually and might conflict with other benchmarks. This can be arbitrarily complex and further information can be found in the docstring of the benchmark.
59+
60+
A simple example is the XGBoost benchmark which can be installed with `pip install .[xgboost]`
61+
62+
```python
63+
from hpobench.benchmarks.ml.xgboost_benchmark_old import XGBoostBenchmark
64+
65+
b = XGBoostBenchmark(task_id=167149)
66+
config = b.get_configuration_space(seed=1).sample_configuration()
67+
result_dict = b.objective_function(configuration=config,
68+
fidelity={"n_estimators": 128, "dataset_fraction": 0.5}, rng=1)
8069

70+
```
8171

82-
### How to build a container locally
72+
### How to Build a Container Locally
8373

84-
With singularity installed run the following to built the xgboost container
74+
With singularity installed run the following to built the, e.g. xgboost container
8575

8676
```bash
8777
cd hpobench/container/recipes/ml
@@ -98,18 +88,23 @@ config = b.get_configuration_space(seed=1).sample_configuration()
9888
result_dict = b.objective_function(config, fidelity={"n_estimators": 128, "dataset_fraction": 0.5})
9989
```
10090

91+
## Configure HPOBench
92+
93+
All of HPOBench's settings are stored in a file, the `hpobenchrc`-file. It is a .yaml file, which is automatically generated at the first use of HPOBench.
94+
By default, it is placed in `$XDG_CONFIG_HOME` (or if not set this defaults to `'~/.config/hpobench'`). This file defines where to store containers and datasets and much more. We highly recommend to have a look at this file once it's created. Furthermore, please make sure to have write permission in these directories or adapt if necessary. For more information on where data is stored, please see the section on `HPOBench Data` below.
95+
96+
Furthermore, for running containers, we rely on Unix sockets which by default are located in `$TEMP_DIR` (or if not set this defaults to `\tmp`).
97+
10198
### Remove all data, containers, and caches
10299

103-
Update: In version 0.0.8, we have added the script `hpobench/util/clean_up_script.py`. It allows to easily remove all
104-
data, downloaded containers, and caches. To get more information, you can use the following command.
100+
Feel free to use `hpobench/util/clean_up_script.py` to remove all data, downloaded containers and caches:
105101
```bash
106102
python ./hpobench/util/clean_up_script.py --help
107103
```
108104

109-
If you like to delete only specific parts, i.e. a single container,
110-
you can find the benchmark's data, container, and caches in the following directories:
105+
If you like to delete only specific parts, i.e. a single container, you can find the benchmark's data, container, and caches in the following directories:
111106

112-
#### HPOBench data
107+
#### HPOBench Data
113108
HPOBench stores downloaded containers and datasets at the following locations:
114109

115110
```bash
@@ -120,20 +115,20 @@ $XDG_DATA_HOME # ~/.local/share/hpobench
120115

121116
For crashes or when not properly shutting down containers, there might be socket files left under `/tmp/hpobench_socket`.
122117

123-
#### OpenML data
118+
#### OpenML Data
124119

125120
OpenML data additionally maintains its cache which is located at `~/.openml/`
126121

127-
#### Singularity container
122+
#### Singularity Containers
128123

129124
Singularity additionally maintains its cache which can be removed with `singularity cache clean`
130125

131-
### Use HPOBench benchmarks in research projects
126+
### Use HPOBench Benchmarks in Research Projects
132127

133128
If you use a benchmark in your experiments, please specify the version number of the HPOBench as well as the version of
134-
the used container. When starting an experiment, HPOBench writes automatically the 2 version numbers to the log.
129+
the used container to ensure reproducibility. When starting an experiment, HPOBench writes automatically these two version numbers to the log.
135130

136-
### Troubleshooting
131+
### Troubleshooting and Further Notes
137132

138133
- **Singularity throws an 'Invalid Image format' exception**
139134
Use a singularity version > 3. For users of the Meta-Cluster in Freiburg, you have to set the following path:
@@ -145,5 +140,3 @@ See whether in `~/.singularity/instances/sing/$HOSTNAME/*/` there is a file that
145140

146141
**Note:** If you are looking for a different or older version of our benchmarking library, you might be looking for
147142
[HPOlib1.5](https://github.com/automl/HPOlib1.5)
148-
149-

0 commit comments

Comments
 (0)