Skip to content

moshesipper/High-Per-Parameter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

High Per Parameter

Code accompanying the paper: M. Sipper, "High Per Parameter: A Large-Scale Study of Hyperparameter Tuning for Machine Learning Algorithms", Algorithms 2022, 15, 315.

  • main.py: main module (Algorithm 1 in the paper)
  • datasets.py: handle datasets used
  • tune.py: hyperparameter tuning with Optuna
  • score.py: compute metrics
  • hp.py: define hyperparamter ranges/sets per algorithm
  • stats.py: compute stats and hp_score

Citation

Citations are always appreciated 😊:

@Article{Sipper2022Hyper, AUTHOR = {Sipper, Moshe}, TITLE = {High Per Parameter: A Large-Scale Study of Hyperparameter Tuning for Machine Learning Algorithms}, JOURNAL = {Algorithms}, VOLUME = {15}, YEAR = {2022}, NUMBER = {9}, ARTICLE-NUMBER = {315}, } 

About

A Large-Scale Study of Hyperparameter Tuning for Machine Learning Algorithms.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages