0

I am using Optuna for hyperparameter tuning. I get messages as shown below:

Trial 15 finished with value: 6.226334123011727 and parameters: {'iterations': 1100, 'learning_rate': 0.04262148853587423, 'depth': 6, 'l2_leaf_reg': 6.63997127673657, 'border_count': 46, 'bagging_temperature': 4.932254276656362, 'random_strength': 3.499938575269665}. 

So, my question is, the parameters printed are the ones that are sent by optuna to k folds or are there some parameters from 1 or more folds (eg: n_estimators) that would later update the printed data?

Thanks in advance for any help.

2
  • Would you add Minimal Reproducible Example of your code on how you organize the work with folds during an Optuna trial? Usually, a trial deals with aggregated metric after fitting all k folds with current set of hyperparameters, fixed across all k folds within this trial. Commented Sep 25 at 13:22
  • I can't add an eg. I posted here to know what happens as ChatGPT mentioned something like this might happen, and I wanted to know what actually happens in such a case. Commented Oct 4 at 11:47

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.