Original article was published on Deep Learning on Medium

How to tune the number of epochs and batch_size in Keras-tuner?

This can be done by subclassing the Tuner class you are using and overriding run_trial. (Note that Hyperband sets the epochs to train for via its own logic, so if you’re using Hyperband you shouldn’t tune the epochs). Here’s an example with kt.tuners.BayesianOptimization:

class MyTuner(kerastuner.tuners.BayesianOptimization):
def run_trial(self, trial, *args, **kwargs):
# You can add additional HyperParameters for preprocessing and custom training loops
# via overriding `run_trial`
kwargs['batch_size'] = trial.hyperparameters.Int('batch_size', 32, 256, step=32)
kwargs['epochs'] = trial.hyperparameters.Int('epochs', 10, 30)
super(MyTuner, self).run_trial(trial, *args, **kwargs)
# Uses same arguments as the BayesianOptimization Tuner.
tuner = MyTuner(...)
# Don't pass epochs or batch_size here, let the Tuner tune them.
tuner.search(...)

For epochs specifically, I’d alternatively recommend looking at using early stopping during training via passing in the tf.keras.callbacks.EarlyStopping callback if it’s applicable to your use case. This can be configured to stop your training as soon as the validation loss stops improving. You can pass Keras callbacks like this to search:

# Will stop training if the "val_loss" hasn't improved in 3 epochs.
tuner.search(x, y, epochs=30, callbacks=[tf.keras.callbacks.EarlyStopping('val_loss', patience=3)])

src:https://github.com/keras-team/keras-tuner/issues/122