如何在Kerastuner中通过交叉验证调整模型中的历元和批量大小?

2024-09-25 00:33:19 发布

您现在位置:Python中文网/ 问答频道 /正文

我想通过使用Kerastuner调整我的Keras模型。我遇到了一些调整批处理大小和epoch的代码片段,以及单独进行Kfold交叉验证的代码片段。我想同时做这些

批量大小和历元代码

class MyTuner(kerastuner.tuners.BayesianOptimization):
  def run_trial(self, trial, *args, **kwargs):
    # You can add additional HyperParameters for preprocessing and custom training loops
    # via overriding `run_trial`
    kwargs['batch_size'] = trial.hyperparameters.Int('batch_size', 32, 256, step=32)
    kwargs['epochs'] = trial.hyperparameters.Int('epochs', 10, 30)
    super(MyTuner, self).run_trial(trial, *args, **kwargs)

# Uses same arguments as the BayesianOptimization Tuner.
tuner = MyTuner(...)
# Don't pass epochs or batch_size here, let the Tuner tune them.
tuner.search(...) 

交叉验证代码

import kerastuner
import numpy as np
from sklearn import model_selection

class CVTuner(kerastuner.engine.tuner.Tuner):
  def run_trial(self, trial, x, y, batch_size=32, epochs=1):
    cv = model_selection.KFold(5)
    val_losses = []
    for train_indices, test_indices in cv.split(x):
      x_train, x_test = x[train_indices], x[test_indices]
      y_train, y_test = y[train_indices], y[test_indices]
      model = self.hypermodel.build(trial.hyperparameters)
      model.fit(x_train, y_train, batch_size=batch_size, epochs=epochs)
      val_losses.append(model.evaluate(x_test, y_test))
    self.oracle.update_trial(trial.trial_id, {'val_loss': np.mean(val_losses)})
    self.save_model(trial.trial_id, model)

tuner = CVTuner(
  hypermodel=my_build_model,
  oracle=kerastuner.oracles.BayesianOptimization(
    objective='val_loss',
    max_trials=40))

x, y = ...  # NumPy data
tuner.search(x, y, batch_size=64, epochs=30)

如何更改run_trial以便这两种方法可以一起执行


Tags: run代码testselfsizemodelbatchtrain