<p>最后一个答案不起作用了。</p>
<p>另一种方法是从create_model返回一个函数,因为KerasClassifier build_fn期望一个函数:</p>
<pre><code>def create_model(input_dim=None):
def model():
# create model
nn = Sequential()
nn.add(Dense(12, input_dim=input_dim, init='uniform', activation='relu'))
nn.add(Dense(6, init='uniform', activation='relu'))
nn.add(Dense(1, init='uniform', activation='sigmoid'))
# Compile model
nn.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
return nn
return model
</code></pre>
<p>或者更好,根据<a href="https://keras.io/scikit-learn-api/#wrappers-for-the-scikit-learn-api" rel="nofollow noreferrer">documentation</a></p>
<blockquote>
<p>sk_params takes both model parameters and fitting parameters. Legal model parameters are the arguments of build_fn. Note that like all other estimators in scikit-learn, build_fn should provide <em>default values</em> for its arguments, so that you could create the estimator without passing any values to sk_params</p>
</blockquote>
<p>所以你可以这样定义你的函数:</p>
<pre><code>def create_model(number_of_features=10): # 10 is the *default value*
# create model
nn = Sequential()
nn.add(Dense(12, input_dim=number_of_features, init='uniform', activation='relu'))
nn.add(Dense(6, init='uniform', activation='relu'))
nn.add(Dense(1, init='uniform', activation='sigmoid'))
# Compile model
nn.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
return nn
</code></pre>
<p>然后创建一个包装器:</p>
<pre><code>KerasClassifier(build_fn=create_model, number_of_features=20, epochs=25, batch_size=1000, ...)
</code></pre>