擅长:python、mysql、java
<p>第一层激活为sigmoid,第二层也为sigmoid,最后一层作为1输出。在</p>
<p>将激活更改为<code>relu</code>,因为<code>sigmoid</code>将压缩0到1之间的值。使数字非常小,导致2隐藏层的消失梯度问题。在</p>
<pre><code>def base_model():
model = Sequential()
model.add(Dense(11, input_dim=11, kernel_initializer='normal', activation='relu'))
model.add(Dense(7, kernel_initializer='normal', activation='relu'))
model.add(Dense(1, kernel_initializer='normal'))
model.compile(loss='mean_squared_error', optimizer='adam')
return model
</code></pre>