Recurrent dropout is not implemented in cuDNN RNN ops. At the cuDNN level. So we can't have it in Keras.
The dropout option in the cuDNN API is not recurrent dropout (unlike what is in Keras), so it is basically useless (regular dropout doesn't work with RNNs).
Actually using such dropout in a stacked RNN will wreck training.
你可以使用核正则化器和递归正则化器来防止过度拟合,我正在使用L2正则化器,我有很好的效果。在
我不认为我们可以拥有它,因为它甚至在低层(即cuDNN)中都没有支持。From François CholletKeras的创建者:
相关问题 更多 >
编程相关推荐