Keras自定义损失函数(弹性网)

2024-09-27 18:08:53 发布

您现在位置:Python中文网/ 问答频道 /正文

我试着编写弹性网的代码。它看起来像:

elsticnet formula

我想在Keras中使用这个损失函数:

def nn_weather_model():
    ip_weather = Input(shape = (30, 38, 5))
    x_weather = BatchNormalization(name='weather1')(ip_weather)
    x_weather = Flatten()(x_weather)
    Dense100_1 = Dense(100, activation='relu', name='weather2')(x_weather)
    Dense100_2 = Dense(100, activation='relu', name='weather3')(Dense100_1)
    Dense18 = Dense(18, activation='linear', name='weather5')(Dense100_2)
    model_weather = Model(inputs=[ip_weather], outputs=[Dense18])
    model = model_weather
    ip = ip_weather
    op = Dense18
    return model, ip, op

我的损失函数是:

^{pr2}$

它是mse+L1+L2

L1和L2是

weight1=model.layers[3].get_weights()[0]
weight2=model.layers[4].get_weights()[0]
weight3=model.layers[5].get_weights()[0]
L1 = Calculate_L1(weight1,weight2,weight3)
L2 = Calculate_L2(weight1,weight2,weight3)

我使用Calculate_L1函数计算dense1&dense2&dense3的权重之和 再计算一遍。在

当我训练RB_model.compile(loss = cost_function(),optimizer= 'RMSprop')时,L1和L2变量并没有更新每一批。因此,我尝试在批处理开始时使用回调,同时使用:

class update_L1L2weight(Callback):
    def __init__(self):
        super(update_L1L2weight, self).__init__()
    def on_batch_begin(self,batch,logs=None):
        weight1=model.layers[3].get_weights()[0]
        weight2=model.layers[4].get_weights()[0]
        weight3=model.layers[5].get_weights()[0]
        L1 = Calculate_L1(weight1,weight2,weight3)
        L2 = Calculate_L2(weight1,weight2,weight3)

如何在批处理中使用回调?开始计算L1和L2完成, 把L1,L2变量传递到损失函数中?在


Tags: 函数nameipl1getmodellayerscalculate
1条回答
网友
1楼 · 发布于 2024-09-27 18:08:53

您只需为每个层使用内置的权重regularization in Keras。为此,可以使用层的kernel_regularizer参数并为此指定正则化器。例如:

from keras import regularizers

model.add(Dense(..., kernel_regularizer=regularizers.l2(0.1)))

这些正则化将创建一个损失张量,该张量将添加到损失函数中,如Keras source code中实现的:

^{pr2}$

相关问题 更多 >

    热门问题