Keras–使用自定义激活函数时的人工神经网络错误

2024-07-01 06:43:04 发布

您现在位置:Python中文网/ 问答频道 /正文

我正在使用Kera的函数API创建一个人工神经网络(ANN)。链接到数据csv文件:https://github.com/dpintof/SPX_Options_ANN/blob/master/MLP3/call_df.csv。再现问题的代码的相关部分:

import pandas as pd
from sklearn.model_selection import train_test_split
from tensorflow import keras
import tensorflow as tf
from tensorflow.keras import layers


# Data 
call_df = pd.read_csv("call_df.csv")

call_X_train, call_X_test, call_y_train, call_y_test = train_test_split(call_df.drop(["Option_Average_Price"],
                    axis = 1), call_df.Option_Average_Price, test_size = 0.01)


# Hyperparameters
n_hidden_layers = 2 # Number of hidden layers.
n_units = 128 # Number of neurons of the hidden layers.

# Create input layer
inputs = keras.Input(shape = (call_X_train.shape[1],))
x = layers.LeakyReLU(alpha = 1)(inputs)

"""
Function that creates a hidden layer by taking a tensor as input and applying a
modified ELU (MELU) activation function.
"""
def hl(tensor):
    # Create custom MELU activation function
    def melu(z):
        return tf.cond(z > 0, lambda: ((z**2)/2 + 0.02*z) / (z - 2 + 1/0.49), 
                        lambda: 0.49*(keras.activations.exponential(z)-1))
    
    y = layers.Dense(n_units, activation = melu)(tensor)
    return y

# Create hidden layers
for _ in range(n_hidden_layers):
    x = hl(x)

# Create output layer
outputs = layers.Dense(1, activation = keras.activations.softplus)(x)

# Actually create the model
model = keras.Model(inputs=inputs, outputs=outputs)


# QUICK TEST
model.compile(loss = "mse", optimizer = keras.optimizers.Adam())
history = model.fit(call_X_train, call_y_train, 
                    batch_size = 4096, epochs = 1,
                    validation_split = 0.01, verbose = 1)

这是我在执行model.fit(…)时遇到的错误(请注意,4096是我的批处理大小,128是隐藏层的神经元数):

InvalidArgumentError:  The second input must be a scalar, but it has shape [4096,128]
     [[{{node dense/cond/dense/BiasAdd/_5}}]] [Op:__inference_keras_scratch_graph_1074]

Function call stack:
keras_scratch_graph

我知道问题与自定义激活函数有关,因为如果我使用以下hl函数,程序运行正常:

def hl(tensor):
    lr = layers.Dense(n_units, activation = layers.LeakyReLU())(tensor)
    return lr

我在尝试这样定义melu(z)时遇到了相同的错误:

@tf.function
def melu(z):
    if z > 0:
        return ((z**2)/2 + 0.02*z) / (z - 2 + 1/0.49)
    else:
        return 0.49*(keras.activations.exponential(z)-1)

How do you create a custom activation function with Keras?开始,我也尝试了以下方法,但没有成功:

def hl(tensor):
    # Create custom MELU activation function
    def melu(z):
        return tf.cond(z > 0, lambda: ((z**2)/2 + 0.02*z) / (z - 2 + 1/0.49), 
                        lambda: 0.49*(keras.activations.exponential(z)-1))
    
    from keras.utils.generic_utils import get_custom_objects
    get_custom_objects().update({'melu': layers.Activation(melu)})
 
    x = layers.Dense(n_units)(tensor)
    y = layers.Activation(melu)(x)
    return y

Tags: testimportdfmodelreturnlayersdefcreate
1条回答
网友
1楼 · 发布于 2024-07-01 06:43:04

发生此问题的原因是^{}要求条件参数使用标量(而不是多维张量)。相反,您可以使用^{}来应用条件元素

例如,您可以如下定义melu

def melu(z):
    return tf.where(z > 0, ((z**2)/2 + 0.02*z) / (z - 2 + 1/0.49), 
                           0.49*(keras.activations.exponential(z)-1))

注意:未测试

相关问题 更多 >

    热门问题