多层感知器不工作

2024-06-26 10:43:12 发布

您现在位置:Python中文网/ 问答频道 /正文

我试图实现一个多层感知器使用简单的numpy。但是,我遇到了一个路障。我很有可能没有正确地实现它,因为在过去,我一直使用库来实现这个目的。如果你能帮我调试代码,我将不胜感激。它不到100行代码,希望不要占用太多时间。谢谢!你知道吗

我的感知机的细节如下:

  1. 输入=2
  2. 输出=1
  3. 隐藏层计数=5
  4. 损失=平方误差

以下是我的代码:

(我已经在所有必要的地方发表了评论)

import numpy as np
import matplotlib.pyplot as plt

#Sampling 100 random values uniformly distributed b/w 0 and 5.
x=np.random.uniform(low=0, high=5, size=(100,))
y=np.multiply(x,x)
#Storing the random values and their squares in x and y
x=np.reshape(x,(-1,1))
y=np.reshape(y,(-1,1))
# plt.plot(x,y, 'ro')
# plt.show()

#Network Initialisation
hSize=5
inputSize=1
outputSize=1
Wxh=np.random.rand(hSize, inputSize+1)
Woh=np.random.rand(outputSize, hSize+1)


#+++++++++++++Back-propagation++++++++++++++
iterations=100
WohGrad=np.zeros(Woh.shape)
WxhGrad=np.zeros(Wxh.shape)
for i in range(0, iterations):
    #+++++++++++++Forward Pass++++++++++++++
    #Input Layer
    z1=x[i]
    a1=z1
    h1=np.append([1], a1)
    #Hidden Layer-1
    z2=np.dot(Wxh, h1)
    a2=1/(1+np.exp(-z2))
    h2=np.append([1], a2)
    #Output Layer
    z3=np.dot(Woh, h2)
    a3=z3


    #+++++++++++++Backward Pass++++++++++++++
    #Squared Error
    pred=a3
    expected=y[i]
    loss=np.square(pred-expected)/2

    #Delta Values
    delta_3=(pred-expected)
    delta_2=np.multiply(np.dot(np.transpose(Woh), delta_3)[1:], 1/(1+np.exp(-z2) ))

    #Parameter Gradients and Update
    WohGrad=WohGrad+np.dot(delta_3,(h2.reshape(1,-1)))
    WxhGrad=WxhGrad+np.dot(delta_2.reshape(hSize,-1),(h1.reshape(1,-1)))

#Parameter Update
learningRate=0.01
L2_regularizer=0.01
WohGrad=WohGrad/iterations+L2_regularizer*Woh
WxhGrad=WxhGrad/iterations+L2_regularizer*Wxh
Wxh=Wxh-learningRate*WxhGrad
Woh=Woh-learningRate*WohGrad


#++++++++Testing++++++++++
#Forward Pass
#Input Layer
z1=np.array([2.5])
a1=z1
h1=np.append([1], a1)


#Hidden Layer-1
z2=np.dot(Wxh, h1)
a2=1/(1+np.exp(-z2))
h2=np.append([1], a2)
#Output Layer
z3=np.dot(Woh, h2)
a3=z3
print(a3)

Tags: andlayernprandomh2h1dotdelta