运行时警告:执行逻辑回归时在日志中遇到被零除的情况

2024-09-28 22:25:26 发布

您现在位置:Python中文网/ 问答频道 /正文

我正在尝试做一个逻辑回归算法,但是sigmoid曲线没有正确地拟合数据集,代码在一些迭代之后也给了我这两条错误线

RuntimeWarning: divide by zero encountered in log cost[i] = -y[i]*np.log(pridicted_y) - (1-y[i])*np.log(1-pridicted_y)

RuntimeWarning: invalid value encountered in double_scalars cost[i] = -y[i]*np.log(pridicted_y) - (1-y[i])*np.log(1-pridicted_y)

这是我的全部代码

import pandas as pd;
import matplotlib.pyplot as plt
import numpy as np
import random

def line(m,b):
    x = np.arange(-10,10)
    y = np.zeros(len(x))
    for i in range(len(x)):
        y[i] = 1/(1+np.exp(-(m*x[i]+b)))
    plt.plot(x,y,alpha=0.3)

def LogCost(m,b,xpoints,ypoints,flag):
    cost = np.zeros(len(xpoints))
    Totalcost = 0
    pridicted_y = 0
    for i in range(len(xpoints)):
        if flag==0:
            pridicted_y = 1/(1+np.exp(-(m*x[i]+b)))
            cost[i] = -y[i]*np.log(pridicted_y) - (1-y[i])*np.log(1-pridicted_y)
        else:
            pridicted_y = 1/(1+np.exp(-(m*x[i]+b)*x[i]))
            cost[i] = -y[i]*np.log(pridicted_y) - (1-y[i])*np.log(1-pridicted_y)

        Totalcost = Totalcost + cost[i]
        # plt.plot([xpoints[i],xpoints[i]],[pridicted_y,ypoints[i]],alpha=0.2)
    return Totalcost
    
def gradientDescent(m,b,xpoints,ypoints,alpha):
    M=m
    B=b
    n = len(xpoints)
    for i in range(2000):
        M = m - alpha*(1/n)*LogCost(m,b,xpoints,ypoints,0)
        B = m - alpha*(1/n)*LogCost(m,b,xpoints,ypoints,1)
        m = M
        b = B
        line(m,b)
    return(m,b)
x = np.arange(-10,10)
y = np.zeros(len(x))
m = 0
b = 0
alpha = 0.01
y = [0,0,0,0,0,0,0,1,1,0,1,1,0,1,1,1,1,1,1,1]
plt.plot(x,y,"ro")
m,b = gradientDescent(m,b,x,y,alpha)
line(m,b)
plt.show()

这是生成的图像

enter image description here

如有任何帮助,我们将不胜感激


Tags: inimportalphaloglendefasnp