如何将最小绝对回归实现为线性优化问题

2024-09-30 08:26:12 发布

您现在位置:Python中文网/ 问答频道 /正文

我是这方面的初学者,所以我尝试将最小绝对重算法作为线性优化问题来实现:

a lineal formulation

更新代码 到目前为止,这是我的代码,但我确信它不好,因为我得到了下一个错误:

ERROR: Rule failed when generating expression for constraint neg_Constraint with index 1: AttributeError: 'NoneType' object has no attribute 'getname'

#Settings my variables
nsample = 100
nvariables = 2
beta=np.random.randint(-5,5,size=([nvariables+1,1]));
x0 = np.ones([nsample,1])
x1 = np.random.uniform(0,5,([nsample,nvariables]))
X = np.concatenate([x0,x1], axis = 1)
error = np.random.normal(0,1, (nsample,1))
Y = np.dot(X, beta) + error

#Defining my model
from pyomo.environ import *
opt = SolverFactory("glpk")
from pyomo.environ import SolverFactory

def la_reg_opt(n,m,x,y):

    model =  AbstractModel()
    model.n = n
    model.m = m

    model.I = RangeSet(1, model.n)
    model.J = RangeSet(1, model.m)

    model.y = y
    model.x = x
    model.b = Var(model.J, domain = NonNegativeReals)
    model.t = Var(model.I, domain = NonNegativeReals) #Tercera restriccion

    def obj_funct(model):
        return summation(model.t)
    model.OBJ = Objective(rule = obj_funct)

    def lower_bound(model,i):
        return   model.y[i] - sum(model.x[i,j]*model.b[j] for j in model.J) >= -model.t[i]
    model.neg_Constraint = Constraint (model.I, rule = lower_bound) 

    def upper_bound(model,i):
        return  model.y[i] - sum(model.x[i,j]*model.b[j] for j in model.J) >= model.t[i] 
    model.pos_Constraint = Constraint (model.I, rule = upper_bound) 

    instance = model.create_instance()
    instance.dual = Suffix(direction=Suffix.IMPORT)
    results = opt.solve(instance)

    return results


#calling my model
res=la_reg_opt(nsample, nvariables, X, Y)
print (res)

我做错什么了?你知道吗

比你还厉害。你知道吗


Tags: instance代码formodelreturnmydefnp

热门问题