有没有办法在scipy.minimize收敛后直接检索最小化错误,或者必须直接编码到代价函数中
我只能检索到似乎收敛到的系数
def errorFunction(params,series,loss_function,slen = 12):
alpha, beta, gamma = params
breakUps = int(len(series) / slen)
end = breakUps * slen
test = series[end:]
errors = []
for i in range(2,breakUps+1):
model = HoltWinters(series=series[:i * 12], slen=slen,
alpha=alpha, beta=beta, gamma=gamma, n_preds=len(test))
model.triple_exponential_smoothing()
predictions = model.result[-len(test):]
actual = test
error = loss_function(predictions, actual)
errors.append(error)
return np.mean(np.array(errors))
opt = scipy.optimize.minimize(errorFunction, x0=x,
args=(train, mean_squared_log_error),
method="L-BFGS-B", bounds = ((0, 1), (0, 1), (0, 1))
)
#gets the converged values
optimal values = opt.x
#I would like to know what the error with errorFunction is when using opt.x values, without having to manually run the script again
#Is the minimum error stored somewhere in the returned object opt
根据我对函数
scipy.optimize.minimize
文档的理解,结果作为OptimizeResult
对象返回从这个类(here)的文档来看,它有一个属性
fun
,即“目标函数的值”所以如果你做
opt.fun
,你应该得到你想要的结果(您可以检索更多的值,如Jacobianopt.jac
,Hessianopt.hess
。。。如文件所述)相关问题 更多 >
编程相关推荐