问题描述
我有一个(x1,x2,y)个点的数据集,我想训练一个多项式回归模型,使其适合具有二次或三次平面的“ y”值。
最小y
s.t。 W1 * x1 + W2 * x2
其中W1和W2是已知常数,而C是在运行时给出的(可能会有所不同,但众所周知)
这是我当前的进度:
degree=3
test_size = 0
X_train,X_test,y_train,y_test = train_test_split(data_x,data_y,test_size=test_size,random_state=1)
poly_features = polynomialFeatures(degree=degree)
X_train_poly = poly_features.fit_transform(X_train)
poly_model = LinearRegression()
poly_model.fit(X_train_poly,y_train)
y_train_predicted = poly_model.predict(X_train_poly)
# evaluating the model on training dataset
rmse_train = np.sqrt(mean_squared_error(y_train,y_train_predicted))
r2_train = r2_score(y_train,y_train_predicted)
print("The model performance for the training set")
print("-------------------------------------------")
print("RMSE of training set is {}".format(rmse_train))
print("R2 score of training set is {}".format(r2_train))
print("\n")
找到模型后,我在此处开始优化:
def objective(x):
print(x)
print(poly_model.predict(poly_features.fit_transform([[x[0],x[1]]])))
return poly_model.predict(poly_features.fit_transform([[x[0],x[1]]]))
# w1 = 0.01,w2 = 0.09,and C = 100
linear_constraint1 = LinearConstraint([[0.01,0.09]],100)
res = minimize(objective,[0,0],method='trust-constr',constraints=[linear_constraint1],bounds=[(0,None),(0,None)])
我遇到以下问题:
- 我不断收到此错误
UserWarning: delta_grad == 0.0. Check if the approximated function is linear. If the function is linear better results can be obtained by defining the Hessian as zero instead of using quasi-Newton approximations. 'approximations.',UserWarning)
- 结果不符合我的预期,其中x1始终接近零,x2大。
我感兴趣的输出是满足两个条件(最小y和加权和 请指教,谢谢。
解决方法
暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!
如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。
小编邮箱:dio#foxmail.com (将#修改为@)