python中的曲线拟合适合我的曲线

问题描述

def sigmoid_function(x1,k,xo,a,c):
    return (a/ (1+ np.exp(-k*(x1-xo))))+c

x_data=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35]

y_data =[0.08965066,0.08990541,0.090073960.09013885,0.09021248,0.09038204,0.09044601,0.09062396,0.09074469,0.09097924,0.09101625,0.09110833,0.09130073,0.09153685,0.09165991,0.09189038,0.09236043,0.09329333,0.09470363,0.09750811,0.10305867,0.11295684,0.12767181,0.14647349,0.16744916,0.18869261,0.20908784,0.22828775,0.2459888,0.262817,0.27898482,0.29499955,0.31033699,0.32526762,0.33972489]

result,covariance= optimize.curve_fit(sigmoid_function,x_data,y_data,maxfev=10000)

Curve with exact data Curve fit resut

我是ml的新手,请告诉我是否可以更改curve_fit()中的任何参数。

解决方法

如果您以观察的规模来看优化,那么优化功能似乎无法很好地发挥作用。

view of curve fitting

但是,如果缩小并查看优化功能的规模,情况会大不相同。

larger view of curve fitting

curve_fit没有提供任何边界或优化方法时,它将使用Levenberg-Marquardt,这可能找不到全局解。

在只有一个最小值的情况下,β=(1,
1,…,1)可以正常工作;在具有多个极小值的情况下,仅当初始猜测已经接近最终解时,算法才会收敛到全局最小值。

您所看到的是优化函数陷入了局部最小值。就像上面说的那样,您可以通过提供更接近解决方案的初始参数来解决此问题,以便优化可以避免该最小值陷阱。例如,通过执行以下操作:

p0 = [0.1,0.1,0.1]
result,convariance = optimize.curve_fit(sigmoid_function,x_data,y_data,p0)

优化功能的行为符合您的预期:

enter image description here

相关问答

错误1:Request method ‘DELETE‘ not supported 错误还原:...
错误1:启动docker镜像时报错:Error response from daemon:...
错误1:private field ‘xxx‘ is never assigned 按Alt...
报错如下,通过源不能下载,最后警告pip需升级版本 Requirem...