这篇文章主要介绍“Java线性回归基础代码怎么写”,在日常操作中,相信很多人在Java线性回归基础代码怎么写问题上存在疑惑,小编查阅了各式资料,整理出简单好用的操作方法,希望对大家解答”Java线性回归基础代码怎么写”的疑惑有所帮助!接下来
这篇文章主要介绍“Java线性回归基础代码怎么写”,在日常操作中,相信很多人在Java线性回归基础代码怎么写问题上存在疑惑,小编查阅了各式资料,整理出简单好用的操作方法,希望对大家解答”Java线性回归基础代码怎么写”的疑惑有所帮助!接下来,请跟着小编一起来学习吧!
# Use linear model to model this data.from sklearn.linear_model import LinearRegressionimport numpy as nplr=LinearRegression()lr.fit(pga.distance[:,np.newaxis],pga['accuracy']) # Another way is using pga[['distance']]theta0=lr.intercept_theta1=lr.coef_print(theta0)print(theta1)#calculating cost-function for each theta1#计算平均累积误差def cost(x,y,theta0,theta1): J=0 for i in range(len(x)): mse=(x[i]*theta1+theta0-y[i])**2 J+=mse return J/(2*len(x))theta0=100theta1s = np.linspace(-3,2,197)costs=[]for theta1 in theta1s: costs.append(cost(pga['distance'],pga['accuracy'],theta0,theta1))plt.plot(theta1s,costs)plt.show()print(pga.distance)#调整thetadef partial_cost_theta0(x,y,theta0,theta1): #我们的模型是线性拟合函数时:y=theta1*x + theta0,而不是sigmoid函数,当非线性时我们可以用sigmoid #直接多整个x series操作,省的一个一个计算,最终求sum 再平均 h=theta1*x+theta0 diff=(h-y) partial=diff.sum()/len(diff) return partialpartial0=partial_cost_theta0(pga.distance,pga.accuracy,1,1)def partial_cost_theta1(x,y,theta0,theta1): #我们的模型是线性拟合函数:y=theta1*x + theta0,而不是sigmoid函数,当非线性时我们可以用sigmoid h=theta1*x+theta0 diff=(h-y)*x partial=diff.sum()/len(diff) return partialpartial1=partial_cost_theta1(pga.distance,pga.accuracy,0,5)print(partial0)print(partial1)def gradient_descent(x,y,alpha=0.1,theta0=0,theta1=0): #设置默认参数 #计算成本 #调整权值 #计算错误代价,判断是否收敛或者达到最大迭代次数 most_iterations=1000 convergence_thres=0.000001 c=cost(x,y,theta0,theta1) costs=[c] cost_pre=c+convergence_thres+1.0 counter=0 while( (np.abs(c-cost_pre)>convergence_thres) & (counter<most_iterations) ): update0=alpha*partial_cost_theta0(x,y,theta0,theta1) update1=alpha*partial_cost_theta1(x,y,theta0,theta1) theta0-=update0 theta1-=update1 cost_pre=c c=cost(x,y,theta0,theta1) costs.append(c) counter+=1 return {'theta0': theta0, 'theta1': theta1, "costs": costs}print("Theta1 =", gradient_descent(pga.distance, pga.accuracy)['theta1'])costs=gradient_descent(pga.distance,pga.accuracy,alpha=.01)['cost']print(gradient_descent(pga.distance, pga.accuracy,alpha=.01)['theta1'])plt.scatter(range(len(costs)),costs)plt.show()预览
数据集 :
复制下面数据,保存为: pga.csv
distance,accuracy290.3,59.5302.1,54.7287.1,62.4282.7,65.4299.1,52.8300.2,51.1300.9,58.3279.5,73.9287.8,67.6284.7,67.2296.7,60283.3,59.4284,72.2292,62.1282.6,66.5287.9,60.9279.2,67.3291.7,64.8289.9,58.1289.8,61.7298.8,56.4280.8,60.5294.9,57.5287.5,61.8282.7,56277.7,72.5270.5,71.7285.2,66315.1,55.2281.9,67.6293.3,58.2286,59.9285.6,58.2289.9,65.7277.5,59293.6,56.8301.1,65.4300.8,63.4287.4,67.3281.8,72.6277.4,63.1279.1,66.5287.4,66.4280.9,62.3287.8,57.2261.4,69.2272.6,69.4291.3,65.3294.2,52.8285.5,49287.9,61.1282.2,65.6301.3,58.2276.2,61.7281.6,68.1275.5,61.2309.7,53.1287.7,56.4291.6,56.9284.1,65299.6,57.5282.7,60271.5,72292.1,58.2295,59.4274.9,69273.6,68.7299.9,60.1279.9,74289.9,66283.6,59.8310.3,52.4291.7,65.6284.2,63.2295,53.5298.6,55.1297.4,60.4299.7,67.7284.4,69.7286.4,72.4285.9,66.9297.6,54.3272.5,62277,66.2287.6,60.9280.4,69.4280,63.7295.4,52.8274.4,68.8286.5,73.1287.7,65.2291.5,65.9279,69.4299,65.2290.1,69.1288.9,67.9288.8,68.2283.2,61293.2,58.4285.3,67.3284.1,65.7281.4,67.7286.1,61.4284.9,62.3284.8,68.1296,62282.9,71.8280.9,67.8291.2,62292.8,62.2291,61.9285.7,62.4283.9,62.9298.4,61.5285.1,65.3286.1,60.1283.1,65.4289.4,58.3284.6,70.7296.6,62.3295.9,64.9295.2,62.8293.9,54.5275,65.5286.8,69.5291.1,64.4284.8,62.5283.7,59.5295.4,66.9291.8,62.7274.9,72.3302.9,61.2272.1,80.4274.9,74.9296.3,59.4286.2,58.8294.2,63.3284.1,66.5299.2,62.4275.4,71273.2,70.9281.6,65.9295.7,55.3287.1,56.8287.7,66.9296.7,53.7282.2,64.2291.7,65.6281.6,73.4311,56.2278.6,64.7288,65.7276.7,72.1292,62286.4,69.9292.7,65.7294.2,62.9278.6,59.6283.1,69.2284.1,66278.6,73.6291.1,60.4294.6,59.4274.3,70.5274,57.1283.8,62.7272.7,66.9303.2,58.3282,70.4281.9,61287,59.9293.5,63.8283.6,56.3296.9,55.3290.9,58.2303,58.1292.8,61.1281.1,65293,61.1284,66.5279.8,66.7292.9,65.4284,66.9282,64.5280.6,64287.7,63.4287.7,63.4298.3,59.5299.6,53.4291.3,62.5295.2,61.4288,62.4297.8,59.5286,62.6285.3,66.2286.9,63.4275.1,73.7
到此,关于“Java线性回归基础代码怎么写”的学习就结束了,希望能够解决大家的疑惑。理论与实践的搭配能更好的帮助大家学习,快去试试吧!若想继续学习更多相关知识,请继续关注编程网网站,小编会继续努力为大家带来更多实用的文章!
--结束END--
本文标题: Java线性回归基础代码怎么写
本文链接: https://www.lsjlt.com/news/228586.html(转载时请注明来源链接)
有问题或投稿请发送至: 邮箱/279061341@qq.com QQ/279061341
下载Word文档到电脑,方便收藏和打印~
2024-05-24
2024-05-24
2024-05-24
2024-05-24
2024-05-24
2024-05-24
2024-05-24
2024-05-24
2024-05-24
2024-05-24
回答
回答
回答
回答
回答
回答
回答
回答
回答
回答
0