逻辑回归虽然叫回归,实际上是一个二分类模型,要知道回归模型是连续的,而分类模型是离散的,逻辑回归简单点理解就是在线性回归的基础上增加了一个 sigmoid 函数
逻辑回归 = 线性回归 + sigmoid 函数
梯度下降:Delta heta_j=frac{1}{m}X^T(h-y)
deltatheta = (1.0 / m) * X.T.dot(h - y)
更新参数: heta_j = heta_j - alphaDelta heta_j
theta = theta - alpha * deltatheta
import numpy as np
import matplotlib.pyplot as plt
data = np.loadtxt('ex2data1.txt',delimiter=',')
x = data[:,:-1]
y = data[:,-1]
x -= np.mean(x,axis=0)
x /= np.std(x,axis=0)
X = np.c_[np.ones(len(x)),x]
def mov(theta):
z = np.dot(X,theta)
h = 1/(1+np.exp(-z))
return h
def cos(h):
j = -np.mean(y*np.log(h)+(1-y)*np.log(1-h))
return j
def tidu(sus=10000,aphe=0.1):
m,n = X.shape
theta = np.zeros(n)
j = np.zeros(sus)
for i in range(sus):
h = mov(theta)
j[i] = cos(h)
te = (1/m)*X.T.dot(h-y)
theta -= te * aphe
return h,j,theta
if __name__ == '__main__':
h,j,theta = tidu()
print(j)
plt.plot(j)
plt.show()
页面更新:2024-03-19
本站资料均由网友自行发布提供,仅用于学习交流。如有版权问题,请与我联系,QQ:4156828
© CopyRight 2008-2024 All Rights Reserved. Powered By bs178.com 闽ICP备11008920号-3
闽公网安备35020302034844号