“ML”的版本间的差异
来自个人维基
(→Gradient Descent梯度下降) |
小 |
||
第1行: | 第1行: | ||
− | =Cost Function损失函数= | + | =Week1= |
+ | ==Cost Function损失函数== | ||
Squared error function/Mean squared function均方误差: <math>J(θ)=\frac{1}{2m}\sum_{i=1}^m(h_θ(x_i)-y_i)^2</math> | Squared error function/Mean squared function均方误差: <math>J(θ)=\frac{1}{2m}\sum_{i=1}^m(h_θ(x_i)-y_i)^2</math> | ||
Cross entropy交叉熵: <math>J(θ)=-\frac{1}{m}\sum_{i=1}^m[y^{(i)}*logh_θ(x^{(i)})+(1-y^{(i)})*log(1-h_θ(x^{(i)}))]</math> | Cross entropy交叉熵: <math>J(θ)=-\frac{1}{m}\sum_{i=1}^m[y^{(i)}*logh_θ(x^{(i)})+(1-y^{(i)})*log(1-h_θ(x^{(i)}))]</math> | ||
− | =Gradient Descent梯度下降= | + | ==Gradient Descent梯度下降== |
<math>θ_j:=θ_j+α\frac{∂}{∂θ_j}J(θ)</math> | <math>θ_j:=θ_j+α\frac{∂}{∂θ_j}J(θ)</math> | ||
对于线性模型,其损失函数为均方误差,故有(这里输入训练数据x为m*n矩阵, 线性参数<math>θ</math>为n*1,<math>x_i</math>代表训练矩阵中的第i行,<math>x_{ik}</math>代表第i行第k列): | 对于线性模型,其损失函数为均方误差,故有(这里输入训练数据x为m*n矩阵, 线性参数<math>θ</math>为n*1,<math>x_i</math>代表训练矩阵中的第i行,<math>x_{ik}</math>代表第i行第k列): | ||
第15行: | 第16行: | ||
:<math>= \frac{1}{m}\sum_{i=1}^m( (h_θ(x_i)-y_i) x_{ij} ) </math> | :<math>= \frac{1}{m}\sum_{i=1}^m( (h_θ(x_i)-y_i) x_{ij} ) </math> | ||
:<math>= \frac{1}{m} (h_θ(x)-y) x_{j} </math> | :<math>= \frac{1}{m} (h_θ(x)-y) x_{j} </math> | ||
+ | |||
+ | =Week2= | ||
+ | ==multivariate linear regression== | ||
+ | <math>h_θ(x) = θ^Tx</math> | ||
+ | 其中, | ||
+ | <math> | ||
+ | x=\begin{vmatrix} | ||
+ | x_0 \\ | ||
+ | x_1 \\ | ||
+ | x_2 \\ | ||
+ | ... \\ | ||
+ | x_m | ||
+ | \end{vmatrix}, | ||
+ | θ=\begin{vmatrix} | ||
+ | θ_0 \\ | ||
+ | θ_1\\ | ||
+ | θ_2\\ | ||
+ | ...\\ | ||
+ | θ_m | ||
+ | \end{vmatrix} | ||
+ | </math> |
2018年12月21日 (五) 17:09的版本
目录[隐藏] |
Week1
Cost Function损失函数
Squared error function/Mean squared function均方误差: J(θ)=12mm∑i=1(hθ(xi)−yi)2
Cross entropy交叉熵: J(θ)=−1mm∑i=1[y(i)∗loghθ(x(i))+(1−y(i))∗log(1−hθ(x(i)))]
Gradient Descent梯度下降
θj:=θj+α∂∂θjJ(θ)
对于线性模型,其损失函数为均方误差,故有(这里输入训练数据x为m*n矩阵, 线性参数θ为n*1,xi代表训练矩阵中的第i行,xik代表第i行第k列):
∂∂θjJ(θ)=∂∂θj(12mm∑i=1(hθ(xi)−yi)2)
- =12m∂∂θj(m∑i=1(hθ(xi)−yi)2)
- =12mm∑i=1(∂∂θj(hθ(xi)−yi)2)
- =1mm∑i=1((hθ(xi)−yi)∂∂θjhθ(xi))//链式求导法式
- =1mm∑i=1((hθ(xi)−yi)∂∂θjxiθ)
- =1mm∑i=1((hθ(xi)−yi)∂∂θjn−1∑k=0xikθk)
对于j>=1:
- =1mm∑i=1((hθ(xi)−yi)xij)
- =1m(hθ(x)−y)xj
Week2
multivariate linear regression
hθ(x)=θTx
其中,
x=|x0x1x2...xm|,θ=|θ0θ1θ2...θm|