Logistic Regression
1.2.1 Warmup exercise: sigmoid function
As the sigmoid function is defined as:
1 | g = 1./(1+exp(-z)) |
1.2.2 Cost function and gradient
1 | J = (-y'*log(sigmoid(X*theta)) - (1-y)'*log(1-sigmoid(X*theta)))/m |
2.3 Cost function and gradient
$$\dfrac{\partial J(\theta )}{\partial \theta {j}} = \dfrac{1}{m}\sum{i=1}^{m}(h_{\theta} (x^{(i)}) - y^{(i)})x_{j}^{(i)} + \dfrac{\lambda }{m}\theta _{j}$$
for
1 | r1 = sum(theta(2:end).^2)*lambda/2/m |
Yes, yes, I know I passed. 😄
But, God knows what happened? 🤔