Logistic Regression

1.2.1 Warmup exercise: sigmoid function

As the sigmoid function is defined as:

1
g = 1./(1+exp(-z))

1.2.2 Cost function and gradient

1
2
J = (-y'*log(sigmoid(X*theta)) - (1-y)'*log(1-sigmoid(X*theta)))/m
grad = (X'*(sigmoid(X*theta) - y))/m

2.3 Cost function and gradient

$$\dfrac{\partial J(\theta )}{\partial \theta {j}} = \dfrac{1}{m}\sum{i=1}^{m}(h_{\theta} (x^{(i)}) - y^{(i)})x_{j}^{(i)} + \dfrac{\lambda }{m}\theta _{j}$$

for , the part = 0

1
2
3
4
5
6
r1 = sum(theta(2:end).^2)*lambda/2/m
J = (-y'*log(sigmoid(X*theta)) - (1-y)'*log(1-sigmoid(X*theta)))/m + r1

r2 = ones(size(theta))
r2(1) = 0
grad = (X'*(sigmoid(X*theta) - y))/m + (theta.*r2)*lambda/m

costFunctionReg

Yes, yes, I know I passed. 😄

But, God knows what happened? 🤔

3 Logistic Regression