Machine Learning by Stanford University
This is my process record of attempting to learn machine learning in the course.
I heard about artificial intelligence a year ago, but as an elderly person who finds it difficult to accept new things, I never really studied it. However, it seems that I will lose my job in this AI revolution. Therefore, I will make an effort to learn it. However, I am old and canβt learn, and I have no confidence that I will be able to master it one day. πππ
1 Linear Regression with One Variable
2 Linear Regression with Multiple Variables
ex1
The coding part is my favorite, I like coding π
ComputeCost
You see, the cost function here (OMG: the formula code is J(\theta) = \cfrac{1}{2m}\sum\limits_{i=1}^{m}(h_\theta(x^{(i)}) - y^{(i)} )^2
in markdown) π€―
I found that in the actual calculation using Octave
, it is X * theta
instead of theta' * X
as shown in the formula.
1 | J = sum(((X * theta) - y).^2)/2/m |
GradientDescent
Similarly, here is the gradient descent
repeat until convergence: {
}
1 | theta = theta - alpha/m*(X' * ((X * theta) - y)) |
It took me several hours to understand why X' * ((X * theta) - y))
is used. It is because all variables, X, theta, and y, are vectors and X' *(...)
represents the sum()
part as shown in the formula.
Although I passed the test, I still donβt fully understand it π΅
3 Logistic Regression
To Be Continuedβ¦
Translated by gpt-3.5-turbo