Machine Learning by Stanford University

This is my process record of attempting to learn machine learning in the course.

I heard about artificial intelligence a year ago, but as an elderly person who finds it difficult to accept new things, I never really studied it. However, it seems that I will lose my job in this AI revolution. Therefore, I will make an effort to learn it. However, I am old and can’t learn, and I have no confidence that I will be able to master it one day. πŸ˜‚πŸ˜‚πŸ˜‚


1 Linear Regression with One Variable

2 Linear Regression with Multiple Variables

ex1

The coding part is my favorite, I like coding πŸ˜„

ComputeCost

You see, the cost function here (OMG: the formula code is J(\theta) = \cfrac{1}{2m}\sum\limits_{i=1}^{m}(h_\theta(x^{(i)}) - y^{(i)} )^2 in markdown) 🀯

I found that in the actual calculation using Octave, it is X * theta instead of theta' * X as shown in the formula.

1
J = sum(((X * theta) - y).^2)/2/m

GradientDescent

Similarly, here is the gradient descent

repeat until convergence: {

}

1
theta = theta - alpha/m*(X' * ((X * theta) - y))

It took me several hours to understand why X' * ((X * theta) - y)) is used. It is because all variables, X, theta, and y, are vectors and X' *(...) represents the sum() part as shown in the formula.

Although I passed the test, I still don’t fully understand it 😡

3 Logistic Regression

To Be Continued…

1 Linear Regression with One Variable
2 Linear Regression with Multiple Variables

Translated by gpt-3.5-turbo