Deep Learning Course 5

Wow, I completed them before the deadline.

😃😃😃

Deep Learning Specialization Certificate

Week 1

Building a Recurrent Neural Network - Step by Step

1.1 - RNN Cell

1
2
3
a_next = np.tanh(np.matmul(Waa, a_prev) + np.matmul(Wax, xt) + ba)

yt_pred = softmax(np.matmul(Wya, a_next) + by)

1.2 - RNN Forward Pass

1
2
3
4
5
6
7
8
xt = x[:,:,t]
a_next, yt_pred, cache = rnn_cell_forward(xt, a_next, parameters)

a[:,:,t] = a_next

y_pred[:,:,t] = yt_pred

caches.append(cache)

2 - Long Short-Term Memory (LSTM) Network

1
2
3
4
5
6
7
8
9
10
concat = np.concatenate((a_prev, xt), axis=0)

ft = sigmoid(np.matmul(Wf, concat) + bf) # forget gate
it = sigmoid(np.matmul(Wi, concat) + bi) # update gate
cct = np.tanh(np.matmul(Wc, concat) + bc) # candidate value
c_next = ft*c_prev + it*cct # cell state
ot = sigmoid(np.matmul(Wo, concat) + bo) # output gate
a_next = ot * np.tanh(c_next) # hidden state

yt_pred = softmax(np.matmul(Wy, a_next) + by)

2.2 - Forward Pass for LSTM

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
Wy = parameters['Wy']
n_x, m, T_x = x.shape
n_y, n_a = parameters['Wy'].shape

a = np.zeros((n_a, m, T_x))
c = np.zeros((n_a, m, T_x))
y = np.zeros((n_y, m, T_x))

a_next = a0
c_next = c[:,:,0]

for t in range(T_x):
xt = x[:,:,t]
a_next, c_next, yt, cache = lstm_cell_forward(xt, a_next, c_next, parameters)
a[:,:,t] = a_next
c[:,:,t] = c_next
y[:,:,t] = yt
caches.append(cache)

…

There is too much code and I didn’t have time to record them as I have received the Certificate.

Translated by gpt-3.5-turbo