深度学习课程 5

哇,我在截止日期之前完成了它们。

😃😃😃

Deep Learning Specialization Certificate

Week 1

Building a recurrent neural network - step by step

1.1 - RNN cell

1
2
3
a_next = np.tanh(np.matmul(Waa, a_prev) + np.matmul(Wax, xt) + ba)

yt_pred = softmax(np.matmul(Wya, a_next) + by)

1.2 - RNN forward pass

1
2
3
4
5
6
7
8
xt = x[:,:,t]
a_next, yt_pred, cache = rnn_cell_forward(xt, a_next, parameters)

a[:,:,t] = a_next

y_pred[:,:,t] = yt_pred

caches.append(cache)

2 - Long Short-Term Memory (LSTM) network

1
2
3
4
5
6
7
8
9
10
concat = np.concatenate((a_prev, xt), axis=0)

ft = sigmoid(np.matmul(Wf, concat) + bf) # forget gate
it = sigmoid(np.matmul(Wi, concat) + bi) # update gate
cct = np.tanh(np.matmul(Wc, concat) + bc) # candidate value
c_next = ft*c_prev + it*cct # cell state
ot = sigmoid(np.matmul(Wo, concat) + bo) # output gate
a_next = ot * np.tanh(c_next) # hidden state

yt_pred = softmax(np.matmul(Wy, a_next) + by)

2.2 - Forward pass for LSTM

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
Wy = parameters['Wy']
n_x, m, T_x = x.shape
n_y, n_a = parameters['Wy'].shape

a = np.zeros((n_a, m, T_x))
c = np.zeros((n_a, m, T_x))
y = np.zeros((n_y, m, T_x))

a_next = a0
c_next = c[:,:,0]

for t in range(T_x):
xt = x[:,:,t]
a_next, c_next, yt, cache = lstm_cell_forward(xt, a_next, c_next, parameters)
a[:,:,t] = a_next
c[:,:,t] = c_next
y[:,:,t] = yt
caches.append(cache)

Too much code and didn’t have time to record them as I have got the Certificate.