Hi,
eqn (10) : Z = WX + B
Here the dimensions of Ŷ is 1*12288. In this way, we can write same equation which are quivalant to the equations we derived in previous posts.
eqn (11): Ŷ = g(Z)
eqn (12): J = np.sum(-Ylog(Ŷ) - (1 - Y)log(1 - Ŷ))
(Here the np.sum means adding the all elimants in the matrix.)
eqn (13): ∂W = (Ŷ - Y)XT/m
eqn (14): ∂B = np.sum(Ŷ - Y)/m
eqn (15): W = W -α∂W
eqn (16): B = B -α∂B
In the next post, I will present you what is the learning rate and the effect of learning rate on the neural network.
Please feel free to raise any concerns/suggestions on this blog post. Let's meet in the next post.
My previous blogs,
How did I learn Machine Learning : part 1 - Create the coding environment
How did I learn Machine Learning : part 2 - Setup conda environment in PyCharm
How did I learn Machine Learning : part 3 - Implement a simple neural network from scratch I
How did I learn Machine Learning : part 3 - Implement a simple neural network from scratch II
How did I learn Machine Learning : part 3 - Implement a simple neural network from scratch III
How did I learn Machine Learning : part 3 - Implement a simple neural network from scratch IV
References:
cousera