Lec06 Derivatives
Lec06 Derivatives
• In particular, we'll first compute the derivative with respect to v. Then that
becomes useful for computing the derivative with respect to a and the derivative
with respect to u. Then the derivative with respect to u, becomes useful for
computing the derivative with respect to b and the derivative with respect to c.
Exercise: Compute
1. dJ/db?
2. dJ/dc? Hint
Practical Example
Derivatives with a Computation
Graph
We have discussed the computation graph and how does
1. a forward or left to right calculation to compute the cost function such as J
that you might want to optimize, and
2. a backwards or a right to left calculation to compute derivatives.
These concepts will be applied to compute derivatives of the logistic
regression model.
Logistic Regression Gradient
Descent
Recap: We had set up logistic regression as follows:
• Let's write this out as a computation graph and for this example, If we
have only two features, X1 and X2.
• Recap
So, compute these derivatives, as we have shown on the training examples, and
average them to give the overall gradient to implement the gradient descent.
Logistic Regression Gradient
Descent on m examples
What we're going to do is use a for loop over the training set, and
• We would start up as: compute the derivative with respect to each training example and
then add them up.
After Computations, dw1 would be
So, with all of these calculations, you've
derivative of just
costcomputed the
fn J w.r.t. w1.
derivatives of the cost function J with respect to each your
parametersFinally,
w_1,tow_2
implement
and b. one step of gradient
descent,
We're using dw_1update the and
and dw_2 learnable
db as parameters
accumulators, so that
after thisas:
computation, dw_1 is equal to the derivative of your
overall cost function with respect to w_1 and similarly for dw_2
and db.