Recap
Logistic Regression:
$\hat{y}=\sigma ( w^{\tau}x + b )$, $\sigma(z) = \dfrac{1}{1+e^{-z}}$
Want to find $w,b$ that minimize $J(w,b)$
Gradient Descent
Suppose $\alpha$ is learning rate, then loop:
$w = w - \alpha \dfrac{\partial J(w,b)}{\partial w}$
$b = b - \alpha \dfrac{\partial J(w,b)}{\partial b}$