Logistic Regression

1. Sigmoid Function

The formula for a sigmoid function is as follows -

$$ g(z) = \frac{1}{1+e^{-z}} \tag{1}$$

In the case of logistic regression, z (the input to the sigmoid function), is the output of a linear regression model.

NumPy has a function called exp(), which offers a convenient way to calculate the exponential ( $e^{z}$) of all elements in the input array (z).

i) Plot Exponential Function np.exp(x)

ii) Sigmoid Function Implementation

iii) Plot Sigmoid Function

2. Cost Function for Logistic Regression

i) Dataset

ii) Define Cost Function

iii) Check Cost

3. Gradient Descent for Logistic Regression

Recall the gradient descent algorithm utilizes the gradient calculation: $$\begin{align*} &\text{repeat until convergence:} \; \lbrace \\ & \; \; \;w_j = w_j - \alpha \frac{\partial J(\mathbf{w},b)}{\partial w_j} \tag{1} \; & \text{for j := 0..n-1} \\ & \; \; \; \; \;b = b - \alpha \frac{\partial J(\mathbf{w},b)}{\partial b} \\ &\rbrace \end{align*}$$

Where each iteration performs simultaneous updates on $w_j$ for all $j$, where $$\begin{align*} \frac{\partial J(\mathbf{w},b)}{\partial w_j} &= \frac{1}{m} \sum\limits_{i = 0}^{m-1} (f_{\mathbf{w},b}(\mathbf{x}^{(i)}) - y^{(i)})x_{j}^{(i)} \tag{2} \\ \frac{\partial J(\mathbf{w},b)}{\partial b} &= \frac{1}{m} \sum\limits_{i = 0}^{m-1} (f_{\mathbf{w},b}(\mathbf{x}^{(i)}) - y^{(i)}) \tag{3} \end{align*}$$

i) Compute Gradient

Expected output

dj_db: 0.49861806546328574
dj_dw: [0.498333393278696, 0.49883942983996693]

ii) Gradient Descent Code

iii) Check Gradient Descent

4. Logistic Regression using Scikit-Learn

Go to Home