Understanding Gradient Descent

With video explanation | Data Series | Episode 4.2

Mazen Ahmed
6 min readJul 26, 2020

This article plans to expand on episode 4.1, explaining Gradient Descent and how it is used to minimise our cost function in Linear Regression. Knowledge of derivatives and partial derivatives will be helpful.

Linear Regression Recap

From the previous episode we calculated the regression line for our humidity and temperature data to be:

Which we obtained from the cost function graph shown below

The algorithm we use in order to obtain the parameter values that give this minimum cost is called gradient descent.

Overview

The idea of gradient descent is that we start at a random point on our cost function graph, for example here:

And use partial derivatives in order to obtain make our way down to the minimum.

--

--

Mazen Ahmed
Mazen Ahmed

No responses yet