Skip to content

This project will be your first steps into AI and Machine Learning. You're going to start with a simple, basic machine learning algorithm. You will have to create a program that predicts the price of a car by using a linear function train with a gradient descent algorithm.

Notifications You must be signed in to change notification settings

hjabbouri/ft_linear_regression

Repository files navigation

ft_linear_regression

This project will be the first steps into AI and Machine Learning.
We're going to start with a simple, basic machine learning algorithm.
We will have to create a program that predicts the price of a car by using a linear function train with a gradient descent algorithm.

Estimated price of a car with 100000 km:

image

Introduction

Linear Regression is a supervised machine learning algorithm where the predicted output is continuous and has a constant slope. It’s used to predict values within a continuous range, (e.g. sales, price) rather than trying to classify them into categories (e.g. cat, dog). There are two main types:

  • Simple regression: Simple linear regression uses traditional slope-intercept form, where m and b are the variables our algorithm will try to “learn” to produce the most accurate predictions. x represents our input data and y represents our prediction. $$y=wx+b$$

  • Multivariable regression: A more complex, multi-variable linear equation might look like this, where w represents the coefficients, or weights, our model will try to learn. $$f(x,y,z)= w_1x + w_2y + w_3z$$

Making predictions

Our prediction function outputs an estimate of price given the mileage of the car and our current values for Weight and Bias. $$Price = Weight * Mileage + Bias$$

Weight: The coefficients weights, or the Slope.

Mileage: the independent variable. we call these variables features.

Bias: the intercept where our line intercepts the y-axis.

Loss Function: Mean Squared Error (MSE)

For the linear regression model, the predicted value is the dot product of the weight and $x$ input vectors plus a bias term $$\hat{y}=wx+b$$

We will use the mean squared error function as our cost function and it's calculated as follows:

$$MSE = \frac{1}{n} \sum_{i=1}^{n} (y_i - \hat{y}_i)^2$$

$$MSE = \frac{1}{n} \sum_{i=1}^{n} (y_i - (wx_i + b))^2$$

Loss Function in Terms of W

$$ \frac{\partial}{\partial w}(MSE) = \frac{\partial}{\partial w} \left[ \frac{1}{n} \sum_{i=1}^{n} (y_i - (wx_i + b))^2 \right] $$

$$ \hspace{1.75cm} = \frac{1}{n} \sum_{i=1}^{n} \frac{\partial}{\partial w} \left[ (y_i - (wx_i + b))^2 \right] $$

$$ \hspace{1.75cm} = \frac{2}{n} \sum_{i=1}^{n} (y_i - (wx_i + b)) \frac{\partial}{\partial w} \left[ y_i - (wx_i + b) \right] $$

$$ \hspace{1.75cm} = \frac{2}{n} \sum_{i=1}^{n} (y_i - (wx_i + b)) (-x_i)$$

$$ \hspace{1.75cm} = -\frac{2}{n} \sum_{i=1}^{n} x_i(y_i - (wx_i + b))$$

Loss Function in Terms of b

$$ \frac{\partial}{\partial b}(MSE) = \frac{\partial}{\partial b} \left[ \frac{1}{n} \sum_{i=1}^{n} (y_i - (wx_i + b))^2 \right] $$

$$ \hspace{1.75cm} = \frac{1}{n} \sum_{i=1}^{n} \frac{\partial}{\partial b} \left[ (y_i - (wx_i + b))^2 \right] $$

$$ \hspace{1.75cm} = \frac{2}{n} \sum_{i=1}^{n} (y_i - (wx_i + b)) \frac{\partial}{\partial b} \left[ y_i - (wx_i + b) \right] $$

$$ \hspace{1.75cm} = \frac{2}{n} \sum_{i=1}^{n} (y_i - (wx_i + b)) (-1)$$

$$ \hspace{1.75cm} = -\frac{2}{n} \sum_{i=1}^{n} (y_i - (wx_i + b))$$

References

About

This project will be your first steps into AI and Machine Learning. You're going to start with a simple, basic machine learning algorithm. You will have to create a program that predicts the price of a car by using a linear function train with a gradient descent algorithm.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published