Regression Linear Simple
Regression Linear Simple
• Regression is a statistical and machine learning technique used for modelling the
relationship between a dependent variable (target) and one or more independent
variables (predictors or features).
• The primary goal of regression analysis is to understand how changes in the
independent variables are associated with changes in the dependent variable.
• In regression, we typically assume that there is a linear or nonlinear relationship
between the independent variables and the dependent variable.
• The resulting model can be used for various purposes, including prediction,
forecasting, and understanding the underlying relationships in the data.
• In regression, the output is continuous.
Regression
Linear Vs Non-Linear
Types of Regression
• We need to check that how loss function changes with small changes in the value of
parameters θ0 and θ1.
Simple Linear Regression
• Gradient descent algorithm to find the value of the unknown parameters.
Simple Linear Regression
• We choose any random values of θ0 and
θ1and then calculate the value of Loss A
B
function.
L(θ1)
• For understanding purpose, we assume that
θ0 =0 and calculate the value of Loss
function. C
• Now if we draw a graph between L(θ1) and θ1
θ1 then value of L(θ1) can be at point A, or it
can be at point B. It depends on the initially
chosen value of θ1. We also assume that
L(θ1) is minimum at point C.
• Then question arises how do we reach at
Point C from A or from B.
Simple Linear Regression
• We differentiate the loss function L with
respect to the parameter θ1i.e. we calculate A
B
L(θ1)
• In simple linear regression, the goal is to find the best-fitting line (a straight line)
that minimizes the sum of the squared differences between the predicted values
and the actual values of a dependent variable (Y) based on an independent variable
(X).
• Gradient descent is one of the optimization techniques used to find the parameters
(slope and intercept) of this line.
Gradient Descent for Simple Linear Regression
• Here are the mathematical steps for finding the parameters (slope and intercept) in
simple linear regression using gradient descent:
1. Initialize Parameters: Start by initializing the slope (m) and intercept (b) with some initial
values. These values can be set randomly or to some initial guess.
2. Define the Cost Function: The cost function represents how far off your predictions are from
the actual values. In simple linear regression, the cost function (often called the Mean Squared
Error or MSE) is defined as: J(m, b) = (1 / 2n) * Σ(i=1 to n) [(Yi - (m * Xi + b))^2]
Where:
•J(m, b) is the cost function,
•m is the slope (parameter to be updated).
•b is the intercept (parameter to be updated).
•n is the number of data points.
•Xi is the independent variable (feature) for the ith data point.
•Yi is the actual dependent variable (target) for the ith data point.
Gradient Descent for Simple Linear Regression