Ols
Posted by: admin 3 years, 1 month ago
(Comments)
Get started
Open in app
Open in app
You have 2 free member-only stories left this month. Sign up for Medium and get an extra one
Understanding the OLS method for Simple Linear Regression
Valentina Alto
Valentina Alto
Aug 17, 2019·4 min read
Linear Regression is the family of algorithms employed in supervised machine learning tasks (to learn more about supervised learning, you can read my former article here). Knowing that supervised ML tasks are normally divided into classification and regression, we can collocate Linear Regression algorithms in the latter category. It differs from classification because of the nature of the target variable: in classification, the target is a categorical value (‘yes/no’, ‘red/blue/green’, ‘spam/not spam’…); on the other hand, regression involves numerical, continuous values as target, hence the algorithm will be asked to predict a continuous number rather than a class or category. Namely, imagine you want to predict the price of a house based on some relative features: the output of your model will be the price, hence a continuous number.
Regression tasks can be divided into two main groups: those which use only one feature to predict the target, and those which use more than one features for that purpose. To give you an example, let’s consider the house task above: if you want to predict its price only based on its squared meters, you will fall into the first situation (one feature); if you are going to predict the price based on, let’s say, its squared meters, its position and the liveability of the surrounding environment, you are going to fall into the second situation (multiple features, in that case, three).
In the first scenario, the algorithm you are likely to employ will be the Simple Linear Regression, which is the one we are going to talk about in this article. On the other side, whenever you are facing more than one features able to explain the target variable, you are likely to employ a Multiple Linear Regression.
Simple Linear Regression is a statistical model, widely used in ML regression tasks, based on the idea that the relationship between two variables can be explained by the following formula:
Where εi is the error term, and α, β are the true (but unobserved) parameters of the regression. The parameter β represents the variation of the dependent variable when the independent variable has a unitary variation: namely, if my parameter is equal to 0.75, when my x increases by 1, my dependent variable will increase by 0.75. On the other hand, the parameter α represents the value of our dependent variable when the independent one is equal to zero.
Let’s visualize it graphically:
Now, the idea of Simple Linear Regression is finding those parameters α and β for which the error term is minimized. To be more precise, the model will minimize the squared errors: indeed, we do not want our positive errors to be compensated by the negative ones, since they are equally penalizing for our model.
This procedure is called Ordinary Least Squared error — OLS.
Let’s demonstrate those optimization problems step by step. If we reframe our squared error sum as follows:
We can set our optimization problem as follows:
So let’s start with β:
Kenapa sekolah PhD butuh waktu lama!?
Recent newsKali ini kita akan bahas kenapa sekolah PhD itu lama! Tanpa panjang lebar, berikut cara ngeles gw! Maksudnya berikut alasannya! Hope its relate with you!
read more3 days, 16 hours ago
Using Vertex AI for zero one and two three AI prediction
Recent newsHere is my documentation after learning the introduction of AI in courserERA.
read more2 weeks, 6 days ago
Neural network with API for pre-trained API
Recent newsOverview
The Cloud Natural Language API lets you extract entities from text, perform sentiment and syntactic analysis, and classify text into categories.
read more3 weeks, 1 day ago
what is null result
Recent newsNull result in economic is when the output does not supporting your hypothesis
read more3 weeks, 3 days ago
3 weeks, 3 days ago
Fixing the issue in assumption of OLS step by step or one by one
Recent newsHi, I want to raise the issue related to know whether your OLS is ok or not.
read more1 month, 2 weeks ago
Meaning of 45 degree in economics chart
Recent newsThe **45-degree line** in economics and geometry refers to a line where the values on the x-axis and y-axis are equal at every point. It typically has a slope of 1, meaning that for every unit increase along the horizontal axis (x), there is an equal unit increase along the vertical axis (y). Here are a couple of contexts where the 45-degree line is significant:
read more2 months, 3 weeks ago
Collaboratively administrate empowered markets via plug-and-play networks. Dynamically procrastinate B2C users after installed base benefits. Dramatically visualize customer directed convergence without
Comments