# Linear Regression Algorithm without Scikit-Learn

In this article, I will teach you how you can easily create your algorithms instead of using any package like Scikit-Learn provided with Python. I will create a Linear Regression Algorithm using mathematical equations, and I will not use Scikit-Learn in this task.

If you are using Scikit-Learn, you can easily use a lot of algorithms that are already made by some famous Researchers, Data Scientists, and other Machine Learning experts. Have you ever thought of building your algorithm instead of using a module like Scikit-Learn? All the Machine Learning Algorithms that Scikit-Learn provides are easy to use but to be a Machine Learning Expert in a brand like Google and Microsoft, you need to build your algorithms instead of using any package so that you could easily create an algorithm according to your needs.

The role of a Data Scientist and a Machine Learning Expert are not just to fit a model and training and testing. These are only the basic stuff that you need to know. Without these, you cannot be called as a practitioner in Machine Learning. But if you started to build your algorithms, it will make you an ideal Expert of all.

Also Read: Audio Processing with Python.

## What is a Linear Regression Algorithm?

A Linear Regression algorithm makes a prediction by simply computing a weighted sum of the input features, plus a constant called the bias term. In mathematics a linear regression algorithm looks like:

## Linear Regression Algorithm without Scikit-Learn

Let’s create our own linear regression algorithm, I will first create this algorithm using the mathematical equation. Then I will visualize our algorithm using the Matplotlib module in Python. I will only use the NumPy module in Python to build our algorithm because NumPy is used in all the mathematical computations in Python. I will start here by creating linear-looking data so that I can use that data in creating my Linear Regression Algorithm:

```.wp-block-code {
border: 0;
}

.wp-block-code > div {
overflow: auto;
}

.hljs {
box-sizing: border-box;
}

.hljs.shcb-code-table {
display: table;
width: 100%;
}

.hljs.shcb-code-table > .shcb-loc {
color: inherit;
display: table-row;
width: 100%;
}

.hljs.shcb-code-table .shcb-loc > span {
display: table-cell;
}

.wp-block-code code.hljs:not(.shcb-wrap-lines) {
white-space: pre;
}

.wp-block-code code.hljs.shcb-wrap-lines {
white-space: pre-wrap;
}

.hljs.shcb-line-numbers {
border-spacing: 0;
counter-reset: line;
}

.hljs.shcb-line-numbers > .shcb-loc {
counter-increment: line;
}

.hljs.shcb-line-numbers .shcb-loc > span {
}

.hljs.shcb-line-numbers .shcb-loc::before {
border-right: 1px solid #ddd;
content: counter(line);
display: table-cell;
text-align: right;
-webkit-user-select: none;
-moz-user-select: none;
-ms-user-select: none;
user-select: none;
white-space: nowrap;
width: 1%;
}
```import numpy as np

X = 2 * np.random.rand(100, 1)
y = 4 + 3 * X + np.random.randn(100, 1)
``````

Before moving forward let’s visualize this data:

``````import matplotlib as mpl
import matplotlib.pyplot as plt
plt.plot(X, y, "b.")
plt.xlabel("\$x_1\$", fontsize=18)
plt.ylabel("\$y\$", rotation=0, fontsize=18)
plt.axis([0, 2, 0, 15])
plt.show()``````

Now, let’s move forward by creating a Linear regression mathematical algorithm. I will use the inv() function from NumPy’s linear algebra module (np.linalg) to compute the inverse of the matrix, and the dot() method for matrix multiplication:

``````X_b = np.c_[np.ones((100, 1)), X]  # add x0 = 1 to each instance
linear = np.linalg.inv(X_b.T.dot(X_b)).dot(X_b.T).dot(y)``````

The function that we used to generate the data is y = 3xi + Gaussian noise. Let’s see what our algorithm found:

``linear``
```array([[4.21509616],
[2.77011339]])```

That’s looks good as a linear regression model. Now let’s make predictions using our algorithm:

``````X_new = np.array([[0], [2]])
X_new_b = np.c_[np.ones((2, 1)), X_new]  # add x0 = 1 to each instance
y_predict = X_new_b.dot(linear)
y_predict``````
```array([[4.21509616],
[9.75532293]])```

Now, let’s plot the predictions of our linear regression:

``````plt.plot(X_new, y_predict, "r-")
plt.plot(X, y, "b.")
plt.axis([0, 2, 0, 15])
plt.show()``````

Now let’s use the same model with the linear regression algorithm, which is provided with Scikit-Learn.

## Linear Regression with Scikit-Learn

You saw above how we can create our own algorithm, you can practice creating your own algorithm by creating an algorithm which is already existing. So that you can evaluate your algorithm using the already existing algorithm. Like here I will cross-check the linear regressing algorithm that I made with the algorithm that Scikit-Learn provides. The results of my algorithm were:

```array([[4.21509616],
[9.75532293]])```

Now, let’s see what results we get from the scikit-learn linear regression model:

``````from sklearn.linear_model import LinearRegression

lin_reg = LinearRegression()
lin_reg.fit(X, y)
lin_reg.intercept_, lin_reg.coef_
lin_reg.predict(X_new)``````
```array([[4.21509616],
[9.75532293]])```