Week 2 Forum

Why we used Feature Scaling?

Why we used Feature Scaling?

by hasibur nirob -
Number of replies: 0

Feature scaling is the process of normalising the range of features in a dataset.Real-world datasets often contain features that are varying in degrees of magnitude, range and units. Therefore, in order for machine learning models to interpret these features on the same scale, we need to perform feature scaling.In the world of science, we all know the importance of comparing apples to apples and yet many people, especially beginners, have a tendency to overlook feature scaling as part of their data preprocessing for machine learning. As we will see in this article, this can cause models to make predictions that are inaccurate.In this article, we will discuss:

  • Why feature scaling is important
  • The difference between normalisation vs standardisation
  • Why and how feature scaling affects model performance

More specifically, we will be looking at 3 different scalers in the Scikit-learn library for feature scaling and they are:

  1. MinMaxScaler
  2. StandardScaler
  3. RobustScaler