Войти
  • 101741Просмотров
  • 3 года назадОпубликованоProf. Ryan Ahmed

Normalization Vs. Standardization (Feature Scaling in Machine Learning)

In this video, we will cover the difference between normalization and standardization. Feature Scaling is an important step to take prior to training of machine learning models to ensure that features are within the same scale. Normalization is conducted to make feature values range from 0 to 1. Standardization is conducted to transform the data to have a mean of zero and standard deviation of 1. Standardization is also known as Z-score normalization in which properties will have the behavior of a standard normal distribution. Check top-rated Udemy courses below: 10 days of No Code AI Bootcamp Modern Artificial Intelligence with Zero Coding Python & Machine Learning for Financial Analysis Modern Artificial Intelligence Masterclass: Build 6 Projects AWS SageMaker Practical for Beginners | Build 6 Projects Data Science for Business | 6 Real-world Case Studies AWS Machine Learning Certification Exam | Complete Guide TensorFlow 2.0 Practical TensorFlow 2.0 Practical Advanced Machine Learning Regression Masterclass in Python Machine Learning Practical Workout | 8 Real-World Projects Machine Learning Classification Bootcamp in Python MATLAB/SIMULINK Bible|Go From Zero to Hero! Python 3 Programming: Beginner to Pro Masterclass Autonomous Cars: Deep Learning and Computer Vision in Python Control Systems Made Simple | Beginner's Guide Artificial Intelligence in Arabicالذكاء الصناعي مبتدئ لمحترف The Complete MATLAB Computer Programming Bootcamp Thanks and see you in future videos! #featurescaling #normalization