Building Features from Nominal Data
Janani Ravi
2:39:16
Description
This course covers various techniques for encoding categorical data, starting with the familiar forms of one-hot and label encoding, before moving to contrast coding schemes such as simple coding, Helmert coding, and orthogonal polynomial coding.
What You'll Learn?
The quality of preprocessing the numeric data is subjected to the important determinant of the results of machine learning models built using that data. In this course, Building Features from Nominal Data, you will gain the ability to encode categorical data in ways that increase the statistical power of models. First, you will learn the different types of continuous and categorical data, and the differences between ratio and interval scale data, and between nominal and ordinal data. Next, you will discover how to encode categorical data using one-hot and label encoding, and how to avoid the dummy variable trap in linear regression. Finally, you will explore how to implement different forms of contrast coding - such as simple, Helmert, and orthogonal polynomial coding, so that regression results closely mirror the hypotheses that you wish to test. When you’re finished with this course, you will have the skills and knowledge of encoding categorical data needed to increase the statistical power of linear regression that includes such data.
More details
User Reviews
Rating
Janani Ravi
Instructor's Courses
Pluralsight
View courses Pluralsight- language english
- Training sessions 43
- duration 2:39:16
- level average
- Release Date 2023/10/11