Companies Home Search Profile

Applied Bayes' Theorem and Naive Bayes Classifiers

Focused View

14:07:16

0 View
  • 1 - L01-The-Purpose-and-the-Structure-of-the-Course.pdf
  • 1 - The Purpose and Structure of this Course.mp4
    16:41
  • 2 - Fundamental Definitions of Probability.mp4
    20:36
  • 2 - L02-Fundamental-Definitions-of-Probability.pdf
  • 3 - L03-01-Distribution-of-Application-Scores-Single-Category.xlsx
  • 3 - L03-02-The-Binomial-Distribution.xlsx
  • 3 - L03-03-Distribution-Table-Client-Rating-by-Size-of-Customer.xlsx
  • 3 - L03-Probability-Distribution-Tables-Contingency.pdf
  • 3 - Probability Distribution Tables Contingency.mp4
    17:41
  • 4 - L04-Probability-Rule-1-Intersection-Rule-for-Independent-Events-JOIN.pdf
  • 4 - Probability Rule 1 Intersection Rule for Independent Events JOIN.mp4
    20:44
  • 5 - L05-Probability-Rule-2-The-Union-Rule-OR-or-UNION.pdf
  • 5 - L05-Solution-to-the-Hiking-Route-Puzzle.pptx
  • 5 - L05-Solution-to-the-Hiking-Route-Puzzle.xlsx
  • 5 - Probability Rule 2 The Union Rule OR or UNION.mp4
    20:01
  • 6 - L06-Probability-Rule-3-The-Union-Rule-XOR-or-Exclusive-UNION.pdf
  • 6 - Probability Rule 3 The Union Rule XOR or Exclusive UNION.mp4
    04:30
  • 7 - L07-01-Distribution-Table-Education-by-Gender.xlsx
  • 7 - L07-Probability-Rule-4-The-Intersection-Rule-AND-or-CONDITIONAL-for-Dependent-Events.pdf
  • 7 - Probability Rule 4 The Intersection Rule AND or CONDITIONAL for Dependent Eve.mp4
    22:27
  • 8 - L08-01-Reliability-Data.xlsx
  • 8 - Probability Rule 4 The Intersection Rule for Dependent Events Examples.mp4
    16:44
  • 9 - L09-Probability-Decision-Trees-and-Probability.pdf
  • 9 - Probability Decision Trees and Probability.mp4
    10:11
  • 10 - L10-Probability-Rule-5-The-Meaning-of-Independence-and-Mutual-Exclusivity.pdf
  • 10 - Probability Rule 5 The Meaning of Independence and Mutual Exclusivity.mp4
    07:39
  • 11 - L11-01-Total-Probability-Three-Bags-and-Two-Colors.xlsx
  • 11 - L11-02-Total-Probability-Two-by-Two-Table.xlsx
  • 11 - L11-Probability-Rule-6-Total-Probability.pdf
  • 11 - Probability Rule 6 Total Probability.mp4
    08:50
  • 12 - L12-Probability-Rule-7-The-Chain-Rule-of-Probability.pdf
  • 12 - Probability Rule 7 The Chain Rule of Probability.mp4
    09:46
  • 13 - Evaluating Classifiers with the Confusion Matrix and its KPIs.mp4
    22:39
  • 13 - L13-Evaluating-Classifiers-with-the-Confusion-Matrix-and-its-KPIs.pdf
  • 13 - l13-01-customer-loan-data-with-the-confusion-matrix.zip
  • 14 - Extracting KPIs from the Confusion Matrix.mp4
    20:21
  • 14 - L14-00-Support-Confusion-Matrix-KPIs-List.xlsx
  • 14 - L14-03-Confusion-Matrix-Indicators-from-a-Contingency-Table.xlsx
  • 14 - L14-Extracting-KPIs-from-the-Confusion-Matrix.pdf
  • 14 - l14-01-customer-loan-data-with-the-confusion-matrix.zip
  • 14 - l14-02-to-play-or-not-to-play-tennis-use-countifs.zip
  • 15 - Evaluating Classifiers in the Case of Multiple Classes or Multiple Labels.mp4
    30:07
  • 15 - L15-Evaluating-Classifiers-in-the-Case-of-Multiple-Classes-or-Multiple-Labels.pdf
  • 15 - l15-01-multiclass-per-item-over-2.zip
  • 15 - l15-02-multilevel-per-item-over-2.zip
  • 16 - Bayes Theorem Rationale and Derivation of Theorem.mp4
    18:38
  • 16 - L16-Bayes-Theorem-Rationale-and-Derivation-of-Theorem.pdf
  • 17 - Bayes Theorem A Bayesian Story an Example without Formulas.mp4
    29:24
  • 17 - L17-01-Bayes-Theorem-Mother-Board-Testing.xlsx
  • 17 - L17-Bayes-Theorem-A-Bayesian-Story-an-Example-without-Formulas.pdf
  • 18 - Bayes Theorem Defining the Factors in Bayes Theorem.mp4
    18:37
  • 18 - L18-Bayes-Theorem-Defining-the-Factors-in-Bayes-Theorem.pdf
  • 19 - Bayes Theorem W1 The Famous HIV Test.mp4
    18:49
  • 19 - L19-00-Support-List-of-Workouts.xlsx
  • 19 - L19-02-Bayes-Theorem-HIV-Testing.xlsx
  • 19 - L19-Bayes-Theorem-W1-The-Famous-HIV-Test.pdf
  • 20 - Bayes Theorem W2W4 Spam Testing 3 Events Defective Machines and HIV Hori.mp4
    09:56
  • 20 - L20-02-Bayes-Theorem-Spam-Testing-with-Three-Events.xlsx
  • 20 - L20-03-Bayes-Theorem-Three-Machines-with-Defectives.xlsx
  • 20 - L20-04-Bayes-Theorem-HIV-Testing-using-the-Horizontal-Table-Format.xlsx
  • 20 - L20-Bayes-Theorem-W2-W4-Spam-Testing-3-Events-Defective-Machines-and-HIV-Horizontal-Table.pdf
  • 21 - Bayes Theorem W5 Spam Testing Contingency Table and Graphic Solutions.mp4
    21:24
  • 21 - L21-05-Bayes-Theorem-Spam-Detection-Graphics.xlsx
  • 21 - L21-Bayes-Theorem-W5-Spam-Testing-Contingency-Table-and-Graphic-Solutions.pdf
  • 22 - Bayes Theorem W5 Spam Testing Continued Deriving Posteriors with the Con.mp4
    06:07
  • 22 - L22-05-Bayes-Theorem-Spam-Detection-Graphics-Duplicate-of-L21-01.xlsx
  • 22 - L22-Bayes-Theorem-W5-Spam-Testing-Continued-Deriving-Posteriors-with-the-Confusion-Matrix.pdf
  • 23 - Bayes Theorem W6W8 Predicting Rain and Identifying Product Suppliers Mon.mp4
    11:59
  • 23 - L23-06-Bayes-Theorem-Predicting-Rain-or-Shine.xlsx
  • 23 - L23-07-Bayes-Theorem-Identifying-Production-Brands.xlsx
  • 23 - L23-Bayes-Theorem-W6-W8-Predicting-Rain-and-Identifying-Product-Suppliers-Monty-Hall.pdf
  • 24 - L24-01-Examples-of-Supervised-Tables.xlsx
  • 24 - L24-Naive-Bayes-Classifiers-Introduction.pdf
  • 24 - Naive Bayes Classifiers Introduction.mp4
    24:40
  • 25 - L25-Naive-Bayes-Classifiers-Introducing-the-Algorithm.pdf
  • 25 - Naive Bayes Classifiers Introducing the Algorithm.mp4
    11:51
  • 26 - L26-01-Flight-Status.xlsx
  • 26 - L26-Naive-Bayes-Classifiers-The-Algorithm-thru-a-Short-Example.pdf
  • 26 - Naive Bayes Classifiers The Algorithm thru a Short Example.mp4
    33:58
  • 27 - L27-Naive-Bayes-Classifiers-The-Naive-Bayes-Classifier-Procedure-Applied-on-a-Categorical-Example.pdf
  • 27 - Naive Bayes Classifiers The Naive Bayes Classifier Procedure Applied on a Cate.mp4
    28:28
  • 28 - L28-01-Can-Person-Buy-a-Laptop.xlsx
  • 28 - L28-02-Transportation-Mode.xlsx
  • 28 - L28-Naive-Bayes-Classifiers-More-Categorical-Examples.pdf
  • 28 - Naive Bayes Classifiers More Categorical Examples.mp4
    05:25
  • 29 - L29-01-To-Play-or-Not-to-Play-Tennis-Laplace.xlsx
  • 29 - L29-02-Transport-Mode-Laplace.xlsx
  • 29 - L29-Naive-Bayes-Classifiers-Laplace-Smoothing-Correction.pdf
  • 29 - Naive Bayes Classifiers Laplace Smoothing Correction.mp4
    21:59
  • 30 - L30-01-Can-Person-Buy-a-Laptop-M-Estimate.xlsx
  • 30 - Naive Bayes Classifiers Correction using MEstimates.mp4
    12:34
  • 31 - L31-Naive-Bayes-Classifiers-Gaussian-and-Continuous.pdf
  • 31 - Naive Bayes Classifiers Gaussian and Continuous.mp4
    16:12
  • 32 - L32-01-Classification-of-Clients-by-Rating.xlsx
  • 32 - L32-02-Using-Gaussian-Features-in-the-Tennis-Model.xlsx
  • 32 - L32-03-The-IRIS-Dataset.xlsx
  • 32 - L32-03-The-IRIS-Original-Dataset.xlsx
  • 32 - L32-Naive-Bayes-Classifiers-Three-Gaussian-Examples.pdf
  • 32 - Naive Bayes Classifiers Three Gaussian Examples.mp4
    24:55
  • 33 - L33-01-Example-of-Variance-Regularization.xlsx
  • 33 - L33-02-The-Regularized-Client-Rating-Dataset.xlsx
  • 33 - L33-03-The-Regularized-IRIS-Dataset.xlsx
  • 33 - L33-Naive-Bayes-Classifiers-Some-Extensions-to-Continuous-Features.pdf
  • 33 - Naive Bayes Classifiers Some Extensions to Continuous Features.mp4
    25:02
  • 33 - l33-02-the-regularized-client-rating-dataset-using-vba.zip
  • 34 - L34-00-Support-Beta-Distribution-Shapes.xlsx
  • 34 - L34-01-Personal-Loans-and-the-Beta-Distribution.xlsx
  • 34 - L34-Naive-Bayes-Classifiers-Handling-Non-Gaussian-Continuous-Features-Beta-Distribution.pdf
  • 34 - Naive Bayes Classifiers Handling NonGaussian Continuous Features Beta Distri.mp4
    14:39
  • 35 - L35-01-Applying-the-Non-Gaussian-Beta-Procedure-to-Personal-Loans.xlsx
  • 35 - L35-Naive-Bayes-Classifiers-Applying-the-Non-Gaussian-Beta-Procedure-to-Personal-Loans.pdf
  • 35 - Naive Bayes Classifiers Applying the NonGaussian Beta Procedure to Personal.mp4
    21:00
  • 36 - L36-Naive-Bayes-Classifiers-Discrete-Distributions-Bernoulli-and-Categorical.pdf
  • 36 - Naive Bayes Classifiers Discrete Distributions Bernoulli and Categorical.mp4
    10:07
  • 37 - L37-01-The-Binomial-Distribution.xlsx
  • 37 - L37-Naive-Bayes-Classifiers-Discrete-Distribution-Binomial.pdf
  • 37 - Naive Bayes Classifiers Discrete Distribution Binomial.mp4
    18:09
  • 38 - L38-Naive-Bayes-Classifiers-Discrete-Distribution-Multinomial.pdf
  • 38 - Naive Bayes Classifiers Discrete Distribution Multinomial.mp4
    11:15
  • 39 - L39-01-Bernoulli-A-Generic-Document-Example.xlsx
  • 39 - L39-02-Bernoulli-Classifying-Reviews-of-a-Novel.xlsx
  • 39 - L39-Naive-Bayes-Classifiers-Bernoulli-Naive-Bayes-Examples.pdf
  • 39 - Naive Bayes Classifiers Bernoulli Naive Bayes Examples.mp4
    28:37
  • 40 - L40-01-Classifying-Locations-by-Type-of-Sport-Multinomial-Formula.xlsx
  • 40 - L40-02-Classifying-Reviews-of-a-Novel-Multinomial-Formula.xlsx
  • 40 - L40-03-Classifying-Motherboards-by-Defect-Type-Multinomial-Formula.xlsx
  • 40 - L40-Naive-Bayes-Classifiers-Multinomial-Naive-Bayes-Examples.pdf
  • 40 - Naive Bayes Classifiers Multinomial Naive Bayes Examples.mp4
    16:52
  • 41 - L41-01-Contamination-Tests-Non-Weighted-Gaussian.xlsx
  • 41 - L41-02-Contamination-Tests-Gaussian-Weighted-Likelihoods.xlsx
  • 41 - L41-03-Contamination-Tests-Gaussian-Weighted-Likelihoods-and-Classes.xlsx
  • 41 - L41-04-Modified-Bernoulli-Classifying-Reviews-of-a-Novel.xlsx
  • 41 - L41-Naive-Bayes-Classifiers-Weighted-Naive-Bayes.pdf
  • 41 - Naive Bayes Classifiers Weighted Naive Bayes.mp4
    20:07
  • 42 - L42-01-Complement-NBC-Classifying-Locations-by-Type-of-Sport.xlsx
  • 42 - L42-Naive-Bayes-Classifiers-Complement-Classifier.pdf
  • 42 - Naive Bayes Classifiers Complement Classifier.mp4
    10:04
  • 43 - L43-01-To-Play-or-Not-to-Play-Tennis-Entropy-Calculations.xlsx
  • 43 - L43-Naive-Bayes-Classifiers-Entropy-and-Information.pdf
  • 43 - Naive Bayes Classifiers Entropy and Information.mp4
    22:55
  • 44 - L44-01-Client-Rating-Kononenko-Algorithm.xlsx
  • 44 - L44-02-How-to-Calculate-Weighted-Averages.xlsx
  • 44 - L44-03-Can-Person-Buy-a-Laptop-Kononenko-Algorithm.xlsx
  • 44 - L44-Naive-Bayes-Classifiers-Kononenko-Information-Gain-and-Evaluation-Feature-Influence.pdf
  • 44 - Naive Bayes Classifiers Kononenko Information Gain and Evaluation Feature Infl.mp4
    20:23
  • 45 - L45-01-Relating-Probability-to-Odds.xlsx
  • 45 - L45-02-Log-of-Odds-Ratio-Tennis.xlsx
  • 45 - L45-Naive-Bayes-Classifiers-Log-Odds-Ratio-and-Nomograms.pdf
  • 45 - Naive Bayes Classifiers Log Odds Ratio and Nomograms.mp4
    16:26
  • 46 - L46-01-Discrete-Probability-Transactions-per-Day.xlsx
  • 46 - L46-Naive-Bayes-Classifiers-Kernel-Distance-Estimation-Discrete-and-Continuous-Distributions.pdf
  • 46 - Naive Bayes Classifiers Kernel Distance Estimation Discrete and Continuous Di.mp4
    23:13
  • 47 - L47-01-Client-Rating-with-KERNEL.xlsx
  • 47 - L47-Naive-Bayes-Classifiers-Kernel-Distance-Estimation-and-Application-of-Naive-Bayes-Classifiers.pdf
  • 47 - Naive Bayes Classifiers Kernel Distance Estimation and Application of Naive Ba.mp4
    18:40
  • 48 - L48-01-Overfitting-and-Underfitting.xlsx
  • 48 - L48-Naive-Bayes-Classifiers-Kernel-Distance-Estimation-Estimating-the-Bandwidth-h.pdf
  • 48 - Naive Bayes Classifiers Kernel Distance Estimation Estimating the Bandwidth.mp4
    05:54
  • Description


    Learn the fundamentals to better develop or acquire such Machine Learning Methods

    What You'll Learn?


    • Detailed and fundamental probability principles, rules and procedures
    • The Confused Matrix and its KPI’s to be used as an evaluation of Naïve Bayes Classifier
    • The Variant of Confused Matrices when datasets have multiple labels in their class or multiple classes
    • The theory and principles behind the Bayes’ Theorem
    • How to develop the inference procedures in Bayes’ Theorem using vertical and horizontal tables, contingency tables and decision trees
    • The theoretical basis of Naïve Bayes Classifier
    • Categorical Naïve Bayes Classifiers
    • Laplace Smoothing Correction and M-Estimates
    • Continuous Naïve Bayes Classifiers based on Gaussian distributions
    • Continuous Naïve Bayes Classifiers based on non-Gaussian distributions (in this case, the Beta Distribution)
    • The Beta Distribution and how to derive its parameters from our data
    • The four discrete distributions in use in the Bayes’ Theorem: Categorical, Bernoulli, Binomial and Multinomial
    • Bernoulli Naïve Bayes Classifiers
    • Multinomial Naïve Bayes Classifiers
    • Weighted Naïve Bayes Classifiers
    • Complemented Naïve Bayes Classifiers
    • Entropy and Information Gain for better classification
    • The Kononenko Information Gain and its application in Naïve Bayes Classifiers
    • The Log Odds Ratio and its application in Naïve Bayes Classifiers
    • The Kernel Density Estimates and its application in Naïve Bayes Classifier
    • The optimization of the bandwidth, h, in the Kernel Density Estimates

    Who is this for?


  • Data Scientists and Analysts
  • Machine Learning Engineers
  • Artificial Intelligence Researchers
  • Software Developers
  • Business Analysts
  • Market, Healthcare, Education and Financial professions
  • Cybersecurity Experts
  • Natural Language Processing (NLP) Specialists
  • Product Managers
  • Business Improvement Experts
  • Quality Assurance Professionals
  • What You Need to Know?


  • A working knowledge of Excel
  • A beginner's knowledge of VBA (in Excel)
  • No Python or R are needed
  • All statistical methods will be presented in the course
  • More details


    Description

    A) The Purpose of the Course

    Most courses on this subject are aimed at Machine Learning and Data Science experts. Often, they are presented for use with specialized development platforms or even as part of advanced off-the-shelf applications. On the other hand, the Bayes' Theorem and its applications are based on statistical principles and concept not often clearly explained.

    The purpose of this course is educational. The techniques, algorithms and procedures presented in this course aim more at making machine learning methods based on the Theorem easier to understand as opposed to getting used.

    The Bayes' Theorem is is one of those theorems where we can apply the proverb: “Still water is deep”. The Theorem was developed in an article by Thomas Bayes in 1763. In due course, it found itself being used in a wide variety of statistical applications. The Theorem itself was an application of inference. From there on, and specifically with the advent of Machine Learning algorithms, the Theorem was extended to be the core of a wide variety of applications such as Classification, Networks and Optimization.

    The Theorem and its applications are best developed using specialized programming environments. This is due to the mere fact that the applications of the Theorem require the handling of large data and performance intensive environments.

    B) So, why do we Present a Course based on Excel?

    Analysts require the use and the development of such applications have the following environments available to them:

    · Off the shelf applications, ready-made and commercially available.

    · Open source or free integrated development environments (IDE) that host a large number of scientific and statistical libraries to use in such applications

    In both cases, the Analyst is faced with an insurmountable learning curve, often not climbable at all. Whether the objective is to use off-the-shelf products or to develop their own applications, learning the methods in a machine learning environment is not possible via these two environments.

    The course will then use Excel specifically for educational purposes and not as a machine learning tool. Excel is known by everyone, and if not, it is easy to learn. Excel is highly flexible in terms of exposing how things work. The course will then exploit such facilities to expose to the Analyst in a common sense and step-by-step manner the basis and procedures of these algorithms.

    B) What Does the Course Cover?

    The course is made up of 5 major sections preceded by a short introduction.

    Section 1: Introducing the Course

    This section consists of one lecture that presents the objectives of the course, its structure and resources as well as what to expect and what not to expect.

    Section 2: An In-Depth Presentation of Probability Rules and Practices

    The section starts with lectures that run through a detailed exposure to the fundamentals and practices of probability rules. Bayes' Theorem is highly linked with such rules and it will not be possible for analysts embarking on its use (and the understanding of its extensions) to learn and use these algorithms without a deep understanding of probability.

    The section uses common sense to clarify often obscure concepts in probability. Many examples are presented and explained in detail.

    Section 3: The Use of the Confusion Matrix for Evaluating Bayesian Results

    Some might wonder why we are introducing the Confusion Matrix and its useful KPI’s in this course. The answer is that in both Sections 3 and 4, we will need to evaluate our results in terms of precision, accuracy, error rates, etc. The Confusion Matrix is a contingency table consisting of four results extracted from comparing the algorithm’s outcome with the historically known outcome of the classes in a Test Table. Four measurements consist of True Positive, True Negative, False Positive and False Negative. These four counts can be used in a variety of ways to measure such KPI’s as accuracy, precision, error rates and such. (The confusion matrix is also used in a variety of other classification machine learning methods: logistic regression, decision trees, etc.)

    Section 4: The Fundamental Application of Bayes’ Theorem

    this section presents the Theorem of Bayes first running through a common-sense example. This is followed by the derivation of the Theorem and a clear explanation of the terms used in the Bayes' Theorem formula. A set of 8 major workouts present the use of the Theorem in different formats (vertical and horizontal tables, decision trees and graphic solutions). The last 3 workouts output the results of the workouts to a Confusion Matrix and shows how that can be used to evaluate the results of the Theorem.

    Section 5: How to Use the Naïve Bayes Classifiers

    this is the heart of the course. It presents a wide variety of algorithms whose purpose is the supervised classification of data. The Naïve Bayes Classifiers are a family of algorithms based on the Bayes’ Theorem. They differ in various ways from each other. They are listed below.

    Amongst the lectures detailing these algorithms with clear examples are “support” lectures that present topics that are needed as a support to these algorithms.

    After starting with two lectures that present the fundamentals of Naïve Bayes Classifiers and the required theory, the course proceeds with a set of lectures consisting of 8 Naïve Bayes Classifier variants:

    1) Categorical Naïve Bayes Classifiers

    2) Gaussian and Continuous Naïve Bayes Classifiers

    3) Non-Gaussian Continuous Naïve Bayes Classifiers

    4) Bernoulli Naïve Bayes Classifier

    5) Multinomial Naïve Bayes Classifier

    6) Weighted Naive Bayes Classification

    7) Complement Naïve Bayes Classification

    8) Kernel Distance Estimation and Naive Bayes Classification

    To support the presentations above, the course will interleave the following detailed presentations consisting of methods, topics and procedures:

    1) Laplace Smoothing Correction

    2) Extensions to Continuous Features: checking for normality, checking for independence of features, smoothing corrections for Gaussian features

    3) Two Discrete Distributions - Bernoulli and Categorical

    4) Two Discrete Distributions - Binomial and Multinomial

    5) Entropy and Information and how used in Naïve Bayes Classification

    6) Kononenko Information Gain and Evaluation of Classifiers

    7) Log Odds Ratio and Nomograms used in Bayes Classification

    8) Kernel Distance Estimation - Estimating the Bandwidth h

    Resources

    All lectures will be supported by a variety of resources:

    · Solved and documented workouts in Excel

    · Dedicated workbooks that animate and describe various probability distributions

    · Links to Interesting articles and books

    Who this course is for:

    • Data Scientists and Analysts
    • Machine Learning Engineers
    • Artificial Intelligence Researchers
    • Software Developers
    • Business Analysts
    • Market, Healthcare, Education and Financial professions
    • Cybersecurity Experts
    • Natural Language Processing (NLP) Specialists
    • Product Managers
    • Business Improvement Experts
    • Quality Assurance Professionals

    User Reviews
    Rating
    0
    0
    0
    0
    0
    average 0
    Total votes0
    Focused display
    Category
    Students take courses primarily to improve job-related skills.Some courses generate credit toward technical certification. Udemy has made a special effort to attract corporate trainers seeking to create coursework for employees of their company.
    • language english
    • Training sessions 48
    • duration 14:07:16
    • Release Date 2025/03/08