Companies Home Search Profile

Machine Learning: Neural networks from scratch

Focused View

Maxime Vandegar

4:54:46

139 View
  • 1. Introduction.mp4
    02:28
  • 2. Neural networks intuitive explanation.mp4
    06:07
  • 1. Forward propagation explanation.mp4
    08:05
  • 2. Forward propagation implementation.mp4
    10:31
  • 3. Activation function ReLU.mp4
    15:16
  • 4. Sequential neural networks.mp4
    06:09
  • 1. Saving and loading neural network parameters.mp4
    12:46
  • 2. Image classification part 1.mp4
    11:01
  • 3. Image classification part 2.mp4
    09:11
  • 4. Activation function Softmax.mp4
    04:22
  • 1. Optimization by gradient descent.mp4
    10:58
  • 2. Jacobian matrix.mp4
    07:38
  • 3. Jacobian matrix implementation.mp4
    09:55
  • 4. Chain rule.mp4
    04:53
  • 5. Chain rule implementation.mp4
    09:59
  • 1. Mean Square Error Loss.mp4
    05:51
  • 2. Testing.mp4
    14:13
  • 3. Neural network training.mp4
    09:26
  • 4. Optimizers.mp4
    07:39
  • 5. Regression problem quantitative measure of diabetes progression.mp4
    15:22
  • 6. Activation function LogSoftmax.mp4
    13:35
  • 7. The Log-Sum-Exp Trick.mp4
    15:43
  • 8. Negative Log-Likelihood Loss.mp4
    12:15
  • 1. Batching Multilayer Perceptron (MLP).mp4
    09:19
  • 2. Batching losses.mp4
    09:16
  • 3. Batching activation functions.mp4
    18:49
  • 4. Jacobian-vector product.mp4
    11:32
  • 5. Xavier Initialization.mp4
    06:20
  • 1. Image classification.mp4
    14:45
  • 2. Conclusion.mp4
    01:22
  • Description


    Implementation of neural networks from scratch (Python)

    What You'll Learn?


    • What are neural networks
    • Implement a neural network from scratch (Python, Java, C, ...)
    • Training neural networks
    • Activation functions and the universal approximation theorem
    • Strengthen your knowledge in Machine Learning and Data Science
    • Implementation tricks: Jacobian-Vector product & log-sum-exp trick

    Who is this for?


  • For developers who would like to implement a neural network without using dedicated libraries
  • For those who study machine learning and would like to strengthen their knowledge about neural networks and automatic differentiation frameworks
  • For those preparing for job interviews in data science
  • To artificial intelligence enthusiasts
  • What You Need to Know?


  • Basic knowledge of programming, algebra and analysis
  • More details


    Description

    In this course, we will implement a neural network from scratch, without dedicated libraries. Although we will use the python programming language, at the end of this course, you will be able to implement a neural network in any programming language.


    We will see how neural networks work intuitively, and then mathematically. We will also see some important tricks, which allow stabilizing the training of neural networks (log-sum-exp trick), and to prevent the memory used during training from growing exponentially (jacobian-vector product). Without these tricks, most neural networks could not be trained.


    We will train our neural networks on real image classification and regression problems. To do so, we will implement different cost functions, as well as several activation functions.


    This course is aimed at developers who would like to implement a neural network from scratch as well as those who want to understand how a neural network works from A to Z.


    This course is taught using the Python programming language and requires basic programming skills. If you do not have the required background, I recommend that you brush up on your programming skills by taking a crash course in programming. It is also recommended that you have some knowledge of Algebra and Analysis to get the most out of this course.


    Concepts covered :

    • Neural networks

    • Implementing neural networks from scratch

    • Gradient descent and Jacobian matrix

    • The creation of Modules that can be nested in order to create a complex neural architecture

    • The log-sum-exp trick

    • Jacobian vector product

    • Activation functions (ReLU, Softmax, LogSoftmax, ...)

    • Cost functions (MSELoss, NLLLoss, ...)


    This course will be frequently updated, with the addition of bonuses.


    Don't wait any longer before launching yourself into the world of machine learning!


    Who this course is for:

    • For developers who would like to implement a neural network without using dedicated libraries
    • For those who study machine learning and would like to strengthen their knowledge about neural networks and automatic differentiation frameworks
    • For those preparing for job interviews in data science
    • To artificial intelligence enthusiasts

    User Reviews
    Rating
    0
    0
    0
    0
    0
    average 0
    Total votes0
    Focused display
    Maxime Vandegar
    Maxime Vandegar
    Instructor's Courses
    Ingénieur fraîchement diplômé, je suis actuellement chercheur à l'université de Stanford et scientifique collaborateur au CERN. Mes recherches combinent l'intelligence artificielle (principalement le deep learning) et la physique fondamentale.Durant mes études, j'ai été responsable de séances d'exercices dans plusieurs cours universitaires (mécanique des matériaux, électronique numérique, signaux et systèmes,...) et je donne régulièrement des séances de coaching avancées en Python.
    Students take courses primarily to improve job-related skills.Some courses generate credit toward technical certification. Udemy has made a special effort to attract corporate trainers seeking to create coursework for employees of their company.
    • language english
    • Training sessions 30
    • duration 4:54:46
    • Release Date 2022/11/17