Companies Home Search Profile

PyTorch: Deep Learning and Artificial Intelligence

Focused View

Lazy Programmer Inc.,Lazy Programmer Team

24:18:06

32 View
  • 1. Welcome.mp4
    04:03
  • 2. Overview and Outline.mp4
    13:14
  • 1.1 Data Links.html
  • 1.2 Github Link.html
  • 1. Get Your Hands Dirty, Practical Coding Experience, Data Links.mp4
    08:33
  • 2. How to use Github & Extra Coding Tips (Optional).mp4
    11:12
  • 3.1 Code Link.html
  • 3.2 Data Links.html
  • 3.3 Github Link.html
  • 3. Where to get the code, notebooks, and data.mp4
    03:23
  • 4. How to Succeed in This Course.mp4
    03:04
  • 5. Temporary 403 Errors.mp4
    02:57
  • 1. Intro to Google Colab, how to use a GPU or TPU for free.mp4
    12:33
  • 2. Uploading your own data to Google Colab.mp4
    13:12
  • 3. Where can I learn about Numpy, Scipy, Matplotlib, Pandas, and Scikit-Learn.mp4
    11:24
  • 1. What is Machine Learning.mp4
    14:26
  • 2. Regression Basics.mp4
    14:39
  • 3. Regression Code Preparation.mp4
    11:45
  • 4. Regression Notebook.mp4
    13:14
  • 5. Moores Law.mp4
    06:57
  • 6. Moores Law Notebook.mp4
    13:51
  • 7. Linear Classification Basics.mp4
    15:06
  • 8. Classification Code Preparation.mp4
    06:57
  • 9. Classification Notebook.mp4
    12:00
  • 10. Saving and Loading a Model.mp4
    05:21
  • 11. A Short Neuroscience Primer.mp4
    09:51
  • 12. How does a model learn.mp4
    10:50
  • 13. Model With Logits.mp4
    04:18
  • 14. Train Sets vs. Validation Sets vs. Test Sets.mp4
    10:12
  • 15. Suggestion Box.mp4
    03:10
  • 1. Artificial Neural Networks Section Introduction.mp4
    06:00
  • 2. Forward Propagation.mp4
    09:40
  • 3. The Geometrical Picture.mp4
    09:43
  • 4. Activation Functions.mp4
    17:18
  • 5. Multiclass Classification.mp4
    09:40
  • 6. How to Represent Images.mp4
    12:21
  • 7. Color Mixing Clarification.mp4
    00:54
  • 8. Code Preparation (ANN).mp4
    14:57
  • 9. ANN for Image Classification.mp4
    18:28
  • 10. ANN for Regression.mp4
    10:55
  • 11. How to Choose Hyperparameters.mp4
    06:24
  • 1. What is Convolution (part 1).mp4
    16:38
  • 2. What is Convolution (part 2).mp4
    05:56
  • 3. What is Convolution (part 3).mp4
    06:41
  • 4. Convolution on Color Images.mp4
    15:58
  • 5. CNN Architecture.mp4
    20:53
  • 6. CNN Code Preparation (part 1).mp4
    17:42
  • 7. CNN Code Preparation (part 2).mp4
    08:00
  • 8. CNN Code Preparation (part 3).mp4
    05:40
  • 9. CNN for Fashion MNIST.mp4
    11:32
  • 10. CNN for CIFAR-10.mp4
    08:05
  • 11. Data Augmentation.mp4
    09:45
  • 12. Batch Normalization.mp4
    05:14
  • 13. Improving CIFAR-10 Results.mp4
    10:46
  • 1. Sequence Data.mp4
    22:14
  • 2. Forecasting.mp4
    10:58
  • 3. Autoregressive Linear Model for Time Series Prediction.mp4
    12:15
  • 4. Proof that the Linear Model Works.mp4
    04:12
  • 5. Recurrent Neural Networks.mp4
    21:31
  • 6. RNN Code Preparation.mp4
    13:49
  • 7. RNN for Time Series Prediction.mp4
    09:29
  • 8. Paying Attention to Shapes.mp4
    09:33
  • 9. GRU and LSTM (pt 1).mp4
    17:35
  • 10. GRU and LSTM (pt 2).mp4
    11:45
  • 11. A More Challenging Sequence.mp4
    10:28
  • 12. RNN for Image Classification (Theory).mp4
    04:41
  • 13. RNN for Image Classification (Code).mp4
    02:48
  • 14. Stock Return Predictions using LSTMs (pt 1).mp4
    12:24
  • 15. Stock Return Predictions using LSTMs (pt 2).mp4
    06:16
  • 16. Stock Return Predictions using LSTMs (pt 3).mp4
    11:46
  • 17. Other Ways to Forecast.mp4
    05:14
  • 1. Embeddings.mp4
    13:12
  • 2. Neural Networks with Embeddings.mp4
    03:45
  • 3. Text Preprocessing Concepts.mp4
    13:33
  • 4.1 Why bad programmers always need the latest version.html
  • 4. Beginner Blues - PyTorch NLP Version.mp4
    10:36
  • 5. (Legacy) Text Preprocessing Code Preparation.mp4
    11:53
  • 6. (Legacy) Text Preprocessing Code Example.mp4
    07:53
  • 7. Text Classification with LSTMs (V2).mp4
    17:42
  • 8. CNNs for Text.mp4
    12:07
  • 9. Text Classification with CNNs (V2).mp4
    07:16
  • 10. (Legacy) VIP Making Predictions with a Trained NLP Model.mp4
    07:37
  • 11. VIP Making Predictions with a Trained NLP Model (V2).mp4
    04:21
  • 1. Recommender Systems with Deep Learning Theory.mp4
    10:26
  • 2. Recommender Systems with Deep Learning Code Preparation.mp4
    09:38
  • 3. Recommender Systems with Deep Learning Code (pt 1).mp4
    08:52
  • 4. Recommender Systems with Deep Learning Code (pt 2).mp4
    12:31
  • 5. VIP Making Predictions with a Trained Recommender Model.mp4
    04:51
  • 1. Transfer Learning Theory.mp4
    08:12
  • 2. Some Pre-trained Models (VGG, ResNet, Inception, MobileNet).mp4
    04:05
  • 3. Large Datasets.mp4
    07:11
  • 4. 2 Approaches to Transfer Learning.mp4
    04:52
  • 5. Transfer Learning Code (pt 1).mp4
    09:36
  • 6. Transfer Learning Code (pt 2).mp4
    07:40
  • 1. GAN Theory.mp4
    16:03
  • 2. GAN Code Preparation.mp4
    06:18
  • 3. GAN Code.mp4
    09:21
  • 1. Deep Reinforcement Learning Section Introduction.mp4
    06:34
  • 2. Elements of a Reinforcement Learning Problem.mp4
    20:18
  • 3. States, Actions, Rewards, Policies.mp4
    09:24
  • 4. Markov Decision Processes (MDPs).mp4
    10:07
  • 5. The Return.mp4
    04:56
  • 6. Value Functions and the Bellman Equation.mp4
    09:53
  • 7. What does it mean to learn.mp4
    07:18
  • 8. Solving the Bellman Equation with Reinforcement Learning (pt 1).mp4
    09:48
  • 9. Solving the Bellman Equation with Reinforcement Learning (pt 2).mp4
    12:04
  • 10. Epsilon-Greedy.mp4
    06:09
  • 11. Q-Learning.mp4
    14:16
  • 12. Deep Q-Learning DQN (pt 1).mp4
    14:05
  • 13. Deep Q-Learning DQN (pt 2).mp4
    10:25
  • 14. How to Learn Reinforcement Learning.mp4
    05:57
  • 1. Reinforcement Learning Stock Trader Introduction.mp4
    05:14
  • 2. Data and Environment.mp4
    12:22
  • 3. Replay Buffer.mp4
    05:40
  • 4. Program Design and Layout.mp4
    06:56
  • 5. Code pt 1.mp4
    09:22
  • 6. Code pt 2.mp4
    09:40
  • 7. Code pt 3.mp4
    06:54
  • 8. Code pt 4.mp4
    07:25
  • 9. Reinforcement Learning Stock Trader Discussion.mp4
    03:36
  • 1. Custom Loss and Estimating Prediction Uncertainty.mp4
    09:36
  • 2. Estimating Prediction Uncertainty Code.mp4
    07:12
  • 1. Facial Recognition Section Introduction.mp4
    03:39
  • 2. Siamese Networks.mp4
    10:17
  • 3. Code Outline.mp4
    05:05
  • 4. Loading in the data.mp4
    05:52
  • 5. Splitting the data into train and test.mp4
    04:27
  • 6. Converting the data into pairs.mp4
    05:04
  • 7. Generating Generators.mp4
    05:06
  • 8. Creating the model and loss.mp4
    04:28
  • 9. Accuracy and imbalanced classes.mp4
    07:48
  • 10. Facial Recognition Section Summary.mp4
    03:32
  • 1. Mean Squared Error.mp4
    09:11
  • 2. Binary Cross Entropy.mp4
    05:58
  • 3. Categorical Cross Entropy.mp4
    08:06
  • 1. Gradient Descent.mp4
    07:52
  • 2. Stochastic Gradient Descent.mp4
    04:36
  • 3. Momentum.mp4
    06:10
  • 4. Variable and Adaptive Learning Rates.mp4
    11:45
  • 5. Adam (pt 1).mp4
    13:15
  • 6. Adam (pt 2).mp4
    11:14
  • 1. Where Are The Exercises.mp4
    04:03
  • 1. Pre-Installation Check.mp4
    04:12
  • 2. How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow.mp4
    17:30
  • 3. Anaconda Environment Setup.mp4
    20:20
  • 4. Installing NVIDIA GPU-Accelerated Deep Learning Libraries on your Home Computer.mp4
    22:15
  • 1. Beginners Coding Tips.mp4
    13:21
  • 2. How to Code Yourself (part 1).mp4
    15:54
  • 3. How to Code Yourself (part 2).mp4
    09:23
  • 4. Proof that using Jupyter Notebook is the same as not using it.mp4
    12:29
  • 1. How to Succeed in this Course (Long Version).mp4
    10:24
  • 2. Is this for Beginners or Experts Academic or Practical Fast or slow-paced.mp4
    22:04
  • 3. Machine Learning and AI Prerequisite Roadmap (pt 1).mp4
    11:18
  • 4. Machine Learning and AI Prerequisite Roadmap (pt 2).mp4
    16:07
  • 1. What is the Appendix.mp4
    02:48
  • 2. BONUS.mp4
    05:31
  • Description


    Neural Networks for Computer Vision, Time Series Forecasting, NLP, GANs, Reinforcement Learning, and More!

    What You'll Learn?


    • Artificial Neural Networks (ANNs) / Deep Neural Networks (DNNs)
    • Predict Stock Returns
    • Time Series Forecasting
    • Computer Vision
    • How to build a Deep Reinforcement Learning Stock Trading Bot
    • GANs (Generative Adversarial Networks)
    • Recommender Systems
    • Image Recognition
    • Convolutional Neural Networks (CNNs)
    • Recurrent Neural Networks (RNNs)
    • Natural Language Processing (NLP) with Deep Learning
    • Demonstrate Moore's Law using Code
    • Transfer Learning to create state-of-the-art image classifiers

    Who is this for?


  • Beginners to advanced students who want to learn about deep learning and AI in PyTorch
  • What You Need to Know?


  • Know how to code in Python and Numpy
  • For the theoretical parts (optional), understand derivatives and probability
  • More details


    Description

    Welcome to PyTorch: Deep Learning and Artificial Intelligence!


    Although Google's Deep Learning library Tensorflow has gained massive popularity over the past few years, PyTorch has been the library of choice for professionals and researchers around the globe for deep learning and artificial intelligence.

    Is it possible that Tensorflow is popular only because Google is popular and used effective marketing?

    Why did Tensorflow change so significantly between version 1 and version 2? Was there something deeply flawed with it, and are there still potential problems?

    It is less well-known that PyTorch is backed by another Internet giant, Facebook (specifically, the Facebook AI Research Lab - FAIR). So if you want a popular deep learning library backed by billion dollar companies and lots of community support, you can't go wrong with PyTorch. And maybe it's a bonus that the library won't completely ruin all your old code when it advances to the next version. ;)

    On the flip side, it is very well-known that all the top AI shops (ex. OpenAI, Apple, and JPMorgan Chase) use PyTorch. OpenAI just recently switched to PyTorch in 2020, a strong sign that PyTorch is picking up steam.

    If you are a professional, you will quickly recognize that building and testing new ideas is extremely easy with PyTorch, while it can be pretty hard in other libraries that try to do everything for you. Oh, and it's faster.


    Deep Learning has been responsible for some amazing achievements recently, such as:

    • Generating beautiful, photo-realistic images of people and things that never existed (GANs)

    • Beating world champions in the strategy game Go, and complex video games like CS:GO and Dota 2 (Deep Reinforcement Learning)

    • Self-driving cars (Computer Vision)

    • Speech recognition (e.g. Siri) and machine translation (Natural Language Processing)

    • Even creating videos of people doing and saying things they never did (DeepFakes - a potentially nefarious application of deep learning)


    This course is for beginner-level students all the way up to expert-level students. How can this be?

    If you've just taken my free Numpy prerequisite, then you know everything you need to jump right in. We will start with some very basic machine learning models and advance to state of the art concepts.

    Along the way, you will learn about all of the major deep learning architectures, such as Deep Neural Networks, Convolutional Neural Networks (image processing), and Recurrent Neural Networks (sequence data).

    Current projects include:

    • Natural Language Processing (NLP)

    • Recommender Systems

    • Transfer Learning for Computer Vision

    • Generative Adversarial Networks (GANs)

    • Deep Reinforcement Learning Stock Trading Bot

    Even if you've taken all of my previous courses already, you will still learn about how to convert your previous code so that it uses PyTorch, and there are all-new and never-before-seen projects in this course such as time series forecasting and how to do stock predictions.

    This course is designed for students who want to learn fast, but there are also "in-depth" sections in case you want to dig a little deeper into the theory (like what is a loss function, and what are the different types of gradient descent approaches).

    I'm taking the approach that even if you are not 100% comfortable with the mathematical concepts, you can still do this! In this course, we focus more on the PyTorch library, rather than deriving any mathematical equations. I have tons of courses for that already, so there is no need to repeat that here.


    Instructor's Note: This course focuses on breadth rather than depth, with less theory in favor of building more cool stuff. If you are looking for a more theory-dense course, this is not it. Generally, for each of these topics (recommender systems, natural language processing, reinforcement learning, computer vision, GANs, etc.) I already have courses singularly focused on those topics.


    Thanks for reading, and I’ll see you in class!


    WHAT ORDER SHOULD I TAKE YOUR COURSES IN?:

    • Check out the lecture "Machine Learning and AI Prerequisite Roadmap" (available in the FAQ of any of my courses, including the free Numpy course)


    UNIQUE FEATURES

    • Every line of code explained in detail - email me any time if you disagree

    • No wasted time "typing" on the keyboard like other courses - let's be honest, nobody can really write code worth learning about in just 20 minutes from scratch

    • Not afraid of university-level math - get important details about algorithms that other courses leave out

    Who this course is for:

    • Beginners to advanced students who want to learn about deep learning and AI in PyTorch

    User Reviews
    Rating
    0
    0
    0
    0
    0
    average 0
    Total votes0
    Focused display
    Lazy Programmer Inc.
    Lazy Programmer Inc.
    Instructor's Courses
    Today, I spend most of my time as an artificial intelligence and machine learning engineer with a focus on deep learning, although I have also been known as a data scientist, big data engineer, and full stack software engineer.I received my first masters degree over a decade ago in computer engineering with a specialization in machine learning and pattern recognition. I received my second masters degree in statistics with applications to financial engineering.Experience includes online advertising and digital media as both a data scientist (optimizing click and conversion rates) and big data engineer (building data processing pipelines). Some big data technologies I frequently use are Hadoop, Pig, Hive, MapReduce, and Spark.I've created deep learning models to predict click-through rate and user behavior, as well as for image and signal processing and modeling text.My work in recommendation systems has applied Reinforcement Learning and Collaborative Filtering, and we validated the results using A/B testing.I have taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Hunter College, and The New School. Multiple businesses have benefitted from my web programming expertise. I do all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies I've used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases I've used MySQL, Postgres, Redis, MongoDB, and more.
    Lazy Programmer Team
    Lazy Programmer Team
    Instructor's Courses
    Today, I spend most of my time as an artificial intelligence and machine learning engineer with a focus on deep learning, although I have also been known as a data scientist, big data engineer, and full stack software engineer.I received my first masters degree over a decade ago in computer engineering with a specialization in machine learning and pattern recognition. I received my second masters degree in statistics with applications to financial engineering.Experience includes online advertising and digital media as both a data scientist (optimizing click and conversion rates) and big data engineer (building data processing pipelines). Some big data technologies I frequently use are Hadoop, Pig, Hive, MapReduce, and Spark.I've created deep learning models to predict click-through rate and user behavior, as well as for image and signal processing and modeling text.My work in recommendation systems has applied Reinforcement Learning and Collaborative Filtering, and we validated the results using A/B testing.I have taught undergraduate and graduate students in data science, statistics, machine learning, algorithms, calculus, computer graphics, and physics for students attending universities such as Columbia University, NYU, Hunter College, and The New School.Multiple businesses have benefitted from my web programming expertise. I do all the backend (server), frontend (HTML/JS/CSS), and operations/deployment work. Some of the technologies I've used are: Python, Ruby/Rails, PHP, Bootstrap, jQuery (Javascript), Backbone, and Angular. For storage/databases I've used MySQL, Postgres, Redis, MongoDB, and more.
    Students take courses primarily to improve job-related skills.Some courses generate credit toward technical certification. Udemy has made a special effort to attract corporate trainers seeking to create coursework for employees of their company.
    • language english
    • Training sessions 150
    • duration 24:18:06
    • English subtitles has
    • Release Date 2023/12/16