Companies Home Search Profile

Introduction to Transformer Models for NLP

Focused View

10:13:27

283 View
  • 001. Introduction to Transformer Models for NLP Introduction.mp4
    03:08
  • 001. Topics.mp4
    00:53
  • 002. 1.1 A brief history of NLP.mp4
    08:06
  • 003. 1.2 Paying attention with attenti.mp4
    07:58
  • 004. 1.3 Encoder-decoder architectures.mp4
    07:04
  • 005. 1.4 How language models look at t.mp4
    07:05
  • 001. Topics.mp4
    01:04
  • 002. 2.1 Introduction to transformers.mp4
    02:33
  • 003. 2.2 Scaled dot product attention.mp4
    28:49
  • 004. 2.3 Multi-headed attention.mp4
    16:50
  • 001. Topics.mp4
    00:32
  • 002. 3.1 Introduction to Transfer Learning.mp4
    14:40
  • 003. 3.2 Introduction to PyTorch.mp4
    10:13
  • 004. 3.3 Fine-tuning transformers with PyTorch.mp4
    06:47
  • 001. Topics.mp4
    00:39
  • 002. 4.1 Introduction to BERT.mp4
    21:51
  • 003. 4.2 Wordpiece tokenization.mp4
    24:23
  • 004. 4.3 The many embeddings of BERT.mp4
    15:17
  • 001. Topics.mp4
    00:43
  • 002. 5.1 The Masked Language Modeling Task.mp4
    10:25
  • 003. 5.2 The Next Sentence Prediction Task.mp4
    10:40
  • 004. 5.3 Fine-tuning BERT to solve NLP tasks.mp4
    14:57
  • 001. Topics.mp4
    00:40
  • 002. 6.1 Flavors of BERT.mp4
    13:00
  • 003. 6.2 BERT for sequence classification.mp4
    38:08
  • 004. 6.3 BERT for token classification.mp4
    17:39
  • 005. 6.4 BERT for questionanswering.mp4
    21:59
  • 001. Topics.mp4
    00:51
  • 002. 7.1 Introduction to the GPT family.mp4
    18:10
  • 003. 7.2 Masked multi-headed attention.mp4
    27:41
  • 004. 7.3 Pre-training GPT.mp4
    09:02
  • 005. 7.4 Few-shot learning.mp4
    13:28
  • 001. Topics.mp4
    00:39
  • 002. 8.1 GPT for style completion.mp4
    15:57
  • 003. 8.2 GPT for code dictation.mp4
    25:18
  • 001. Topics.mp4
    00:29
  • 002. 9.1 Siamese BERT-networks for semantic searc.mp4
    36:02
  • 003. 9.2 Teaching GPT multiple tasks at once with.mp4
    19:06
  • 001. Topics.mp4
    00:42
  • 002. 10.1 Encoders and decoders welcome T5s architecture.mp4
    07:29
  • 003. 10.2 Cross-attention.mp4
    05:38
  • 001. Topics.mp4
    00:26
  • 002. 11.1 Off the shelf results with T5.mp4
    14:10
  • 003. 11.2 Using T5 for abstractive summarization.mp4
    15:48
  • 001. Topics.mp4
    00:37
  • 002. 12.1 Introduction to the Vision Transformer (ViT).mp4
    12:44
  • 003. 12.2 Fine-tuning an image captioning system.mp4
    29:10
  • 001. Topics.mp4
    00:36
  • 002. 13.1 Introduction to MLOps.mp4
    17:04
  • 003. 13.2 Sharing our models on HuggingFace.mp4
    16:35
  • 004. 13.3 Deploying a fine-tuned BERT model using Fast.mp4
    17:42
  • 001. Introduction to Transformer Models for NLP Summary.mp4
    02:00
  • More details


    User Reviews
    Rating
    0
    0
    0
    0
    0
    average 0
    Total votes0
    Focused display
    Pearson's video training library is an indispensable learning tool for today's competitive job market. Having essential technology training and certifications can open doors for career advancement and life enrichment. We take learning personally. We've published hundreds of up-to-date videos on wide variety of key topics for Professionals and IT Certification candidates. Now you can learn from renowned industry experts from anywhere in the world, without leaving home.
    • language english
    • Training sessions 52
    • duration 10:13:27
    • English subtitles has
    • Release Date 2023/03/28