Sequence Modeling | Deep Learning Specialization | Coursera

Course planning

Week 1: Recurrent neural networks

Learn about recurrent neural networks. This type of model has been proven to perform extremely well on temporal data. It has several variants including LSTMs, GRUs and Bidirectional RNNs, which you are going to learn about in this section.

  • Recurrent neural networks
    • C4W1L01 Why sequence models
    • C4W1L02 Notation
    • C4W1L03 Recurrent neural network model
    • C4W1L04 Backpropagation through time
    • C4W1L05 Different types of RNNs
    • C4W1L06 Language model and sequence generation
    • C4W1L07 Sampling novel sequences
    • C4W1L08 Vanishing gradients with RNNs
    • C4W1L09 Gated recurrent unit (GRU)
    • C4W1L10 Long short term memory (LSTM)
    • C4W1L11 Bidirectional RNN
    • C4W1L12 Deep RNNs
  • Practice questions
    • C4W1Q1 Recurrent neural networks
  • Programming assignments
    • C4W1P1 Building a recurrent neural network – Step by step
    • C4W1P2 Dinosaur island – Character-level language modeling
Week 2: Natural language processing & word embeddings

Natural language processing with deep learning is an important combination. Using word vector representations and embedding layers you can train recurrent neural networks with outstanding performances in a wide variety of industries. Examples of applications are sentiment analysis, named entity recognition and machine translation.

  • Introduction to word embeddings
    • C4W2L01 Word representation
    • C4W2L02 Using word embeddings
    • C4W2L03 Properties of word embeddings
    • C4W2L04 Embedding matrix
  • Learning word embeddings: Word2vec & GloVe
    • C4W2L05 Learning word embeddings
    • C4W2L06 Word2vec
    • C4W2L07 Negative sampling
    • C4W2L08 GloVe word vectors
  • Applications using word embeddings
    • C4W2L09 Sentiment classification
    • C4W2L10 Debiasing word embeddings
  • Practice questions
    • C4W2Q1 Natural language processing & word embeddings
  • Programming assignments
    • C4W2P1 Operations on word vectors – Debiasing
    • C4W2P2 Emoify
Week 3: Sequence models & attention mechanism

Sequence models can be augmented using an attention mechanism. This algorithm will help your model understand where it should focus its attention given a sequence of inputs. This week, you will also learn about speech recognition and how to deal with audio data.

  • Various sequence to sequence architectures
    • C4W3L01 Basic models
    • C4W3L02 Picking the most likely sentence
    • C4W3L03 Beam search
    • C4W3L04 Refinements to beam search
    • C4W3L05 Error analysis in beam search
    • C4W3L06 Bleu score (optional)
    • C4W3L07 Attention model intuition
    • C4W3L08 Attention model
  • Speech recognition – Audio data
    • C4W3L09 Speech recognition
    • C4W3L10 Trigger word detection
  • Conclusion
    • C4W3L11 Conclusion and thank you
  • Practice questions
    • C4W3Q1 Sequence models & attention mechanism
  • Programming assignments
    • C4W3P1 Neural machine translation with attention
    • C4W3P2 Trigger word detection

Leave a Reply

Your email address will not be published. Required fields are marked *