This is handwritten Notes from first session( READ_2)


What I learned :

  1. Attention

  2. Encoder & Decoder Model ( architecture)

  3. BERT Model

  4. Context Length

Lecture_transformer_02.pdf