# Chapter 3: Coding Attention Mechanisms ## Main Chapter Code - [01_main-chapter-code](01_main-chapter-code) contains the main chapter code. ## Bonus Materials - [02_bonus_efficient-multihead-attention](02_bonus_efficient-multihead-attention) implements and compares different implementation variants of multihead-attention - [03_understanding-buffers](03_understanding-buffers) explains the idea behind PyTorch buffers, which are used to implement the causal attention mechanism in chapter 3