mirror of
https://github.com/rasbt/LLMs-from-scratch.git
synced 2025-08-15 04:01:44 +00:00

* Uv workflow improvements * Uv workflow improvements * linter improvements * pytproject.toml fixes * pytproject.toml fixes * pytproject.toml fixes * pytproject.toml fixes * pytproject.toml fixes * pytproject.toml fixes * windows fixes * windows fixes * windows fixes * windows fixes * windows fixes * windows fixes * win32 fix * win32 fix * win32 fix * win32 fix * win32 fix * win32 fix * win32 fix * win32 fix * win32 fix * win32 fix * win32 fix * win32 fix * win32 fix * win32 fix * win32 fix * win32 fix * win32 fix * win32 fix * win32 fix
Chapter 3: Coding Attention Mechanisms
Main Chapter Code
- 01_main-chapter-code contains the main chapter code.
Bonus Materials
- 02_bonus_efficient-multihead-attention implements and compares different implementation variants of multihead-attention
- 03_understanding-buffers explains the idea behind PyTorch buffers, which are used to implement the causal attention mechanism in chapter 3