casinca 564e986496
fix issue #664 - inverted token and pos emb layers (#665)
* fix inverted token and pos layers

* remove redundant code

---------

Co-authored-by: rasbt <mail@sebastianraschka.com>
2025-06-22 12:15:01 -05:00
..
2025-04-18 18:57:09 -05:00
2025-06-13 08:16:18 -05:00

Chapter 2: Working with Text Data

 

Main Chapter Code

 

Bonus Materials

  • 02_bonus_bytepair-encoder contains optional code to benchmark different byte pair encoder implementations

  • 03_bonus_embedding-vs-matmul contains optional (bonus) code to explain that embedding layers and fully connected layers applied to one-hot encoded vectors are equivalent.

  • 04_bonus_dataloader-intuition contains optional (bonus) code to explain the data loader more intuitively with simple numbers rather than text.

  • 05_bpe-from-scratch contains (bonus) code that implements and trains a GPT-2 BPE tokenizer from scratch.

In the video below, I provide a code-along session that covers some of the chapter contents as supplementary material.



Link to the video