12 Commits

Author SHA1 Message Date
casinca
564e986496
fix issue #664 - inverted token and pos emb layers (#665)
* fix inverted token and pos layers

* remove redundant code

---------

Co-authored-by: rasbt <mail@sebastianraschka.com>
2025-06-22 12:15:01 -05:00
Sebastian Raschka
08040f024c
Test code in pytorch 2.4 (#285)
* test code in pytorch 2.4

* update
2024-07-24 21:53:41 -05:00
rasbt
39c4a887eb
add allowed_special={"<|endoftext|>"} 2024-06-09 06:04:02 -05:00
Sebastian Raschka
72a073bbbf
Remove leftover instances of self.tokenizer (#201)
* Remove leftover instances of self.tokenizer

* add endoftext token
2024-06-08 14:57:34 -05:00
rasbt
98d453b666
update formatting 2024-05-24 07:20:37 -05:00
James Holcombe
05718c6b94
Use instance tokenizer (#116)
* Use instance tokenizer

* consistency updates

---------

Co-authored-by: Sebastian Raschka <mail@sebastianraschka.com>
2024-04-10 21:16:19 -04:00
Sebastian Raschka
2de60d1bfb
Rename variable to context_length to make it easier on readers (#106)
* rename to context length

* fix spacing
2024-04-04 07:27:41 -05:00
Sebastian Raschka
a2cd8436cb Ch05 supplementary code (#81) 2024-03-19 09:26:26 -05:00
rasbt
cc2383c4de remove duplicated exercise code 2024-03-02 16:44:36 -06:00
rasbt
c400f77f26 update exercise solutions 2024-01-13 14:49:02 -06:00
rasbt
4f161bd549 use block size variable in positional embedding layer 2023-12-28 19:05:06 +01:00
rasbt
c8825b7c22 add exercise solutions 2023-10-27 06:19:40 -05:00