mirror of
https://github.com/rasbt/LLMs-from-scratch.git
synced 2025-09-22 14:44:06 +00:00
Add cover and book info
This commit is contained in:
parent
220df4ffb3
commit
d29bb5a01e
30
README.md
30
README.md
@ -1,6 +1,23 @@
|
|||||||
# Large Language Models from Scratch
|
# Build a Large Language Model (From Scratch)
|
||||||
|
|
||||||
Details will follow ...
|
|
||||||
|
<br>
|
||||||
|
<br>
|
||||||
|
|
||||||
|
|
||||||
|
<a href="http://mng.bz/orYv"><img src="images/cover.jpg" width="250px"></a>
|
||||||
|
|
||||||
|
In [*Build a Large Language Model (from Scratch)*](http://mng.bz/orYv), you'll discover how LLMs work from the inside out. In this book, I'll guide you step by step through creating your own LLM, explaining each stage with clear text, diagrams, and examples.
|
||||||
|
|
||||||
|
The method described in this book for training and developing your own small-but-functional model for educational purposes mirrors the approach used in creating large-scale foundational models such as GPT-4.
|
||||||
|
|
||||||
|
- Link to the official [source code repository](https://github.com/rasbt/LLMs-from-scratch)
|
||||||
|
- [Link to the early access version](http://mng.bz/orYv) at Manning
|
||||||
|
- ISBN 9781633437166
|
||||||
|
- Publication in Early 2025 (estimated)
|
||||||
|
|
||||||
|
<br>
|
||||||
|
<br>
|
||||||
|
|
||||||
## Table of Contents
|
## Table of Contents
|
||||||
|
|
||||||
@ -10,4 +27,11 @@ Details will follow ...
|
|||||||
| Ch 2: Working with Text Data | [ch02.ipynb](ch02/01_main-chapter-code/ch02.ipynb)<br />[dataloader.ipynb](ch02/01_main-chapter-code/dataloader.ipynb) | [./ch02](./ch02) |
|
| Ch 2: Working with Text Data | [ch02.ipynb](ch02/01_main-chapter-code/ch02.ipynb)<br />[dataloader.ipynb](ch02/01_main-chapter-code/dataloader.ipynb) | [./ch02](./ch02) |
|
||||||
| Ch 3: Understanding Attention Mechanisms | [ch03.ipynb](ch03/01_main-chapter-code/ch03.ipynb)<br />[multihead-attention.ipynb](ch03/01_main-chapter-code/multihead-attention.ipynb) | [./ch03](./ch03) |
|
| Ch 3: Understanding Attention Mechanisms | [ch03.ipynb](ch03/01_main-chapter-code/ch03.ipynb)<br />[multihead-attention.ipynb](ch03/01_main-chapter-code/multihead-attention.ipynb) | [./ch03](./ch03) |
|
||||||
| ... | ... | ... |
|
| ... | ... | ... |
|
||||||
| Appendix A: Introduction to PyTorch | [code-part1.ipynb](03_main-chapter-code/01_main-chapter-code/code-part1.ipynb)<br />[code-part2.ipynb](03_main-chapter-code/01_main-chapter-code/code-part2.ipynb)<br />[DDP-script.py](03_main-chapter-code/01_main-chapter-code/DDP-script.py) | [./appendix-A](./appendix-A) |
|
| Appendix A: Introduction to PyTorch | [code-part1.ipynb](03_main-chapter-code/01_main-chapter-code/code-part1.ipynb)<br />[code-part2.ipynb](03_main-chapter-code/01_main-chapter-code/code-part2.ipynb)<br />[DDP-script.py](03_main-chapter-code/01_main-chapter-code/DDP-script.py) | [./appendix-A](./appendix-A) |
|
||||||
|
|
||||||
|
<br>
|
||||||
|
<br>
|
||||||
|
|
||||||
|
<img src="images/mental-model.png" width="500px">
|
||||||
|
|
||||||
|
(A mental model summarizing the contents covered in this book.)
|
BIN
images/cover.jpg
Normal file
BIN
images/cover.jpg
Normal file
Binary file not shown.
After Width: | Height: | Size: 275 KiB |
BIN
images/mental-model.png
Normal file
BIN
images/mental-model.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 231 KiB |
Loading…
x
Reference in New Issue
Block a user