This commit is contained in:
Sebastian Raschka 2024-05-24 06:58:12 -05:00 committed by GitHub
parent ecb1788a9a
commit ee6afe260a

View File

@ -11,7 +11,7 @@ This repository contains the code for coding, pretraining, and finetuning a GPT-
<br>
In [*Build a Large Language Model (From Scratch)*](http://mng.bz/orYv), you'll discover how LLMs work from the inside out. In this book, I'll guide you step by step through creating your own LLM, explaining each stage with clear text, diagrams, and examples.
In [*Build a Large Language Model (From Scratch)*](http://mng.bz/orYv), you'll learn and understand how large language models (LLMs) work from the inside out by coding them from the ground up, step by step. In this book, I'll guide you through creating your own LLM, explaining each stage with clear text, diagrams, and examples.
The method described in this book for training and developing your own small-but-functional model for educational purposes mirrors the approach used in creating large-scale foundational models such as those behind ChatGPT.