Update README.md

This commit is contained in:
Sebastian Raschka 2024-10-29 20:20:48 -05:00 committed by GitHub
parent cd24a27161
commit b5f2aa3500
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -1,11 +1,11 @@
# Build a Large Language Model (From Scratch)
This repository contains the code for developing, pretraining, and finetuning a GPT-like LLM and is the official code repository for the book [Build a Large Language Model (From Scratch)](http://mng.bz/orYv).
This repository contains the code for developing, pretraining, and finetuning a GPT-like LLM and is the official code repository for the book [Build a Large Language Model (From Scratch)](https://amzn.to/4fqvn0D).
<br>
<br>
<a href="http://mng.bz/orYv"><img src="https://sebastianraschka.com/images/LLMs-from-scratch-images/cover.jpg?123" width="250px"></a>
<a href="https://amzn.to/4fqvn0D"><img src="https://sebastianraschka.com/images/LLMs-from-scratch-images/cover.jpg?123" width="250px"></a>
<br>
@ -14,8 +14,8 @@ In [*Build a Large Language Model (From Scratch)*](http://mng.bz/orYv), you'll l
The method described in this book for training and developing your own small-but-functional model for educational purposes mirrors the approach used in creating large-scale foundational models such as those behind ChatGPT. In addition, this book includes code for loading the weights of larger pretrained models for finetuning.
- Link to the official [source code repository](https://github.com/rasbt/LLMs-from-scratch)
- [Link to the book at Manning](http://mng.bz/orYv)
- [Link to the book page on Amazon](https://www.amazon.com/gp/product/1633437167)
- [Link to the book at Manning (the publisher's website)](http://mng.bz/orYv)
- [Link to the book page on Amazon.com](https://www.amazon.com/gp/product/1633437167)
- ISBN 9781633437166
<a href="http://mng.bz/orYv#reviews"><img src="https://sebastianraschka.com//images/LLMs-from-scratch-images/other/reviews.png" width="220px"></a>