mirror of
https://github.com/rasbt/LLMs-from-scratch.git
synced 2025-08-30 03:20:51 +00:00

* added ch05/03_bonus_pretraining_on_gutenberg model checkpoints and preprocessing output folders to .gitignore * removed prettier extension, added github alerts markdown extension * specified download instructions and fixed code markdown * Update ch05/03_bonus_pretraining_on_gutenberg/README.md * Update ch05/03_bonus_pretraining_on_gutenberg/README.md --------- Co-authored-by: Sebastian Raschka <mail@sebastianraschka.com>
Chapter 5: Pretraining on Unlabeled Data
- 01_main-chapter-code contains the main chapter code
- 02_alternative_weight_loading contains code to load the GPT model weights from alternative places in case the model weights become unavailable from OpenAI
- 03_bonus_pretraining_on_gutenberg contains code to pretrain the LLM longer on the whole corpus of books from Project Gutenberg
- [04_learning_rate_schedulers] contains code implementing a more sophisticated training function including learning rate schedulers and gradient clipping
- 05_hparam_tuning contains an optional hyperparameter tuning script