mirror of
https://github.com/rasbt/LLMs-from-scratch.git
synced 2025-07-30 20:34:16 +00:00

* updated RoPE statement * updated .gitignore * Update ch05/07_gpt_to_llama/converting-gpt-to-llama2.ipynb --------- Co-authored-by: Sebastian Raschka <mail@sebastianraschka.com>
Chapter 5: Pretraining on Unlabeled Data
Main Chapter Code
- 01_main-chapter-code contains the main chapter code
Bonus Materials
- 02_alternative_weight_loading contains code to load the GPT model weights from alternative places in case the model weights become unavailable from OpenAI
- 03_bonus_pretraining_on_gutenberg contains code to pretrain the LLM longer on the whole corpus of books from Project Gutenberg
- 04_learning_rate_schedulers contains code implementing a more sophisticated training function including learning rate schedulers and gradient clipping
- 05_bonus_hparam_tuning contains an optional hyperparameter tuning script
- 06_user_interface implements an interactive user interface to interact with the pretrained LLM
- 07_gpt_to_llama contains a step-by-step guide for converting a GPT architecture implementation to Llama 3.2 and loads pretrained weights from Meta AI
- 08_memory_efficient_weight_loading contains a bonus notebook showing how to load model weights via PyTorch's
load_state_dict
method more efficiently