mirror of
https://github.com/rasbt/LLMs-from-scratch.git
synced 2025-07-24 17:33:51 +00:00

* removed duplicated white spaces * Update ch07/01_main-chapter-code/ch07.ipynb * Update ch07/05_dataset-generation/llama3-ollama.ipynb * removed duplicated white spaces * fixed title again --------- Co-authored-by: Sebastian Raschka <mail@sebastianraschka.com>
Chapter 5: Pretraining on Unlabeled Data
Main Chapter Code
- ch05.ipynb contains all the code as it appears in the chapter
- previous_chapters.py is a Python module that contains the
MultiHeadAttention
module andGPTModel
class from the previous chapters, which we import in ch05.ipynb to pretrain the GPT model - gpt_download.py contains the utility functions for downloading the pretrained GPT model weights
- exercise-solutions.ipynb contains the exercise solutions for this chapter
Optional Code
- gpt_train.py is a standalone Python script file with the code that we implemented in ch05.ipynb to train the GPT model (you can think of it as a code file summarizing this chapter)
- gpt_generate.py is a standalone Python script file with the code that we implemented in ch05.ipynb to load and use the pretrained model weights from OpenAI