mirror of
https://github.com/rasbt/LLMs-from-scratch.git
synced 2025-08-15 04:01:44 +00:00
add main and optional sections
This commit is contained in:
parent
85827e0a0b
commit
283397aaf2
@ -1,5 +1,9 @@
|
|||||||
# Chapter 2: Working with Text Data
|
# Chapter 2: Working with Text Data
|
||||||
|
|
||||||
- [ch02.ipynb](ch02.ipynb) contains all the code as it appears in the chapter
|
### Main Chapter Code
|
||||||
- [dataloader.ipynb](dataloader.ipynb) is a minimal notebook with the main data loading pipeline implemented in this chapter
|
|
||||||
|
|
||||||
|
- [ch02.ipynb](ch02.ipynb) contains all the code as it appears in the chapter
|
||||||
|
|
||||||
|
### Optional Code
|
||||||
|
|
||||||
|
- [dataloader.ipynb](dataloader.ipynb) is a minimal notebook with the main data loading pipeline implemented in this chapter
|
||||||
|
@ -1,5 +1,10 @@
|
|||||||
# Chapter 3: Coding Attention Mechanisms
|
# Chapter 3: Coding Attention Mechanisms
|
||||||
|
|
||||||
|
### Main Chapter Code
|
||||||
|
|
||||||
- [ch03.ipynb](ch03.ipynb) contains all the code as it appears in the chapter
|
- [ch03.ipynb](ch03.ipynb) contains all the code as it appears in the chapter
|
||||||
|
|
||||||
|
### Optional Code
|
||||||
|
|
||||||
- [multihead-attention.ipynb](multihead-attention.ipynb) is a minimal notebook with the main data loading pipeline implemented in this chapter
|
- [multihead-attention.ipynb](multihead-attention.ipynb) is a minimal notebook with the main data loading pipeline implemented in this chapter
|
||||||
|
|
||||||
|
@ -1,6 +1,11 @@
|
|||||||
# Chapter 4: Implementing a GPT Model from Scratch To Generate Text
|
# Chapter 4: Implementing a GPT Model from Scratch To Generate Text
|
||||||
|
|
||||||
|
### Main Chapter Code
|
||||||
|
|
||||||
- [ch04.ipynb](ch04.ipynb) contains all the code as it appears in the chapter
|
- [ch04.ipynb](ch04.ipynb) contains all the code as it appears in the chapter
|
||||||
- [previous_chapters.py](previous_chapters.py) is a Python module that contains the `MultiHeadAttention` module from the previous chapter, which we import in [ch04.ipynb](ch04.ipynb) to create the GPT model
|
- [previous_chapters.py](previous_chapters.py) is a Python module that contains the `MultiHeadAttention` module from the previous chapter, which we import in [ch04.ipynb](ch04.ipynb) to create the GPT model
|
||||||
|
|
||||||
|
### Optional Code
|
||||||
|
|
||||||
- [gpt.py](gpt.py) is a standalone Python script file with the code that we implemented thus far, including the GPT model we coded in this chapter
|
- [gpt.py](gpt.py) is a standalone Python script file with the code that we implemented thus far, including the GPT model we coded in this chapter
|
||||||
|
|
||||||
|
@ -4,6 +4,7 @@
|
|||||||
|
|
||||||
- [01_main-chapter-code](01_main-chapter-code) contains the main chapter code.
|
- [01_main-chapter-code](01_main-chapter-code) contains the main chapter code.
|
||||||
|
|
||||||
## Bonus Materials
|
## ### Optional Code
|
||||||
|
|
||||||
|
- [02_performance-analysis](02_performance-analysis) contains optional code analyzing the performance of the GPT model(s) implemented in the main chapter.
|
||||||
|
|
||||||
- [02_performance-analysis](02_performance-analysis) contains optional code analyzing the performance of the GPT model(s) implemented in the main chapter.
|
|
@ -1,9 +1,14 @@
|
|||||||
# Chapter 5: Pretraining on Unlabeled Data
|
# Chapter 5: Pretraining on Unlabeled Data
|
||||||
|
|
||||||
|
### Main Chapter Code
|
||||||
|
|
||||||
- [ch05.ipynb](ch05.ipynb) contains all the code as it appears in the chapter
|
- [ch05.ipynb](ch05.ipynb) contains all the code as it appears in the chapter
|
||||||
- [previous_chapters.py](previous_chapters.py) is a Python module that contains the `MultiHeadAttention` module and `GPTModel` class from the previous chapters, which we import in [ch05.ipynb](ch05.ipynb) to pretrain the GPT model
|
- [previous_chapters.py](previous_chapters.py) is a Python module that contains the `MultiHeadAttention` module and `GPTModel` class from the previous chapters, which we import in [ch05.ipynb](ch05.ipynb) to pretrain the GPT model
|
||||||
- [gpt_train.py](gpt_train.py) is a standalone Python script file with the code that we implemented in [ch05.ipynb](ch05.ipynb) to train the GPT model (you can think of it as a code file summarizing this chapter)
|
|
||||||
- [gpt_generate.py](gpt_generate.py) is a standalone Python script file with the code that we implemented in [ch05.ipynb](ch05.ipynb) to load and use the pretrained model weights from OpenAI
|
|
||||||
- [gpt_download.py](gpt_download.py) contains the utility functions for downloading the pretrained GPT model weights
|
- [gpt_download.py](gpt_download.py) contains the utility functions for downloading the pretrained GPT model weights
|
||||||
- [exercise-solutions.ipynb](exercise-solutions.ipynb) contains the exercise solutions for this chapter
|
- [exercise-solutions.ipynb](exercise-solutions.ipynb) contains the exercise solutions for this chapter
|
||||||
|
|
||||||
|
### Optional Code
|
||||||
|
|
||||||
|
- [gpt_train.py](gpt_train.py) is a standalone Python script file with the code that we implemented in [ch05.ipynb](ch05.ipynb) to train the GPT model (you can think of it as a code file summarizing this chapter)
|
||||||
|
- [gpt_generate.py](gpt_generate.py) is a standalone Python script file with the code that we implemented in [ch05.ipynb](ch05.ipynb) to load and use the pretrained model weights from OpenAI
|
||||||
|
|
||||||
|
@ -1,8 +1,13 @@
|
|||||||
# Chapter 6: Finetuning for Classification
|
# Chapter 6: Finetuning for Classification
|
||||||
|
|
||||||
|
### Main Chapter Code
|
||||||
|
|
||||||
- [ch06.ipynb](ch06.ipynb) contains all the code as it appears in the chapter
|
- [ch06.ipynb](ch06.ipynb) contains all the code as it appears in the chapter
|
||||||
- [previous_chapters.py](previous_chapters.py) is a Python module that contains the GPT model we coded and trained in previous chapters, alongside many utility functions, which we reuse in this chapter
|
- [previous_chapters.py](previous_chapters.py) is a Python module that contains the GPT model we coded and trained in previous chapters, alongside many utility functions, which we reuse in this chapter
|
||||||
- [gpt-class-finetune.py](gpt-class-finetune.py) is a standalone Python script file with the code that we implemented in [ch06.ipynb](ch06.ipynb) to finetune the GPT model (you can think of it as a chapter summary)
|
|
||||||
- [gpt_download.py](gpt_download.py) contains the utility functions for downloading the pretrained GPT model weights
|
- [gpt_download.py](gpt_download.py) contains the utility functions for downloading the pretrained GPT model weights
|
||||||
- [exercise-solutions.ipynb](exercise-solutions.ipynb) contains the exercise solutions for this chapter
|
- [exercise-solutions.ipynb](exercise-solutions.ipynb) contains the exercise solutions for this chapter
|
||||||
|
|
||||||
|
### Optional Code
|
||||||
|
|
||||||
|
- [gpt-class-finetune.py](gpt-class-finetune.py) is a standalone Python script file with the code that we implemented in [ch06.ipynb](ch06.ipynb) to finetune the GPT model (you can think of it as a chapter summary)
|
||||||
|
|
||||||
|
@ -1,3 +1,11 @@
|
|||||||
# Chapter 7: Finetuning to Follow Instructions
|
# Chapter 7: Finetuning to Follow Instructions
|
||||||
|
|
||||||
|
### Main Chapter Code
|
||||||
|
|
||||||
|
- [ch07.ipynb](ch07.ipynb) contains all the code as it appears in the chapter
|
||||||
|
- [previous_chapters.py](previous_chapters.py) is a Python module that contains the GPT model we coded and trained in previous chapters, alongside many utility functions, which we reuse in this chapter
|
||||||
|
- [gpt_download.py](gpt_download.py) contains the utility functions for downloading the pretrained GPT model weights
|
||||||
|
|
||||||
|
### Optional Code
|
||||||
|
|
||||||
In progress ...
|
In progress ...
|
Loading…
x
Reference in New Issue
Block a user