mirror of
https://github.com/rasbt/LLMs-from-scratch.git
synced 2025-08-08 08:42:56 +00:00

total_training_iters=20, warmup_iters=20= len(train_loader) 4 multiply n_epochs 5, then ZeroDivisionError occurred. ```shell Traceback (most recent call last): File "LLMs-from-scratch/ch05/05_bonus_hparam_tuning/hparam_search.py", line 191, in <module> train_loss, val_loss = train_model( ^^^^^^^^^^^^ File "/mnt/raid1/docker/ai/LLMs-from-scratch/ch05/05_bonus_hparam_tuning/hparam_search.py", line 90, in train_model progress = (global_step - warmup_iters) / (total_training_iters - warmup_iters) ~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ZeroDivisionError: division by zero ```
Chapter 5: Pretraining on Unlabeled Data
Main Chapter Code
- 01_main-chapter-code contains the main chapter code
Bonus Materials
- 02_alternative_weight_loading contains code to load the GPT model weights from alternative places in case the model weights become unavailable from OpenAI
- 03_bonus_pretraining_on_gutenberg contains code to pretrain the LLM longer on the whole corpus of books from Project Gutenberg
- 04_learning_rate_schedulers contains code implementing a more sophisticated training function including learning rate schedulers and gradient clipping
- 05_bonus_hparam_tuning contains an optional hyperparameter tuning script