LLMs-from-scratch/ch05/05_bonus_hparam_tuning
Sebastian Raschka 40ba3a4068 Remove leftover instances of self.tokenizer (#201)
* Remove leftover instances of self.tokenizer

* add endoftext token
2024-06-08 14:57:34 -05:00
..
2024-06-03 07:06:42 -05:00
2024-04-10 22:09:46 -04:00

Optimizing Hyperparameters for Pretraining

The hparam_search.py script, based on the extended training function in Appendix D: Adding Bells and Whistles to the Training Loop, is designed to find optimal hyperparameters via grid search.

Note

This script will take a long time to run. You may want to reduce the number of hyperparameter configurations explored in the HPARAM_GRID dictionary at the top.