James Holcombe 0b866c133f Use instance tokenizer (#116)
* Use instance tokenizer

* consistency updates

---------

Co-authored-by: Sebastian Raschka <mail@sebastianraschka.com>
2024-04-10 21:16:19 -04:00
..
2024-03-23 07:27:43 -05:00

Chapter 5: Pretraining on Unlabeled Data

  • 01_main-chapter-code contains the main chapter code
  • 02_alternative_weight_loading contains code to load the GPT model weights from alternative places in case the model weights become unavailable from OpenAI
  • 03_bonus_pretraining_on_gutenberg contains code to pretrain the LLM longer on the whole corpus of books from Project Gutenberg
  • [04_learning_rate_schedulers] contains code implementing a more sophisticated training function including learning rate schedulers and gradient clipping
  • 05_hparam_tuning contains an optional hyperparameter tuning script