8 Commits

Author SHA1 Message Date
Sebastian Raschka
75ede3e340 RoPE theta rescaling (#419)
* rope fixes

* update

* update

* cleanup
2024-10-25 15:27:23 -05:00
Daniel Kleine
8b60460319 Updated Llama 2 to 3 paths (#413)
* llama 2 and 3 path fixes

* updated llama 3, 3.1 and 3.2 paths

* updated .gitignore

* Typo fix

---------

Co-authored-by: Sebastian Raschka <mail@sebastianraschka.com>
2024-10-24 07:40:08 -05:00
Sebastian Raschka
f8bdfe12e1 RoPE updates (#412)
* RoPE updates

* Apply suggestions from code review

* updates

* updates

* updates
2024-10-23 18:07:49 -05:00
Sebastian Raschka
9726ca6546 RoPE increase (#407) 2024-10-21 19:58:38 -05:00
Sebastian Raschka
06604f4b84 Introduce buffers to improve Llama 3.2 efficiency (#389)
* Introduce buffers to improve Llama 3.2 efficiency

* update

* update
2024-10-06 12:49:04 -05:00
Daniel Kleine
4f9775d91c fixed Llama 2 to 3.2 NBs (#388)
* updated requirements

* fixes llama2 to llama3

* fixed llama 3.2 standalone

* fixed typo

* fixed rope formula

* Update requirements-extra.txt

* Update ch05/07_gpt_to_llama/converting-llama2-to-llama3.ipynb

* Update ch05/07_gpt_to_llama/converting-llama2-to-llama3.ipynb

* Update ch05/07_gpt_to_llama/standalone-llama32.ipynb

---------

Co-authored-by: Sebastian Raschka <mail@sebastianraschka.com>
2024-10-06 09:56:55 -05:00
Sebastian Raschka
81053ccadd Add a note about weight tying in Llama 3.2 (#386) 2024-10-05 09:20:54 -05:00
Sebastian Raschka
6f86c78763 Implement Llama 3.2 (#383) 2024-10-05 07:30:47 -05:00