Sebastian Raschka
3bdf18a599
Update Llama 3 table for consistency with Qwen3
2025-06-23 18:33:04 -05:00
Sebastian Raschka
81eda38d3b
Improve KV cache code for torch.compile ( #705 )
...
* Improve KV cache code for torch.compile
* cleanup
* cleanup
2025-06-23 18:08:49 -05:00
Sebastian Raschka
01be5a42e4
Use more recent sentencepiece tokenizer API ( #696 )
2025-06-22 13:52:30 -05:00
Sebastian Raschka
0a2e8c39c4
Qwen3 KV cache ( #688 )
2025-06-21 17:34:39 -05:00
Sebastian Raschka
3be0f3202a
Llama 3 KV Cache ( #685 )
...
* Llama 3 KV Cache
* skip expensive tests on Gh actions
* Update __init__.py
2025-06-21 10:55:20 -05:00
casinca
58b8672452
removed old args in GQA class ( #674 )
2025-06-17 13:09:53 -05:00
Daniel Kleine
c2cfb47b1a
fixed gqa qkv code comments ( #660 )
2025-06-13 08:21:28 -05:00
Sebastian Raschka
c4cde1c21b
Reduce Llama 3 RoPE memory requirements ( #658 )
...
* Llama3 from scratch improvements
* Fix Llama 3 expensive RoPE memory issue
* updates
* update package
* benchmark
* remove unused rescale_theta
2025-06-12 11:08:02 -05:00
Sebastian Raschka
47c036058d
Llama3 from scratch improvements ( #621 )
...
* Llama3 from scratch improvements
* restore
2025-04-16 18:08:26 -05:00
Sebastian Raschka
67e0680210
Disable mask saving as weight in Llama 3 model ( #604 )
...
* Disable mask saving as weight
* update pixi
* update pixi
2025-04-06 09:33:36 -05:00
Sebastian Raschka
f1434652f2
reformat nbs ( #602 )
2025-04-05 16:18:27 -05:00
Sebastian Raschka
d4c8d8f2c9
Fix Llama language typo in bonus materials ( #597 )
2025-04-02 21:41:36 -05:00
Sebastian Raschka
aedad7efc3
Add Llama 3.2 to pkg ( #591 )
...
* Add Llama 3.2 to pkg
* remove redundant attributes
* update tests
* updates
* updates
* updates
* fix link
* fix link
2025-03-31 18:59:47 -05:00
casinca
152a087a37
removing unused RoPE parameters ( #590 )
...
* removing unused RoPE parameters
* remove redundant context_length in GQA
---------
Co-authored-by: Sebastian Raschka <mail@sebastianraschka.com>
2025-03-31 17:10:39 -05:00
Sebastian Raschka
0f6894f41e
Memory optimized Llama ( #588 )
...
* Memory optimized Llama
* re-ad login
2025-03-30 15:18:12 -05:00
Sebastian Raschka
c21bfe4a23
Add PyPI package ( #576 )
...
* Add PyPI package
* fixes
* fixes
2025-03-23 19:28:49 -05:00
Sebastian Raschka
a08d7aaa84
Uv workflow improvements ( #531 )
...
* Uv workflow improvements
* Uv workflow improvements
* linter improvements
* pytproject.toml fixes
* pytproject.toml fixes
* pytproject.toml fixes
* pytproject.toml fixes
* pytproject.toml fixes
* pytproject.toml fixes
* windows fixes
* windows fixes
* windows fixes
* windows fixes
* windows fixes
* windows fixes
* win32 fix
* win32 fix
* win32 fix
* win32 fix
* win32 fix
* win32 fix
* win32 fix
* win32 fix
* win32 fix
* win32 fix
* win32 fix
* win32 fix
* win32 fix
* win32 fix
* win32 fix
* win32 fix
* win32 fix
* win32 fix
* win32 fix
2025-02-16 13:16:51 -06:00
Sebastian Raschka
4bfbcd069d
Auto download DPO dataset if not already available in path ( #479 )
...
* Auto download DPO dataset if not already available in path
* update tests to account for latest HF transformers release in unit tests
* pep 8
2025-01-12 12:27:28 -06:00
casinca
bb31de8999
[minor] typo & comments ( #441 )
...
* typo & comment
- safe -> save
- commenting code: batch_size, seq_len = in_idx.shape
* comment
- adding # NEW for assert num_heads % num_kv_groups == 0
* update memory wording
---------
Co-authored-by: rasbt <mail@sebastianraschka.com>
2024-11-18 19:52:42 +09:00
Daniel Kleine
81eed9afe2
updated RoPE statement ( #423 )
...
* updated RoPE statement
* updated .gitignore
* Update ch05/07_gpt_to_llama/converting-gpt-to-llama2.ipynb
---------
Co-authored-by: Sebastian Raschka <mail@sebastianraschka.com>
2024-10-30 08:00:08 -05:00
ROHAN WINSOR
cd24a27161
Fix argument name in LlamaTokenizer constructor ( #421 )
...
This PR addresses an oversight in the LlamaTokenizer class by changing the constructor argument from filepath to tokenizer_file.
2024-10-29 18:01:36 -05:00
Daniel Kleine
e8c2f962e9
minor fixes: Llama 3.2 standalone ( #420 )
...
* minor fixes
* reformat rope base as float
---------
Co-authored-by: rasbt <mail@sebastianraschka.com>
2024-10-25 21:08:06 -05:00
Sebastian Raschka
1516de54a5
RoPE theta rescaling ( #419 )
...
* rope fixes
* update
* update
* cleanup
2024-10-25 15:27:23 -05:00
Daniel Kleine
5ff72c2850
fixed typos ( #414 )
...
* fixed typos
* fixed formatting
* Update ch03/02_bonus_efficient-multihead-attention/mha-implementations.ipynb
* del weights after load into model
---------
Co-authored-by: Sebastian Raschka <mail@sebastianraschka.com>
2024-10-24 18:23:53 -05:00
Daniel Kleine
d38083c401
Updated Llama 2 to 3 paths ( #413 )
...
* llama 2 and 3 path fixes
* updated llama 3, 3.1 and 3.2 paths
* updated .gitignore
* Typo fix
---------
Co-authored-by: Sebastian Raschka <mail@sebastianraschka.com>
2024-10-24 07:40:08 -05:00
Sebastian Raschka
e1dfd2cb7a
Update test-requirements-extra.txt
2024-10-23 19:19:58 -05:00
Sebastian Raschka
7cd6a670ed
RoPE updates ( #412 )
...
* RoPE updates
* Apply suggestions from code review
* updates
* updates
* updates
2024-10-23 18:07:49 -05:00
Sebastian Raschka
4f9c9fb703
Update tests.py
2024-10-23 07:48:33 -05:00
Sebastian Raschka
534a704364
RoPE increase ( #407 )
2024-10-21 19:58:38 -05:00
Sebastian Raschka
ec18b6a8a3
Add Llama 3.2 RoPE to CI ( #391 )
...
* add Llama 3.2 RoPE to CI
* update
2024-10-08 08:28:34 -05:00
Sebastian Raschka
1eb0b3810a
Introduce buffers to improve Llama 3.2 efficiency ( #389 )
...
* Introduce buffers to improve Llama 3.2 efficiency
* update
* update
2024-10-06 12:49:04 -05:00
Daniel Kleine
a0c0c765a8
fixed Llama 2 to 3.2 NBs ( #388 )
...
* updated requirements
* fixes llama2 to llama3
* fixed llama 3.2 standalone
* fixed typo
* fixed rope formula
* Update requirements-extra.txt
* Update ch05/07_gpt_to_llama/converting-llama2-to-llama3.ipynb
* Update ch05/07_gpt_to_llama/converting-llama2-to-llama3.ipynb
* Update ch05/07_gpt_to_llama/standalone-llama32.ipynb
---------
Co-authored-by: Sebastian Raschka <mail@sebastianraschka.com>
2024-10-06 09:56:55 -05:00
Sebastian Raschka
0972ded530
Add a note about weight tying in Llama 3.2 ( #386 )
2024-10-05 09:20:54 -05:00
Sebastian Raschka
8553644440
Llama 3.2 requirements file
2024-10-05 07:32:43 -05:00
Sebastian Raschka
b44096acef
Implement Llama 3.2 ( #383 )
2024-10-05 07:30:47 -05:00
Sebastian Raschka
a5405c255d
Cos-sin fix in Llama 2 bonus notebook ( #381 )
2024-10-03 20:45:40 -05:00
Sebastian Raschka
b993c2b25b
Improve rope settings for llama3 ( #380 )
2024-10-03 08:29:54 -05:00
rasbt
278a50a348
add section numbers
2024-09-30 08:42:22 -05:00
Sebastian Raschka
b56d0b2942
Add llama2 unit tests ( #372 )
...
* add llama2 unit tests
* update
* updates
* updates
* update file path
* update requirements file
* rmsnorm test
* update
2024-09-25 19:40:36 -05:00
rasbt
a6d8e93da3
improve formatting
2024-09-24 18:49:17 -05:00
Daniel Kleine
ff31b345b0
ch05/07 gpt_to_llama text improvements ( #369 )
...
* fixed typo
* fixed RMSnorm formula
* fixed SwiGLU formula
* temperature=0 for untrained model for reproducibility
* added extra info hf token
2024-09-24 18:45:49 -05:00
rasbt
d144bd5b7a
add json import
2024-09-23 09:12:35 -05:00
rasbt
6bc3de165c
move access token to config.json
2024-09-23 08:56:16 -05:00
rasbt
58df945ed4
add llama3 comparison
2024-09-23 08:17:10 -05:00
Sebastian Raschka
0467c8289b
GPT to Llama ( #368 )
...
* GPT to Llama
* fix urls
2024-09-23 07:34:06 -05:00