mirror of
https://github.com/rasbt/LLMs-from-scratch.git
synced 2025-08-16 20:51:51 +00:00
add additional lora figure
This commit is contained in:
parent
c93c90eb1e
commit
d18f92fa34
@ -615,7 +615,9 @@
|
||||
"id": "4D21Jk7Vw3nG"
|
||||
},
|
||||
"source": [
|
||||
"- To try LoRA on the GPT model we defined earlier, we define a `replace_linear_with_lora` function to replace all `Linear` layers in the model with the new `LinearWithLoRA` layers"
|
||||
"- To try LoRA on the GPT model we defined earlier, we define a `replace_linear_with_lora` function to replace all `Linear` layers in the model with the new `LinearWithLoRA` layers\n",
|
||||
"\n",
|
||||
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/appendix-e_compressed/lora-4.webp\" width=\"400px\">"
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -642,7 +644,8 @@
|
||||
"id": "8c172164-cdde-4489-b7d7-aaed9cc2f5f2",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"- We then freeze the original model parameter and use the `replace_linear_with_lora` to replace the said `Linear` layers below"
|
||||
"- We then freeze the original model parameter and use the `replace_linear_with_lora` to replace the said `Linear` layers using the code below\n",
|
||||
"- This will replace the `Linear` layers in the "
|
||||
]
|
||||
},
|
||||
{
|
||||
|
Loading…
x
Reference in New Issue
Block a user