add clarifying note about GELU

This commit is contained in:
rasbt 2024-06-29 07:14:36 -05:00
parent 1e61943bf2
commit 1ffb7500e4

View File

@ -667,7 +667,7 @@
"metadata": {},
"source": [
"- As we can see, ReLU is a piecewise linear function that outputs the input directly if it is positive; otherwise, it outputs zero\n",
"- GELU is a smooth, non-linear function that approximates ReLU but with a non-zero gradient for negative values\n",
"- GELU is a smooth, non-linear function that approximates ReLU but with a non-zero gradient for negative values (except at approximately -0.75)\n",
"\n",
"- Next, let's implement the small neural network module, `FeedForward`, that we will be using in the LLM's transformer block later:"
]