update ollama instructions

This commit is contained in:
rasbt 2024-06-06 21:03:40 -05:00
parent de36026e5a
commit 017f73d50c

View File

@ -100,15 +100,22 @@
"id": "9558a522-650d-401a-84fc-9fd7b1f39da7",
"metadata": {},
"source": [
"- Now let's test if ollama is set up correctly\n",
"- For this, click on the ollama application you downloaded; if it prompts you to install the command line usage, say \"yes\"\n",
"- Next, on the command line, execute the following command to try out the 8 billion parameters Llama 3 model (the model, which takes up 4.7 GB of storage space, will be automatically downloaded the first time you execute this command)\n",
"- For macOS and Windows users, click on the ollama application you downloaded; if it prompts you to install the command line usage, say \"yes\"\n",
"- Linux users can use the installation command provided on the ollama website\n",
"\n",
"- In general, before we can use ollama from the command line, we have to either start the ollama application or run `ollama serve` in a separate terminal\n",
"\n",
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/bonus/ollama-eval/ollama-serve.webp?1\">\n",
"\n",
"\n",
"- With the ollama application or `ollama serve` running, in a different terminal, on the command line, execute the following command to try out the 8 billion parameters Llama 3 model (the model, which takes up 4.7 GB of storage space, will be automatically downloaded the first time you execute this command)\n",
"\n",
"```bash\n",
"# 8B model\n",
"ollama run llama3\n",
"```\n",
"\n",
"\n",
"The output looks like as follows:\n",
"\n",
"```\n",
@ -165,7 +172,9 @@
"metadata": {},
"source": [
"- Now, an alternative way to interact with the model is via its REST API in Python via the following function\n",
"- First, in your terminal, start a local ollama server via `ollama serve` (after executing the code in this notebook, you can later stop this session by simply closing the terminal)\n",
"- Before you run the next cells in this notebook, make sure that ollama is still running, as described above, via\n",
" - `ollama serve` in a terminal\n",
" - the ollama application\n",
"- Next, run the following code cell to query the model"
]
},