mirror of
https://github.com/rasbt/LLMs-from-scratch.git
synced 2025-07-16 21:41:11 +00:00
2364 lines
139 KiB
Plaintext
2364 lines
139 KiB
Plaintext
{
|
||
"cells": [
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "c024bfa4-1a7a-4751-b5a1-827225a3478b",
|
||
"metadata": {
|
||
"id": "c024bfa4-1a7a-4751-b5a1-827225a3478b"
|
||
},
|
||
"source": [
|
||
"<font size=\"1\">\n",
|
||
"Supplementary code for \"Build a Large Language Model From Scratch\": <a href=\"https://www.manning.com/books/build-a-large-language-model-from-scratch\">https://www.manning.com/books/build-a-large-language-model-from-scratch</a> by <a href=\"https://sebastianraschka.com\">Sebastian Raschka</a><br>\n",
|
||
"Code repository: <a href=\"https://github.com/rasbt/LLMs-from-scratch\">https://github.com/rasbt/LLMs-from-scratch</a>\n",
|
||
"</font>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "bfabadb8-5935-45ff-b39c-db7a29012129",
|
||
"metadata": {
|
||
"id": "bfabadb8-5935-45ff-b39c-db7a29012129"
|
||
},
|
||
"source": [
|
||
"# Chapter 6: Finetuning for Text Classification"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 1,
|
||
"id": "5b7e01c2-1c84-4f2a-bb51-2e0b74abda90",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "5b7e01c2-1c84-4f2a-bb51-2e0b74abda90",
|
||
"outputId": "9495f150-9d79-4910-d6e7-6c0d9aae4a41"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"matplotlib version: 3.8.2\n",
|
||
"numpy version: 1.26.0\n",
|
||
"tiktoken version: 0.5.1\n",
|
||
"torch version: 2.2.2\n",
|
||
"tensorflow version: 2.15.0\n",
|
||
"pandas version: 2.2.1\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"from importlib.metadata import version\n",
|
||
"\n",
|
||
"pkgs = [\"matplotlib\",\n",
|
||
" \"numpy\",\n",
|
||
" \"tiktoken\",\n",
|
||
" \"torch\",\n",
|
||
" \"tensorflow\", # For OpenAI's pretrained weights\n",
|
||
" \"pandas\" # Dataset loading\n",
|
||
" ]\n",
|
||
"for p in pkgs:\n",
|
||
" print(f\"{p} version: {version(p)}\")"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "a445828a-ff10-4efa-9f60-a2e2aed4c87d",
|
||
"metadata": {},
|
||
"source": [
|
||
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch06_compressed/chapter-overview.webp\" width=500px>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "3a84cf35-b37f-4c15-8972-dfafc9fadc1c",
|
||
"metadata": {
|
||
"id": "3a84cf35-b37f-4c15-8972-dfafc9fadc1c"
|
||
},
|
||
"source": [
|
||
"## 6.1 Different categories of finetuning"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "ede3d731-5123-4f02-accd-c670ce50a5a3",
|
||
"metadata": {
|
||
"id": "ede3d731-5123-4f02-accd-c670ce50a5a3"
|
||
},
|
||
"source": [
|
||
"- No code in this section"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "ac45579d-d485-47dc-829e-43be7f4db57b",
|
||
"metadata": {},
|
||
"source": [
|
||
"- The most common ways to finetune language models are instruction-finetuning and classification finetuning\n",
|
||
"- Instruction-finetuning, depicted below, is the topic of the next chapter"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "6c29ef42-46d9-43d4-8bb4-94974e1665e4",
|
||
"metadata": {},
|
||
"source": [
|
||
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch06_compressed/instructions.webp\" width=500px>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "a7f60321-95b8-46a9-97bf-1d07fda2c3dd",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Classification finetuning, the topic of this chapter, is a procedure you may already be familiar with if you have a background in machine learning -- it's similar to training a convolutional network to classify handwritten digits, for example\n",
|
||
"- In classification finetuning, we have a specific number of class labels (for example, \"spam\" and \"not spam\") that the model can output\n",
|
||
"- A classification finetuned model can only predict classes it has seen during training (for example, \"spam\" or \"not spam\", whereas an instruction-finetuned model can usually perform many tasks\n",
|
||
"- We can think of a classification-finetuned model as a very specialized model; in practice, it is much easier to create a specialized model than a generalist model that performs well on many different tasks"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "0b37a0c4-0bb1-4061-b1fe-eaa4416d52c3",
|
||
"metadata": {},
|
||
"source": [
|
||
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch06_compressed/spam-non-spam.webp\" width=500px>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "8c7017a2-32aa-4002-a2f3-12aac293ccdf",
|
||
"metadata": {
|
||
"id": "8c7017a2-32aa-4002-a2f3-12aac293ccdf"
|
||
},
|
||
"source": [
|
||
"## 6.2 Preparing the dataset"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "5f628975-d2e8-4f7f-ab38-92bb868b7067",
|
||
"metadata": {},
|
||
"source": [
|
||
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch06_compressed/overview-1.webp\" width=500px>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "9fbd459f-63fa-4d8c-8499-e23103156c7d",
|
||
"metadata": {
|
||
"id": "9fbd459f-63fa-4d8c-8499-e23103156c7d"
|
||
},
|
||
"source": [
|
||
"- This section prepares the dataset we use for classification finetuning\n",
|
||
"- We use a dataset consisting of spam and non-spam text messages to finetune the LLM to classify them\n",
|
||
"- First, we download and unzip the dataset"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 2,
|
||
"id": "def7c09b-af9c-4216-90ce-5e67aed1065c",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "def7c09b-af9c-4216-90ce-5e67aed1065c",
|
||
"outputId": "424e4423-f623-443c-ab9e-656f9e867559"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"sms_spam_collection/SMSSpamCollection.tsv already exists. Skipping download and extraction.\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"import urllib.request\n",
|
||
"import zipfile\n",
|
||
"import os\n",
|
||
"from pathlib import Path\n",
|
||
"\n",
|
||
"url = \"https://archive.ics.uci.edu/static/public/228/sms+spam+collection.zip\"\n",
|
||
"zip_path = \"sms_spam_collection.zip\"\n",
|
||
"extracted_path = \"sms_spam_collection\"\n",
|
||
"data_file_path = Path(extracted_path) / \"SMSSpamCollection.tsv\"\n",
|
||
"\n",
|
||
"def download_and_unzip_spam_data(url, zip_path, extracted_path, data_file_path):\n",
|
||
" if data_file_path.exists():\n",
|
||
" print(f\"{data_file_path} already exists. Skipping download and extraction.\")\n",
|
||
" return\n",
|
||
"\n",
|
||
" # Downloading the file\n",
|
||
" with urllib.request.urlopen(url) as response:\n",
|
||
" with open(zip_path, \"wb\") as out_file:\n",
|
||
" out_file.write(response.read())\n",
|
||
"\n",
|
||
" # Unzipping the file\n",
|
||
" with zipfile.ZipFile(zip_path, \"r\") as zip_ref:\n",
|
||
" zip_ref.extractall(extracted_path)\n",
|
||
"\n",
|
||
" # Add .tsv file extension\n",
|
||
" original_file_path = Path(extracted_path) / \"SMSSpamCollection\"\n",
|
||
" os.rename(original_file_path, data_file_path)\n",
|
||
" print(f\"File downloaded and saved as {data_file_path}\")\n",
|
||
"\n",
|
||
"download_and_unzip_spam_data(url, zip_path, extracted_path, data_file_path)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "6aac2d19-06d0-4005-916b-0bd4b1ee50d1",
|
||
"metadata": {
|
||
"id": "6aac2d19-06d0-4005-916b-0bd4b1ee50d1"
|
||
},
|
||
"source": [
|
||
"- The dataset is saved as a tab-separated text file, which we can load into a pandas DataFrame"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 3,
|
||
"id": "da0ed4da-ac31-4e4d-8bdd-2153be4656a4",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/",
|
||
"height": 423
|
||
},
|
||
"id": "da0ed4da-ac31-4e4d-8bdd-2153be4656a4",
|
||
"outputId": "a16c5cde-d341-4887-a93f-baa9bec542ab"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<div>\n",
|
||
"<style scoped>\n",
|
||
" .dataframe tbody tr th:only-of-type {\n",
|
||
" vertical-align: middle;\n",
|
||
" }\n",
|
||
"\n",
|
||
" .dataframe tbody tr th {\n",
|
||
" vertical-align: top;\n",
|
||
" }\n",
|
||
"\n",
|
||
" .dataframe thead th {\n",
|
||
" text-align: right;\n",
|
||
" }\n",
|
||
"</style>\n",
|
||
"<table border=\"1\" class=\"dataframe\">\n",
|
||
" <thead>\n",
|
||
" <tr style=\"text-align: right;\">\n",
|
||
" <th></th>\n",
|
||
" <th>Label</th>\n",
|
||
" <th>Text</th>\n",
|
||
" </tr>\n",
|
||
" </thead>\n",
|
||
" <tbody>\n",
|
||
" <tr>\n",
|
||
" <th>0</th>\n",
|
||
" <td>ham</td>\n",
|
||
" <td>Go until jurong point, crazy.. Available only ...</td>\n",
|
||
" </tr>\n",
|
||
" <tr>\n",
|
||
" <th>1</th>\n",
|
||
" <td>ham</td>\n",
|
||
" <td>Ok lar... Joking wif u oni...</td>\n",
|
||
" </tr>\n",
|
||
" <tr>\n",
|
||
" <th>2</th>\n",
|
||
" <td>spam</td>\n",
|
||
" <td>Free entry in 2 a wkly comp to win FA Cup fina...</td>\n",
|
||
" </tr>\n",
|
||
" <tr>\n",
|
||
" <th>3</th>\n",
|
||
" <td>ham</td>\n",
|
||
" <td>U dun say so early hor... U c already then say...</td>\n",
|
||
" </tr>\n",
|
||
" <tr>\n",
|
||
" <th>4</th>\n",
|
||
" <td>ham</td>\n",
|
||
" <td>Nah I don't think he goes to usf, he lives aro...</td>\n",
|
||
" </tr>\n",
|
||
" <tr>\n",
|
||
" <th>...</th>\n",
|
||
" <td>...</td>\n",
|
||
" <td>...</td>\n",
|
||
" </tr>\n",
|
||
" <tr>\n",
|
||
" <th>5567</th>\n",
|
||
" <td>spam</td>\n",
|
||
" <td>This is the 2nd time we have tried 2 contact u...</td>\n",
|
||
" </tr>\n",
|
||
" <tr>\n",
|
||
" <th>5568</th>\n",
|
||
" <td>ham</td>\n",
|
||
" <td>Will ü b going to esplanade fr home?</td>\n",
|
||
" </tr>\n",
|
||
" <tr>\n",
|
||
" <th>5569</th>\n",
|
||
" <td>ham</td>\n",
|
||
" <td>Pity, * was in mood for that. So...any other s...</td>\n",
|
||
" </tr>\n",
|
||
" <tr>\n",
|
||
" <th>5570</th>\n",
|
||
" <td>ham</td>\n",
|
||
" <td>The guy did some bitching but I acted like i'd...</td>\n",
|
||
" </tr>\n",
|
||
" <tr>\n",
|
||
" <th>5571</th>\n",
|
||
" <td>ham</td>\n",
|
||
" <td>Rofl. Its true to its name</td>\n",
|
||
" </tr>\n",
|
||
" </tbody>\n",
|
||
"</table>\n",
|
||
"<p>5572 rows × 2 columns</p>\n",
|
||
"</div>"
|
||
],
|
||
"text/plain": [
|
||
" Label Text\n",
|
||
"0 ham Go until jurong point, crazy.. Available only ...\n",
|
||
"1 ham Ok lar... Joking wif u oni...\n",
|
||
"2 spam Free entry in 2 a wkly comp to win FA Cup fina...\n",
|
||
"3 ham U dun say so early hor... U c already then say...\n",
|
||
"4 ham Nah I don't think he goes to usf, he lives aro...\n",
|
||
"... ... ...\n",
|
||
"5567 spam This is the 2nd time we have tried 2 contact u...\n",
|
||
"5568 ham Will ü b going to esplanade fr home?\n",
|
||
"5569 ham Pity, * was in mood for that. So...any other s...\n",
|
||
"5570 ham The guy did some bitching but I acted like i'd...\n",
|
||
"5571 ham Rofl. Its true to its name\n",
|
||
"\n",
|
||
"[5572 rows x 2 columns]"
|
||
]
|
||
},
|
||
"execution_count": 3,
|
||
"metadata": {},
|
||
"output_type": "execute_result"
|
||
}
|
||
],
|
||
"source": [
|
||
"import pandas as pd\n",
|
||
"\n",
|
||
"df = pd.read_csv(data_file_path, sep=\"\\t\", header=None, names=[\"Label\", \"Text\"])\n",
|
||
"df"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "e7b6e631-4f0b-4aab-82b9-8898e6663109",
|
||
"metadata": {
|
||
"id": "e7b6e631-4f0b-4aab-82b9-8898e6663109"
|
||
},
|
||
"source": [
|
||
"- When we check the class distribution, we see that the data contains \"ham\" (i.e., \"not spam\") much more frequently than \"spam\""
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 4,
|
||
"id": "495a5280-9d7c-41d4-9719-64ab99056d4c",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "495a5280-9d7c-41d4-9719-64ab99056d4c",
|
||
"outputId": "761e0482-43ba-4f46-f4b7-6774dae51b38"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Label\n",
|
||
"ham 4825\n",
|
||
"spam 747\n",
|
||
"Name: count, dtype: int64\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"print(df[\"Label\"].value_counts())"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "f773f054-0bdc-4aad-bbf6-397621bf63db",
|
||
"metadata": {
|
||
"id": "f773f054-0bdc-4aad-bbf6-397621bf63db"
|
||
},
|
||
"source": [
|
||
"- For simplicity, and because we prefer a small dataset for educational purposes anyway (it will make it possible to finetune the LLM faster), we subsample (undersample) the dataset so that it contains 747 instances from each class\n",
|
||
"- (Next to undersampling, there are several other ways to deal with class balances, but they are out of the scope of a book on LLMs; you can find examples and more information in the [`imbalanced-learn` user guide](https://imbalanced-learn.org/stable/user_guide.html))"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 5,
|
||
"id": "7be4a0a2-9704-4a96-b38f-240339818688",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "7be4a0a2-9704-4a96-b38f-240339818688",
|
||
"outputId": "396dc415-cb71-4a88-e85d-d88201c6d73f"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Label\n",
|
||
"ham 747\n",
|
||
"spam 747\n",
|
||
"Name: count, dtype: int64\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"def create_balanced_dataset(df):\n",
|
||
" \n",
|
||
" # Count the instances of \"spam\"\n",
|
||
" num_spam = df[df[\"Label\"] == \"spam\"].shape[0]\n",
|
||
" \n",
|
||
" # Randomly sample \"ham\" instances to match the number of \"spam\" instances\n",
|
||
" ham_subset = df[df[\"Label\"] == \"ham\"].sample(num_spam, random_state=123)\n",
|
||
" \n",
|
||
" # Combine ham \"subset\" with \"spam\"\n",
|
||
" balanced_df = pd.concat([ham_subset, df[df[\"Label\"] == \"spam\"]])\n",
|
||
"\n",
|
||
" return balanced_df\n",
|
||
"\n",
|
||
"balanced_df = create_balanced_dataset(df)\n",
|
||
"print(balanced_df[\"Label\"].value_counts())"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "d3fd2f5a-06d8-4d30-a2e3-230b86c559d6",
|
||
"metadata": {
|
||
"id": "d3fd2f5a-06d8-4d30-a2e3-230b86c559d6"
|
||
},
|
||
"source": [
|
||
"- Next, we change the string class labels \"ham\" and \"spam\" into integer class labels 0 and 1:"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 6,
|
||
"id": "c1b10c3d-5d57-42d0-8de8-cf80a06f5ffd",
|
||
"metadata": {
|
||
"id": "c1b10c3d-5d57-42d0-8de8-cf80a06f5ffd"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"balanced_df[\"Label\"] = balanced_df[\"Label\"].map({\"ham\": 0, \"spam\": 1})"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "5715e685-35b4-4b45-a86c-8a8694de9d6f",
|
||
"metadata": {
|
||
"id": "5715e685-35b4-4b45-a86c-8a8694de9d6f"
|
||
},
|
||
"source": [
|
||
"- Let's now define a function that randomly divides the dataset into a training, validation, and test subset"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 7,
|
||
"id": "uQl0Psdmx15D",
|
||
"metadata": {
|
||
"id": "uQl0Psdmx15D"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"def random_split(df, train_frac, validation_frac):\n",
|
||
" # Shuffle the entire DataFrame\n",
|
||
" df = df.sample(frac=1, random_state=123).reset_index(drop=True)\n",
|
||
"\n",
|
||
" # Calculate split indices\n",
|
||
" train_end = int(len(df) * train_frac)\n",
|
||
" validation_end = train_end + int(len(df) * validation_frac)\n",
|
||
"\n",
|
||
" # Split the DataFrame\n",
|
||
" train_df = df[:train_end]\n",
|
||
" validation_df = df[train_end:validation_end]\n",
|
||
" test_df = df[validation_end:]\n",
|
||
"\n",
|
||
" return train_df, validation_df, test_df\n",
|
||
"\n",
|
||
"train_df, validation_df, test_df = random_split(balanced_df, 0.7, 0.1)\n",
|
||
"# Test size is implied to be 0.2 as the remainder\n",
|
||
"\n",
|
||
"train_df.to_csv(\"train.csv\", index=None)\n",
|
||
"validation_df.to_csv(\"validation.csv\", index=None)\n",
|
||
"test_df.to_csv(\"test.csv\", index=None)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "a8d7a0c5-1d5f-458a-b685-3f49520b0094",
|
||
"metadata": {},
|
||
"source": [
|
||
"## 6.3 Creating data loaders"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "7126108a-75e7-4862-b0fb-cbf59a18bb6c",
|
||
"metadata": {
|
||
"id": "7126108a-75e7-4862-b0fb-cbf59a18bb6c"
|
||
},
|
||
"source": [
|
||
"- Note that the text messages have different lengths; if we want to combine multiple training examples in a batch, we have to either\n",
|
||
" - 1. truncate all messages to the length of the shortest message in the dataset or batch\n",
|
||
" - 2. pad all messages to the length of the longest message in the dataset or batch\n",
|
||
"\n",
|
||
"- We choose option 2 and pad all messages to the longest message in the dataset\n",
|
||
"- For that, we use `<|endoftext|>` as a padding token, as discussed in chapter 2"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "0829f33f-1428-4f22-9886-7fee633b3666",
|
||
"metadata": {},
|
||
"source": [
|
||
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch06_compressed/pad-input-sequences.webp?123\" width=500px>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 8,
|
||
"id": "74c3c463-8763-4cc0-9320-41c7eaad8ab7",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "74c3c463-8763-4cc0-9320-41c7eaad8ab7",
|
||
"outputId": "b5b48439-32c8-4b37-cca2-c9dc8fa86563"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"[50256]\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"import tiktoken\n",
|
||
"\n",
|
||
"tokenizer = tiktoken.get_encoding(\"gpt2\")\n",
|
||
"print(tokenizer.encode(\"<|endoftext|>\", allowed_special={\"<|endoftext|>\"}))"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "04f582ff-68bf-450e-bd87-5fb61afe431c",
|
||
"metadata": {
|
||
"id": "04f582ff-68bf-450e-bd87-5fb61afe431c"
|
||
},
|
||
"source": [
|
||
"- The `SpamDataset` class below identifies the longest sequence in the training dataset and adds the padding token to the others to match that sequence length"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 9,
|
||
"id": "d7791b52-af18-4ac4-afa9-b921068e383e",
|
||
"metadata": {
|
||
"id": "d7791b52-af18-4ac4-afa9-b921068e383e"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"import torch\n",
|
||
"from torch.utils.data import Dataset\n",
|
||
"\n",
|
||
"\n",
|
||
"class SpamDataset(Dataset):\n",
|
||
" def __init__(self, csv_file, tokenizer, max_length=None, pad_token_id=50256):\n",
|
||
" self.data = pd.read_csv(csv_file)\n",
|
||
"\n",
|
||
" # Pre-tokenize texts\n",
|
||
" self.encoded_texts = [\n",
|
||
" tokenizer.encode(text) for text in self.data[\"Text\"]\n",
|
||
" ]\n",
|
||
"\n",
|
||
" if max_length is None:\n",
|
||
" self.max_length = self._longest_encoded_length()\n",
|
||
" else:\n",
|
||
" self.max_length = max_length\n",
|
||
" # Truncate sequences if they are longer than max_length\n",
|
||
" self.encoded_texts = [\n",
|
||
" encoded_text[:self.max_length]\n",
|
||
" for encoded_text in self.encoded_texts\n",
|
||
" ]\n",
|
||
"\n",
|
||
" # Pad sequences to the longest sequence\n",
|
||
" self.encoded_texts = [\n",
|
||
" encoded_text + [pad_token_id] * (self.max_length - len(encoded_text))\n",
|
||
" for encoded_text in self.encoded_texts\n",
|
||
" ]\n",
|
||
"\n",
|
||
" def __getitem__(self, index):\n",
|
||
" encoded = self.encoded_texts[index]\n",
|
||
" label = self.data.iloc[index][\"Label\"]\n",
|
||
" return (\n",
|
||
" torch.tensor(encoded, dtype=torch.long),\n",
|
||
" torch.tensor(label, dtype=torch.long)\n",
|
||
" )\n",
|
||
"\n",
|
||
" def __len__(self):\n",
|
||
" return len(self.data)\n",
|
||
"\n",
|
||
" def _longest_encoded_length(self):\n",
|
||
" max_length = 0\n",
|
||
" for encoded_text in self.encoded_texts:\n",
|
||
" encoded_length = len(encoded_text)\n",
|
||
" if encoded_length > max_length:\n",
|
||
" max_length = encoded_length\n",
|
||
" return max_length"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 10,
|
||
"id": "uzj85f8ou82h",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "uzj85f8ou82h",
|
||
"outputId": "d08f1cf0-c24d-445f-a3f8-793532c3716f"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"120\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"train_dataset = SpamDataset(\n",
|
||
" csv_file=\"train.csv\",\n",
|
||
" max_length=None,\n",
|
||
" tokenizer=tokenizer\n",
|
||
")\n",
|
||
"\n",
|
||
"print(train_dataset.max_length)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "15bdd932-97eb-4b88-9cf9-d766ea4c3a60",
|
||
"metadata": {},
|
||
"source": [
|
||
"- We also pad the validation and test set to the longest training sequence\n",
|
||
"- Note that validation and test set samples that are longer than the longest training example are being truncated via `encoded_text[:self.max_length]` in the `SpamDataset` code\n",
|
||
"- This behavior is entirely optional, and it would also work well if we set `max_length=None` in both the validation and test set cases"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 11,
|
||
"id": "bb0c502d-a75e-4248-8ea0-196e2b00c61e",
|
||
"metadata": {
|
||
"id": "bb0c502d-a75e-4248-8ea0-196e2b00c61e"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"val_dataset = SpamDataset(\n",
|
||
" csv_file=\"validation.csv\",\n",
|
||
" max_length=train_dataset.max_length,\n",
|
||
" tokenizer=tokenizer\n",
|
||
")\n",
|
||
"test_dataset = SpamDataset(\n",
|
||
" csv_file=\"test.csv\",\n",
|
||
" max_length=train_dataset.max_length,\n",
|
||
" tokenizer=tokenizer\n",
|
||
")"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "20170d89-85a0-4844-9887-832f5d23432a",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Next, we use the dataset to instantiate the data loaders, which is similar to creating the data loaders in previous chapters"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "64bcc349-205f-48f8-9655-95ff21f5e72f",
|
||
"metadata": {},
|
||
"source": [
|
||
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch06_compressed/batch.webp\" width=500px>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 12,
|
||
"id": "8681adc0-6f02-4e75-b01a-a6ab75d05542",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "8681adc0-6f02-4e75-b01a-a6ab75d05542",
|
||
"outputId": "3266c410-4fdb-4a8c-a142-7f707e2525ab"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"from torch.utils.data import DataLoader\n",
|
||
"\n",
|
||
"num_workers = 0\n",
|
||
"batch_size = 8\n",
|
||
"\n",
|
||
"torch.manual_seed(123)\n",
|
||
"\n",
|
||
"train_loader = DataLoader(\n",
|
||
" dataset=train_dataset,\n",
|
||
" batch_size=batch_size,\n",
|
||
" shuffle=True,\n",
|
||
" num_workers=num_workers,\n",
|
||
" drop_last=True,\n",
|
||
")\n",
|
||
"\n",
|
||
"val_loader = DataLoader(\n",
|
||
" dataset=val_dataset,\n",
|
||
" batch_size=batch_size,\n",
|
||
" num_workers=num_workers,\n",
|
||
" drop_last=False,\n",
|
||
")\n",
|
||
"\n",
|
||
"test_loader = DataLoader(\n",
|
||
" dataset=test_dataset,\n",
|
||
" batch_size=batch_size,\n",
|
||
" num_workers=num_workers,\n",
|
||
" drop_last=False,\n",
|
||
")"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "ab7335db-e0bb-4e27-80c5-eea11e593a57",
|
||
"metadata": {},
|
||
"source": [
|
||
"- As a verification step, we iterate through the data loaders and ensure that the batches contain 8 training examples each, where each training example consists of 120 tokens"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 13,
|
||
"id": "4dee6882-4c3a-4964-af15-fa31f86ad047",
|
||
"metadata": {},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Train loader:\n",
|
||
"Input batch dimensions: torch.Size([8, 120])\n",
|
||
"Label batch dimensions torch.Size([8])\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"print(\"Train loader:\")\n",
|
||
"for input_batch, target_batch in train_loader:\n",
|
||
" pass\n",
|
||
"\n",
|
||
"print(\"Input batch dimensions:\", input_batch.shape)\n",
|
||
"print(\"Label batch dimensions\", target_batch.shape)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "5cdd7947-7039-49bf-8a5e-c0a2f4281ca1",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Lastly, let's print the total number of batches in each dataset"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 14,
|
||
"id": "IZfw-TYD2zTj",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "IZfw-TYD2zTj",
|
||
"outputId": "6934bbf2-9797-4fbe-d26b-1a246e18c2fb"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"130 training batches\n",
|
||
"19 validation batches\n",
|
||
"38 test batches\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"print(f\"{len(train_loader)} training batches\")\n",
|
||
"print(f\"{len(val_loader)} validation batches\")\n",
|
||
"print(f\"{len(test_loader)} test batches\")"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "d1c4f61a-5f5d-4b3b-97cf-151b617d1d6c",
|
||
"metadata": {
|
||
"id": "d1c4f61a-5f5d-4b3b-97cf-151b617d1d6c"
|
||
},
|
||
"source": [
|
||
"## 6.4 Initializing a model with pretrained weights"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "97e1af8b-8bd1-4b44-8b8b-dc031496e208",
|
||
"metadata": {},
|
||
"source": [
|
||
"- In this section, we initialize the pretrained model we worked with in the previous chapter\n",
|
||
"\n",
|
||
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch06_compressed/overview-2.webp\" width=500px>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 15,
|
||
"id": "2992d779-f9fb-4812-a117-553eb790a5a9",
|
||
"metadata": {
|
||
"id": "2992d779-f9fb-4812-a117-553eb790a5a9"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"CHOOSE_MODEL = \"gpt2-small (124M)\"\n",
|
||
"INPUT_PROMPT = \"Every effort moves\"\n",
|
||
"\n",
|
||
"BASE_CONFIG = {\n",
|
||
" \"vocab_size\": 50257, # Vocabulary size\n",
|
||
" \"context_length\": 1024, # Context length\n",
|
||
" \"drop_rate\": 0.0, # Dropout rate\n",
|
||
" \"qkv_bias\": True # Query-key-value bias\n",
|
||
"}\n",
|
||
"\n",
|
||
"model_configs = {\n",
|
||
" \"gpt2-small (124M)\": {\"emb_dim\": 768, \"n_layers\": 12, \"n_heads\": 12},\n",
|
||
" \"gpt2-medium (355M)\": {\"emb_dim\": 1024, \"n_layers\": 24, \"n_heads\": 16},\n",
|
||
" \"gpt2-large (774M)\": {\"emb_dim\": 1280, \"n_layers\": 36, \"n_heads\": 20},\n",
|
||
" \"gpt2-xl (1558M)\": {\"emb_dim\": 1600, \"n_layers\": 48, \"n_heads\": 25},\n",
|
||
"}\n",
|
||
"\n",
|
||
"BASE_CONFIG.update(model_configs[CHOOSE_MODEL])"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 16,
|
||
"id": "022a649a-44f5-466c-8a8e-326c063384f5",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "022a649a-44f5-466c-8a8e-326c063384f5",
|
||
"outputId": "7091e401-8442-4f47-a1d9-ecb42a1ef930"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"File already exists and is up-to-date: gpt2/124M/checkpoint\n",
|
||
"File already exists and is up-to-date: gpt2/124M/encoder.json\n",
|
||
"File already exists and is up-to-date: gpt2/124M/hparams.json\n",
|
||
"File already exists and is up-to-date: gpt2/124M/model.ckpt.data-00000-of-00001\n",
|
||
"File already exists and is up-to-date: gpt2/124M/model.ckpt.index\n",
|
||
"File already exists and is up-to-date: gpt2/124M/model.ckpt.meta\n",
|
||
"File already exists and is up-to-date: gpt2/124M/vocab.bpe\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"from gpt_download import download_and_load_gpt2\n",
|
||
"from previous_chapters import GPTModel, load_weights_into_gpt\n",
|
||
"\n",
|
||
"model_size = CHOOSE_MODEL.split(\" \")[-1].lstrip(\"(\").rstrip(\")\")\n",
|
||
"settings, params = download_and_load_gpt2(model_size=model_size, models_dir=\"gpt2\")\n",
|
||
"\n",
|
||
"model = GPTModel(BASE_CONFIG)\n",
|
||
"load_weights_into_gpt(model, params)\n",
|
||
"model.eval();"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "ab8e056c-abe0-415f-b34d-df686204259e",
|
||
"metadata": {},
|
||
"source": [
|
||
"- To ensure that the model was loaded corrected, let's double-check that it generates coherent text"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 17,
|
||
"id": "d8ac25ff-74b1-4149-8dc5-4c429d464330",
|
||
"metadata": {},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Every effort moves you forward.\n",
|
||
"\n",
|
||
"The first step is to understand the importance of your work\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"from previous_chapters import (\n",
|
||
" generate_text_simple,\n",
|
||
" text_to_token_ids,\n",
|
||
" token_ids_to_text\n",
|
||
")\n",
|
||
"\n",
|
||
"\n",
|
||
"text_1 = \"Every effort moves you\"\n",
|
||
"\n",
|
||
"token_ids = generate_text_simple(\n",
|
||
" model=model,\n",
|
||
" idx=text_to_token_ids(text_1, tokenizer),\n",
|
||
" max_new_tokens=15,\n",
|
||
" context_size=BASE_CONFIG[\"context_length\"]\n",
|
||
")\n",
|
||
"\n",
|
||
"print(token_ids_to_text(token_ids, tokenizer))"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "69162550-6a02-4ece-8db1-06c71d61946f",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Before we finetune the model as a classifier, let's see if the model can perhaps already classify spam messages via prompting"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 18,
|
||
"id": "94224aa9-c95a-4f8a-a420-76d01e3a800c",
|
||
"metadata": {},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Is the following text 'spam'? Answer with 'yes' or 'no': 'You are a winner you have been specially selected to receive $1000 cash or a $2000 award.' Answer with 'yes' or 'no'. Answer with 'yes' or 'no'. Answer with 'yes' or 'no'. Answer with 'yes'\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"text_2 = (\n",
|
||
" \"Is the following text 'spam'? Answer with 'yes' or 'no':\"\n",
|
||
" \" 'You are a winner you have been specially\"\n",
|
||
" \" selected to receive $1000 cash or a $2000 award.'\"\n",
|
||
" \" Answer with 'yes' or 'no'.\"\n",
|
||
")\n",
|
||
"\n",
|
||
"token_ids = generate_text_simple(\n",
|
||
" model=model,\n",
|
||
" idx=text_to_token_ids(text_2, tokenizer),\n",
|
||
" max_new_tokens=23,\n",
|
||
" context_size=BASE_CONFIG[\"context_length\"]\n",
|
||
")\n",
|
||
"\n",
|
||
"print(token_ids_to_text(token_ids, tokenizer))"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "1ce39ed0-2c77-410d-8392-dd15d4b22016",
|
||
"metadata": {},
|
||
"source": [
|
||
"- As we can see, the model is not very good at following instructions\n",
|
||
"- This is expected, since it has only been pretrained and not instruction-finetuned (instruction finetuning will be covered in the next chapter)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "4c9ae440-32f9-412f-96cf-fd52cc3e2522",
|
||
"metadata": {
|
||
"id": "4c9ae440-32f9-412f-96cf-fd52cc3e2522"
|
||
},
|
||
"source": [
|
||
"## 6.5 Adding a classification head"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "d6e9d66f-76b2-40fc-9ec5-3f972a8db9c0",
|
||
"metadata": {},
|
||
"source": [
|
||
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch06_compressed/lm-head.webp\" width=500px>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "217bac05-78df-4412-bd80-612f8061c01d",
|
||
"metadata": {},
|
||
"source": [
|
||
"- In this section, we are modifying the pretrained LLM to make it ready for classification finetuning\n",
|
||
"- Let's take a look at the model architecture first"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 19,
|
||
"id": "b23aff91-6bd0-48da-88f6-353657e6c981",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "1d8f7a01-b7c0-48d4-b1e7-8c12cc7ad932",
|
||
"outputId": "b6a5b9b5-a92f-498f-d7cb-b58dd99e4497"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"GPTModel(\n",
|
||
" (tok_emb): Embedding(50257, 768)\n",
|
||
" (pos_emb): Embedding(1024, 768)\n",
|
||
" (drop_emb): Dropout(p=0.0, inplace=False)\n",
|
||
" (trf_blocks): Sequential(\n",
|
||
" (0): TransformerBlock(\n",
|
||
" (att): MultiHeadAttention(\n",
|
||
" (W_query): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_key): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_value): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (out_proj): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (dropout): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (ff): FeedForward(\n",
|
||
" (layers): Sequential(\n",
|
||
" (0): Linear(in_features=768, out_features=3072, bias=True)\n",
|
||
" (1): GELU()\n",
|
||
" (2): Linear(in_features=3072, out_features=768, bias=True)\n",
|
||
" )\n",
|
||
" )\n",
|
||
" (norm1): LayerNorm()\n",
|
||
" (norm2): LayerNorm()\n",
|
||
" (drop_resid): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (1): TransformerBlock(\n",
|
||
" (att): MultiHeadAttention(\n",
|
||
" (W_query): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_key): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_value): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (out_proj): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (dropout): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (ff): FeedForward(\n",
|
||
" (layers): Sequential(\n",
|
||
" (0): Linear(in_features=768, out_features=3072, bias=True)\n",
|
||
" (1): GELU()\n",
|
||
" (2): Linear(in_features=3072, out_features=768, bias=True)\n",
|
||
" )\n",
|
||
" )\n",
|
||
" (norm1): LayerNorm()\n",
|
||
" (norm2): LayerNorm()\n",
|
||
" (drop_resid): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (2): TransformerBlock(\n",
|
||
" (att): MultiHeadAttention(\n",
|
||
" (W_query): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_key): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_value): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (out_proj): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (dropout): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (ff): FeedForward(\n",
|
||
" (layers): Sequential(\n",
|
||
" (0): Linear(in_features=768, out_features=3072, bias=True)\n",
|
||
" (1): GELU()\n",
|
||
" (2): Linear(in_features=3072, out_features=768, bias=True)\n",
|
||
" )\n",
|
||
" )\n",
|
||
" (norm1): LayerNorm()\n",
|
||
" (norm2): LayerNorm()\n",
|
||
" (drop_resid): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (3): TransformerBlock(\n",
|
||
" (att): MultiHeadAttention(\n",
|
||
" (W_query): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_key): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_value): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (out_proj): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (dropout): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (ff): FeedForward(\n",
|
||
" (layers): Sequential(\n",
|
||
" (0): Linear(in_features=768, out_features=3072, bias=True)\n",
|
||
" (1): GELU()\n",
|
||
" (2): Linear(in_features=3072, out_features=768, bias=True)\n",
|
||
" )\n",
|
||
" )\n",
|
||
" (norm1): LayerNorm()\n",
|
||
" (norm2): LayerNorm()\n",
|
||
" (drop_resid): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (4): TransformerBlock(\n",
|
||
" (att): MultiHeadAttention(\n",
|
||
" (W_query): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_key): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_value): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (out_proj): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (dropout): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (ff): FeedForward(\n",
|
||
" (layers): Sequential(\n",
|
||
" (0): Linear(in_features=768, out_features=3072, bias=True)\n",
|
||
" (1): GELU()\n",
|
||
" (2): Linear(in_features=3072, out_features=768, bias=True)\n",
|
||
" )\n",
|
||
" )\n",
|
||
" (norm1): LayerNorm()\n",
|
||
" (norm2): LayerNorm()\n",
|
||
" (drop_resid): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (5): TransformerBlock(\n",
|
||
" (att): MultiHeadAttention(\n",
|
||
" (W_query): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_key): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_value): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (out_proj): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (dropout): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (ff): FeedForward(\n",
|
||
" (layers): Sequential(\n",
|
||
" (0): Linear(in_features=768, out_features=3072, bias=True)\n",
|
||
" (1): GELU()\n",
|
||
" (2): Linear(in_features=3072, out_features=768, bias=True)\n",
|
||
" )\n",
|
||
" )\n",
|
||
" (norm1): LayerNorm()\n",
|
||
" (norm2): LayerNorm()\n",
|
||
" (drop_resid): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (6): TransformerBlock(\n",
|
||
" (att): MultiHeadAttention(\n",
|
||
" (W_query): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_key): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_value): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (out_proj): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (dropout): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (ff): FeedForward(\n",
|
||
" (layers): Sequential(\n",
|
||
" (0): Linear(in_features=768, out_features=3072, bias=True)\n",
|
||
" (1): GELU()\n",
|
||
" (2): Linear(in_features=3072, out_features=768, bias=True)\n",
|
||
" )\n",
|
||
" )\n",
|
||
" (norm1): LayerNorm()\n",
|
||
" (norm2): LayerNorm()\n",
|
||
" (drop_resid): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (7): TransformerBlock(\n",
|
||
" (att): MultiHeadAttention(\n",
|
||
" (W_query): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_key): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_value): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (out_proj): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (dropout): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (ff): FeedForward(\n",
|
||
" (layers): Sequential(\n",
|
||
" (0): Linear(in_features=768, out_features=3072, bias=True)\n",
|
||
" (1): GELU()\n",
|
||
" (2): Linear(in_features=3072, out_features=768, bias=True)\n",
|
||
" )\n",
|
||
" )\n",
|
||
" (norm1): LayerNorm()\n",
|
||
" (norm2): LayerNorm()\n",
|
||
" (drop_resid): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (8): TransformerBlock(\n",
|
||
" (att): MultiHeadAttention(\n",
|
||
" (W_query): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_key): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_value): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (out_proj): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (dropout): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (ff): FeedForward(\n",
|
||
" (layers): Sequential(\n",
|
||
" (0): Linear(in_features=768, out_features=3072, bias=True)\n",
|
||
" (1): GELU()\n",
|
||
" (2): Linear(in_features=3072, out_features=768, bias=True)\n",
|
||
" )\n",
|
||
" )\n",
|
||
" (norm1): LayerNorm()\n",
|
||
" (norm2): LayerNorm()\n",
|
||
" (drop_resid): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (9): TransformerBlock(\n",
|
||
" (att): MultiHeadAttention(\n",
|
||
" (W_query): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_key): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_value): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (out_proj): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (dropout): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (ff): FeedForward(\n",
|
||
" (layers): Sequential(\n",
|
||
" (0): Linear(in_features=768, out_features=3072, bias=True)\n",
|
||
" (1): GELU()\n",
|
||
" (2): Linear(in_features=3072, out_features=768, bias=True)\n",
|
||
" )\n",
|
||
" )\n",
|
||
" (norm1): LayerNorm()\n",
|
||
" (norm2): LayerNorm()\n",
|
||
" (drop_resid): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (10): TransformerBlock(\n",
|
||
" (att): MultiHeadAttention(\n",
|
||
" (W_query): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_key): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_value): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (out_proj): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (dropout): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (ff): FeedForward(\n",
|
||
" (layers): Sequential(\n",
|
||
" (0): Linear(in_features=768, out_features=3072, bias=True)\n",
|
||
" (1): GELU()\n",
|
||
" (2): Linear(in_features=3072, out_features=768, bias=True)\n",
|
||
" )\n",
|
||
" )\n",
|
||
" (norm1): LayerNorm()\n",
|
||
" (norm2): LayerNorm()\n",
|
||
" (drop_resid): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (11): TransformerBlock(\n",
|
||
" (att): MultiHeadAttention(\n",
|
||
" (W_query): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_key): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (W_value): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (out_proj): Linear(in_features=768, out_features=768, bias=True)\n",
|
||
" (dropout): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" (ff): FeedForward(\n",
|
||
" (layers): Sequential(\n",
|
||
" (0): Linear(in_features=768, out_features=3072, bias=True)\n",
|
||
" (1): GELU()\n",
|
||
" (2): Linear(in_features=3072, out_features=768, bias=True)\n",
|
||
" )\n",
|
||
" )\n",
|
||
" (norm1): LayerNorm()\n",
|
||
" (norm2): LayerNorm()\n",
|
||
" (drop_resid): Dropout(p=0.0, inplace=False)\n",
|
||
" )\n",
|
||
" )\n",
|
||
" (final_norm): LayerNorm()\n",
|
||
" (out_head): Linear(in_features=768, out_features=50257, bias=False)\n",
|
||
")\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"print(model)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "3f640a76-dd00-4769-9bc8-1aed0cec330d",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Above, we can see the architecture we implemented in chapter 4 neatly laid out\n",
|
||
"- The goal is to replace and finetune the output layer\n",
|
||
"- To achieve this, we first freeze the model, meaning that we make all layers non-trainable"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 20,
|
||
"id": "fkMWFl-0etea",
|
||
"metadata": {
|
||
"id": "fkMWFl-0etea"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"for param in model.parameters():\n",
|
||
" param.requires_grad = False"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "72155f83-87d9-476a-a978-a15aa2d44147",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Then, we replace the output layer (`model.out_head`), which originally maps the layer inputs to 50,257 dimensions (the size of the vocabulary)\n",
|
||
"- Since we finetune the model for binary classification (predicting 2 classes, \"spam\" and \"not spam\"), we can replace the output layer as shown below, which will be trainable by default\n",
|
||
"- Note that we use `BASE_CONFIG[\"emb_dim\"]` (which is equal to 768 in the `\"gpt2-small (124M)\"` model) to keep the code below more general"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 21,
|
||
"id": "7e759fa0-0f69-41be-b576-17e5f20e04cb",
|
||
"metadata": {},
|
||
"outputs": [],
|
||
"source": [
|
||
"torch.manual_seed(123)\n",
|
||
"\n",
|
||
"num_classes = 2\n",
|
||
"model.out_head = torch.nn.Linear(in_features=BASE_CONFIG[\"emb_dim\"], out_features=num_classes)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "30be5475-ae77-4f97-8f3e-dec462b1339f",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Technically, it's sufficient to only train the output layer\n",
|
||
"- However, as I found in [experiments finetuning additional layers](https://magazine.sebastianraschka.com/p/finetuning-large-language-models) can noticeably improve the performance\n",
|
||
"- So, we are also making the last transformer block and the final `LayerNorm` module connecting the last transformer block to the output layer trainable"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "0be7c1eb-c46c-4065-8525-eea1b8c66d10",
|
||
"metadata": {},
|
||
"source": [
|
||
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch06_compressed/trainable.webp\" width=500px>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 22,
|
||
"id": "2aedc120-5ee3-48f6-92f2-ad9304ebcdc7",
|
||
"metadata": {
|
||
"id": "2aedc120-5ee3-48f6-92f2-ad9304ebcdc7"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"for param in model.trf_blocks[-1].parameters():\n",
|
||
" param.requires_grad = True\n",
|
||
"\n",
|
||
"for param in model.final_norm.parameters():\n",
|
||
" param.requires_grad = True"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "f012b899-8284-4d3a-97c0-8a48eb33ba2e",
|
||
"metadata": {},
|
||
"source": [
|
||
"- We can still use this model similar to before in previous chapters\n",
|
||
"- For example, let's feed it some text input"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 23,
|
||
"id": "f645c06a-7df6-451c-ad3f-eafb18224ebc",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "f645c06a-7df6-451c-ad3f-eafb18224ebc",
|
||
"outputId": "27e041b1-d731-48a1-cf60-f22d4565304e"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Inputs: tensor([[5211, 345, 423, 640]])\n",
|
||
"Inputs dimensions: torch.Size([1, 4])\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"inputs = tokenizer.encode(\"Do you have time\")\n",
|
||
"inputs = torch.tensor(inputs).unsqueeze(0)\n",
|
||
"print(\"Inputs:\", inputs)\n",
|
||
"print(\"Inputs dimensions:\", inputs.shape) # shape: (batch_size, num_tokens)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "fbbf8481-772d-467b-851c-a62b86d0cb1b",
|
||
"metadata": {},
|
||
"source": [
|
||
"- What's different compared to previous chapters is that it now has two output dimensions instead of 50,257"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 24,
|
||
"id": "48dc84f1-85cc-4609-9cee-94ff539f00f4",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "48dc84f1-85cc-4609-9cee-94ff539f00f4",
|
||
"outputId": "9cae7448-253d-4776-973e-0af190b06354"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Outputs:\n",
|
||
" tensor([[[-1.5854, 0.9904],\n",
|
||
" [-3.7235, 7.4548],\n",
|
||
" [-2.2661, 6.6049],\n",
|
||
" [-3.5983, 3.9902]]])\n",
|
||
"Outputs dimensions: torch.Size([1, 4, 2])\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"with torch.no_grad():\n",
|
||
" outputs = model(inputs)\n",
|
||
"\n",
|
||
"print(\"Outputs:\\n\", outputs)\n",
|
||
"print(\"Outputs dimensions:\", outputs.shape) # shape: (batch_size, num_tokens, num_classes)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "75430a01-ef9c-426a-aca0-664689c4f461",
|
||
"metadata": {},
|
||
"source": [
|
||
"- As discussed in previous chapters, for each input token, there's one output vector\n",
|
||
"- Since we fed the model a text sample with 4 input tokens, the output consists of 4 2-dimensional output vectors above"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "7df9144f-6817-4be4-8d4b-5d4dadfe4a9b",
|
||
"metadata": {},
|
||
"source": [
|
||
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch06_compressed/input-and-output.webp\" width=500px>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "e3bb8616-c791-4f5c-bac0-5302f663e46a",
|
||
"metadata": {},
|
||
"source": [
|
||
"- In chapter 3, we discussed the attention mechanism, which connects each input token to each other input token\n",
|
||
"- In chapter 3, we then also introduced the causal attention mask that is used in GPT-like models; this causal mask lets a current token only attend to the current and previous token positions\n",
|
||
"- Based on this causal attention mechanism, the 4th (last) token contains the most information among all tokens because it's the only token that includes information about all other tokens\n",
|
||
"- Hence, we are particularly interested in this last token, which we will finetune for the spam classification task"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 25,
|
||
"id": "49383a8c-41d5-4dab-98f1-238bca0c2ed7",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "49383a8c-41d5-4dab-98f1-238bca0c2ed7",
|
||
"outputId": "e79eb155-fa1f-46ed-ff8c-d828c3a3fabd"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Last output token: tensor([[-3.5983, 3.9902]])\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"print(\"Last output token:\", outputs[:, -1, :])"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "8df08ae0-e664-4670-b7c5-8a2280d9b41b",
|
||
"metadata": {},
|
||
"source": [
|
||
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch06_compressed/attention-mask.webp\" width=200px>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "32aa4aef-e1e9-491b-9adf-5aa973e59b8c",
|
||
"metadata": {},
|
||
"source": [
|
||
"## 6.6 Calculating the classification loss and accuracy"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "669e1fd1-ace8-44b4-b438-185ed0ba8b33",
|
||
"metadata": {},
|
||
"source": [
|
||
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch06_compressed/overview-3.webp?123\" width=500px>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "7a7df4ee-0a34-4a4d-896d-affbbf81e0b3",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Before explaining the loss calculation, let's have a brief look at how the model outputs are turned into class labels"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "557996dd-4c6b-49c4-ab83-f60ef7e1d69e",
|
||
"metadata": {},
|
||
"source": [
|
||
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch06_compressed/class-argmax.webp\" width=600px>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 26,
|
||
"id": "c77faab1-3461-4118-866a-6171f2b89aa0",
|
||
"metadata": {},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Last output token: tensor([[-3.5983, 3.9902]])\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"print(\"Last output token:\", outputs[:, -1, :])"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "7edd71fa-628a-4d00-b81d-6d8bcb2c341d",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Similar to chapter 5, we convert the outputs (logits) into probability scores via the `softmax` function and then obtain the index position of the largest probability value via the `argmax` function"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 27,
|
||
"id": "b81efa92-9be1-4b9e-8790-ce1fc7b17f01",
|
||
"metadata": {},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Class label: 1\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"probas = torch.softmax(outputs[:, -1, :], dim=-1)\n",
|
||
"label = torch.argmax(probas)\n",
|
||
"print(\"Class label:\", label.item())"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "414a6f02-307e-4147-a416-14d115bf8179",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Note that the softmax function is optional here, as explained in chapter 5, because the largest outputs correspond to the largest probability scores"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 28,
|
||
"id": "f9f9ad66-4969-4501-8239-3ccdb37e71a2",
|
||
"metadata": {},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Class label: 1\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"logits = outputs[:, -1, :]\n",
|
||
"label = torch.argmax(logits)\n",
|
||
"print(\"Class label:\", label.item())"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "dcb20d3a-cbba-4ab1-8584-d94e16589505",
|
||
"metadata": {},
|
||
"source": [
|
||
"- We can apply this concept to calculate the so-called classification accuracy, which computes the percentage of correct predictions in a given dataset\n",
|
||
"- To calculate the classification accuracy, we can apply the preceding `argmax`-based prediction code to all examples in a dataset and calculate the fraction of correct predictions as follows:"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 29,
|
||
"id": "3ecf9572-aed0-4a21-9c3b-7f9f2aec5f23",
|
||
"metadata": {},
|
||
"outputs": [],
|
||
"source": [
|
||
"def calc_accuracy_loader(data_loader, model, device, num_batches=None):\n",
|
||
" model.eval()\n",
|
||
" correct_predictions, num_examples = 0, 0\n",
|
||
"\n",
|
||
" if num_batches is None:\n",
|
||
" num_batches = len(data_loader)\n",
|
||
" else:\n",
|
||
" num_batches = min(num_batches, len(data_loader))\n",
|
||
" for i, (input_batch, target_batch) in enumerate(data_loader):\n",
|
||
" if i < num_batches:\n",
|
||
" input_batch, target_batch = input_batch.to(device), target_batch.to(device)\n",
|
||
"\n",
|
||
" with torch.no_grad():\n",
|
||
" logits = model(input_batch)[:, -1, :] # Logits of last output token\n",
|
||
" predicted_labels = torch.argmax(logits, dim=-1)\n",
|
||
"\n",
|
||
" num_examples += predicted_labels.shape[0]\n",
|
||
" correct_predictions += (predicted_labels == target_batch).sum().item()\n",
|
||
" else:\n",
|
||
" break\n",
|
||
" return correct_predictions / num_examples"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "7165fe46-a284-410b-957f-7524877d1a1a",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Let's apply the function to calculate the classification accuracies for the different datasets:"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 30,
|
||
"id": "390e5255-8427-488c-adef-e1c10ab4fb26",
|
||
"metadata": {},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Training accuracy: 46.25%\n",
|
||
"Validation accuracy: 45.00%\n",
|
||
"Test accuracy: 48.75%\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"device = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\n",
|
||
"model.to(device) # no assignment model = model.to(device) necessary for nn.Module classes\n",
|
||
"\n",
|
||
"torch.manual_seed(123) # For reproducibility due to the shuffling in the training data loader\n",
|
||
"\n",
|
||
"train_accuracy = calc_accuracy_loader(train_loader, model, device, num_batches=10)\n",
|
||
"val_accuracy = calc_accuracy_loader(val_loader, model, device, num_batches=10)\n",
|
||
"test_accuracy = calc_accuracy_loader(test_loader, model, device, num_batches=10)\n",
|
||
"\n",
|
||
"print(f\"Training accuracy: {train_accuracy*100:.2f}%\")\n",
|
||
"print(f\"Validation accuracy: {val_accuracy*100:.2f}%\")\n",
|
||
"print(f\"Test accuracy: {test_accuracy*100:.2f}%\")"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "30345e2a-afed-4d22-9486-f4010f90a871",
|
||
"metadata": {},
|
||
"source": [
|
||
"- As we can see, the prediction accuracies are not very good, since we haven't finetuned the model, yet"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "4f4a9d15-8fc7-48a2-8734-d92a2f265328",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Before we can start finetuning (/training), we first have to define the loss function we want to optimize during training\n",
|
||
"- The goal is to maximize the spam classification accuracy of the model; however, classification accuracy is not a differentiable function\n",
|
||
"- Hence, instead, we minimize the cross entropy loss as a proxy for maximizing the classification accuracy (you can learn more about this topic in lecture 8 of my freely available [Introduction to Deep Learning](https://sebastianraschka.com/blog/2021/dl-course.html#l08-multinomial-logistic-regression--softmax-regression) class)\n",
|
||
"\n",
|
||
"- The `calc_loss_batch` function is the same here as in chapter 5, except that we are only interested in optimizing the last token `model(input_batch)[:, -1, :]` instead of all tokens `model(input_batch)`"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 31,
|
||
"id": "2f1e9547-806c-41a9-8aba-3b2822baabe4",
|
||
"metadata": {
|
||
"id": "2f1e9547-806c-41a9-8aba-3b2822baabe4"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"def calc_loss_batch(input_batch, target_batch, model, device):\n",
|
||
" input_batch, target_batch = input_batch.to(device), target_batch.to(device)\n",
|
||
" logits = model(input_batch)[:, -1, :] # Logits of last output token\n",
|
||
" loss = torch.nn.functional.cross_entropy(logits, target_batch)\n",
|
||
" return loss"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "a013aab9-f854-4866-ad55-5b8350adb50a",
|
||
"metadata": {},
|
||
"source": [
|
||
"The `calc_loss_loader` is exactly the same as in chapter 5"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 32,
|
||
"id": "b7b83e10-5720-45e7-ac5e-369417ca846b",
|
||
"metadata": {},
|
||
"outputs": [],
|
||
"source": [
|
||
"# Same as in chapter 5\n",
|
||
"def calc_loss_loader(data_loader, model, device, num_batches=None):\n",
|
||
" total_loss = 0.\n",
|
||
" if len(data_loader) == 0:\n",
|
||
" return float(\"nan\")\n",
|
||
" elif num_batches is None:\n",
|
||
" num_batches = len(data_loader)\n",
|
||
" else:\n",
|
||
" # Reduce the number of batches to match the total number of batches in the data loader\n",
|
||
" # if num_batches exceeds the number of batches in the data loader\n",
|
||
" num_batches = min(num_batches, len(data_loader))\n",
|
||
" for i, (input_batch, target_batch) in enumerate(data_loader):\n",
|
||
" if i < num_batches:\n",
|
||
" loss = calc_loss_batch(input_batch, target_batch, model, device)\n",
|
||
" total_loss += loss.item()\n",
|
||
" else:\n",
|
||
" break\n",
|
||
" return total_loss / num_batches"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "56826ecd-6e74-40e6-b772-d3541e585067",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Using the `calc_closs_loader`, we compute the initial training, validation, and test set losses before we start training"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 33,
|
||
"id": "f6f00e53-5beb-4e64-b147-f26fd481c6ff",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "f6f00e53-5beb-4e64-b147-f26fd481c6ff",
|
||
"outputId": "49df8648-9e38-4314-854d-9faacd1b2e89"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Training loss: 2.453\n",
|
||
"Validation loss: 2.583\n",
|
||
"Test loss: 2.322\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"with torch.no_grad(): # Disable gradient tracking for efficiency because we are not training, yet\n",
|
||
" train_loss = calc_loss_loader(train_loader, model, device, num_batches=5)\n",
|
||
" val_loss = calc_loss_loader(val_loader, model, device, num_batches=5)\n",
|
||
" test_loss = calc_loss_loader(test_loader, model, device, num_batches=5)\n",
|
||
"\n",
|
||
"print(f\"Training loss: {train_loss:.3f}\")\n",
|
||
"print(f\"Validation loss: {val_loss:.3f}\")\n",
|
||
"print(f\"Test loss: {test_loss:.3f}\")"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "e04b980b-e583-4f62-84a0-4edafaf99d5d",
|
||
"metadata": {},
|
||
"source": [
|
||
"- In the next section, we train the model to improve the loss values and consequently the classification accuracy"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "456ae0fd-6261-42b4-ab6a-d24289953083",
|
||
"metadata": {
|
||
"id": "456ae0fd-6261-42b4-ab6a-d24289953083"
|
||
},
|
||
"source": [
|
||
"## 6.7 Finetuning the model on supervised data"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "6a9b099b-0829-4f72-8a2b-4363e3497026",
|
||
"metadata": {},
|
||
"source": [
|
||
"- In this section, we define and use the training function to improve the classification accuracy of the model\n",
|
||
"- The `train_classifier_simple` function below is practically the same as the `train_model_simple` function we used for pretraining the model in chapter 5\n",
|
||
"- The only two differences are that we now \n",
|
||
" 1. track the number of training examples seen (`examples_seen`) instead of the number of tokens seen\n",
|
||
" 2. calculate the accuracy after each epoch instead of printing a sample text after each epoch"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "979b6222-1dc2-4530-9d01-b6b04fe3de12",
|
||
"metadata": {},
|
||
"source": [
|
||
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch06_compressed/training-loop.webp\" width=500px>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 31,
|
||
"id": "Csbr60to50FL",
|
||
"metadata": {
|
||
"id": "Csbr60to50FL"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"# Overall the same as `train_model_simple` in chapter 5\n",
|
||
"def train_classifier_simple(model, train_loader, val_loader, optimizer, device, num_epochs,\n",
|
||
" eval_freq, eval_iter, tokenizer):\n",
|
||
" # Initialize lists to track losses and tokens seen\n",
|
||
" train_losses, val_losses, train_accs, val_accs = [], [], [], []\n",
|
||
" examples_seen, global_step = 0, -1\n",
|
||
"\n",
|
||
" # Main training loop\n",
|
||
" for epoch in range(num_epochs):\n",
|
||
" model.train() # Set model to training mode\n",
|
||
"\n",
|
||
" for input_batch, target_batch in train_loader:\n",
|
||
" optimizer.zero_grad() # Reset loss gradients from previous epoch\n",
|
||
" loss = calc_loss_batch(input_batch, target_batch, model, device)\n",
|
||
" loss.backward() # Calculate loss gradients\n",
|
||
" optimizer.step() # Update model weights using loss gradients\n",
|
||
" examples_seen += input_batch.shape[0] # New: track examples instead of tokens\n",
|
||
" global_step += 1\n",
|
||
"\n",
|
||
" # Optional evaluation step\n",
|
||
" if global_step % eval_freq == 0:\n",
|
||
" train_loss, val_loss = evaluate_model(\n",
|
||
" model, train_loader, val_loader, device, eval_iter)\n",
|
||
" train_losses.append(train_loss)\n",
|
||
" val_losses.append(val_loss)\n",
|
||
" print(f\"Ep {epoch+1} (Step {global_step:06d}): \"\n",
|
||
" f\"Train loss {train_loss:.3f}, Val loss {val_loss:.3f}\")\n",
|
||
"\n",
|
||
" # Calculate accuracy after each epoch\n",
|
||
" train_accuracy = calc_accuracy_loader(train_loader, model, device, num_batches=eval_iter)\n",
|
||
" val_accuracy = calc_accuracy_loader(val_loader, model, device, num_batches=eval_iter)\n",
|
||
" print(f\"Training accuracy: {train_accuracy*100:.2f}% | \", end=\"\")\n",
|
||
" print(f\"Validation accuracy: {val_accuracy*100:.2f}%\")\n",
|
||
" train_accs.append(train_accuracy)\n",
|
||
" val_accs.append(val_accuracy)\n",
|
||
"\n",
|
||
" return train_losses, val_losses, train_accs, val_accs, examples_seen"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "9624cb30-3e3a-45be-b006-c00475b58ae8",
|
||
"metadata": {},
|
||
"source": [
|
||
"- The `evaluate_model` function used in the `train_classifier_simple` is the same as the one we used in chapter 5"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 32,
|
||
"id": "bcc7bc04-6aa6-4516-a147-460e2f466eab",
|
||
"metadata": {},
|
||
"outputs": [],
|
||
"source": [
|
||
"# Same as chapter 5\n",
|
||
"def evaluate_model(model, train_loader, val_loader, device, eval_iter):\n",
|
||
" model.eval()\n",
|
||
" with torch.no_grad():\n",
|
||
" train_loss = calc_loss_loader(train_loader, model, device, num_batches=eval_iter)\n",
|
||
" val_loss = calc_loss_loader(val_loader, model, device, num_batches=eval_iter)\n",
|
||
" model.train()\n",
|
||
" return train_loss, val_loss"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "e807bfe9-364d-46b2-9e25-3b000c3ef6f9",
|
||
"metadata": {},
|
||
"source": [
|
||
"- The training takes about 5 minutes on a M3 MacBook Air laptop computer and less than half a minute on a V100 or A100 GPU"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 33,
|
||
"id": "X7kU3aAj7vTJ",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "X7kU3aAj7vTJ",
|
||
"outputId": "504a033e-2bf8-41b5-a037-468309845513"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Ep 1 (Step 000000): Train loss 2.153, Val loss 2.392\n",
|
||
"Ep 1 (Step 000050): Train loss 0.617, Val loss 0.637\n",
|
||
"Ep 1 (Step 000100): Train loss 0.523, Val loss 0.557\n",
|
||
"Training accuracy: 70.00% | Validation accuracy: 72.50%\n",
|
||
"Ep 2 (Step 000150): Train loss 0.561, Val loss 0.489\n",
|
||
"Ep 2 (Step 000200): Train loss 0.419, Val loss 0.397\n",
|
||
"Ep 2 (Step 000250): Train loss 0.409, Val loss 0.353\n",
|
||
"Training accuracy: 82.50% | Validation accuracy: 85.00%\n",
|
||
"Ep 3 (Step 000300): Train loss 0.333, Val loss 0.320\n",
|
||
"Ep 3 (Step 000350): Train loss 0.340, Val loss 0.306\n",
|
||
"Training accuracy: 90.00% | Validation accuracy: 90.00%\n",
|
||
"Ep 4 (Step 000400): Train loss 0.136, Val loss 0.200\n",
|
||
"Ep 4 (Step 000450): Train loss 0.153, Val loss 0.132\n",
|
||
"Ep 4 (Step 000500): Train loss 0.222, Val loss 0.137\n",
|
||
"Training accuracy: 100.00% | Validation accuracy: 97.50%\n",
|
||
"Ep 5 (Step 000550): Train loss 0.207, Val loss 0.143\n",
|
||
"Ep 5 (Step 000600): Train loss 0.083, Val loss 0.074\n",
|
||
"Training accuracy: 100.00% | Validation accuracy: 97.50%\n",
|
||
"Training completed in 5.65 minutes.\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"import time\n",
|
||
"\n",
|
||
"start_time = time.time()\n",
|
||
"\n",
|
||
"torch.manual_seed(123)\n",
|
||
"\n",
|
||
"optimizer = torch.optim.AdamW(model.parameters(), lr=5e-5, weight_decay=0.1)\n",
|
||
"\n",
|
||
"num_epochs = 5\n",
|
||
"train_losses, val_losses, train_accs, val_accs, examples_seen = train_classifier_simple(\n",
|
||
" model, train_loader, val_loader, optimizer, device,\n",
|
||
" num_epochs=num_epochs, eval_freq=50, eval_iter=5,\n",
|
||
" tokenizer=tokenizer\n",
|
||
")\n",
|
||
"\n",
|
||
"end_time = time.time()\n",
|
||
"execution_time_minutes = (end_time - start_time) / 60\n",
|
||
"print(f\"Training completed in {execution_time_minutes:.2f} minutes.\")"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "1261bf90-3ce7-4591-895a-044a05538f30",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Similar to chapter 5, we use matplotlib to plot the loss function for the training and validation set"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 34,
|
||
"id": "cURgnDqdCeka",
|
||
"metadata": {
|
||
"id": "cURgnDqdCeka"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"import matplotlib.pyplot as plt\n",
|
||
"\n",
|
||
"def plot_values(epochs_seen, examples_seen, train_values, val_values, label=\"loss\"):\n",
|
||
" fig, ax1 = plt.subplots(figsize=(5, 3))\n",
|
||
"\n",
|
||
" # Plot training and validation loss against epochs\n",
|
||
" ax1.plot(epochs_seen, train_values, label=f\"Training {label}\")\n",
|
||
" ax1.plot(epochs_seen, val_values, linestyle=\"-.\", label=f\"Validation {label}\")\n",
|
||
" ax1.set_xlabel(\"Epochs\")\n",
|
||
" ax1.set_ylabel(label.capitalize())\n",
|
||
" ax1.legend()\n",
|
||
"\n",
|
||
" # Create a second x-axis for tokens seen\n",
|
||
" ax2 = ax1.twiny() # Create a second x-axis that shares the same y-axis\n",
|
||
" ax2.plot(examples_seen, train_values, alpha=0) # Invisible plot for aligning ticks\n",
|
||
" ax2.set_xlabel(\"Examples seen\")\n",
|
||
"\n",
|
||
" fig.tight_layout() # Adjust layout to make room\n",
|
||
" plt.savefig(f\"{label}-plot.pdf\")\n",
|
||
" plt.show()"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 35,
|
||
"id": "OIqRt466DiGk",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/",
|
||
"height": 307
|
||
},
|
||
"id": "OIqRt466DiGk",
|
||
"outputId": "b16987cf-0001-4652-ddaf-02f7cffc34db"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"data": {
|
||
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAeoAAAEiCAYAAAA21pHjAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8pXeV/AAAACXBIWXMAAA9hAAAPYQGoP6dpAABXi0lEQVR4nO3deVxU9f748dfMwAz7viOCyuIK7uZOSamVZatfr7e0LG+FlZkt3krNfkWL3awsK7vJrVtZWVq3XELc9xUFF9wBlc2FVRhg5vz+GBidxAUEZsD38/E4D+Z8zuec855P5JvzOZ9zPipFURSEEEIIYZPU1g5ACCGEEJcniVoIIYSwYZKohRBCCBsmiVoIIYSwYZKohRBCCBsmiVoIIYSwYZKohRBCCBsmiVoIIYSwYZKohRBCCBsmiVoIcU1iY2OZNGmStcMQ4oYjiVqIJjJu3DhUKtUly7Bhw6wdmhDChtlZOwAhbiTDhg1j/vz5FmU6nc5K0QghmgO5ohaiCel0OgICAiwWT09PAFavXo1Wq2XdunXm+u+++y5+fn7k5uYCsGzZMgYMGICHhwfe3t7ceeedHDlyxFz/+PHjqFQqfvzxRwYOHIijoyO9evXi4MGDbNu2jZ49e+Li4sLw4cPJz8837zdu3DhGjhzJ66+/jq+vL25ubjzxxBNUVFRc9rvo9XqmTJlCcHAwzs7O9OnTh9WrV5u3Z2RkMGLECDw9PXF2dqZTp04sWbLkssf79NNPiYiIwMHBAX9/f+6//37zNqPRSEJCAm3atMHR0ZGYmBgWLlxosX9aWhrDhw/HxcUFf39/HnroIU6fPm3eHhsbyzPPPMOLL76Il5cXAQEBzJgx47LxCGErJFELYSNq7gE/9NBDFBYWsmvXLl577TW+/PJL/P39ASgtLWXy5Mls376d5ORk1Go199xzD0aj0eJY06dP59VXX2Xnzp3Y2dnxt7/9jRdffJEPP/yQdevWcfjwYaZNm2axT3JyMvv372f16tV8//33/PLLL7z++uuXjXfixIls2rSJBQsWsGfPHh544AGGDRvGoUOHAIiPj0ev17N27VpSU1N55513cHFxqfVY27dv55lnnmHmzJmkp6ezbNkyBg0aZN6ekJDA119/zWeffcbevXt57rnn+Pvf/86aNWsAKCgo4JZbbqFbt25s376dZcuWkZuby4MPPmhxnv/85z84OzuzZcsW3n33XWbOnElSUtI1/hcSwkoUIUSTGDt2rKLRaBRnZ2eL5c033zTX0ev1SteuXZUHH3xQ6dixo/L4449f8Zj5+fkKoKSmpiqKoijHjh1TAOXLL7801/n+++8VQElOTjaXJSQkKFFRURaxeXl5KaWlpeayuXPnKi4uLorBYFAURVEGDx6sPPvss4qiKEpGRoai0WiUkydPWsQzZMgQZerUqYqiKEqXLl2UGTNmXFPb/Pzzz4qbm5tSVFR0ybby8nLFyclJ2bhxo0X5+PHjldGjRyuKoihvvPGGctttt1lsz8rKUgAlPT3dHP+AAQMs6vTq1Ut56aWXrilGIaxF7lEL0YRuvvlm5s6da1Hm5eVl/qzVavn222+Jjo4mNDSUDz74wKLuoUOHmDZtGlu2bOH06dPmK+nMzEw6d+5srhcdHW3+XHM13qVLF4uyvLw8i2PHxMTg5ORkXu/bty8lJSVkZWURGhpqUTc1NRWDwUBkZKRFuV6vx9vbG4BnnnmGJ598kj///JO4uDjuu+8+i7guduuttxIaGkrbtm0ZNmwYw4YN45577sHJyYnDhw9z/vx5br31Vot9Kioq6NatGwC7d+9m1apVtV6xHzlyxBznX88fGBh4STsIYWskUQvRhJydnQkPD79inY0bNwJw9uxZzp49i7Ozs3nbiBEjCA0NZd68eQQFBWE0GuncufMl95Lt7e3Nn1UqVa1lf+0ur4uSkhI0Gg07duxAo9FYbKtJlo899hhDhw7ljz/+4M8//yQhIYH333+fp59++pLjubq6snPnTlavXs2ff/7JtGnTmDFjBtu2baOkpASAP/74g+DgYIv9agbilZSUMGLECN55551Ljh0YGGj+fHEbwPW3gxBNQRK1EDbkyJEjPPfcc8ybN48ffviBsWPHsmLFCtRqNWfOnCE9PZ158+YxcOBAANavX99g5969ezdlZWU4OjoCsHnzZlxcXAgJCbmkbrdu3TAYDOTl5ZljqU1ISAhPPPEETzzxBFOnTmXevHm1JmoAOzs74uLiiIuLY/r06Xh4eLBy5UpuvfVWdDodmZmZDB48uNZ9u3fvzs8//0xYWBh2dvLPmmhZ5DdaiCak1+vJycmxKLOzs8PHxweDwcDf//53hg4dyiOPPMKwYcPo0qUL77//Pi+88AKenp54e3vzxRdfEBgYSGZmJi+//HKDxVZRUcH48eN59dVXOX78ONOnT2fixImo1ZeOOY2MjGTMmDE8/PDDvP/++3Tr1o38/HySk5OJjo7mjjvuYNKkSQwfPpzIyEjOnTvHqlWr6NChQ63n/v333zl69CiDBg3C09OTJUuWYDQaiYqKwtXVlSlTpvDcc89hNBoZMGAAhYWFbNiwATc3N8aOHUt8fDzz5s1j9OjR5lHdhw8fZsGCBXz55ZeXXPUL0ZxIohaiCS1btsyiKxYgKiqKAwcO8Oabb5KRkcHvv/8OmLpsv/jiC0aPHs1tt91GTEwMCxYs4JlnnqFz585ERUXx0UcfERsb2yCxDRkyhIiICAYNGoRer2f06NFXfHxp/vz5/L//9/94/vnnOXnyJD4+Ptx0003ceeedABgMBuLj4zlx4gRubm4MGzbsknvuNTw8PPjll1+YMWMG5eXlRERE8P3339OpUycA3njjDXx9fUlISODo0aN4eHjQvXt3/vnPfwIQFBTEhg0beOmll7jtttvQ6/WEhoYybNiwWv/QEKI5USmKolg7CCGEdY0bN46CggIWL15s7VCEEH8hf2oKIYQQNkwStRBCCGHDpOtbCCGEsGFyRS2EEELYMEnUQgghhA2TRC2EEELYMEnU1+GTTz4hLCwMBwcH+vTpw9atW60dUqNZu3YtI0aMICgoCJVKdcljPIqiMG3aNAIDA3F0dCQuLs48i1KNs2fPMmbMGNzc3PDw8GD8+PHm10PW2LNnDwMHDsTBwYGQkBDefffdxv5qDSIhIYFevXrh6uqKn58fI0eOJD093aJOeXk58fHxeHt74+Liwn333WeevrJGZmYmd9xxB05OTvj5+fHCCy9QVVVlUWf16tV0794dnU5HeHg4iYmJjf31GsTcuXOJjo7Gzc0NNzc3+vbty9KlS83bb/T2qc3bb7+NSqVi0qRJ5jJpJ5gxYwYqlcpiad++vXl7i2sjq04J0owtWLBA0Wq1yldffaXs3btXefzxxxUPDw8lNzfX2qE1iiVLliivvPKK8ssvvyiAsmjRIovtb7/9tuLu7q4sXrxY2b17t3LXXXcpbdq0UcrKysx1hg0bpsTExCibN29W1q1bp4SHh5tnP1IURSksLFT8/f2VMWPGKGlpacr333+vODo6Kp9//nlTfc16Gzp0qDJ//nwlLS1NSUlJUW6//XaldevWSklJibnOE088oYSEhCjJycnK9u3blZtuuknp16+feXtVVZXSuXNnJS4uTtm1a5eyZMkSxcfHxzwblaIoytGjRxUnJydl8uTJyr59+5SPP/5Y0Wg0yrJly5r0+9bHb7/9pvzxxx/KwYMHlfT0dOWf//ynYm9vr6SlpSmKIu3zV1u3blXCwsKU6Oho86xliiLtpCiKMn36dKVTp05Kdna2ecnPzzdvb2ltJIm6nnr37q3Ex8eb1w0GgxIUFKQkJCRYMaqm8ddEbTQalYCAAOW9994zlxUUFCg6nU75/vvvFUVRlH379imAsm3bNnOdpUuXKiqVyjxV4qeffqp4enoqer3eXOell16ymI6xucjLy1MAZc2aNYqimNrD3t5e+emnn8x19u/frwDKpk2bFEUx/TGkVquVnJwcc525c+cqbm5u5jZ58cUXlU6dOlmca9SoUcrQoUMb+ys1Ck9PT+XLL7+U9vmL4uJiJSIiQklKSrKYXlTayWT69OlKTExMrdtaYhtJ13c9VFRUsGPHDuLi4sxlarWauLg4Nm3aZMXIrOPYsWPk5ORYtIe7uzt9+vQxt8emTZvw8PCgZ8+e5jpxcXGo1Wq2bNlirjNo0CC0Wq25ztChQ0lPT+fcuXNN9G0aRmFhIXBhCssdO3ZQWVlp0Ubt27endevWFm3UpUsX87SUYPr+RUVF7N2711zn4mPU1Gluv3cGg4EFCxZQWlpK3759pX3+Ij4+njvuuOOS7yLtdMGhQ4cICgqibdu2jBkzhszMTKBltpEk6no4ffo0BoPB4j8ymOb4/euECzeCmu98pfbIycnBz8/PYrudnR1eXl4WdWo7xsXnaA6MRiOTJk2if//+5jmic3Jy0Gq1eHh4WNT9axtd7ftfrk5RURFlZWWN8XUaVGpqKi4uLuh0Op544gkWLVpEx44dpX0usmDBAnbu3ElCQsIl26SdTPr06UNiYiLLli1j7ty5HDt2jIEDB1JcXNwi20gm5RCigcXHx5OWltagU1C2FFFRUaSkpFBYWMjChQsZO3Ysa9assXZYNiMrK4tnn32WpKQkHBwcrB2OzRo+fLj5c3R0NH369CE0NJQff/zRPE1rSyJX1PXg4+ODRqO5ZBRhbm4uAQEBVorKemq+85XaIyAggLy8PIvtVVVVnD171qJObce4+By2buLEifz++++sWrWKVq1amcsDAgKoqKigoKDAov5f2+hq3/9yddzc3JrFP1BarZbw8HB69OhBQkICMTExfPjhh9I+1Xbs2EFeXh7du3fHzs4OOzs71qxZw0cffYSdnR3+/v7STrXw8PAgMjKSw4cPt8jfJUnU9aDVaunRowfJycnmMqPRSHJyMn379rViZNbRpk0bAgICLNqjqKiILVu2mNujb9++FBQUsGPHDnOdlStXYjQa6dOnj7nO2rVrqaysNNdJSkoiKioKT0/PJvo29aMoChMnTmTRokWsXLmSNm3aWGzv0aMH9vb2Fm2Unp5OZmamRRulpqZa/EGTlJSEm5sbHTt2NNe5+Bg1dZrr753RaESv10v7VBsyZAipqamkpKSYl549ezJmzBjzZ2mnS5WUlHDkyBECAwNb5u9Skw9fayEWLFig6HQ6JTExUdm3b58yYcIExcPDw2IUYUtSXFys7Nq1S9m1a5cCKP/617+UXbt2KRkZGYqimB7P8vDwUH799Vdlz549yt13313r41ndunVTtmzZoqxfv16JiIiweDyroKBA8ff3Vx566CElLS1NWbBggeLk5NQsHs968sknFXd3d2X16tUWj4ycP3/eXOeJJ55QWrduraxcuVLZvn270rdvX6Vv377m7TWPjNx2221KSkqKsmzZMsXX17fWR0ZeeOEFZf/+/conn3zSbB6refnll5U1a9Yox44dU/bs2aO8/PLLikqlUv78809FUaR9LufiUd+KIu2kKIry/PPPK6tXr1aOHTumbNiwQYmLi1N8fHyUvLw8RVFaXhtJor4OH3/8sdK6dWtFq9UqvXv3VjZv3mztkBrNqlWrFOCSZezYsYqimB7Reu211xR/f39Fp9MpQ4YMUdLT0y2OcebMGWX06NGKi4uL4ubmpjzyyCNKcXGxRZ3du3crAwYMUHQ6nRIcHKy8/fbbTfUVr0ttbQMo8+fPN9cpKytTnnrqKcXT01NxcnJS7rnnHiU7O9viOMePH1eGDx+uODo6Kj4+Psrzzz+vVFZWWtRZtWqV0rVrV0Wr1Spt27a1OIcte/TRR5XQ0FBFq9Uqvr6+ypAhQ8xJWlGkfS7nr4la2sn0mFRgYKCi1WqV4OBgZdSoUcrhw4fN21taG8nsWUIIIYQNk3vUQgghhA2TRC2EEELYMEnUQgghhA2TRC2EEELYMEnUQgghhA2TRC2EEELYMEnU10Gv1zNjxgz0er21Q7Fp0k5XJ210ddJGVydtdHXNsY2s+hx1QkICv/zyCwcOHMDR0ZF+/frxzjvvEBUVddl9EhMTeeSRRyzKdDod5eXljR3uJYqKinB3d6ewsBA3N7cmP39zIe10ddJGVydtdHXSRlfXHNvIqlfUa9asIT4+ns2bN5OUlERlZSW33XYbpaWlV9zPzc2N7Oxs85KRkdFEEQshhBBNy6rTXC5btsxiPTExET8/P3bs2MGgQYMuu59KpWo2sykJIYQQ18Om5qMuLCwEwMvL64r1SkpKCA0NxWg00r17d9566y06dep0Teeoqqpi165d+Pv7o1ZfX4dCcXExACdPnqSoqOi6jtWSSTtdnbTR1UkbXZ200dXZShsZjUZyc3Pp1q0bdnZXTsU2865vo9HIXXfdRUFBAevXr79svU2bNnHo0CGio6MpLCxk1qxZrF27lr1791rM/1tDr9dbDBrYsWMHt9xyS6N8ByGEEKIutm7dSq9eva5Yx2YS9ZNPPsnSpUtZv359rQn3ciorK+nQoQOjR4/mjTfeuGT7jBkzeP311y8p37p1K4GBgdcVsxBCCFEf2dnZ9O7dm4yMDFq3bn3FujaRqCdOnMivv/7K2rVradOmTZ33f+CBB7Czs+P777+/ZNtfr6hPnjxJx44dycrKqtMfBEIIIURDOXHiBCEhIdeUi6w66ltRFCZOnMiiRYtYuXJlvZK0wWAgNTX1slfHOp0ONzc38+Lq6nq9YQshhBBNxqqDyeLj4/nuu+/49ddfcXV1JScnBwB3d3ccHR0BePjhhwkODiYhIQGAmTNnctNNNxEeHk5BQQHvvfceGRkZPPbYY1b7HkIIIURjsWqinjt3LgCxsbEW5fPnz2fcuHEAZGZmWozOPnfuHI8//jg5OTl4enrSo0cPNm7cSMeOHZsqbCGEEKLJ2MQ96qZUl/sCQogbj8FgoLKy0tphiGbO3t4ejUZz2e11yUU29Ry1EEJYi6Io5OTkUFBQYO1QRAvh4eFBQEAAKpXquo4jifp6lBVA5mZwbwUBna0djRDiOtQkaT8/P5ycnK77H1dx41IUhfPnz5OXlwdw3Y8CS6K+Hiv/H2ybB32egOHvWDsaIUQ9GQwGc5L29va2djiiBagZEJ2Xl4efn98Vu8GvRqa5vB5h/U0/j2+wbhxCiOtSc0/aycnJypGIlqTm9+l6xzxIor4eodWJOjcNzp+1bixCiOsm3d2iITXU75Mk6uvh4gc+kYACmZusHY0QQogWSBL19QobYPop3d9CiBYiLCyM2bNnX3P91atXo1KpGn3EfGJiIh4eHo16Dlskifp61XR/H19n3TiEEDcclUp1xWXGjBn1Ou62bduYMGHCNdfv168f2dnZuLu71+t84spk1Pf1qrmizkk1Pa7l6GHNaIQQN5Ds7Gzz5x9++IFp06aRnp5uLnNxcTF/VhQFg8Fw1bmPAXx9fesUh1arJSAgoE77iGsnV9TXyzUAvMMx3afebO1ohBA3kICAAPPi7u6OSqUyrx84cABXV1eWLl1Kjx490Ol0rF+/niNHjnD33Xfj7++Pi4sLvXr1YsWKFRbH/WvXt0ql4ssvv+See+7BycmJiIgIfvvtN/P2v3Z913RRL1++nA4dOuDi4sKwYcMs/rCoqqrimWeewcPDA29vb1566SXGjh3LyJEj69QGc+fOpV27dmi1WqKiovjmm2/M2xRFYcaMGbRu3RqdTkdQUBDPPPOMefunn35KREQEDg4O+Pv7c//999fp3E1FEnVDkO5vIVocRVE4X1FllaUh3+z88ssv8/bbb7N//36io6MpKSnh9ttvJzk5mV27djFs2DBGjBhBZmbmFY/z+uuv8+CDD7Jnzx5uv/12xowZw9mzl3/a5fz588yaNYtvvvmGtWvXkpmZyZQpU8zb33nnHb799lvmz5/Phg0bKCoqYvHixXX6bosWLeLZZ5/l+eefJy0tjX/84x888sgjrFq1CoCff/6ZDz74gM8//5xDhw6xePFiunTpAsD27dt55plnmDlzJunp6SxbtoxBgwbV6fxNRbq+G0LYANj5H8iQAWVCtBRllQY6TltulXPvmzkUJ23D/PM8c+ZMbr31VvO6l5cXMTEx5vU33niDRYsW8dtvvzFx4sTLHmfcuHGMHj0agLfeeouPPvqIrVu3MmzYsFrrV1ZW8tlnn9GuXTsAJk6cyMyZM83bP/74Y6ZOnco999wDwJw5c1iyZEmdvtusWbMYN24cTz31FACTJ09m8+bNzJo1i5tvvpnMzEwCAgKIi4vD3t6e1q1b07t3b8A04ZOzszN33nknrq6uhIaG0q1btzqdv6nIFXVDqLmizt4N5YXWjUUIIS7Ss2dPi/WSkhKmTJlChw4d8PDwwMXFhf3791/1ijo6Otr82dnZGTc3N/MrMmvj5ORkTtJgeo1mTf3CwkJyc3PNSRNAo9HQo0ePOn23/fv3079/f4uy/v37s3//fgAeeOABysrKaNu2LY8//jiLFi2iqqoKgFtvvZXQ0FDatm3LQw89xLfffsv58+frdP6mIlfUDcE9GDzbwLljkLkFIm+zdkRCiOvkaK9h38yhVjt3Q3F2drZYnzJlCklJScyaNYvw8HAcHR25//77qaiouOJx7O3tLdZVKhVGo7FO9Zt6ssaQkBDS09NZsWIFSUlJPPXUU7z33nusWbMGV1dXdu7cyerVq/nzzz+ZNm0aM2bMYNu2bTb3CJhcUTeUqNshchhona9eVwhh81QqFU5aO6ssjfmGtA0bNjBu3DjuueceunTpQkBAAMePH2+089XG3d0df39/tm3bZi4zGAzs3LmzTsfp0KEDGzZY3nLcsGEDHTt2NK87OjoyYsQIPvroI1avXs2mTZtITU0FwM7Ojri4ON5991327NnD8ePHWbly5XV8s8YhV9QNZdhb1o5ACCGuKiIigl9++YURI0agUql47bXXrnhl3FiefvppEhISCA8Pp3379nz88cecO3euTn+kvPDCCzz44IN069aNuLg4/ve///HLL7+YR7EnJiZiMBjo06cPTk5O/Pe//8XR0ZHQ0FB+//13jh49yqBBg/D09GTJkiUYjUaioqIa6yvXmyRqIYS4gfzrX//i0UcfpV+/fvj4+PDSSy9RVFTU5HG89NJL5OTk8PDDD6PRaJgwYQJDhw6t0yxTI0eO5MMPP2TWrFk8++yztGnThvnz5xMbGwuY5oN+++23mTx5MgaDgS5duvC///0Pb29vPDw8+OWXX5gxYwbl5eVERETw/fff06lTp0b6xvWnUpr6poGVnThxgpCQELKysmjVqtV1H6/KYESjVl34K7AgC9R24HZ9848KIZpOeXk5x44do02bNjg4OFg7nBuS0WikQ4cOPPjgg7zxxhvWDqdBXOn3qi65SO5RX4cXF+6m+xtJpJ2s/mt02T9hdmfY+oV1AxNCCBuXkZHBvHnzOHjwIKmpqTz55JMcO3aMv/3tb9YOzeZIor4O585XUlRexZqD1Y8o+HcClQbOn7FuYEIIYePUajWJiYn06tWL/v37k5qayooVK+jQoYO1Q7M5co/6OgyO9CVpXy5rDuYz8ZYI6DQSOt4FOldrhyaEEDYtJCTkkhHbonaSqK/D4EjTi+t3ZhZQWFaJu6M8miWEEKJhSdf3dQjxcqKdrzMGo8KGw6ctN1rhcQchhBAtjyTq6zQ40g+ANen5poKTO2DeLfD1XVaMSgghREshifo6DY4ydX+vOZhvej2eg4cpWWdtgcoy6wYnhBCi2ZNEfZ36tPFCZ6cmp6ic9Nxi8GoLroFgqIAT265+ACGEEOIKrJqoExIS6NWrF66urvj5+TFy5EjS09Ovut9PP/1E+/btcXBwoEuXLnWeGq0hOdhr6NvOG6ju/lapTNNeAhyXEY1CCCGuj1UT9Zo1a4iPj2fz5s0kJSVRWVnJbbfdRmlp6WX32bhxI6NHj2b8+PHs2rWLkSNHMnLkSNLS0powcks1o7/XHKy+T10z7eXx9VaKSAghrl1sbCyTJk0yr4eFhTF79uwr7qNSqVi8ePF1n7uhjnMlM2bMoGvXro16jsZk1US9bNkyxo0bR6dOnYiJiSExMZHMzEx27Nhx2X0+/PBDhg0bxgsvvECHDh1444036N69O3PmzGnCyC3VJOptx89Sqq+6cEV9YhtUllstLiFEyzZixAiGDRtW67Z169ahUqnYs2dPnY+7bds2JkyYcL3hWbhcsszOzmb48OENeq6WxqbuURcWFgLg5eV12TqbNm0iLi7Oomzo0KFs2rSp1vp6vZ6ioiLzUlxc3HABV2vj40xrLycqDQobj5wB73Bw8QeD3jSwTAghGsH48eNJSkrixIkTl2ybP38+PXv2JDo6us7H9fX1xcnJqSFCvKqAgAB0Ol2TnKu5splEbTQamTRpEv3796dz586XrZeTk4O/v79Fmb+/Pzk5ObXWT0hIwN3d3bxcPE9pQ1GpVBd1f+eZ7lNL97cQopHdeeed+Pr6kpiYaFFeUlLCTz/9xPjx4zlz5gyjR48mODgYJycnunTpwvfff3/F4/616/vQoUMMGjQIBwcHOnbsSFJS0iX7vPTSS0RGRuLk5ETbtm157bXXqKysBEzTTb7++uvs3r0blco0iVFNzH/t+k5NTeWWW27B0dERb29vJkyYQElJiXn7uHHjGDlyJLNmzSIwMBBvb2/i4+PN57oWRqORmTNn0qpVK3Q6HV27dmXZsmXm7RUVFUycOJHAwEAcHBwIDQ0lISEBAEVRmDFjBq1bt0an0xEUFMQzzzxzzeeuD5tJ1PHx8aSlpbFgwYIGPe7UqVMpLCw0L/v27WvQ49eoSdSr06sf0wqrTtQZkqiFaNYqSuu+GKou7G+oMpX99XHNy+1bB3Z2djz88MMkJiZy8USIP/30EwaDgdGjR1NeXk6PHj34448/SEtLY8KECTz00ENs3br1ms5hNBq599570Wq1bNmyhc8++4yXXnrpknqurq4kJiayb98+PvzwQ+bNm8cHH3wAwKhRo3j++efp1KkT2dnZZGdnM2rUqEuOUVpaytChQ/H09GTbtm389NNPrFixgokTJ1rUW7VqFUeOHGHVqlX85z//ITEx8ZI/Vq7kww8/5P3332fWrFns2bOHoUOHctddd3Ho0CEAPvroI3777Td+/PFH0tPT+fbbbwkLCwPg559/5oMPPuDzzz/n0KFDLF68mC5dulzzuevDJl4hOnHiRH7//XfWrl171em+AgICyM3NtSjLzc0lICCg1vo6nc6iW6Wx5l3t284brUbNiXNlHD1dSrvQ6vvUWdugSg920rUjRLP0VlDd93kgETrdY/p84H/w0zgIHQCP/HGhzuwutU/gM6OwTqd69NFHee+991izZo15Hub58+dz3333mXsSp0yZYq7/9NNPs3z5cn788Ud69+591eOvWLGCAwcOsHz5coKCTG3x1ltvXXJf+dVXXzV/DgsLY8qUKSxYsIAXX3wRR0dHXFxcsLOzu+y/1QDfffcd5eXlfP311zg7m17JPGfOHEaMGME777xj7k319PRkzpw5aDQa2rdvzx133EFycjKPP/74NbXZrFmzeOmll/i///s/AN555x1WrVrF7Nmz+eSTT8jMzCQiIoIBAwagUqkIDQ0175uZmUlAQABxcXHY29vTunXra2rH62HVK2pFUZg4cSKLFi1i5cqVtGnT5qr79O3bl+TkZIuypKQk+vbt21hhXhNnnR292ngC1Y9p+UaBkw9UlcHJnVaNTQjRcrVv355+/frx1VdfAXD48GHWrVvH+PHjATAYDLzxxht06dIFLy8vXFxcWL58OZmZmdd0/P379xMSEmJO0kCt/97+8MMP9O/fn4CAAFxcXHj11Vev+RwXnysmJsacpAH69++P0Wi0eHS3U6dOaDQa83pgYCB5eXnXdI6ioiJOnTpF//79Lcr79+/P/v37AVP3ekpKClFRUTzzzDP8+eef5noPPPAAZWVltG3blscff5xFixZRVVVFY7LqFXV8fDzfffcdv/76K66urub7zO7u7jg6OgLw8MMPExwcbL4/8OyzzzJ48GDef/997rjjDhYsWMD27dv54gvrzwE9ONKXDYfPsOZgPo8OaGPq/t73q6n7O9S6f0gIIerpn6fqvo/moh609iNMx1D95bpoUur1xXWR8ePH8/TTT/PJJ58wf/582rVrx+DBgwF47733+PDDD5k9ezZdunTB2dmZSZMmUVFR0WDn37RpE2PGjOH1119n6NChuLu7s2DBAt5///0GO8fF7O3tLdZVKhXGBpxfoXv37hw7doylS5eyYsUKHnzwQeLi4li4cCEhISGkp6ezYsUKkpKSeOqpp8w9Gn+Nq6FY9Yp67ty5FBYWEhsbS2BgoHn54YcfzHUyMzPJzs42r/fr14/vvvuOL774gpiYGBYuXMjixYuvOACtqcRGmd77vfnoGcorDaauLjB1fwshmietc90XzUXXQBo7U5m947Udtx4efPBB1Go13333HV9//TWPPvooKpUKgA0bNnD33Xfz97//nZiYGNq2bcvBgwev+dgdOnQgKyvL4t/hzZs3W9TZuHEjoaGhvPLKK/Ts2ZOIiAgyMjIsv65Wi8FguOq5du/ebfEujQ0bNqBWq4mKirrmmK/Ezc2NoKCgS6bY3LBhg8VgYzc3N0aNGsW8efP44Ycf+Pnnnzl79iwAjo6OjBgxgo8++ojVq1ezadMmUlMb7g+vv7LqFfXFgx8uZ/Xq1ZeUPfDAAzzwwAONENH1ifBzIdDdgezCcjYfPUNsx7shuAcExlg7NCFEC+bi4sKoUaOYOnUqRUVFjBs3zrwtIiKChQsXsnHjRjw9PfnXv/5Fbm7uNT8BExcXR2RkJGPHjuW9996jqKiIV155xaJOREQEmZmZLFiwgF69evHHH3+waNEiizphYWEcO3aMlJQUWrVqhaur6yWPZY0ZM4bp06czduxYZsyYQX5+Pk8//TQPPfTQJU/7XI8XXniB6dOn065dO7p27cr8+fNJSUnh22+/BeBf//oXgYGBdOvWDbVazU8//URAQAAeHh4kJiZiMBjo06cPTk5O/Pe//8XR0dHiPnZDs5lR3y2B5WNa+eDqD616WP51LYQQjWD8+PGcO3eOoUOHWtxPfvXVV+nevTtDhw4lNjaWgIAARo4cec3HVavVLFq0iLKyMnr37s1jjz3Gm2++aVHnrrvu4rnnnmPixIl07dqVjRs38tprr1nUue+++xg2bBg333wzvr6+tT4i5uTkxPLlyzl79iy9evXi/vvvZ8iQIQ3+QqtnnnmGyZMn8/zzz9OlSxeWLVvGb7/9RkREBGAawf7uu+/Ss2dPevXqxfHjx1myZAlqtRoPDw/mzZtH//79iY6OZsWKFfzvf//D29u7QWO8mEq5lsvaFuTEiROEhISQlZV11RHm9bE0NZsnv91JW19nVj4f2+DHF0I0vPLyco4dO0abNm1wcHCwdjiihbjS71VdcpFc6jWw/hE+aNQqjuaXknX2PCGGE7DpY1BpYMRsa4cnhBCimZGu7wbm5mBPj9amx7RWH8w3vUZ059eQ+pPlSxCEEEKIayCJuhEMjqq+T52eD36dYMBkuP8r4Ia6yyCEEKIBSKJuBDUDyjYeOY3eqEDcdIgcCprGecZOCCFEyyWJuhF0DHTDx0XH+QoDO46fs3Y4QgghmjFJ1I1ArVYxKNIHqH5My2iAw8mw8k3TZyGETWrIt1sJ0VC/TzLqu5HERvnxy86TrDmYz9RhkfDTI6AvhPa3Q1A3a4cnhLiIVqtFrVZz6tQpfH190Wq15jd7CVFXiqJQUVFBfn4+arUarVZ7XceTRN1IBob7oFLBgZxisosrCAztCweXwfENkqiFsDFqtZo2bdqQnZ3NqVP1eLe3ELVwcnKidevWqNXX13ktibqReDpriWnlQUpWAWsP5jMqtH91ol4P/SZe/QBCiCal1Wpp3bo1VVVVV30ntRBXo9FosLOza5CeGUnUjWhwpC8pWQWsOZjPqNjqKdUyN5ruU6s1V95ZCNHkVCoV9vb2jTYLkhD1IYPJGlFs9fPU6w6dpsqvC2hdobwQcvdaOTIhhBDNhSTqRhTdygMPJ3uKy6vYdbIEWt9k2nB8vXUDE0II0WxIom5EGrWKgREXvaUsrLr7O2PDFfYSQgghLpBE3chiq99StvpgHoQOMBVmbAB5XlMIIcQ1kETdyAZWv/gk7WQR+a4dwN4Zys5B3j4rRyaEEKI5kETdyPxcHegU5AbAuqMF0LqPaYN0fwshhLgGkqibQM3o7zUH8yG0+j61DCgTQghxDSRRN4HBkX4ArD2Yj+Hi+9SKTHsphBDiyuSFJ02gW2sPXHV2nDtfSZrSlpiIoaYu8Co92DtYOzwhhBA2TBJ1E7DXqBkQ4cPStBxWHy4kZsyP1g5JCCFEMyFd301k8MWPaQkhhBDXSBJ1ExlUnah3ZxVwrrQCinNh72K5Ty2EEOKKJFE3kSAPRyL9XTAqsOHgKfgwGn4aC2cOWzs0IYQQNsyqiXrt2rWMGDGCoKAgVCoVixcvvmL91atXo1KpLllycnKaJuDrFBtlGv29+nAhhPSBgGg4f9bKUQkhhLBlVk3UpaWlxMTE8Mknn9Rpv/T0dLKzs82Ln59fI0XYsGruU685mI9xzM/wxLoLL0ARQgghamHVUd/Dhw9n+PDhdd7Pz88PDw+Phg+okfUM88RJqyG/WM/+vPN0CnK3dkhCCCFsXLO8R921a1cCAwO59dZb2bCh+byKU2enoV87b6D6LWUAlWVQcd6KUQkhhLBlzSpRBwYG8tlnn/Hzzz/z888/ExISQmxsLDt37rzsPnq9nqKiIvNSXFzchBFfyvyYVno+LHkR3m4NqT9ZNSYhhBC2q1m98CQqKoqoqCjzer9+/Thy5AgffPAB33zzTa37JCQk8PrrrzdViFdlep3oXnZmnEPfxgWdocL0OtEeY60dmhBCCBvUrK6oa9O7d28OH778I05Tp06lsLDQvOzbZ93pJVt7O9HWx5kqo8JuTRdT4fH18jy1EEKIWjX7RJ2SkkJgYOBlt+t0Otzc3MyLq6trE0ZXu5qXn/x+rhWo7aHoJJw7bt2ghBBC2CSrJuqSkhJSUlJISUkB4NixY6SkpJCZmQmYroYffvhhc/3Zs2fz66+/cvjwYdLS0pg0aRIrV64kPj7eGuHX2+DqaS9XHCpCCe5uKpRpL4UQQtTCqveot2/fzs0332xenzx5MgBjx44lMTGR7Oxsc9IGqKio4Pnnn+fkyZM4OTkRHR3NihUrLI7RHPRt643OTs2pwnLOdeqNV9YW033q7g9ZOzQhhBA2RqUoN9bN0RMnThASEkJWVhatWrWyWhwPf7WVtQfzmXtTAcNTngL31vBcqtXiEUII0XTqkoua/T3q5qrmMa2FecGg0kBhJpzLsHJUQgghbI0kaiupSdTrMsowBHUzFWY0n5e3CCGEaBr1StRZWVmcOHHCvL5161YmTZrEF1980WCBtXTtfJ1p5elIhcHICbfqRH1cErUQQghL9UrUf/vb31i1ahUAOTk53HrrrWzdupVXXnmFmTNnNmiALZVKpTJfVa/VV7/E5fg6K0YkhBDCFtUrUaelpdG7d28AfvzxRzp37szGjRv59ttvSUxMbMj4WrSaRP1dTpDpPnVBBhSeuMpeQgghbiT1StSVlZXodDoAVqxYwV133QVA+/btyc7ObrjoWrh+4T7Ya1TsPwt63y5g5wD56dYOSwghhA2pV6Lu1KkTn332GevWrSMpKYlhw4YBcOrUKby9vRs0wJbMRWdHz1AvAP4XlQAvZ0L4ECtHJYQQwpbUK1G/8847fP7558TGxjJ69GhiYmIA+O2338xd4uLa1Lyl7I9MO7DTWTkaIYQQtqZebyaLjY3l9OnTFBUV4enpaS6fMGECTk5ODRbcjSA2ype3lx5g09EzlFcacLDXmCboUKmsHZoQQggbUK8r6rKyMvR6vTlJZ2RkMHv2bNLT0/Hz82vQAFu6KH9X/N10lFcaObXkXfjkJkj72dphCSGEsBH1StR33303X3/9NQAFBQX06dOH999/n5EjRzJ37twGDbClu/gxrdxTGZC/XyboEEIIYVavRL1z504GDhwIwMKFC/H39ycjI4Ovv/6ajz76qEEDvBHERpl6Ib4quQke/AZuec3KEQkhhLAV9UrU58+fN8/r/Oeff3LvvfeiVqu56aabyMiQ91XXVf9wHzRqFUlnfDkRGAfOMnJeCCGESb0SdXh4OIsXLyYrK4vly5dz2223AZCXl4ebm1uDBngjcHe0p1uIBwBrD562bjBCCCFsSr0S9bRp05gyZQphYWH07t2bvn37Aqar627dujVogDeKmvvU+1J3wOq3YcvnVo5ICCGELahXor7//vvJzMxk+/btLF++3Fw+ZMgQPvjggwYL7kZSc5+6JCsVVifA9q+sHJEQQghbUK/nqAECAgIICAgwz6LVqlUrednJdegU5Ia3s5Y1pRHgAOQfgNLT4Oxj7dCEEEJYUb2uqI1GIzNnzsTd3Z3Q0FBCQ0Px8PDgjTfewGg0NnSMNwS1WsWgSF/O4UaeYztTocxPLYQQN7x6JepXXnmFOXPm8Pbbb7Nr1y527drFW2+9xccff8xrr8mjRfUVW/060c3GDqYCeZ5aCCFuePXq+v7Pf/7Dl19+aZ41CyA6Oprg4GCeeuop3nzzzQYL8EYyINwHlQqWFrfjLi1wXK6ohRDiRlevK+qzZ8/Svn37S8rbt2/P2bNnrzuoG5W3i47oYHe2GqvbNm8vnJf2FEKIG1m9EnVMTAxz5sy5pHzOnDlER0dfd1A3ssFRfpzBnWxtqKkgY6N1AxJCCGFV9er6fvfdd7njjjtYsWKF+RnqTZs2kZWVxZIlSxo0wBvN4EhfPko+xNqKKEaRYbpP3eFOa4clhBDCSup1RT148GAOHjzIPffcQ0FBAQUFBdx7773s3buXb775pqFjvKHEtHLH3dGedRVRpoIMGVAmhBA3sno/Rx0UFHTJoLHdu3fz73//my+++OK6A7tR2WnUDIjwYcue6pHfOWlQdg4cPa+8oxBCiBapXlfUonHFRvqSjwcnNK0ABTI2WTskIYQQVmLVRL127VpGjBhBUFAQKpWKxYsXX3Wf1atX0717d3Q6HeHh4SQmJjZ6nE2t5r3faysiTQXy4hMhhLhhWTVRl5aWEhMTwyeffHJN9Y8dO8Ydd9zBzTffTEpKCpMmTeKxxx6zeN94S+Dn5kCHQDf+Z+jL/qh46HyftUMSQghhJXW6R33vvfdecXtBQUGdTj58+HCGDx9+zfU/++wz2rRpw/vvvw9Ahw4dWL9+PR988AFDhw6t07ltXWyUL3OzO/GFOpgPgrtaOxwhhBBWUqcrand39ysuoaGhPPzww40VK5s2bSIuLs6ibOjQoWza1PLu4Zq7vw/mYzQqVo5GCCGEtdTpinr+/PmNFcc1ycnJwd/f36LM39+foqIiysrKcHR0vGQfvV6PXq83rxcXFzd6nA2hR6gnLjo7KkvPkrXxR0J9XKH97dYOSwghRBNr8aO+ExISLK76O3bsaO2Qrom9Rk3/cG9uUacQumICrJtl7ZCEEEJYQbNK1AEBAeTm5lqU5ebm4ubmVuvVNMDUqVMpLCw0L/v27WuKUBvE4Eg/thg7kKUJgeCeoEgXuBBC3GiaVaLu27cvycnJFmVJSUnm15jWRqfT4ebmZl5cXV0bO8wGMzjKl2y8GXz+HQpj3wSVytohCSGEaGJWTdQlJSWkpKSQkpICmB6/SklJITMzEzBdDV88OO2JJ57g6NGjvPjiixw4cIBPP/2UH3/8keeee84a4Te6YA9HIvxcMCqw/vBpa4cjhBDCCqyaqLdv3063bt3o1q0bAJMnT6Zbt25MmzYNgOzsbHPSBmjTpg1//PEHSUlJxMTE8P777/Pll1+2uEezLlYz+nv9gZOQk2rlaIQQQjQ1laLcWDc+T5w4QUhICFlZWbRq1cra4VzVukP5TPp3EhscnkWnNqJ6ORO0ztYOSwghxHWoSy5qVveob0S9wrw4b+/FacUNlbEKsrZYOyQhhBBNSBK1jXOw19C3nTdbjO1NBcdl2kshhLiRSKJuBgZH+rLZWP3893GZoEMIIW4kkqibgcGRvmwxmuanVk7ugIrzVo5ICCFEU5FE3QyE+Tij9gwjW/FCZayEE9usHZIQQogmIom6mRgc5cfm6qtquU8thBA3DknUzcTgqIu6vzMkUQshxI1CEnUzcVNbb3aqTAPKlBM7oLLcyhEJIYRoCpKomwknrR3+YZ3IVTxQG/Rwcru1QxJCCNEEJFE3I4Oj/Mzd33KfWgghbgySqJuR2IvuUxuOSaIWQogbgSTqZqSdrwtHnbtRoWgo1BtlfmohhLgBSKJuRlQqFWFRXYnWf8lHQe/J/NRCCHEDkETdzAyO8qMcHWsP5ls7FCGEEE1AEnUz0z/cGzu1iqOnS8nKOW3tcIQQQjQySdTNjKuDPbGtVPyu/ScB87pAVYW1QxJCCNGIJFE3Q907hBOoOoO94Tzkplk7HCGEEI1IEnUzFBvlzz8qnmOwcS56/xhrhyOEEKIRSaJuhjoEupLhEkNGhTvbj5+zdjhCCCEakZ21AxB1p1KpGBzpy8IdJ9Cvfh/Wp4J/ZwjoDAFdwLc92OmsHaYQQogGIIm6mYqNMiVqx+wtYNgBx9dd2Ki2A5/IC8nbvzqBu/hZL2AhhBD1Iom6mRoQ7oOdWsX08w8Sre5JR3UmPXQniVCO42Qogrx9piX1xws7OfuZEnf0/0HMKOsFL4QQ4ppJom6mPJy0fDS6G4t3+bEmK5yFxXqoBFAI4Cwd1Rl0056gt9MpIozH8SzPQlWaB0dWQut+Fw50LgN++DsE94ARs630bYQQQlyOJOpm7PYugdzeJRBFUThVWE5KZgG7Ms+RkuXFhpO+rCzvDtXTVjtSTpTqBANcszEeb4u//XG6tfagQ+Ee7HP2AH95b/h31Vfc5u7zLuDVBtSaJv2OQghxo5NE3QKoVCqCPRwJ9nDkjuhAACoNRg5kF5OSdY5dWQWkZBaQctqBlKJwKAL27wXA3+4893q/ShtnZxx3n6Jbaw+CXe1QHVkJhgo4uOzCieydwK+j5X1v3/bg6NHo31FRFIr1VZwpqeBMiZ7TJRWUVVbRK8yLVp5OjX5+IYSwFpWi3FhTMJ04cYKQkBCysrJo1aqVtcNpUgXnK0jJKjAvuzILKCyrvKSev7Md9/qf4ianbNpzDJ/SQ2jyD0BVWe0HdvE3DV6Lex1a9TCVGSpNg9quMHGIvsrA2dIKzpRUcLpEb0rCpfrqddNnc3lJBRUGY63HiWnlzrDOgQzvHECYj3Od20UIIZpaXXKRTSTqTz75hPfee4+cnBxiYmL4+OOP6d27d611ExMTeeSRRyzKdDod5eXl13SuGzlR/5WiKBw/c766u9yUvPedKqLKaPkroVJBe18n4vxLuMk5m/aqDLyKD6LKTYPiU+Z6xsdWUejZmTOlejTb5hGy6z3Sg+9jeatnOFOi50yxHm3hEfaVe5NbaqC4vKrOMbvo7PB20eLtrEUBUrIKLGb77BDoxu2dAxjeJYBwP9f6No0QQjSquuQiq3d9//DDD0yePJnPPvuMPn36MHv2bIYOHUp6ejp+frU/TuTm5kZ6erp5XSXTPdaLSqWijY8zbXycube76RelvNLA3lOF7MosMHeZnywoY3/eefbnqfmYYCAYZ+1AurRyx9W1DMeio3iWHefnT49TYswGYKbdRh62O8/aIwV8lH4IAF/Osc0hnkpFQ4bizxH7II4STK62Neec2nDerS0ubp54O2vxdtHh7aLFx0WLt7MOH1cd3s5aHOwt75HnF+v5c18OS1Nz2HT0DPuzi9ifXcT7SQeJ8HNheOcAhncJpH2Aq/yeCCGaJatfUffp04devXoxZ84cAIxGIyEhITz99NO8/PLLl9RPTExk0qRJFBQU1Ot8ckVdd3nF1QPVqhP3nhMFlFYYLlvf3dEef2cVnRzO4ujsisazNd4uWiINh7ht62PYGc5f/mSuQeAbaepKr1lC+oC9w1XjPFdaQdK+XJakZbPh8GkqDRd+tcO8nRjexdQ93iXYXZK2EMKqmk3Xd0VFBU5OTixcuJCRI0eay8eOHUtBQQG//vrrJfskJiby2GOPERwcjNFopHv37rz11lt06tSp1nPo9Xr0er15/eTJk3Ts2FES9XUwGBUO5RWTeqIQO40Kb+eaq18dnk5atHZXeDOt0WjqLs9Ph9OH4HT1z/x0KM2rfZ8phy68rCXtFyjIhIhbwb/2/+YAhWWVJO/PZWlaDmsO5lNRdeH+drCHo/lKu1uIB2q1JG0hRNNqNl3fp0+fxmAw4O/vb1Hu7+/PgQMHat0nKiqKr776iujoaAoLC5k1axb9+vVj7969tX7ZhIQEXn/99UaJ/0alUatoH+BG+wC3uu+sVoN7K9MSPsRyW9m56uR9sDqRH4TiHHD2vVBnzw+mkeha5wuJ+uwx2PVf07PgwT3A1R93R3vu7d6Ke7u3okRfxaoDeSxNy2bVgXxOFpTx5fpjfLn+GAFuDgzrHMCwzgH0CvNCI0lbCGFjrHpFferUKYKDg9m4cSN9+/Y1l7/44ousWbOGLVu2XPUYlZWVdOjQgdGjR/PGG29csl2uqFuYrfMgczP0fcqUlAF2fgO/TbxQxz0EgrtfSNyBXUHnAkBZhYE1B/NYmpZD8v48SvQXBrT5uGi5rVMAt3cOpE9bL+w1MmeNEKJxNJsrah8fHzQaDbm5uRblubm5BAQEXNMx7O3t6datG4cPH651u06nQ6e7MEFFUVFR/QMW1tf7cdNyMe920O3vcHIn5O2HwizTsq/61olKDb4dILg7jsE9GBbcg2EPdKHcqGLD4dMsSc0haV8Op0sq+G5LJt9tycTDyZ7bOvozvHMg/cN9rtydL4QQjciqiVqr1dKjRw+Sk5PN96iNRiPJyclMnDjxyjtXMxgMpKamcvvttzdipMKmhfYzLQD6YsjeDSe2w8kdpuRddALy9pqWXd+Y6gV1w2HCaoZ08GdIB38qCvzYlKth2d4clu/N5WxpBT9uP8GP20/g6mBHXAd/hncOYFCk7yUjz4UQojFZ/fGsyZMnM3bsWHr27Env3r2ZPXs2paWl5melH374YYKDg0lISABg5syZ3HTTTYSHh1NQUMB7771HRkYGjz32mDW/hrAVOlcIG2BaahTnVCftHReS98UD0QyVaOd0ZbDWhcFPrOeNuzuz9dhZlqeeYMm+0+QX61m06ySLdp3ESavhlvZ+DO8cyIBwH5x0GuzUKhlFLoRoNFZP1KNGjSI/P59p06aRk5ND165dWbZsmXmAWWZmJmr1hW7Hc+fO8fjjj5OTk4Onpyc9evRg48aNdOzY0VpfQdg61wBof4dpAdPI88rSC9vPHgOjAYyV4OKPnVpNv3Af+qW8yAzXFM6GdGZrZRt+zvFnXXEgv+/J5vc92Ran0GrU2GtU2NupsdeoL6xrTOv2dmq0F69r1GjtVNipL3y22FZT1+4v6xcdy9dVR6S/K64O9k3YmEKIpmb156ibmjxHLWpVWW567Ms38kLZ7GgoyLCoZlTbk+sYziZ9KFvLWpGjeJKneJKreHIWVxSa/l52sIcjUQGupsXf9LOtrzM6O+miF8JWNZvnqK1BErW4ZufPwqldpq7yk9tN973Pn75sdUVtR/6ANzjd/u9UGoyoCjLxOPwLJS5hnAoeTqXBSIXBSGWVkUqjQqXBSKWh+meVsXp7TXn1etVf1g0KlVWm45w8V0ZOUe2vzrVTm946FxngSnt/V9PPAFdCPJ3kuXEhbECzGfUthE1z8jI9613zvLeimEaTn9xhStr56VCSA8W5UJqPyliFn68ffkHVz5eXboTdH0BQdzreOu7Ccef0gorzpi75ixevQHCpWQ80nf8q974LzldwMLeE9Jwi0nOLSc8p5kBOMcXlVRzKK+FQXgl/cKGb3tFeQ6S/C1EBrkT6u9I+wI3IABd8XXRyn10IGyWJWohrpVKBR2vT0ukey22GSijJA4eLXgLj6g/dHjLVr6EoUJBlmoms6MSVz6e2v5DEBz4PUcNN5aWn4VQKeITg4RtF7zZe9G7jddEpFHKKyknPKb6w5BZzKK+EskoDu08UsvtEocWpvJy1RPq7mBJ3dfd5VIArLjr5J0IIa5P/C4VoCBp7cA+2LKt54cpfPb3dNBK9OAeKs6Ek1/SzuPrqvDjb1MVurLzwTHjlRe9Hz9wMP4yBVr3hsaQL5fNugSo9KicvAh29CHTyItbJG1p7QXsvDA6eZFe6cLhYy94Ce1LzjRzMK+H4mVLOllaw+ehZNh89a/kVPBxpH3Ch6zzS35V2vi71fq5cURQMRoUq419/Gk0/DbWXG/5S38FeI69/FTcMSdRCNCWV6sIrVK+kqsL07vOahB7c/cI2tQb8OoF3uOU+ufsuP2c4oAFaVS+xYJov/M7ZlHf5G4fzSjh1aBe+e//N/spAPjo/lJyick4WlOFeuJ9j6VoWKC4U4oJarSHU2wkHe02tSdScdI0KBoNlubEBR8S083XmH4PbMbJrsLyQRrRoMphMiJZAUUwD38rOwvlzcP5M9eezf/l81vS55gr9/q+g832mz/t+gx8fMs1WNv5PCs5XkJ5TTOcfbsJZb5owxYiKIsWJc4oLZTigx55yRWv6iemnXrFnkXEAm4ymZ9UDOcOdmk3kKR78arzwfHsv1QHsVAb0ij16tFSpaxYHqlRaDGodRrU9Go0ajVqFnVpV/VPNqYIyiqtf/xro7sBjA9vyf71CcJauetFMyGAyIW40KpXlVffVVJaZkraD+4Uyn0i4+VXzTGUeTlr6tPUGNy8o0oO+EDUKHqpSPFSllzmwyaBBwynpPBg7tQqnE+vwW/wdVT4dmDZuBnZqNRqNCqcvpqM+c+jyBzECRhXgACodqB2h/7Nw05MUl1eyYNMRDqz/hRWF7Xjj93I+XnmIsX3DGNcvDE9n7bW3hRA2ThK1EDcie8dL76n7tTctfxVfPTmOodI0w5n5qrwMqsqrF331uh6qygkI7w9+polQqGoF0aOwcw3E2+XCe/fxamvqxr9oP/Nippi686vKoLzAvM3VwZ7HI0pgzTvoXT0Yaj+f42fL+DD5EN+s3cfdvSN4fGBbgjwcG6zJhLAWSdRCiGujsTddbdfMDX6tAjrDvV9cWj7mx9rrKwoYKv6SwPWmZO1y0ZS45YXgE4XOJ4LkB29mWVoOn646xOdnH6F8m5Y1WztgbN2PfkNG0KZtVN1iFsKGyD1qIUTzZqg0/REBKIUnUX1w6euE8+0CUbfpj3fHWyCsP3iEXvUZdSEak9yjFkLcODQX3nWucg+GF49B5mbyUpM5f3gdIeUH8a3KhkMLTQuguAWjCu1vmnUtbIBpBL0kbmGjJFELIVoWJy9ofzt+7U1T3x45cYpVf/6PqmPr6aXaT7TqKPZFJyH1R9MC8NzeC4/MlZ0DnTuo5ZEvYRskUQshWrR2rYJo9+g/OFXwMF+uO8ZjWw/RwXCAPuoDDNam08axDAfnQMzD3H75B5zYCnfNgQ53WjN0m1Cir+JIXgmH80o4nF9C1tnz2KlVONhrLlrUOF70+eJtjheVOdpr0F302V4jfwxdC0nUQogbQpCHI9NGdOTpW8L5z6YOzN94nA/OV6I6b8T3nVWMH9CGv/UOwTUn1XRVffGo+LRfYPf3pq7y0P4Q2BXsWs4jYIqicLqkwpyMaxLzkfwSsgtrn/ilIWjUKhzs1DhqNejsqhO+VoODneUfARcnfC9nHf3Dvekc5H7DvJlOBpMJIW5IpfoqFmzL4st1R83JyNXBjnF9ghnfrhCPdn1AU30t82s87PrvhZ3VdqbHy3wi/7KEWz6bbmOMRoUT58o4nF9sSsR5pRzONyXlwrLKy+7n46Ij3M+ZcD8XwrydASirMFBeZaC80khZpYHySgP6iz6XVxooqzSiN3821S2vMtAQWcfbWcugSF9io3wZGOGLVzN7dl6mubwCSdRCiItVVBn5NeUkn605wpF804tcdHZqHuwZwoRBbQnxcoK8/XBkFWRsMC1l5y5/QJcA8ImA9nfATU9eKFeUJhuwpq8ycOx0qSkRV18lH84r4Wh+CfoqY637qFQQ4ulEuJ8L7XxNSTncz4VwX1fcnexr3ac+FEVBX2VEX520LRJ+9Wf9xYn9os/6StP32njkDCXVb6ariT2mlQexUb4MjvQlupUHGhu/2pZEfQWSqIUQtTEaFZL25/Lp6iPszioATF2zI6IDeSK2He0D3GoqQvEp0zSnpw/B6YPVyyHTtKc1ej4Kd35g+lxRCrMiTVfhjy4HrZOpvDjXdAVu71CvmIvKKy3uH9d8zjx7/rLvVddq1LT1daadrwvtzMnYhba+zjjYa+oVR1OrqDKyI+Mcqw/msSY9nwM5xRbbPZ3szVfbgyJ8LV+0YyMkUV+BJGohxJUoisKmo2eYu/oI6w6dNpff0t6PJ2Pb0SvM6/I7lxfC6cOmxO3VBlrfZCrP3g2fDwInH3jxCJUGUxexbsEotMdXUukWQplbO0pd21Lk0oZzTmGc1oVSqHKjvMp0pVlWfWVZVmEg8+x5DueVkFesv2worjq7C4m4OhmH+7kQ4uVk81ebdZVTWM6ag3msTs9n/aHT5vfAg+lqu0uwO7GRvgyO8qNriG1cbUuivgJJ1EKIa5V2spC5a46wJDXbfF+1Z6gnd0YHUmU0deFenETL/5JQa7ptKysq8ao8hUvlWTZURlJVfbn7h3YqndQZlz3/OcWFI0oQR4xBHFaCOKIEkWpsQz6eAOioIMqljBAfN7wDw8wJOco+Fy+dEZViBMVo6gVQjKAYwGi48Pnibd7tTAtAWQEcSQaNznLke/pS0zSsxurjGKsuWi6zHtoPOo007X/+LCyZAqjg/n9fOO7KNyFz0xWOWXlhXaM1PfcecSv0+cclbVZpMLIz4xxrDuazOj2ffdlFFtvdHe0ZGOFDbJQfgyN98XW1ztW2JOorkEQthKirY6dL+WLtEX7ecZIKQ+33eOtDpVJoZV9Ce7scItTZtFOfIkw5SYjxBD6GPNRc+s/zxrB4TnZ+knA/FyJLt+P8w/3g3xme3HCh0kfd4eyRugVz86sw+AXT55xU+GyA6ZWtUw5eqPPv2yBrS92O2/sfcPu7ps/FOfB+FKg0MP2iuc8XjIEDv9ftuF3HwMhPTZ+rKuDDGNMfGv/3HThU36aoKCWvTM3qQ6dZczCfdQfzKSqvsjhM52A3YiP9GBzlS7cQD+ya6JExeTOZEEI0oDY+ziTcG82kuEj+s/E4h/JKcKx+ZMhRe+F5YUet2uL54Uu3XyjX2avR2alRXW6AWcV5U7Ktuf9dfS+8X99BEBViqnPMAewcLN7OBoCzD1SUgEptSooqtekFLhbrmurPKtPni9/hrnWBsIHg6Gl53NB+pu57tcY08l1jb/pZs25eLlpv1evC/jo3GPa2qfxifeOh872XOYa9ZVlFSfWthbYX9j971DRuQF8MOtcL5Yv+gd/RNTzoE8mDvlEYhkRwlGDWnPHit0w79pwqJe1kEWkni5iz6jBuDnYMjPBlcJQvsZG++LnVb+xAQ5MraiGEEM1blR5y06AkH6KGXSj/5CbI31/7PhodVV7tyLYPJVXvz+qznuwu9+eYEkgFpj98OgS6EVudtLuHejboC1qk6/sKJFELIcQNokoPZ47A6XTIP1j9s3q0vqH2gXibWz1KQvl97DlZiLtSzC3qXaQrIWRqIxgQ4cPgSF/ujAnCRXd9HdLS9S2EEELY6cC/o2m5mNEABRkXJe+DkH8ATh/kpj79+bXLAM6U6Dmw/hf6b/6MowRzS/l7LE3LYfneHIZ2CoAmHIMmiVoIIcSNRa0x3eP2amvZVa4ophHwgLeLjv6RQZAzkDCvdizu1p/V6XnkFpXj2cRvQbOJN6J/8sknhIWF4eDgQJ8+fdi6desV6//000+0b98eBwcHunTpwpIlS5ooUiGEEC1WzcC6Gm0Hw7jfUd/1IV1DPJgUF0nCvdFNHpbVE/UPP/zA5MmTmT59Ojt37iQmJoahQ4eSl5dXa/2NGzcyevRoxo8fz65duxg5ciQjR44kLS2tiSMXQgghGp/VB5P16dOHXr16MWfOHACMRiMhISE8/fTTvPzyy5fUHzVqFKWlpfz++4Vn7m666Sa6du3KZ599dtXzyWAyIYQQ1laXXGTVK+qKigp27NhBXFycuUytVhMXF8emTZtq3WfTpk0W9QGGDh162fpCCCFEc2bVwWSnT5/GYDDg7+9vUe7v78+BAwdq3ScnJ6fW+jk5ObXW1+v16PUXhuEXFxfXWk8IIYSwRVa/R93YEhIScHd3Ny8dO3a8+k5CCCGEjbBqovbx8UGj0ZCbm2tRnpubS0BAQK37BAQE1Kn+1KlTKSwsNC/79u1rmOCFEEKIJmDVrm+tVkuPHj1ITk5m5MiRgGkwWXJyMhMnTqx1n759+5KcnMykSZPMZUlJSfTt27fW+jqdDp3uwpPpBQUFAGRnZzfIdxBCCCHqqiYHGY3XMMmLYmULFixQdDqdkpiYqOzbt0+ZMGGC4uHhoeTk5CiKoigPPfSQ8vLLL5vrb9iwQbGzs1NmzZql7N+/X5k+fbpib2+vpKamXtP5tm7dqgCyyCKLLLLIYvVl69atV81bVn8z2ahRo8jPz2fatGnk5OTQtWtXli1bZh4wlpmZiVp9oYe+X79+fPfdd7z66qv885//JCIigsWLF9O5c+drOl+3bt3YunUr/v7+Fsetj+LiYjp27Mi+fftwdXW9+g43OGmvupM2qxtpr7qR9qqbhmwvo9FIbm4u3bp1u2pdqz9H3ZwVFRXh7u5OYWEhbm5u1g7H5kl71Z20Wd1Ie9WNtFfdWKu9WvyobyGEEKI5k0QthBBC2DBJ1NdBp9Mxffp0i1Hl4vKkvepO2qxupL3qRtqrbqzVXnKPWgghhLBhckUthBBC2DBJ1EIIIYQNk0QthBBC2DBJ1Nfhk08+ISwsDAcHB/r06cPWrVutHZLNWrt2LSNGjCAoKAiVSsXixYutHZLNSkhIoFevXri6uuLn58fIkSNJT0+3dlg2a+7cuURHR+Pm5oabmxt9+/Zl6dKl1g6r2Xj77bdRqVQWr2UWlmbMmIFKpbJY2rdv32Tnl0RdTz/88AOTJ09m+vTp7Ny5k5iYGIYOHUpeXp61Q7NJpaWlxMTE8Mknn1g7FJu3Zs0a4uPj2bx5M0lJSVRWVnLbbbdRWlpq7dBsUqtWrXj77bfZsWMH27dv55ZbbuHuu+9m79691g7N5m3bto3PP/+c6Ohoa4di8zp16kR2drZ5Wb9+fdOdvO5v5xaKoii9e/dW4uPjzesGg0EJCgpSEhISrBhV8wAoixYtsnYYzUZeXp4CKGvWrLF2KM2Gp6en8uWXX1o7DJtWXFysREREKElJScrgwYOVZ5991toh2azp06crMTExVju/XFHXQ0VFBTt27CAuLs5cplariYuLY9OmTVaMTLREhYWFAHh5eVk5EttnMBhYsGABpaWll51RT5jEx8dzxx13WPw7Ji7v0KFDBAUF0bZtW8aMGUNmZmaTndvqk3I0R6dPn8ZgMJgnDqnh7+/PgQMHrBSVaImMRiOTJk2if//+1zzxzI0oNTWVvn37Ul5ejouLC4sWLaJjx47WDstmLViwgJ07d7Jt2zZrh9Is9OnTh8TERKKiosjOzub1119n4MCBpKWlNclkJpKohbBh8fHxpKWlNe39sGYoKiqKlJQUCgsLWbhwIWPHjmXNmjWSrGuRlZXFs88+S1JSEg4ODtYOp1kYPny4+XN0dDR9+vQhNDSUH3/8kfHjxzf6+SVR14OPjw8ajYbc3FyL8tzcXAICAqwUlWhpJk6cyO+//87atWtp1aqVtcOxaVqtlvDwcAB69OjBtm3b+PDDD/n888+tHJnt2bFjB3l5eXTv3t1cZjAYWLt2LXPmzEGv16PRaKwYoe3z8PAgMjKSw4cPN8n55B51PWi1Wnr06EFycrK5zGg0kpycLPfFxHVTFIWJEyeyaNEiVq5cSZs2bawdUrNjNBrR6/XWDsMmDRkyhNTUVFJSUsxLz549GTNmDCkpKZKkr0FJSQlHjhwhMDCwSc4nV9T1NHnyZMaOHUvPnj3p3bs3s2fPprS0lEceecTaodmkkpISi78+jx07RkpKCl5eXrRu3dqKkdme+Ph4vvvuO3799VdcXV3JyckBwN3dHUdHRytHZ3umTp3K8OHDad26NcXFxXz33XesXr2a5cuXWzs0m+Tq6nrJeAdnZ2e8vb1lHMRlTJkyhREjRhAaGsqpU6eYPn06Go2G0aNHN8n5JVHX06hRo8jPz2fatGnk5OTQtWtXli1bdskAM2Gyfft2br75ZvP65MmTARg7diyJiYlWiso2zZ07F4DY2FiL8vnz5zNu3LimD8jG5eXl8fDDD5OdnY27uzvR0dEsX76cW2+91dqhiRbixIkTjB49mjNnzuDr68uAAQPYvHkzvr6+TXJ+mT1LCCGEsGFyj1oIIYSwYZKohRBCCBsmiVoIIYSwYZKohRBCCBsmiVoIIYSwYZKohRBCCBsmiVoIIYSwYZKohRBCCBsmiVoI0WhUKhWLFy+2dhhCNGuSqIVoocaNG4dKpbpkGTZsmLVDE0LUgbzrW4gWbNiwYcyfP9+iTKfTWSkaIUR9yBW1EC2YTqcjICDAYvH09ARM3dJz585l+PDhODo60rZtWxYuXGixf2pqKrfccguOjo54e3szYcIESkpKLOp89dVXdOrUCZ1OR2BgIBMnTrTYfvr0ae655x6cnJyIiIjgt99+M287d+4cY8aMwdfXF0dHRyIiIi75w0KIG50kaiFuYK+99hr33Xcfu3fvZsyYMfzf//0f+/fvB6C0tJShQ4fi6enJtm3b+Omnn1ixYoVFIp47dy7x8fFMmDCB1NRUfvvtN8LDwy3O8frrr/Pggw+yZ88ebr/9dsaMGcPZs2fN59+3bx9Lly5l//79zJ07Fx8fn6ZrACGaA0UI0SKNHTtW0Wg0irOzs8Xy5ptvKoqiKIDyxBNPWOzTp08f5cknn1QURVG++OILxdPTUykpKTFv/+OPPxS1Wq3k5OQoiqIoQUFByiuvvHLZGADl1VdfNa+XlJQogLJ06VJFURRlxIgRyiOPPNIwX1iIFkruUQvRgt18883m+a1reHl5mT/37dvXYlvfvn1JSUkBYP/+/cTExODs7Gze3r9/f4xGI+np6ahUKk6dOsWQIUOuGEN0dLT5s7OzM25ubuTl5QHw5JNPct9997Fz505uu+02Ro4cSb9+/er1XYVoqSRRC9GCOTs7X9IV3VAcHR2vqZ69vb3Fukqlwmg0AjB8+HAyMjJYsmQJSUlJDBkyhPj4eGbNmtXg8QrRXMk9aiFuYJs3b75kvUOHDgB06NCB3bt3U1paat6+YcMG1Go1UVFRuLq6EhYWRnJy8nXF4Ovry9ixY/nvf//L7Nmz+eKLL67reEK0NHJFLUQLptfrycnJsSizs7MzD9j66aef6NmzJwMGDODbb79l69at/Pvf/wZgzJgxTJ8+nbFjxzJjxgzy8/N5+umneeihh/D39wdgxowZPPHEE/j5+TF8+HCKi4vZsGEDTz/99DXFN23aNHr06EGnTp3Q6/X8/vvv5j8UhBAmkqiFaMGWLVtGYGCgRVlUVBQHDhwATCOyFyxYwFNPPUVgYCDff/89HTt2BMDJyYnly5fz7LPP0qtXL5ycnLjvvvv417/+ZT7W2LFjKS8v54MPPmDKlCn4+Phw//33X3N8Wq2WqVOncvz4cRwdHRk4cCALFixogG8uRMuhUhRFsXYQQoimp1KpWLRoESNHjrR2KEKIK5B71EIIIYQNk0QthBBC2DC5Ry3EDUruegnRPMgVtRBCCGHDJFELIYQQNkwStRBCCGHDJFELIYQQNkwStRBCCGHDJFELIYQQNkwStRBCCGHDJFELIYQQNkwStRBCCGHD/j8BzPcReWf3bQAAAABJRU5ErkJggg==",
|
||
"text/plain": [
|
||
"<Figure size 500x300 with 2 Axes>"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
}
|
||
],
|
||
"source": [
|
||
"epochs_tensor = torch.linspace(0, num_epochs, len(train_losses))\n",
|
||
"examples_seen_tensor = torch.linspace(0, examples_seen, len(train_losses))\n",
|
||
"\n",
|
||
"plot_values(epochs_tensor, examples_seen_tensor, train_losses, val_losses)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "dbd28174-1836-44ba-b6c0-7e0be774fadc",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Above, based on the downward slope, we see that the model learns well\n",
|
||
"- Furthermore, the fact that the training and validation loss are very close indicates that the model does not tend to overfit the training data\n",
|
||
"- Similarly, we can plot the accuracy below"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 36,
|
||
"id": "yz8BIsaF0TUo",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/",
|
||
"height": 307
|
||
},
|
||
"id": "yz8BIsaF0TUo",
|
||
"outputId": "3a7ed967-1f2a-4c6d-f4a3-0cc8cc9d6c5f"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"data": {
|
||
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAeEAAAEiCAYAAADONmoUAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8pXeV/AAAACXBIWXMAAA9hAAAPYQGoP6dpAABdB0lEQVR4nO3deVhU1f/A8fcMOOyrIIIiouKuiBthbrmESyRmaWaJS/rTXDPTLPcWysosNU0tbXNPzW+4RLjvKyou5IKiCLjLomwz9/fH5OgIKoPoIHxezzPPM3Puued+5oh8uPeee45KURQFIYQQQjx1anMHIIQQQpRUkoSFEEIIM5EkLIQQQpiJJGEhhBDCTCQJCyGEEGYiSVgIIYQwE0nCQgghhJlIEhZCCCHMRJKwEEIIYSaShIUQeWrZsiXDhw83dxhCFGuShIV4Qnr16oVKpcr1ateunblDE0IUEZbmDkCI4qxdu3bMnz/fqMzKyspM0Qghiho5ExbiCbKysqJs2bJGLxcXFwA2bdqERqNh69athvpTpkyhTJkyJCcnA7Bu3TqaNm2Ks7MzpUuX5qWXXuL06dOG+mfPnkWlUrF06VKaNWuGjY0NjRo14t9//2Xv3r00bNgQe3t72rdvz+XLlw379erVi9DQUCZNmoS7uzuOjo4MGDCArKysB36XzMxMRo4cSbly5bCzsyMwMJBNmzYZtp87d46QkBBcXFyws7OjVq1arFmz5oHtff/99/j5+WFtbY2HhwevvvqqYZtOpyM8PBxfX19sbGzw9/dn+fLlRvvHxMTQvn177O3t8fDw4K233uLKlSuG7S1btmTo0KGMGjUKV1dXypYty8SJEx8YjxDmIElYCDO5c8/1rbfe4ubNmxw8eJBx48Yxb948PDw8AEhPT2fEiBHs27ePqKgo1Go1nTt3RqfTGbU1YcIExo4dy4EDB7C0tOSNN95g1KhRfPvtt2zdupVTp04xfvx4o32ioqI4fvw4mzZtYtGiRaxYsYJJkyY9MN7Bgwezc+dOFi9ezOHDh3nttddo164dJ0+eBGDQoEFkZmayZcsWjhw5whdffIG9vX2ebe3bt4+hQ4cyefJkYmNjWbduHc2bNzdsDw8P55dffmH27NkcPXqUd999lzfffJPNmzcDcOPGDVq1akVAQAD79u1j3bp1JCcn07VrV6Pj/Pzzz9jZ2bF7926mTJnC5MmTiYyMzOe/kBBPgSKEeCLCwsIUCwsLxc7Ozuj16aefGupkZmYq9erVU7p27arUrFlT6dev30PbvHz5sgIoR44cURRFUeLi4hRAmTdvnqHOokWLFECJiooylIWHhyvVqlUzis3V1VVJT083lM2aNUuxt7dXtFqtoiiK0qJFC2XYsGGKoijKuXPnFAsLCyUhIcEontatWytjxoxRFEVR6tSpo0ycODFfffPHH38ojo6OSkpKSq5tGRkZiq2trbJjxw6j8r59+yrdu3dXFEVRPv74Y+XFF1802n7+/HkFUGJjYw3xN23a1KhOo0aNlNGjR+crRiGeBrknLMQT9MILLzBr1iyjMldXV8N7jUbD77//Tt26dfHx8eGbb74xqnvy5EnGjx/P7t27uXLliuEMOD4+ntq1axvq1a1b1/D+zll0nTp1jMouXbpk1La/vz+2traGz0FBQaSlpXH+/Hl8fHyM6h45cgStVkvVqlWNyjMzMyldujQAQ4cOZeDAgfz999+0adOGLl26GMV1r7Zt2+Lj40OlSpVo164d7dq1o3Pnztja2nLq1Clu3bpF27ZtjfbJysoiICAAgEOHDrFx48Y8z7RPnz5tiPP+43t6eubqByHMSZKwEE+QnZ0dVapUeWidHTt2AHDt2jWuXbuGnZ2dYVtISAg+Pj7MnTsXLy8vdDodtWvXznXvtlSpUob3KpUqz7L7L2GbIi0tDQsLC/bv34+FhYXRtjuJ8O233yY4OJiIiAj+/vtvwsPD+frrrxkyZEiu9hwcHDhw4ACbNm3i77//Zvz48UycOJG9e/eSlpYGQEREBOXKlTPa786gtrS0NEJCQvjiiy9yte3p6Wl4f28fwOP3gxCFTZKwEGZ0+vRp3n33XebOncuSJUsICwvjn3/+Qa1Wc/XqVWJjY5k7dy7NmjUDYNu2bYV27EOHDnH79m1sbGwA2LVrF/b29nh7e+eqGxAQgFar5dKlS4ZY8uLt7c2AAQMYMGAAY8aMYe7cuXkmYQBLS0vatGlDmzZtmDBhAs7OzmzYsIG2bdtiZWVFfHw8LVq0yHPf+vXr88cff1CxYkUsLeXXmHh2yU+vEE9QZmYmSUlJRmWWlpa4ubmh1Wp58803CQ4Opnfv3rRr1446derw9ddf8/777+Pi4kLp0qWZM2cOnp6exMfH88EHHxRabFlZWfTt25exY8dy9uxZJkyYwODBg1Grc4/XrFq1Kj169KBnz558/fXXBAQEcPnyZaKioqhbty4dO3Zk+PDhtG/fnqpVq3L9+nU2btxIjRo18jz2X3/9xZkzZ2jevDkuLi6sWbMGnU5HtWrVcHBwYOTIkbz77rvodDqaNm3KzZs32b59O46OjoSFhTFo0CDmzp1L9+7dDaOfT506xeLFi5k3b16us3UhiipJwkI8QevWrTO6PApQrVo1Tpw4waeffsq5c+f466+/AP1l1Dlz5tC9e3defPFF/P39Wbx4MUOHDqV27dpUq1aN7777jpYtWxZKbK1bt8bPz4/mzZuTmZlJ9+7dH/oIz/z58/nkk0947733SEhIwM3Njeeee46XXnoJAK1Wy6BBg7hw4QKOjo60a9cu1z3uO5ydnVmxYgUTJ04kIyMDPz8/Fi1aRK1atQD4+OOPcXd3Jzw8nDNnzuDs7Ez9+vX58MMPAfDy8mL79u2MHj2aF198kczMTHx8fGjXrl2ef0QIUVSpFEVRzB2EEOLp6tWrFzdu3GDVqlXmDkWIEk3+ZBRCCCHMRJKwEEIIYSZyOVoIIYQwEzkTFkIIIcxEkrAQQghhJpKEhRBCCDORJFxAM2fOpGLFilhbWxMYGMiePXvMHdITsWXLFkJCQvDy8kKlUuV6pEVRFMaPH4+npyc2Nja0adPGsKrOHdeuXaNHjx44Ojri7OxM3759DVMT3nH48GGaNWuGtbU13t7eTJky5Ul/tccWHh5Oo0aNcHBwoEyZMoSGhhIbG2tUJyMjg0GDBlG6dGns7e3p0qWLYZnCO+Lj4+nYsSO2traUKVOG999/n5ycHKM6mzZton79+lhZWVGlShUWLFjwpL/eY5k1axZ169bF0dERR0dHgoKCWLt2rWF7Se2XB/n8889RqVQMHz7cUFaS+2jixImoVCqjV/Xq1Q3bi1XfmHX5iGfU4sWLFY1Go/z000/K0aNHlX79+inOzs5KcnKyuUMrdGvWrFE++ugjZcWKFQqgrFy50mj7559/rjg5OSmrVq1SDh06pLz88suKr6+vcvv2bUOddu3aKf7+/squXbuUrVu3KlWqVDGshqMoinLz5k3Fw8ND6dGjhxITE6MsWrRIsbGxUX744Yen9TULJDg4WJk/f74SExOjREdHKx06dFAqVKigpKWlGeoMGDBA8fb2VqKiopR9+/Ypzz33nNKkSRPD9pycHKV27dpKmzZtlIMHDypr1qxR3NzcDCsTKYqinDlzRrG1tVVGjBihHDt2TJk+fbpiYWGhrFu37ql+X1OsXr1aiYiIUP79918lNjZW+fDDD5VSpUopMTExiqKU3H7Jy549e5SKFSsqdevWNaxapSglu48mTJig1KpVS0lMTDS8Ll++bNhenPpGknABNG7cWBk0aJDhs1arVby8vJTw8HAzRvXk3Z+EdTqdUrZsWeXLL780lN24cUOxsrJSFi1apCiKohw7dkwBlL179xrqrF27VlGpVIZl8b7//nvFxcVFyczMNNQZPXq00dJ7z4JLly4pgLJ582ZFUfR9UapUKWXZsmWGOsePH1cAZefOnYqi6P/IUavVSlJSkqHOrFmzFEdHR0N/jBo1SqlVq5bRsbp166YEBwc/6a9UqFxcXJR58+ZJv9wjNTVV8fPzUyIjI42WjizpfTRhwgTF398/z23FrW/kcrSJsrKy2L9/P23atDGUqdVq2rRpw86dO80Y2dMXFxdHUlKSUV84OTkRGBho6IudO3fi7OxMw4YNDXXatGmDWq1m9+7dhjrNmzdHo9EY6gQHBxMbG8v169ef0rd5fDdv3gTuLlW4f/9+srOzjfqnevXqVKhQwah/6tSpY1h+EPTfPSUlhaNHjxrq3NvGnTrPys+bVqtl8eLFpKenExQUJP1yj0GDBtGxY8dc30P6SL+Mp5eXF5UqVaJHjx7Ex8cDxa9vJAmb6MqVK2i1WqN/XNCv13r/RP3F3Z3v+7C+SEpKokyZMkbbLS0tcXV1NaqTVxv3HqOo0+l0DB8+nOeff96wzm9SUhIajQZnZ2ejuvf3z6O++4PqpKSkcPv27SfxdQrFkSNHsLe3x8rKigEDBrBy5Upq1qxZ4vvljsWLF3PgwAHCw8NzbSvpfRQYGMiCBQtYt24ds2bNIi4ujmbNmpGamlrs+kYWcBCiEAwaNIiYmJhCXWrwWVetWjWio6O5efMmy5cvJywsjM2bN5s7rCLh/PnzDBs2jMjISKytrc0dTpHTvn17w/u6desSGBiIj48PS5cuNSy9WVzImbCJ3NzcsLCwyDUSLzk5mbJly5opKvO4830f1hdly5bl0qVLRttzcnK4du2aUZ282rj3GEXZ4MGD+euvv9i4cSPly5c3lJctW5asrCxu3LhhVP/+/nnUd39QHUdHxyL9C0mj0VClShUaNGhAeHg4/v7+fPvttyW+X0B/SfXSpUvUr18fS0tLLC0t2bx5M9999x2WlpZ4eHiU+D66l7OzM1WrVuXUqVPF7udHkrCJNBoNDRo0ICoqylCm0+mIiooiKCjIjJE9fb6+vpQtW9aoL1JSUti9e7ehL4KCgrhx4wb79+831NmwYQM6nY7AwEBDnS1btpCdnW2oExkZSbVq1XBxcXlK38Z0iqIwePBgVq5cyYYNG/D19TXa3qBBA0qVKmXUP7GxscTHxxv1z5EjR4z+UImMjMTR0ZGaNWsa6tzbxp06z9rPm06nIzMzU/oF/TKSR44cITo62vBq2LAhPXr0MLwv6X10r7S0NE6fPo2np2fx+/l5qsPAionFixcrVlZWyoIFC5Rjx44p/fv3V5ydnY1G4hUXqampysGDB5WDBw8qgDJ16lTl4MGDyrlz5xRF0T+i5OzsrPz555/K4cOHlU6dOuX5iFJAQICye/duZdu2bYqfn5/RI0o3btxQPDw8lLfeekuJiYlRFi9erNja2hb5R5QGDhyoODk5KZs2bTJ6lOLWrVuGOgMGDFAqVKigbNiwQdm3b58SFBSkBAUFGbbfeZTixRdfVKKjo5V169Yp7u7ueT5K8f777yvHjx9XZs6cWeQfM/nggw+UzZs3K3Fxccrhw4eVDz74QFGpVMrff/+tKErJ7ZeHuXd0tKKU7D567733lE2bNilxcXHK9u3blTZt2ihubm7KpUuXFEUpXn0jSbiApk+frlSoUEHRaDRK48aNlV27dpk7pCdi48aNCpDrFRYWpiiK/jGlcePGKR4eHoqVlZXSunVrJTY21qiNq1evKt27d1fs7e0VR0dHpXfv3kpqaqpRnUOHDilNmzZVrKyslHLlyimff/750/qKBZZXvwDK/PnzDXVu376tvPPOO4qLi4tia2urdO7cWUlMTDRq5+zZs0r79u0VGxsbxc3NTXnvvfeU7OxsozobN25U6tWrp2g0GqVSpUpGxyiK+vTpo/j4+CgajUZxd3dXWrdubUjAilJy++Vh7k/CJbmPunXrpnh6eioajUYpV66c0q1bN+XUqVOG7cWpb2QVJSGEEMJM5J6wEEIIYSaShIUQQggzkSQshBBCmIkkYSGEEMJMJAkLIYQQZiJJWAghhDATScKPITMzk4kTJ5KZmWnuUIok6Z8Hk755OOmfh5P+ebBnrW/kOeHHkJKSgpOTEzdv3sTR0dHc4RQ50j8PJn3zcNI/Dyf982DPWt/ImbAQQghhJpKEhRBCCDMpcesJ5+TkcPDgQTw8PFCrH+9vkNTUVAASEhJISUkpjPCKFemfB5O+eTjpn4eT/nmwotA3Op2O5ORkAgICsLR8eJotcfeE9+7dS+PGjc0dhhBCiGJuz549NGrU6KF1StyZsIeHB6DvHE9PTzNHI4QQorhJTEykcePGhnzzMCUuCd+5BO3p6Un58uXNHI0QQojiKj+3PGVglhBCCGEmZk3CW7ZsISQkBC8vL1QqFatWrXrkPps2baJ+/fpYWVlRpUoVFixY8MTjFEIIIZ4Esybh9PR0/P39mTlzZr7qx8XF0bFjR1544QWio6MZPnw4b7/9NuvXr3/CkQohhBCFz6z3hNu3b0/79u3zXX/27Nn4+vry9ddfA1CjRg22bdvGN998Q3BwcKHGptVqyc7OLtQ2hSgKNBrNYz+eJ4QoHM/UwKydO3fSpk0bo7Lg4GCGDx9eaMdQFIWkpCRu3LhRaG0KUZSo1Wp8fX3RaDTmDkU8QEa2ln1nr5Ot1Zk7lBLH3cGK2uWcntrxnqkknJSUlGvIt4eHBykpKdy+fRsbG5tc+2RmZhpN5H3nQe6HHePGjRuUKVMGW1tbVCpV4QQvRBGg0+m4ePEiiYmJVKhQQX6+i6ANJ5KZsPoo56/dNncoJdJLdT2Z8Ub9p3a8ZyoJF0R4eDiTJk3KV12tVmtIwKVLl37CkQlhHu7u7ly8eJGcnBxKlSpl7nDEfy5cv8Wk/x0j8lgyAG72Grycc59YiCergqvtUz3eM5WEy5YtS3JyslFZcnIyjo6OeZ4FA4wZM4YRI0YYPickJFCzZs086965B2xr+3T/EYR4mu5chtZqtZKEi4DMHC3ztsYxfcNJMrJ1WKpV9G3qy9DWfthZPVO/okUBPFP/wkFBQaxZs8aoLDIykqCgoAfuY2VlhZWVleFzfuYSlUt0ojiTn++iY/upK4z7M4Yzl9MBCPR15ePQ2lT1cDBzZOJpMWsSTktL49SpU4bPcXFxREdH4+rqSoUKFRgzZgwJCQn88ssvAAwYMIAZM2YwatQo+vTpw4YNG1i6dCkRERHm+gpCCGGy5JQMPv7rGH8dTgTAzd6KsR1r0Kmel/yRVMKY9TmFffv2ERAQQEBAAAAjRowgICCA8ePHA/r5N+Pj4w31fX19iYiIIDIyEn9/f77++mvmzZtX6I8nCb2KFSsybdq0fNfftGkTKpVKRpYL8QA5Wh3ztp6h9deb+etwImoV9GpSkaj3WhAaUE4ScAlk1jPhli1b8rBFnPKaDatly5YcPHjwCUb17HnUf9wJEyYwceJEk9vdu3cvdnZ2+a7fpEkTEhMTcXJ6esP7hXhW7D17jXGrYjiRpH9CI6CCMx93qv1UH4cRRc8zdU9Y5C0xMdHwfsmSJYwfP57Y2FhDmb29veG9oihotdpHrnEJ+lG0ptBoNJQtW9akfYqLrKwsee5W5OlKWibha07wx4ELALjYluKD9tV5rYE3arWc+ZZ0Mm1OMVC2bFnDy8nJCZVKZfh84sQJHBwcWLt2LQ0aNMDKyopt27Zx+vRpOnXqhIeHB/b29jRq1Ih//vnHqN37L0erVCrmzZtH586dsbW1xc/Pj9WrVxu23385esGCBTg7O7N+/Xpq1KiBvb097dq1M/qjIScnh6FDh+Ls7Ezp0qUZPXo0YWFhhIaGPvD7Xr16le7du1OuXDlsbW2pU6cOixYtMqqj0+mYMmUKVapUwcrKigoVKvDpp58atl+4cIHu3bvj6uqKnZ0dDRs2ZPfu3QD06tUr1/GHDx9Oy5YtDZ9btmzJ4MGDGT58OG5uboZbIlOnTqVOnTrY2dnh7e3NO++8Q1pamlFb27dvp2XLltja2uLi4kJwcDDXr1/nl19+oXTp0kbPtQOEhoby1ltvPbA/RNGk1Sn8uuscrb7aZEjA3Rt7s+G9lnRrVEESsAAkCT+Soijcysoxy+thl+pN9cEHH/D5559z/Phx6tatS1paGh06dCAqKoqDBw/Srl07QkJCjO7B52XSpEl07dqVw4cP06FDB3r06MG1a9ceWP/WrVt89dVX/Prrr2zZsoX4+HhGjhxp2P7FF1/w+++/M3/+fLZv305KSsojF/LIyMigQYMGREREEBMTQ//+/XnrrbfYs2ePoc6YMWP4/PPPGTduHMeOHWPhwoWGiV7S0tJo0aIFCQkJrF69mkOHDjFq1Ch0OtNmJ/r555/RaDRs376d2bNnA/rZqL777juOHj3Kzz//zIYNGxg1apRhn+joaFq3bk3NmjXZuXMn27ZtIyQkBK1Wy2uvvYZWqzX6w+bSpUtERETQp08fk2IT5nXo/A06f7+dcatiSMnIoZaXIyveaUL4K3VxsZMrJuIuuRz9CLeztdQcb54FIo5NDsZWUzj/RJMnT6Zt27aGz66urvj7+xs+f/zxx6xcuZLVq1czePDgB7bTq1cvunfvDsBnn33Gd999x549e2jXrl2e9bOzs5k9ezaVK1cGYPDgwUyePNmwffr06YwZM4bOnTsDMGPGjFyPod2vXLlyRol8yJAhrF+/nqVLl9K4cWNSU1P59ttvmTFjBmFhYQBUrlyZpk2bArBw4UIuX77M3r17cXV1BaBKlSoPPWZe/Pz8mDJlilHZvVOoVqxYkU8++YQBAwbw/fffAzBlyhQaNmxo+AxQq1Ytw/s33niD+fPn89prrwHw22+/UaFCBaOzcFF03biVxZfrY1m4Jx5FAQdrS0a+WI03n/PBQs58RR4kCZcQDRs2NPqclpbGxIkTiYiIIDExkZycHG7fvv3IM+G6desa3tvZ2eHo6MilS5ceWN/W1taQgAE8PT0N9W/evElycjKNGzc2bLewsKBBgwYPPSvVarV89tlnLF26lISEBLKyssjMzDRMsnL8+HEyMzNp3bp1nvtHR0cTEBBgSMAF1aBBg1xl//zzD+Hh4Zw4cYKUlBRycnLIyMjg1q1b2NraEh0dbUiweenXrx+NGjUiISGBcuXKsWDBAnr16iWjZos4nU5h+YELfL72BNfSswB4JaAcYzrUwN3B6hF7i5JMkvAj2JSy4Nhk8zwCZVPKotDaun+U88iRI4mMjOSrr76iSpUq2NjY8Oqrr5KVlfXQdu6fYUmlUj00YeZV/3Evs3/55Zd8++23TJs2zXD/dfjw4YbYHzR72h2P2q5Wq3PFmNeKWvf36dmzZ3nppZcYOHAgn376Ka6urmzbto2+ffuSlZWFra3tI48dEBCAv78/v/zyCy+++CJHjx6V5+CLuGMXUxj3Zwz7z10HoKqHPR93qk1gJZn6VjyaJOFHUKlUhXZJuCjZvn07vXr1MlwGTktL4+zZs081BicnJzw8PNi7dy/NmzcH9Ge5Bw4coF69eg/cb/v27XTq1Ik333wT0A/C+vfffw3Tkfr5+WFjY0NUVBRvv/12rv3r1q3LvHnzuHbtWp5nw+7u7sTExBiVRUdHP3KKx/3796PT6fj6668NSwUuXbo017GjoqIeOp/522+/zbRp00hISKBNmzZ4e3s/9LjCPFIzsvkm8iQ/7zyLVqdgq7FgeBs/ej/vSymLxxxuo9PB9TjQ5rGcqlM5sPpvRq3bNyA1CTS24Fzhbp3L/4Ji4gpMDh5g46J/n5kGNy+ApRW4+t6tc/V03jE9jJ072P33B0n2bbh+DtSW4HbPLaDrZyE7w7R2bVz0MYM+pqunQaUC92p369w4D1np+W/T2gkcPU2L4zEVv+wi8sXPz48VK1YQEhKCSqVi3LhxJg9MKgxDhgwhPDycKlWqUL16daZPn87169cfevnVz8+P5cuXs2PHDlxcXJg6dSrJycmGJGxtbc3o0aMZNWoUGo2G559/nsuXL3P06FH69u1L9+7d+eyzzwgNDSU8PBxPT08OHjyIl5cXQUFBtGrVii+//JJffvmFoKAgfvvtN2JiYgyTyjxIlSpVyM7OZvr06YSEhBgN2LpjzJgx1KlTh3feeYcBAwag0WjYuHEjr732Gm5uboD+vvDIkSOZO3euYbY4UXQoisLqQxf5NOI4l1L1I9k71vFk7Es18HR6zAUXsjPgyFLYMQOuxOZdp/tiqPbfOuz/roOV/weVW8NbK+7WmfsCZKXlvf+DvDwd6vfUv4/fBb93AU9/+L8td+v89oo+YZqi9QRo9t/8/ZdPwJyW4FgORhy7W2d5X0jYZ1q7QYMh+L8nHtKS4ftAsLCCcffcHlszUt9H+RXwJnSaaVocj0mScAk1depU+vTpQ5MmTXBzc2P06NH5mle7sI0ePZqkpCR69uyJhYUF/fv3Jzg4GAuLB1+KHzt2LGfOnCE4OBhbW1v69+9PaGgoN2/eNNQZN24clpaWjB8/nosXL+Lp6cmAAQMA/fPMf//9N++99x4dOnQgJyeHmjVrMnOm/j9fcHAw48aNY9SoUWRkZNCnTx969uzJkSNHHvpd/P39mTp1Kl988QVjxoyhefPmhIeH07NnT0OdqlWr8vfff/Phhx/SuHFjbGxsCAwMNAx2A/0Vgi5duhAREfHQR7XE03fqUirj/zzKjtNXAfB1s2PSy7VoXtW0Z+pzuXUN9v0Iu+dA+n9JxMIKrOxz17W454qMhQZsS4O1o3EdG1f9WawpLK3vadfyv3bvm0jExgUyH74cbC6l7vnDRP1fu3fOuO+wdtKXm9TuPQvtqNT6/S3u+85WDqa1q8mjv58wlVKYz8E8Ay5cuIC3tzfnz5+nfPnyRtsyMjKIi4vD19cXa2vrB7QgniSdTkeNGjXo2rUrH3/8sbnDMZvWrVtTq1Ytvvvuu0JvW37OTXcrK4fpG04xb+sZsrUKVpZqBr9Qhf4tKmFl+RhjN66fhZ3fw8FfIfuWvsyxPDw3UH9Wen9yFc+Eh+WZ+8mZsDCrc+fO8ffff9OiRQsyMzOZMWMGcXFxvPHGG+YOzSyuX7/Opk2b2LRpk9FjTMI8FEVh/dFkPv7rGAk3bgPQpkYZJoTUwrsw1p3dMQP2ztW/96gDzw+FWp2Nz3ZFsSZJWJiVWq1mwYIFjBw5EkVRqF27Nv/88w81atQwd2hmERAQwPXr1/niiy+oVq3ao3cQT8y5q+lMWH2UTbGXASjnbMPEl2vRtqZHwRrU6eBUpP5+aNna+rKgd/QDsIIGQ6WW+oFFokSRJCzMytvbm+3bt5s7jCLjaY9QF7llZGuZvfk03286TVaOjlIWKv6veWUGvVAFG81jXHre8DFsmwo1XoZuv+rLXCvBm38UTuDimSRJWAgh/rMx9hITVx/l3FX9/dmmVdyY1KkWld0LMGDn1jXIybz7yEvdrrD3R33iVRQ56xWAJGEhhODijdtM/t8x1h1NAsDD0YpxL9WkYx1P02cru34Wds2CA79CjRB45Qd9eZkaMDLWeLSwKPEkCQshSqysHB0/bovju6iT3M7WYqFW0ef5igxrUxV7KxN/PSYcgB3T4diquxNlXIkFbY7+kR+QBCxykSQshCiRdpy+wvg/j3Lqkn5Si8YVXZkcWovqZU14LOjOYKsd0+Hs1rvllVtBk6Ey2Eo8kiRhIUSJciklg0/XHOfP6IsAuNlrGNO+Bq/UL5f/S885mXB4KeycoZ8FCvQTUdR+FZoMuTv6WYhHkCQshCgRcrQ6ftl5jm8i/yU1MweVCt56zof3XqyGk00+n8u9fR32/QS7f9BPlQhg5QgNekHgAP28zkKY4DFnGRfFScuWLXOthztt2rSH7qNSqVi1atVjH7uw2hEiL/vPXSdkxnYm/3WM1Mwc/L2dWT2oKZM71c5/AgZY0R+iJusTsGM5ePETeDcGXvxYErAoEDkTLgZCQkLIzs5m3brcE5Vv3bqV5s2bc+jQIaO1gPNj7969uZbre1wTJ05k1apVREdHG5UnJibi4uKS905CFNDVtEy+WHeCpfsuAOBkU4rR7arzeiNv1Op8XHq+eBCcvMFOv7gGjfpBSqL+knPtV2RmK/HYJAkXA3379qVLly5cuHAh1zyl8+fPp2HDhiYnYNAv6fe0lC1b9qkdqyjJyspCo9GYO4xiR6dTWLQ3ninrYrl5W7/0XreG3oxuXx1Xu3z295pRsOcHaDEaXvhQX+bXVv+SwVaikMjl6GLgpZdewt3dnQULFhiVp6WlsWzZMvr27cvVq1fp3r075cqVw9bWljp16rBo0aKHtnv/5eiTJ0/SvHlzrK2tqVmzJpGRkbn2GT16NFWrVsXW1pZKlSoxbtw4srP1vwQXLFjApEmTOHToECqVCpVKZYj5/svRR44coVWrVtjY2FC6dGn69+9PWtrdpdl69epFaGgoX331FZ6enpQuXZpBgwYZjpWX06dP06lTJzw8PLC3t6dRo0b8888/RnUyMzMZPXo03t7eWFlZUaVKFX788UfD9qNHj/LSSy/h6OiIg4MDzZo14/Tp00Duy/kAoaGh9OrVy6hPP/74Y3r27ImjoyP9+/d/ZL/d8b///Y9GjRphbW2Nm5ubYS3oyZMnU7t27oFA9erVY9y4cQ/sj+LqyIWbdJ61g49WxnDzdjY1PB35Y2AQX7xa9+EJOCcTsm7d/ewTpB9slXF3dS5UKknAolDJmXB+mbIw9B0WVnefD9TmgDZTv+TWvc8KPqhdTf4vA1taWtKzZ08WLFjARx99ZBjhuWzZMrRaLd27dyctLY0GDRowevRoHB0diYiI4K233qJy5co0btz4kcfQ6XS88soreHh4sHv3bm7evJkr4QA4ODiwYMECvLy8OHLkCP369cPBwYFRo0bRrVs3YmJiWLdunSH5OTk55WojPT2d4OBggoKC2Lt3L5cuXeLtt99m8ODBRn9obNy4EU9PTzZu3MipU6fo1q0b9erVo1+/fnl+h7S0NDp06MCnn36KlZUVv/zyCyEhIcTGxlKhgn5B9J49e7Jz506+++47/P39iYuL48qVKwAkJCTQvHlzWrZsyYYNG3B0dGT79u3k5OQ8sv/u9dVXXzF+/HgmTJiQr34DiIiIoHPnznz00Uf88ssvZGVlsWbNGgD69OnDpEmT2Lt3L40aNQLg4MGDHD58mBUrVuQOoJi6eSubr/6O5bfd51AUsLey5L0Xq/LWcz5YWjzkfOPewVbPDYSm7+rLa7wMww7LvV7xZCklzPnz5xVAOX/+fK5tt2/fVo4dO6bcvn07944THE1/xay4u3/MCn3ZTx2M2/3CN+99TXT8+HEFUDZu3Ggoa9asmfLmm28+cJ+OHTsq7733nuFzixYtlGHDhhk++/j4KN98842iKIqyfv16xdLSUklISDBsX7t2rQIoK1eufOAxvvzyS6VBgwaGzxMmTFD8/f1z1bu3nTlz5iguLi5KWlqaYXtERISiVquVpKQkRVEUJSwsTPHx8VFycnIMdV577TWlW7duD4wlL7Vq1VKmT5+uKIqixMbGKoASGRmZZ90xY8Yovr6+SlZWVp7b7+8/RVGUTp06KWFhYYbPPj4+Smho6CPjur/fgoKClB49ejywfvv27ZWBAwcaPg8ZMkRp2bJlnnUf+nP+DNLpdMryfeeV+pP/VnxG/6X4jP5LGbrogJJ88xHf79pZRVkzWlE+8bz7/+6Hloqi0z2dwEWx9bA8cz85Ey4mqlevTpMmTfjpp59o2bIlp06dYuvWrUyePBkArVbLZ599xtKlS0lISCArK4vMzExsbfO3HNvx48fx9vbGy8vLUBYUFJSr3pIlS/juu+84ffo0aWlp5OTk4Oho2pqox48fx9/f32hQ2PPPP49OpyM2NhYPD/0qNrVq1cLC4u6E+p6enhw5cuSB7aalpTFx4kQiIiJITEwkJyeH27dvEx8fD0B0dDQWFha0aNEiz/2jo6Np1qwZpUo93mCchg0b5ip7VL9FR0c/8AwfoF+/fvTp04epU6eiVqtZuHAh33zzzWPF+Sw4kZTC+FVH2XP2GgBVytgzuVMtmlR2e/BOFw/qJ9c4ugoUrb6sTK3/lhF8RS43i6dKknB+fXjR9H0srO6+rx6ib0N132Wx4Q9OGqbq27cvQ4YMYebMmcyfP5/KlSsbEsqXX37Jt99+y7Rp06hTpw52dnYMHz6crKysQjv+zp076dGjB5MmTSI4OBgnJycWL17M119/XWjHuNf9yVClUqHT6R5Yf+TIkURGRvLVV19RpUoVbGxsePXVVw19YGPz8CkFH7VdrVajKIpRWV73qO8fcZ6ffnvUsUNCQrCysmLlypVoNBqys7N59dVXH7rPsywtM4dpkf8yf8dZtDoFm1IWDGvjR5/nfdFY5nHpWaeDU//Aju+MZ7aq1FI/s1XlVpJ8hVlIEs4vE+7R5snC8u794cJs9x5du3Zl2LBhLFy4kF9++YWBAwca7g9v376dTp068eabbwL6e7z//vsvNWvWzFfbNWrU4Pz58yQmJuLpqV8VZteuXUZ1duzYgY+PDx999JGh7Ny5c0Z1NBoNWq32kcdasGAB6enphoS1fft21Gr1Y62xu337dnr16mUY0JSWlma0dGCdOnXQ6XRs3ryZNm3a5Nq/bt26/Pzzz2RnZ+d5Nuzu7k5iYqLhs1arJSYmhhdeeOGhceWn3+rWrUtUVBS9e/fOsw1LS0vCwsKYP38+Go2G119//ZGJ+1mkKAoRRxL5+K9jJKdkAtCuVlnGhdSknHMe3zcnE44s05/53pnZSmUBtbvoHzPyNP2pASEKk4yOLkbs7e3p1q0bY8aMITEx0WhUrp+fH5GRkezYsYPjx4/zf//3fyQnJ+e77TZt2lC1alXCwsI4dOgQW7duNUoad44RHx/P4sWLOX36NN999x0rV640qlOxYkXi4uKIjo7mypUrZGZm5jpWjx49sLa2JiwsjJiYGDZu3MiQIUN46623DJeiC8LPz48VK1YQHR3NoUOHeOONN4zOnCtWrEhYWBh9+vRh1apVxMXFsWnTJpYuXQrA4MGDSUlJ4fXXX2ffvn2cPHmSX3/9ldjYWABatWpFREQEERERnDhxgoEDB3Ljxo18xfWofpswYQKLFi1iwoQJHD9+nCNHjvDFF18Y1Xn77bfZsGED69ato0+fPgXup6Lq9OU03vpxD4MXHiQ5JROf0rYs6N2I2W81yDsBA/zUDv4cpE/AGnsIGgzDDkGXuZKARZEgSbiY6du3L9evXyc4ONjo/u3YsWOpX78+wcHBtGzZkrJlyxIaGprvdtVqNStXruT27ds0btyYt99+m08//dSozssvv8y7777L4MGDqVevHjt27Mj1iEyXLl1o164dL7zwAu7u7nk+JmVra8v69eu5du0ajRo14tVXX6V169bMmDHDtM64z9SpU3FxcaFJkyaEhIQQHBxM/fr1jerMmjWLV199lXfeeYfq1avTr18/0tP1I9hLly7Nhg0bSEtLo0WLFjRo0IC5c+cazor79OlDWFgYPXv2pEWLFlSqVOmRZ8GQv35r2bIly5YtY/Xq1dSrV49WrVqxZ88eozp+fn40adKE6tWrExgY+DhdVaTcztLy1fpY2k3bwrZTV9BYqhnexo/1w5vTsloZ48o34vVPItxRKxQcPKHtZHj3KAR/Cs7eTzV+IR5Gpdx/E6uYu3DhAt7e3pw/fz7XxBYZGRnExcXh6+uLtbW1mSIUomAURcHPz4933nmHESNGPLDes/RzHnksmYmrj5Jw4zYAL1RzZ+LLtfApncdtnDXvw94f9We5tbvoy7Jv6y8/W8qEKOLpeVieuZ/cExaiGLh8+TKLFy8mKSnpgfeNnyXnr91i4uqjRJ24BEA5ZxvGh9TkxZoed1c6unP+cOezbWn9aOfze+4mYVm/VxRxkoSFKAbKlCmDm5sbc+bMeabn4M7M0fLD5jPM3HiKzBwdpSxUvN2sEkNaVcFW89+vq5ws/WCrnTOg9QSo1k5f3rg/VGsPnv7m+wJCmEiSsBDFQHG4q7Tl38tMWH2UuCv6e/BNKpdmcqfaVCljr69w+wbsn6+f2Sr1v1Hoe+feTcK2rvqXEM8QScJCCLNKvHmbj/86xpojSQCUcbBi7Es1Canrqb/0fCMeds2GAz9D1n/zhzt46tfvbdDLfIELUQgkCQshzCJbq2P+9jim/XOSW1laLNQqwoIq8m5bPxysS0HiIf3zvTErjGe2ajJEf89XBluJYkCScB4eNuuSEM+6onDpeteZq4z/M4Z/k/Vntg19XJjcqTY1PR3gVJR+Zqu4zXd3qNRSn3wrt5aZrUSxIkn4HhqNBrVazcWLF3F3d0ej0dwdiSlEMaAoCpcvX0alUj32HNgFcSk1g/A1J1h5MAEAVzsNY9pXp0v98qgVLcxpCYnR+sqGma0Gy2ArUWxJEr6HWq3G19eXxMRELl4swFzRQjwDVCoV5cuXN1r84knT6hR+23WOr9bHkpqZg0oFbzSuwPsvlMfZ+c5obksoUxOuntLf6w0cIBNriGJPkvB9NBoNFSpUICcn55FzHAvxLCpVqtRTTcAH4q8zblUMRy+mAFCnnBOfdKqF/4mv4fsF0GcdlK2tr9xmArQLBxvnpxafEOYkSTgPdy7VmeNynRDFxfX0LKasP8GiPecBcLS25P121XmjcQUs1CrYFQ9ZqRCz/G4SdihrxoiFePokCQshCpVOp7B033m+WHeC67eyAYUPqyXSi/+h8fsG1P+Ns2jxAQT0hCqtzRqvEOYkSVgIUWhiEm4y7s8YDsbfoBQ5DHY9wDuatdie0680xc6Z8NJU/XuPmvqXECWYJGEhxGNLychm6t//8svOs9gr6QzRbGSATSR2ty7DLfTLCNYPg+cGmjtUIYoUScJCiAJTFIVV0Ql8GnECTVoCYyzX8aZmEza6W5CJ8cxWMthKiFwkCQshCuTf5FTGrYoh7ewBPrKM4GXrnVigAx36R42aDIHar8rMVkI8hNrcAcycOZOKFStibW1NYGBgroXK75Wdnc3kyZOpXLky1tbW+Pv7s27duqcYrRAiPTOH8DXH6frteoZceI8Iqw/pbLFdn4B9W0CPP2DgDqj3hiRgIR7BrGfCS5YsYcSIEcyePZvAwECmTZtGcHAwsbGxlClTJlf9sWPH8ttvvzF37lyqV6/O+vXr6dy5Mzt27CAgIMAM30CIkkNRFNYeSeTjiOMk3swArCnvkIOSZYGq9isQNBi86pk7TCGeKSrFjBPJBgYG0qhRI2bMmAHo52z29vZmyJAhfPDBB7nqe3l58dFHHzFo0CBDWZcuXbCxseG3337L1zEvXLiAt7c358+fp3z58oXzRYQo5uKSr7Nr4Sc0vL6WLlkTcXJ1Y9LLtWjlmKhfPtC5grlDFKLIMCXPmHwmXLFiRfr06UOvXr2oUKHg//GysrLYv38/Y8aMMZSp1WratGnDzp0789wnMzMTa2trozIbGxu2bdv2wONkZmaSmZlp+JyamlrgmIUoUXKyuJim5adtcfyy8yz/s1iHnzqBb2scI+iNcViXsgA8zB2lEM80k+8JDx8+nBUrVlCpUiXatm3L4sWLjZJcfl25cgWtVouHh/F/Yg8PD5KSkvLcJzg4mKlTp3Ly5El0Oh2RkZGsWLGCxMTEBx4nPDwcJycnw6tmTXkuUYgHSkmEffNJ/ekVMj7z4aUp/2PetjiytAoRZfpzufU3vNBjzH8JWAjxuAqUhKOjo9mzZw81atRgyJAheHp6MnjwYA4cOPAkYjT49ttv8fPzo3r16mg0GgYPHkzv3r1Rqx/8NcaMGcPNmzcNr2PHjj3RGIV4pigKJB2BzVNQ5rwAU6vDX8NxiI/CWneLxhwlqFJp5vduxLuDhuLerA9YWpk7aiGKjQIPzKpfvz7169fn66+/5vvvv2f06NHMmjWLOnXqMHToUHr37v3QZQDd3NywsLAgOTnZqDw5OZmyZfOeP9bd3Z1Vq1aRkZHB1atX8fLy4oMPPqBSpUoPPI6VlRVWVnd/aaSkpJj4TYUoZnIy4ew2iF2rf6VcAODO/9ZoXWWidA3IrNKOQa1bU8fb2WyhClHcFTgJZ2dns3LlSubPn09kZCTPPfccffv25cKFC3z44Yf8888/LFy48IH7azQaGjRoQFRUFKGhoYB+YFZUVBSDBw9+6LGtra0pV64c2dnZ/PHHH3Tt2rWgX0OIkuPoSv3rVBRkpRmKM9CwVVuHf3T12WnRgDaN/On9fEW8XW3NGKwQJYPJSfjAgQPMnz+fRYsWoVar6dmzJ9988w3Vq1c31OncuTONGjV6ZFsjRowgLCyMhg0b0rhxY6ZNm0Z6ejq9e/cGoGfPnpQrV47w8HAAdu/eTUJCAvXq1SMhIYGJEyei0+kYNWqUqV9DiOLv2hlwvecq0ZHlcOIvAFJLubEuy5+12QHs0NXCwcGR3s9X5MPGPjjZyuphQjwtJifhRo0a0bZtW2bNmkVoaGiey/35+vry+uuvP7Ktbt26cfnyZcaPH09SUhL16tVj3bp1hsFa8fHxRvd7MzIyGDt2LGfOnMHe3p4OHTrw66+/4uzsbOrXEKL40ubA7KZw+TgM3g9uVQCI9+nCicuuzEqqSnRGRRTU+JWxZ3LzSnSq54WVpQy2EuJpM/k54XPnzuHj4/Ok4nni5DlhUaxkpMCpf+DScWj10d3yn1+GcztQXpnLVk1T5m49w9aTVwybgyqVpn/zSrSo6o5a/eCxG0II0z3R54QvXbpEUlISgYGBRuW7d+/GwsKChg0bmtqkEMIUN+Ihdh3ErtEPsNJl68sb9QUH/aDGrPbfsC4um+//ucSJJP1UsBZqFR3qeNKvmS91yzubKXghxL1MTsKDBg1i1KhRuZJwQkICX3zxBbt37y604IQQgE4HFw/Cv/+NZk6OMd5eugpUaw+KQkpGNov3xPPTtrMkpWQAYKuxoFsjb/o87yuDrYQoYkxOwseOHaN+/fq5ygMCAuQZXCEKS9YtiNusT7r/roO0ex7lU6mhQhBUbadPvm5+XLxxmwXbzrJw92HSMnMAcHewoleTirwZKIOthCiqTE7CVlZWJCcn53o2NzExEUtLWRlRiMemKDCjkeH5XQA0DlClNVTrAH5t9fM1A8cupjB3STT/O3SRHJ1+eIdfGXv6yWArIZ4JJmfNF198kTFjxvDnn3/i5OQEwI0bN/jwww9p27ZtoQcoRLGWkQJ7foAL+6H7IlCp9K+KTeHcdv2ZbrX24NPUsCygoihsO3mZOVuMB1s9V8mV/2teWQZbCfEMMTkJf/XVVzRv3hwfHx/D8oHR0dF4eHjw66+/FnqAQhQrOVn6M9w7z+9aaGDrVMi+BUmHwdNfX97xa9DY6RPyf7K1Ov536CJztpzhRJJ+IRK1CjrW9ZLBVkI8o0xOwuXKlePw4cP8/vvvHDp0CBsbG3r37k337t3zfGZYiBLv1jU4GakfWHUqChy9YNB/AxhLWUPzkWBbGpy87+5jZW94m5qRzaI98czffva/dXxlsJUQxUWBbuLa2dnRv3//wo5FiOLjyqm7o5njd4GivbvtlrU+Mf93X5dm7+XZROLN28zffpZFu+NJvW+wVY/ACjjbap70txBCPGEFHkl17Ngx4uPjycrKMip/+eWXHzsoIZ452hy4sOfuoghXTxpvL1Prv/u7HcArAB6y8texiynM23qG1fcMtqpSxp7+zSrRKUAGWwlRnJichM+cOUPnzp05cuQIKpWKOxNu3VkxSavVPmx3IYqXzFSIGAkn/4bb1+6Wq0vpB1dVa69/lMjl4bPMKYrCtlNXcg22CvR15f9aVKJl1TIy2EqIYsjkJDxs2DB8fX2JiorC19eXPXv2cPXqVd577z2++uqrJxGjEEXHjfNw9RRUfkH/WWMPZ7fqE7C1M1QN1ifeyq3B2vGRzWVrdfx1+CJztsRxPFG/zKZaxX8zW1XCX5YRFKJYMzkJ79y5kw0bNuDm5oZarUatVtO0aVPCw8MZOnQoBw8efBJxCmF+F/bDvFZg4wrvnwK1hX70crvP9QOrvAPBIn//pVIzslm85zw/bY8zDLayKaUfbNW3qQy2EqKkMDkJa7VaHBwcAHBzc+PixYtUq1YNHx8fYmNjCz1AIZ667NtwZrN+YJWDJ7T8QF/u6Q+2buDmB+mXDfM0UzP/4yASb95mwfazLLxnsJWbvRW9n5fBVkKURCYn4dq1a3Po0CF8fX0JDAxkypQpaDQa5syZk2sWLSGeGWmX9NNDxq6F0xsh57a+3MkbWozWn/FaWMLwI6Ax/Sz1eGIKc7eeYXX03cFWld3t6N+8Ep3qlcO6lAy2EqIkMjkJjx07lvT0dAAmT57MSy+9RLNmzShdujRLliwp9ACFeCIUBS4duzuaOWE/cM+qno7l785WpSh3J80wIQErisL2U1eZs/UMW/69bCgP9HWlf/NKvFBNBlsJUdKZnISDg4MN76tUqcKJEye4du0aLi4uhhHSQhRJOVn6qSD//W8ZwBvxxtu9AvSPEFVrDx61jWarMkW2VkfE4UTmbDnDsXsGW7Wv40l/GWwlhLiHSUk4OzsbGxsboqOjqV27tqHc1dW10AMTolDodHefyb15Hn4NvbvN0hp8W9x9jMjR87EOlZqRzZK95/lpWxwXZbCVECIfTErCpUqVokKFCvIssCj6zu+BqMlg5wavLdCXla6sXwjBtaL+jLdSS/38zI8p6WYG87fHyWArIYTJTL4c/dFHH/Hhhx/y66+/yhmwKBp0WriwV//Mbtn/rtBYlNI/v1vKTn8Z+r8ViOgdUWiHPZGUwpwtMthKCFFwJifhGTNmcOrUKby8vPDx8cHOzvhM4sCBA4UWnBAPlJkKpzdA7Do4uR5uXQX/N6DzLP12z3rQcap+DV7LwjsTlcFWQojCZHISDg0NfQJhCJFPx/6EA79A3BbQ3jNvubUTWDnc/axSQaO+hXbYhw226tesEvVksJUQogBMTsITJkx4EnEI8XDZt2HN+3DwnjWrXXzvjmau8Jz+EnQhe9hgqz7P+1KhtAy2EkIUXIFXURLiqblyCpaFQXIMoIImQyDgTXCrWuDHiB4l6WYG83f8N9gq4+5gq15NfOgR6IOLnQy2EkI8PpOTsFqtfujzwDJyWhSqmBWweihkpYKdO3SZpx/V/IScSEph7pY4Vh9KIFt7d7BVv2aVCA2QwVZCiMJlchJeuXKl0efs7GwOHjzIzz//zKRJkwotMCHYNRvWjda/93keuvz42M/y5kVRFHacvsqcLWfYfM9gq8a+rvRvVolW1WWwlRDiyTA5CXfq1ClX2auvvkqtWrVYsmQJffsW3mAYUcLVeAm2TIH6PeGFsfleoSi/srU61hzRD7Y6evGewVa1PXm7mS8BFVwK9XhCCHG/Qvut9txzz9G/f//Cak6UVJdjwb2a/r1TeRi8D2wL93n0tMwcFu+JZ/72syTc0C/UYFPKgq4Ny9OnqS8+pR9/Ag8hhMiPQknCt2/f5rvvvqNcuXKF0ZwoiRQFIsfDjunw+kKo3kFfXogJODklg5+23z/YSkNYUEXefE4GWwkhnj6Tk/D9CzUoikJqaiq2trb89ttvhRqcKEFUKtDlAApcPHg3CReC2KRU5m49w5/RdwdbVfpvsFVnGWwlhDAjk5PwN998Y5SE1Wo17u7uBAYG4uIi99CEibQ5d+/1tpkEfm2hcqvHblZRFHaevsoP9w+2qqif2UoGWwkhigKTk3CvXr2eQBiixNFpYVM4nNsJPf/UJ2JLzWMn4DuDreZuPUNMwt3BVu1ql6Vfs0oy2EoIUaSYnITnz5+Pvb09r732mlH5smXLuHXrFmFhYYUWnCimUpPhj776BRYA/l0LNUIeq8m8BltZl1LTraG3DLYSQhRZJifh8PBwfvjhh1zlZcqUoX///pKExcPFbYHlfSH9kn6Fo5BvHysBJ6dkMH/7WX7ffU4GWwkhnjkmJ+H4+Hh8fX1zlfv4+BAfH18oQYliSKeDbV/Dxs9A0YF7Dej6C7hXLVBzMthKCFEcmJyEy5Qpw+HDh6lYsaJR+aFDhyhdunRhxSWKk/SrsKIfnI7Sf67XAzp8BRrTFz/Yf+460zecZFOs8WCrfs0r0VoGWwkhnjEmJ+Hu3bszdOhQHBwcaN68OQCbN29m2LBhvP7664UeoHjGxe+G5b0hJQEsbaDjV/rFF0yk0yl8v+kUUyP/RafIYCshRPFgchL++OOPOXv2LK1bt8bSUr+7TqejZ8+efPbZZ4UeoHhGKQrsnAH/TNQ//1vaD7r+DB61TG7qWnoWw5dEs+W/R41C63nxbtuqMthKCPHMMzkJazQalixZwieffEJ0dDQ2NjbUqVMHHx+fJxGfeBbdvgGr3oHYCP3n2l30A7CsHExuat/ZawxeeJCklAysS6mZ3Kk2XRt6F268QghhJgWettLPzw8/P7/CjEUUF2oLuBILFhpo9zk07GPyur+KojBvaxyfrzuBVqdQyd2O73vUp3pZxycUtBBCPH0mJ+EuXbrQuHFjRo8ebVQ+ZcoU9u7dy7JlywotOPEMUfQjlFGp9Ge8XX8FbRZ41TO5qZu3shm5/BCRx5IBeNnfi89eqYO9VeGuoiSEEOamNnWHLVu20KFD7nl927dvz5YtWwolKPGMyUjRD77aNetumUfNAiXgwxdu0HH6ViKPJaOxUPNJaG2+fb2eJGAhRLFk8m+2tLQ0NJrcEyCUKlWKlJSUQglKPGOO/w+OroTYdVC3K9i5mdyEoij8uuscn/x1nCytjgqutnzfoz61yzk9gYCFEKJoMPlMuE6dOixZsiRX+eLFi6lZs2ahBCWeMfXegMCBELa6QAk4NSObwYsOMv7Po2RpdQTX8uB/Q5pKAhZCFHsmnwmPGzeOV155hdOnT9OqlX6y/aioKBYuXMjy5csLPUBRBGWlw6bPoflIsHbS3wdu/3mBmjp2MYVBCw8QdyUdS7WKMR1q0Of5ikYrdQkhRHFlchIOCQlh1apVfPbZZyxfvhwbGxv8/f3ZsGEDrq6FtwC7KKIux8LSMLh8HG7E65/9LQBFUVi67zzj/zxKZo4OLydrZvSoT32ZeEMIUYKYfDkaoGPHjmzfvp309HTOnDlD165dGTlyJP7+/ia3NXPmTCpWrIi1tTWBgYHs2bPnofWnTZtGtWrVsLGxwdvbm3fffZeMjIyCfA1hqsNLYc4L+gRs7wGN+xWomVtZOby37BCj/zhCZo6OF6q5EzG0mSRgIUSJU+Ahp1u2bOHHH3/kjz/+wMvLi1deeYWZM2ea1MaSJUsYMWIEs2fPJjAwkGnTphEcHExsbCxlypTJVX/hwoV88MEH/PTTTzRp0oR///2XXr16oVKpmDp1akG/iniU7AxYNxr2L9B/9m0BXeaBfe5/o0c5dSmVgb8d4OSlNNQqGBlcjQHNK8ucz0KIEsmkJJyUlMSCBQv48ccfSUlJoWvXrmRmZrJq1aoCDcqaOnUq/fr1o3fv3gDMnj2biIgIfvrpJz744INc9Xfs2MHzzz/PG2+8AUDFihXp3r07u3fvNvnYIp+unoZlYZB0BFBBi1HQYrR+Qg4TrTqYwIcrj3ArS0sZByu+6x7Ac5Vk0Q8hRMmV78vRISEhVKtWjcOHDzNt2jQuXrzI9OnTC3zgrKws9u/fT5s2be4Go1bTpk0bdu7cmec+TZo0Yf/+/YZL1mfOnGHNmjV5PrcsCsGxP2FOS30Cti0Nb/4BL3xocgLOyNYyZsURhi+J5laWluerlCZiaDNJwEKIEi/fZ8Jr165l6NChDBw4sFCmq7xy5QparRYPDw+jcg8PD06cOJHnPm+88QZXrlyhadOmKIpCTk4OAwYM4MMPP3zgcTIzM8nMzDR8Tk1NfezYi72cLIgcD7v/m3yjQhC8+hM4epnc1Nkr6bzz+wGOJaagUsHQVn4Mbe2HhVx+FkKI/J8Jb9u2jdTUVBo0aEBgYCAzZszgypUrTzK2XDZt2sRnn33G999/z4EDB1ixYgURERF8/PHHD9wnPDwcJycnw0ueZX6EG/Ewv93dBPz8MAj7X4ES8Nojibw0fRvHElMobafhlz6NebdtVUnAQgjxH5Wi3Jn0N3/S09NZsmQJP/30E3v27EGr1TJ16lT69OmDg0P+V8nJysrC1taW5cuXExoaaigPCwvjxo0b/Pnnn7n2adasGc899xxffvmloey3336jf//+pKWloVbn/pvi/jPhhIQEatasyfnz5ylfvny+4y0xlrwFx1eDtTN0ng3V2pvcRFaOjvC1x5m//SwAjSq6ML17fco6WRdurEIIUQRduHABb2/vfOUZkx9RsrOzo0+fPmzbto0jR47w3nvv8fnnn1OmTBlefvnlfLej0Who0KABUVFRhjKdTkdUVBRBQUF57nPr1q1cidbCQn9/8kF/S1hZWeHo6Gh4mfKHQonU4Suo1gH+b0uBEvCF67d47YedhgQ8oEVlFvV7ThKwEELkoUDPCd9RrVo1pkyZwoULF1i0aJHJ+48YMYK5c+fy888/c/z4cQYOHEh6erphtHTPnj0ZM2aMoX5ISAizZs1i8eLFxMXFERkZybhx4wgJCTEkY2GilETY/cPdzw4e0H0RuJi+PnTU8WQ6freNQ+dv4GRTih/DGvJB++pYWjzWj5kQQhRbhbI0jYWFBaGhoUaXlfOjW7duXL58mfHjx5OUlES9evVYt26dYbBWfHy80Znv2LFjUalUjB07loSEBNzd3QkJCeHTTz8tjK9R8mTchB+aQ/ol/ejnOq8WqJkcrY4v/47lh81nAPD3dmbmGwGUd7EtzGiFEKLYMfme8LPOlGv1JULUx/Dvev30k6Urm7x70s0Mhi46yJ6z1wDo/XxFxrSvgcZSzn6FECWTKXlGFmktadIuQ04GOHvrP7cco1+IoZSNyU1tPXmZ4YujuZqehb2VJVNerUuHOp6FHLAQQhRfkoRLkrPbYXkfcCgLff8GSyuwsNS/TKDVKXwbdZLpG06iKFDT05Hve9SnopvdEwpcCCGKJ0nCJYFOBzu+1V96VrT65QfTL4OT6ZfjL6dmMnzJQbafugpA98YVmBBSE+tSMjBOCCFMJUm4uLt1DVb+H5z8W/+57uvw0lTQmH7WuvvMVYYsOsil1ExsNRZ81rkOoQHlCjlgIYQoOSQJF2fn98KyXpByASytof0UqN8TVKbNWKXTKczecpqv1seiU8CvjD2z3qxPlTLyzLUQQjwOScLFkaLArlkQOQ50OeBaCbr+AmXrmNzU9fQsRiyNZmPsZQBeCSjHJ51rY6uRHx0hhHhc8pu0uLl9A/4cBCf+0n+uGQovTwdrR5ObOhB/ncG/H+DizQysLNVM7lSLrg29UZl4Ji2EECJvkoSLk4vR+rV/r58FdSkI/gwa9zP58rOiKPy0/Szha46To1PwdbNj5hv1qelleiIXQgjxYJKEi4u4rfBbF9BmglMF6LoAyjUwuZmbt7MZtfwQ648mA9Cxjiefd6mDg3WpQg5YCCGEJOHionxDcPMDJ28I/R5sXU1uIibhJu/8foD4a7coZaFi3Es1ees5H7n8LIQQT4gk4WfZtTPgXBHUav2MV2H/AxuXAl1+/n13PJP/d4wsrY7yLjbMfKM+/t7OTyRsIYQQejLB77Pq0BL4vgls/fpuma2ryQk4LTOHYYujGbsqhiytjjY1PIgY0kwSsBBCPAVyJvys0uVAzm04v1s/I5ba9L+nTiSl8M7vBzhzOR0LtYoP2lXn7Wa+cvlZCCGeEknCzxKdFtT/TQ8Z0EN/6blqcIES8LJ95xn3ZwwZ2TrKOloz440AGlY0/T6yEEKIgpPL0c+KmD/g+yBIv3q3rHqHu0k5n25naXl/2SHeX36YjGwdzau6EzG0qSRgIYQwAzkTLupyMmH9h7B3nv7zrpnQenyBmjp9OY1Bvx/gRFIqahWMaFuVd1pWQa2Wy89CCGEOkoSLsmtx+rmfE6P1n5uN1K//WwCrD11kzB+HSc/S4mZvxXfd69GksluhhSqEEMJ0koSLquN/wap3IPMm2LjCK3PAr63JzWRka/kk4hi/7YoH4LlKrnzXPYAyDtaFHbEQQggTSRIuarTZ8M9E2DlD/7l8Y3htfoHW/o2/eot3Fu4nJiEFgCGtqjCstR+WFjIUQAghigJJwkXJzQuwrDdc2KP/HDQY2kwEC9OnjFx/NImRyw6RmpGDi20pvulWj5bVyhRuvEIIIR6LJOGi4mQkrOgPt6+BlZN+6skaL5ncTLZWxxdrTzBvWxwADXxcmN49AC9nm8KOWAghxGOSJGxuOi1s/PTuzFee9eC1BeDqa3JTCTduM3jhAQ7G3wCgXzNfRrWrTim5/CyEEEWSJGGzU0HyUf3bRm/rlx+0tDK5lY2xl3h3STQ3bmXjaG3JV6/582KtsoUcqxBCiMIkSdhcFEU/z7NaDaGz4OxWqNnJ5GZytDq++edfZm48DUDd8k7MfKM+3q62hR2xEEKIQiZJ+GnT6fSXnq+fhU4z9InY1rVACfhSSgZDFh1kd9w1AHoG+fBRxxpYWZo2i5YQQgjzkCT8tCUfgU2fgaKDet2hYtMCNbPj1BWGLj7IlbQs7DQWfN6lLiH+XoUcrBBCiCdJkvDT5ukPbT/WL75QgASs0ynM2HiKb/75F0WB6mUd+L5HfSq52z+BYIUQQjxJkoSfNEWBnTPB70Vwr6ovazK4QE1dTctk+JJotp68AkC3ht5M6lQL61Jy+VkIIZ5FkoSfpNvXYeVA+HctHPwN+m+CUgWbLnLv2WsMWXiQpJQMrEup+SS0Dq82MH0WLSGEEEWHJOEnJWG/fvGFG/FgoYHA/gV69EinU5i79QxT1sei1SlUdrfj+x4NqFbWofBjFkII8VRJEi5sigJ75uqXH9Rlg0tFeO1n8KpnclM3bmUxctkh/jl+CYBO9bz4rHMd7Kzkn00IIYoD+W1emDJSYPUQOLZK/7lGCHSaCdZOJjcVff4Gg34/QMKN22gs1UwMqUX3xt6oVLL2rxBCFBeShAtL0hFYGgbXToPaEl78BAIH6J8DNoGiKPy84yyfrjlOtlbBp7QtM9+oT+1ypidyIYQQRZsk4celKHDgF1g7CnIywLG8fu5n70YmN5WSkc0HfxxmzZEkANrXLssXr9bF0dr0VZSEEEIUfZKEH0dWOvw1Ag4v1n/2exE6/6CfActERy/eZNDvBzh79RalLFR82KEGvZpUlMvPQghRjEkSfhy7Z+sTsMoCWo+DJsP0c0GbQFEUFu89z4TVR8nK0VHO2YYZbwQQUMHlCQUthBCiqJAk/DiChkDCAXjuHaj4vMm7p2fmMHZVDCsPJgDQqnoZpnb1x9lWU9iRCiGEKIIkCT8OSw28/nuBdj2ZnMrA3w9w6lIaFmoV7wdXo3+zSqjVcvlZCCFKCknCZrDiwAU+WhnD7WwtHo5WTO9en8a+pt9HFkII8WyTJPwUZWRrmfS/oyzacx6AplXcmPZ6PdzsTZ9JSwghxLNPkvBTEnclnXd+P8DxxBRUKhjW2o8hrfywkMvPQghRYkkSfgoiDicy+o/DpGXmUNpOw7evB9DUz83cYQkhhDAzScJPUGaOls8ijvPzznMANPZ1ZXr3ADwcC7aSkhBCiOJFkvATcv7aLQYvPMChCzcBGNiyMu+1rYqlhWnPEQshhCi+JAk/AZHHknlvaTQpGTk42ZTim27+tKruYe6whBBCFDGShAtRtlbHV+tj+WHLGQDqeTsz440AyrvYmjkyIYQQRVGRuDY6c+ZMKlasiLW1NYGBgezZs+eBdVu2bIlKpcr16tix41OMOLfEm7fpPmeXIQH3ed6Xpf8XJAlYCCHEA5n9THjJkiWMGDGC2bNnExgYyLRp0wgODiY2NpYyZcrkqr9ixQqysrIMn69evYq/vz+vvfba0wzbyJZ/LzN8STTX0rNwsLLky9fq0q62p9niEUII8Www+5nw1KlT6devH71796ZmzZrMnj0bW1tbfvrppzzru7q6UrZsWcMrMjISW1tbsyRhrU5h6t+xhM3fw7X0LGp5OfLX0KaSgIUQQuSLWc+Es7Ky2L9/P2PGjDGUqdVq2rRpw86dO/PVxo8//sjrr7+OnZ1dntszMzPJzMw0fE5NTX28oP9zKTWDYYui2XnmKgA9Aisw7qWaWJeyKJT2hRBCFH9mPRO+cuUKWq0WDw/jkcMeHh4kJSU9cv89e/YQExPD22+//cA64eHhODk5GV41a9Z87LgBzl+7zd6z17DVWPDt6/X4tHMdScBCCCFMYvbL0Y/jxx9/pE6dOjRu3PiBdcaMGcPNmzcNr2PHjhXKsRv4uDDl1bqsHtyUTvXKFUqbQgghShazXo52c3PDwsKC5ORko/Lk5GTKli370H3T09NZvHgxkydPfmg9KysrrKzuLpCQkpJS8IDv80r98oXWlhBCiJLHrGfCGo2GBg0aEBUVZSjT6XRERUURFBT00H2XLVtGZmYmb7755pMOUwghhHgizP6I0ogRIwgLC6Nhw4Y0btyYadOmkZ6eTu/evQHo2bMn5cqVIzw83Gi/H3/8kdDQUEqXLm2OsIUQQojHZvYk3K1bNy5fvsz48eNJSkqiXr16rFu3zjBYKz4+HrXa+IQ9NjaWbdu28ffff5sjZCGEEKJQqBRFUcwdxNN04cIFvL29OX/+POXLyz1dIYQQhcuUPPNMj44WQgghnmVmvxz9tOl0OgASExPNHIkQQoji6E5+uZNvHqbEJeE7j0M97NliIYQQ4nElJydToUKFh9YpcfeEc3JyOHjwIB4eHrkGfJkqNTWVmjVrcuzYMRwcHAopwuJH+in/pK/yT/oqf6Sf8q+w+kqn05GcnExAQACWlg8/1y1xSbgwpaSk4OTkxM2bN3F0dDR3OEWW9FP+SV/ln/RV/kg/5Z85+koGZgkhhBBmIklYCCGEMBNJwo/BysqKCRMmGM1NLXKTfso/6av8k77KH+mn/DNHX8k9YSGEEMJM5ExYCCGEMBNJwkIIIYSZSBIWQgghzESScAHNnDmTihUrYm1tTWBgIHv27DF3SEXSli1bCAkJwcvLC5VKxapVq8wdUpEUHh5Oo0aNcHBwoEyZMoSGhhIbG2vusIqcWbNmUbduXRwdHXF0dCQoKIi1a9eaO6wi7/PPP0elUjF8+HBzh1LkTJw4EZVKZfSqXr36Uzu+JOECWLJkCSNGjGDChAkcOHAAf39/goODuXTpkrlDK3LS09Px9/dn5syZ5g6lSNu8eTODBg1i165dREZGkp2dzYsvvkh6erq5QytSypcvz+eff87+/fvZt28frVq1olOnThw9etTcoRVZe/fu5YcffqBu3brmDqXIqlWrFomJiYbXtm3bnt7BFWGyxo0bK4MGDTJ81mq1ipeXlxIeHm7GqIo+QFm5cqW5w3gmXLp0SQGUzZs3mzuUIs/FxUWZN2+eucMoklJTUxU/Pz8lMjJSadGihTJs2DBzh1TkTJgwQfH39zfb8eVM2ERZWVns37+fNm3aGMrUajVt2rRh586dZoxMFCc3b94EwNXV1cyRFF1arZbFixeTnp5OUFCQucMpkgYNGkTHjh2Nfl+J3E6ePImXlxeVKlWiR48exMfHP7Vjl7hVlB7XlStX0Gq1eHh4GJV7eHhw4sQJM0UlihOdTsfw4cN5/vnnqV27trnDKXKOHDlCUFAQGRkZ2Nvbs3LlSmrWrGnusIqcxYsXc+DAAfbu3WvuUIq0wMBAFixYQLVq1UhMTGTSpEk0a9aMmJiYp7LghSRhIYqYQYMGERMT83TvSz1DqlWrRnR0NDdv3mT58uWEhYWxefNmScT3OH/+PMOGDSMyMhJra2tzh1OktW/f3vC+bt26BAYG4uPjw9KlS+nbt+8TP74kYRO5ublhYWFhWJf4juTkZMqWLWumqERxMXjwYP766y+2bNlC+fLlzR1OkaTRaKhSpQoADRo0YO/evXz77bf88MMPZo6s6Ni/fz+XLl2ifv36hjKtVsuWLVuYMWMGmZmZWFhYmDHCosvZ2ZmqVaty6tSpp3I8uSdsIo1GQ4MGDYiKijKU6XQ6oqKi5L6UKDBFURg8eDArV65kw4YN+Pr6mjukZ4ZOpyMzM9PcYRQprVu35siRI0RHRxteDRs2pEePHkRHR0sCfoi0tDROnz6Np6fnUzmenAkXwIgRIwgLC6Nhw4Y0btyYadOmkZ6eTu/evc0dWpGTlpZm9BdlXFwc0dHRuLq6UqFCBTNGVrQMGjSIhQsX8ueff+Lg4EBSUhIATk5O2NjYmDm6omPMmDG0b9+eChUqkJqaysKFC9m0aRPr1683d2hFioODQ67xBHZ2dpQuXVrGGdxn5MiRhISE4OPjw8WLF5kwYQIWFhZ07979qRxfknABdOvWjcuXLzN+/HiSkpKoV68e69atyzVYS8C+fft44YUXDJ9HjBgBQFhYGAsWLDBTVEXPrFmzAGjZsqVR+fz58+nVq9fTD6iIunTpEj179iQxMREnJyfq1q3L+vXradu2rblDE8+oCxcu0L17d65evYq7uztNmzZl165duLu7P5XjyypKQgghhJnIPWEhhBDCTCQJCyGEEGYiSVgIIYQwE0nCQgghhJlIEhZCCCHMRJKwEEIIYSaShIUQQggzkSQshBBCmIkkYSFEoVGpVKxatcrcYQjxzJAkLEQx0atXL1QqVa5Xu3btzB2aEOIBZO5oIYqRdu3aMX/+fKMyKysrM0UjhHgUORMWohixsrKibNmyRi8XFxdAf6l41qxZtG/fHhsbGypVqsTy5cuN9j9y5AitWrXCxsaG0qVL079/f9LS0ozq/PTTT9SqVQsrKys8PT0ZPHiw0fYrV67QuXNnbG1t8fPzY/Xq1YZt169fp0ePHri7u2NjY4Ofn1+uPxqEKEkkCQtRgowbN44uXbpw6NAhevToweuvv87x48cBSE9PJzg4GBcXF/bu3cuyZcv4559/jJLsrFmzGDRoEP379+fIkSOsXr2aKlWqGB1j0qRJdO3alcOHD9OhQwd69OjBtWvXDMc/duwYa9eu5fjx48yaNQs3N7en1wFCFDWKEKJYCAsLUywsLBQ7Ozuj16effqooiqIAyoABA4z2CQwMVAYOHKgoiqLMmTNHcXFxUdLS0gzbIyIiFLVarSQlJSmKoiheXl7KRx999MAYAGXs2LGGz2lpaQqgrF27VlEURQkJCVF69+5dOF9YiGJA7gkLUYy88MILhrWJ73B1dTW8DwoKMtoWFBREdHQ0AMePH8ff3x87OzvD9ueffx6dTkdsbCwqlYqLFy/SunXrh8ZQt25dw3s7OzscHR25dOkSAAMHDqRLly4cOHCAF198kdDQUJo0aVKg7ypEcSBJWIhixM7OLtfl4cJiY2OTr3qlSpUy+qxSqdDpdAC0b9+ec+fOsWbNGiIjI2ndujWDBg3iq6++KvR4hXgWyD1hIUqQXbt25fpco0YNAGrUqMGhQ4dIT083bN++fTtqtZpq1arh4OBAxYoViYqKeqwY3N3dCQsL47fffmPatGnMmTPnsdoT4lkmZ8JCFCOZmZkkJSUZlVlaWhoGPy1btoyGDRvStGlTfv/9d/bs2cOPP/4IQI8ePZgwYQJhYWFMnDiRy5cvM2TIEN566y08PDwAmDhxIgMGDKBMmTK0b9+e1NRUtm/fzpAhQ/IV3/jx42nQoAG1atUiMzOTv/76y/BHgBAlkSRhIYqRdevW4enpaVRWrVo1Tpw4AehHLi9evJh33nkHT09PFi1aRM2aNQGwtbVl/fr1DBs2jEaNGmFra0uXLl2YOnWqoa2wsDAyMjL45ptvGDlyJG5ubrz66qv5jk+j0TBmzBjOnj2LjY0NzZo1Y/HixYXwzYV4NqkURVHMHYQQ4slTqVSsXLmS0NBQc4cihPiP3BMWQgghzESSsBBCCGEmck9YiBJC7jwJUfTImbAQQghhJpKEhRBCCDORJCyEEEKYiSRhIYQQwkwkCQshhBBmIklYCCGEMBNJwkIIIYSZSBIWQgghzESSsBBCCGEm/w+mswi2yPdp7QAAAABJRU5ErkJggg==",
|
||
"text/plain": [
|
||
"<Figure size 500x300 with 2 Axes>"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
}
|
||
],
|
||
"source": [
|
||
"epochs_tensor = torch.linspace(0, num_epochs, len(train_accs))\n",
|
||
"examples_seen_tensor = torch.linspace(0, examples_seen, len(train_accs))\n",
|
||
"\n",
|
||
"plot_values(epochs_tensor, examples_seen_tensor, train_accs, val_accs, label=\"accuracy\")"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "90aba699-21bc-42de-a69c-99f370bb0363",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Based on the accuracy plot above, we can see that the model achieves a relatively high training and validation accuracy after epochs 4 and 5\n",
|
||
"- However, we have to keep in mind that we specified `eval_iter=5` in the training function earlier, which means that we only estimated the training and validation set performances\n",
|
||
"- We can compute the training, validation, and test set performances over the complete dataset as follows below"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 37,
|
||
"id": "UHWaJFrjY0zW",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "UHWaJFrjY0zW",
|
||
"outputId": "e111e6e6-b147-4159-eb9d-19d4e809ed34"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Training accuracy: 97.21%\n",
|
||
"Validation accuracy: 97.32%\n",
|
||
"Test accuracy: 95.67%\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"train_accuracy = calc_accuracy_loader(train_loader, model, device)\n",
|
||
"val_accuracy = calc_accuracy_loader(val_loader, model, device)\n",
|
||
"test_accuracy = calc_accuracy_loader(test_loader, model, device)\n",
|
||
"\n",
|
||
"print(f\"Training accuracy: {train_accuracy*100:.2f}%\")\n",
|
||
"print(f\"Validation accuracy: {val_accuracy*100:.2f}%\")\n",
|
||
"print(f\"Test accuracy: {test_accuracy*100:.2f}%\")"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "6882649f-dc7b-401f-84d2-024ff79c74a1",
|
||
"metadata": {},
|
||
"source": [
|
||
"- We can see that the training and test set performances are practically identical\n",
|
||
"- However, based on the slightly lower test set performance, we can see that the model overfits the training data to a very small degree, as well as the validation data that has been used for tweaking some of the hyperparameters, such as the learning rate\n",
|
||
"- This is normal, however, and this gap could potentially be further reduced by increasing the model's dropout rate (`drop_rate`) or the `weight_decay` in the optimizer setting"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "a74d9ad7-3ec1-450e-8c9f-4fc46d3d5bb0",
|
||
"metadata": {},
|
||
"source": [
|
||
"## 6.8 Using the LLM as a spam classifier"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "72ebcfa2-479e-408b-9cf0-7421f6144855",
|
||
"metadata": {},
|
||
"source": [
|
||
"<img src=\"https://sebastianraschka.com/images/LLMs-from-scratch-images/ch06_compressed/overview-4.webp\" width=500px>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "fd5408e6-83e4-4e5a-8503-c2fba6073f31",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Finally, let's use the finetuned GPT model in action\n",
|
||
"- The `classify_review` function below implements the data preprocessing steps similar to the `SpamDataset` we implemented earlier\n",
|
||
"- Then, the function returns the predicted integer class label from the model and returns the corresponding class name"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 38,
|
||
"id": "aHdn6xvL-IW5",
|
||
"metadata": {
|
||
"id": "aHdn6xvL-IW5"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"def classify_review(text, model, tokenizer, device, max_length=None, pad_token_id=50256):\n",
|
||
" model.eval()\n",
|
||
"\n",
|
||
" # Prepare inputs to the model\n",
|
||
" input_ids = tokenizer.encode(text)\n",
|
||
" supported_context_length = model.pos_emb.weight.shape[1]\n",
|
||
"\n",
|
||
" # Truncate sequences if they too long\n",
|
||
" input_ids = input_ids[:min(max_length, supported_context_length)]\n",
|
||
"\n",
|
||
" # Pad sequences to the longest sequence\n",
|
||
" input_ids += [pad_token_id] * (max_length - len(input_ids))\n",
|
||
" input_tensor = torch.tensor(input_ids, device=device).unsqueeze(0) # add batch dimension\n",
|
||
"\n",
|
||
" # Model inference\n",
|
||
" with torch.no_grad():\n",
|
||
" logits = model(input_tensor)[:, -1, :] # Logits of the last output token\n",
|
||
" predicted_label = torch.argmax(logits, dim=-1).item()\n",
|
||
"\n",
|
||
" # Return the classified result\n",
|
||
" return \"spam\" if predicted_label == 1 else \"not spam\""
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "f29682d8-a899-4d9b-b973-f8d5ec68172c",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Let's try it out on a few examples below"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 39,
|
||
"id": "apU_pf51AWSV",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "apU_pf51AWSV",
|
||
"outputId": "d0fde0a5-e7a3-4dbe-d9c5-0567dbab7e62"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"spam\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"text_1 = (\n",
|
||
" \"You are a winner you have been specially\"\n",
|
||
" \" selected to receive $1000 cash or a $2000 award.\"\n",
|
||
")\n",
|
||
"\n",
|
||
"print(classify_review(\n",
|
||
" text_1, model, tokenizer, device, max_length=train_dataset.max_length\n",
|
||
"))"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 40,
|
||
"id": "1g5VTOo_Ajs5",
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "1g5VTOo_Ajs5",
|
||
"outputId": "659b08eb-b6a9-4a8a-9af7-d94c757e93c2"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"not spam\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"text_2 = (\n",
|
||
" \"Hey, just wanted to check if we're still on\"\n",
|
||
" \" for dinner tonight? Let me know!\"\n",
|
||
")\n",
|
||
"\n",
|
||
"print(classify_review(\n",
|
||
" text_2, model, tokenizer, device, max_length=train_dataset.max_length\n",
|
||
"))"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "bf736e39-0d47-40c1-8d18-1f716cf7a81e",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Finally, let's save the model in case we want to reuse the model later without having to train it again"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 41,
|
||
"id": "mYnX-gI1CfQY",
|
||
"metadata": {
|
||
"id": "mYnX-gI1CfQY"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"torch.save(model.state_dict(), \"review_classifier.pth\")"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "ba78cf7c-6b80-4f71-a50e-3ccc73839af6",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Then, in a new session, we could load the model as follows"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 42,
|
||
"id": "cc4e68a5-d492-493b-87ef-45c475f353f5",
|
||
"metadata": {},
|
||
"outputs": [
|
||
{
|
||
"data": {
|
||
"text/plain": [
|
||
"<All keys matched successfully>"
|
||
]
|
||
},
|
||
"execution_count": 42,
|
||
"metadata": {},
|
||
"output_type": "execute_result"
|
||
}
|
||
],
|
||
"source": [
|
||
"model_state_dict = torch.load(\"review_classifier.pth\")\n",
|
||
"model.load_state_dict(model_state_dict)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "5b70ac71-234f-4eeb-b33d-c62726d50cd4",
|
||
"metadata": {
|
||
"id": "5b70ac71-234f-4eeb-b33d-c62726d50cd4"
|
||
},
|
||
"source": [
|
||
"## Summary and takeaways"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "dafdc910-d616-47ab-aa85-f90c6e7ed80e",
|
||
"metadata": {},
|
||
"source": [
|
||
"- Interested readers can find an introduction to parameter-efficient training with low-rank adaptation (LoRA) in appendix E\n"
|
||
]
|
||
}
|
||
],
|
||
"metadata": {
|
||
"accelerator": "GPU",
|
||
"colab": {
|
||
"gpuType": "V100",
|
||
"provenance": []
|
||
},
|
||
"kernelspec": {
|
||
"display_name": "Python 3 (ipykernel)",
|
||
"language": "python",
|
||
"name": "python3"
|
||
},
|
||
"language_info": {
|
||
"codemirror_mode": {
|
||
"name": "ipython",
|
||
"version": 3
|
||
},
|
||
"file_extension": ".py",
|
||
"mimetype": "text/x-python",
|
||
"name": "python",
|
||
"nbconvert_exporter": "python",
|
||
"pygments_lexer": "ipython3",
|
||
"version": "3.10.6"
|
||
}
|
||
},
|
||
"nbformat": 4,
|
||
"nbformat_minor": 5
|
||
}
|