mirror of
https://github.com/microsoft/autogen.git
synced 2025-07-18 14:32:09 +00:00
7556 lines
396 KiB
Plaintext
7556 lines
396 KiB
Plaintext
{
|
||
"cells": [
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "view-in-github",
|
||
"colab_type": "text"
|
||
},
|
||
"source": [
|
||
"<a href=\"https://colab.research.google.com/github/liususan091219/FLAML-pub/blob/main/notebook/automl_nlp.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "43f7-wG-Tjg_"
|
||
},
|
||
"source": [
|
||
"# FineTuning NLP Models with FLAML Library\n",
|
||
"\n",
|
||
"\n",
|
||
"## 1. Introduction\n",
|
||
"\n",
|
||
"FLAML is a Python library (https://github.com/microsoft/FLAML) designed to automatically produce accurate machine learning models \n",
|
||
"with low computational cost. It is fast and economical. The simple and lightweight design makes it easy to use and extend, such as adding new learners. FLAML can \n",
|
||
"- serve as an economical AutoML engine,\n",
|
||
"- be used as a fast hyperparameter tuning tool, or \n",
|
||
"- be embedded in self-tuning software that requires low latency & resource in repetitive\n",
|
||
" tuning tasks.\n",
|
||
"\n",
|
||
"In this notebook, we demonstrate how to use the FLAML library to fine tune an NLP language model with hyperparameter search. We will use [flaml.tune](https://microsoft.github.io/FLAML/docs/Use-Cases/Tune-User-Defined-Function) with the built in GPU in colab for the tuning. However, if you have a machine with more than 1 GPU, you can also use FLAML's [parallel tuning](https://microsoft.github.io/FLAML/docs/Use-Cases/Task-Oriented-AutoML#parallel-tuning) with the ray tune option. \n",
|
||
"\n",
|
||
"FLAML requires `Python>=3.7`. To run this notebook example, please install flaml with the `nlp,notebook` and `blendsearch` option:\n",
|
||
"```bash\n",
|
||
"pip install flaml[nlp,notebook,blendsearch]==1.0.11; \n",
|
||
"```"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 10,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "Q8c3VMy6TjhC",
|
||
"outputId": "0eaa0dd7-e163-46c6-a637-a982ca62fff2"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/\n",
|
||
"Requirement already satisfied: flaml[blendsearch,nlp,notebook] in /usr/local/lib/python3.7/dist-packages (1.0.11)\n",
|
||
"Requirement already satisfied: tenacity>=6.2.0 in /usr/local/lib/python3.7/dist-packages (from plotly->catboost>=0.26->flaml[blendsearch,nlp,notebook]) (8.0.1)\n",
|
||
"Requirement already satisfied: qtpy>=2.0.1 in /usr/local/lib/python3.7/dist-packages (from qtconsole->jupyter->flaml[blendsearch,nlp,notebook]) (2.2.0)\n",
|
||
"Requirement already satisfied: absl-py in /usr/local/lib/python3.7/dist-packages (from rouge-score->flaml[blendsearch,nlp,notebook]) (1.2.0)\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"%pip install flaml[nlp,notebook,blendsearch]==1.0.11;"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "efPlAWTdTjhD"
|
||
},
|
||
"source": [
|
||
"Let's run some examples. To use CoLab's built in GPU, you need to select Runtime -> Change runtime type and select GPU. Then you can print the device information using:"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 11,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "2kx9QbI7uaU8",
|
||
"outputId": "7b6f5fc2-1406-4460-e3f3-22a23751e136"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"[<torch.cuda.device object at 0x7fa1e4115190>]\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"import torch\n",
|
||
"print([torch.cuda.device(i) for i in range(torch.cuda.device_count())])"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "-yEuLXoHua-f"
|
||
},
|
||
"source": [
|
||
"Note: throughout this notebook, you may see a few ModuleNotFoundErrors. As long as the cell successfully executes, you can ignore that error."
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "ZBr83DYlTjhD"
|
||
},
|
||
"source": [
|
||
"## 2. Sentiment Classification Example\n",
|
||
"### Load data and preprocess\n",
|
||
"\n",
|
||
"The Stanford Sentiment treebank (SST-2) dataset is a dataset for sentiment classification. First, let's load this dataset into pandas dataframes:"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 12,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/",
|
||
"height": 106,
|
||
"referenced_widgets": [
|
||
"16fba9eb9e4542bc9d34eca00d71cc14",
|
||
"c81f11a99b9d4d1d95533f24bea1d5ac",
|
||
"e4ce5cf6ea174583a14675a75d31992d",
|
||
"0c8473019e434db0ae34d58b69605a69",
|
||
"814d3f2b7212461ca51f8635b5106783",
|
||
"129488cadbb9477ca593ae106ee8e9f7",
|
||
"c7aaa1fbd10942649c90044c0c901d99",
|
||
"6fe78cfd377b4c10a626588f46a569cf",
|
||
"2aa0f33b8d3a4bc7bedf3b66e06b62f0",
|
||
"1794a34790b647b3a8a845c55e5e0744",
|
||
"13d813116e1846a3a0e42a5e8423f80e",
|
||
"2aa02213244048fead33ea157c17837b",
|
||
"dfee126a02ef4934a0f654a101dadc1b",
|
||
"8e58d795d528405f8d1c48bfc2afe399",
|
||
"e23212ae504e493a85e4b2524f0217e1",
|
||
"b84af540de4741ceb206456d2f05fa4b",
|
||
"9ba620aa4e51456c9b3b2469c2c887c3",
|
||
"cc8f1bc7322542828205777903530f1e",
|
||
"c5f0ba4cda014a63a99ecc989f72f731",
|
||
"44803ba97fd54ceab2afe0555e21dfe8",
|
||
"f4ca1b7b5868446da945574f4db4373b",
|
||
"f7070757d4784c8099ef7dd9bd280ed3",
|
||
"6a8be136bafe40eea3430dda4063e6db",
|
||
"f28c1b9dee064db99c389beb98306f86",
|
||
"b2a5550cd6474de7a46eab6a973305e0",
|
||
"a4e9b7b28055406c9569e585296850c6",
|
||
"ac534b02efb34100b53999031767e8a3",
|
||
"5f9f744825e44fdbae9c126837e40efe",
|
||
"c3f7a2bb90b44a21a43939f78914f9b8",
|
||
"4ec28fbc9433413f8355e0c976839a94",
|
||
"fab424da92b541cfac6b3bb05ee4e17b",
|
||
"8cfa8c0a28f549c19649ec9b390aa528",
|
||
"470b17af8c8442e49757dc4e385d16f0",
|
||
"2c31d2eb9ae44ffbb0f02ad1b1e7937a",
|
||
"cd5ec1bd9cf54dbfbfd577a096a9e588",
|
||
"d1ba6870db484696879f0d6e5d3a9d70",
|
||
"e7405147ca374cc4998ca947be069652",
|
||
"fb72bbb82aec4184a8e0a510177433cf",
|
||
"f37c66b863854588b7a8891720372dc6",
|
||
"f48c7243c49a43acac4d1ba3a6fe674f",
|
||
"95ad5a87c66f405599a710b8a5fa0a9d",
|
||
"de103ee4780843db8502ebe64f5d2b28",
|
||
"428db79c8cd74257ad09539518a21835",
|
||
"3c95b6d54b294fe2a958056c463ce541"
|
||
]
|
||
},
|
||
"id": "hGP2eqTBTjhD",
|
||
"outputId": "56245bac-5fe2-4dcc-c7ae-714d2a11be5d"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"Downloading and preparing dataset glue/sst2 (download: 7.09 MiB, generated: 4.81 MiB, post-processed: Unknown size, total: 11.90 MiB) to /root/.cache/huggingface/datasets/glue/sst2/1.0.0/dacbe3125aa31d7f70367a07a8a9e72a5a0bfeb5fc42e75c9db75b96da6053ad...\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "display_data",
|
||
"data": {
|
||
"text/plain": [
|
||
"Downloading data: 0%| | 0.00/7.44M [00:00<?, ?B/s]"
|
||
],
|
||
"application/vnd.jupyter.widget-view+json": {
|
||
"version_major": 2,
|
||
"version_minor": 0,
|
||
"model_id": "16fba9eb9e4542bc9d34eca00d71cc14"
|
||
},
|
||
"application/json": {
|
||
"n": 0,
|
||
"total": 7439277,
|
||
"elapsed": 0.023516416549682617,
|
||
"ncols": null,
|
||
"nrows": null,
|
||
"prefix": "Downloading data",
|
||
"ascii": false,
|
||
"unit": "B",
|
||
"unit_scale": true,
|
||
"rate": null,
|
||
"bar_format": null,
|
||
"postfix": null,
|
||
"unit_divisor": 1000,
|
||
"initial": 0,
|
||
"colour": null
|
||
}
|
||
},
|
||
"metadata": {}
|
||
},
|
||
{
|
||
"output_type": "display_data",
|
||
"data": {
|
||
"text/plain": [
|
||
"Generating train split: 0%| | 0/67349 [00:00<?, ? examples/s]"
|
||
],
|
||
"application/vnd.jupyter.widget-view+json": {
|
||
"version_major": 2,
|
||
"version_minor": 0,
|
||
"model_id": "2aa02213244048fead33ea157c17837b"
|
||
},
|
||
"application/json": {
|
||
"n": 0,
|
||
"total": 67349,
|
||
"elapsed": 0.02047133445739746,
|
||
"ncols": null,
|
||
"nrows": null,
|
||
"prefix": "Generating train split",
|
||
"ascii": false,
|
||
"unit": " examples",
|
||
"unit_scale": false,
|
||
"rate": null,
|
||
"bar_format": null,
|
||
"postfix": null,
|
||
"unit_divisor": 1000,
|
||
"initial": 0,
|
||
"colour": null
|
||
}
|
||
},
|
||
"metadata": {}
|
||
},
|
||
{
|
||
"output_type": "display_data",
|
||
"data": {
|
||
"text/plain": [
|
||
"Generating validation split: 0%| | 0/872 [00:00<?, ? examples/s]"
|
||
],
|
||
"application/vnd.jupyter.widget-view+json": {
|
||
"version_major": 2,
|
||
"version_minor": 0,
|
||
"model_id": "6a8be136bafe40eea3430dda4063e6db"
|
||
},
|
||
"application/json": {
|
||
"n": 0,
|
||
"total": 872,
|
||
"elapsed": 0.020740985870361328,
|
||
"ncols": null,
|
||
"nrows": null,
|
||
"prefix": "Generating validation split",
|
||
"ascii": false,
|
||
"unit": " examples",
|
||
"unit_scale": false,
|
||
"rate": null,
|
||
"bar_format": null,
|
||
"postfix": null,
|
||
"unit_divisor": 1000,
|
||
"initial": 0,
|
||
"colour": null
|
||
}
|
||
},
|
||
"metadata": {}
|
||
},
|
||
{
|
||
"output_type": "display_data",
|
||
"data": {
|
||
"text/plain": [
|
||
"Generating test split: 0%| | 0/1821 [00:00<?, ? examples/s]"
|
||
],
|
||
"application/vnd.jupyter.widget-view+json": {
|
||
"version_major": 2,
|
||
"version_minor": 0,
|
||
"model_id": "2c31d2eb9ae44ffbb0f02ad1b1e7937a"
|
||
},
|
||
"application/json": {
|
||
"n": 0,
|
||
"total": 1821,
|
||
"elapsed": 0.0380253791809082,
|
||
"ncols": null,
|
||
"nrows": null,
|
||
"prefix": "Generating test split",
|
||
"ascii": false,
|
||
"unit": " examples",
|
||
"unit_scale": false,
|
||
"rate": null,
|
||
"bar_format": null,
|
||
"postfix": null,
|
||
"unit_divisor": 1000,
|
||
"initial": 0,
|
||
"colour": null
|
||
}
|
||
},
|
||
"metadata": {}
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"Dataset glue downloaded and prepared to /root/.cache/huggingface/datasets/glue/sst2/1.0.0/dacbe3125aa31d7f70367a07a8a9e72a5a0bfeb5fc42e75c9db75b96da6053ad. Subsequent calls will reuse this data.\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stderr",
|
||
"text": [
|
||
"WARNING:datasets.builder:Reusing dataset glue (/root/.cache/huggingface/datasets/glue/sst2/1.0.0/dacbe3125aa31d7f70367a07a8a9e72a5a0bfeb5fc42e75c9db75b96da6053ad)\n",
|
||
"WARNING:datasets.builder:Reusing dataset glue (/root/.cache/huggingface/datasets/glue/sst2/1.0.0/dacbe3125aa31d7f70367a07a8a9e72a5a0bfeb5fc42e75c9db75b96da6053ad)\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"from datasets import load_dataset\n",
|
||
"\n",
|
||
"train_dataset = load_dataset(\"glue\", \"sst2\", split=\"train\").to_pandas()\n",
|
||
"dev_dataset = load_dataset(\"glue\", \"sst2\", split=\"validation\").to_pandas()\n",
|
||
"test_dataset = load_dataset(\"glue\", \"sst2\", split=\"test\").to_pandas()"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "Nb7SAWVLTjhE"
|
||
},
|
||
"source": [
|
||
"Take a look at the first 5 examples of this dataset:"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"metadata": {
|
||
"id": "65mLkoJhTjhE",
|
||
"outputId": "2ec2ba75-caeb-4e6e-e1f8-78ee900f525d"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<div>\n",
|
||
"<style scoped>\n",
|
||
" .dataframe tbody tr th:only-of-type {\n",
|
||
" vertical-align: middle;\n",
|
||
" }\n",
|
||
"\n",
|
||
" .dataframe tbody tr th {\n",
|
||
" vertical-align: top;\n",
|
||
" }\n",
|
||
"\n",
|
||
" .dataframe thead th {\n",
|
||
" text-align: right;\n",
|
||
" }\n",
|
||
"</style>\n",
|
||
"<table border=\"1\" class=\"dataframe\">\n",
|
||
" <thead>\n",
|
||
" <tr style=\"text-align: right;\">\n",
|
||
" <th></th>\n",
|
||
" <th>sentence</th>\n",
|
||
" <th>label</th>\n",
|
||
" <th>idx</th>\n",
|
||
" </tr>\n",
|
||
" </thead>\n",
|
||
" <tbody>\n",
|
||
" <tr>\n",
|
||
" <th>0</th>\n",
|
||
" <td>hide new secretions from the parental units</td>\n",
|
||
" <td>0</td>\n",
|
||
" <td>0</td>\n",
|
||
" </tr>\n",
|
||
" <tr>\n",
|
||
" <th>1</th>\n",
|
||
" <td>contains no wit , only labored gags</td>\n",
|
||
" <td>0</td>\n",
|
||
" <td>1</td>\n",
|
||
" </tr>\n",
|
||
" <tr>\n",
|
||
" <th>2</th>\n",
|
||
" <td>that loves its characters and communicates som...</td>\n",
|
||
" <td>1</td>\n",
|
||
" <td>2</td>\n",
|
||
" </tr>\n",
|
||
" <tr>\n",
|
||
" <th>3</th>\n",
|
||
" <td>remains utterly satisfied to remain the same t...</td>\n",
|
||
" <td>0</td>\n",
|
||
" <td>3</td>\n",
|
||
" </tr>\n",
|
||
" <tr>\n",
|
||
" <th>4</th>\n",
|
||
" <td>on the worst revenge-of-the-nerds clichés the ...</td>\n",
|
||
" <td>0</td>\n",
|
||
" <td>4</td>\n",
|
||
" </tr>\n",
|
||
" </tbody>\n",
|
||
"</table>\n",
|
||
"</div>"
|
||
],
|
||
"text/plain": [
|
||
" sentence label idx\n",
|
||
"0 hide new secretions from the parental units 0 0\n",
|
||
"1 contains no wit , only labored gags 0 1\n",
|
||
"2 that loves its characters and communicates som... 1 2\n",
|
||
"3 remains utterly satisfied to remain the same t... 0 3\n",
|
||
"4 on the worst revenge-of-the-nerds clichés the ... 0 4"
|
||
]
|
||
},
|
||
"execution_count": 3,
|
||
"metadata": {},
|
||
"output_type": "execute_result"
|
||
}
|
||
],
|
||
"source": [
|
||
"train_dataset.head(5)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "ENcUQbOgTjhE"
|
||
},
|
||
"source": [
|
||
"Separate the data into X and y:"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 13,
|
||
"metadata": {
|
||
"id": "GA0VH9URTjhF"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"custom_sent_keys = [\"sentence\"] # specify the column names of the input sentences\n",
|
||
"label_key = \"label\" # specify the column name of the label\n",
|
||
"\n",
|
||
"X_train, y_train = train_dataset[custom_sent_keys], train_dataset[label_key]\n",
|
||
"X_val, y_val = dev_dataset[custom_sent_keys], dev_dataset[label_key]\n",
|
||
"X_test = test_dataset[custom_sent_keys]"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "NpRqB153TjhF"
|
||
},
|
||
"source": [
|
||
"### Run FLAML"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "2kXabqxZuzQl"
|
||
},
|
||
"source": [
|
||
"Now we can run AutoML with FLAML:"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 14,
|
||
"metadata": {
|
||
"id": "asYbkzrXTjhF"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"from flaml import AutoML\n",
|
||
"automl = AutoML()\n"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "2XZmrBRru_A0"
|
||
},
|
||
"source": [
|
||
"Let's run FLAML for 30 mins. Here we use Electra's [small model](https://huggingface.co/google/electra-small-discriminator) for the tuning. We set gpu_per_trial to 1, and n_concurrent_trials to 1 (the number of trials running at the same time). Make sure gpu_per_trial * n_concurrent_trials does not exceed the GPU number you have. While running you can observe the resource usage (including the GPU) on the right. "
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 15,
|
||
"metadata": {
|
||
"id": "QEvR2bZiTjhG"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"TIME_BUDGET=1800\n",
|
||
"automl_settings = {\n",
|
||
" \"time_budget\": TIME_BUDGET, # setting the time budget\n",
|
||
" \"task\": \"seq-classification\", # setting the task as seq-classification\n",
|
||
" \"fit_kwargs_by_estimator\": {\n",
|
||
" \"transformer\": {\n",
|
||
" \"output_dir\": \"data/output/\", # setting the output directory\n",
|
||
" \"model_path\": \"google/electra-small-discriminator\", # if model_path is not set, the default model is facebook/muppet-roberta-base: https://huggingface.co/facebook/muppet-roberta-base\n",
|
||
" }\n",
|
||
" },\n",
|
||
" \"gpu_per_trial\": 1, # using 1 GPU for each trial\n",
|
||
" \"log_file_name\": \"seqclass.log\", # set the file to save the log for HPO\n",
|
||
" \"log_type\": \"all\", # the log type for trials: \"all\" if logging all the trials, \"better\" if only keeping the better trials\n",
|
||
" \"use_ray\": False, # If parallel tuning, set \"use_ray\" to {\"local_dir\": \"data/output/\"}\n",
|
||
" \"n_concurrent_trials\": 1, # How many trials to run at the same time, n_concurrent_trials * gpu_per_trial must not exceed the total number of GPUs\n",
|
||
" \"keep_search_state\": True, # keeping the search state\n",
|
||
" \"fp16\": False # whether to use fp16, this option is True by default. \n",
|
||
"}"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 23,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "EXjF65hOTjhG",
|
||
"outputId": "e706d9e0-c890-41a0-cf9e-fb308d7c9533"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stderr",
|
||
"text": [
|
||
"/usr/local/lib/python3.7/dist-packages/pandas/core/frame.py:3641: SettingWithCopyWarning: \n",
|
||
"A value is trying to be set on a copy of a slice from a DataFrame.\n",
|
||
"Try using .loc[row_indexer,col_indexer] = value instead\n",
|
||
"\n",
|
||
"See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n",
|
||
" self[k1] = value[k2]\n",
|
||
"[flaml.automl: 08-21 02:50:27] {2565} INFO - task = seq-classification\n",
|
||
"INFO:flaml.automl:task = seq-classification\n",
|
||
"[flaml.automl: 08-21 02:50:27] {2567} INFO - Data split method: stratified\n",
|
||
"INFO:flaml.automl:Data split method: stratified\n",
|
||
"[flaml.automl: 08-21 02:50:27] {2570} INFO - Evaluation method: holdout\n",
|
||
"INFO:flaml.automl:Evaluation method: holdout\n",
|
||
"[flaml.automl: 08-21 02:50:27] {2689} INFO - Minimizing error metric: 1-accuracy\n",
|
||
"INFO:flaml.automl:Minimizing error metric: 1-accuracy\n",
|
||
"[flaml.automl: 08-21 02:50:27] {2831} INFO - List of ML learners in AutoML Run: ['transformer']\n",
|
||
"INFO:flaml.automl:List of ML learners in AutoML Run: ['transformer']\n",
|
||
"[flaml.automl: 08-21 02:50:27] {3133} INFO - iteration 0, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 0, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"{'loss': 0.5665, 'learning_rate': 4.6751863684771026e-06, 'epoch': 1.6}\n",
|
||
"{'eval_loss': 0.42372775077819824, 'eval_automl_metric': 0.1754587155963303, 'eval_runtime': 10.818, 'eval_samples_per_second': 80.606, 'eval_steps_per_second': 80.606, 'epoch': 2.0}\n",
|
||
"{'eval_loss': 0.4013938903808594, 'eval_automl_metric': 0.16399082568807344, 'eval_runtime': 10.8291, 'eval_samples_per_second': 80.524, 'eval_steps_per_second': 80.524, 'epoch': 3.0}\n",
|
||
"{'train_runtime': 81.4429, 'train_samples_per_second': 368.356, 'train_steps_per_second': 11.53, 'train_loss': 0.4875296855759951, 'epoch': 3.0}\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stderr",
|
||
"text": [
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 872\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-21_02-50-27/train_036eed6c_15_s=9223372036854775807,e=1e-05,s=-1,s=3,e=32,d=20_2022-08-21_02-50-27/checkpoint-939/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-21_02-50-27/train_036eed6c_15_s=9223372036854775807,e=1e-05,s=-1,s=3,e=32,d=20_2022-08-21_02-50-27/checkpoint-939/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-21_02-50-27/train_036eed6c_15_s=9223372036854775807,e=1e-05,s=-1,s=3,e=32,d=20_2022-08-21_02-50-27/checkpoint-939/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-21_02-50-27/train_036eed6c_15_s=9223372036854775807,e=1e-05,s=-1,s=3,e=32,d=20_2022-08-21_02-50-27/checkpoint-939/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-21_02-50-27/train_036eed6c_15_s=9223372036854775807,e=1e-05,s=-1,s=3,e=32,d=20_2022-08-21_02-50-27/checkpoint-939/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-21 02:52:09] {3267} INFO - Estimated sufficient time budget=6862714s. Estimated necessary time budget=6863s.\n",
|
||
"INFO:flaml.automl:Estimated sufficient time budget=6862714s. Estimated necessary time budget=6863s.\n",
|
||
"[flaml.automl: 08-21 02:52:09] {3319} INFO - at 102.1s,\testimator transformer's best error=0.1640,\tbest estimator transformer's best error=0.1640\n",
|
||
"INFO:flaml.automl: at 102.1s,\testimator transformer's best error=0.1640,\tbest estimator transformer's best error=0.1640\n",
|
||
"[flaml.automl: 08-21 02:52:09] {3133} INFO - iteration 1, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 1, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"{'eval_loss': 0.4843567907810211, 'eval_automl_metric': 0.18233944954128445, 'eval_runtime': 10.5457, 'eval_samples_per_second': 82.688, 'eval_steps_per_second': 82.688, 'epoch': 2.0}\n",
|
||
"{'eval_loss': 0.4618026912212372, 'eval_automl_metric': 0.17889908256880738, 'eval_runtime': 10.645, 'eval_samples_per_second': 81.916, 'eval_steps_per_second': 81.916, 'epoch': 3.0}\n",
|
||
"{'train_runtime': 65.5885, 'train_samples_per_second': 457.397, 'train_steps_per_second': 7.181, 'train_loss': 0.5575582905180135, 'epoch': 3.0}\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stderr",
|
||
"text": [
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 872\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-21_02-52-09/train_402e7e16_16_s=9223372036854775807,e=9.7119e-06,s=-1,s=3,e=64,d=14_2022-08-21_02-52-09/checkpoint-471/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-21_02-52-09/train_402e7e16_16_s=9223372036854775807,e=9.7119e-06,s=-1,s=3,e=64,d=14_2022-08-21_02-52-09/checkpoint-471/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-21_02-52-09/train_402e7e16_16_s=9223372036854775807,e=9.7119e-06,s=-1,s=3,e=64,d=14_2022-08-21_02-52-09/checkpoint-471/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-21_02-52-09/train_402e7e16_16_s=9223372036854775807,e=9.7119e-06,s=-1,s=3,e=64,d=14_2022-08-21_02-52-09/checkpoint-471/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-21_02-52-09/train_402e7e16_16_s=9223372036854775807,e=9.7119e-06,s=-1,s=3,e=64,d=14_2022-08-21_02-52-09/checkpoint-471/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-21 02:53:35] {3319} INFO - at 188.2s,\testimator transformer's best error=0.1640,\tbest estimator transformer's best error=0.1640\n",
|
||
"INFO:flaml.automl: at 188.2s,\testimator transformer's best error=0.1640,\tbest estimator transformer's best error=0.1640\n",
|
||
"[flaml.automl: 08-21 02:53:36] {3133} INFO - iteration 2, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 2, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"{'loss': 0.5778, 'learning_rate': 7.550901222797876e-06, 'epoch': 0.8}\n",
|
||
"{'loss': 0.3836, 'learning_rate': 4.805118959962285e-06, 'epoch': 1.6}\n",
|
||
"{'eval_loss': 0.3749224543571472, 'eval_automl_metric': 0.15596330275229353, 'eval_runtime': 10.5464, 'eval_samples_per_second': 82.682, 'eval_steps_per_second': 82.682, 'epoch': 2.0}\n",
|
||
"{'loss': 0.3399, 'learning_rate': 2.0593366971266936e-06, 'epoch': 2.4}\n",
|
||
"{'eval_loss': 0.37013810873031616, 'eval_automl_metric': 0.1513761467889908, 'eval_runtime': 10.6222, 'eval_samples_per_second': 82.092, 'eval_steps_per_second': 82.092, 'epoch': 3.0}\n",
|
||
"{'train_runtime': 126.516, 'train_samples_per_second': 237.124, 'train_steps_per_second': 14.82, 'train_loss': 0.40950755208333334, 'epoch': 3.0}\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stderr",
|
||
"text": [
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 872\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-21_02-53-36/train_7382565c_17_s=9223372036854775807,e=1.0297e-05,s=-1,s=3,e=16,d=26_2022-08-21_02-53-36/checkpoint-1875/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-21_02-53-36/train_7382565c_17_s=9223372036854775807,e=1.0297e-05,s=-1,s=3,e=16,d=26_2022-08-21_02-53-36/checkpoint-1875/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-21_02-53-36/train_7382565c_17_s=9223372036854775807,e=1.0297e-05,s=-1,s=3,e=16,d=26_2022-08-21_02-53-36/checkpoint-1875/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-21_02-53-36/train_7382565c_17_s=9223372036854775807,e=1.0297e-05,s=-1,s=3,e=16,d=26_2022-08-21_02-53-36/checkpoint-1875/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-21_02-53-36/train_7382565c_17_s=9223372036854775807,e=1.0297e-05,s=-1,s=3,e=16,d=26_2022-08-21_02-53-36/checkpoint-1875/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-21 02:56:03] {729} WARNING - checkpoint data/output/train_2022-08-21_02-50-27/train_036eed6c_15_s=9223372036854775807,e=1e-05,s=-1,s=3,e=32,d=20_2022-08-21_02-50-27/checkpoint-939 not found\n",
|
||
"WARNING:flaml.automl:checkpoint data/output/train_2022-08-21_02-50-27/train_036eed6c_15_s=9223372036854775807,e=1e-05,s=-1,s=3,e=32,d=20_2022-08-21_02-50-27/checkpoint-939 not found\n",
|
||
"[flaml.automl: 08-21 02:56:03] {3319} INFO - at 335.5s,\testimator transformer's best error=0.1514,\tbest estimator transformer's best error=0.1514\n",
|
||
"INFO:flaml.automl: at 335.5s,\testimator transformer's best error=0.1514,\tbest estimator transformer's best error=0.1514\n",
|
||
"[flaml.automl: 08-21 02:56:03] {3133} INFO - iteration 3, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 3, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"{'loss': 0.5362, 'learning_rate': 8.879996750213199e-06, 'epoch': 0.8}\n",
|
||
"{'eval_loss': 0.3863365948200226, 'eval_automl_metric': 0.1594036697247706, 'eval_runtime': 10.6255, 'eval_samples_per_second': 82.067, 'eval_steps_per_second': 82.067, 'epoch': 1.0}\n",
|
||
"{'loss': 0.3654, 'learning_rate': 2.959998916737733e-06, 'epoch': 1.6}\n",
|
||
"{'eval_loss': 0.375693142414093, 'eval_automl_metric': 0.15596330275229353, 'eval_runtime': 10.6464, 'eval_samples_per_second': 81.906, 'eval_steps_per_second': 81.906, 'epoch': 2.0}\n",
|
||
"{'train_runtime': 91.5445, 'train_samples_per_second': 218.473, 'train_steps_per_second': 13.655, 'train_loss': 0.42628193359375, 'epoch': 2.0}\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stderr",
|
||
"text": [
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 872\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-21_02-56-03/train_cb570bac_18_s=9223372036854775807,e=1.48e-05,s=-1,s=2,e=16,d=25_2022-08-21_02-56-03/checkpoint-1250/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-21_02-56-03/train_cb570bac_18_s=9223372036854775807,e=1.48e-05,s=-1,s=2,e=16,d=25_2022-08-21_02-56-03/checkpoint-1250/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-21_02-56-03/train_cb570bac_18_s=9223372036854775807,e=1.48e-05,s=-1,s=2,e=16,d=25_2022-08-21_02-56-03/checkpoint-1250/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-21_02-56-03/train_cb570bac_18_s=9223372036854775807,e=1.48e-05,s=-1,s=2,e=16,d=25_2022-08-21_02-56-03/checkpoint-1250/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-21_02-56-03/train_cb570bac_18_s=9223372036854775807,e=1.48e-05,s=-1,s=2,e=16,d=25_2022-08-21_02-56-03/checkpoint-1250/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-21 02:57:55] {3319} INFO - at 447.3s,\testimator transformer's best error=0.1514,\tbest estimator transformer's best error=0.1514\n",
|
||
"INFO:flaml.automl: at 447.3s,\testimator transformer's best error=0.1514,\tbest estimator transformer's best error=0.1514\n",
|
||
"[flaml.automl: 08-21 02:57:55] {3133} INFO - iteration 4, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 4, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"{'loss': 0.6402, 'learning_rate': 5.730904302906456e-06, 'epoch': 0.8}\n",
|
||
"{'loss': 0.4537, 'learning_rate': 4.298178227179842e-06, 'epoch': 1.6}\n",
|
||
"{'loss': 0.3716, 'learning_rate': 2.865452151453228e-06, 'epoch': 2.4}\n",
|
||
"{'eval_loss': 0.4031089246273041, 'eval_automl_metric': 0.16284403669724767, 'eval_runtime': 10.6207, 'eval_samples_per_second': 82.104, 'eval_steps_per_second': 82.104, 'epoch': 2.88}\n",
|
||
"{'eval_loss': 0.4031089246273041, 'eval_automl_metric': 0.16284403669724767, 'eval_runtime': 10.663, 'eval_samples_per_second': 81.778, 'eval_steps_per_second': 81.778, 'epoch': 2.88}\n",
|
||
"{'train_runtime': 122.6118, 'train_samples_per_second': 326.233, 'train_steps_per_second': 20.39, 'train_loss': 0.46601707301930595, 'epoch': 2.88}\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stderr",
|
||
"text": [
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 872\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-21_02-57-55/train_0df42e0e_19_s=9223372036854775807,e=7.1636e-06,s=-1,s=4,e=16,d=27_2022-08-21_02-57-55/checkpoint-1803/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-21_02-57-55/train_0df42e0e_19_s=9223372036854775807,e=7.1636e-06,s=-1,s=4,e=16,d=27_2022-08-21_02-57-55/checkpoint-1803/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-21_02-57-55/train_0df42e0e_19_s=9223372036854775807,e=7.1636e-06,s=-1,s=4,e=16,d=27_2022-08-21_02-57-55/checkpoint-1803/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-21_02-57-55/train_0df42e0e_19_s=9223372036854775807,e=7.1636e-06,s=-1,s=4,e=16,d=27_2022-08-21_02-57-55/checkpoint-1803/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-21_02-57-55/train_0df42e0e_19_s=9223372036854775807,e=7.1636e-06,s=-1,s=4,e=16,d=27_2022-08-21_02-57-55/checkpoint-1803/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-21 03:00:18] {3319} INFO - at 590.4s,\testimator transformer's best error=0.1514,\tbest estimator transformer's best error=0.1514\n",
|
||
"INFO:flaml.automl: at 590.4s,\testimator transformer's best error=0.1514,\tbest estimator transformer's best error=0.1514\n",
|
||
"[flaml.automl: 08-21 03:00:18] {3133} INFO - iteration 5, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 5, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"{'loss': 0.5223, 'learning_rate': 1.3121346786922505e-05, 'epoch': 0.8}\n",
|
||
"{'loss': 0.333, 'learning_rate': 8.349947955314322e-06, 'epoch': 1.6}\n",
|
||
"{'eval_loss': 0.37441486120224, 'eval_automl_metric': 0.16169724770642202, 'eval_runtime': 10.5419, 'eval_samples_per_second': 82.717, 'eval_steps_per_second': 82.717, 'epoch': 2.0}\n",
|
||
"{'eval_loss': 0.3761043846607208, 'eval_automl_metric': 0.15481651376146788, 'eval_runtime': 10.512, 'eval_samples_per_second': 82.953, 'eval_steps_per_second': 82.953, 'epoch': 2.23}\n",
|
||
"{'eval_loss': 0.3761043846607208, 'eval_automl_metric': 0.15481651376146788, 'eval_runtime': 10.5934, 'eval_samples_per_second': 82.316, 'eval_steps_per_second': 82.316, 'epoch': 2.23}\n",
|
||
"{'train_runtime': 111.7637, 'train_samples_per_second': 268.423, 'train_steps_per_second': 16.776, 'train_loss': 0.39087216824957316, 'epoch': 2.23}\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stderr",
|
||
"text": [
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 872\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-21_03-00-18/train_634289e6_20_s=9223372036854775807,e=1.7893e-05,s=-1,s=3,e=16,d=32_2022-08-21_03-00-18/checkpoint-1391/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-21_03-00-18/train_634289e6_20_s=9223372036854775807,e=1.7893e-05,s=-1,s=3,e=16,d=32_2022-08-21_03-00-18/checkpoint-1391/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-21_03-00-18/train_634289e6_20_s=9223372036854775807,e=1.7893e-05,s=-1,s=3,e=16,d=32_2022-08-21_03-00-18/checkpoint-1391/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-21_03-00-18/train_634289e6_20_s=9223372036854775807,e=1.7893e-05,s=-1,s=3,e=16,d=32_2022-08-21_03-00-18/checkpoint-1391/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-21_03-00-18/train_634289e6_20_s=9223372036854775807,e=1.7893e-05,s=-1,s=3,e=16,d=32_2022-08-21_03-00-18/checkpoint-1391/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-21 03:02:30] {3319} INFO - at 722.7s,\testimator transformer's best error=0.1514,\tbest estimator transformer's best error=0.1514\n",
|
||
"INFO:flaml.automl: at 722.7s,\testimator transformer's best error=0.1514,\tbest estimator transformer's best error=0.1514\n",
|
||
"[flaml.automl: 08-21 03:02:30] {3133} INFO - iteration 6, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 6, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"{'loss': 0.6593, 'learning_rate': 4.3452939856201385e-06, 'epoch': 0.8}\n",
|
||
"{'loss': 0.5039, 'learning_rate': 2.76518708175827e-06, 'epoch': 1.6}\n",
|
||
"{'eval_loss': 0.4441715180873871, 'eval_automl_metric': 0.18463302752293576, 'eval_runtime': 10.6465, 'eval_samples_per_second': 81.905, 'eval_steps_per_second': 81.905, 'epoch': 2.0}\n",
|
||
"{'eval_loss': 0.444117933511734, 'eval_automl_metric': 0.18463302752293576, 'eval_runtime': 10.6581, 'eval_samples_per_second': 81.815, 'eval_steps_per_second': 81.815, 'epoch': 2.0}\n",
|
||
"{'eval_loss': 0.444117933511734, 'eval_automl_metric': 0.18463302752293576, 'eval_runtime': 10.659, 'eval_samples_per_second': 81.809, 'eval_steps_per_second': 81.809, 'epoch': 2.0}\n",
|
||
"{'train_runtime': 103.8522, 'train_samples_per_second': 288.872, 'train_steps_per_second': 18.055, 'train_loss': 0.5531763736959651, 'epoch': 2.0}\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stderr",
|
||
"text": [
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 872\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-21_03-02-30/train_b2206e84_21_s=9223372036854775807,e=5.9254e-06,s=-1,s=3,e=16,d=20_2022-08-21_03-02-30/checkpoint-1250/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-21_03-02-30/train_b2206e84_21_s=9223372036854775807,e=5.9254e-06,s=-1,s=3,e=16,d=20_2022-08-21_03-02-30/checkpoint-1250/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-21_03-02-30/train_b2206e84_21_s=9223372036854775807,e=5.9254e-06,s=-1,s=3,e=16,d=20_2022-08-21_03-02-30/checkpoint-1250/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-21_03-02-30/train_b2206e84_21_s=9223372036854775807,e=5.9254e-06,s=-1,s=3,e=16,d=20_2022-08-21_03-02-30/checkpoint-1250/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-21_03-02-30/train_b2206e84_21_s=9223372036854775807,e=5.9254e-06,s=-1,s=3,e=16,d=20_2022-08-21_03-02-30/checkpoint-1250/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-21 03:04:34] {3319} INFO - at 847.0s,\testimator transformer's best error=0.1514,\tbest estimator transformer's best error=0.1514\n",
|
||
"INFO:flaml.automl: at 847.0s,\testimator transformer's best error=0.1514,\tbest estimator transformer's best error=0.1514\n",
|
||
"[flaml.automl: 08-21 03:04:34] {3133} INFO - iteration 7, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 7, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"{'loss': 0.4949, 'learning_rate': 1.624682269684853e-05, 'epoch': 0.8}\n",
|
||
"{'loss': 0.3234, 'learning_rate': 1.0338887170721792e-05, 'epoch': 1.6}\n",
|
||
"{'eval_loss': 0.34439605474472046, 'eval_automl_metric': 0.13188073394495414, 'eval_runtime': 10.6017, 'eval_samples_per_second': 82.251, 'eval_steps_per_second': 82.251, 'epoch': 2.0}\n",
|
||
"{'eval_loss': 0.3457942605018616, 'eval_automl_metric': 0.13188073394495414, 'eval_runtime': 10.539, 'eval_samples_per_second': 82.74, 'eval_steps_per_second': 82.74, 'epoch': 2.0}\n",
|
||
"{'eval_loss': 0.3457942605018616, 'eval_automl_metric': 0.13188073394495414, 'eval_runtime': 10.661, 'eval_samples_per_second': 81.794, 'eval_steps_per_second': 81.794, 'epoch': 2.0}\n",
|
||
"{'train_runtime': 102.8292, 'train_samples_per_second': 291.746, 'train_steps_per_second': 18.234, 'train_loss': 0.39010055993291304, 'epoch': 2.0}\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stderr",
|
||
"text": [
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 872\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-21_03-04-34/train_fc3698e0_22_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-04-34/checkpoint-1250/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-21_03-04-34/train_fc3698e0_22_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-04-34/checkpoint-1250/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-21_03-04-34/train_fc3698e0_22_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-04-34/checkpoint-1250/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-21_03-04-34/train_fc3698e0_22_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-04-34/checkpoint-1250/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-21_03-04-34/train_fc3698e0_22_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-04-34/checkpoint-1250/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-21 03:06:38] {729} WARNING - checkpoint data/output/train_2022-08-21_02-53-36/train_7382565c_17_s=9223372036854775807,e=1.0297e-05,s=-1,s=3,e=16,d=26_2022-08-21_02-53-36/checkpoint-1875 not found\n",
|
||
"WARNING:flaml.automl:checkpoint data/output/train_2022-08-21_02-53-36/train_7382565c_17_s=9223372036854775807,e=1.0297e-05,s=-1,s=3,e=16,d=26_2022-08-21_02-53-36/checkpoint-1875 not found\n",
|
||
"[flaml.automl: 08-21 03:06:38] {3319} INFO - at 971.1s,\testimator transformer's best error=0.1319,\tbest estimator transformer's best error=0.1319\n",
|
||
"INFO:flaml.automl: at 971.1s,\testimator transformer's best error=0.1319,\tbest estimator transformer's best error=0.1319\n",
|
||
"[flaml.automl: 08-21 03:06:38] {3133} INFO - iteration 8, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 8, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"{'loss': 0.5516, 'learning_rate': 2.1517120038185796e-05, 'epoch': 0.4}\n",
|
||
"{'loss': 0.3741, 'learning_rate': 1.8206793878464904e-05, 'epoch': 0.8}\n",
|
||
"{'eval_loss': 0.4729900658130646, 'eval_automl_metric': 0.1674311926605505, 'eval_runtime': 10.6305, 'eval_samples_per_second': 82.028, 'eval_steps_per_second': 82.028, 'epoch': 0.92}\n",
|
||
"{'eval_loss': 0.4729900658130646, 'eval_automl_metric': 0.1674311926605505, 'eval_runtime': 10.6789, 'eval_samples_per_second': 81.656, 'eval_steps_per_second': 81.656, 'epoch': 0.92}\n",
|
||
"{'train_runtime': 83.7502, 'train_samples_per_second': 358.208, 'train_steps_per_second': 44.776, 'train_loss': 0.450164735005164, 'epoch': 0.92}\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stderr",
|
||
"text": [
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 872\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-21_03-06-38/train_462df83a_23_s=9223372036854775807,e=2.4827e-05,s=-1,s=3,e=8,d=24_2022-08-21_03-06-38/checkpoint-1146/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-21_03-06-38/train_462df83a_23_s=9223372036854775807,e=2.4827e-05,s=-1,s=3,e=8,d=24_2022-08-21_03-06-38/checkpoint-1146/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-21_03-06-38/train_462df83a_23_s=9223372036854775807,e=2.4827e-05,s=-1,s=3,e=8,d=24_2022-08-21_03-06-38/checkpoint-1146/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-21_03-06-38/train_462df83a_23_s=9223372036854775807,e=2.4827e-05,s=-1,s=3,e=8,d=24_2022-08-21_03-06-38/checkpoint-1146/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-21_03-06-38/train_462df83a_23_s=9223372036854775807,e=2.4827e-05,s=-1,s=3,e=8,d=24_2022-08-21_03-06-38/checkpoint-1146/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-21 03:08:23] {3319} INFO - at 1076.1s,\testimator transformer's best error=0.1319,\tbest estimator transformer's best error=0.1319\n",
|
||
"INFO:flaml.automl: at 1076.1s,\testimator transformer's best error=0.1319,\tbest estimator transformer's best error=0.1319\n",
|
||
"[flaml.automl: 08-21 03:08:23] {3133} INFO - iteration 9, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 9, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"{'loss': 0.4588, 'learning_rate': 9.24274365870653e-06, 'epoch': 1.6}\n",
|
||
"{'eval_loss': 0.35305526852607727, 'eval_automl_metric': 0.14220183486238536, 'eval_runtime': 10.7298, 'eval_samples_per_second': 81.269, 'eval_steps_per_second': 81.269, 'epoch': 2.0}\n",
|
||
"{'eval_loss': 0.3390190303325653, 'eval_automl_metric': 0.13876146788990829, 'eval_runtime': 10.6766, 'eval_samples_per_second': 81.674, 'eval_steps_per_second': 81.674, 'epoch': 2.22}\n",
|
||
"{'eval_loss': 0.3390190303325653, 'eval_automl_metric': 0.13876146788990829, 'eval_runtime': 10.7321, 'eval_samples_per_second': 81.252, 'eval_steps_per_second': 81.252, 'epoch': 2.22}\n",
|
||
"{'train_runtime': 76.0261, 'train_samples_per_second': 394.602, 'train_steps_per_second': 12.351, 'train_loss': 0.42033653918879177, 'epoch': 2.22}\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stderr",
|
||
"text": [
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 872\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-21_03-08-23/train_84bb5b06_24_s=9223372036854775807,e=1.977e-05,s=-1,s=3,e=32,d=24_2022-08-21_03-08-23/checkpoint-694/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-21_03-08-23/train_84bb5b06_24_s=9223372036854775807,e=1.977e-05,s=-1,s=3,e=32,d=24_2022-08-21_03-08-23/checkpoint-694/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-21_03-08-23/train_84bb5b06_24_s=9223372036854775807,e=1.977e-05,s=-1,s=3,e=32,d=24_2022-08-21_03-08-23/checkpoint-694/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-21_03-08-23/train_84bb5b06_24_s=9223372036854775807,e=1.977e-05,s=-1,s=3,e=32,d=24_2022-08-21_03-08-23/checkpoint-694/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-21_03-08-23/train_84bb5b06_24_s=9223372036854775807,e=1.977e-05,s=-1,s=3,e=32,d=24_2022-08-21_03-08-23/checkpoint-694/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-21 03:10:00] {3319} INFO - at 1172.6s,\testimator transformer's best error=0.1319,\tbest estimator transformer's best error=0.1319\n",
|
||
"INFO:flaml.automl: at 1172.6s,\testimator transformer's best error=0.1319,\tbest estimator transformer's best error=0.1319\n",
|
||
"[flaml.automl: 08-21 03:10:00] {3133} INFO - iteration 10, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 10, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"{'loss': 0.5476, 'learning_rate': 1.0987792912546241e-05, 'epoch': 0.8}\n",
|
||
"{'eval_loss': 0.41232776641845703, 'eval_automl_metric': 0.1594036697247706, 'eval_runtime': 10.5607, 'eval_samples_per_second': 82.57, 'eval_steps_per_second': 82.57, 'epoch': 1.35}\n",
|
||
"{'eval_loss': 0.41232776641845703, 'eval_automl_metric': 0.1594036697247706, 'eval_runtime': 10.5165, 'eval_samples_per_second': 82.918, 'eval_steps_per_second': 82.918, 'epoch': 1.35}\n",
|
||
"{'train_runtime': 68.5081, 'train_samples_per_second': 437.905, 'train_steps_per_second': 27.369, 'train_loss': 0.47965226870796485, 'epoch': 1.35}\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stderr",
|
||
"text": [
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 872\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-21_03-10-00/train_be42d944_25_s=9223372036854775807,e=1.4983e-05,s=-1,s=3,e=16,d=18_2022-08-21_03-10-00/checkpoint-841/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-21_03-10-00/train_be42d944_25_s=9223372036854775807,e=1.4983e-05,s=-1,s=3,e=16,d=18_2022-08-21_03-10-00/checkpoint-841/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-21_03-10-00/train_be42d944_25_s=9223372036854775807,e=1.4983e-05,s=-1,s=3,e=16,d=18_2022-08-21_03-10-00/checkpoint-841/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-21_03-10-00/train_be42d944_25_s=9223372036854775807,e=1.4983e-05,s=-1,s=3,e=16,d=18_2022-08-21_03-10-00/checkpoint-841/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-21_03-10-00/train_be42d944_25_s=9223372036854775807,e=1.4983e-05,s=-1,s=3,e=16,d=18_2022-08-21_03-10-00/checkpoint-841/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-21 03:11:29] {3319} INFO - at 1262.0s,\testimator transformer's best error=0.1319,\tbest estimator transformer's best error=0.1319\n",
|
||
"INFO:flaml.automl: at 1262.0s,\testimator transformer's best error=0.1319,\tbest estimator transformer's best error=0.1319\n",
|
||
"[flaml.automl: 08-21 03:11:29] {3133} INFO - iteration 11, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 11, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"{'loss': 0.4679, 'learning_rate': 2.402295436797273e-05, 'epoch': 0.8}\n",
|
||
"{'eval_loss': 0.3937930166721344, 'eval_automl_metric': 0.1513761467889908, 'eval_runtime': 10.5403, 'eval_samples_per_second': 82.73, 'eval_steps_per_second': 82.73, 'epoch': 1.14}\n",
|
||
"{'eval_loss': 0.3937930166721344, 'eval_automl_metric': 0.1513761467889908, 'eval_runtime': 10.5525, 'eval_samples_per_second': 82.634, 'eval_steps_per_second': 82.634, 'epoch': 1.14}\n",
|
||
"{'train_runtime': 61.8987, 'train_samples_per_second': 484.663, 'train_steps_per_second': 30.291, 'train_loss': 0.4275143780285799, 'epoch': 1.14}\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stderr",
|
||
"text": [
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 872\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-21_03-11-29/train_f38a7ed6_26_s=9223372036854775807,e=3.2759e-05,s=-1,s=3,e=16,d=30_2022-08-21_03-11-29/checkpoint-711/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-21_03-11-29/train_f38a7ed6_26_s=9223372036854775807,e=3.2759e-05,s=-1,s=3,e=16,d=30_2022-08-21_03-11-29/checkpoint-711/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-21_03-11-29/train_f38a7ed6_26_s=9223372036854775807,e=3.2759e-05,s=-1,s=3,e=16,d=30_2022-08-21_03-11-29/checkpoint-711/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-21_03-11-29/train_f38a7ed6_26_s=9223372036854775807,e=3.2759e-05,s=-1,s=3,e=16,d=30_2022-08-21_03-11-29/checkpoint-711/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-21_03-11-29/train_f38a7ed6_26_s=9223372036854775807,e=3.2759e-05,s=-1,s=3,e=16,d=30_2022-08-21_03-11-29/checkpoint-711/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-21 03:12:52] {3319} INFO - at 1344.3s,\testimator transformer's best error=0.1319,\tbest estimator transformer's best error=0.1319\n",
|
||
"INFO:flaml.automl: at 1344.3s,\testimator transformer's best error=0.1319,\tbest estimator transformer's best error=0.1319\n",
|
||
"[flaml.automl: 08-21 03:12:52] {3133} INFO - iteration 12, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 12, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"{'loss': 0.4939, 'learning_rate': 2.1277689409714175e-05, 'epoch': 0.12}\n",
|
||
"{'loss': 0.3558, 'learning_rate': 2.040062059645308e-05, 'epoch': 0.24}\n",
|
||
"{'loss': 0.3081, 'learning_rate': 1.9523551783191982e-05, 'epoch': 0.36}\n",
|
||
"{'loss': 0.2893, 'learning_rate': 1.8646482969930888e-05, 'epoch': 0.48}\n",
|
||
"{'loss': 0.2667, 'learning_rate': 1.776941415666979e-05, 'epoch': 0.59}\n",
|
||
"{'loss': 0.2576, 'learning_rate': 1.6892345343408696e-05, 'epoch': 0.71}\n",
|
||
"{'loss': 0.2435, 'learning_rate': 1.60152765301476e-05, 'epoch': 0.83}\n",
|
||
"{'loss': 0.2409, 'learning_rate': 1.5138207716886507e-05, 'epoch': 0.95}\n",
|
||
"{'loss': 0.2148, 'learning_rate': 1.4261138903625411e-05, 'epoch': 1.07}\n",
|
||
"{'loss': 0.2032, 'learning_rate': 1.3384070090364317e-05, 'epoch': 1.19}\n",
|
||
"{'loss': 0.1991, 'learning_rate': 1.2507001277103219e-05, 'epoch': 1.31}\n",
|
||
"{'loss': 0.2109, 'learning_rate': 1.1629932463842124e-05, 'epoch': 1.43}\n",
|
||
"{'loss': 0.1921, 'learning_rate': 1.0752863650581028e-05, 'epoch': 1.54}\n",
|
||
"{'loss': 0.1924, 'learning_rate': 9.875794837319934e-06, 'epoch': 1.66}\n",
|
||
"{'loss': 0.1903, 'learning_rate': 8.99872602405884e-06, 'epoch': 1.78}\n",
|
||
"{'loss': 0.1865, 'learning_rate': 8.121657210797743e-06, 'epoch': 1.9}\n",
|
||
"{'eval_loss': 0.317385196685791, 'eval_automl_metric': 0.08944954128440363, 'eval_runtime': 10.4635, 'eval_samples_per_second': 83.338, 'eval_steps_per_second': 83.338, 'epoch': 1.93}\n",
|
||
"{'eval_loss': 0.317385196685791, 'eval_automl_metric': 0.08944954128440363, 'eval_runtime': 10.5911, 'eval_samples_per_second': 82.333, 'eval_steps_per_second': 82.333, 'epoch': 1.93}\n",
|
||
"{'train_runtime': 477.591, 'train_samples_per_second': 423.054, 'train_steps_per_second': 26.445, 'train_loss': 0.2517916716532359, 'epoch': 1.93}\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stderr",
|
||
"text": [
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 872\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-21_03-12-52/train_249deb52_27_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-12-52/checkpoint-8112/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-21_03-12-52/train_249deb52_27_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-12-52/checkpoint-8112/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-21_03-12-52/train_249deb52_27_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-12-52/checkpoint-8112/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-21_03-12-52/train_249deb52_27_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-12-52/checkpoint-8112/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-21_03-12-52/train_249deb52_27_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-12-52/checkpoint-8112/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-21 03:21:35] {729} WARNING - checkpoint data/output/train_2022-08-21_03-04-34/train_fc3698e0_22_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-04-34/checkpoint-1250 not found\n",
|
||
"WARNING:flaml.automl:checkpoint data/output/train_2022-08-21_03-04-34/train_fc3698e0_22_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-04-34/checkpoint-1250 not found\n",
|
||
"[flaml.automl: 08-21 03:21:35] {3319} INFO - at 1867.4s,\testimator transformer's best error=0.0894,\tbest estimator transformer's best error=0.0894\n",
|
||
"INFO:flaml.automl: at 1867.4s,\testimator transformer's best error=0.0894,\tbest estimator transformer's best error=0.0894\n",
|
||
"[flaml.automl: 08-21 03:21:35] {3434} INFO - selected model: None\n",
|
||
"INFO:flaml.automl:selected model: None\n",
|
||
"[flaml.automl: 08-21 03:21:35] {2862} INFO - fit succeeded\n",
|
||
"INFO:flaml.automl:fit succeeded\n",
|
||
"[flaml.automl: 08-21 03:21:35] {2864} INFO - Time taken to find the best model: 1867.4163627624512\n",
|
||
"INFO:flaml.automl:Time taken to find the best model: 1867.4163627624512\n",
|
||
"[flaml.automl: 08-21 03:21:35] {2878} WARNING - Time taken to find the best model is 104% of the provided time budget and not all estimators' hyperparameter search converged. Consider increasing the time budget.\n",
|
||
"WARNING:flaml.automl:Time taken to find the best model is 104% of the provided time budget and not all estimators' hyperparameter search converged. Consider increasing the time budget.\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"'''The main flaml automl API'''\n",
|
||
"automl.fit(X_train=X_train, y_train=y_train, X_val=X_val, y_val=y_val, **automl_settings)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "Ehn1SDb5xAH9"
|
||
},
|
||
"source": [
|
||
"The run searched for 9 trials. We can print the best trial's loss, which is 1-the accuracy. The accuracy we got is 91.0% which is close to 91.2% reported by [the Electra model github](https://github.com/google-research/electra). "
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 25,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "qbTAqBsnTjhG",
|
||
"outputId": "08c88e0e-0fd8-4b76-8275-e8893c4fe0b1"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"The best loss by FLAML: 0.9105504587155964\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"print(\"The best loss by FLAML: {}\".format(1-automl.best_loss))"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "wcO2th5M6AIu"
|
||
},
|
||
"source": [
|
||
"If you have more GPUs on your server, you can use flaml.tune with the ray tune option, which will often give you a better score. For example, with the 4 NVIDIA V100 GPUs, the accuracy was 92.2%. For that experiment, you can open this notebook on your GPU server and set \"use_ray\" to {\"local_dir\": \"data/output/\"} and n_concurrent_trials to more than 1. "
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "QFP5JNdPTjhG"
|
||
},
|
||
"source": [
|
||
"### Best model and metric"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "mY07pTY_xlIV"
|
||
},
|
||
"source": [
|
||
"Next, we can print the best hyperparameter and the best score:"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 26,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "sbnhP3WrTjhG",
|
||
"outputId": "e7be276a-d30f-4dde-acea-d7b00107a161"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"Best hyperparmeter config: {'learning_rate': 2.215475822297527e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 24, 'global_max_steps': 8112, 'FLAML_sample_size': 67349}\n",
|
||
"Best accuracy on validation data: 0.9106\n",
|
||
"Training duration of best run: 523.1 s\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"'''retrieve best config and best learner'''\n",
|
||
"print('Best hyperparmeter config:', automl.best_config)\n",
|
||
"print('Best accuracy on validation data: {0:.4g}'.format(1-automl.best_loss))\n",
|
||
"print('Training duration of best run: {0:.4g} s'.format(automl.best_config_train_time))"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"source": [
|
||
"Save and load the model:"
|
||
],
|
||
"metadata": {
|
||
"id": "MqIpmxl0dKWu"
|
||
}
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"source": [
|
||
"import pickle\n",
|
||
"automl.pickle(\"automl.pkl\")\n",
|
||
"\n",
|
||
"with open(\"automl.pkl\", \"rb\") as f:\n",
|
||
" automl = pickle.load(f)"
|
||
],
|
||
"metadata": {
|
||
"id": "gfUNXfcNTBA2"
|
||
},
|
||
"execution_count": 27,
|
||
"outputs": []
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "6mdBURdexxJS"
|
||
},
|
||
"source": [
|
||
"Run the prediction:\n",
|
||
"\n",
|
||
"\n"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 28,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "kRl7pnEKTjhH",
|
||
"outputId": "19b17fb3-cf01-472c-958d-1511be54379d",
|
||
"scrolled": true
|
||
},
|
||
"outputs": [
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stderr",
|
||
"text": [
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 872\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-21_03-12-52/train_249deb52_27_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-12-52/checkpoint-8112/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-21_03-12-52/train_249deb52_27_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-12-52/checkpoint-8112/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-21_03-12-52/train_249deb52_27_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-12-52/checkpoint-8112/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-21_03-12-52/train_249deb52_27_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-12-52/checkpoint-8112/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-21_03-12-52/train_249deb52_27_s=9223372036854775807,e=2.2155e-05,s=-1,s=3,e=16,d=24_2022-08-21_03-12-52/checkpoint-8112/tokenizer_config.json\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"Predicted labels [1 1 1 1 0 1 0 0 1 0 1 0 0 0 0 1 1 1 0 0 0 0 0 1 1 0 0 1 0 0 1 0 1 0 0 0 1\n",
|
||
" 0 1 1 1 1 1 1 0 0 0 1 1 0 0 1 1 1 0 1 0 0 0 0 1 0 1 1 1 0 1 1 1 0 0 1 1 1\n",
|
||
" 0 1 0 1 1 0 1 0 0 1 1 1 0 1 1 1 1 1 1 0 1 1 1 1 0 0 1 0 1 1 1 0 1 0 0 1 0\n",
|
||
" 0 1 0 1 1 1 1 0 0 1 1 1 0 1 1 0 0 1 1 0 0 1 0 0 1 0 0 1 0 0 0 1 1 0 0 1 0\n",
|
||
" 0 1 1 1 1 0 1 0 1 0 1 1 0 0 0 0 1 0 0 0 1 1 1 1 1 0 1 1 1 0 0 1 0 0 0 1 0\n",
|
||
" 1 1 1 0 0 0 1 1 1 1 1 1 0 1 0 1 1 0 0 1 1 1 1 0 0 1 0 0 1 0 0 1 0 1 1 1 0\n",
|
||
" 1 1 1 1 0 1 1 0 1 1 0 0 1 1 1 0 0 1 1 0 0 1 1 1 1 0 0 1 1 0 1 1 0 0 0 0 0\n",
|
||
" 1 0 1 0 1 0 0 0 0 0 0 1 0 0 1 1 1 1 1 0 1 1 0 0 1 0 0 1 1 1 1 1 0 1 1 1 1\n",
|
||
" 0 1 0 1 1 1 1 1 1 0 1 1 1 1 1 1 0 1 1 0 1 1 0 0 1 0 0 1 0 1 1 1 0 0 0 1 1\n",
|
||
" 1 1 0 1 0 0 1 0 1 0 0 1 1 0 0 0 0 0 1 1 1 1 0 0 0 0 1 1 0 1 0 0 0 1 1 0 1\n",
|
||
" 0 1 1 0 0 0 0 0 0 0 0 1 0 1 0 1 1 1 0 1 0 0 0 1 1 1 1 1 1 0 1 1 1 0 1 0 1\n",
|
||
" 0 0 0 1 1 0 0 1 1 1 1 1 0 0 1 1 0 1 1 1 1 1 0 1 0 0 1 0 1 0 1 1 1 1 0 1 1\n",
|
||
" 0 1 1 1 0 1 1 1 0 1 0 1 0 1 1 1 1 0 0 0 0 0 1 1 1 0 1 0 1 1 0 1 0 1 1 1 1\n",
|
||
" 1 1 1 1 1 0 0 1 0 0 0 1 0 1 1 1 0 1 1 0 0 0 0 1 1 1 1 1 1 1 1 0 1 1 0 0 1\n",
|
||
" 1 1 1 0 0 1 1 1 0 0 1 0 1 0 1 1 1 1 0 1 0 0 1 1 0 0 1 0 1 0 0 0 0 0 0 1 0\n",
|
||
" 0 0 1 1 0 0 0 0 1 0 1 0 1 0 1 1 1 0 1 0 0 0 1 0 0 1 0 0 0 0 0 0 0 1 1 0 0\n",
|
||
" 0 1 1 0 0 0 0 0 1 0 1 1 1 1 0 1 1 1 0 1 1 1 1 0 1 1 1 0 0 1 1 1 1 1 1 0 1\n",
|
||
" 0 1 0 0 1 1 0 1 0 1 0 1 1 0 0 1 0 1 0 1 0 0 0 0 0 0 1 1 0 1 0 1 0 0 1 1 1\n",
|
||
" 1 1 0 1 0 1 0 0 1 1 0 1 1 1 1 0 0 1 1 0 1 0 0 1 0 1 1 0 0 1 1 1 1 0 0 0 0\n",
|
||
" 0 1 0 0 0 1 0 0 0 0 0 1 0 1 1 1 0 1 1 0 1 0 1 1 0 1 1 0 0 0 1 0 0 1 1 1 1\n",
|
||
" 1 1 1 0 1 0 1 1 0 0 0 0 0 1 1 1 0 0 1 0 0 0 1 1 0 0 1 1 1 1 0 1 1 1 0 0 0\n",
|
||
" 1 1 0 1 0 1 1 1 1 0 0 1 0 0 1 1 1 1 0 1 0 0 1 0 0 0 1 0 1 1 1 1 1 1 0 1 1\n",
|
||
" 0 1 1 1 0 0 1 1 0 1 0 1 1 1 0 1 1 1 0 0 1 1 0 1 0 0 0 1 0 0 0 0 1 1 1 1 0\n",
|
||
" 0 0 1 0 1 1 1 1 1 1 1 1 0 0 0 1 0 1 0 1 1]\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"'''compute predictions of testing dataset''' \n",
|
||
"y_pred = automl.predict(X_val, **{\"per_device_eval_batch_size\": 1})\n",
|
||
"print('Predicted labels', y_pred)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "QThcVssKTjhH"
|
||
},
|
||
"source": [
|
||
"### Log history"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "OEFqWAuLyYIQ"
|
||
},
|
||
"source": [
|
||
"You can also save and plot the history:"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 29,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "58wpj4vPTjhH",
|
||
"outputId": "bbd9850e-dc0e-416b-d9eb-342bb4a3a052"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 9.999999999999999e-06, 'num_train_epochs': 3, 'per_device_train_batch_size': 32, 'seed': 20, 'global_max_steps': 939, 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 9.999999999999999e-06, 'num_train_epochs': 3, 'per_device_train_batch_size': 32, 'seed': 20, 'global_max_steps': 939, 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 9.711865003865157e-06, 'num_train_epochs': 3, 'per_device_train_batch_size': 64, 'seed': 14, 'global_max_steps': 471, 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 9.999999999999999e-06, 'num_train_epochs': 3, 'per_device_train_batch_size': 32, 'seed': 20, 'global_max_steps': 939, 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 1.0296683485633468e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 26, 'global_max_steps': 1875, 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 1.0296683485633468e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 26, 'global_max_steps': 1875, 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 1.4799994583688665e-05, 'num_train_epochs': 2, 'per_device_train_batch_size': 16, 'seed': 25, 'global_max_steps': 1250, 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 1.0296683485633468e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 26, 'global_max_steps': 1875, 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 7.163630378633069e-06, 'num_train_epochs': 4, 'per_device_train_batch_size': 16, 'seed': 27, 'global_max_steps': 1803, 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 1.0296683485633468e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 26, 'global_max_steps': 1875, 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 1.789274561853069e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 32, 'global_max_steps': 1391, 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 1.0296683485633468e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 26, 'global_max_steps': 1875, 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 5.925400889482007e-06, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 20, 'global_max_steps': 1250, 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 1.0296683485633468e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 26, 'global_max_steps': 1875, 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 2.215475822297527e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 24, 'global_max_steps': 1250, 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 2.215475822297527e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 24, 'global_max_steps': 1250, 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 2.4827446197906688e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 8, 'seed': 24, 'global_max_steps': 1146, 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 2.215475822297527e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 24, 'global_max_steps': 1250, 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 1.9769786550171826e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 32, 'seed': 24, 'global_max_steps': 694, 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 2.215475822297527e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 24, 'global_max_steps': 1250, 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 1.4983353971653967e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 18, 'global_max_steps': 841, 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 2.215475822297527e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 24, 'global_max_steps': 1250, 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 3.2758574138144633e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 30, 'global_max_steps': 711, 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 2.215475822297527e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 24, 'global_max_steps': 1250, 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 67349, 'Current Hyper-parameters': {'learning_rate': 2.215475822297527e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 24, 'global_max_steps': 8112, 'FLAML_sample_size': 67349}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 2.215475822297527e-05, 'num_train_epochs': 3, 'per_device_train_batch_size': 16, 'seed': 24, 'global_max_steps': 8112, 'FLAML_sample_size': 67349}}\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"from flaml.data import get_output_from_log\n",
|
||
"time_history, best_valid_loss_history, valid_loss_history, config_history, metric_history = \\\n",
|
||
" get_output_from_log(filename=automl_settings['log_file_name'], time_budget=3000)\n",
|
||
"for config in config_history:\n",
|
||
" print(config)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 30,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/",
|
||
"height": 312
|
||
},
|
||
"id": "dtWSrLsdTjhH",
|
||
"outputId": "dfe4f9c5-f9b7-4a7d-d519-4d4a24647aa4"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"output_type": "stream",
|
||
"name": "stdout",
|
||
"text": [
|
||
"13\n"
|
||
]
|
||
},
|
||
{
|
||
"output_type": "display_data",
|
||
"data": {
|
||
"text/plain": [
|
||
"<Figure size 432x288 with 1 Axes>"
|
||
],
|
||
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYgAAAEWCAYAAAB8LwAVAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nO3de5hdVX3/8feHEMiAhACJNCSExAIpQSzBCGJVEC+J+QlERAvWimhB28pPRUNJQUyx1EuqffSR4g8sRVCuKYSokUgRsYUgCQQSEgwGjJAJQhAiF0dy+/7+2OuEncOeM3smc24zn9fznGf2XvtyvmfPzPnuvdbaaysiMDMzq7ZTswMwM7PW5ARhZmaFnCDMzKyQE4SZmRVygjAzs0JOEGZmVsgJwqwPJL1F0qpmx2FWT04Q1nYkrZH0jmbGEBH/ExET67V/SVMl/VzS85LWS7pD0gn1ej+zIk4QZgUkDWnie58M3ABcCYwF9gUuAI7vw74kyf/n1if+w7EBQ9JOks6V9Iik30m6XtLeueU3SPqtpN+ns/NDc8uukHSJpAWSXgTelq5UPidpWdrmOknD0vrHSlqb277bddPycyQ9IWmdpL+RFJIOLPgMAr4OfDEivhMRv4+IrRFxR0SckdaZLel7uW3Gp/3tnOZ/JukiSXcCfwBmSlpS9T6fkTQ/Te8q6V8lPSbpSUnfltSxg78OGwCcIGwgOQuYARwD7Ac8C1ycW/5j4CDg1cB9wPertv8gcBGwB/C/qewDwDRgAvA64CM13r9wXUnTgLOBdwAHAsfW2MdEYH9gbo11yvhr4Eyyz/JtYKKkg3LLPwhcnaa/DBwMHJ7iG0N2xWKDnBOEDSSfAM6LiLUR8RIwGzi5cmYdEZdHxPO5ZX8uac/c9jdHxJ3pjP2PqeybEbEuIp4BfkD2Jdqd7tb9APCfEbEiIv6Q3rs7+6SfT5T90N24Ir3f5oj4PXAzcCpAShR/BsxPVyxnAp+JiGci4nngX4BTdvD9bQBwgrCB5ADgJkkbJG0AHgK2APtKGiLpy6n66TlgTdpmZG77xwv2+dvc9B+AV9V4/+7W3a9q30XvU/G79HN0jXXKqH6Pq0kJguzqYV5KVqOA3YB7c8ftllRug5wThA0kjwPvjogRudewiOgk+1I8kayaZ09gfNpGue3rNbTxE2SNzRX711h3FdnneF+NdV4k+1Kv+JOCdao/y63AKEmHkyWKSvXS00AXcGjumO0ZEbUSoQ0SThDWroZKGpZ77UxW136RpAMAJI2SdGJafw/gJbIz9N3IqlEa5XrgdEmHSNoN+Hx3K0Y2/v7ZwOclnS5peGp8f7OkS9Nq9wNvlTQuVZHN6imAiNhE1jNqDrA3WcIgIrYClwH/JunVAJLGSJra509rA4YThLWrBWRnvpXXbOAbwHzgJ5KeB+4GjkrrXwn8BugEVqZlDRERPwa+CdwOrM6990vdrD8X+Evgo8A64Engn8naEYiIW4HrgGXAvcAPS4ZyNdkV1A0RsTlX/g+VuFL123+TNZbbICc/MMissSQdAjwI7Fr1RW3WUnwFYdYAkt6b7jfYC/gK8AMnB2t1ThBmjfFx4CngEbKeVX/b3HDMeuYqJjMzK+QrCDMzK7RzswPoLyNHjozx48c3Owwzs7Zy7733Ph0RhTdGDpgEMX78eJYsWdLzimZmto2k33S3zFVMZmZWyAnCzMwKOUGYmVkhJwgzMyvkBGFmZoUGTC8mM7PBZt7STuYsXMW6DV3sN6KDmVMnMmPymH7bvxOEmVkbmre0k1k3Lqdr0xYAOjd0MevG5QD9liRcxWRm1obmLFy1LTlUdG3awpyFq/rtPZwgzMza0LoNXb0q7wsnCDOzNrTfiI5elfeFE4SZWRuaOXUiHUOHbFfWMXQIM6f238MA3UhtZtaGKg3R58xdxsYtWxnjXkxmZlYxY/IYrrnnMQCu+/jR/b5/VzGZmVkhJwgzMyvkBGFmZoWcIMzMrJAThJmZFXKCMDOzQk4QZmZWyAnCzMwKOUGYmVkhJwgzMyvkBGFmZoWcIMzMrJAThJmZFXKCMDOzQnVNEJKmSVolabWkcwuWHyDpNknLJP1M0tjcstMk/Sq9TqtnnGZm9kp1SxCShgAXA+8GJgGnSppUtdq/AldGxOuAC4EvpW33Br4AHAUcCXxB0l71itXMzF6pnlcQRwKrI+LRiNgIXAucWLXOJOCnafr23PKpwK0R8UxEPAvcCkyrY6xmZlalngliDPB4bn5tKst7ADgpTb8X2EPSPiW3RdKZkpZIWrJ+/fp+C9zMzJrfSP054BhJS4FjgE5gS9mNI+LSiJgSEVNGjRpVrxjNzAalej6TuhPYPzc/NpVtExHrSFcQkl4FvC8iNkjqBI6t2vZndYzVzMyq1PMKYjFwkKQJknYBTgHm51eQNFJSJYZZwOVpeiHwLkl7pcbpd6UyMzNrkLoliIjYDHyS7Iv9IeD6iFgh6UJJJ6TVjgVWSXoY2Be4KG37DPBFsiSzGLgwlZmZWYPUs4qJiFgALKgquyA3PReY2822l/PyFYWZmTVYsxupzcysRTlBmJlZIScIMzMr5ARhZmaFnCDMzKyQE4SZmRVygjAzs0JOEGZmVsgJwszMCjlBmJlZIScIMzMr5ARhZmaFnCDMzKyQE4SZmRVygjAzs0JOEGZmVsgJwszMCjlBmJlZIScIMzMr5ARhZmaFnCDMzKyQE4SZmRVygjAzs0I9JghJ+zQiEDMzay1lriDulnSDpOmSVPeIzMysJZRJEAcDlwJ/DfxK0r9IOri+YZmZWbP1mCAic2tEnAqcAZwG3CPpDklH1z1CMzNrip17WiG1QXyI7AriSeAsYD5wOHADMKGeAZqZWXP0mCCARcBVwIyIWJsrXyLp2/UJy8zMmq1MgpgYEVG0ICK+0s/xmJlZiyjTSP0TSSMqM5L2krSwjjGZmVkLKJMgRkXEhspMRDwLvLp+IZmZWSsokyC2SBpXmZF0AFBY5WRmZgNHmQRxHvC/kq6S9D3g58CsMjuXNE3SKkmrJZ1bsHycpNslLZW0TNL0VD5U0nclLZf0kKRS72dmZv2nx0bqiLhF0hHAG1PRpyPi6Z62kzQEuBh4J7AWWCxpfkSszK12PnB9RFwiaRKwABgPvB/YNSIOk7QbsFLSNRGxphefzczMdkDZwfq2AE8BzwGTJL21xDZHAqsj4tGI2AhcC5xYtU4Aw9P0nsC6XPnuknYGOoCN6b3NzKxBytwo9zfAp4CxwP1kVxKLgON62HQM8Hhufi1wVNU6s8l6SZ0F7A68I5XPJUsmTwC7AZ+JiGcKYjsTOBNg3Lhx1YvNzGwHlLmC+BTwBuA3EfE2YDKwofYmpZ0KXBERY4HpwFWSdiK7+tgC7Ed2p/ZnJb2meuOIuDQipkTElFGjRvVTSGZmBuUSxB8j4o8AknaNiF8CE0ts1wnsn5sfm8ryPgZcDxARi4BhwEjgg8AtEbEpIp4C7gSmlHhPMzPrJ2USxNp0o9w84FZJNwO/KbHdYuAgSRMk7QKcQjaGU95jwNsBJB1CliDWp/LjUvnuZNVavyzxnmZm1k/K9GJ6b5qcLel2ssbkW0pst1nSJ4GFwBDg8ohYIelCYElEzAc+C1wm6TNkDdMfiYiQdDHwn5JWAAL+MyKW9eUDmplZ39RMEKmr6oqI+DOAiLijNzuPiAVkXVfzZRfkplcCf1Gw3QtkXV3NzKxJalYxRcQWYFX+TmozMxscyozmuhewQtI9wIuVwog4oW5RmZlZ05VJEJ+vexRmZtZyyjRS96rdwczMBoYyd1I/z8ujt+4CDAVejIjh3W9lZmbtrswVxB6VaUkiGwLjjd1vYWZmA0HZwfoAiMw8YGqd4jEzsxZRporppNzsTmRDXvyxbhGZmVlLKNOL6fjc9GZgDa8cttvMzAaYMm0QpzciEDMzay09tkGkR3+OyM3vJeny+oZlZmbNVqaR+nURse35DxHxLNkzIczMbAArkyB2krRXZUbS3pRruzAzszZW5ov+a8AiSTek+fcDF9UvJDMzawVlGqmvlLSEl59BfVIaptvMzAawMvdBvJHsmRDfSvPDJR0VEb+oe3RmbWje0k7mLFzFug1d7Deig5lTJzJj8phmh2XWa2XaIC4BXsjNv5DKzKzKvKWdzLpxOZ0bugigc0MXs25czryl1Y9jN2t9ZdogFBGVwfqIiK2S3EhtVmDOwlV0bdqyXVnXpi2cM3cZ19zzWJOisoFs5RPPMWl0fcZOLXMF8aik/ytpaHp9Cni0LtGYtbl1G7oKyzdu2drgSGywmDR6OCceXp8qzDJXAp8AvgmcTzbs923AGXWJxqzN7Teig86CJDFmRAfXffzoJkRk1nc9XkFExFMRcUpEvDoi9gU+Bhxb98jM2tDMqRPpGDpku7KOoUOYOXVikyIy67tSw31LGiJpuqSrgF8Df1nfsMza04zJY/jSSYexy5DsX2vMiA6+dNJh7sVkbalmFZOkY4APAtOBe4C/AF4TEX9oQGxmbWnG5DHbGqRdrWTtrNsEIWkt8BhZl9bPRcTzkn7t5GBmNjjUqmKaC+xHVp10vKTdefnZ1GZmNsB1myAi4tPABLKxmI4FVgGjJH1A0qsaE56ZmTVLzUbq9Azq2yPiTLJkcSrZ0+TWNCA2MzNrotJ3REfEJuCHwA8lddQvJDMzawWlurlWi4ji20XNzGzA6FOCMDOzgc+D7pkNYB563HZEmedBHAzMBA7Irx8Rx3W7kZk1XWXo8crospWhxwEnCSulTBXTDcB9ZIP1zcy9eiRpmqRVklZLOrdg+ThJt0taKmmZpOm5Za+TtEjSCknLJQ0r95HMDLofenzOwlVNisjaTZkqps0R0esHBEkaAlwMvBNYCyyWNL/qcaXnA9dHxCWSJgELgPHpeRPfA/46Ih6QtA+wqbcxmA1m3Q093l25WbUyVxA/kPR3kkZL2rvyKrHdkcDqiHg0IjYC15LdQ5EXQOVJF3sC69L0u4BlEfEAQET8LiK2YGal7TeiuDd6d+Vm1cokiNPIqpTuAu5NryUlthsDPJ6bX5vK8mYDH0rjPi0AzkrlBwMhaaGk+ySdU/QGks6UtETSkvXr15cIyWzw8NDjtqN6rGKKiAl1fP9TgSsi4muSjgaukvTaFNebgTcAfwBuk3RvRNxWFdulwKUAU6ZM8ThRZjmVhmj3YrK+KtOLaSjwt8BbU9HPgP+X7qyupRPYPzc/NpXlfQyYBhARi1JD9Eiyq42fR8TTKYYFwBFkT7Mzs5JmTB7jhGB9VqaK6RLg9cC/p9frU1lPFgMHSZogaRfgFGB+1TqPAW8HkHQIMAxYDywEDpO0W2qwPgZYiZmZNUyZXkxviIg/z83/VNIDPW0UEZslfZLsy34IcHlErJB0IbAkIuYDnwUuk/QZsgbrj0REAM9K+jpZkglgQUT8qHcfzczMdkSZBLFF0p9GxCMAkl4DlOpRFBELyBqf82UX5KZXkj2lrmjb75F1dTUzsyYokyBmArdLehQQ2R3Vp9c1KjMza7oyvZhuk3QQUOkbtyoiXqpvWGZm1my1nkl9XET8VNJJVYsOlERE3Fjn2MzMrIlqXUEcA/wUOL5gWQBOEGZmA1i3CSIivpAmL4yIX+eXSarnzXNmZtYCytwH8V8FZXP7OxAzM2sttdog/gw4FNizqh1iONkNbWZmNoDVaoOYCLwHGMH27RDPA2fUMygzM2u+Wm0QNwM3Szo6IhY1MCYzM2sBZW6UWyrp78mqm7ZVLUXER+sWlZmZNV2ZRuqrgD8BpgJ3kI3K+nw9gzIzs+YrkyAOjIjPAy9GxHeB/wMcVd+wzMys2cokiMpzHzakh/nsCby6fiGZmVkrKNMGcamkvYDPkz3P4VXABbU3MTOzdldmsL7vpMk7gNfUNxwzM2sVtW6UO7vWhhHx9f4Px6rNW9rpZwpb2/Hf7cBQ6wpij/RzIvAGXn5c6PHAPfUMyjLzlnYy68bldG3Kns/UuaGLWTcuB/A/m7Us/90OHLVulPsnAEk/B46IiOfT/GzAj/9sgDkLV237J6vo2rSFc+Yu45p7HmtSVFbGyieeY9Lo4c0Ooym6+7uds3CVE0SbKdOLaV9gY25+YyqzOlu3oauwfOOWrQ2OxHpr0ujhnHj44Pwy7O7vtrtya11lejFdCdwj6aY0PwO4om4R2Tb7jeigs+CfasyIDq77+NFNiMisZ9393e43oqMJ0diO6PEKIiIuInsG9bPpdXpEfKnegRnMnDqRjqFDtivrGDqEmVMndrOFWfP573bgqNWLaXhEPCdpb2BNelWW7R0Rz9Q/vMGtUl97ztxlbNyylTHuDWJtoPL36V5M7a9WFdPVZMN930v2iNEKpXnfE9EAMyaP2dYgPdirldx1sn3MmDzGv5sBoFYvpvekn368qDWdu06aNV6tKqYjam0YEff1fzhWT+18Bu6uk2aNV6uK6Ws1lgVwXD/HYnXU7mfg7jpp1ni1qpje1shArL7a/QzcXSfNGq/MjXJIeq2kD0j6cOVV78Csf7X7Gbi7Tpo1Xo83ykn6AnAsMAlYALwb+F+yG+isTbT7Gbi7Tpo1Xpk7qU8G/hxYGhGnS9oX+F59w7L+NnPqxO3aIKD9zsDdddKsscokiK6I2Cpps6ThwFPA/nWOy/qZz8DNrLfKJIglkkYAl5HdNPcCsKiuUVld+AzczHqj1n0QFwNXR8TfpaJvS7oFGB4RyxoSnZk1VTvfO2M7rlYvpoeBf5W0RtJXJU2OiDW9SQ6SpklaJWm1pHMLlo+TdLukpZKWSZpesPwFSZ8r/5HMrD9U7p3p3NBF8PK9M/OWdjY7NGuQbhNERHwjIo4GjgF+B1wu6ZeSviDp4J52LGkIcDFZr6dJwKmSJlWtdj5wfURMBk4B/r1q+deBH5f+NGbWb2rdO2ODQ5nhvn8TEV9JX+Knkj0P4qES+z4SWB0Rj0bERuBa4MTq3QOVx27tCayrLJA0A/g1sKLEe5lZP2v3e2dsx/WYICTtLOl4Sd8nO5tfBZxUYt9jgMdz82tTWd5s4EOS1pLdY3FWes9XAf8A/FMPsZ0paYmkJevXry8RkpmV1d09Mu1y74ztuG4ThKR3Srqc7Iv9DLLnUP9pRJwSETf30/ufClwREWOB6cBVknYiSxz/FhEv1No4Ii6NiCkRMWXUqFH9FJKZge9et9rdXGeRPRPisxHxbB/23cn290uMTWV5HwOmAUTEIknDgJHAUcDJkr4KjAC2SvpjRHyrD3GYWR/43hmrNVjfjo7Wuhg4SNIEssRwCvDBqnUeA94OXCHpEGAYsD4i3lJZQdJs4AUnh/bgbpEDi++dGdzK3CjXJxGxWdIngYXAEODyiFgh6UJgSUTMBz4LXCbpM2QN1h+JiOh+r9bK2n1IcTPbXt0SBEBELCBrfM6XXZCbXgn8RQ/7mF2X4BKf8fafdh9S3My2V9cE0ep8xtu/3C3SbGAZ1AmiuzPec+Yu45p7HmtSVK+08onnmDR6eM8rNlm7DyluZtsr9cCggaq7M9uNW7Y2OJLaJo0ezomHt/4VjbtFmg0sg/oKorsz3jEjOrju40c3IaL25m6RZgPLoE4QA+EhOq3G3SLNBo5BnSB8xmtm1r1BnSDAZ7zWutwF25pt0CcIs1bkLtjWCgZ1LyazVuVnMVgrcIIwa0G+6dBagROEWQvysxisFThBmLUg33RorcCN1GYtyF2wrRU4QZi1KHfBtmZzFZOZmRVygjAzs0JOEGZmVsgJwszMCjlBmJlZIScIMzMr5ARhZmaFnCDMzKyQE4SZmRVygjAzs0JOEGZmVsgJwszMCjlBmJlZIScIMzMr5ARhZmaFnCDMzKyQHxjUR/OWdvppX2Y2oDlB9MG8pZ3MunE5XZu2ANC5oYtZNy4HcJIwswGjrlVMkqZJWiVptaRzC5aPk3S7pKWSlkmansrfKeleScvTz+PqGWdvzVm4altyqOjatIU5C1c1KSIzs/5XtysISUOAi4F3AmuBxZLmR8TK3GrnA9dHxCWSJgELgPHA08DxEbFO0muBhUDLnJqv29DVq3Izs3ZUzyuII4HVEfFoRGwErgVOrFongOFpek9gHUBELI2Idal8BdAhadc6xtor+43o6FW5mVk7qmeCGAM8nptfyyuvAmYDH5K0luzq4ayC/bwPuC8iXqpeIOlMSUskLVm/fn3/RF3CzKkT6Rg6ZLuyjqFDmDl1YsNiMDOrt2Z3cz0VuCIixgLTgaskbYtJ0qHAV4CPF20cEZdGxJSImDJq1KiGBAxZQ/SXTjqMMSM6EDBmRAdfOukwN1Cb2YBSz15MncD+ufmxqSzvY8A0gIhYJGkYMBJ4StJY4CbgwxHxSB3j7JMZk8c4IZjZgFbPK4jFwEGSJkjaBTgFmF+1zmPA2wEkHQIMA9ZLGgH8CDg3Iu6sY4xmZtaNuiWIiNgMfJKsB9JDZL2VVki6UNIJabXPAmdIegC4BvhIRETa7kDgAkn3p9er6xWrmZm9krLv4/Y3ZcqUWLJkSbPDMDNrK5LujYgpRcua3UhtZmYtykNt2KDmMbXMuucEYYOWx9Qyq81VTDZoeUwts9qcIGzQ8phaZrU5Qdig5TG1zGpzgrBBy2NqmdXmRmobtCoN0e7FZFbMCcIGNY+pZdY9VzGZmVkhJwgzMyvkBGFmZoWcIMzMrJAThJmZFRoww31LWg/8ptlx9GAk8HSzgyjJsdaHY62fdoq3lWI9ICIKn9k8YBJEO5C0pLtx11uNY60Px1o/7RRvu8TqKiYzMyvkBGFmZoWcIBrr0mYH0AuOtT4ca/20U7xtEavbIMzMrJCvIMzMrJAThJmZFXKC6CeS9pd0u6SVklZI+lQqny2pU9L96TU9t80sSaslrZI0tQkxr5G0PMW1JJXtLelWSb9KP/dK5ZL0zRTvMklHNDDOibnjd7+k5yR9ulWOraTLJT0l6cFcWa+Po6TT0vq/knRaA2OdI+mXKZ6bJI1I5eMldeWO77dz27w+/e2sTp9HDYq1179zSdNS2WpJ5/Z3nDVivS4X5xpJ96fyph7XXokIv/rhBYwGjkjTewAPA5OA2cDnCtafBDwA7ApMAB4BhjQ45jXAyKqyrwLnpulzga+k6enAjwEBbwR+0aTjPAT4LXBAqxxb4K3AEcCDfT2OwN7Ao+nnXml6rwbF+i5g5zT9lVys4/PrVe3nnhS/0ud5d4Ni7dXvPL0eAV4D7JLWmdSIWKuWfw24oBWOa29evoLoJxHxRETcl6afBx4Caj1o4ETg2oh4KSJ+DawGjqx/pD06Efhumv4uMCNXfmVk7gZGSBrdhPjeDjwSEbXumm/osY2InwPPFMTQm+M4Fbg1Ip6JiGeBW4FpjYg1In4SEZvT7N3A2Fr7SPEOj4i7I/tWu5KXP19dY62hu9/5kcDqiHg0IjYC16Z1GxZrugr4AHBNrX006rj2hhNEHUgaD0wGfpGKPpku3y+vVDWQJY/Hc5utpXZCqYcAfiLpXklnprJ9I+KJNP1bYN803QrxApzC9v9orXpse3scWyFmgI+SnblWTJC0VNIdkt6SysaQxVfR6Fh78ztvheP6FuDJiPhVrqwVj+srOEH0M0mvAv4L+HREPAdcAvwpcDjwBNmlZqt4c0QcAbwb+HtJb80vTGcxLdMPWtIuwAnADamolY/tNq12HLsj6TxgM/D9VPQEMC4iJgNnA1dLGt6s+JK2+J1XOZXtT2pa8bgWcoLoR5KGkiWH70fEjQAR8WREbImIrcBlvFzV0Qnsn9t8bCprmIjoTD+fAm5KsT1ZqTpKP59Kqzc9XrJEdl9EPAmtfWzp/XFsasySPgK8B/irlNBI1TW/S9P3ktXlH5ziyldDNSzWPvzOm31cdwZOAq6rlLXice2OE0Q/SfWM/wE8FBFfz5Xn6+nfC1R6OcwHTpG0q6QJwEFkDVSNind3SXtUpskaKh9McVV60JwG3JyL98OpF84bgd/nqlAaZbszsVY9trkYenMcFwLvkrRXqjZ5VyqrO0nTgHOAEyLiD7nyUZKGpOnXkB3HR1O8z0l6Y/q7/3Du89U71t7+zhcDB0makK5AT0nrNso7gF9GxLaqo1Y8rt1qZgv5QHoBbyarRlgG3J9e04GrgOWpfD4wOrfNeWRnD6tocG8Fsl4dD6TXCuC8VL4PcBvwK+C/gb1TuYCLU7zLgSkNjnd34HfAnrmylji2ZEnrCWATWb3xx/pyHMnq/1en1+kNjHU1WT195e/222nd96W/jfuB+4Djc/uZQvbl/AjwLdKoDA2Itde/8/R/+HBadl6jjmsqvwL4RNW6TT2uvXl5qA0zMyvkKiYzMyvkBGFmZoWcIMzMrJAThJmZFXKCMDOzQk4Q1hYk/ZukT+fmF0r6Tm7+a5LOrrH9FZJOTtM/k/SKB8ZLGirpy8pGU71P0iJJ707L1kga2Ye4t71vN8svTiN6rqwa4fNkSQuURlbtT5JGS/phjeW7SPp5usnLBjEnCGsXdwJvApC0EzASODS3/E3AXTv4Hl8kG5X3tZENQTKDbGTeuomIv4+Iw8n66j8SEYen19yImB4RG+rwtmeT3YXcXUwbye7h+Ms6vLe1EScIaxd3AUen6UPJbiZ6Pt15vCtwCHCfpAskLZb0oKRLy46nL2k34AzgrIh4CbYN63B9wbpnp/0/WHVV8+E0iNwDkq4q2O6L6YpiSMmY1kgaqez5Ab9M2z4s6fuS3iHpznS1c2Raf3dlA9jdkwaC627U0vcBt6RtDk3r359iPyitMw/4qzJx2sDlS0hrCxGxTtJmSePIrhYWkY10eTTwe2B5RGyU9K2IuBAgfUm/B/hBibc4EHgssgEWuyXp9cDpwFFkd0X/QtIdwEbgfOBNEfG0pL2rtptDdjVyevTt7tQDgfeT3W29GPgg2d37JwD/SHa1cx7w04j4aKqaukfSf0fEi7k4JgDPVpIg8AngGxHx/TQURSV5PQi8oQ9x2gDiKwhrJ3eRJYdKgliUm78zrfM2Sb+QtBw4ju2rofrDm4GbIuLFiHgBuJFsOOfjgBsi4mmAiMg/G+DzZEOEfKKPyQHg1xGxPLJB6lYAt6V9LSd7AA1k4zedq+zJZT8DhgHjqs322KYAAAHASURBVPYzGlifm18E/KOkfwAOiIiuFP8WYKPSeF02ODlBWDuptEMcRnaGezfZFcSbgLskDQP+HTg5Ig4jq2cfVnLfq4Fxqs+wy4uB11dfVfTSS7nprbn5rbxcEyDgfbl2jHER8VDVfrrIHZOIuJrsKqQLWCDpuNy6uwJ/3IGYrc05QVg7uYusyuiZyIZ8fgYYQZYk7uLlL76nlT2Xo9veQ9UiG8X0P4BvpKqWyqib769a9X+AGZJ2UzYK7ntT2U+B90vaJ22bTwa3AF8GflTnM/KFwFmVdhdJkwvWeZiXrzgqo4k+GhHfJBs59HWpfB/g6YjYVMd4rcU5QVg7WU7We+nuqrLfR8TTqcfPZWRXFwvJztx743yy6peVyh4+/0NguzaJyB4rewXZUNK/AL4TEUsjYgVwEXCHpAeAr1dtd0OKbb6kjl7GVdYXgaHAMkkr0vx2UnvEI5IOTEUfAB5M1VKvJXvMJcDbgB/VKU5rEx7N1WyQkfRe4PURcX6NdW4Ezo2IhxsXmbUa92IyG2Qi4qZKVViRVMU2z8nBfAVhZmaF3AZhZmaFnCDMzKyQE4SZmRVygjAzs0JOEGZmVuj/A37tMhoK6ZryAAAAAElFTkSuQmCC\n"
|
||
},
|
||
"metadata": {
|
||
"needs_background": "light"
|
||
}
|
||
}
|
||
],
|
||
"source": [
|
||
"import matplotlib.pyplot as plt\n",
|
||
"import numpy as np\n",
|
||
"\n",
|
||
"plt.title('Learning Curve')\n",
|
||
"plt.xlabel('Wall Clock Time (s)')\n",
|
||
"plt.ylabel('Validation Accuracy')\n",
|
||
"print(len(valid_loss_history))\n",
|
||
"plt.scatter(time_history, 1 - np.array(valid_loss_history))\n",
|
||
"plt.step(time_history, 1 - np.array(best_valid_loss_history), where='post')\n",
|
||
"plt.show()"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "xudzM73mTjhI"
|
||
},
|
||
"source": [
|
||
"## 3. Model selection"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "A3gC3u_E4cO1"
|
||
},
|
||
"source": [
|
||
"Given a dataset, which language model should you use for the fine tuning? It appears this is a simple question: just choose the best model according to the benchmarks such as [GLUE](https://gluebenchmark.com/leaderboard). However, we will see that under the resource constraints, the model selection is non trivial. \n",
|
||
"\n",
|
||
"In this example, we will tune the [spooky-author-identification](https://www.kaggle.com/competitions/spooky-author-identification/data?select=train.zip) dataset from kaggle. You can download the dataset from the website and upload it to Colab. We run FLAML for 30 mins using bert."
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "HjvdojhfTjhI",
|
||
"outputId": "c8848ff9-1ce3-4632-84aa-0a8199a7fce9"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"[flaml.automl: 08-19 14:50:41] {2540} INFO - task = seq-classification\n",
|
||
"INFO:flaml.automl:task = seq-classification\n",
|
||
"[flaml.automl: 08-19 14:50:41] {2542} INFO - Data split method: stratified\n",
|
||
"INFO:flaml.automl:Data split method: stratified\n",
|
||
"[flaml.automl: 08-19 14:50:41] {2545} INFO - Evaluation method: holdout\n",
|
||
"INFO:flaml.automl:Evaluation method: holdout\n",
|
||
"[flaml.automl: 08-19 14:50:41] {2664} INFO - Minimizing error metric: 1-accuracy\n",
|
||
"INFO:flaml.automl:Minimizing error metric: 1-accuracy\n",
|
||
"[flaml.automl: 08-19 14:50:41] {2806} INFO - List of ML learners in AutoML Run: ['transformer']\n",
|
||
"INFO:flaml.automl:List of ML learners in AutoML Run: ['transformer']\n",
|
||
"[flaml.automl: 08-19 14:50:41] {3108} INFO - iteration 0, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 0, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"{'eval_loss': 0.8782516717910767, 'eval_automl_metric': 0.3650663942798774, 'eval_runtime': 63.6528, 'eval_samples_per_second': 76.902, 'eval_steps_per_second': 76.902, 'epoch': 0.3}\n",
|
||
"{'train_runtime': 136.3771, 'train_samples_per_second': 32.302, 'train_steps_per_second': 1.012, 'train_loss': 0.9700310748556386, 'epoch': 0.3}\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"The following columns in the test set don't have a corresponding argument in `BertForSequenceClassification.forward` and have been ignored: __index_level_0__. If __index_level_0__ are not expected by `BertForSequenceClassification.forward`, you can safely ignore this message.\n",
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 4895\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-19_14-50-41/train_4ba0c0a8_18_s=9223372036854775807,e=1e-05,s=-1,s=0.3,e=32,d=20_2022-08-19_14-50-41/checkpoint-138/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-19_14-50-41/train_4ba0c0a8_18_s=9223372036854775807,e=1e-05,s=-1,s=0.3,e=32,d=20_2022-08-19_14-50-41/checkpoint-138/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-19_14-50-41/train_4ba0c0a8_18_s=9223372036854775807,e=1e-05,s=-1,s=0.3,e=32,d=20_2022-08-19_14-50-41/checkpoint-138/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-19_14-50-41/train_4ba0c0a8_18_s=9223372036854775807,e=1e-05,s=-1,s=0.3,e=32,d=20_2022-08-19_14-50-41/checkpoint-138/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-19_14-50-41/train_4ba0c0a8_18_s=9223372036854775807,e=1e-05,s=-1,s=0.3,e=32,d=20_2022-08-19_14-50-41/checkpoint-138/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-19 14:54:42] {3242} INFO - Estimated sufficient time budget=2413758s. Estimated necessary time budget=2414s.\n",
|
||
"INFO:flaml.automl:Estimated sufficient time budget=2413758s. Estimated necessary time budget=2414s.\n",
|
||
"[flaml.automl: 08-19 14:54:42] {3294} INFO - at 241.5s,\testimator transformer's best error=0.3651,\tbest estimator transformer's best error=0.3651\n",
|
||
"INFO:flaml.automl: at 241.5s,\testimator transformer's best error=0.3651,\tbest estimator transformer's best error=0.3651\n",
|
||
"[flaml.automl: 08-19 14:54:42] {3108} INFO - iteration 1, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 1, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"{'eval_loss': 0.9422562122344971, 'eval_automl_metric': 0.4482124616956078, 'eval_runtime': 64.093, 'eval_samples_per_second': 76.373, 'eval_steps_per_second': 76.373, 'epoch': 0.3}\n",
|
||
"{'train_runtime': 142.1563, 'train_samples_per_second': 30.988, 'train_steps_per_second': 0.485, 'train_loss': 1.0089939504429914, 'epoch': 0.3}\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"The following columns in the test set don't have a corresponding argument in `BertForSequenceClassification.forward` and have been ignored: __index_level_0__. If __index_level_0__ are not expected by `BertForSequenceClassification.forward`, you can safely ignore this message.\n",
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 4895\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-19_14-54-42/train_db826be0_19_s=9223372036854775807,e=9.7119e-06,s=-1,s=0.3,e=64,d=14_2022-08-19_14-54-42/checkpoint-69/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-19_14-54-42/train_db826be0_19_s=9223372036854775807,e=9.7119e-06,s=-1,s=0.3,e=64,d=14_2022-08-19_14-54-42/checkpoint-69/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-19_14-54-42/train_db826be0_19_s=9223372036854775807,e=9.7119e-06,s=-1,s=0.3,e=64,d=14_2022-08-19_14-54-42/checkpoint-69/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-19_14-54-42/train_db826be0_19_s=9223372036854775807,e=9.7119e-06,s=-1,s=0.3,e=64,d=14_2022-08-19_14-54-42/checkpoint-69/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-19_14-54-42/train_db826be0_19_s=9223372036854775807,e=9.7119e-06,s=-1,s=0.3,e=64,d=14_2022-08-19_14-54-42/checkpoint-69/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-19 14:58:51] {3294} INFO - at 490.1s,\testimator transformer's best error=0.3651,\tbest estimator transformer's best error=0.3651\n",
|
||
"INFO:flaml.automl: at 490.1s,\testimator transformer's best error=0.3651,\tbest estimator transformer's best error=0.3651\n",
|
||
"[flaml.automl: 08-19 14:58:51] {3108} INFO - iteration 2, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 2, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"{'eval_loss': 0.764643669128418, 'eval_automl_metric': 0.30684371807967314, 'eval_runtime': 64.3046, 'eval_samples_per_second': 76.122, 'eval_steps_per_second': 76.122, 'epoch': 0.3}\n",
|
||
"{'train_runtime': 139.6474, 'train_samples_per_second': 31.545, 'train_steps_per_second': 1.976, 'train_loss': 0.9045784438865773, 'epoch': 0.3}\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"The following columns in the test set don't have a corresponding argument in `BertForSequenceClassification.forward` and have been ignored: __index_level_0__. If __index_level_0__ are not expected by `BertForSequenceClassification.forward`, you can safely ignore this message.\n",
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 4895\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-19_14-58-51/train_6fc6d930_20_s=9223372036854775807,e=1.0297e-05,s=-1,s=0.3,e=16,d=26_2022-08-19_14-58-51/checkpoint-276/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-19_14-58-51/train_6fc6d930_20_s=9223372036854775807,e=1.0297e-05,s=-1,s=0.3,e=16,d=26_2022-08-19_14-58-51/checkpoint-276/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-19_14-58-51/train_6fc6d930_20_s=9223372036854775807,e=1.0297e-05,s=-1,s=0.3,e=16,d=26_2022-08-19_14-58-51/checkpoint-276/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-19_14-58-51/train_6fc6d930_20_s=9223372036854775807,e=1.0297e-05,s=-1,s=0.3,e=16,d=26_2022-08-19_14-58-51/checkpoint-276/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-19_14-58-51/train_6fc6d930_20_s=9223372036854775807,e=1.0297e-05,s=-1,s=0.3,e=16,d=26_2022-08-19_14-58-51/checkpoint-276/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-19 15:02:56] {729} WARNING - checkpoint data/output/train_2022-08-19_14-50-41/train_4ba0c0a8_18_s=9223372036854775807,e=1e-05,s=-1,s=0.3,e=32,d=20_2022-08-19_14-50-41/checkpoint-138 not found\n",
|
||
"WARNING:flaml.automl:checkpoint data/output/train_2022-08-19_14-50-41/train_4ba0c0a8_18_s=9223372036854775807,e=1e-05,s=-1,s=0.3,e=32,d=20_2022-08-19_14-50-41/checkpoint-138 not found\n",
|
||
"[flaml.automl: 08-19 15:02:56] {3294} INFO - at 735.2s,\testimator transformer's best error=0.3068,\tbest estimator transformer's best error=0.3068\n",
|
||
"INFO:flaml.automl: at 735.2s,\testimator transformer's best error=0.3068,\tbest estimator transformer's best error=0.3068\n",
|
||
"[flaml.automl: 08-19 15:02:56] {3108} INFO - iteration 3, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 3, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"{'eval_loss': 0.6895061731338501, 'eval_automl_metric': 0.26414708886619, 'eval_runtime': 64.1612, 'eval_samples_per_second': 76.292, 'eval_steps_per_second': 76.292, 'epoch': 0.3}\n",
|
||
"{'train_runtime': 137.9967, 'train_samples_per_second': 31.923, 'train_steps_per_second': 2.0, 'train_loss': 0.8616765340169271, 'epoch': 0.3}\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"The following columns in the test set don't have a corresponding argument in `BertForSequenceClassification.forward` and have been ignored: __index_level_0__. If __index_level_0__ are not expected by `BertForSequenceClassification.forward`, you can safely ignore this message.\n",
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 4895\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-19_15-02-56/train_01e528ee_21_s=9223372036854775807,e=1.48e-05,s=-1,s=0.3,e=16,d=25_2022-08-19_15-02-56/checkpoint-276/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-19_15-02-56/train_01e528ee_21_s=9223372036854775807,e=1.48e-05,s=-1,s=0.3,e=16,d=25_2022-08-19_15-02-56/checkpoint-276/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-19_15-02-56/train_01e528ee_21_s=9223372036854775807,e=1.48e-05,s=-1,s=0.3,e=16,d=25_2022-08-19_15-02-56/checkpoint-276/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-19_15-02-56/train_01e528ee_21_s=9223372036854775807,e=1.48e-05,s=-1,s=0.3,e=16,d=25_2022-08-19_15-02-56/checkpoint-276/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-19_15-02-56/train_01e528ee_21_s=9223372036854775807,e=1.48e-05,s=-1,s=0.3,e=16,d=25_2022-08-19_15-02-56/checkpoint-276/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-19 15:07:00] {729} WARNING - checkpoint data/output/train_2022-08-19_14-58-51/train_6fc6d930_20_s=9223372036854775807,e=1.0297e-05,s=-1,s=0.3,e=16,d=26_2022-08-19_14-58-51/checkpoint-276 not found\n",
|
||
"WARNING:flaml.automl:checkpoint data/output/train_2022-08-19_14-58-51/train_6fc6d930_20_s=9223372036854775807,e=1.0297e-05,s=-1,s=0.3,e=16,d=26_2022-08-19_14-58-51/checkpoint-276 not found\n",
|
||
"[flaml.automl: 08-19 15:07:00] {3294} INFO - at 979.1s,\testimator transformer's best error=0.2641,\tbest estimator transformer's best error=0.2641\n",
|
||
"INFO:flaml.automl: at 979.1s,\testimator transformer's best error=0.2641,\tbest estimator transformer's best error=0.2641\n",
|
||
"[flaml.automl: 08-19 15:07:00] {3108} INFO - iteration 4, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 4, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"{'loss': 0.7586, 'learning_rate': 4.688468079515019e-06, 'epoch': 0.54}\n",
|
||
"{'eval_loss': 0.4876616895198822, 'eval_automl_metric': 0.18447395301327885, 'eval_runtime': 64.0236, 'eval_samples_per_second': 76.456, 'eval_steps_per_second': 76.456, 'epoch': 1.0}\n",
|
||
"{'train_runtime': 312.4704, 'train_samples_per_second': 46.993, 'train_steps_per_second': 2.938, 'train_loss': 0.6536963469062755, 'epoch': 1.0}\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"The following columns in the test set don't have a corresponding argument in `BertForSequenceClassification.forward` and have been ignored: __index_level_0__. If __index_level_0__ are not expected by `BertForSequenceClassification.forward`, you can safely ignore this message.\n",
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 4895\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-19_15-07-00/train_933f400e_22_s=9223372036854775807,e=1.0297e-05,s=-1,s=1,e=16,d=26_2022-08-19_15-07-00/checkpoint-918/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-19_15-07-00/train_933f400e_22_s=9223372036854775807,e=1.0297e-05,s=-1,s=1,e=16,d=26_2022-08-19_15-07-00/checkpoint-918/vocab.txt\n",
|
||
"loading file data/output/train_2022-08-19_15-07-00/train_933f400e_22_s=9223372036854775807,e=1.0297e-05,s=-1,s=1,e=16,d=26_2022-08-19_15-07-00/checkpoint-918/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-19_15-07-00/train_933f400e_22_s=9223372036854775807,e=1.0297e-05,s=-1,s=1,e=16,d=26_2022-08-19_15-07-00/checkpoint-918/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-19_15-07-00/train_933f400e_22_s=9223372036854775807,e=1.0297e-05,s=-1,s=1,e=16,d=26_2022-08-19_15-07-00/checkpoint-918/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-19 15:13:57] {729} WARNING - checkpoint data/output/train_2022-08-19_15-02-56/train_01e528ee_21_s=9223372036854775807,e=1.48e-05,s=-1,s=0.3,e=16,d=25_2022-08-19_15-02-56/checkpoint-276 not found\n",
|
||
"WARNING:flaml.automl:checkpoint data/output/train_2022-08-19_15-02-56/train_01e528ee_21_s=9223372036854775807,e=1.48e-05,s=-1,s=0.3,e=16,d=25_2022-08-19_15-02-56/checkpoint-276 not found\n",
|
||
"[flaml.automl: 08-19 15:13:57] {3294} INFO - at 1396.3s,\testimator transformer's best error=0.1845,\tbest estimator transformer's best error=0.1845\n",
|
||
"INFO:flaml.automl: at 1396.3s,\testimator transformer's best error=0.1845,\tbest estimator transformer's best error=0.1845\n",
|
||
"[flaml.automl: 08-19 15:13:57] {3409} INFO - selected model: None\n",
|
||
"INFO:flaml.automl:selected model: None\n",
|
||
"[flaml.automl: 08-19 15:13:57] {2837} INFO - fit succeeded\n",
|
||
"INFO:flaml.automl:fit succeeded\n",
|
||
"[flaml.automl: 08-19 15:13:57] {2839} INFO - Time taken to find the best model: 1396.3099913597107\n",
|
||
"INFO:flaml.automl:Time taken to find the best model: 1396.3099913597107\n",
|
||
"[flaml.automl: 08-19 15:13:57] {2853} WARNING - Time taken to find the best model is 78% of the provided time budget and not all estimators' hyperparameter search converged. Consider increasing the time budget.\n",
|
||
"WARNING:flaml.automl:Time taken to find the best model is 78% of the provided time budget and not all estimators' hyperparameter search converged. Consider increasing the time budget.\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"import flaml\n",
|
||
"from flaml import AutoML\n",
|
||
"import pandas as pd\n",
|
||
"from sklearn.model_selection import train_test_split\n",
|
||
"\n",
|
||
"df = pd.read_csv('spooky-author-identification.csv')\n",
|
||
"X, y = df.drop('author', axis=1), df['author']\n",
|
||
"\n",
|
||
"X_train, X_val, y_train, y_val = train_test_split(X, y, random_state=123)\n",
|
||
"automl_model = AutoML()\n",
|
||
"\n",
|
||
"automl_settings = {\n",
|
||
" \"time_budget\": 1800, \n",
|
||
" \"task\": \"seq-classification\", \n",
|
||
" \"fit_kwargs_by_estimator\": {\n",
|
||
" \"transformer\": {\n",
|
||
" \"output_dir\": \"data/output/\", \n",
|
||
" \"model_path\": \"bert-base-uncased\", \n",
|
||
" }\n",
|
||
" },\n",
|
||
" \"metric\": \"accuracy\",\n",
|
||
" \"gpu_per_trial\": 1, \n",
|
||
" \"log_file_name\": \"spooky_bert.log\", \n",
|
||
" \"log_type\": \"all\", \n",
|
||
" \"use_ray\": False, # set whether to use Ray\n",
|
||
" \"n_concurrent_trials\": 1,\n",
|
||
" \"keep_search_state\": True, # keeping the search state\n",
|
||
"}\n",
|
||
"\n",
|
||
"from flaml import tune\n",
|
||
"custom_hp = {\n",
|
||
" \"transformer\": {\n",
|
||
" \"num_train_epochs\": {\n",
|
||
" \"domain\": tune.choice([0.3, 1, 2, 3, 4, 5]),\n",
|
||
" \"init_value\": 0.3, \n",
|
||
" \"low_cost_init_value\": 0.3,\n",
|
||
" },\n",
|
||
" }\n",
|
||
"}\n",
|
||
"\n",
|
||
"automl_model.fit(X_train=X_train, y_train=y_train,X_val=X_val, y_val=y_val, custom_hp=custom_hp, **automl_settings)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "9jZiKSU75jjl"
|
||
},
|
||
"source": [
|
||
"The job ran for 23m and searched for 4 trials. This time is shorter than our budget 30m because FLAML early stops the last trial which will run for too long. If you want to run for longer time, set a larger time budget. "
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "xpA-rzYzTjhI",
|
||
"outputId": "bacf6804-5ae5-4cea-ee01-be4f35f5c90f"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"the best loss for spooky author identification: 0.18447395301327885\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"print(\"the best loss for spooky author identification: {}\".format(automl_model.best_loss))"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "TzDjaBTA6ZaD"
|
||
},
|
||
"source": [
|
||
"Next, we set the model to roberta and run again. RoBERTa outperforms BERT by 15% on the [SuperGLUE](https://super.gluebenchmark.com/) benchmark, as well as [GLUE](https://gluebenchmark.com/), [SQuAD](https://rajpurkar.github.io/SQuAD-explorer/), [RACE](https://www.cs.cmu.edu/~glai1/data/race/), etc. Does this mean we should always use RoBERTa and never use BERT? To answer this question, we run the same experiment again with RoBERTa:"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "6MTZCJz1TjhJ",
|
||
"outputId": "8adde438-ec14-44d2-f174-c549deb44729"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"[flaml.automl: 08-19 15:21:09] {2540} INFO - task = seq-classification\n",
|
||
"INFO:flaml.automl:task = seq-classification\n",
|
||
"[flaml.automl: 08-19 15:21:09] {2542} INFO - Data split method: stratified\n",
|
||
"INFO:flaml.automl:Data split method: stratified\n",
|
||
"[flaml.automl: 08-19 15:21:09] {2545} INFO - Evaluation method: holdout\n",
|
||
"INFO:flaml.automl:Evaluation method: holdout\n",
|
||
"[flaml.automl: 08-19 15:21:09] {2664} INFO - Minimizing error metric: 1-accuracy\n",
|
||
"INFO:flaml.automl:Minimizing error metric: 1-accuracy\n",
|
||
"[flaml.automl: 08-19 15:21:09] {2806} INFO - List of ML learners in AutoML Run: ['transformer']\n",
|
||
"INFO:flaml.automl:List of ML learners in AutoML Run: ['transformer']\n",
|
||
"[flaml.automl: 08-19 15:21:09] {3108} INFO - iteration 0, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 0, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"{'eval_loss': 0.7796056866645813, 'eval_automl_metric': 0.3170582226762002, 'eval_runtime': 65.2086, 'eval_samples_per_second': 75.067, 'eval_steps_per_second': 75.067, 'epoch': 0.3}\n",
|
||
"{'train_runtime': 139.2884, 'train_samples_per_second': 31.626, 'train_steps_per_second': 0.991, 'train_loss': 0.9700887928838315, 'epoch': 0.3}\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"The following columns in the test set don't have a corresponding argument in `RobertaForSequenceClassification.forward` and have been ignored: __index_level_0__. If __index_level_0__ are not expected by `RobertaForSequenceClassification.forward`, you can safely ignore this message.\n",
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 4895\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-19_15-21-09/train_8d9e8af4_24_s=9223372036854775807,e=1e-05,s=-1,s=0.3,e=32,d=20_2022-08-19_15-21-09/checkpoint-138/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-19_15-21-09/train_8d9e8af4_24_s=9223372036854775807,e=1e-05,s=-1,s=0.3,e=32,d=20_2022-08-19_15-21-09/checkpoint-138/vocab.json\n",
|
||
"loading file data/output/train_2022-08-19_15-21-09/train_8d9e8af4_24_s=9223372036854775807,e=1e-05,s=-1,s=0.3,e=32,d=20_2022-08-19_15-21-09/checkpoint-138/merges.txt\n",
|
||
"loading file data/output/train_2022-08-19_15-21-09/train_8d9e8af4_24_s=9223372036854775807,e=1e-05,s=-1,s=0.3,e=32,d=20_2022-08-19_15-21-09/checkpoint-138/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-19_15-21-09/train_8d9e8af4_24_s=9223372036854775807,e=1e-05,s=-1,s=0.3,e=32,d=20_2022-08-19_15-21-09/checkpoint-138/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-19_15-21-09/train_8d9e8af4_24_s=9223372036854775807,e=1e-05,s=-1,s=0.3,e=32,d=20_2022-08-19_15-21-09/checkpoint-138/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-19 15:25:18] {3242} INFO - Estimated sufficient time budget=2487757s. Estimated necessary time budget=2488s.\n",
|
||
"INFO:flaml.automl:Estimated sufficient time budget=2487757s. Estimated necessary time budget=2488s.\n",
|
||
"[flaml.automl: 08-19 15:25:18] {3294} INFO - at 249.0s,\testimator transformer's best error=0.3171,\tbest estimator transformer's best error=0.3171\n",
|
||
"INFO:flaml.automl: at 249.0s,\testimator transformer's best error=0.3171,\tbest estimator transformer's best error=0.3171\n",
|
||
"[flaml.automl: 08-19 15:25:18] {3108} INFO - iteration 1, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 1, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"{'eval_loss': 1.0274657011032104, 'eval_automl_metric': 0.5805924412665986, 'eval_runtime': 65.4388, 'eval_samples_per_second': 74.803, 'eval_steps_per_second': 74.803, 'epoch': 0.3}\n",
|
||
"{'train_runtime': 141.3162, 'train_samples_per_second': 31.173, 'train_steps_per_second': 0.488, 'train_loss': 1.0675905752873076, 'epoch': 0.3}\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"The following columns in the test set don't have a corresponding argument in `RobertaForSequenceClassification.forward` and have been ignored: __index_level_0__. If __index_level_0__ are not expected by `RobertaForSequenceClassification.forward`, you can safely ignore this message.\n",
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 4895\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-19_15-25-18/train_21eb5d04_25_s=9223372036854775807,e=9.7119e-06,s=-1,s=0.3,e=64,d=14_2022-08-19_15-25-18/checkpoint-69/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-19_15-25-18/train_21eb5d04_25_s=9223372036854775807,e=9.7119e-06,s=-1,s=0.3,e=64,d=14_2022-08-19_15-25-18/checkpoint-69/vocab.json\n",
|
||
"loading file data/output/train_2022-08-19_15-25-18/train_21eb5d04_25_s=9223372036854775807,e=9.7119e-06,s=-1,s=0.3,e=64,d=14_2022-08-19_15-25-18/checkpoint-69/merges.txt\n",
|
||
"loading file data/output/train_2022-08-19_15-25-18/train_21eb5d04_25_s=9223372036854775807,e=9.7119e-06,s=-1,s=0.3,e=64,d=14_2022-08-19_15-25-18/checkpoint-69/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-19_15-25-18/train_21eb5d04_25_s=9223372036854775807,e=9.7119e-06,s=-1,s=0.3,e=64,d=14_2022-08-19_15-25-18/checkpoint-69/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-19_15-25-18/train_21eb5d04_25_s=9223372036854775807,e=9.7119e-06,s=-1,s=0.3,e=64,d=14_2022-08-19_15-25-18/checkpoint-69/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-19 15:29:30] {3294} INFO - at 500.5s,\testimator transformer's best error=0.3171,\tbest estimator transformer's best error=0.3171\n",
|
||
"INFO:flaml.automl: at 500.5s,\testimator transformer's best error=0.3171,\tbest estimator transformer's best error=0.3171\n",
|
||
"[flaml.automl: 08-19 15:29:30] {3108} INFO - iteration 2, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 2, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"{'eval_loss': 0.6534864902496338, 'eval_automl_metric': 0.27436159346271705, 'eval_runtime': 65.6703, 'eval_samples_per_second': 74.539, 'eval_steps_per_second': 74.539, 'epoch': 0.3}\n",
|
||
"{'train_runtime': 142.8116, 'train_samples_per_second': 30.846, 'train_steps_per_second': 1.933, 'train_loss': 0.8491502291914346, 'epoch': 0.3}\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"The following columns in the test set don't have a corresponding argument in `RobertaForSequenceClassification.forward` and have been ignored: __index_level_0__. If __index_level_0__ are not expected by `RobertaForSequenceClassification.forward`, you can safely ignore this message.\n",
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 4895\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-19_15-29-30/train_b7eb3874_26_s=9223372036854775807,e=1.0297e-05,s=-1,s=0.3,e=16,d=26_2022-08-19_15-29-30/checkpoint-276/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-19_15-29-30/train_b7eb3874_26_s=9223372036854775807,e=1.0297e-05,s=-1,s=0.3,e=16,d=26_2022-08-19_15-29-30/checkpoint-276/vocab.json\n",
|
||
"loading file data/output/train_2022-08-19_15-29-30/train_b7eb3874_26_s=9223372036854775807,e=1.0297e-05,s=-1,s=0.3,e=16,d=26_2022-08-19_15-29-30/checkpoint-276/merges.txt\n",
|
||
"loading file data/output/train_2022-08-19_15-29-30/train_b7eb3874_26_s=9223372036854775807,e=1.0297e-05,s=-1,s=0.3,e=16,d=26_2022-08-19_15-29-30/checkpoint-276/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-19_15-29-30/train_b7eb3874_26_s=9223372036854775807,e=1.0297e-05,s=-1,s=0.3,e=16,d=26_2022-08-19_15-29-30/checkpoint-276/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-19_15-29-30/train_b7eb3874_26_s=9223372036854775807,e=1.0297e-05,s=-1,s=0.3,e=16,d=26_2022-08-19_15-29-30/checkpoint-276/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-19 15:33:42] {729} WARNING - checkpoint data/output/train_2022-08-19_15-21-09/train_8d9e8af4_24_s=9223372036854775807,e=1e-05,s=-1,s=0.3,e=32,d=20_2022-08-19_15-21-09/checkpoint-138 not found\n",
|
||
"WARNING:flaml.automl:checkpoint data/output/train_2022-08-19_15-21-09/train_8d9e8af4_24_s=9223372036854775807,e=1e-05,s=-1,s=0.3,e=32,d=20_2022-08-19_15-21-09/checkpoint-138 not found\n",
|
||
"[flaml.automl: 08-19 15:33:42] {3294} INFO - at 753.0s,\testimator transformer's best error=0.2744,\tbest estimator transformer's best error=0.2744\n",
|
||
"INFO:flaml.automl: at 753.0s,\testimator transformer's best error=0.2744,\tbest estimator transformer's best error=0.2744\n",
|
||
"[flaml.automl: 08-19 15:33:42] {3108} INFO - iteration 3, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 3, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"{'eval_loss': 0.586589515209198, 'eval_automl_metric': 0.2402451481103166, 'eval_runtime': 65.9616, 'eval_samples_per_second': 74.21, 'eval_steps_per_second': 74.21, 'epoch': 0.3}\n",
|
||
"{'train_runtime': 140.9957, 'train_samples_per_second': 31.243, 'train_steps_per_second': 1.958, 'train_loss': 0.7886253025220789, 'epoch': 0.3}\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"The following columns in the test set don't have a corresponding argument in `RobertaForSequenceClassification.forward` and have been ignored: __index_level_0__. If __index_level_0__ are not expected by `RobertaForSequenceClassification.forward`, you can safely ignore this message.\n",
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 4895\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-19_15-33-42/train_4e5c88d0_27_s=9223372036854775807,e=1.48e-05,s=-1,s=0.3,e=16,d=25_2022-08-19_15-33-42/checkpoint-276/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-19_15-33-42/train_4e5c88d0_27_s=9223372036854775807,e=1.48e-05,s=-1,s=0.3,e=16,d=25_2022-08-19_15-33-42/checkpoint-276/vocab.json\n",
|
||
"loading file data/output/train_2022-08-19_15-33-42/train_4e5c88d0_27_s=9223372036854775807,e=1.48e-05,s=-1,s=0.3,e=16,d=25_2022-08-19_15-33-42/checkpoint-276/merges.txt\n",
|
||
"loading file data/output/train_2022-08-19_15-33-42/train_4e5c88d0_27_s=9223372036854775807,e=1.48e-05,s=-1,s=0.3,e=16,d=25_2022-08-19_15-33-42/checkpoint-276/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-19_15-33-42/train_4e5c88d0_27_s=9223372036854775807,e=1.48e-05,s=-1,s=0.3,e=16,d=25_2022-08-19_15-33-42/checkpoint-276/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-19_15-33-42/train_4e5c88d0_27_s=9223372036854775807,e=1.48e-05,s=-1,s=0.3,e=16,d=25_2022-08-19_15-33-42/checkpoint-276/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-19 15:37:54] {729} WARNING - checkpoint data/output/train_2022-08-19_15-29-30/train_b7eb3874_26_s=9223372036854775807,e=1.0297e-05,s=-1,s=0.3,e=16,d=26_2022-08-19_15-29-30/checkpoint-276 not found\n",
|
||
"WARNING:flaml.automl:checkpoint data/output/train_2022-08-19_15-29-30/train_b7eb3874_26_s=9223372036854775807,e=1.0297e-05,s=-1,s=0.3,e=16,d=26_2022-08-19_15-29-30/checkpoint-276 not found\n",
|
||
"[flaml.automl: 08-19 15:37:54] {3294} INFO - at 1004.8s,\testimator transformer's best error=0.2402,\tbest estimator transformer's best error=0.2402\n",
|
||
"INFO:flaml.automl: at 1004.8s,\testimator transformer's best error=0.2402,\tbest estimator transformer's best error=0.2402\n",
|
||
"[flaml.automl: 08-19 15:37:54] {3108} INFO - iteration 4, current learner transformer\n",
|
||
"INFO:flaml.automl:iteration 4, current learner transformer\n",
|
||
"/usr/local/lib/python3.7/dist-packages/transformers/optimization.py:309: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" FutureWarning,\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"{'loss': 0.7223, 'learning_rate': 4.688468079515019e-06, 'epoch': 0.54}\n",
|
||
"{'eval_loss': 0.4903346300125122, 'eval_automl_metric': 0.1953013278855975, 'eval_runtime': 65.2412, 'eval_samples_per_second': 75.029, 'eval_steps_per_second': 75.029, 'epoch': 1.0}\n",
|
||
"{'train_runtime': 310.9644, 'train_samples_per_second': 47.221, 'train_steps_per_second': 2.952, 'train_loss': 0.624375353711363, 'epoch': 1.0}\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"The following columns in the test set don't have a corresponding argument in `RobertaForSequenceClassification.forward` and have been ignored: __index_level_0__. If __index_level_0__ are not expected by `RobertaForSequenceClassification.forward`, you can safely ignore this message.\n",
|
||
"***** Running Prediction *****\n",
|
||
" Num examples = 4895\n",
|
||
" Batch size = 1\n",
|
||
"Didn't find file data/output/train_2022-08-19_15-37-54/train_e47787a2_28_s=9223372036854775807,e=1.0297e-05,s=-1,s=1,e=16,d=26_2022-08-19_15-37-54/checkpoint-918/added_tokens.json. We won't load it.\n",
|
||
"loading file data/output/train_2022-08-19_15-37-54/train_e47787a2_28_s=9223372036854775807,e=1.0297e-05,s=-1,s=1,e=16,d=26_2022-08-19_15-37-54/checkpoint-918/vocab.json\n",
|
||
"loading file data/output/train_2022-08-19_15-37-54/train_e47787a2_28_s=9223372036854775807,e=1.0297e-05,s=-1,s=1,e=16,d=26_2022-08-19_15-37-54/checkpoint-918/merges.txt\n",
|
||
"loading file data/output/train_2022-08-19_15-37-54/train_e47787a2_28_s=9223372036854775807,e=1.0297e-05,s=-1,s=1,e=16,d=26_2022-08-19_15-37-54/checkpoint-918/tokenizer.json\n",
|
||
"loading file None\n",
|
||
"loading file data/output/train_2022-08-19_15-37-54/train_e47787a2_28_s=9223372036854775807,e=1.0297e-05,s=-1,s=1,e=16,d=26_2022-08-19_15-37-54/checkpoint-918/special_tokens_map.json\n",
|
||
"loading file data/output/train_2022-08-19_15-37-54/train_e47787a2_28_s=9223372036854775807,e=1.0297e-05,s=-1,s=1,e=16,d=26_2022-08-19_15-37-54/checkpoint-918/tokenizer_config.json\n",
|
||
"[flaml.automl: 08-19 15:44:56] {729} WARNING - checkpoint data/output/train_2022-08-19_15-33-42/train_4e5c88d0_27_s=9223372036854775807,e=1.48e-05,s=-1,s=0.3,e=16,d=25_2022-08-19_15-33-42/checkpoint-276 not found\n",
|
||
"WARNING:flaml.automl:checkpoint data/output/train_2022-08-19_15-33-42/train_4e5c88d0_27_s=9223372036854775807,e=1.48e-05,s=-1,s=0.3,e=16,d=25_2022-08-19_15-33-42/checkpoint-276 not found\n",
|
||
"[flaml.automl: 08-19 15:44:56] {3294} INFO - at 1426.8s,\testimator transformer's best error=0.1953,\tbest estimator transformer's best error=0.1953\n",
|
||
"INFO:flaml.automl: at 1426.8s,\testimator transformer's best error=0.1953,\tbest estimator transformer's best error=0.1953\n",
|
||
"[flaml.automl: 08-19 15:44:56] {3409} INFO - selected model: None\n",
|
||
"INFO:flaml.automl:selected model: None\n",
|
||
"[flaml.automl: 08-19 15:44:56] {2837} INFO - fit succeeded\n",
|
||
"INFO:flaml.automl:fit succeeded\n",
|
||
"[flaml.automl: 08-19 15:44:56] {2839} INFO - Time taken to find the best model: 1426.8331220149994\n",
|
||
"INFO:flaml.automl:Time taken to find the best model: 1426.8331220149994\n",
|
||
"[flaml.automl: 08-19 15:44:56] {2853} WARNING - Time taken to find the best model is 79% of the provided time budget and not all estimators' hyperparameter search converged. Consider increasing the time budget.\n",
|
||
"WARNING:flaml.automl:Time taken to find the best model is 79% of the provided time budget and not all estimators' hyperparameter search converged. Consider increasing the time budget.\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"automl_settings[\"fit_kwargs_by_estimator\"][\"transformer\"][\"model_path\"] = \"roberta-base\"\n",
|
||
"automl_settings[\"log_file_name\"] = \"spooky_roberta.log\"\n",
|
||
"automl_model.fit(X_train=X_train, y_train=y_train,X_val=X_val, y_val=y_val, custom_hp=custom_hp, **automl_settings)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "MknjX6ij7lpX"
|
||
},
|
||
"source": [
|
||
"We plot the performance of BERT and RoBERTa w.r.t. the wall clock time. We find that although RoBERTa frequently outperforms BERT on benchmark datasets, its performance on the spooky-author-identification dataset is worse than BERT using the same time budget. Therefore, model selection is a non trivial problem. "
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/",
|
||
"height": 300
|
||
},
|
||
"id": "IHqpFgG3TjhJ",
|
||
"outputId": "dcd3f094-1689-4ebf-d796-55c127ba048c"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"5\n",
|
||
"5\n"
|
||
]
|
||
},
|
||
{
|
||
"data": {
|
||
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXoAAAD4CAYAAADiry33AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAeXUlEQVR4nO3df3RV5Z3v8fenKUrGi4BCFUkE7EJGRQU9MnjpDzsOQlsVnVoLY1vtL8bbYltnFlVu77IO/TFVutpO12LGOpbqmtUKXC7F1DqLWluvq61WQlWUaDSiown2NoJgrUEgfO8feyccDifkBE5ycnY+r7XOyt7PfvbJ87DDJzt77/M8igjMzCy73lbpBpiZWf9y0JuZZZyD3sws4xz0ZmYZ56A3M8u4t1e6AYXGjBkTEydOrHQzzMyqysaNG1+NiLHFtg26oJ84cSKNjY2VboaZWVWR9F89bfOlGzOzjHPQm5llnIPezCzjBt01+mL27NlDa2sru3btqnRTBqXhw4dTV1fHsGHDKt0UMxuEqiLoW1tbGTFiBBMnTkRSpZszqEQE27Zto7W1lUmTJlW6OWY2CFVF0O/atcsh3wNJHH/88bS3t1e6KWbWg3WPtbFsfTNbd3Rw0qhaFs+ZwmXTxw/Y96+KoAcc8ofgfxuzwWvdY20sWfskHXs6AWjb0cGStU8CDFjY+2asmVk/Wra+uTvku3Ts6WTZ+uYBa4ODvkQvvvgiU6dOPez9161bR1NTUxlbZGbVYOuOjj6V9wcH/QDYu3evg95siDppVG2fyvuDg74P9u7dy1VXXcVpp53GFVdcwZtvvsnGjRt573vfy7nnnsucOXN45ZVXALjgggv44he/SC6X45ZbbqGhoYHFixczbdo0nn/++Qr3xMwGyuI5U6gdVnNAWe2wGhbPmTJgbaiam7Fd/umnm2na+npZ3/P0k47lK5ec0Wu95uZmfvCDHzBr1iw++clPsnz5cn7yk59wzz33MHbsWFatWsWXv/xlVqxYAcDu3bu7x+157rnnuPjii7niiivK2nYzG9y6brh+ac0mdnfuY3yxp242rYYHlsLOVhhZBxfeBGddWbY2VF3QV1J9fT2zZs0C4KMf/Sjf+MY3eOqpp5g9ezYAnZ2djBs3rrv+Rz7ykYq008wGl8umj+fuR18CYNXfn3/gxk2r4aefhz3pNfudLyfrULawLynoJc0F/gWoAe6IiG8WbD8ZuAsYlda5MSLuS7ctAT4FdAKfj4j1R9LgUs68+0vhY4wjRozgjDPO4OGHHy5a/5hjjhmIZplZNXtg6f6Q77KnIykvU9D3eo1eUg2wHHg/cDqwQNLpBdX+F7A6IqYD84F/Tfc9PV0/A5gL/Gv6flXppZde6g71H//4x8ycOZP29vbusj179rB58+ai+44YMYI//elPA9ZWM6sSO1v7Vn4YSrkZOwNoiYgtEbEbWAnMK6gTwLHp8khga7o8D1gZEW9FxAtAS/p+VWnKlCksX76c0047jddee43rrruONWvWcMMNN3D22Wczbdo0fvvb3xbdd/78+Sxbtozp06f7ZqyZ7Teyrm/lh6GUSzfjgZfz1luBvyqoczPwc0nXAccAf5O37yMF+x70UTBJC4GFACeffHIp7R5wEydO5JlnnjmofNq0aTz00EMHlT/44IMHrM+aNcuPV5rZwS686cBr9ADDapPyMinX45ULgDsjog74APAfkkp+74i4PSJyEZEbO7boTFhmZtl01pVwyfeg5uhkfWR9sj7AT920AfV563VpWb5PkVyDJyIeljQcGFPivmZmQ9tZV8LGu5LlT/ys7G9fyln3BmCypEmSjiK5udpQUOcl4EIASacBw4H2tN58SUdLmgRMBh4tV+PNzKx3vZ7RR8ReSYuA9SSPTq6IiM2SlgKNEdEA/CPw75KuJ7kxe01EBLBZ0mqgCdgLfC4iOot/JzMz6w8lPUefPhN/X0HZTXnLTcCsHvb9OvD1I2ijmZkdAY91Y2aWcQ76MnrwwQe5+OKLj+g97rzzTrZu3dp7RTOzEjnoD0NEsG/fvrK/b2dnp4PezMrOQV+iF198kSlTpvDxj3+cqVOn8qlPfYqpU6dy5plnsmrVqu56r7/+Oh/84AeZMmUK1157bfcvhJ///Oecf/75nHPOOXz4wx/mjTfeAJIPYt1www2cc8453H333TQ2NnLVVVcxbdo0Ojo6WLp0Keeddx5Tp05l4cKFJPe4zcxKV32jV/7njfCHJ8v7nieeCe//Zq/VnnvuOe666y7a2tq47bbbeOKJJ3j11Vc577zzeM973gPAo48+SlNTExMmTGDu3LmsXbuWCy64gK997Wv84he/4JhjjuGWW27h29/+NjfdlNzPPv744/n9738PwB133MG3vvUtcrkcAIsWLequ97GPfYx7772XSy65pLz9N7NMq76gr6AJEyYwc+ZMrr/+ehYsWEBNTQ0nnHAC733ve9mwYQPHHnssM2bM4JRTTgFgwYIF/PrXv2b48OE0NTV1D3G8e/duzj9//1ClhxrO+Fe/+hW33norb775Jtu3b+eMM85w0JtZn1Rf0Jdw5t1fShl2uHAoY0lEBLNnz+buu+/u0/vu2rWLz372szQ2NlJfX8/NN9/Mrl27+t5wMxvSfI3+MLz73e9m1apVdHZ20t7ezkMPPcSMGcmgnI8++igvvPAC+/btY9WqVbzrXe9i5syZ/OY3v6GlpQWAP//5zzz77LNF3zt/OOOuUB8zZgxvvPEGa9asGYDemVnWVN8Z/SBw+eWX8/DDD3P22WcjiVtvvZUTTzyRZ555hvPOO49FixbR0tLC+973Pi6//HLe9ra3ceedd7JgwQLeeustAL72ta9x6qmnHvTe11xzDddeey21tbU8/PDDfOYzn2Hq1KmceOKJnHfeeQPdVTPLAA22pzhyuVx0zbPa5emnn+a0006rUIuqg/+NzAa3j3w/maDooKkEu/zwg8nXwxzUTNLGiMgV2+ZLN2ZmGeegN8uiTavhO1Ph5lHJ102rK90iq6CquUYfEQc90WKJwXb5zSps0+oDZyza+XKyDmWdzMKqR1UE/fDhw9m2bRvHH3+8w75ARLBt2zaGDx9e6abYYPHA0gOnpYNk/Z5F+ye3sAF307adycIPRxav8Icnkw9v9oOqCPq6ujpaW1tpb2+vdFMGpeHDh1NXV76JhK3K7WwtXt751sC2w/rmxDPhzCv65a2rIuiHDRvGpEmTKt0Ms+owsi65XHNQeX2/TFNnpVna9dTNJ3p46qYflXQzVtJcSc2SWiTdWGT7dyQ9nr6elbQjb1tn3rbCKQjNrNwuvAmG1R5YNqw2Kbchqdczekk1wHJgNtAKbJDUkM4qBUBEXJ9X/zpget5bdETEtPI12cwOqeuG6z2Lkss1I+uTkPeN2CGrlEs3M4CWiNgCIGklMI9kHthiFgBfKU/zzOywnHXl/huvvlwz5JVy6WY8kH/BrzUtO4ikCcAk4Jd5xcMlNUp6RNJlPey3MK3T6BuuZmblVe4PTM0H1kREZ17ZhPRjuX8HfFfSOwt3iojbIyIXEbmxY8eWuUlmZkNbKUHfBtTnrdelZcXMBw4Yizci2tKvW4AHOfD6vZmZ9bNSgn4DMFnSJElHkYT5QU/PSPpLYDTwcF7ZaElHp8tjgFn0fG3fzMz6Qa83YyNir6RFwHqgBlgREZslLQUaI6Ir9OcDK+PAz+OfBnxf0j6SXyrfzH9ax8wO37rH2li2vpmtOzo4aVQti+dM4bLpRW+f2RBX0gemIuI+4L6CspsK1m8ust9vgf75TK/ZELbusTaWrH2Sjj3J7bC2HR0sWZvMpeywt0JV8clYMzvQsvXN3SHfpWNPJ19as4m7H30J2D+2StcnMq2yml55ndPHHVuR7+1his2q0NYdHUXLd3fuG+CWWKlOH3cs86ZV5q8tn9GbVaGTRtXSViTsx4+q3T+DUTpKYiXGVrHBxWf0lj1DYNKNxXOmUDus5oCy2mE1LJ4zpUItssHMZ/SWLUNk0o2uG65fWrOJ3Z37GO+nbuwQHPSWLUNo0o3LgMnHJDdczzhhJDxO8urSjxNZWHXxpRvLFk+6sV8/TmRh1cVn9JYtQ2zSjUpOZmHVw2f0li2edMPsIA56y5azroRLvgc1RyfrI+uT9QzdiDXrK1+6sezxpBtmB/AZvZlZxjnozcwyzkFvZpZxDnozs4xz0JuZZVxJQS9prqRmSS2Sbiyy/TuSHk9fz0rakbftaknPpa+ry9l4MzPrXa+PV0qqAZYDs4FWYIOkhvwpASPi+rz615FOAC7pOOArQA4IYGO672tl7YWZmfWolDP6GUBLRGyJiN3ASmDeIeovAO5Ol+cA90fE9jTc7wfmHkmDzcysb0r5wNR4IH/wkFbgr4pVlDQBmAT88hD7HjSOqqSFwEKAk08+uYQmmSU8QbZZ78r9ydj5wJqI6Oy1Zp6IuB24HSCXy0WZ22QZdcgJsivZMLNBppSgbwPq89br0rJi5gOfK9j3goJ9Hyy9eWY9O9QE2V3jtGd9YuxKTjht1aOUa/QbgMmSJkk6iiTMGworSfpLYDSQ/z9rPXCRpNGSRgMXpWVmR8wTZFd2wmmrHr2e0UfEXkmLSAK6BlgREZslLQUaI6Ir9OcDKyMi8vbdLumrJL8sAJZGxPbydsGGqkNNkH3GCZ4Y26xLSdfoI+I+4L6CspsK1m/uYd8VwIrDbF/fbVqdTCe3szWZhOLCmzxEbUYtnjPlgGv0kDdB9uOH2NFsiMnWMMVDZGJoSxxygmwHvVm3bAX9EJoY2hI9TpDtibHNumVrrBtPDG1dPDG2WbdsndEPsYmhLeEJss0OLVtn9J4Y2szsINkK+q6JoUfWA/LE0GZmZO3SDSSh7mA3M+uWrTN6MzM7iIPezCzjHPRmZhnnoDczyzgHvZlZxjnozcwyzkFvZpZxDnozs4xz0JuZZVxJQS9prqRmSS2SbuyhzpWSmiRtlvTjvPJOSY+nr4OmIDQzs/7V6xAIkmqA5cBsoBXYIKkhIpry6kwGlgCzIuI1Se/Ie4uOiJhW5nabmVmJSjmjnwG0RMSWiNgNrATmFdT5DLA8Il4DiIg/lreZZmZ2uEoJ+vFA/iDvrWlZvlOBUyX9RtIjkubmbRsuqTEtv6zYN5C0MK3T2N7e3qcOmJnZoZVr9Mq3A5OBC4A64CFJZ0bEDmBCRLRJOgX4paQnI+L5/J0j4nbgdoBcLhdlapOZmVHaGX0bUJ+3XpeW5WsFGiJiT0S8ADxLEvxERFv6dQvwIDD9CNtsZmZ9UErQbwAmS5ok6ShgPlD49Mw6krN5JI0huZSzRdJoSUfnlc8CmjAzswHT66WbiNgraRGwHqgBVkTEZklLgcaIaEi3XSSpCegEFkfENkn/Hfi+pH0kv1S+mf+0Tjmte6yNZeub2bqjg5NG1bJ4zhQum154K8HMbOgp6Rp9RNwH3FdQdlPecgD/kL7y6/wWOPPIm3lo6x5rY8naJ+nY0wlA244Olqx9EsBhb2ZDXiamEly2vrk75Lt07OnkS2s2cfejL1WoVTZQml55ndPHHVvpZpgNWpkYAmHrjo6i5bs79w1wS6wSTh93LPOm+S83s55k4oz+pFG1tBUJ+/Gjaln19+dXoEVmZoNHJs7oF8+ZQu2wmgPKaofVsHjOlAq1yMxs8MjEGX3XDVc/dWNmdrBMBD0kYe9gNzM7WCYu3ZiZWc8c9GZmGeegNzPLOAe9mVnGOejNzDLOQW9mlnEOejOzjHPQm5llnIPezCzjHPRmZhlXUtBLmiupWVKLpBt7qHOlpCZJmyX9OK/8aknPpa+ry9VwMzMrTa9j3UiqAZYDs0kmAd8gqSF/SkBJk4ElwKyIeE3SO9Ly44CvADkggI3pvq+VvytmZlZMKWf0M4CWiNgSEbuBlcC8gjqfAZZ3BXhE/DEtnwPcHxHb0233A3PL03QzMytFKUE/Hng5b701Lct3KnCqpN9IekTS3D7si6SFkholNba3t5feejMz61W5bsa+HZgMXAAsAP5d0qhSd46I2yMiFxG5sWPHlqlJZmYGpQV9G1Cft16XluVrBRoiYk9EvAA8SxL8pexrZmb9qJSg3wBMljRJ0lHAfKChoM46krN5JI0huZSzBVgPXCRptKTRwEVpmZmZDZBen7qJiL2SFpEEdA2wIiI2S1oKNEZEA/sDvQnoBBZHxDYASV8l+WUBsDQitvdHR8zMrDhFRKXbcIBcLheNjY2VboaZWVWRtDEicsW2+ZOxZmYZ56A3M8s4B72ZWcY56M3MMs5Bb2aWcQ56M7OMc9CbmWWcg97MLOMc9GZmGeegNzPLOAe9mVnGOejNzDLOQW9mlnEOejOzjHPQm5llXK8Tj9jgt+6xNpatb2brjg5OGlXL4jlTuGz6QXOwm9kQVdIZvaS5kpoltUi6scj2ayS1S3o8fX06b1tnXnnhFIR2hNY91saStU/StqODANp2dLBk7ZOse8xT85pZotczekk1wHJgNskk4BskNUREU0HVVRGxqMhbdETEtCNvqhWzbH0zHXs6Dyjr2NPJsvXNPqs3M6C0M/oZQEtEbImI3cBKYF7/NstKtXVHR5/KzWzoKSXoxwMv5623pmWFPiRpk6Q1kurzyodLapT0iKTLin0DSQvTOo3t7e2lt944aVRtn8rNbOgp11M3PwUmRsRZwP3AXXnbJqQT1v4d8F1J7yzcOSJuj4hcROTGjh1bpiYNDYvnTKF2WM0BZbXDalg8Z0qFWmRmg00pQd8G5J+h16Vl3SJiW0S8la7eAZybt60t/boFeBCYfgTttQKXTR/PP//tmYwfVYuA8aNq+ee/PdPX582sWymPV24AJkuaRBLw80nOzrtJGhcRr6SrlwJPp+WjgTcj4i1JY4BZwK3larwlLps+3sFuZj3qNegjYq+kRcB6oAZYERGbJS0FGiOiAfi8pEuBvcB24Jp099OA70vaR/LXwzeLPK1jZmb9SBFR6TYcIJfLRWNjY6WbYWZWVSRtTO+HHsRDIJiZZZyD3sws4xz0ZmYZ56A3M8s4B72ZWcY56M3MMs5Bb2aWcQ56M7OMc9CbmWWcg97MLOMc9GZmGeegNzPLOAe9mVnGOejNzDLOQW9mlnEOejOzjCsp6CXNldQsqUXSjUW2XyOpXdLj6evTeduulvRc+rq6nI03M7Pe9TqVoKQaYDkwG2gFNkhqKDIl4KqIWFSw73HAV4AcEMDGdN/XytJ6MzPrVSln9DOAlojYEhG7gZXAvBLffw5wf0RsT8P9fmDu4TXVzMwORylBPx54OW+9NS0r9CFJmyStkVTfl30lLZTUKKmxvb29xKabmVkpynUz9qfAxIg4i+Ss/a6+7BwRt0dELiJyY8eOLVOTzMwMSgv6NqA+b70uLesWEdsi4q109Q7g3FL3NTOz/lVK0G8AJkuaJOkoYD7QkF9B0ri81UuBp9Pl9cBFkkZLGg1clJaZmdkA6fWpm4jYK2kRSUDXACsiYrOkpUBjRDQAn5d0KbAX2A5ck+67XdJXSX5ZACyNiO390A8zM+uBIqLSbThALpeLxsbGSjfDzKyqSNoYEbli2/zJWDOzjHPQm5llnIPezCzjHPRmZhnnoDczyzgHvZlZxjnozcwyzkFvZpZxDnozs4xz0JuZZZyD3sws4xz0ZmYZ56A3M8s4B72ZWcY56M3MMq6koJc0V1KzpBZJNx6i3ockhaRcuj5RUoekx9PXbeVquJmZlabXGaYk1QDLgdlAK7BBUkNENBXUGwF8AfhdwVs8HxHTytReMzPro1LO6GcALRGxJSJ2AyuBeUXqfRW4BdhVxvaZmdkRKiXoxwMv5623pmXdJJ0D1EfEz4rsP0nSY5L+r6R3H35TzczscPR66aY3kt4GfJt0QvACrwAnR8Q2SecC6ySdERGvF7zHQmAhwMknn3ykTTIzszylnNG3AfV563VpWZcRwFTgQUkvAjOBBkm5iHgrIrYBRMRG4Hng1MJvEBG3R0QuInJjx449vJ6YmVlRpQT9BmCypEmSjgLmAw1dGyNiZ0SMiYiJETEReAS4NCIaJY1Nb+Yi6RRgMrCl7L0wM7Me9XrpJiL2SloErAdqgBURsVnSUqAxIhoOsft7gKWS9gD7gGsjYns5Gm5mZqVRRFS6DQfI5XLR2NhY6WaYmVUVSRsjIldsmz8Za2aWcQ56M7OMc9CbmWWcg97MLOMc9GZmGeegNzPLOAe9mVnGOeizZtNq+M5UuHlU8nXT6kq3yMwq7IgHNbNBZNNq+OnnYU9Hsr7z5WQd4KwrK9cuM6son9FnyQNL94d8lz0dSbmZDVkO+izZ2dq3cjMbEhz0WTKyrm/lZjYkOOiz5MKbYFjtgWXDapNyMxuyHPRZctaVcMn3YGQ9oOTrJd/zjVizIc5P3WTNWVc62M3sAD6jNzPLuJKCXtJcSc2SWiTdeIh6H5IUknJ5ZUvS/ZolzSlHo83MrHS9XrpJ53xdDswGWoENkhoioqmg3gjgC8Dv8spOJ5lj9gzgJOAXkk6NiM7ydcHMzA6llDP6GUBLRGyJiN3ASmBekXpfBW4BduWVzQNWRsRbEfEC0JK+n5mZDZBSgn488HLeemta1k3SOUB9RPysr/uamVn/OuKnbiS9Dfg2cM0RvMdCYGG6+oak5oIqY4BXD/f9B5ks9QWy1R/3ZfDKUn/6qy8TetpQStC3AfV563VpWZcRwFTgQUkAJwINki4tYV8AIuJ24PaeGiCpsafZzatNlvoC2eqP+zJ4Zak/lehLKZduNgCTJU2SdBTJzdWGro0RsTMixkTExIiYCDwCXBoRjWm9+ZKOljQJmAw8WvZemJlZj3o9o4+IvZIWAeuBGmBFRGyWtBRojIiGQ+y7WdJqoAnYC3zOT9yYmQ2skq7RR8R9wH0FZUUHUImICwrWvw58/TDb16XHyzpVKEt9gWz1x30ZvLLUnwHviyJioL+nmZkNIA+BYGaWcQ56M7OMq3jQS6qX9CtJTZI2S/pCWn6cpPslPZd+HZ2WS9L30vFzNqUf1hpUJNVIekzSven6JEm/S9u8Kn16ifRppFVp+e8kTaxku4uRNErSGknPSHpa0vnVemwkXZ/+jD0l6W5Jw6vp2EhaIemPkp7KK+vzsZB0dVr/OUlXD6K+LEt/zjZJ+omkUXnbio6ZpRLH4epvxfqTt+0flYwBNiZdH/hjExEVfQHjgHPS5RHAs8DpwK3AjWn5jcAt6fIHgP8EBMwEflfpPhTp0z8APwbuTddXA/PT5duA/5Eufxa4LV2eD6yqdNuL9OUu4NPp8lHAqGo8NiSfyH4BqM07JtdU07EB3gOcAzyVV9anYwEcB2xJv45Ol0cPkr5cBLw9Xb4lry+nA08ARwOTgOdJngCsSZdPSX82nwBOHyzHJi2vJ3li8b+AMZU6NhX9we3hH+wekgHUmoFxadk4oDld/j6wIK9+d73B8CL5UNgDwF8D96YH89W8H+DzgfXp8nrg/HT57Wk9VboPeX0ZmYajCsqr7tiwfziO49J/63uBOdV2bICJBeHYp2MBLAC+n1d+QL1K9qVg2+XAj9LlJcCSvG3r02PVfbyK1RsM/QHWAGcDL7I/6Af82FT80k2+9M/j6SQjYJ4QEa+km/4AnJAuD/bxc74LfAnYl64fD+yIiL3pen57u/uSbt+Z1h8sJgHtwA/TS1F3SDqGKjw2EdEGfAt4CXiF5N96I9V7bLr09VgM2mNU4JMkZ71QpX2RNA9oi4gnCjYNeH8GTdBL+m/A/wG+GBGv52+L5NfboH8OVNLFwB8jYmOl21Imbyf5c/TfImI68GeSywPdqujYjCYZTXUSyZDZxwBzK9qoMquWY9EbSV8m+YDljyrdlsMl6S+A/wkMigmbB0XQSxpGEvI/ioi1afH/kzQu3T4O+GNaXtL4ORUyC7hU0oskwzn/NfAvwChJXR9Oy29vd1/S7SOBbQPZ4F60Aq0R0TXHwBqS4K/GY/M3wAsR0R4Re4C1JMerWo9Nl74ei8F8jJB0DXAxcFX6iwuqsy/vJDmpeCLNgzrg95JOpAL9qXjQSxLwA+DpiPh23qYGoOuu89Uk1+67yj+e3rmeCezM+9O1oiJiSUTURTLmz3zglxFxFfAr4Iq0WmFfuvp4RVp/0JyRRcQfgJclTUmLLiQZzqLqjg3JJZuZkv4i/Znr6ktVHps8fT0W64GLJI1O/8q5KC2rOElzSS57XhoRb+Zt6mnMrEOOw1VJEfFkRLwj9o8B1kry0MkfqMSxqdSNi7wbDu8i+XNzE/B4+voAyfXQB4DngF8Ax6X1RTLj1fPAk0Cu0n3ooV8XsP+pm1NIfjBbgP8NHJ2WD0/XW9Ltp1S63UX6MQ1oTI/POpKnAary2AD/BDwDPAX8B8lTHFVzbIC7Se4v7CEJjk8dzrEguf7dkr4+MYj60kJyjborB27Lq//ltC/NwPvzyj9A8qTe88CXB9OxKdj+Ivtvxg74sfEQCGZmGVfxSzdmZta/HPRmZhnnoDczyzgHvZlZxjnozcwyzkFvZpZxDnozs4z7/xQNBu15VctMAAAAAElFTkSuQmCC\n",
|
||
"text/plain": [
|
||
"<Figure size 432x288 with 1 Axes>"
|
||
]
|
||
},
|
||
"metadata": {
|
||
"needs_background": "light"
|
||
},
|
||
"output_type": "display_data"
|
||
}
|
||
],
|
||
"source": [
|
||
"from flaml.data import get_output_from_log\n",
|
||
"import matplotlib.pyplot as plt\n",
|
||
"import numpy as np\n",
|
||
"\n",
|
||
"for each_file_name in ['bert', 'roberta']:\n",
|
||
" time_history, best_valid_loss_history, valid_loss_history, config_history, metric_history = \\\n",
|
||
" get_output_from_log(filename='spooky_' + each_file_name + '.log', time_budget=3000)\n",
|
||
" print(len(valid_loss_history))\n",
|
||
" plt.scatter(time_history, 1 - np.array(valid_loss_history))\n",
|
||
" plt.step(time_history, 1 - np.array(best_valid_loss_history), where='post')\n",
|
||
"\n",
|
||
"plt.legend(['bert', 'roberta'])\n",
|
||
"plt.show()"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "lT7IwNCoTjhJ"
|
||
},
|
||
"source": [
|
||
"## 4. Other Tasks"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "Fzkr77iATjhJ"
|
||
},
|
||
"source": [
|
||
"Besides sequence classification, FLAML currently also supports four other tasks (more tasks are to be supported, which can be found on [FLAML's documentation website] (https://microsoft.github.io/FLAML/docs/Examples/AutoML-NLP)):\n",
|
||
"\n",
|
||
"- sequence regression: predicting a float number from the input sequence, e.g., predicting the rating of a hotel review based on the text content;\n",
|
||
"- token classification: predicting the label of each token in a sequence, e.g., named entity recognition;\n",
|
||
"- multiple choice: predicting the best second half of a sentence that comes next to the first part of a sentence based on common sensen reasoning. An example is seen below;\n",
|
||
"- (abstractive) summarization: generating the textual summarization of an input paragraph;\n",
|
||
"\n",
|
||
"Here we look into two tasks: multiple choice classification and text summarization. These tasks require significant computational resources, therefore instead of Colab, we run them using 4 NVIDIA V100 GPUs and Ray Tune on our server."
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "Y4VgUR5TTjhJ"
|
||
},
|
||
"source": [
|
||
"### 4.1 Multiple Choice Example"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "OO8GqaH3TjhJ"
|
||
},
|
||
"source": [
|
||
"Multiple choice is a task of predicting the best second half of a sentence that follows the first half based on common sense reasoning. An example of multiple-choice classification problem is:\n",
|
||
"\n",
|
||
"On stage, a woman takes a seat at the piano. She\n",
|
||
"a) sits on a bench as her sister plays with the doll.\n",
|
||
"b) smiles with someone as the music plays.\n",
|
||
"c) is in the crowd, watching the dancers.\n",
|
||
"d) *nervously sets her fingers on the keys*."
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/",
|
||
"height": 382,
|
||
"referenced_widgets": [
|
||
"3588f07c45694ec4a484afaaa9e9c599",
|
||
"998b0ca5b37b47b88ea47327462c76fa",
|
||
"38bb77cefa2e4c17b8e9c419125d6c45",
|
||
"ded6921a6b8140b3bcd59d0e7bbd7900",
|
||
"ad994aff0bf94c2ea4ac9aa8d5c067e3",
|
||
"6be493d86857493190ee47a08c04ff40",
|
||
"46e340fe82414a58b283c78cdf953773",
|
||
"760feb714cc54846a52fc399703891d7",
|
||
"a21f2bb483e44c749ec41a2b1784ee4b",
|
||
"9cf2c0d8439a4f5a86b4769a27babb94",
|
||
"007a43463f5e4da3983f59dfeb793e64",
|
||
"2e00672f9d1f46cea3e5db651bca19a3",
|
||
"cc378c1990634f7da6ffba019fe38c59",
|
||
"642323b1bafa4d0fbeca1adff2426c02",
|
||
"a902290681e942cbae40024baaa2e9b7",
|
||
"3a014eef1b7d44698572bc5cada4cb8c",
|
||
"c1792bbceb854dc5880003f64e5623cb",
|
||
"b167b817426e4832b73a7a37b72115c1",
|
||
"5ea642008bf74641a021e17b7e3fd6e7",
|
||
"2d975d14c3f0434583e73ae97f580951",
|
||
"26f3abaf861a4c63986ff0691294d70c",
|
||
"1a701b0fa5a34fb9b64fb92b5c8e4306",
|
||
"7dc52faf4b3b4643b7d7019f1722c1d8",
|
||
"44cf4e612b5f482d8bf224413c1bc852",
|
||
"f4ec7bc190af4c9dbe6a5fc05fad4540",
|
||
"9dd6ab0e0cb940bebe25cba5492b2486",
|
||
"e556c049fbb24669a49b26c7f107e6a5",
|
||
"805c722b7dec4fc59ef64f8704e29424",
|
||
"0c3a0eb88b16493e9d0f62e3d5abf195",
|
||
"bfbcfceca0444337ac6c4033a7734fc1",
|
||
"5ad8132df42340c58f1375b1e52eb5bc",
|
||
"aa1c4b91b583440a8e6f79dd06cfd200",
|
||
"b37bd95afbe44cd196fc5ab2d52bccd0",
|
||
"b649c32bcc8446cf91c53604fc1dcaa6",
|
||
"b70e2d813be3473f90ca76e293251c0b",
|
||
"b66e17d6fd094f44bc10eade34fc5261",
|
||
"9509ad27a9b54e3e80f796d224f3e189",
|
||
"920c8dd736a4454f9469fb3fa0a9af90",
|
||
"8f5311cebd554f5ba645b8d33b0722a3",
|
||
"aad092fcb29d4045a342288aa9d6a329",
|
||
"811d914b52904fa0913adc9daa33695c",
|
||
"98f179b9be5044c79bc867f5261e2b47",
|
||
"18091d361aa44881a3db5d1951882082",
|
||
"2a1aa694683d4df9b509f5ce4d6d53b0",
|
||
"f74dfe0a3de64c3ea051e14fba9a04e4",
|
||
"37d4912ed8ee4c0c9f0a9187bad156fd",
|
||
"6284429508d849bd8259460913efc250",
|
||
"e1f77bef878c4b0bbfac867c5a9eea98",
|
||
"20624397998c4e188b419c6267affb65",
|
||
"3b6eaa3d64924ec581c412b04b9196fa",
|
||
"94a2d9480adc426abb4ade344ca8dd2f",
|
||
"d4830572efa244968881c31932ec5dff",
|
||
"21f848683b2648c08a7476658d382177",
|
||
"5a392cc22ee84433bac08ef8a6a3e0d4",
|
||
"98bcaff9e28547e3b1f9b0640d598f99",
|
||
"3b684b9f50ce48ff92b075d62619368b",
|
||
"a3e42e8e532c4628abaa6e154d667ed2",
|
||
"49290455306b47aaaf8153bed5e49742",
|
||
"d8468fc2f0b94b2b8dab75336a0d29a3",
|
||
"b09d990f98f0419e84f5939d3b48d381",
|
||
"4b92faf53c2b4f7986066af8026ffc3a",
|
||
"8497fe93c0d148a49f9a0a0c56961f36",
|
||
"d226d72577ea4cc299ed78c2fa99a486",
|
||
"82e0dbe1328a4c54807932984e0c4efb",
|
||
"eb395373be244e6e8815087d5d32a801",
|
||
"1ffd6e8c1f834dc48d66116b6089f7a2",
|
||
"155d7e95c2504507b83b12dc60f1edc1",
|
||
"feaf1f712b8e499db2558dc0fdd4261e",
|
||
"51392c27affa4fd3b4184cde01b7029d",
|
||
"5dd6914461ea456e9dc96ccf8c391c6e",
|
||
"069feb62f1ec4392b04ee1d80aa4b445",
|
||
"62bdec145ccb48faa4fe5f51d2879732",
|
||
"c96613989db447b5acfb35cfef553145",
|
||
"db589666b507409f9647930b1222b0a9",
|
||
"1fed6cdd71c4453b976e8651b7b34cae",
|
||
"f307d9be05d24940b57d9edc82be8976",
|
||
"20abb46ea9c948b8ba85a921aee8af6d",
|
||
"ca550fa3fe1147bd8285c2b7cadde206",
|
||
"263e99ca21124b79a26e1078b187273d",
|
||
"894de256daec49329f6404326eddaa39",
|
||
"7feeefa264da4af89ddc8ddf331b4f9f",
|
||
"4b2b38b8064040849b10c63b9f2ed8fd",
|
||
"a91dc7ddd7f641d9b60b59bbbde7bae6",
|
||
"96c6874b9bd045a5bc67596b2ab04df2",
|
||
"57aca5124cc14ed69da5a0b24a2c1052",
|
||
"8f2c2b10e21e42569ef5396e42c65e30",
|
||
"be21476b15a14c0084712a9d5aedc22f",
|
||
"362f58e6d05f4d0c865cbe6a956d677b",
|
||
"c274c717ac7e4fa2888e0d101c3fe1fb",
|
||
"9cbec0a1fe3247ed8a46290df56756fa",
|
||
"35151380719b41349a2113b0b893bd6f",
|
||
"1fb6335656b1444abe05aa94a7d13825",
|
||
"a6431dfb76084a838d63849fa362de35",
|
||
"26885bbae6c646e7bcc4a0459620c37a",
|
||
"a56b058d910244588c1454b02c8cda8c",
|
||
"efca4a3072e94170a1b851f9dec6164d",
|
||
"791101442898470f8524ebee4cb9459d",
|
||
"6a24dd061b2d40d4baca4036059fc94c",
|
||
"9fdc731eb58247cbafef9286b49c66f9",
|
||
"4fa9926221cb4d29bc0cc0c3d0bf93f3",
|
||
"4e956bd06d3a45eca66b192990416a62",
|
||
"e319f91c3ba841f99a9a1ca1c7b551f2",
|
||
"da31e023dddb4c25a035258c0e4ed0d7",
|
||
"574c7a42dad940379a96b9f0968d3be1",
|
||
"ab2c48f34b7f43c5a5fb37c80a7d47f3",
|
||
"834bdf06646a4d009648b6bc270c7624",
|
||
"2e0b939889d84c07a90d36a57065aac4",
|
||
"929cdd7c2f8e4902aac96a9a3afa5866",
|
||
"87639f90c2ab47db986419c03e165d7a",
|
||
"a18b9c22460940cfa53b657849b034bf"
|
||
]
|
||
},
|
||
"id": "hQ5fX0N3TjhJ",
|
||
"outputId": "e1701a84-daad-4e70-82ac-c2c14d718793"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"No config specified, defaulting to: swag/regular\n",
|
||
"Reusing dataset swag (/home/xliu127/.cache/huggingface/datasets/swag/regular/0.0.0/9640de08cdba6a1469ed3834fcab4b8ad8e38caf5d1ba5e7436d8b1fd067ad4c)\n",
|
||
"No config specified, defaulting to: swag/regular\n",
|
||
"Reusing dataset swag (/home/xliu127/.cache/huggingface/datasets/swag/regular/0.0.0/9640de08cdba6a1469ed3834fcab4b8ad8e38caf5d1ba5e7436d8b1fd067ad4c)\n",
|
||
"No config specified, defaulting to: swag/regular\n",
|
||
"Reusing dataset swag (/home/xliu127/.cache/huggingface/datasets/swag/regular/0.0.0/9640de08cdba6a1469ed3834fcab4b8ad8e38caf5d1ba5e7436d8b1fd067ad4c)\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"73546\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"from datasets import load_dataset\n",
|
||
"\n",
|
||
"train_dataset = load_dataset(\"swag\", split=\"train\").to_pandas()\n",
|
||
"dev_dataset = load_dataset(\"swag\", split=\"validation\").to_pandas()\n",
|
||
"test_dataset = load_dataset(\"swag\", split=\"test\").to_pandas()\n",
|
||
"\n",
|
||
"custom_sent_keys = [\n",
|
||
" \"sent1\",\n",
|
||
" \"sent2\",\n",
|
||
" \"ending0\",\n",
|
||
" \"ending1\",\n",
|
||
" \"ending2\",\n",
|
||
" \"ending3\",\n",
|
||
" \"gold-source\",\n",
|
||
" \"video-id\",\n",
|
||
" \"startphrase\",\n",
|
||
" \"fold-ind\",\n",
|
||
" ] # specify the column names of the input sentences\n",
|
||
"label_key = \"label\" # specify the column name of the label\n",
|
||
"\n",
|
||
"X_train, y_train = train_dataset[custom_sent_keys], train_dataset[label_key]\n",
|
||
"X_val, y_val = dev_dataset[custom_sent_keys], dev_dataset[label_key]\n",
|
||
"X_test = test_dataset[custom_sent_keys]\n",
|
||
"\n",
|
||
"print(len(X_train))"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/",
|
||
"height": 53
|
||
},
|
||
"id": "19m2ZpRGTjhJ",
|
||
"outputId": "65a82458-dfd0-4e90-d0ac-fdbd231822f1"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"data": {
|
||
"application/vnd.google.colaboratory.intrinsic+json": {
|
||
"type": "string"
|
||
},
|
||
"text/plain": [
|
||
"'Members of the procession walk down the street holding small horn brass instruments.'"
|
||
]
|
||
},
|
||
"execution_count": 29,
|
||
"metadata": {},
|
||
"output_type": "execute_result"
|
||
}
|
||
],
|
||
"source": [
|
||
"train_dataset.iloc[0][\"sent1\"]"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "uvNeyzFsTjhJ",
|
||
"outputId": "b423df4f-a056-4abd-cece-ac653ea639e2"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"== Status ==<br>Current time: 2022-08-20 09:44:29 (running for 00:30:25.10)<br>Memory usage on this node: 23.7/376.6 GiB<br>Using FIFO scheduling algorithm.<br>Resources requested: 0/4 CPUs, 0/4 GPUs, 0.0/252.27 GiB heap, 0.0/112.11 GiB objects (0.0/1.0 accelerator_type:V100)<br>Current best trial: a6161fe9 with val_loss=0.2717684694591622 and parameters={'learning_rate': 1.0471607729914847e-05, 'num_train_epochs': 4, 'per_device_train_batch_size': 16, 'seed': 21, 'global_max_steps': 9223372036854775807, 'learner': 'transformer', 'FLAML_sample_size': 10000}<br>Result logdir: /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03<br>Number of trials: 12/1000000 (12 TERMINATED)<br><br>"
|
||
],
|
||
"text/plain": [
|
||
"<IPython.core.display.HTML object>"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"\u001b[2m\u001b[36m(train pid=9840)\u001b[0m The following columns in the test set don't have a corresponding argument in `BertForMultipleChoice.forward` and have been ignored: sent1, ending3, video-id, startphrase, fold-ind, ending0, ending1, ending2, gold-source, sent2. If sent1, ending3, video-id, startphrase, fold-ind, ending0, ending1, ending2, gold-source, sent2 are not expected by `BertForMultipleChoice.forward`, you can safely ignore this message.\n",
|
||
"\u001b[2m\u001b[36m(train pid=9840)\u001b[0m ***** Running Prediction *****\n",
|
||
"\u001b[2m\u001b[36m(train pid=9840)\u001b[0m Num examples = 20006\n",
|
||
"\u001b[2m\u001b[36m(train pid=9840)\u001b[0m Batch size = 1\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"\u001b[2m\u001b[36m(train pid=10183)\u001b[0m {'eval_loss': 0.6692697405815125, 'eval_automl_metric': 0.25667299810056987, 'eval_runtime': 158.3379, 'eval_samples_per_second': 126.35, 'eval_steps_per_second': 126.35, 'epoch': 0.93}\n",
|
||
"\u001b[2m\u001b[36m(train pid=10015)\u001b[0m {'eval_loss': 0.8068006634712219, 'eval_automl_metric': 0.301259622113366, 'eval_runtime': 166.133, 'eval_samples_per_second': 120.422, 'eval_steps_per_second': 120.422, 'epoch': 1.02}\n",
|
||
"\u001b[2m\u001b[36m(train pid=10183)\u001b[0m {'train_runtime': 444.8991, 'train_samples_per_second': 89.908, 'train_steps_per_second': 11.239, 'train_loss': 0.9226323432562794, 'epoch': 0.93}\n",
|
||
"\u001b[2m\u001b[36m(train pid=10015)\u001b[0m {'train_runtime': 462.8118, 'train_samples_per_second': 108.035, 'train_steps_per_second': 6.752, 'train_loss': 1.0871613750307578, 'epoch': 1.02}\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"\u001b[2m\u001b[36m(train pid=10183)\u001b[0m The following columns in the test set don't have a corresponding argument in `BertForMultipleChoice.forward` and have been ignored: ending2, ending1, sent1, gold-source, ending0, fold-ind, video-id, ending3, sent2, startphrase. If ending2, ending1, sent1, gold-source, ending0, fold-ind, video-id, ending3, sent2, startphrase are not expected by `BertForMultipleChoice.forward`, you can safely ignore this message.\n",
|
||
"\u001b[2m\u001b[36m(train pid=10183)\u001b[0m ***** Running Prediction *****\n",
|
||
"\u001b[2m\u001b[36m(train pid=10183)\u001b[0m Num examples = 20006\n",
|
||
"\u001b[2m\u001b[36m(train pid=10183)\u001b[0m Batch size = 1\n",
|
||
"\u001b[2m\u001b[36m(train pid=10015)\u001b[0m The following columns in the test set don't have a corresponding argument in `BertForMultipleChoice.forward` and have been ignored: ending1, ending0, ending3, gold-source, video-id, fold-ind, startphrase, ending2, sent1, sent2. If ending1, ending0, ending3, gold-source, video-id, fold-ind, startphrase, ending2, sent1, sent2 are not expected by `BertForMultipleChoice.forward`, you can safely ignore this message.\n",
|
||
"\u001b[2m\u001b[36m(train pid=10015)\u001b[0m ***** Running Prediction *****\n",
|
||
"\u001b[2m\u001b[36m(train pid=10015)\u001b[0m Num examples = 20006\n",
|
||
"\u001b[2m\u001b[36m(train pid=10015)\u001b[0m Batch size = 1\n",
|
||
"\u001b[2m\u001b[36m(train pid=9672)\u001b[0m Didn't find file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_259ba87c_9_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_09-35-49/checkpoint-630/added_tokens.json. We won't load it.\n",
|
||
"\u001b[2m\u001b[36m(train pid=9672)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_259ba87c_9_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_09-35-49/checkpoint-630/vocab.txt\n",
|
||
"\u001b[2m\u001b[36m(train pid=9672)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_259ba87c_9_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_09-35-49/checkpoint-630/tokenizer.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=9672)\u001b[0m loading file None\n",
|
||
"\u001b[2m\u001b[36m(train pid=9672)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_259ba87c_9_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_09-35-49/checkpoint-630/special_tokens_map.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=9672)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_259ba87c_9_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_09-35-49/checkpoint-630/tokenizer_config.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=9840)\u001b[0m Didn't find file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_31f5f7b3_10_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train__2022-08-20_09-36-11/checkpoint-612/added_tokens.json. We won't load it.\n",
|
||
"\u001b[2m\u001b[36m(train pid=9840)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_31f5f7b3_10_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train__2022-08-20_09-36-11/checkpoint-612/vocab.txt\n",
|
||
"\u001b[2m\u001b[36m(train pid=9840)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_31f5f7b3_10_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train__2022-08-20_09-36-11/checkpoint-612/tokenizer.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=9840)\u001b[0m loading file None\n",
|
||
"\u001b[2m\u001b[36m(train pid=9840)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_31f5f7b3_10_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train__2022-08-20_09-36-11/checkpoint-612/special_tokens_map.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=9840)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_31f5f7b3_10_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train__2022-08-20_09-36-11/checkpoint-612/tokenizer_config.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=10183)\u001b[0m Didn't find file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_46c248b3_12_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train__2022-08-20_09-36-46/checkpoint-1167/added_tokens.json. We won't load it.\n",
|
||
"\u001b[2m\u001b[36m(train pid=10183)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_46c248b3_12_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train__2022-08-20_09-36-46/checkpoint-1167/vocab.txt\n",
|
||
"\u001b[2m\u001b[36m(train pid=10183)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_46c248b3_12_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train__2022-08-20_09-36-46/checkpoint-1167/tokenizer.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=10183)\u001b[0m loading file None\n",
|
||
"\u001b[2m\u001b[36m(train pid=10183)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_46c248b3_12_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train__2022-08-20_09-36-46/checkpoint-1167/special_tokens_map.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=10183)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_46c248b3_12_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train__2022-08-20_09-36-46/checkpoint-1167/tokenizer_config.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=10015)\u001b[0m Didn't find file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_3f1c7a06_11_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train__2022-08-20_09-36-32/checkpoint-635/added_tokens.json. We won't load it.\n",
|
||
"\u001b[2m\u001b[36m(train pid=10015)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_3f1c7a06_11_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train__2022-08-20_09-36-32/checkpoint-635/vocab.txt\n",
|
||
"\u001b[2m\u001b[36m(train pid=10015)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_3f1c7a06_11_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train__2022-08-20_09-36-32/checkpoint-635/tokenizer.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=10015)\u001b[0m loading file None\n",
|
||
"\u001b[2m\u001b[36m(train pid=10015)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_3f1c7a06_11_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train__2022-08-20_09-36-32/checkpoint-635/special_tokens_map.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=10015)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-14-03/train_3f1c7a06_11_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train__2022-08-20_09-36-32/checkpoint-635/tokenizer_config.json\n",
|
||
"2022-08-20 09:47:40,258\tINFO tune.py:747 -- Total run time: 2017.12 seconds (1805.20 seconds for the tuning loop).\n",
|
||
"[flaml.automl: 08-20 09:47:52] {3322} INFO - selected model: None\n",
|
||
"/data/installation/anaconda3/envs/tmp/lib/python3.8/site-packages/transformers/optimization.py:306: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" warnings.warn(\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"{'loss': 1.0426, 'learning_rate': 1.0186867471868435e-05, 'epoch': 0.11}\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"[flaml.automl: 08-20 09:51:13] {3465} INFO - retrain transformer for 200.9s\n",
|
||
"[flaml.automl: 08-20 09:51:13] {3472} INFO - retrained model: None\n",
|
||
"[flaml.automl: 08-20 09:51:13] {2749} INFO - fit succeeded\n",
|
||
"[flaml.automl: 08-20 09:51:13] {2750} INFO - Time taken to find the best model: 1323.6405737400055\n",
|
||
"[flaml.automl: 08-20 09:51:13] {2761} WARNING - Time taken to find the best model is 74% of the provided time budget and not all estimators' hyperparameter search converged. Consider increasing the time budget.\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"{'train_runtime': 128.202, 'train_samples_per_second': 2294.692, 'train_steps_per_second': 143.43, 'train_loss': 1.0108715354543423, 'epoch': 0.14}\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"''' import AutoML class from flaml package '''\n",
|
||
"from flaml import AutoML\n",
|
||
"automl = AutoML()\n",
|
||
"\n",
|
||
"import ray\n",
|
||
"\n",
|
||
"if ray.is_initialized() == False:\n",
|
||
" ray.init(num_gpus=4, num_cpus=4)\n",
|
||
"\n",
|
||
"automl_settings = {\n",
|
||
" \"time_budget\": 1800, # setting the time budget\n",
|
||
" \"task\": \"multichoice-classification\", # setting the task as multiplechoice-classification\n",
|
||
" \"fit_kwargs_by_estimator\": { # if model_path is not set, the default model is facebook/muppet-roberta-base: https://huggingface.co/facebook/muppet-roberta-base\n",
|
||
" \"transformer\": {\n",
|
||
" \"output_dir\": \"data/output/\", # setting the output directory\n",
|
||
" \"model_path\": \"bert-base-uncased\", # the batch size for validation (inference)\n",
|
||
" }\n",
|
||
" },\n",
|
||
" \"gpu_per_trial\": 1, # set to 0 if no GPU is available\n",
|
||
" \"log_file_name\": \"seqclass.log\", # set the file to save the log for HPO\n",
|
||
" \"log_type\": \"all\", # the log type for trials: \"all\" if logging all the trials, \"better\" if only keeping the better trials\n",
|
||
" \"use_ray\": {\"local_dir\": \"data/output/\"}, # set whether to use Ray\n",
|
||
" \"n_concurrent_trials\": 4\n",
|
||
"}\n",
|
||
"\n",
|
||
"'''The main flaml automl API'''\n",
|
||
"automl.fit(X_train=X_train, y_train=y_train, X_val=X_val, y_val=y_val, **automl_settings)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/",
|
||
"height": 350
|
||
},
|
||
"id": "kh7ZJsIKTjhJ",
|
||
"outputId": "36fd683c-0792-4c26-ecc3-9e981b791b39"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 5.316409886511772e-06, 'num_train_epochs': 1, 'per_device_train_batch_size': 64, 'seed': 26, 'global_max_steps': 152, 'learner': 'transformer', 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 5.316409886511772e-06, 'num_train_epochs': 1, 'per_device_train_batch_size': 64, 'seed': 26, 'global_max_steps': 152, 'learner': 'transformer', 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 9.999999999999999e-06, 'num_train_epochs': 3, 'per_device_train_batch_size': 32, 'seed': 20, 'global_max_steps': 322, 'learner': 'transformer', 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 9.999999999999999e-06, 'num_train_epochs': 3, 'per_device_train_batch_size': 32, 'seed': 20, 'global_max_steps': 322, 'learner': 'transformer', 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 1.5662610420278344e-06, 'num_train_epochs': 1, 'per_device_train_batch_size': 64, 'seed': 6, 'global_max_steps': 151, 'learner': 'transformer', 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 9.999999999999999e-06, 'num_train_epochs': 3, 'per_device_train_batch_size': 32, 'seed': 20, 'global_max_steps': 322, 'learner': 'transformer', 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 5.461587558683657e-06, 'num_train_epochs': 1, 'per_device_train_batch_size': 64, 'seed': 25, 'global_max_steps': 157, 'learner': 'transformer', 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 9.999999999999999e-06, 'num_train_epochs': 3, 'per_device_train_batch_size': 32, 'seed': 20, 'global_max_steps': 322, 'learner': 'transformer', 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 5.163225512301641e-06, 'num_train_epochs': 1, 'per_device_train_batch_size': 64, 'seed': 20, 'global_max_steps': 152, 'learner': 'transformer', 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 9.999999999999999e-06, 'num_train_epochs': 3, 'per_device_train_batch_size': 32, 'seed': 20, 'global_max_steps': 322, 'learner': 'transformer', 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 1.0471607729914847e-05, 'num_train_epochs': 4, 'per_device_train_batch_size': 16, 'seed': 21, 'global_max_steps': 629, 'learner': 'transformer', 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 1.0471607729914847e-05, 'num_train_epochs': 4, 'per_device_train_batch_size': 16, 'seed': 21, 'global_max_steps': 629, 'learner': 'transformer', 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 9.54963197430746e-06, 'num_train_epochs': 2, 'per_device_train_batch_size': 64, 'seed': 19, 'global_max_steps': 154, 'learner': 'transformer', 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 1.0471607729914847e-05, 'num_train_epochs': 4, 'per_device_train_batch_size': 16, 'seed': 21, 'global_max_steps': 629, 'learner': 'transformer', 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 7.798079645313044e-06, 'num_train_epochs': 2, 'per_device_train_batch_size': 64, 'seed': 15, 'global_max_steps': 155, 'learner': 'transformer', 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 1.0471607729914847e-05, 'num_train_epochs': 4, 'per_device_train_batch_size': 16, 'seed': 21, 'global_max_steps': 629, 'learner': 'transformer', 'FLAML_sample_size': 10000}}\n",
|
||
"8\n"
|
||
]
|
||
},
|
||
{
|
||
"data": {
|
||
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAY4AAAEWCAYAAABxMXBSAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAAlcElEQVR4nO3de5xdVX338c/XIYHBGiaBwSbDJfEhBEFsAiNeqBdQSPRRMiJi6EXACqUWbeXVSFIqUngo2LTl0aepPGARUW6SQogaHGlBtFwzmEguGggBIZMo4RJAGMiFX//Y64Sdk5nJ3smcmTMz3/frdV6z99prr/M7k8vvrLX2XlsRgZmZWVFvGOgAzMxscHHiMDOzUpw4zMysFCcOMzMrxYnDzMxKceIwM7NSnDjM+pCk90paOdBxmNWSE4cNGZIel/ShgYwhIn4WEZNq1b6kqZJ+KulFSesl3SXphFq9n1l3nDjMSpDUMIDvfRJwE3ANsB/wZuB84GM70ZYk+d+/7RT/xbEhT9IbJM2S9KikZyR9T9KY3PGbJP1G0vPp2/xhuWNXS/qGpIWSXgKOST2bv5H0UDrnRkl7pPofkLQmd36PddPxL0laJ2mtpM9KCkkHdfMZBPwLcFFEfDMino+I1yLirog4I9W5QNJ3c+eMT+3tlvZ/IuliSXcDLwMzJXVUvc8XJS1I27tL+idJT0j6raTLJTXu4h+HDQFOHDYcfB5oA94PjAOeA+bmjt8GTAT2BX4OXFt1/h8BFwNvAv47lZ0MTAMmAG8HTuvl/butK2kacA7wIeAg4AO9tDEJ2B+Y10udIv4UOJPss1wOTJI0MXf8j4Dr0valwMHA5BRfC1kPx4Y5Jw4bDs4CzouINRHxKnABcFLlm3hEXBURL+aO/YGkvXLn3xoRd6dv+K+ksq9HxNqIeBb4Ptl/rj3pqe7JwLciYnlEvJzeuyd7p5/rin3kHl2d3m9zRDwP3AqcApASyCHAgtTDORP4YkQ8GxEvAv8AzNjF97chwInDhoMDgVskbZC0AfglsAV4s6QGSZemYawXgMfTOfvkzn+ymzZ/k9t+Gfi9Xt6/p7rjqtru7n0qnkk/x/ZSp4jq97iOlDjIehvzUxJrBvYEHsz93n6Uym2Yc+Kw4eBJ4MMR0ZR77RERnWT/WU4nGy7aCxifzlHu/FotIb2ObJK7Yv9e6q4k+xyf6KXOS2T/2Vf8fjd1qj/L7UCzpMlkCaQyTPU00AUclvud7RURvSVIGyacOGyoGSFpj9xrN7Kx/IslHQggqVnS9FT/TcCrZN/o9yQbjukv3wNOl/RWSXsCX+6pYmTPPzgH+LKk0yWNSpP+fyjpilRtCfA+SQekobbZOwogIjaRXak1BxhDlkiIiNeAK4HLJO0LIKlF0tSd/bA2dDhx2FCzkOybcuV1AfA1YAHwY0kvAvcB70z1rwF+DXQCK9KxfhERtwFfB+4EVuXe+9Ue6s8DPgV8BlgL/Bb4P2TzFETE7cCNwEPAg8APCoZyHVmP66aI2JwrP7cSVxrG+0+ySXob5uQHOZnVB0lvBZYBu1f9B25WV9zjMBtAkj6e7pcYDXwV+L6ThtU7Jw6zgfXnwFPAo2RXev3FwIZjtmMeqjIzs1Lc4zAzs1J2G+gA+sM+++wT48ePH+gwzMwGlQcffPDpiNjups9hkTjGjx9PR0fHjiuamdlWkn7dXbmHqszMrBQnDjMzK8WJw8zMSnHiMDOzUpw4zMyslGFxVZWZ2WA0f3Enc9pXsnZDF+OaGpk5dRJtU1oGOiwnDjOzejR/cSezb15K16YtAHRu6GL2zUsBBjx5OHGYmdWhOe0rtyaNiq5NW/jSvIe4/oEnCrVx6LhRfOVjh/V5bJ7jMDOrQ2s3dHVbvnHLa/0cyfZq2uOQNI3sIToNwDcj4tKq45cBx6TdPYF9I6IpPcbyG8AoshVDL46IG9M5VwPvB55P550WEUtq+TnMzPrbuKZGOrtJHi1Njdz45+8egIheV7Meh6QGYC7wYeBQ4BRJh+brRMQXI2JyREwG/h9wczr0MvDpiDgMmAb8X0lNuVNnVs5z0jCzoWjm1Ek0jmjYpqxxRAMzpw78QxhrOVR1FLAqIlZHxEbgBmB6L/VPAa4HiIiHI+KRtL2W7HkF2y20ZWY2VLVNaeGSEw9nZEP233RLUyOXnHj4gE+MQ20TRwvwZG5/TSrbjqQDgQnAHd0cOwoYSfagm4qLJT0k6TJJu/fQ5pmSOiR1rF+/fmc/g5nZgGmb0sKUA5p454Qx3D3r2LpIGlA/k+MzgHkRsc0lBJLGAt8BTo+IyozQbOAQ4B3AGODc7hqMiCsiojUiWpub3VkxM+srtUwcncD+uf39Ull3ZpCGqSokjQJ+CJwXEfdVyiNiXWReBb5FNiRmZmb9pJaJYxEwUdIESSPJksOC6kqSDgFGA/fmykYCtwDXRMS8qvpj008BbcCyWn0AMzPbXs0ux42IzZLOBtrJLse9KiKWS7oQ6IiIShKZAdwQ2z78/GTgfcDekk5LZZXLbq+V1AwIWAKcVavPYGZm26vpfRwRsRBYWFV2ftX+Bd2c913guz20eWwfhmhmZiV5yZGS6nXRMTOz/uLEUUI9LzpmZtZfnDhK6ItFx8zMylix7gUOHTtqoMPYRr3cxzEo1POiY2Y2NB06dhTTJ9fXiIZ7HCXU86JjZmb9xT2OEup50TEzs/7iHkcJlQnwL817iI1bXqPFV1WZ2TDkxFFS25SWrRPhHp4ys+HIQ1VmZlaKE4eZmZXixGFmZqU4cZiZWSlOHGZmVooTh5mZleLEYWZmpThxmJlZKU4cZmZWihOHmZmV4sRhZmalOHGYmVkpNU0ckqZJWilplaRZ3Ry/TNKS9HpY0obcsVMlPZJep+bKj5S0NLX5dUmq5WcwM7Nt1Wx1XEkNwFzgOGANsEjSgohYUakTEV/M1f88MCVtjwG+ArQCATyYzn0O+AZwBnA/sBCYBtxWq89hZmbbqmWP4yhgVUSsjoiNwA3A9F7qnwJcn7anArdHxLMpWdwOTJM0FhgVEfdFRADXAG01+wRmZradWiaOFuDJ3P6aVLYdSQcCE4A7dnBuS9ou0uaZkjokdaxfv36nPoCZmW2vXibHZwDzImJLXzUYEVdERGtEtDY3N/dVs2Zmw14tE0cnsH9uf79U1p0ZvD5M1du5nWm7SJtmZlYDtUwci4CJkiZIGkmWHBZUV5J0CDAauDdX3A4cL2m0pNHA8UB7RKwDXpD0rnQ11aeBW2v4GczMrErNrqqKiM2SziZLAg3AVRGxXNKFQEdEVJLIDOCGNNldOfdZSReRJR+ACyPi2bT9OeBqoJHsaipfUWVm1o9qljgAImIh2SWz+bLzq/Yv6OHcq4CruinvAN7Wd1GamVkZ9TI5bmZmg4QTh5mZleLEYWZmpThxmJlZKU4cZmZWihOHmZmV4sRhZmalOHGYmVkpNb0B0MzM+sb8xZ3MaV/J2g1djGtqZObUSbRN6XZx8Jpz4jAzq3PzF3cy++aldG3KFhDv3NDF7JuXAgxI8vBQlZlZnZvTvnJr0qjo2rSFOe0rByQeJw4zszq3dkNXqfJa22HikLR3fwRiZmbdG9fUWKq81or0OO6TdJOkj6RnYJiZWT+aOXUSjSMatilrHNHAzKmTBiSeIonjYOAK4E+BRyT9g6SDaxuWmZlVtE1p4ZITD6elqREBLU2NXHLi4fV7VVV6wNLtwO2SjgG+C3xO0i+AWRFxb68NmJnZLmub0jJgiaLaDhNHmuP4E7Iex2+Bz5M9AnYycBMwoYbxmZlZnSlyH8e9wHeAtohYkyvvkHR5bcIyM7N6VSRxTMo/DzwvIr7ax/GYmVmdKzI5/mNJTZUdSaMltdcuJDMzq2dFEkdzRGyo7ETEc8C+RRqXNE3SSkmrJM3qoc7JklZIWi7pulR2jKQludcrktrSsaslPZY7NrlILGZm1jeKDFVtkXRARDwBIOlAoNuhqzxJDcBc4DhgDbBI0oKIWJGrMxGYDRwdEc9J2hcgIu4km3xH0hhgFfDjXPMzI2JegdjNzKyPFUkc5wH/LekuQMB7gTMLnHcUsCoiVgNIugGYDqzI1TkDmJt6MUTEU920cxJwW0S8XOA9zcysxnY4VBURPwKOAG4EbgCOjIgicxwtwJO5/TWpLO9g4GBJd0u6T9K0btqZAVxfVXaxpIckXSZp9wKxmJlZHym6yOEW4CngBeBQSe/ro/ffDZgIfAA4BbiyaiJ+LHA4kE9Us4FDgHcAY4Bzu2tY0pmSOiR1rF+/vo/CNTOzIoscfhb4Kdl/3n+ffl5QoO1OYP/c/n6pLG8NsCAiNkXEY8DDZImk4mTglojYVCmIiHWReRX4FtmQ2HYi4oqIaI2I1ubm5gLhmplZEUV6HH9F9u3+1xFxDDAF2FDgvEXAREkTJI0kG3JaUFVnPllvA0n7kA1drc4dP4WqYarUCyEtuNgGLCsQi5mZ9ZEik+OvRMQrkpC0e0T8StIOl2SMiM2SzibroTQAV0XEckkXAh0RsSAdO17SCrLhsJkR8QyApPFkPZa7qpq+VlIz2UT9EuCsQp/UzMz6RJHEsSbNO8wnW+jwOeDXRRqPiIXAwqqy83PbAZyTXtXnPs72k+lExLFF3tvMzGqjyOq4H0+bF0i6E9gL+FFNozIzs7rVa+JIN/Etj4hDACKietjIzMyGmV4nxyNiC7BS0gH9FI+ZmdW5InMco4Hlkh4AXqoURsQJNYvKzMzqVpHE8eWaR2FmZoNGkclxz2uYmdlWRR4d+yKvr4Y7EhgBvBQRo2oZmJmZ1aciPY43VbbT3drTgXfVMigzM6tfRRc5BLIb9iJiPjC1NuGYmVm9KzJUdWJu9w1AK/BKzSIyM7O6VuSqqo/ltjcDj5MNV5mZ2TBUZI7j9P4IxMzMBociz+P4dtXDlUZLuqqmUZmZ2U6bv7iToy+9gwmzfsjRl97B/MXVj0LaNUWGqt4eERsqOxHxnKQpfRqFmZn1ifmLO5l981K6Nm0BoHNDF7NvXgpA25TtFhzfKUWuqnqDpNGVHUljKJZwzMysn81pX7k1aVR0bdrCnPaVffYeRRLAPwP3Srop7X8SuLjPIjAzsz6zdkNXqfKdUWRy/BpJHUDlAUonRsSKPovAzMz6zLimRjq7SRLjmhr77D2KTI6/C3gyIv41Iv6V7ImA7+yzCMzMrM/MnDqJxhEN25Q1jmhg5tQdPvG7sCJzHN8Afpfb/10qMzOzOtM2pYVLTjyclqZGBLQ0NXLJiYf32cQ4FJvjUHo2OAAR8ZokT46bmdWptiktfZooqhXpcayW9AVJI9Lrr4DVNYvIzMzqWpHEcRbwHqATWAO8EzijSOOSpklaKWmVpFk91DlZ0gpJyyVdlyvfImlJei3IlU+QdH9q80ZJI4vEYmZmfaPIVVVPATMq+5IagY8CN/V4UlavAZgLHEeWcBZJWpC/IkvSRGA2cHS6sXDfXBNdETG5m6a/ClwWETdIuhz4MzznYmbWbwotqy6pQdJHJH0HeAz4VIHTjgJWRcTqiNgI3MD2iyOeAcyNiOdga5LqLQ6RXRY8LxV9G2gr8hnMzKxv9Jo4JL1f0v8nWxH3z8h6D2+JiJMKtN0CPJnbX5PK8g4GDpZ0t6T7JE3LHdtDUkcqb0tlewMbImJzL21WYj8znd+xfv36AuGamVkRPQ5VSVoDPEE2DPQ3EfGipMci4uU+fv+JwAeA/YCfSjo8rY11YER0SnoLcIekpcDzRRuOiCuAKwBaW1tjB9XNzKyg3noc84BxZMNSH5P0Rl5/9ngRncD+uf39UlneGmBBRGyKiMeAh8kSCRHRmX6uBn4CTAGeAZpylwN316aZmdVQj4kjIv4amEC2VtUHgJVAc7oK6vcKtL0ImJiughpJNsG+oKrO/NQ2kvYhG7panZZu3z1XfjSwIt1PcidQGSo7Fbi1QCxmZtZHep3jSM8YvzMiziRLIqeQTXA/vqOG0zzE2UA78EvgexGxXNKFkk5I1dqBZyStIEsIMyPiGeCtQIekX6TyS3NXY50LnCNpFdmcx7+X+sRmZrZLCt8BHhGbgB8AP0iX5BY5ZyGwsKrs/Nx2AOekV77OPcDhPbS5muyKLTMzGwCFLsetFhF9tz6vmZkNKjuVOMzMbPhy4jAzs1J2OMch6WBgJnBgvn5EHNvjSWZmNmQVmRy/CbgcuBLYsoO6Zma2A/MXdzKnfSVrN3QxrqmRmVMn1XQZ9L5WJHFsjggvImhm1gfmL+5k9s1L6dqUfQ/v3NDF7JuXAgya5FFkjuP7kj4naaykMZVXzSMzMxuC5rSv3Jo0Kro2bWFO+8oBiqi8Ij2OU9PPmbmyAN7S9+GYmQ1tazd0fzdDT+X1qMjzOCb0RyBmZsPBuKZGOrtJEuOaCt1XXRd2OFSVHhf7BUnz0utsSSP6Izgzs6Fm5tRJNI5o2KascUQDM6dOGqCIyisyVPUNYATwb2n/T1PZZ2sVlJnZUFWZAB/qV1W9IyL+ILd/R1p80MzMdkLblJZBlSiqFbmqaouk/1XZSQ9W8v0cZmbDVJEex0zgTkmrAZHdQX56TaMyM7O6VeSqqv+SNBGozNysjIhXaxuWmZnVq96eOX5sRNwh6cSqQwdJIiJurnFsZmZWh3rrcbwfuAP4WDfHAnDiMDMbhnpMHBHxlbR5YUQ8lj8myTcFmpkNU0Umx/8DOKKqbB5wZN+HMzwM9pUxzWx4622O4xDgMGCvqnmOUcAetQ5sqBoKK2Oa2fDW230ck4CPAk1k8xyV1xHAGUUalzRN0kpJqyTN6qHOyZJWSFou6bpUNlnSvansIUmfytW/WtJjkpak1+QisdSLobAyppkNb73NcdwK3Crp3RFxb9mGJTUAc4HjgDXAIkkLImJFrs5EYDZwdEQ8J2nfdOhl4NMR8YikccCDktojYkM6PjMi5pWNqR4MhZUxzWx4KzLHsVjSX5INW20dooqIz+zgvKOAVRGxGkDSDcB0YEWuzhnA3Ih4LrX5VPr5cO591kp6CmgGNhSIt64NhZUxzWx4K7LkyHeA3wemAncB+wEvFjivBXgyt78mleUdDBws6W5J90maVt2IpKOAkcCjueKL0xDWZZJ27+7NJZ0pqUNSx/r16wuE2z+GwsqYZja8FUkcB0XEl4GXIuLbwP8G3tlH778bMBH4AHAKcKWkpspBSWPJEtfpEfFaKp4NHAK8AxgDnNtdwxFxRUS0RkRrc3NzH4W769qmtHDJiYfT0tSIgJamRi458XBPjJvZoFFkqGpT+rlB0tuA3wD79lK/ohPYP7e/XyrLWwPcHxGbgMckPUyWSBZJGgX8EDgvIu6rnBAR69Lmq5K+BfxNgVjqymBfGdPMhrciPY4rJI0GvgwsIJuj+McC5y0CJkqaIGkkMCOdnzefrLeBpH3Ihq5Wp/q3ANdUT4KnXgiSBLQBywrEYmZmfaTIIoffTJt3UeI54xGxWdLZQDvQAFwVEcslXQh0RMSCdOx4SSvIlmqfGRHPSPoT4H3A3pJOS02eFhFLgGslNZOt1LsEOKtoTGX4Jj0zs+71dgPgOb2dGBH/sqPGI2IhsLCq7PzcdgDnpFe+zneB7/bQ5rE7et9d5Zv0zMx61luP403p5ySyiejKMNPHgAdqGdRA6+kmvS/Ne4jrH3iCFete4NCxowYoOjOzgdXbDYB/DyDpp8AREfFi2r+AbNJ6yOrpZryNW7ILuw4dO4rpk93zMLPhqchVVW8GNub2N6ayIaunm/Ramhq58c/fPQARmZnVjyJXVV0DPCDpgtTbuB+4upZBDTTfpGdm1rMiV1VdLOk24L2p6PSIWFzbsAZWZQLcV1WZmW2vt6uqRkXEC5LGAI+nV+XYmIh4tvbhDRzfpGdm1r3eehzXkS2r/iDZo2IrlPYL39NhZmZDR29XVX00/fRjYs3MbKvehqqqHxe7jYj4ed+HY2Zm9a63oap/7uVYADW/g9vMzOpPb0NVx/RnIGZmNjgUuQGQtJz6oWz7BMBrahWUmZnVrx0mDklfIVv6/FCyBQs/DPw32Y2BZmY2zBS5c/wk4IPAbyLidOAPgL1qGpWZmdWtIomjKz22dXN6Kt9TbPtkPzMzG0aKzHF0pOeAX0l2M+DvgHtrGZSZmdWv3u7jmAtcFxGfS0WXS/oRMCoiHuqX6MzMrO701uN4GPin9Izv7wHXD/XFDc3MbMd6nOOIiK9FxLuB9wPPAFdJ+pWkr0g6uN8iNDOzurLDyfGI+HVEfDUipgCnAG3AL2sdmJmZ1acdJg5Ju0n6mKRrgduAlcCJRRqXNE3SSkmrJM3qoc7JklZIWi7pulz5qZIeSa9Tc+VHSlqa2vy6JBWJxczM+kZvk+PHkfUwPgI8ANwAnBkRLxVpWFIDMBc4DlgDLJK0ICJW5OpMBGYDR0fEc5L2TeVjgK8ArWTrYj2Yzn0O+AZwBtmTCBcC08gSmpmZ9YPeehyzgXuAt0bECRFxXdGkkRwFrIqI1RGxkSzxTK+qcwYwNyUEIuKpVD4VuD0ink3HbgempYn6URFxX0QE2d3rbSViMjOzXdTbIoe7uvptC/Bkbn8N8M6qOgcDSLobaAAuiIgf9XBuS3qt6aZ8O5LOBM4EOOCAA3b6Q5iZ2baK3DleS7sBE8nWwjoFuDLdbLjLIuKKiGiNiNbm5ua+aNLMzKht4uhk26VJ9ktleWuABRGxKSIeI7t3ZGIv53am7d7aNDOzGqpl4lgETJQ0QdJIYAawoKrOfLLeBpL2IRu6Wg20A8dLGi1pNHA80B4R64AXJL0rXU31aeDWGn4GMzOrUuh5HDsjIjZLOpssCTQAV0XEckkXAh0RsYDXE8QKYAswMyKeAZB0EVnyAbgwIp5N258DrgYaya6m8hVVZmb9SNnFSUNba2trdHR0DHQYZmaDiqQHI6K1unygJ8fNzGyQceIwM7NSnDjMzKwUJw4zMyvFicPMzEpx4jAzs1KcOMzMrBQnDjMzK8WJw8zMSnHiMDOzUpw4zMysFCcOMzMrxYnDzMxKceIwM7NSnDjMzKwUJw4zMyvFicPMzEpx4jAzs1KcOMzMrBQnDjMzK6WmiUPSNEkrJa2SNKub46dJWi9pSXp9NpUfkytbIukVSW3p2NWSHssdm1zLz2BmZtvarVYNS2oA5gLHAWuARZIWRMSKqqo3RsTZ+YKIuBOYnNoZA6wCfpyrMjMi5tUqdjMz61ktexxHAasiYnVEbARuAKbvRDsnAbdFxMt9Gp2Zme2UWiaOFuDJ3P6aVFbtE5IekjRP0v7dHJ8BXF9VdnE65zJJu/dRvGZmVsBAT45/HxgfEW8Hbge+nT8oaSxwONCeK54NHAK8AxgDnNtdw5LOlNQhqWP9+vW1iN3MbFiqZeLoBPI9iP1S2VYR8UxEvJp2vwkcWdXGycAtEbEpd866yLwKfItsSGw7EXFFRLRGRGtzc/MufhQzM6uoZeJYBEyUNEHSSLIhpwX5CqlHUXEC8MuqNk6hapiqco4kAW3Asr4N28zMelOzq6oiYrOks8mGmRqAqyJiuaQLgY6IWAB8QdIJwGbgWeC0yvmSxpP1WO6qavpaSc2AgCXAWbX6DGZmtj1FxEDHUHOtra3R0dEx0GGYmQ0qkh6MiNbq8oGeHDczs0HGicPMzEpx4jAzs1KcOMzMrBQnDjMzK8WJw8zMSnHiMDOzUpw4zMysFCcOMzMrxYnDzMxKqdlaVcPB/MWdzGlfydoNXYxramTm1Em0TenukSNmZkOHE8dOmr+4k9k3L6Vr0xYAOjd0MfvmpQBOHmY2pHmoaifNaV+5NWlUdG3awpz2lQMUkZlZ/3Di2ElrN3SVKjczGyqcOHbSuKbGUuVmZkOFE8dOmjl1Eo0jGrYpaxzRwMypkwYoIjOz/uHJ8Z1UmQD3VVVmNtw4ceyCtiktThRmNux4qMrMzEpx4jAzs1KcOMzMrBQnDjMzK8WJw8zMSlFEDHQMNSdpPfDrgY6jgH2Apwc6iAIcZ99ynH1vsMRa73EeGBHN1YXDInEMFpI6IqJ1oOPYEcfZtxxn3xsssQ6WOKt5qMrMzEpx4jAzs1KcOOrLFQMdQEGOs285zr43WGIdLHFuw3McZmZWinscZmZWihOHmZmV4sTRTyRNkrQk93pB0l9LGiPpdkmPpJ+jU31J+rqkVZIeknREP8b6RUnLJS2TdL2kPSRNkHR/iudGSSNT3d3T/qp0fHw/xvlXKcblkv46ldXF71PSVZKekrQsV1Y6NkmnpvqPSDq1n+L8ZPqdviaptar+7BTnSklTc+XTUtkqSbP6Kc45kn6Vfme3SGqq0zgvSjEukfRjSeNS+YD9ue+yiPCrn19AA/Ab4EDgH4FZqXwW8NW0/RHgNkDAu4D7+ym2FuAxoDHtfw84Lf2ckcouB/4ibX8OuDxtzwBu7Kc43wYsA/YkezzAfwIH1cvvE3gfcASwLFdWKjZgDLA6/Rydtkf3Q5xvBSYBPwFac+WHAr8AdgcmAI+mv8sNafstwMhU59B+iPN4YLe0/dXc77Pe4hyV2/5C7t/LgP257+rLPY6B8UHg0Yj4NTAd+HYq/zbQlranA9dE5j6gSdLYfopvN6BR0m5k/zGvA44F5vUQZyX+ecAHJakfYnwr2T+0lyNiM3AXcCJ18vuMiJ8Cz1YVl41tKnB7RDwbEc8BtwPTah1nRPwyIlZ2U306cENEvBoRjwGrgKPSa1VErI6IjcANqW6t4/xx+rMHuA/Yr07jfCG3+0agckXSgP257yonjoExA7g+bb85Ital7d8Ab07bLcCTuXPWpLKaiohO4J+AJ8gSxvPAg8CG3D/SfCxb40zHnwf2rnWcZL2N90raW9KeZN/e9qfOfp9VysZWDzHn1XOcnyH79k4v8QxYnJIulvQk8MfA+fUaZ1FOHP0szQ2cANxUfSyyfuqAXh+dxt2nk3Xxx5F9Q6qrbzuQfSsmG574MfAjYAmwparOgP8+e1LPsQ02ks4DNgPXDnQsPYmI8yJif7IYzx7oeHaVE0f/+zDw84j4bdr/bWXIJP18KpV3kn2DrtgvldXah4DHImJ9RGwCbgaOJutGVx41nI9la5zp+F7AM/0QJxHx7xFxZES8D3gOeJj6+33mlY2tHmLOq7s4JZ0GfBT445SM6SWeevh9Xgt8Im3Xc5y9cuLof6fw+jAVwAKgctXEqcCtufJPpysv3gU8nxvmqKUngHdJ2jPNVXwQWAHcCZzUQ5yV+E8C7sj9A64pSfumnweQzW9cR/39PvPKxtYOHC9pdOoJHp/KBsoCYIayK+kmABOBB4BFwERlV96NJBuKXVDrYCRNA74EnBARL9dxnBNzu9OBX+XiHAx/7tsb6Nn54fQiG/Z5BtgrV7Y38F/AI2RXBo1J5QLmkl0FspTc1S39EOffk/3lXgZ8h+zqlLeQ/eNbRTbMtnuqu0faX5WOv6Uf4/wZWVL7BfDBevp9kn05WAdsIhuj/rOdiY1s7H5Vep3eT3F+PG2/CvwWaM/VPy/FuRL4cK78I2Q9vkeB8/opzlVkcwFL0uvyOo3zP9K/pYeA7wMtA/3nvqsvLzliZmaleKjKzMxKceIwM7NSnDjMzKwUJw4zMyvFicPMzEpx4rBBT9JlSqvjpv12Sd/M7f+zpHN6Of9qSSel7Z9UrwibykdIujStVvpzSfdK+nA69rikfXYi7q3v28PxuWlF1RWSuvT6ysonSVqYXw22r0gaK+kHvRwfKemnuZtBbRhy4rCh4G7gPQCS3gDsAxyWO/4e4J5dfI+LgLHA2yLiCLIFCt+0i232KiL+MiImk9178GhETE6veRHxkYjYUIO3PQe4speYNpLdi/KpGry3DRJOHDYU3AO8O20fRnaz1YvpztvdyVbS/bmk8yUtUvYMjyuKruKbFlE8A/h8RLwKEBG/jYjvdVP3nNT+sqpe0KfTMxd+Iek73Zx3UeqBNBSM6XFJ+0gar+yZFFdLeljStZI+JOnu1Ds6KtV/o7JnRTwgabGknlaF/QTZ2l9IOizVX5Jir9wBPZ9ssT4bptzdtEEvItZK2pyWHnkPcC/ZaqLvJlutd2lEbJT0rxFxIUD6z/ujZHfy7shBwBOx7fLY25F0JHA68E6yu4Lvl3QXsBH4O+A9EfG0pDFV580h672cHjt3R+5BwCfJ7jZeBPwR8Idki2n+LVnv6Dyy5WA+k4a4HpD0nxHxUi6OCcBzleQInAV8LSKuTUt0VJLaMuAdOxGnDRHucdhQcQ9Z0qgkjntz+3enOscoe0rhUrLnixzWXUO74A+BWyLipYj4HdkCke9N73VTRDwNEBH55zV8mWwJmrN2MmlAtijl0oh4DVgO/FdqaykwPtU5HpglaQnZA5r2AA6oamcssD63fy/wt5LOBQ6MiK4U/xZgo6SaDtVZ/XLisKGiMs9xONk34vvIehzvAe6RtAfwb8BJEXE42Tj+HgXbXgUcIGlUn0ed9RCOrO6FlPRqbvu13P5rvD6qIOATuXmSAyJbmj6vi9zvJCKuI+u1dAELJR2bq7s78MouxGyDmBOHDRX3kA09PRsRW9K3+iay5HEPr/+H+LSk3+P1lX53KLKVV/8d+Jpef9Z6s6RPVlX9GdCmbGXhN5ItFvgz4A7gk5L2Tufmk8SPgEuBH9b4G3w78PnKvI6kKd3UeZjXeyhIeguwOiK+TraS79tT+d7A05Etu2/DkBOHDRVLya6muq+q7PmIeDpdgXQlWW+kneybfhl/RzaMs0LSMuAHwDZzHhHxc+BqslWC7we+GRGLI2I5cDFwl6RfAP9Sdd5NKbYFkhpLxlXURcAI4CFJy9P+NtJ8x6OSDkpFJwPL0vDW24BrUvkxwA9rFKcNAl4d18y2kvRx4MiI+Lte6twMzIqIh/svMqsnvqrKzLaKiFsqQ2rdSUN18500hjf3OMzMrBTPcZiZWSlOHGZmVooTh5mZleLEYWZmpThxmJlZKf8DKMTcjGm+IEYAAAAASUVORK5CYII=\n",
|
||
"text/plain": [
|
||
"<Figure size 432x288 with 1 Axes>"
|
||
]
|
||
},
|
||
"metadata": {
|
||
"needs_background": "light"
|
||
},
|
||
"output_type": "display_data"
|
||
}
|
||
],
|
||
"source": [
|
||
"from flaml.data import get_output_from_log\n",
|
||
"time_history, best_valid_loss_history, valid_loss_history, config_history, metric_history = \\\n",
|
||
" get_output_from_log(filename=automl_settings['log_file_name'], time_budget=3000)\n",
|
||
"for config in config_history:\n",
|
||
" print(config)\n",
|
||
"\n",
|
||
"import matplotlib.pyplot as plt\n",
|
||
"import numpy as np\n",
|
||
"plt.title('Learning Curve')\n",
|
||
"plt.xlabel('Wall Clock Time (s)')\n",
|
||
"plt.ylabel('Validation Accuracy')\n",
|
||
"print(len(valid_loss_history))\n",
|
||
"plt.scatter(time_history, 1 - np.array(valid_loss_history))\n",
|
||
"plt.step(time_history, 1 - np.array(best_valid_loss_history), where='post')\n",
|
||
"plt.show()"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "664qCdihTjhJ"
|
||
},
|
||
"source": [
|
||
"### 4.2 Text Summarization Example"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {
|
||
"id": "kmB4kaF_TjhJ"
|
||
},
|
||
"source": [
|
||
"The text summarization task summarizes a long text into a short sentence. For example:\n",
|
||
"\n",
|
||
"- Document: Army explosives experts were called out to deal with a suspect package at the offices on the Newtownards Road on Friday night. Roads were sealed off and traffic diverted as a controlled explosion was carried out. The premises, used by East Belfast MP Naomi Long, have been targeted a number of times. Most recently, petrol bomb attacks were carried out on the offices on consecutive nights in April and May. The attacks began following a Belfast City Council vote in December 2012 restricting the flying of the union flag at the City Hall. Condemning the latest hoax, Alliance MLA Chris Lyttle said: \"It is a serious incident for the local area, it causes serious disruption, it puts people's lives at risk, it can prevent emergency services reaching the area. \"Ultimately we need people with information to share that with the police in order for them to do their job and bring these people to justice.\n",
|
||
"\n",
|
||
"- Summary: A suspicious package left outside an Alliance Party office in east Belfast has been declared a hoax.\n",
|
||
"\n",
|
||
"In this example, we use FLAML to perform *abstractive summarization* using the t5-small language model, i.e., the summary is generated word-by-word. "
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/"
|
||
},
|
||
"id": "amlQnvcxTjhK",
|
||
"outputId": "5382a8f5-8c7a-4884-a8bf-8151c7f27624"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Using custom data configuration default\n",
|
||
"Reusing dataset xsum (/home/xliu127/.cache/huggingface/datasets/xsum/default/1.2.0/32c23220eadddb1149b16ed2e9430a05293768cfffbdfd151058697d4c11f934)\n",
|
||
"Using custom data configuration default\n",
|
||
"Reusing dataset xsum (/home/xliu127/.cache/huggingface/datasets/xsum/default/1.2.0/32c23220eadddb1149b16ed2e9430a05293768cfffbdfd151058697d4c11f934)\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"204045\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"Using custom data configuration default\n",
|
||
"Reusing dataset xsum (/home/xliu127/.cache/huggingface/datasets/xsum/default/1.2.0/32c23220eadddb1149b16ed2e9430a05293768cfffbdfd151058697d4c11f934)\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"from datasets import load_dataset\n",
|
||
"\n",
|
||
"train_dataset = load_dataset(\"xsum\", split=\"train\").to_pandas()\n",
|
||
"print(len(train_dataset))\n",
|
||
"valid_dataset = load_dataset(\"xsum\", split=\"validation\").to_pandas()\n",
|
||
"test_dataset = load_dataset(\"xsum\", split=\"test\").to_pandas()\n",
|
||
"\n",
|
||
"custom_sent_keys = [\"document\"] # specify the column names of the input sentences\n",
|
||
"label_key = \"summary\" # specify the column name of the label \n",
|
||
"\n",
|
||
"X_train, y_train = train_dataset[custom_sent_keys], train_dataset[label_key]\n",
|
||
"X_val, y_val = valid_dataset[custom_sent_keys], valid_dataset[label_key]\n",
|
||
"X_test = test_dataset[custom_sent_keys]"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"metadata": {
|
||
"colab": {
|
||
"base_uri": "https://localhost:8080/",
|
||
"height": 626
|
||
},
|
||
"id": "aYq8XAtxTjhK",
|
||
"outputId": "267fffbb-e5a5-4f45-b8d1-4f718b298e01"
|
||
},
|
||
"outputs": [
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"== Status ==<br>Current time: 2022-08-20 10:53:00 (running for 01:00:10.16)<br>Memory usage on this node: 24.9/376.6 GiB<br>Using FIFO scheduling algorithm.<br>Resources requested: 0/4 CPUs, 0/4 GPUs, 0.0/252.27 GiB heap, 0.0/112.11 GiB objects (0.0/1.0 accelerator_type:V100)<br>Current best trial: 888b71b8 with val_loss=0.8562685839953479 and parameters={'learning_rate': 4.747405262702932e-05, 'num_train_epochs': 0.1, 'per_device_train_batch_size': 64, 'seed': 19, 'global_max_steps': 9223372036854775807, 'learner': 'transformer', 'FLAML_sample_size': 10000}<br>Result logdir: /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50<br>Number of trials: 8/1000000 (8 TERMINATED)<br><br>"
|
||
],
|
||
"text/plain": [
|
||
"<IPython.core.display.HTML object>"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"\u001b[2m\u001b[36m(train pid=12776)\u001b[0m [nltk_data] Downloading package punkt to /home/xliu127/nltk_data...\n",
|
||
"\u001b[2m\u001b[36m(train pid=12776)\u001b[0m [nltk_data] Package punkt is already up-to-date!\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"\u001b[2m\u001b[36m(train pid=12776)\u001b[0m {'eval_loss': 3.629159688949585, 'eval_automl_metric': 0.8566706912648477, 'eval_runtime': 1292.7731, 'eval_samples_per_second': 8.766, 'eval_steps_per_second': 8.766, 'epoch': 0.1}\n",
|
||
"\u001b[2m\u001b[36m(train pid=12776)\u001b[0m {'train_runtime': 1299.5697, 'train_samples_per_second': 0.769, 'train_steps_per_second': 0.012, 'train_loss': 3.952885627746582, 'epoch': 0.1}\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"\u001b[2m\u001b[36m(train pid=12943)\u001b[0m [nltk_data] Downloading package punkt to /home/xliu127/nltk_data...\n",
|
||
"\u001b[2m\u001b[36m(train pid=12943)\u001b[0m [nltk_data] Package punkt is already up-to-date!\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"\u001b[2m\u001b[36m(train pid=12943)\u001b[0m {'eval_loss': 3.3510377407073975, 'eval_automl_metric': 0.8490142159489932, 'eval_runtime': 1295.8577, 'eval_samples_per_second': 8.745, 'eval_steps_per_second': 8.745, 'epoch': 0.1}\n",
|
||
"\u001b[2m\u001b[36m(train pid=12943)\u001b[0m {'train_runtime': 1303.1308, 'train_samples_per_second': 0.767, 'train_steps_per_second': 0.025, 'train_loss': 3.790097713470459, 'epoch': 0.1}\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"\u001b[2m\u001b[36m(train pid=12776)\u001b[0m ***** Running Prediction *****\n",
|
||
"\u001b[2m\u001b[36m(train pid=12776)\u001b[0m Num examples = 11332\n",
|
||
"\u001b[2m\u001b[36m(train pid=12776)\u001b[0m Batch size = 1\n",
|
||
"\u001b[2m\u001b[36m(train pid=13121)\u001b[0m [nltk_data] Downloading package punkt to /home/xliu127/nltk_data...\n",
|
||
"\u001b[2m\u001b[36m(train pid=13121)\u001b[0m [nltk_data] Package punkt is already up-to-date!\n",
|
||
"\u001b[2m\u001b[36m(train pid=12943)\u001b[0m ***** Running Prediction *****\n",
|
||
"\u001b[2m\u001b[36m(train pid=12943)\u001b[0m Num examples = 11332\n",
|
||
"\u001b[2m\u001b[36m(train pid=12943)\u001b[0m Batch size = 1\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"\u001b[2m\u001b[36m(train pid=13121)\u001b[0m {'eval_loss': 3.418060064315796, 'eval_automl_metric': 0.8528926893980235, 'eval_runtime': 1291.0528, 'eval_samples_per_second': 8.777, 'eval_steps_per_second': 8.777, 'epoch': 0.1}\n",
|
||
"\u001b[2m\u001b[36m(train pid=13121)\u001b[0m {'train_runtime': 1297.7666, 'train_samples_per_second': 0.771, 'train_steps_per_second': 0.012, 'train_loss': 3.8431835174560547, 'epoch': 0.1}\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"\u001b[2m\u001b[36m(train pid=13121)\u001b[0m ***** Running Prediction *****\n",
|
||
"\u001b[2m\u001b[36m(train pid=13121)\u001b[0m Num examples = 11332\n",
|
||
"\u001b[2m\u001b[36m(train pid=13121)\u001b[0m Batch size = 1\n",
|
||
"\u001b[2m\u001b[36m(train pid=13312)\u001b[0m [nltk_data] Downloading package punkt to /home/xliu127/nltk_data...\n",
|
||
"\u001b[2m\u001b[36m(train pid=13312)\u001b[0m [nltk_data] Package punkt is already up-to-date!\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"\u001b[2m\u001b[36m(train pid=13312)\u001b[0m {'eval_loss': 3.036466598510742, 'eval_automl_metric': 0.8168502922908516, 'eval_runtime': 1287.5045, 'eval_samples_per_second': 8.802, 'eval_steps_per_second': 8.802, 'epoch': 1.0}\n",
|
||
"\u001b[2m\u001b[36m(train pid=13312)\u001b[0m {'train_runtime': 1341.8008, 'train_samples_per_second': 7.453, 'train_steps_per_second': 0.117, 'train_loss': 3.455144991540605, 'epoch': 1.0}\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"\u001b[2m\u001b[36m(train pid=13312)\u001b[0m ***** Running Prediction *****\n",
|
||
"\u001b[2m\u001b[36m(train pid=13312)\u001b[0m Num examples = 11332\n",
|
||
"\u001b[2m\u001b[36m(train pid=13312)\u001b[0m Batch size = 1\n",
|
||
"\u001b[2m\u001b[36m(train pid=12776)\u001b[0m Didn't find file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_bba8754c_5_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_10-37-19/checkpoint-16/spiece.model. We won't load it.\n",
|
||
"\u001b[2m\u001b[36m(train pid=12776)\u001b[0m Didn't find file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_bba8754c_5_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_10-37-19/checkpoint-16/added_tokens.json. We won't load it.\n",
|
||
"\u001b[2m\u001b[36m(train pid=12776)\u001b[0m loading file None\n",
|
||
"\u001b[2m\u001b[36m(train pid=12776)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_bba8754c_5_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_10-37-19/checkpoint-16/tokenizer.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=12776)\u001b[0m loading file None\n",
|
||
"\u001b[2m\u001b[36m(train pid=12776)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_bba8754c_5_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_10-37-19/checkpoint-16/special_tokens_map.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=12776)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_bba8754c_5_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_10-37-19/checkpoint-16/tokenizer_config.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=12776)\u001b[0m [nltk_data] Downloading package punkt to /home/xliu127/nltk_data...\n",
|
||
"\u001b[2m\u001b[36m(train pid=12776)\u001b[0m [nltk_data] Package punkt is already up-to-date!\n",
|
||
"\u001b[2m\u001b[36m(train pid=12943)\u001b[0m Didn't find file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_c5039194_6_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_10-37-35/checkpoint-32/spiece.model. We won't load it.\n",
|
||
"\u001b[2m\u001b[36m(train pid=12943)\u001b[0m Didn't find file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_c5039194_6_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_10-37-35/checkpoint-32/added_tokens.json. We won't load it.\n",
|
||
"\u001b[2m\u001b[36m(train pid=12943)\u001b[0m loading file None\n",
|
||
"\u001b[2m\u001b[36m(train pid=12943)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_c5039194_6_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_10-37-35/checkpoint-32/tokenizer.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=12943)\u001b[0m loading file None\n",
|
||
"\u001b[2m\u001b[36m(train pid=12943)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_c5039194_6_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_10-37-35/checkpoint-32/special_tokens_map.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=12943)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_c5039194_6_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_10-37-35/checkpoint-32/tokenizer_config.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=12943)\u001b[0m [nltk_data] Downloading package punkt to /home/xliu127/nltk_data...\n",
|
||
"\u001b[2m\u001b[36m(train pid=12943)\u001b[0m [nltk_data] Package punkt is already up-to-date!\n",
|
||
"\u001b[2m\u001b[36m(train pid=13121)\u001b[0m Didn't find file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_de0b5a76_7_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0001,num_train_e_2022-08-20_10-38-15/checkpoint-16/spiece.model. We won't load it.\n",
|
||
"\u001b[2m\u001b[36m(train pid=13121)\u001b[0m Didn't find file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_de0b5a76_7_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0001,num_train_e_2022-08-20_10-38-15/checkpoint-16/added_tokens.json. We won't load it.\n",
|
||
"\u001b[2m\u001b[36m(train pid=13121)\u001b[0m loading file None\n",
|
||
"\u001b[2m\u001b[36m(train pid=13121)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_de0b5a76_7_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0001,num_train_e_2022-08-20_10-38-15/checkpoint-16/tokenizer.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=13121)\u001b[0m loading file None\n",
|
||
"\u001b[2m\u001b[36m(train pid=13121)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_de0b5a76_7_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0001,num_train_e_2022-08-20_10-38-15/checkpoint-16/special_tokens_map.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=13121)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_de0b5a76_7_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0001,num_train_e_2022-08-20_10-38-15/checkpoint-16/tokenizer_config.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=13121)\u001b[0m [nltk_data] Downloading package punkt to /home/xliu127/nltk_data...\n",
|
||
"\u001b[2m\u001b[36m(train pid=13121)\u001b[0m [nltk_data] Package punkt is already up-to-date!\n",
|
||
"\u001b[2m\u001b[36m(train pid=13312)\u001b[0m Didn't find file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_f69bd7dc_8_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_10-38-56/checkpoint-157/spiece.model. We won't load it.\n",
|
||
"\u001b[2m\u001b[36m(train pid=13312)\u001b[0m Didn't find file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_f69bd7dc_8_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_10-38-56/checkpoint-157/added_tokens.json. We won't load it.\n",
|
||
"\u001b[2m\u001b[36m(train pid=13312)\u001b[0m loading file None\n",
|
||
"\u001b[2m\u001b[36m(train pid=13312)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_f69bd7dc_8_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_10-38-56/checkpoint-157/tokenizer.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=13312)\u001b[0m loading file None\n",
|
||
"\u001b[2m\u001b[36m(train pid=13312)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_f69bd7dc_8_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_10-38-56/checkpoint-157/special_tokens_map.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=13312)\u001b[0m loading file /data/xliu127/projects/hyperopt/FLAML/notebook/data/output/train_2022-08-20_09-52-50/train_f69bd7dc_8_FLAML_sample_size=10000,global_max_steps=9223372036854775807,learner=transformer,learning_rate=0.0000,num_train_e_2022-08-20_10-38-56/checkpoint-157/tokenizer_config.json\n",
|
||
"\u001b[2m\u001b[36m(train pid=13312)\u001b[0m [nltk_data] Downloading package punkt to /home/xliu127/nltk_data...\n",
|
||
"\u001b[2m\u001b[36m(train pid=13312)\u001b[0m [nltk_data] Package punkt is already up-to-date!\n",
|
||
"2022-08-20 11:25:23,543\tINFO tune.py:747 -- Total run time: 5553.09 seconds (3602.98 seconds for the tuning loop).\n",
|
||
"[flaml.automl: 08-20 11:25:27] {3322} INFO - selected model: None\n",
|
||
"/data/installation/anaconda3/envs/tmp/lib/python3.8/site-packages/transformers/optimization.py:306: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
|
||
" warnings.warn(\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"[flaml.automl: 08-20 11:37:29] {3465} INFO - retrain transformer for 721.8s\n",
|
||
"[flaml.automl: 08-20 11:37:29] {3472} INFO - retrained model: None\n",
|
||
"[flaml.automl: 08-20 11:37:29] {2749} INFO - fit succeeded\n",
|
||
"[flaml.automl: 08-20 11:37:29] {2750} INFO - Time taken to find the best model: 2666.945666074753\n",
|
||
"[flaml.automl: 08-20 11:37:29] {2761} WARNING - Time taken to find the best model is 74% of the provided time budget and not all estimators' hyperparameter search converged. Consider increasing the time budget.\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"{'train_runtime': 9.5593, 'train_samples_per_second': 2134.522, 'train_steps_per_second': 33.371, 'train_loss': 3.9266421794891357, 'epoch': 0.01}\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"''' import AutoML class from flaml package '''\n",
|
||
"from flaml import AutoML\n",
|
||
"automl = AutoML()\n",
|
||
"\n",
|
||
"import ray\n",
|
||
"\n",
|
||
"if ray.is_initialized() == False:\n",
|
||
" ray.init(num_gpus=4, num_cpus=4)\n",
|
||
"\n",
|
||
"automl_settings = {\n",
|
||
" \"time_budget\": 3600, # setting the time budget\n",
|
||
" \"task\": \"summarization\", # setting the task as summarization\n",
|
||
" \"fit_kwargs_by_estimator\": { # if model_path is not set, the default model is t5-small: https://huggingface.co/t5-small\n",
|
||
" \"transformer\": {\n",
|
||
" \"output_dir\": \"data/output/\", # setting the output directory\n",
|
||
" \"model_path\": \"t5-small\",\n",
|
||
" \"pad_to_max_length\": True,\n",
|
||
" }\n",
|
||
" },\n",
|
||
" \"gpu_per_trial\": 1, # set to 0 if no GPU is available\n",
|
||
" \"log_file_name\": \"seqclass.log\", # set the file to save the log for HPO\n",
|
||
" \"log_type\": \"all\", # the log type for trials: \"all\" if logging all the trials, \"better\" if only keeping the better trials\n",
|
||
" \"use_ray\": {\"local_dir\": \"data/output/\"}, # set whether to use Ray\n",
|
||
" \"metric\": \"rouge1\",\n",
|
||
" \"sample\": True, # sample: False # if the time is sufficient (e.g., longer than one trial's running time), you can set \n",
|
||
" \"n_concurrent_trials\": 4, \n",
|
||
"}\n",
|
||
"\n",
|
||
"from flaml import tune\n",
|
||
"custom_hp = {\n",
|
||
" \"transformer\": {\n",
|
||
" \"num_train_epochs\": {\n",
|
||
" \"domain\": tune.choice([0.1, 1, 2, 3, 4, 5]),\n",
|
||
" \"init_value\": 0.1, \n",
|
||
" \"low_cost_init_value\": 0.1,\n",
|
||
" },\n",
|
||
" }\n",
|
||
"}\n",
|
||
"\n",
|
||
"\n",
|
||
"'''The main flaml automl API'''\n",
|
||
"automl.fit(X_train=X_train, y_train=y_train, X_val=X_val, y_val=y_val, custom_hp=custom_hp, **automl_settings)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"metadata": {
|
||
"id": "xPy67MBFTjhK",
|
||
"outputId": "fe0ca67e-b129-4889-ee03-972620bc8421",
|
||
"scrolled": true
|
||
},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 4.747405262702932e-05, 'num_train_epochs': 0.1, 'per_device_train_batch_size': 64, 'seed': 19, 'global_max_steps': 16, 'learner': 'transformer', 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 4.747405262702932e-05, 'num_train_epochs': 0.1, 'per_device_train_batch_size': 64, 'seed': 19, 'global_max_steps': 16, 'learner': 'transformer', 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 1.5662610420278344e-06, 'num_train_epochs': 0.1, 'per_device_train_batch_size': 64, 'seed': 6, 'global_max_steps': 16, 'learner': 'transformer', 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 4.747405262702932e-05, 'num_train_epochs': 0.1, 'per_device_train_batch_size': 64, 'seed': 19, 'global_max_steps': 16, 'learner': 'transformer', 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 5.316409886511772e-06, 'num_train_epochs': 1, 'per_device_train_batch_size': 64, 'seed': 26, 'global_max_steps': 157, 'learner': 'transformer', 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 4.747405262702932e-05, 'num_train_epochs': 0.1, 'per_device_train_batch_size': 64, 'seed': 19, 'global_max_steps': 16, 'learner': 'transformer', 'FLAML_sample_size': 10000}}\n",
|
||
"{'Current Learner': 'transformer', 'Current Sample': 10000, 'Current Hyper-parameters': {'learning_rate': 9.999999999999999e-06, 'num_train_epochs': 0.1, 'per_device_train_batch_size': 32, 'seed': 20, 'global_max_steps': 32, 'learner': 'transformer', 'FLAML_sample_size': 10000}, 'Best Learner': 'transformer', 'Best Hyper-parameters': {'learning_rate': 4.747405262702932e-05, 'num_train_epochs': 0.1, 'per_device_train_batch_size': 64, 'seed': 19, 'global_max_steps': 16, 'learner': 'transformer', 'FLAML_sample_size': 10000}}\n",
|
||
"4\n"
|
||
]
|
||
},
|
||
{
|
||
"data": {
|
||
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAY4AAAEWCAYAAABxMXBSAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAAewklEQVR4nO3de5hcVZ3u8e9LbjSXEJSWITcSJUYTL4QUOFHgiKMnQUeIEJCLR6PPiJqDz1EfMiaiDoKeQTEqDIwSGWUcRQwMIioSvIKHm+mEkNAw0RAj6Q6jAYwQaMntd/7Yq6FSqerU7vROVXe/n+epJ7XXXnvXWqnufmvtvWttRQRmZmb12q/RDTAzs/7FwWFmZrk4OMzMLBcHh5mZ5eLgMDOzXBwcZmaWi4PDrA9JOkHSmka3w6xIDg4bMCStl/TmRrYhIn4dEZOL2r+kmZLulPS0pE2S7pB0SlGvZ1aNg8MsB0lDGvjac4AbgG8BY4HDgU8Db+/FviTJv//WK/7BsQFP0n6SFkh6RNITkpZIelHZ+hsk/bekv6RP81PL1l0r6auSbpX0DHBSGtlcIGlV2uZ7kvZP9d8oqaNs+5p10/p/lPSYpI2S/kFSSDqqSh8EfAm4JCKuiYi/RMTOiLgjIt6f6lwk6dtl20xI+xualn8l6XOS7gKeBeZLaqt4nY9KuiU9HyHpi5IelfRHSV+T1LKXb4cNAA4OGww+DMwG/gcwGvgzcFXZ+p8Ak4CXACuA71Rsfw7wOeBg4P+lsjOBWcBE4DXA3B5ev2pdSbOAjwFvBo4C3tjDPiYD44Abe6hTj/8FnEfWl68BkyVNKlt/DnBden4p8HLg6NS+MWQjHBvkHBw2GHwQuDAiOiLiOeAiYE73J/GI+EZEPF227rWSDinb/gcRcVf6hP/XVHZFRGyMiCeBH5L9ca2lVt0zgW9GRHtEPJteu5YXp38fq6/LNV2bXm97RPwF+AFwNkAKkFcAt6QRznnARyPiyYh4Gvi/wFl7+fo2ADg4bDA4Evi+pM2SNgMPAzuAwyUNkXRpOoz1FLA+bXNY2fYbquzzv8uePwsc1MPr16o7umLf1V6n2xPp3yN6qFOPyte4jhQcZKONm1OItQIHAMvL/t9uS+U2yDk4bDDYAJwcEaPKHvtHRCfZH8tTyQ4XHQJMSNuobPuippB+jOwkd7dxPdRdQ9aP03uo8wzZH/tuf1OlTmVffgq0SjqaLEC6D1M9DnQBU8v+zw6JiJ4C0gYJB4cNNMMk7V/2GEp2LP9zko4EkNQq6dRU/2DgObJP9AeQHY7ZV5YA75X0SkkHAJ+qVTGy+x98DPiUpPdKGplO+h8vaXGqthI4UdL4dKht4Z4aEBHbyK7Uugx4EVmQEBE7ga8DX5b0EgBJYyTN7G1nbeBwcNhAcyvZJ+Xux0XA5cAtwO2SngbuBV6X6n8L+APQCTyU1u0TEfET4Argl8Dastd+rkb9G4F3Au8DNgJ/BD5Ldp6CiPgp8D1gFbAc+FGdTbmObMR1Q0RsLyv/eHe70mG8n5GdpLdBTr6Rk1lzkPRK4EFgRMUfcLOm4hGHWQNJekf6vsShwOeBHzo0rNkVGhySZklaI2mtpAVV1p8oaYWk7elbsZXrR0rqkHRllXW3SHqwqLab7SMfAP4EPEJ2pdeHGtscsz0bWtSO09QMVwFvATqAZZJuiYiHyqo9SvZlqAtq7OYS4M4q+z4N2NKnDTZrgIiY1eg2mOVV5IjjOGBtRKyLiK3A9WSXPT4vItZHxCpgZ+XGkqaTzcVze0X5QWRXl3y2qIabmVlthY04yKYnKP+yUQcvXMnSozT52iLgXWRXe5S7JK17tt6GHHbYYTFhwoR6q5uZGbB8+fLHI2K3L30WGRx7Yx5wa0R0ZDMfZNKXlF4WER+VNKGnHUg6j2zKBMaPH09bW1tP1c3MrIKkP1QrLzI4Otn1m7BjU1k9ZgAnSJpHNj3DcElbyK63L0laT9b2l0j6VUS8sXIHEbEYWAxQKpV8zbGZWR8pMjiWAZMkTSQLjLPIpnfYo4g4t/u5pLlAKSK6r8r6aiqfAPyoWmiYmVlxCjs5nq5FPx9YSjap3JKIaJd0sdIdyyQdm+5dcAZwtaT2otpjZmZ9Y1B8c7xUKoXPcZiZ5SNpeUSUKsv9zXEzM8ulWa+qarib7+/ksqVr2Li5i9GjWpg/czKzp41pdLPMzBrOwVHFzfd3svCm1XRt2wFA5+YuFt60GsDhYf2GP/xYUXyOo4o3XPoLOjd37VY+fMh+TBs/qg9bZlaMx7c8x+8ff4adZb/e+wkmHnYghx00onENs31qyuiR/NPbp/Z6e5/jyGFjldAA2Lpjt5lRzJrShie7dgkNgJ2RlZvtLR+qqmL0qJaqI44xo1r43gdmNKBFZvlMXPDjquXbduz0z7DtNY84qpg/czItw4bsUtYybAjzZ/rmZ9Y/jB7VkqvcLA8HRxWzp43hn097NWNGtSCykcY/n/Zqn1i0fsMffqxIPlRVw+xpYxwU1m91/+z6qiorgoPDbIDyhx8rig9VmZlZLg4OMzPLxcFhZma5ODjMzCwXB4eZmeXi4DAzs1wcHGZmlouDw8zMcnFwmJlZLg4OMzPLxcFhZma5ODjMzCwXB4eZmeXi4DAzs1wcHGZmlouDw8zMcnFwmJlZLoUGh6RZktZIWitpQZX1J0paIWm7pDlV1o+U1CHpyrKy2yQ9IKld0tckDanczszMilNYcKQ/6FcBJwNTgLMlTamo9igwF7iuxm4uAe6sKDszIl4LvApoBc7oqzabmdmeFTniOA5YGxHrImIrcD1wanmFiFgfEauAnZUbS5oOHA7cXrHNU+npUGA4EAW03czMaigyOMYAG8qWO1LZHknaD1gEXFBj/VLgT8DTwI016pwnqU1S26ZNm/K028zMetCsJ8fnAbdGREe1lRExEzgCGAG8qUadxRFRiohSa2trcS01Mxtkhha4705gXNny2FRWjxnACZLmAQcBwyVtiYjnT7BHxF8l/YDs8NdP+6jNZma2B0UGxzJgkqSJZIFxFnBOPRtGxLndzyXNBUoRsUDSQcDBEfGYpKHA24Bf93nLzcyspsIOVUXEduB8YCnwMLAkItolXSzpFABJx0rqILsy6mpJ7XvY7YHALZJWASvJznN8rag+mJnZ7hQx8C9KKpVK0dbW1uhmmJn1K5KWR0SpsrxZT46bmVmTcnCYmVkuDg4zM8vFwWFmZrk4OMzMLBcHh5mZ5eLgMDOzXBwcZmaWi4PDzMxycXCYmVkuDg4zM8vFwWFmZrk4OMzMLBcHh5mZ5eLgMDOzXBwcZmaWi4PDzMxycXCYmVkuDg4zM8vFwWFmZrk4OMzMLBcHh5mZ5eLgMDOzXBwcZmaWi4PDzMxycXCYmVkuDg4zM8ul0OCQNEvSGklrJS2osv5ESSskbZc0p8r6kZI6JF2Zlg+Q9GNJ/yWpXdKlRbbfzMx2V1hwSBoCXAWcDEwBzpY0paLao8Bc4Loau7kEuLOi7IsR8QpgGvAGSSf3WaPNzGyPihxxHAesjYh1EbEVuB44tbxCRKyPiFXAzsqNJU0HDgduL6v/bET8Mj3fCqwAxhbXBTMzq1RkcIwBNpQtd6SyPZK0H7AIuKCHOqOAtwM/r7H+PEltkto2bdpUb5vNzGwPmvXk+Dzg1ojoqLZS0lDgu8AVEbGuWp2IWBwRpYgotba2FthUM7PBZWiB++4ExpUtj01l9ZgBnCBpHnAQMFzSlojoPsG+GPhdRHylrxprZmb1KTI4lgGTJE0kC4yzgHPq2TAizu1+LmkuUOoODUmfBQ4B/qGvG2xmZntW2KGqiNgOnA8sBR4GlkREu6SLJZ0CIOlYSR3AGcDVktp72qekscCFZFdprZC0UpIDxMxsH1JENLoNhSuVStHW1tboZpiZ9SuSlkdEqbK8WU+Om5lZk3JwmJlZLg4OMzPLxcFhZma5ODjMzCwXB4eZmeXi4DAzs1wcHGZmlouDw8zMcnFwmJlZLg4OMzPLxcFhZma5ODjMzCyXXgWHpJ/0dUPMzKx/qHkjJ0nH1FoFHF1Ia8zMrOn1dAfAZcAdZEFRaVQhrTEzs6bXU3A8DHwgIn5XuULShuKaZGZmzayncxwX9bD+w33fFDMz6w9qjjgi4sYe1t1cSGvMzKzp+XJcMzPLxcFhZma5ODjMzCyXPQaHpAMkfUrS19PyJEl/X3zTzMysGdUz4vgm8BwwIy13Ap8trEVmZtbU6gmOl0XEF4BtABHxLNW/FGhmZoNAPcGxVVILEACSXkY2AjEzs0Gop2+Od/sn4DZgnKTvAG8A5hbZKDMza157HHFExE+B08jC4rtAKSJ+Vc/OJc2StEbSWkkLqqw/UdIKSdslzamyfqSkDklXlpV9TtIGSVvqaYOZmfWteq6qOgY4EngM2AiMl/QyST2OViQNAa4CTgamAGdLmlJR7VGyQLquxm4uAe6sKPshcNye2m1mZsWo51DVvwLHAKvIToq/CmgHDpH0oYi4vcZ2xwFrI2IdgKTrgVOBh7orRMT6tG5n5caSpgOHkx0mK5Vtc29aX0fTzcysr9VzcnwjMC0iShExHZgGrAPeAnyhh+3GAOWz6Haksj2StB+wCLignvo19nGepDZJbZs2bertbszMrEI9wfHyiGjvXoiIh4BXdI8kCjIPuDUiOnq7g4hYnMKu1Nra2odNMzMb3Oo5VNUu6avA9Wn5ncBDkkaQvttRQycwrmx5bCqrxwzgBEnzgIOA4ZK2RMRuJ9jNzGzfqic45pKNAD6Slu8iO4S0DTiph+2WAZMkTSQLjLOAc+ppVESc2/1c0lyyK7kcGmZmTaCey3G7ImJRRLwjPb4YEc9GxM6IqHlJbERsB84HlpLdTXBJRLRLuljSKQCSjpXUAZwBXC2pvdb+ukn6QtrmgHSp7kX1ddXMzPqCIqLnCtLvSd8aLxcRLy2qUX2tVCpFW1tbo5thZtavSFoeEaXK8noOVZVvtD/Z6OBFfdUwMzPrX+o5VPVE2aMzIr4CvK34ppmZWTPa44gjfXO8235kI5B6RipmZjYA1RMAi8qebwfWA2cW0hozM2t6ewyOiOjpklszMxtk6pnk8BBJX+qevkPSIkmH7IvGmZlZ86lnypFvAE+THZ46E3iK7HayZmY2CNVzjuNlEXF62fJnJK0sqD1mZtbk6hlxdEk6vntB0huAruKaZGZmzayeEceHgH9P5zUEPAm8p9BWmZlZ06rnqqqVwGsljUxFz5BNWLiqwHaZmVmTqnmoKt3ve6GkKyW9hewE+buBtfh7HGZmg1ZPI47/AP4M3AO8H7iQ7FDVO9IoxMzMBqGeguOlEfFqAEnXAI8B4yPir/ukZWZm1pR6Co7n7+4XETskdTg0zMya3833d3LZ0jVs3NzF6FEtzJ85mdnTxvTZ/nsKjtdKeio9F9CSlgVERIysvamZmTXCzfd3svCm1XRt2wFA5+YuFt60GqDPwqPmyfGIGBIRI9Pj4IgYWvbcoWFm1oQuW7rm+dDo1rVtB5ctXdNnr1HPFwDNzKyf2Li5+veza5X3hoPDzGwAGT2qJVd5bzg4zMwGkPkzJ9MybMguZS3DhjB/5uQ+ew3fyc/MbADpPgHeqKuqzMysH5o9bUyfBkUlH6oyM7NcHBxmZpaLg8PMzHJxcJiZWS4ODjMzy6XQ4JA0S9IaSWslLaiy/kRJKyRtlzSnyvqRkjokXVlWNl3S6rTPKySpyD6YmdmuCgsOSUOAq4CTgSnA2ZKmVFR7FJgLXFdjN5cAd1aUfZXs/iCT0mNWHzXZzMzqUOSI4zhgbUSsi4itwPXAqeUVImJ9RKwCdlZuLGk6cDhwe1nZEcDIiLg3IgL4FjC7uC6YmVmlIoNjDLChbLkjle2RpP2ARcAFVfbZUc8+JZ0nqU1S26ZNm+putJmZ9axZT47PA26NiI491qwhIhZHRCkiSq2trX3YNDOzwa3IKUc6gXFly2NTWT1mACdImgccBAyXtAW4PO2nN/s0M7M+UGRwLAMmSZpI9sf9LOCcejaMiHO7n0uaC5QiYkFafkrS3wL3Ae8G/qWP221mZj0o7FBVRGwHzgeWAg8DSyKiXdLFkk4BkHSspA7gDOBqSe117HoecA2wFngE+EkhHTAzs6qUXZw0sJVKpWhra2t0M8zM+hVJyyOiVFnerCfHzcysSTk4zMwsFweHmZnl4uAwM7NcHBxmZpaLg8PMzHJxcJiZWS4ODjMzy8XBYWZmuTg4zMwsFweHmZnl4uAwM7NcHBxmZpaLg8PMzHJxcJiZWS4ODjMzy8XBYWZmuTg4zMwsFweHmZnl4uAwM7NcHBxmZpaLg8PMzHJxcJiZWS4ODjMzy8XBYWZmuTg4zMwsFweHmZnlUmhwSJolaY2ktZIWVFl/oqQVkrZLmlNWfmQqXympXdIHy9a9U9KqVP75IttvZma7Kyw4JA0BrgJOBqYAZ0uaUlHtUWAucF1F+WPAjIg4GngdsEDSaEkvBi4D/i4ipgJ/I+nviuqDmZntrsgRx3HA2ohYFxFbgeuBU8srRMT6iFgF7Kwo3xoRz6XFEWXtfCnwu4jYlJZ/BpxeVAfMzGx3RQbHGGBD2XJHKquLpHGSVqV9fD4iNgJrgcmSJkgaCswGxtXY/jxJbZLaNm3aVK2KmZn1QtOeHI+IDRHxGuAo4D2SDo+IPwMfAr4H/BpYD+yosf3iiChFRKm1tXVfNdvMbMArMjg62XU0MDaV5ZJGGg8CJ6TlH0bE6yJiBrAG+G0ftNXMzOpUZHAsAyZJmihpOHAWcEs9G0oaK6klPT8UOJ4sJJD0krLyecA1BbTdzMxqKCw4ImI7cD6wFHgYWBIR7ZIulnQKgKRjJXUAZwBXS2pPm78SuE/SA8AdwBcjYnVad7mkh4C7gEsjwiMOM7N9SBHR6DYUrlQqRVtbW6ObYWbWr0haHhGlyvKmPTluZmbNycFhZma5ODjMzCwXB4eZmeXi4DAzs1wcHGZmlouDw8zMcnFwmJlZLg4OMzPLxcFhZma5ODjMzCwXB4eZmeXi4DAzs1wcHGZmlouDw8zMcnFwmJlZLg4OMzPLxcFhZma5ODjMzCyXoY1ugBXj5vs7uWzpGjZu7mL0qBbmz5zM7GljGt0sMxsAHBwD0M33d7LwptV0bdsBQOfmLhbetBrA4WFme82Hqgagy5aueT40unVt28FlS9c0qEVmNpA4OAagjZu7cpWbmeXh4BiARo9qyVVuZpaHg2MAmj9zMi3DhuxS1jJsCPNnTm5Qi8xsIPHJ8QGo+wS4r6oysyIUGhySZgGXA0OAayLi0or1JwJfAV4DnBURN6byI4Hvk42IhgH/EhFfS+vOBj4BBLAReFdEPF5kP/qj2dPGOCjMrBCFHaqSNAS4CjgZmAKcLWlKRbVHgbnAdRXljwEzIuJo4HXAAkmjJQ0lC6KTIuI1wCrg/KL6YGZmuyvyHMdxwNqIWBcRW4HrgVPLK0TE+ohYBeysKN8aEc+lxRFl7VR6HChJwEiyUYeZme0jRQbHGGBD2XJHKquLpHGSVqV9fD4iNkbENuBDwGqywJgC/FuN7c+T1CapbdOmTb3tg5mZVWjaq6oiYkM6HHUU8B5Jh0saRhYc04DRZIeqFtbYfnFElCKi1Nraus/abWY20BUZHJ3AuLLlsaksl4jYCDwInAAcncoeiYgAlgCv3+uWmplZ3Yq8qmoZMEnSRLLAOAs4p54NJY0FnoiILkmHAscDXwaeAKZIao2ITcBbgIf3tL/ly5c/LukPvexHf3AYMFivLHPfB6fB3HfYd/0/slphYcEREdslnQ8sJbsc9xsR0S7pYqAtIm6RdCzZZbeHAm+X9JmImAq8ElgkKchOhn8xIlYDSPoMcKekbcAfyK7K2lNbBvSxKkltEVFqdDsawX133wejRvdf2REf688a/UPUSO67+z4YNbr/TXty3MzMmpODY2BY3OgGNJD7PjgN5r5Dg/vvQ1VmZpaLRxxmZpaLg8PMzHJxcDShNN3KLyU9JKld0v8pW/dhSf+Vyr+QyoZJ+ndJqyU9LGlhWf1ZktZIWitpQSP6k0etvkv6nqSV6bFe0sqybRam/q2RNLOsfED3XdJbJC1P7/tySW8q29f0VL5W0hVpbrem1pv3Pq0fL2mLpAvKygb0e5/WvUbSPan+akn7p/Li3/uI8KPJHsARwDHp+cHAb8nm5ToJ+BkwIq17Sfr3HOD69PwAYD0wgez7M48ALwWGAw8AUxrdv970vaLOIuDT6fmU1K8RwMTU3yGDpO/TgNHp+auAzrJ6vwH+lux7UD8BTm50//q6/2VlNwI3ABek5cHw3g8lm3LptWn5xcCQffXee8TRhCLisYhYkZ4/Tfbt+DFk83RdGmnm4Ij4U/cmZDMGDwVagK3AU9QxQ3Gz6aHvAKRPT2cC301Fp5KF5nMR8XtgLVm/B3zfI+L+yKbkAWgHWiSNkHQEMDIi7o3sL8m3gNn7rie904v3Hkmzgd+T9b/bgH/vgf8JrIqIB9I2T0TEjn313js4mpykCWSfLO8DXg6cIOk+SXekb95D9onrGbL7mDxK9k37J9nLGYobraLv3U4A/hgRv0vLtfo4GPpe7nRgRfpQMYasv936Vd+hvv5LOgj4OPCZis0Hw3v/ciAkLZW0QtI/pvJ98t771rFNLP1i/CfwkYh4Ko0oXkQ2DD0WWCLppWSfsHaQzRh8KPBrST9rULP7RGXfy1adTdknzoEob98lTQU+T/YptN/L0f+LgC9HxJZ+cAqnLjn6PpRsDr9jgWeBn0taDvxlX7TTwdGklE0h/5/AdyLiplTcAdyUhqC/kbSTbLKzc4DbIrtfyZ8k3QWUyD517fUMxftajb6TgvM0YHpZ9Z5mYR7ofe+eEPT7wLsj4pFU3EnW3279ou+Qu/+vA+You0hkFLBT0l+B5Qz8974DuDPSbbMl3QocA3ybffHeN/qkkB9VT5SJ7NjkVyrKPwhcnJ6/nCwYRDZc/2YqPxB4iOw+7kOBdWQnjbtPEk5tdP960/e0bhZwR0XZVHY9Ob6O7OToYOj7qNSv06rUrzxB+tZG96+v+1+x/iJeODk+GN77Q4EVZBfDDCW7aOZt++q9b/h/mB9Vf4iOJzvhvQpYmR5vTb8E3ya7P8kK4E2p/kFkV5W0p9CYX7avt5JdofEIcGGj+9bbvqd11wIfrLLNhal/ayi7gmSg9x34JNm5rZVlj+4r7Urp5+QR4ErSLBHN/OjNe1+27fPBMRje+1T+rvQ7/yDwhbLywt97TzliZma5+KoqMzPLxcFhZma5ODjMzCwXB4eZmeXi4DAzs1wcHNbvSfqypI+ULS+VdE3Z8iJJH+th+2slzUnPfyVpt3s5K5uB+FJJv0tTPNwj6eS0br2kw3rR7udft8b6q9KsqA9J6iqbJXWOpFsljcr7mnW06QhJP+ph/XBJd6Yvpdkg5eCwgeAu4PUAkvYj+zb91LL1rwfu3svXuIRsBtNXRcQxZBPHHbyX++xRRPzviDia7DsJj0TE0elxY0S8NSI2F/CyHwO+3kObtgI/B95ZwGtbP+HgsIHgbmBGej6V7MtPT0s6VNII4JXACkmflrRM0oOSFtd7nwJJBwDvBz4cL8xM/MeIWFKl7sfS/h+sGAW9W9IqSQ9I+o8q212SRiBD6mzTekmHSZqg7P4s10r6raTvSHqzpLvS6Oi4VP9ASd+Q9BtJ90uqNVvs6cBtaZupqf7K1PZJqc7NwLn1tNMGJg83rd+LiI2StksaTza6uIdsRtAZZJO+rY6IrZKujIiLAdIf778HfljHSxwFPBq7Tjq3G0nTgfeSzaEk4D5Jd5BNc/9J4PUR8bikF1VsdxnZ6OW90btv5B4FnAG8D1hGNnfZ8cApwCfIRkcXAr+IiPelQ1y/kfSziHimrB0TgT93hyPZFDeXR8R3JA0nm8oFsmDunpnZBiGPOGyguJssNLqD456y5btSnZOUTUm/GngTux7O6gvHA9+PiGciYgtwE9l02G8Cbog0IV1kU953+xRwSER8sJehAfD7iFgdETvJpqD4edrXarIbekE2c+4CZXeQ+xWwPzC+Yj9HAJvKlu8BPiHp48CREdGV2r8D2Cqp0EN11rwcHDZQdJ/neDXZJ+J7yUYcrwfuVnZbzX8F5kTEq8mO4+9f577XAuMljezzVmcjhOmVo5Ccnit7vrNseScvHFUQcHrZeZLxEfFwxX66KPs/iYjryEYtXcCtKrs1Ldmkkn/dizZbP+bgsIHibrJDT09GxI70qX4UWXjczQt/EB9P9zyoeTVTpYh4Fvg34PJ0yAZJrZLOqKj6a2C2pAMkHQi8I5X9AjhD0ovTtuUhcRtwKfDjgj/BLwU+3H1eR9K0KnV+ywsjFJTd62VdRFwB/IBsxmVSPx6PbBp/G4QcHDZQrCa7mureirK/RMTj6Qqkr5ONRpaSfdLP45Nkh3EekvQg8COy2/M+L7Jbf15LNq31fcA1kd3etR34HHCHpAeAL1Vsd0Nq2y2SWnK2q16XAMOAVZLa0/Iu0vmORyQdlYrOBB5Mh7deRTbtN8BJwI8Laqf1A54d18yeJ+kdwPSI+GQPdW4CFkTEb/ddy6yZ+KoqM3teRHy/+5BaNelQ3c0OjcHNIw4zM8vF5zjMzCwXB4eZmeXi4DAzs1wcHGZmlouDw8zMcvn/bO0s6LCmZj4AAAAASUVORK5CYII=\n",
|
||
"text/plain": [
|
||
"<Figure size 432x288 with 1 Axes>"
|
||
]
|
||
},
|
||
"metadata": {
|
||
"needs_background": "light"
|
||
},
|
||
"output_type": "display_data"
|
||
}
|
||
],
|
||
"source": [
|
||
"\n",
|
||
"from flaml.data import get_output_from_log\n",
|
||
"time_history, best_valid_loss_history, valid_loss_history, config_history, metric_history = \\\n",
|
||
" get_output_from_log(filename=automl_settings['log_file_name'], time_budget=3000)\n",
|
||
"for config in config_history:\n",
|
||
" print(config)\n",
|
||
"\n",
|
||
"import matplotlib.pyplot as plt\n",
|
||
"import numpy as np\n",
|
||
"plt.title('Learning Curve')\n",
|
||
"plt.xlabel('Wall Clock Time (s)')\n",
|
||
"plt.ylabel('Rouge 1')\n",
|
||
"print(len(valid_loss_history))\n",
|
||
"plt.scatter(time_history, 1 - np.array(valid_loss_history))\n",
|
||
"plt.step(time_history, 1 - np.array(best_valid_loss_history), where='post')\n",
|
||
"plt.show()"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"metadata": {
|
||
"id": "AzID7DyALObP"
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
""
|
||
]
|
||
}
|
||
],
|
||
"metadata": {
|
||
"accelerator": "GPU",
|
||
"colab": {
|
||
"collapsed_sections": [],
|
||
"name": "Copy of automl_nlp.ipynb",
|
||
"provenance": [],
|
||
"include_colab_link": true
|
||
},
|
||
"gpuClass": "standard",
|
||
"interpreter": {
|
||
"hash": "e9d36fc5b7c3dd4177ff1b60184dd696c0acc18150a44682abca4d769811bd46"
|
||
},
|
||
"kernelspec": {
|
||
"display_name": "Python 3 (ipykernel)",
|
||
"language": "python",
|
||
"name": "python3"
|
||
},
|
||
"language_info": {
|
||
"codemirror_mode": {
|
||
"name": "ipython",
|
||
"version": 3
|
||
},
|
||
"file_extension": ".py",
|
||
"mimetype": "text/x-python",
|
||
"name": "python",
|
||
"nbconvert_exporter": "python",
|
||
"pygments_lexer": "ipython3",
|
||
"version": "3.8.0"
|
||
},
|
||
"widgets": {
|
||
"application/vnd.jupyter.widget-state+json": {
|
||
"007a43463f5e4da3983f59dfeb793e64": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"069feb62f1ec4392b04ee1d80aa4b445": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"0c3a0eb88b16493e9d0f62e3d5abf195": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"155d7e95c2504507b83b12dc60f1edc1": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HBoxModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HBoxModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HBoxView",
|
||
"box_style": "",
|
||
"children": [
|
||
"IPY_MODEL_feaf1f712b8e499db2558dc0fdd4261e",
|
||
"IPY_MODEL_51392c27affa4fd3b4184cde01b7029d",
|
||
"IPY_MODEL_5dd6914461ea456e9dc96ccf8c391c6e"
|
||
],
|
||
"layout": "IPY_MODEL_069feb62f1ec4392b04ee1d80aa4b445"
|
||
}
|
||
},
|
||
"18091d361aa44881a3db5d1951882082": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"1a701b0fa5a34fb9b64fb92b5c8e4306": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"1fb6335656b1444abe05aa94a7d13825": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_6a24dd061b2d40d4baca4036059fc94c",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_9fdc731eb58247cbafef9286b49c66f9",
|
||
"value": " 19566/20006 [00:02<00:00, 9081.23 examples/s]"
|
||
}
|
||
},
|
||
"1fed6cdd71c4453b976e8651b7b34cae": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "ProgressStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "ProgressStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"bar_color": null,
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"1ffd6e8c1f834dc48d66116b6089f7a2": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"20624397998c4e188b419c6267affb65": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"20abb46ea9c948b8ba85a921aee8af6d": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"21f848683b2648c08a7476658d382177": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "ProgressStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "ProgressStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"bar_color": null,
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"263e99ca21124b79a26e1078b187273d": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_a91dc7ddd7f641d9b60b59bbbde7bae6",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_96c6874b9bd045a5bc67596b2ab04df2",
|
||
"value": "Generating train split: 100%"
|
||
}
|
||
},
|
||
"26885bbae6c646e7bcc4a0459620c37a": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"26f3abaf861a4c63986ff0691294d70c": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"2a1aa694683d4df9b509f5ce4d6d53b0": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"2d975d14c3f0434583e73ae97f580951": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "ProgressStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "ProgressStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"bar_color": null,
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"2e00672f9d1f46cea3e5db651bca19a3": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HBoxModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HBoxModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HBoxView",
|
||
"box_style": "",
|
||
"children": [
|
||
"IPY_MODEL_cc378c1990634f7da6ffba019fe38c59",
|
||
"IPY_MODEL_642323b1bafa4d0fbeca1adff2426c02",
|
||
"IPY_MODEL_a902290681e942cbae40024baaa2e9b7"
|
||
],
|
||
"layout": "IPY_MODEL_3a014eef1b7d44698572bc5cada4cb8c"
|
||
}
|
||
},
|
||
"2e0b939889d84c07a90d36a57065aac4": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"35151380719b41349a2113b0b893bd6f": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "FloatProgressModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "FloatProgressModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "ProgressView",
|
||
"bar_style": "",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_efca4a3072e94170a1b851f9dec6164d",
|
||
"max": 20006,
|
||
"min": 0,
|
||
"orientation": "horizontal",
|
||
"style": "IPY_MODEL_791101442898470f8524ebee4cb9459d",
|
||
"value": 20006
|
||
}
|
||
},
|
||
"3588f07c45694ec4a484afaaa9e9c599": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HBoxModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HBoxModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HBoxView",
|
||
"box_style": "",
|
||
"children": [
|
||
"IPY_MODEL_998b0ca5b37b47b88ea47327462c76fa",
|
||
"IPY_MODEL_38bb77cefa2e4c17b8e9c419125d6c45",
|
||
"IPY_MODEL_ded6921a6b8140b3bcd59d0e7bbd7900"
|
||
],
|
||
"layout": "IPY_MODEL_ad994aff0bf94c2ea4ac9aa8d5c067e3"
|
||
}
|
||
},
|
||
"362f58e6d05f4d0c865cbe6a956d677b": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"37d4912ed8ee4c0c9f0a9187bad156fd": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_3b6eaa3d64924ec581c412b04b9196fa",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_94a2d9480adc426abb4ade344ca8dd2f",
|
||
"value": "Downloading data: "
|
||
}
|
||
},
|
||
"38bb77cefa2e4c17b8e9c419125d6c45": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "FloatProgressModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "FloatProgressModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "ProgressView",
|
||
"bar_style": "success",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_760feb714cc54846a52fc399703891d7",
|
||
"max": 2348,
|
||
"min": 0,
|
||
"orientation": "horizontal",
|
||
"style": "IPY_MODEL_a21f2bb483e44c749ec41a2b1784ee4b",
|
||
"value": 2348
|
||
}
|
||
},
|
||
"3a014eef1b7d44698572bc5cada4cb8c": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"3b684b9f50ce48ff92b075d62619368b": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HBoxModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HBoxModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HBoxView",
|
||
"box_style": "",
|
||
"children": [
|
||
"IPY_MODEL_a3e42e8e532c4628abaa6e154d667ed2",
|
||
"IPY_MODEL_49290455306b47aaaf8153bed5e49742",
|
||
"IPY_MODEL_d8468fc2f0b94b2b8dab75336a0d29a3"
|
||
],
|
||
"layout": "IPY_MODEL_b09d990f98f0419e84f5939d3b48d381"
|
||
}
|
||
},
|
||
"3b6eaa3d64924ec581c412b04b9196fa": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"44cf4e612b5f482d8bf224413c1bc852": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_805c722b7dec4fc59ef64f8704e29424",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_0c3a0eb88b16493e9d0f62e3d5abf195",
|
||
"value": "Downloading data files: 100%"
|
||
}
|
||
},
|
||
"46e340fe82414a58b283c78cdf953773": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"49290455306b47aaaf8153bed5e49742": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "FloatProgressModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "FloatProgressModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "ProgressView",
|
||
"bar_style": "success",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_d226d72577ea4cc299ed78c2fa99a486",
|
||
"max": 2214653,
|
||
"min": 0,
|
||
"orientation": "horizontal",
|
||
"style": "IPY_MODEL_82e0dbe1328a4c54807932984e0c4efb",
|
||
"value": 2214653
|
||
}
|
||
},
|
||
"4b2b38b8064040849b10c63b9f2ed8fd": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"4b92faf53c2b4f7986066af8026ffc3a": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"4e956bd06d3a45eca66b192990416a62": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_ab2c48f34b7f43c5a5fb37c80a7d47f3",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_834bdf06646a4d009648b6bc270c7624",
|
||
"value": "Generating test split: 98%"
|
||
}
|
||
},
|
||
"4fa9926221cb4d29bc0cc0c3d0bf93f3": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HBoxModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HBoxModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HBoxView",
|
||
"box_style": "",
|
||
"children": [
|
||
"IPY_MODEL_4e956bd06d3a45eca66b192990416a62",
|
||
"IPY_MODEL_e319f91c3ba841f99a9a1ca1c7b551f2",
|
||
"IPY_MODEL_da31e023dddb4c25a035258c0e4ed0d7"
|
||
],
|
||
"layout": "IPY_MODEL_574c7a42dad940379a96b9f0968d3be1"
|
||
}
|
||
},
|
||
"51392c27affa4fd3b4184cde01b7029d": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "FloatProgressModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "FloatProgressModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "ProgressView",
|
||
"bar_style": "success",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_db589666b507409f9647930b1222b0a9",
|
||
"max": 3,
|
||
"min": 0,
|
||
"orientation": "horizontal",
|
||
"style": "IPY_MODEL_1fed6cdd71c4453b976e8651b7b34cae",
|
||
"value": 3
|
||
}
|
||
},
|
||
"574c7a42dad940379a96b9f0968d3be1": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"57aca5124cc14ed69da5a0b24a2c1052": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"5a392cc22ee84433bac08ef8a6a3e0d4": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"5ad8132df42340c58f1375b1e52eb5bc": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "ProgressStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "ProgressStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"bar_color": null,
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"5dd6914461ea456e9dc96ccf8c391c6e": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_f307d9be05d24940b57d9edc82be8976",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_20abb46ea9c948b8ba85a921aee8af6d",
|
||
"value": " 3/3 [00:00<00:00, 85.61it/s]"
|
||
}
|
||
},
|
||
"5ea642008bf74641a021e17b7e3fd6e7": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"6284429508d849bd8259460913efc250": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "FloatProgressModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "FloatProgressModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "ProgressView",
|
||
"bar_style": "success",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_d4830572efa244968881c31932ec5dff",
|
||
"max": 2238601,
|
||
"min": 0,
|
||
"orientation": "horizontal",
|
||
"style": "IPY_MODEL_21f848683b2648c08a7476658d382177",
|
||
"value": 2238601
|
||
}
|
||
},
|
||
"62bdec145ccb48faa4fe5f51d2879732": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"642323b1bafa4d0fbeca1adff2426c02": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "FloatProgressModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "FloatProgressModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "ProgressView",
|
||
"bar_style": "success",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_5ea642008bf74641a021e17b7e3fd6e7",
|
||
"max": 1775,
|
||
"min": 0,
|
||
"orientation": "horizontal",
|
||
"style": "IPY_MODEL_2d975d14c3f0434583e73ae97f580951",
|
||
"value": 1775
|
||
}
|
||
},
|
||
"6a24dd061b2d40d4baca4036059fc94c": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"6be493d86857493190ee47a08c04ff40": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"760feb714cc54846a52fc399703891d7": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"791101442898470f8524ebee4cb9459d": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "ProgressStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "ProgressStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"bar_color": null,
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"7dc52faf4b3b4643b7d7019f1722c1d8": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HBoxModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HBoxModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HBoxView",
|
||
"box_style": "",
|
||
"children": [
|
||
"IPY_MODEL_44cf4e612b5f482d8bf224413c1bc852",
|
||
"IPY_MODEL_f4ec7bc190af4c9dbe6a5fc05fad4540",
|
||
"IPY_MODEL_9dd6ab0e0cb940bebe25cba5492b2486"
|
||
],
|
||
"layout": "IPY_MODEL_e556c049fbb24669a49b26c7f107e6a5"
|
||
}
|
||
},
|
||
"7feeefa264da4af89ddc8ddf331b4f9f": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_be21476b15a14c0084712a9d5aedc22f",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_362f58e6d05f4d0c865cbe6a956d677b",
|
||
"value": " 73335/73546 [00:15<00:00, 4205.15 examples/s]"
|
||
}
|
||
},
|
||
"805c722b7dec4fc59ef64f8704e29424": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"811d914b52904fa0913adc9daa33695c": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"82e0dbe1328a4c54807932984e0c4efb": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "ProgressStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "ProgressStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"bar_color": null,
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"834bdf06646a4d009648b6bc270c7624": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"8497fe93c0d148a49f9a0a0c56961f36": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"87639f90c2ab47db986419c03e165d7a": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"894de256daec49329f6404326eddaa39": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "FloatProgressModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "FloatProgressModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "ProgressView",
|
||
"bar_style": "",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_57aca5124cc14ed69da5a0b24a2c1052",
|
||
"max": 73546,
|
||
"min": 0,
|
||
"orientation": "horizontal",
|
||
"style": "IPY_MODEL_8f2c2b10e21e42569ef5396e42c65e30",
|
||
"value": 73546
|
||
}
|
||
},
|
||
"8f2c2b10e21e42569ef5396e42c65e30": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "ProgressStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "ProgressStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"bar_color": null,
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"8f5311cebd554f5ba645b8d33b0722a3": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"920c8dd736a4454f9469fb3fa0a9af90": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"929cdd7c2f8e4902aac96a9a3afa5866": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "ProgressStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "ProgressStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"bar_color": null,
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"94a2d9480adc426abb4ade344ca8dd2f": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"9509ad27a9b54e3e80f796d224f3e189": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_18091d361aa44881a3db5d1951882082",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_2a1aa694683d4df9b509f5ce4d6d53b0",
|
||
"value": " 28.2M/? [00:00<00:00, 56.9MB/s]"
|
||
}
|
||
},
|
||
"96c6874b9bd045a5bc67596b2ab04df2": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"98bcaff9e28547e3b1f9b0640d598f99": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"98f179b9be5044c79bc867f5261e2b47": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "ProgressStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "ProgressStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"bar_color": null,
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"998b0ca5b37b47b88ea47327462c76fa": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_6be493d86857493190ee47a08c04ff40",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_46e340fe82414a58b283c78cdf953773",
|
||
"value": "Downloading builder script: "
|
||
}
|
||
},
|
||
"9cbec0a1fe3247ed8a46290df56756fa": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_26885bbae6c646e7bcc4a0459620c37a",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_a56b058d910244588c1454b02c8cda8c",
|
||
"value": "Generating validation split: 98%"
|
||
}
|
||
},
|
||
"9cf2c0d8439a4f5a86b4769a27babb94": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"9dd6ab0e0cb940bebe25cba5492b2486": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_aa1c4b91b583440a8e6f79dd06cfd200",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_b37bd95afbe44cd196fc5ab2d52bccd0",
|
||
"value": " 3/3 [00:13<00:00, 4.35s/it]"
|
||
}
|
||
},
|
||
"9fdc731eb58247cbafef9286b49c66f9": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"a18b9c22460940cfa53b657849b034bf": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"a21f2bb483e44c749ec41a2b1784ee4b": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "ProgressStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "ProgressStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"bar_color": null,
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"a3e42e8e532c4628abaa6e154d667ed2": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_4b92faf53c2b4f7986066af8026ffc3a",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_8497fe93c0d148a49f9a0a0c56961f36",
|
||
"value": "Downloading data: "
|
||
}
|
||
},
|
||
"a56b058d910244588c1454b02c8cda8c": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"a6431dfb76084a838d63849fa362de35": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"a902290681e942cbae40024baaa2e9b7": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_26f3abaf861a4c63986ff0691294d70c",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_1a701b0fa5a34fb9b64fb92b5c8e4306",
|
||
"value": " 7.10k/? [00:00<00:00, 221kB/s]"
|
||
}
|
||
},
|
||
"a91dc7ddd7f641d9b60b59bbbde7bae6": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"aa1c4b91b583440a8e6f79dd06cfd200": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"aad092fcb29d4045a342288aa9d6a329": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"ab2c48f34b7f43c5a5fb37c80a7d47f3": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"ad994aff0bf94c2ea4ac9aa8d5c067e3": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"b09d990f98f0419e84f5939d3b48d381": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"b167b817426e4832b73a7a37b72115c1": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"b37bd95afbe44cd196fc5ab2d52bccd0": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"b649c32bcc8446cf91c53604fc1dcaa6": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HBoxModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HBoxModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HBoxView",
|
||
"box_style": "",
|
||
"children": [
|
||
"IPY_MODEL_b70e2d813be3473f90ca76e293251c0b",
|
||
"IPY_MODEL_b66e17d6fd094f44bc10eade34fc5261",
|
||
"IPY_MODEL_9509ad27a9b54e3e80f796d224f3e189"
|
||
],
|
||
"layout": "IPY_MODEL_920c8dd736a4454f9469fb3fa0a9af90"
|
||
}
|
||
},
|
||
"b66e17d6fd094f44bc10eade34fc5261": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "FloatProgressModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "FloatProgressModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "ProgressView",
|
||
"bar_style": "success",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_811d914b52904fa0913adc9daa33695c",
|
||
"max": 6710578,
|
||
"min": 0,
|
||
"orientation": "horizontal",
|
||
"style": "IPY_MODEL_98f179b9be5044c79bc867f5261e2b47",
|
||
"value": 6710578
|
||
}
|
||
},
|
||
"b70e2d813be3473f90ca76e293251c0b": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_8f5311cebd554f5ba645b8d33b0722a3",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_aad092fcb29d4045a342288aa9d6a329",
|
||
"value": "Downloading data: "
|
||
}
|
||
},
|
||
"be21476b15a14c0084712a9d5aedc22f": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"bfbcfceca0444337ac6c4033a7734fc1": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"c1792bbceb854dc5880003f64e5623cb": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"c274c717ac7e4fa2888e0d101c3fe1fb": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HBoxModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HBoxModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HBoxView",
|
||
"box_style": "",
|
||
"children": [
|
||
"IPY_MODEL_9cbec0a1fe3247ed8a46290df56756fa",
|
||
"IPY_MODEL_35151380719b41349a2113b0b893bd6f",
|
||
"IPY_MODEL_1fb6335656b1444abe05aa94a7d13825"
|
||
],
|
||
"layout": "IPY_MODEL_a6431dfb76084a838d63849fa362de35"
|
||
}
|
||
},
|
||
"c96613989db447b5acfb35cfef553145": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "DescriptionStyleModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "DescriptionStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "StyleView",
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"ca550fa3fe1147bd8285c2b7cadde206": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HBoxModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HBoxModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HBoxView",
|
||
"box_style": "",
|
||
"children": [
|
||
"IPY_MODEL_263e99ca21124b79a26e1078b187273d",
|
||
"IPY_MODEL_894de256daec49329f6404326eddaa39",
|
||
"IPY_MODEL_7feeefa264da4af89ddc8ddf331b4f9f"
|
||
],
|
||
"layout": "IPY_MODEL_4b2b38b8064040849b10c63b9f2ed8fd"
|
||
}
|
||
},
|
||
"cc378c1990634f7da6ffba019fe38c59": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_c1792bbceb854dc5880003f64e5623cb",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_b167b817426e4832b73a7a37b72115c1",
|
||
"value": "Downloading metadata: "
|
||
}
|
||
},
|
||
"d226d72577ea4cc299ed78c2fa99a486": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"d4830572efa244968881c31932ec5dff": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"d8468fc2f0b94b2b8dab75336a0d29a3": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_eb395373be244e6e8815087d5d32a801",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_1ffd6e8c1f834dc48d66116b6089f7a2",
|
||
"value": " 7.82M/? [00:00<00:00, 38.6MB/s]"
|
||
}
|
||
},
|
||
"da31e023dddb4c25a035258c0e4ed0d7": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_87639f90c2ab47db986419c03e165d7a",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_a18b9c22460940cfa53b657849b034bf",
|
||
"value": " 19617/20005 [00:02<00:00, 9178.75 examples/s]"
|
||
}
|
||
},
|
||
"db589666b507409f9647930b1222b0a9": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"ded6921a6b8140b3bcd59d0e7bbd7900": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_9cf2c0d8439a4f5a86b4769a27babb94",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_007a43463f5e4da3983f59dfeb793e64",
|
||
"value": " 7.97k/? [00:00<00:00, 244kB/s]"
|
||
}
|
||
},
|
||
"e1f77bef878c4b0bbfac867c5a9eea98": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_5a392cc22ee84433bac08ef8a6a3e0d4",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_98bcaff9e28547e3b1f9b0640d598f99",
|
||
"value": " 7.89M/? [00:00<00:00, 41.2MB/s]"
|
||
}
|
||
},
|
||
"e319f91c3ba841f99a9a1ca1c7b551f2": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "FloatProgressModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "FloatProgressModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "ProgressView",
|
||
"bar_style": "",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_2e0b939889d84c07a90d36a57065aac4",
|
||
"max": 20005,
|
||
"min": 0,
|
||
"orientation": "horizontal",
|
||
"style": "IPY_MODEL_929cdd7c2f8e4902aac96a9a3afa5866",
|
||
"value": 20005
|
||
}
|
||
},
|
||
"e556c049fbb24669a49b26c7f107e6a5": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"eb395373be244e6e8815087d5d32a801": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"efca4a3072e94170a1b851f9dec6164d": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"f307d9be05d24940b57d9edc82be8976": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_module_version": "1.2.0",
|
||
"model_name": "LayoutModel",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "1.2.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "1.2.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"overflow_x": null,
|
||
"overflow_y": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"f4ec7bc190af4c9dbe6a5fc05fad4540": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "FloatProgressModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "FloatProgressModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "ProgressView",
|
||
"bar_style": "success",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_bfbcfceca0444337ac6c4033a7734fc1",
|
||
"max": 3,
|
||
"min": 0,
|
||
"orientation": "horizontal",
|
||
"style": "IPY_MODEL_5ad8132df42340c58f1375b1e52eb5bc",
|
||
"value": 3
|
||
}
|
||
},
|
||
"f74dfe0a3de64c3ea051e14fba9a04e4": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HBoxModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HBoxModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HBoxView",
|
||
"box_style": "",
|
||
"children": [
|
||
"IPY_MODEL_37d4912ed8ee4c0c9f0a9187bad156fd",
|
||
"IPY_MODEL_6284429508d849bd8259460913efc250",
|
||
"IPY_MODEL_e1f77bef878c4b0bbfac867c5a9eea98"
|
||
],
|
||
"layout": "IPY_MODEL_20624397998c4e188b419c6267affb65"
|
||
}
|
||
},
|
||
"feaf1f712b8e499db2558dc0fdd4261e": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_module_version": "1.5.0",
|
||
"model_name": "HTMLModel",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "1.5.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "1.5.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_tooltip": null,
|
||
"layout": "IPY_MODEL_62bdec145ccb48faa4fe5f51d2879732",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_c96613989db447b5acfb35cfef553145",
|
||
"value": "Extracting data files: 100%"
|
||
}
|
||
},
|
||
"16fba9eb9e4542bc9d34eca00d71cc14": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HBoxModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HBoxModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "HBoxView",
|
||
"box_style": "",
|
||
"children": [
|
||
"IPY_MODEL_c81f11a99b9d4d1d95533f24bea1d5ac",
|
||
"IPY_MODEL_e4ce5cf6ea174583a14675a75d31992d",
|
||
"IPY_MODEL_0c8473019e434db0ae34d58b69605a69"
|
||
],
|
||
"layout": "IPY_MODEL_814d3f2b7212461ca51f8635b5106783",
|
||
"tabbable": null,
|
||
"tooltip": null
|
||
}
|
||
},
|
||
"c81f11a99b9d4d1d95533f24bea1d5ac": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HTMLModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_allow_html": false,
|
||
"layout": "IPY_MODEL_129488cadbb9477ca593ae106ee8e9f7",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_c7aaa1fbd10942649c90044c0c901d99",
|
||
"tabbable": null,
|
||
"tooltip": null,
|
||
"value": "Downloading data: 100%"
|
||
}
|
||
},
|
||
"e4ce5cf6ea174583a14675a75d31992d": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "FloatProgressModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "FloatProgressModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "ProgressView",
|
||
"bar_style": "success",
|
||
"description": "",
|
||
"description_allow_html": false,
|
||
"layout": "IPY_MODEL_6fe78cfd377b4c10a626588f46a569cf",
|
||
"max": 7439277,
|
||
"min": 0,
|
||
"orientation": "horizontal",
|
||
"style": "IPY_MODEL_2aa0f33b8d3a4bc7bedf3b66e06b62f0",
|
||
"tabbable": null,
|
||
"tooltip": null,
|
||
"value": 7439277
|
||
}
|
||
},
|
||
"0c8473019e434db0ae34d58b69605a69": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HTMLModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_allow_html": false,
|
||
"layout": "IPY_MODEL_1794a34790b647b3a8a845c55e5e0744",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_13d813116e1846a3a0e42a5e8423f80e",
|
||
"tabbable": null,
|
||
"tooltip": null,
|
||
"value": " 7.44M/7.44M [00:00<00:00, 13.9MB/s]"
|
||
}
|
||
},
|
||
"814d3f2b7212461ca51f8635b5106783": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_name": "LayoutModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border_bottom": null,
|
||
"border_left": null,
|
||
"border_right": null,
|
||
"border_top": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"129488cadbb9477ca593ae106ee8e9f7": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_name": "LayoutModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border_bottom": null,
|
||
"border_left": null,
|
||
"border_right": null,
|
||
"border_top": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"c7aaa1fbd10942649c90044c0c901d99": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HTMLStyleModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HTMLStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "StyleView",
|
||
"background": null,
|
||
"description_width": "",
|
||
"font_size": null,
|
||
"text_color": null
|
||
}
|
||
},
|
||
"6fe78cfd377b4c10a626588f46a569cf": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_name": "LayoutModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border_bottom": null,
|
||
"border_left": null,
|
||
"border_right": null,
|
||
"border_top": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"2aa0f33b8d3a4bc7bedf3b66e06b62f0": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "ProgressStyleModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "ProgressStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "StyleView",
|
||
"bar_color": null,
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"1794a34790b647b3a8a845c55e5e0744": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_name": "LayoutModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border_bottom": null,
|
||
"border_left": null,
|
||
"border_right": null,
|
||
"border_top": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"13d813116e1846a3a0e42a5e8423f80e": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HTMLStyleModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HTMLStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "StyleView",
|
||
"background": null,
|
||
"description_width": "",
|
||
"font_size": null,
|
||
"text_color": null
|
||
}
|
||
},
|
||
"2aa02213244048fead33ea157c17837b": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HBoxModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HBoxModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "HBoxView",
|
||
"box_style": "",
|
||
"children": [
|
||
"IPY_MODEL_dfee126a02ef4934a0f654a101dadc1b",
|
||
"IPY_MODEL_8e58d795d528405f8d1c48bfc2afe399",
|
||
"IPY_MODEL_e23212ae504e493a85e4b2524f0217e1"
|
||
],
|
||
"layout": "IPY_MODEL_b84af540de4741ceb206456d2f05fa4b",
|
||
"tabbable": null,
|
||
"tooltip": null
|
||
}
|
||
},
|
||
"dfee126a02ef4934a0f654a101dadc1b": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HTMLModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_allow_html": false,
|
||
"layout": "IPY_MODEL_9ba620aa4e51456c9b3b2469c2c887c3",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_cc8f1bc7322542828205777903530f1e",
|
||
"tabbable": null,
|
||
"tooltip": null,
|
||
"value": "Generating train split: 98%"
|
||
}
|
||
},
|
||
"8e58d795d528405f8d1c48bfc2afe399": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "FloatProgressModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "FloatProgressModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "ProgressView",
|
||
"bar_style": "",
|
||
"description": "",
|
||
"description_allow_html": false,
|
||
"layout": "IPY_MODEL_c5f0ba4cda014a63a99ecc989f72f731",
|
||
"max": 67349,
|
||
"min": 0,
|
||
"orientation": "horizontal",
|
||
"style": "IPY_MODEL_44803ba97fd54ceab2afe0555e21dfe8",
|
||
"tabbable": null,
|
||
"tooltip": null,
|
||
"value": 67349
|
||
}
|
||
},
|
||
"e23212ae504e493a85e4b2524f0217e1": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HTMLModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_allow_html": false,
|
||
"layout": "IPY_MODEL_f4ca1b7b5868446da945574f4db4373b",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_f7070757d4784c8099ef7dd9bd280ed3",
|
||
"tabbable": null,
|
||
"tooltip": null,
|
||
"value": " 65772/67349 [00:03<00:00, 18334.90 examples/s]"
|
||
}
|
||
},
|
||
"b84af540de4741ceb206456d2f05fa4b": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_name": "LayoutModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border_bottom": null,
|
||
"border_left": null,
|
||
"border_right": null,
|
||
"border_top": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"9ba620aa4e51456c9b3b2469c2c887c3": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_name": "LayoutModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border_bottom": null,
|
||
"border_left": null,
|
||
"border_right": null,
|
||
"border_top": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"cc8f1bc7322542828205777903530f1e": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HTMLStyleModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HTMLStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "StyleView",
|
||
"background": null,
|
||
"description_width": "",
|
||
"font_size": null,
|
||
"text_color": null
|
||
}
|
||
},
|
||
"c5f0ba4cda014a63a99ecc989f72f731": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_name": "LayoutModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border_bottom": null,
|
||
"border_left": null,
|
||
"border_right": null,
|
||
"border_top": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"44803ba97fd54ceab2afe0555e21dfe8": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "ProgressStyleModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "ProgressStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "StyleView",
|
||
"bar_color": null,
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"f4ca1b7b5868446da945574f4db4373b": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_name": "LayoutModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border_bottom": null,
|
||
"border_left": null,
|
||
"border_right": null,
|
||
"border_top": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"f7070757d4784c8099ef7dd9bd280ed3": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HTMLStyleModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HTMLStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "StyleView",
|
||
"background": null,
|
||
"description_width": "",
|
||
"font_size": null,
|
||
"text_color": null
|
||
}
|
||
},
|
||
"6a8be136bafe40eea3430dda4063e6db": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HBoxModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HBoxModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "HBoxView",
|
||
"box_style": "",
|
||
"children": [
|
||
"IPY_MODEL_f28c1b9dee064db99c389beb98306f86",
|
||
"IPY_MODEL_b2a5550cd6474de7a46eab6a973305e0",
|
||
"IPY_MODEL_a4e9b7b28055406c9569e585296850c6"
|
||
],
|
||
"layout": "IPY_MODEL_ac534b02efb34100b53999031767e8a3",
|
||
"tabbable": null,
|
||
"tooltip": null
|
||
}
|
||
},
|
||
"f28c1b9dee064db99c389beb98306f86": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HTMLModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_allow_html": false,
|
||
"layout": "IPY_MODEL_5f9f744825e44fdbae9c126837e40efe",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_c3f7a2bb90b44a21a43939f78914f9b8",
|
||
"tabbable": null,
|
||
"tooltip": null,
|
||
"value": "Generating validation split: 92%"
|
||
}
|
||
},
|
||
"b2a5550cd6474de7a46eab6a973305e0": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "FloatProgressModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "FloatProgressModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "ProgressView",
|
||
"bar_style": "",
|
||
"description": "",
|
||
"description_allow_html": false,
|
||
"layout": "IPY_MODEL_4ec28fbc9433413f8355e0c976839a94",
|
||
"max": 872,
|
||
"min": 0,
|
||
"orientation": "horizontal",
|
||
"style": "IPY_MODEL_fab424da92b541cfac6b3bb05ee4e17b",
|
||
"tabbable": null,
|
||
"tooltip": null,
|
||
"value": 872
|
||
}
|
||
},
|
||
"a4e9b7b28055406c9569e585296850c6": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HTMLModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_allow_html": false,
|
||
"layout": "IPY_MODEL_8cfa8c0a28f549c19649ec9b390aa528",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_470b17af8c8442e49757dc4e385d16f0",
|
||
"tabbable": null,
|
||
"tooltip": null,
|
||
"value": " 798/872 [00:00<00:00, 7979.17 examples/s]"
|
||
}
|
||
},
|
||
"ac534b02efb34100b53999031767e8a3": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_name": "LayoutModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border_bottom": null,
|
||
"border_left": null,
|
||
"border_right": null,
|
||
"border_top": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"5f9f744825e44fdbae9c126837e40efe": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_name": "LayoutModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border_bottom": null,
|
||
"border_left": null,
|
||
"border_right": null,
|
||
"border_top": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"c3f7a2bb90b44a21a43939f78914f9b8": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HTMLStyleModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HTMLStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "StyleView",
|
||
"background": null,
|
||
"description_width": "",
|
||
"font_size": null,
|
||
"text_color": null
|
||
}
|
||
},
|
||
"4ec28fbc9433413f8355e0c976839a94": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_name": "LayoutModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border_bottom": null,
|
||
"border_left": null,
|
||
"border_right": null,
|
||
"border_top": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"fab424da92b541cfac6b3bb05ee4e17b": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "ProgressStyleModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "ProgressStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "StyleView",
|
||
"bar_color": null,
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"8cfa8c0a28f549c19649ec9b390aa528": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_name": "LayoutModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border_bottom": null,
|
||
"border_left": null,
|
||
"border_right": null,
|
||
"border_top": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"470b17af8c8442e49757dc4e385d16f0": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HTMLStyleModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HTMLStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "StyleView",
|
||
"background": null,
|
||
"description_width": "",
|
||
"font_size": null,
|
||
"text_color": null
|
||
}
|
||
},
|
||
"2c31d2eb9ae44ffbb0f02ad1b1e7937a": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HBoxModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HBoxModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "HBoxView",
|
||
"box_style": "",
|
||
"children": [
|
||
"IPY_MODEL_cd5ec1bd9cf54dbfbfd577a096a9e588",
|
||
"IPY_MODEL_d1ba6870db484696879f0d6e5d3a9d70",
|
||
"IPY_MODEL_e7405147ca374cc4998ca947be069652"
|
||
],
|
||
"layout": "IPY_MODEL_fb72bbb82aec4184a8e0a510177433cf",
|
||
"tabbable": null,
|
||
"tooltip": null
|
||
}
|
||
},
|
||
"cd5ec1bd9cf54dbfbfd577a096a9e588": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HTMLModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_allow_html": false,
|
||
"layout": "IPY_MODEL_f37c66b863854588b7a8891720372dc6",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_f48c7243c49a43acac4d1ba3a6fe674f",
|
||
"tabbable": null,
|
||
"tooltip": null,
|
||
"value": "Generating test split: 50%"
|
||
}
|
||
},
|
||
"d1ba6870db484696879f0d6e5d3a9d70": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "FloatProgressModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "FloatProgressModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "ProgressView",
|
||
"bar_style": "",
|
||
"description": "",
|
||
"description_allow_html": false,
|
||
"layout": "IPY_MODEL_95ad5a87c66f405599a710b8a5fa0a9d",
|
||
"max": 1821,
|
||
"min": 0,
|
||
"orientation": "horizontal",
|
||
"style": "IPY_MODEL_de103ee4780843db8502ebe64f5d2b28",
|
||
"tabbable": null,
|
||
"tooltip": null,
|
||
"value": 1821
|
||
}
|
||
},
|
||
"e7405147ca374cc4998ca947be069652": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HTMLModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_dom_classes": [],
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HTMLModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/controls",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "HTMLView",
|
||
"description": "",
|
||
"description_allow_html": false,
|
||
"layout": "IPY_MODEL_428db79c8cd74257ad09539518a21835",
|
||
"placeholder": "",
|
||
"style": "IPY_MODEL_3c95b6d54b294fe2a958056c463ce541",
|
||
"tabbable": null,
|
||
"tooltip": null,
|
||
"value": " 915/1821 [00:00<00:00, 9145.67 examples/s]"
|
||
}
|
||
},
|
||
"fb72bbb82aec4184a8e0a510177433cf": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_name": "LayoutModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border_bottom": null,
|
||
"border_left": null,
|
||
"border_right": null,
|
||
"border_top": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"f37c66b863854588b7a8891720372dc6": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_name": "LayoutModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border_bottom": null,
|
||
"border_left": null,
|
||
"border_right": null,
|
||
"border_top": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"f48c7243c49a43acac4d1ba3a6fe674f": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HTMLStyleModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HTMLStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "StyleView",
|
||
"background": null,
|
||
"description_width": "",
|
||
"font_size": null,
|
||
"text_color": null
|
||
}
|
||
},
|
||
"95ad5a87c66f405599a710b8a5fa0a9d": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_name": "LayoutModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border_bottom": null,
|
||
"border_left": null,
|
||
"border_right": null,
|
||
"border_top": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"de103ee4780843db8502ebe64f5d2b28": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "ProgressStyleModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "ProgressStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "StyleView",
|
||
"bar_color": null,
|
||
"description_width": ""
|
||
}
|
||
},
|
||
"428db79c8cd74257ad09539518a21835": {
|
||
"model_module": "@jupyter-widgets/base",
|
||
"model_name": "LayoutModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/base",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "LayoutModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "LayoutView",
|
||
"align_content": null,
|
||
"align_items": null,
|
||
"align_self": null,
|
||
"border_bottom": null,
|
||
"border_left": null,
|
||
"border_right": null,
|
||
"border_top": null,
|
||
"bottom": null,
|
||
"display": null,
|
||
"flex": null,
|
||
"flex_flow": null,
|
||
"grid_area": null,
|
||
"grid_auto_columns": null,
|
||
"grid_auto_flow": null,
|
||
"grid_auto_rows": null,
|
||
"grid_column": null,
|
||
"grid_gap": null,
|
||
"grid_row": null,
|
||
"grid_template_areas": null,
|
||
"grid_template_columns": null,
|
||
"grid_template_rows": null,
|
||
"height": null,
|
||
"justify_content": null,
|
||
"justify_items": null,
|
||
"left": null,
|
||
"margin": null,
|
||
"max_height": null,
|
||
"max_width": null,
|
||
"min_height": null,
|
||
"min_width": null,
|
||
"object_fit": null,
|
||
"object_position": null,
|
||
"order": null,
|
||
"overflow": null,
|
||
"padding": null,
|
||
"right": null,
|
||
"top": null,
|
||
"visibility": null,
|
||
"width": null
|
||
}
|
||
},
|
||
"3c95b6d54b294fe2a958056c463ce541": {
|
||
"model_module": "@jupyter-widgets/controls",
|
||
"model_name": "HTMLStyleModel",
|
||
"model_module_version": "2.0.0",
|
||
"state": {
|
||
"_model_module": "@jupyter-widgets/controls",
|
||
"_model_module_version": "2.0.0",
|
||
"_model_name": "HTMLStyleModel",
|
||
"_view_count": null,
|
||
"_view_module": "@jupyter-widgets/base",
|
||
"_view_module_version": "2.0.0",
|
||
"_view_name": "StyleView",
|
||
"background": null,
|
||
"description_width": "",
|
||
"font_size": null,
|
||
"text_color": null
|
||
}
|
||
}
|
||
}
|
||
}
|
||
},
|
||
"nbformat": 4,
|
||
"nbformat_minor": 0
|
||
}
|