mirror of
				https://github.com/microsoft/autogen.git
				synced 2025-10-31 17:59:50 +00:00 
			
		
		
		
	 f718d18b5e
			
		
	
	
		f718d18b5e
		
			
		
	
	
	
	
		
			
			* time series forecasting with panel datasets - integrate Temporal Fusion Transformer as a learner based on pytorchforecasting Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update setup.py Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update test_forecast.py Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update setup.py Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update setup.py Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update model.py and test_forecast.py - remove blank lines Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update model.py to prevent errors Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update automl.py and data.py - change forecast task name - update documentation for fit() method Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update test_forecast.py Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update test_forecast.py - add performance test - use 'fit_kwargs_by_estimator' Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * add time index function Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update test_forecast.py performance test Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update data.py Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update automl.py Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update data.py to prevent type error Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update setup.py Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update for pytorch forecasting tft on panel datasets Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update automl.py documentations Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * - rename estimator - add 'gpu_per_trial' for tft estimator Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update test_forecast.py Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * include ts panel forecasting as an example Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update model.py Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update documentations Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update automl_time_series_forecast.ipynb Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update documentations Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * "weights_summary" argument deprecated and removed for pl.Trainer() Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update model.py tft estimator prediction method Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update model.py Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update `fit_kwargs` documentation Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> * update automl.py Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> Signed-off-by: Kevin Chen <chenkevin.8787@gmail.com> Co-authored-by: Chi Wang <wang.chi@microsoft.com>
		
			
				
	
	
		
			4540 lines
		
	
	
		
			454 KiB
		
	
	
	
		
			Plaintext
		
	
	
	
	
	
			
		
		
	
	
			4540 lines
		
	
	
		
			454 KiB
		
	
	
	
		
			Plaintext
		
	
	
	
	
	
| {
 | ||
|  "cells": [
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "# Time Series Forecasting with FLAML Library"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "## 1. Introduction\n",
 | ||
|     "\n",
 | ||
|     "FLAML is a Python library (https://github.com/microsoft/FLAML) designed to automatically produce accurate machine learning models with low computational cost. It is fast and economical. The simple and lightweight design makes it easy to use and extend, such as adding new learners. FLAML can\n",
 | ||
|     "\n",
 | ||
|     " - serve as an economical AutoML engine,\n",
 | ||
|     " - be used as a fast hyperparameter tuning tool, or\n",
 | ||
|     " - be embedded in self-tuning software that requires low latency & resource in repetitive tuning tasks.\n",
 | ||
|     "\n",
 | ||
|     "In this notebook, we demonstrate how to use FLAML library for time series forecasting tasks: univariate time series forecasting (only time), multivariate time series forecasting (with exogneous variables) and forecasting discrete values.\n",
 | ||
|     "\n",
 | ||
|     "FLAML requires Python>=3.7. To run this notebook example, please install flaml with the notebook and forecast option:\n"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": null,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [],
 | ||
|    "source": [
 | ||
|     "%pip install flaml[notebook,ts_forecast]\n",
 | ||
|     "# avoid version 1.0.2 to 1.0.5 for this notebook due to a bug for arima and sarimax's init config"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "## 2. Forecast Problem\n",
 | ||
|     "\n",
 | ||
|     "### Load data and preprocess\n",
 | ||
|     "\n",
 | ||
|     "Import co2 data from statsmodel. The dataset is from “Atmospheric CO2 from Continuous Air Samples at Mauna Loa Observatory, Hawaii, U.S.A.,” which collected CO2 samples from March 1958 to December 2001. The task is to predict monthly CO2 samples given only timestamps."
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 2,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [],
 | ||
|    "source": [
 | ||
|     "import statsmodels.api as sm\n",
 | ||
|     "data = sm.datasets.co2.load_pandas().data\n",
 | ||
|     "# data is given in weeks, but the task is to predict monthly, so use monthly averages instead\n",
 | ||
|     "data = data['co2'].resample('MS').mean()\n",
 | ||
|     "data = data.bfill().ffill()  # makes sure there are no missing values\n",
 | ||
|     "data = data.to_frame().reset_index()"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 3,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [],
 | ||
|    "source": [
 | ||
|     "# split the data into a train dataframe and X_test and y_test dataframes, where the number of samples for test is equal to\n",
 | ||
|     "# the number of periods the user wants to predict\n",
 | ||
|     "num_samples = data.shape[0]\n",
 | ||
|     "time_horizon = 12\n",
 | ||
|     "split_idx = num_samples - time_horizon\n",
 | ||
|     "train_df = data[:split_idx]  # train_df is a dataframe with two columns: timestamp and label\n",
 | ||
|     "X_test = data[split_idx:]['index'].to_frame()  # X_test is a dataframe with dates for prediction\n",
 | ||
|     "y_test = data[split_idx:]['co2']  # y_test is a series of the values corresponding to the dates for prediction"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 4,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "data": {
 | ||
|       "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEGCAYAAACKB4k+AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjAsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8GearUAAAgAElEQVR4nOy9eXhkd3nn+/mVpNpVpdK+dku9ubvdbrdNewFjFtvEJjCQwIR4huRmmQs3CbmZkNnCwGQCE0/IPjNZyMNkgdwnCZgMzBATAo6xjcHY7W73vi/a1ypJVapFVaWq+t0/zlLnqOR22+7qVrfez/Pocen3q3N0jjHnPe/2fZXWGkEQBEEA8FzvCxAEQRDWD2IUBEEQBBsxCoIgCIKNGAVBEATBRoyCIAiCYNN4vS/gjdDe3q4HBwev92UIgiDcUBw6dCihte5Ya++GNgqDg4McPHjwel+GIAjCDYVSavSV9iR8JAiCINiIURAEQRBsxCgIgiAINmIUBEEQBBsxCoIgCIKNGAVBEATBRoyCIAiCYCNGQRAEYR1TKJX5mxfH+PujU9fk793QzWuCIAg3O3/w5Hn+9NmLANw91EpXxF/XvyeegiAIwnUmUyiRLZTW3Ht5bNH+PJVcrvu1iFEQBEG4zvybx49w63/+FhfmMjV7l+IZbuuLAjC7lK/7tdTNKCil/EqpA0qpo0qpk0qpT5vrX1ZKHTF/RpRSRxzHfEIpdUEpdVYp9XC9rk0QBGE98e1TswB85eC4az2ZK5LIFHnLtjYAZpcKdb+WenoKBeABrfXtwD7gEaXUvVrrH9da79Na7wP+F/BVAKXUbuBR4FbgEeBPlFINdbw+QRCE606pXMGjFAD/dHrWtWd5DncPttLUoJi5kT0FbWD5Qk3mj7b2lVIK+BDwt+bS+4Evaa0LWuth4AJwd72uTxAEYT0wncpTrmj6WgJcjGdZLpbtPcsobO9sprPZf2OHjwCUUg1meGgOeFJr/aJj+35gVmt93vy9D3D6ThPm2upzflQpdVApdTAej9fr0gVBEK4J44s5AN62ox2AsYWcvXdhLoOv0UNfLEBnxGcbheViGa117cmuAnU1Clrrshkm6gfuVkrtcWz/C6peAoBa6xRrnPPzWuv9Wuv9HR1rzogQBEG4YTg3kwbgHbd0AjA6n7X3zs6mGWoP0eBRdEf8zKQMo/Ajf/x9fvFvD9fleq5J9ZHWOgk8g5ErQCnVCHwA+LLjaxPAgOP3fuDadGsIgiDUmXOzad78m09xamrJtX5wdJHeqJ97hlqBqqcwksjy/QsJ7t9ueBBdET9zSwW01owv5ugI++pynfWsPupQSrWYnwPAQ8AZc/sh4IzWesJxyNeBR5VSPqXUELAdOFCv6xMEQbiWfOXgONOpPJ/9xzOu9UOji9y5OUZL0EvY18ik2Yvw1Jk5Khp+5r4hwDAK6UKJicVlcsUym1qDdbnOenY09wBfNCuIPMDjWusnzL1HcYeO0FqfVEo9DpwCSsDHtNZlBEEQbgIOjhpNaCcnU/baZHKZ6VSe/ZtjALSFvSxki4CRT2gJNtETNTqYu6M+8zwLAAzcaEZBa30MuOMV9n76FdYfAx6r1zUJgiBcD7TWnDVzB/PZIqncCtFgE4dMQ7F/0AgdtYaqRuFiPMO2jjDKLFftajaMw8ER45iB1kBdrlU6mgVBEOpMPF0gVyzb+YFhM5l8ciqFt8HDzu5mANpCXuYzplGYy7CtM2yfoyu6yijE6uMpiFEQBEGoM5cShhF4YKdRYXQpbvQfjCZyDLQGaGwwHsWtIS/z2QLp/Arz2SKD7SH7HJYQ3tnZNK0hLyFffQI9YhQEQRDqzIhpFO7fbpTRW8J2I/NZNrdVH/ytIR8L2SITi8a+0xsI+xoJm4ZgIFaf0BGIURAEQag7w/NZvA0ehtpDRPyNzKWN0tKxhRyb26oP/raQl5Wy5vS0Ubbav+rh3xUxks39dUoygxgFQRCEq8Zkcpnf+dYZyhV33+1IIsumtiANHkWn2W8Qzxh5hs2OB3xryAvAsQmjQml1hZG/yZCDq1c+AcQoCIIgXDV+7HPP88dPX6xpUBtOZBk0w0SdzT7imQJj80aT2mZH3qA1bBiFI+NJgt4GYsEm13msczx8a1fd7kGMgiAIwlWgUtFMmTIUww6pinJFMzqfY6jdeLvvbPYxl84zYhkFhzfQHjLCQ8cmkgzEgnY5qsWn338r3/7427hjU6xu9yHjOAVBEK4CC7mi/dmqLgKjCa1QqrC7NwJgh49G57N4FPQ7QkGWp1DRtfkEgPawj/Y6yVtYiKcgCIJwFXDKWl+KVz2FoxNJAG7rawGgI+yjUKpwYjJFb0sAb2P1Mdxm5hSgfh3Lr4YYBUEQhNfAX784yp7//C1yRfdM5TlzKlqzv5FLiaqncGIyRdjXyBYzd9BpVhC9NLJo5wgsrEQyrO0pXAvEKAiCILwGPv33p8gUSvz9UbeI81za8BTevKWN4XjWnncwsbjM5rYgHo+RH+hoNoxCplBylaOuZktH6BX36okYBUEQhCskVyzZD/vnzidce9b85Hu2tJEtlu3fp5LL9ESrb/2dpoYRcFmjcN+29qt23a8FSTQLgiBcIS9cmmelbBgFS+LaYjiRpbPZZ+sYXYpn6I76mUouc7c5KwGq4SPA1c1s8bkP30lZa3yN12dEvRgFQRCEK+S75xL4mzw8fGs3P7g479p7aWSBN22O2bmA6VSeTKHEUr5Eb0vVU2h2aBbt6YvW/I1339ZTp6u/MiR8JAiCsIoTkym+fXKmZv3w2CJ3DMTY0h5mLl0gv2KMfJlOLTOxuMxdg612yWg8U2Da9CasmQgASikaPIr+WIC+luuTTL4c4ikIgiA4KJUrvPcPvwfA4f/0LmJmmWipXOHMTJqfvHezyxsYag/xkilnfddgKyFfI0FvA/F0wQ4xrX74H/rUQ65Ko/WEeAqCIAgOjjkmoz1xrFphNJzI2k1o3eabv9WbcHBkgZC3gV09Rj6ho9lHPF1g2uxw7lllFFqC3nVrFMRTEARBcDCbqjahXZjL1Hze0dVMY4NRXmoNxDkwvMCdm2P2XISOsI9EpsBUchmPgq7m+nYhX03EUxAEQXBgvf13NPtcFUaWntFge8jOGyQyBVLLK5ydTXPXYLXCyPIUppJ5uiJ+21jcCNw4VyoIgnANmE0XaGpQ7OmNMJmseg2jiRztYR9hXyOxoBePgvlMgSPjSbSG/ZurInXtYUMJdSq57Ko8uhEQoyAIguBgbqlAR9hHfyxoT0gDw1OwlE4bPIrWkJd4psjYgqF2utUxTzkW8pJaXmEimXNVHt0IiFEQBGFD8tz5ON89F69Zn0vn6Yz46W0JkFpeIVswNI5GEu7RmW0hH/OZArOpPA0e5VIvbQ02oTWMLyyvy7LTyyGJZkEQNhzFUoWf/PMDAIx89j2uvcnFZXZ0NdNmylgvZIsoBXPpAkOOgThtYS/z2SIzS3k6wj4aPNXZBzGH2ql4CoIgCOuc71+s6hbNOSSvl/IrXEpkubU3QixoPNgXc0VGEuZAHIdWUbtZYTS7lKdr1YPfOhaQnIIgCMJ659xM2v58cHTR/nzCnI28d6DFHoW5mFth1Ko8alvlKWSKzKTydEfcJaetITEKNSil/EqpA0qpo0qpk0qpTzv2/l+l1Flz/bcd659QSl0w9x6u17UJgrCxGU5k8TcZj79RcywmVBvXbuuL0mIahWSu6CpHtWgP+8gUSlxKZF0qqIB9LFy/YTmvl3rmFArAA1rrjFKqCfieUuqbQAB4P7BXa11QSnUCKKV2A48CtwK9wD8ppXZorct1vEZBEG5ivnl8ms6Inzdtds80Hk5k2dMb5UI8w2TSYRQmkgy0BmgNeamYEtnJ3IqrHNWi3cw5lCvaVXkEbk8hGmjiRqJunoI2sNoBm8wfDfw88FmtdcH83pz5nfcDX9JaF7TWw8AF4O56XZ8gCDc3Wmt+/q9f5oOfe75mbziRZbA9RG80wJSjF+HoeIq9/cbYzJaAFT4quspRLdpC1ZDRtg63UQg0NdAe9vLxh3Zctfu5VtQ1p6CUalBKHQHmgCe11i8CO4D7lVIvKqWeVUrdZX69Dxh3HD5hrq0+50eVUgeVUgfj8dpyMkEQNhYvXponni7UrFv9A4BrdGa2ULIrifpiASYXjV6E+YwhYLfXlLNubPDQ7G8kmVupKUcF7OokgG2rPAWlFAc/9S7+9UPb3/gNXmPqahS01mWt9T6gH7hbKbUHI2QVA+4F/h3wuFJKAWqtU6xxzs9rrfdrrfd3dHTU8eoFQVgvPH1mzu4XcDKXzvPjn3+Bf/k/X6jZO+4Qtjs2Uf08YuYHhtpD9LUE7AY1K59geQpg5AamksvMpQsMrpqS1uHQM2p3GIgbnWtSfaS1TgLPAI9geABfNcNLB4AK0G6uDzgO6wemEARhQ3NiMsXPfOElfvObp2v2vn7EeEScn8swn3F7C06j4PQkhhPVSqKOZh/pQon8SpnjEymUgj19Efu73RE/h8eTQG0VUV9LgF97726+/6sPYLzX3hzUs/qoQynVYn4OAA8BZ4D/DTxgru8AvEAC+DrwqFLKp5QaArYDB+p1fYIg3Bi8cMmYcOasErJwPvhXj8c8MZmyu4mdBuPMdBqPgsH2IG2haoPacCJLbzRAs99RORQL2gale1UvglKKn33r0A3Xsfxq1LP6qAf4olKqAcP4PK61fkIp5QX+Qil1AigCP6WNSdgnlVKPA6eAEvAxqTwSBOGlkQWANcNHI4ksIW8D2WKZ+WzRXtdac2JyiXfv6ebLB8dZcOw9dWaO/ZtbCXobaTOlKeYzRVO8zv3g73eUk64uO71ZqZtR0FofA+5YY70I/MQrHPMY8Fi9rkkQhBuPCTMRfH42g9baDtVorRlOZLlzc4znzidYyFQf/PG0IWm9qydCa9BLwjQKC9kip6eX+A+P7ASqpaOJbIGp1DJ3bnKXrm5yGIXuyI0lV/F6kY5mQRDWNdb0snShRLZYDR4s5lZYypfsHgSnNzC+aISaNrUGaQ15bYNhVRpt7TAqiawEcTxdYCaVr/EGtnRUK44C3vU5Ke1qI4J4giCsW/IrZRayRYbaQwwnsixkinYDmVVFdFtflKYG5QofjS8YD/+B1gBtYa9tMKy8g5U0tsJH52bSrJQ1favCR3cMtPBLD24nt0bo6mZFjIIgCOsWq1z01t4Iw4ks89kCm8zS0JFEVXqiNeRlIVtNJo+bPQr9sSBtIR+nZ5Zc57OMQsjbgK/RY5esrvYUlFL8yrtuvAa0N4KEjwRBuO6cn00z+Kvf4NDogmvdCh3dZjaULeaq3sBIIotHGRVCrSGfK3x0zKw88jc1uDyF6dQyvkaPLXanlKIn6ufQmCGKd6OJ19UDMQqCIFx3njw9C8DXDk+61idtT8EwCvOOZPLwfI6+WABvo4e2kNcOH6VyKzxzdo537+kGjGRyMrfCSrnCVDJPb0vA1VfQHwtSrhh9squrjzYiYhQEQbjuzJoewXKx4lqfNnWJdvcaDWVOb+D09BI7OpsBzPCRsXd8MsVKWfPOnZ1ANW+wmCsyvpijP1bbhAYQ9DbccOJ19UCMgiAI150TU0bM/+zskmt9OrVMe9hHLNiEt9FjP/hTyytcmMuwb8CQpHBWGDllLAC7QW0+U2R8IVcjZW0ZidaQ96bqTH69iFEQBOGaoXWNnBm5YoljE4aUxEgi5/rOpNlQppRyhYis7+/bZBiFtpCXdKFEoVRmdD6Lr9Fj9xVYRmFsIcdiboWBmNsoDJllp3v7o1fzVm9YxCgIgnBN+PujU7zls9+xp5hZHBxZZKWsuX97O5lCidTyir03ubhMb7T6Jr9oGoUjY4ZRsMTrWs1+g8XsCiPzOTa3BfGYM5MtNdOjpobRQKs7fPTuPT387Ufu5b8/WtNruyERoyAIwjXh418+wnQqz5dfGnetW/pF77u9F6h2MOdXyozMZ9nRXc0bWJ7CkfEkWztCdg7ADhFlC4zOu2WurbkHh01DstpTaPAo3ry1jaYGeRyCGAVBEK4BxVKFklnh8+w59xyUi3MZeqJ+dnYbyWTLKJydSVPRsLunNpl8bDLF7QNVievWUFXDaHQ+55K5bgk24Wv02BpKN9p4zGuNGAVBEOpO3FQp9Td5GJ135w0uxDNs6wzbCd8JU6LipJl8toyFZRSyhRLxdIGtjmlnllzF8ckUhVLF5SkopehtCVCqaELeBrtHQVgbMQqCIFw1yhXN3FK+Zn3WXLtrsJVMoWS/8WutuTiXYWtHmBazwmjOlKp+6vQsvVE/m823/raQl0yhZM9DcL7xWwNvLG/Aqjyy6DFlrwdag1Jh9CqIURAE4arxR9+5wN3/9Sn7bd9izmEUAEZNGYrpVJ5sscy2zjBKKTqbfcTTBfIrZZ47n+CRPT32Q9wKEVmSFM5+g7CvkUBTAz+4aMxeGKwxCoGaY4S1EaMgCMJV4yuHjCTylw64k8mzS8bbv2UULG2iC3MZoDrjuMM0ClPJZYrlimsKmiVzbVUROR/wSik6Iz4KpQodzT56awbiGP98+y2db/wmb3JEEE8QhKtCfqVsaxUdNfsILEbnc3gbPezoMh7+llxFjVEI+xiZz9ryFs6pZp0Rw1M4NLaIr9FDR7g6I9k6dnQ+x/7NsZoQ0c+/Yyu9UT//8u5NV+Veb2bEUxAE4aowlVy2NYSmVo3GfP5igv2bY7QEvXhUVdjuQjxDS7DJLintjBiegjX3wClQ12U2o12Yy9AfC9Q8+K2qpbdub6+5tq0dYX7lh26hwSP5hFdDjIIgCFcF6+1+T1+E6VTerjB6/mKCMzNp3rq9nQaPoiVYLS29MJdhW0fYfsB3hP0s5owGNI9yz0V2egb9sdqy0nfu7ADgR+/oq88NbhDEKAiCcFWw3tT3b24lVyyzlDcG0/zF94bpifr5v948CEAs2GR7ClblkYUVIjo6nqQ74nc1lFlqqLB2wvjT79vDkV97F0GvRMXfCGIUBEG4KkwuLtPgUdxh6hFNpwwjMbaQY09f1J6YZvUbLGaLzGeLdj4Bqt7A4fFF+tZ48BfLhorqWg1o3kYPLUHv1b2pDYgYBUEQrgpHJ5L0xwJ2cnjGDCFNLC67pCViQS+L2RUuxN1JZqj2G+RXKmsOvHnbDiNEdP8aeQPh6iBGQRCE18TfHhjjJ/7sRVdX8kgiy3PnE3xo/4D9YE9kiizmVsgVy65wT2vIy0KuyMW5WqNghY/AXXlk8ZsfuI3D/+ld9tAd4eojwTdBEF4Tn/jqcQDOz2XY0WXoElmlpW/d1k67GQKKpwuOWcnVB3zMVDs9N5vB3+RxPfwt8TpgzfBRxC8SFfVGPAVBEK4Yp3fw9Jk5+7PVwdwfCxDyNRLyNpDIFOzkszMH0NXso1TRvDSywJb2sC1xDUZewGKTCNddF8QoCIJwxcw4dI1GHHMRJhaXCTQ12F3H7WZnsmUsnG/9Vpnp8cmUK3RkEfEbAYw3b2m7+jcgvCp1MwpKKb9S6oBS6qhS6qRS6tPm+q8rpSaVUkfMnx92HPMJpdQFpdRZpdTD9bo2QRBeH5YYHVSlK8AwCs6Gso6wYRTGF3NEA02usI/VhAasaRSe/JW3c+TX3kWjzDe4LtQzp1AAHtBaZ5RSTcD3lFLfNPf+QGv9u84vK6V2A48CtwK9wD8ppXZorct1vEZBENYgkSlQKFVqkr0TC0Y4aFtn2FY+BZhI5lzeQEezj3OzaXxNnppJZ69mFJz7wrXnNZlipVRMKbX3Sr6rDTLmr03mT+2A1irvB76ktS5orYeBC8Ddr+X6BEG4Otz/W09z32e/U7M+vmh0Gu8baFnTU7DojvqZSeUZX8jVGBarOgngFnOqmrB+eFWjoJR6RikVUUq1AkeBv1RK/f6VnFwp1aCUOgLMAU9qrV80t35RKXVMKfUXSqmYudYHOKUVJ8y11ef8qFLqoFLqYDweX70tCMIbJJkrsrxiOOgzKfdshLGFHD1RoxdhPltgpVwhnV8hmVtxSU/0RgNki2UuxrM13kBTg4d33NLBz719q6ubWVgfXImnENVaLwEfAP5Sa/0m4KErObnWuqy13gf0A3crpfYAnwO2AvuAaeD3zK+vpVRV41lorT+vtd6vtd7f0dFxJZchCMJr4OWxRfvzi8Pzrr2xhRybWoN0RfxobZSdWppHTk+hp6UaArLKVp184Wfu5lffvfNqX7pwFbgSo9ColOoBPgQ88Xr+iNY6CTwDPKK1njWNRQX4n1RDRBPAgOOwfmDq9fw9QRBePzOpalhoJFEdlqO15sLs6tGZy3aeweUpOEJGaxkFYf1yJUbhM8C3gAta65eUUluA8692kFKqQynVYn4OYHgXZ0wDY/GjwAnz89eBR5VSPqXUELAdOHDltyIIwmvhR/74+/zylw7XrM+ljZBRS7DJ1i8Co9ooXSixoyts9x2MLeTs0tQBh6fQG61+3tLhnoImrG9etfpIa/0V4CuO3y8BH7yCc/cAX1RKNWAYn8e11k8opf4/pdQ+jNDQCPD/mOc9qZR6HDgFlICPSeWRINSH8YUcR8aTHBlP8rs/drur/HN2qUBbyEtvS8AemgNwemYJgG2dzfS2+FHKOM90apn2sJc2h7R1V8THz943xHv29uBrbLh2Nya8YV7RKCil/pDLVAtprX/pcifWWh8D7lhj/Scvc8xjwGOXO68gCG+c7zi6kY9PprhjU8z+PZ7O0xnx0xP1MzpfDR9968QMIW8D+wZa8DU20B3xM76Y42I8WxMiUkrxa/9sd/1vRLjqXM5TOHjNrkIQhGvKsYmU/Xk4kXUZhdmlAp3NPnqifn5wyUg0a6158tQsD+3uIuA13vz7YwEmFpY5P5vmQ/sHEG4OXtEoaK2/6PxdKRXSWmdf6fuCIKwvVsoVZlL5NUdXHp9Mcs9QKy8OL7j6DcAYpXlrb4SuqJ90vsRysUyuWGI+W2Rvf4v9va6In2fPxskVywy2iU7RzcKV9Cm8WSl1Cjht/n67UupP6n5lgiC8Ib7w/RHu/+2n+fx3L7nWV8oVLsxluGuwlbCv0U4sg1FiOp8tsr2rmfaQJYFdsFVQnT0HPVE/6YIxXW2t2QfCjcmVVB/9N+BhYB5Aa30UeFs9L0oQhDfO6ILh2P/vI+7K7ni6QEUbD/LOiI85h6dwdiYNwM7uZtqbDXG7RKZgD8TZ6qgkcspRrCVzLdyYXJHMhdZ6fNWSVAUJwjpnMbsCwOnpJRayRXvd0izqivjobPa5PIXjk0auYWd3sz0XIZEpMhzP4mv0uEpNLbVTgP4WCR/dLFyJURhXSr0F0Eopr1Lq32KGkgRBWL84DcG52bT92cohdEX8dEX8rpzCs+fm2NndTFvYZ5eYWnMR+mMB1+yDboenEAnIvK6bhSsxCj8HfAxDh2gCQ57iY/W8KEEQ3jgL2SLbzRzAVNLZhGZ5Cn7aQj7mM4ZRSOdXODiyyDtu6QSgzZyNMJ8pMJHMuQblANzWH+Un7t3E73/o9ppEtnDjciXmXWmtP1z3KxEE4XVxMZ5hU2uQplXzBxZyRd66rZ3zcxmXUZhO5Wn0KNpCXtrCXrLFMvmVMs9fnKdU0bzjFkNTzN/UQLO/0Ryrucy+gRbX+X2NDfzGj9xW/xsUrilX4ik8r5T6tlLqX1myFYIgrA8WskUe/L1n+eUvHXGta61ZzBbpifppD3tt0TqA87NptnSE8JiGwTrP984nCPsaedPmas/CptYgxydTpJbdKqjCzcurGgWt9XbgUxjDb15WSj2hlPqJul+ZIAivysGRBQC+cXya/Eq1/mMpX6JU0bSachXWrGSAU9NL7O6JANjjMxeyRS7GM2zvCrs8jlu6mnl5LAm4VVCFm5crrT46oLX+FQxF0wXgi69yiCAI14BDo1WZa6dO0UWzhLQ/Zshcx9NG3mAhW2Q6lWd3r2EU2sLVstOJxWUGVnkDziE4q/eEm5MraV6LKKV+yhyl+TzGDASZiCYI64CL8arIgFPR9GXTWNy5qYX2sJdExqhEOjpuvPXf1mdEglvNBrV4usBUcrnGG9hlehQgnsJG4UoSzUeB/w18Rmv9gzpfjyAIr4G5dJ7NbUFG53NMJ6uewstji/S1BOg0K4wWsgUqFc2h0UUaPMpOGluewqnpJUoVXZM3uGNTNY1ohZqEm5srCR9t0Vp/HDhW74sRBKGWUrnCF74/TDq/UrM3u5TndlOPyPIUtNYcGF7g7qFWANrDXioakssrHBlPsrsnYovaNfsaCXobODhieBYDrW5voNnfZH+WstONwZUYhXtF+0gQrh/fOD7Nr//9Kf7HU+7ZVuWKJp4usLktSCzYZOcULsazJDJF7jGNgrMJbWQ+65KqUErZFUbAmhVG//BL9/PVX3hLXe5NWH+I9pEgrHO+fXIWgDMzadf6fMbQMDJmH1QH4lhzle/Z0gZUQ0QzqTxTyWU2rWpCs35XCnods5UtdvdGuNMhrS3c3Ij2kSCsY7TWvGDONDg8lkTr6tyrGaszudlHb4vfblB78dICnc0+W866w/QUjk+mqGjoX2UUNju+J1PSBNE+EoR1wHymwK9//SQZU4raYmwhx3y2yNaOEJlCiWSumlc4MWmNxwzTHfXbRuLw+CJ3DbbaOQArfHR4zMgbrPYUtptT07Kr/rawMXm92ke/UM+LEoSNxn/6Pyf4wvMj/MPxadf6UXNC2nv39gK4mtBeHJ6no9nHUHuInmiAZG6FXLHEbKrg0ilqCTTR4FEcNpvQVhuFH9nXx3tu6+FfP7S9Lvcm3FhcSUdzQmv9Ya11l9a6U2v9E8B/vAbXJggbhgPDRmfyoZFF1/qleAal4O2mHtHEYnVm8vGJFHduakEpZecCzsykKZYrdDT77O95PIrWkJf5bJGmBuWagwDgbfTwxx++k4++bWtd7k24sbiinMIafOiqXoUgbGAWs0W7uezwuNsojM7n6I0G2NphqJ1ankKxVGF0IWdPQrMe9CfMKiKnUYCq4ml/LEiDR0pLhVfm9RoF+a9KEK4S1qyDwbYgMw6pCoDhRJbB9iDRQBNBb4OdNxhbyFGuaLa0G0ahzexMtianWcllC2tgzmr5a0FYzSsaBV494sMAACAASURBVKVU6yv8tCFGQRBeM+MLOX77H8+wUq641i2jcN+2dpbyJVvYrlLRXIxnGGo3+gpiQS+LOcOjuGRqG20xew5ioSbXuVZ7CkGzWW1Tq0hVCJfncjIXhwDN2gaguMaaIAiX4Xe+dZavH51iqD3Ej+0fsNcvxrOEvA3s7Y/y1y8aOkQDrUFG5rOk8yX22jpFXhbNaWqWR2CFj2JBIzxk9TKsNgpWVdOP7Our4x0KNwOv6ClorYe01lvMf67+2fJqJ1ZK+ZVSB5RSR5VSJ5VSn161/2+VUlop1e5Y+4RS6oJS6qxS6uE3dmuCsL64MGe83T9xzF1hdDGeYWtnmE4zL2DNTD46YVQL3W7qFMVCXhbMktQTUymG2kO2DEVTg4dmfyPpfImWYBMRv/t97zPvv5X/8v5b2T/YWqe7E24W6jlYtQA8oLXOKKWagO8ppb6ptX5BKTUAvAsYs76slNoNPIoxt6EX+Cel1A6ttTTKCTc8iUyBU9NGX4FTzRTg4lyGe7a00Wm+3c+ZM5NPTC7ha/TY3kBrsImRRNbec4rVgeFJpPMltnaEa3SKtnU2s62zGUF4NV5vovlV0QYZ89cm88dqx/wD4N87fgd4P/AlrXVBaz0MXEAkuoWbhO9fSACwb6CFWfOhD0bD2FQqz9aOEJ3Nlqdg7F+Yy7C1I2xXC8XM8FG2UGIyueyStYZqCGmbWakkCK+HuhkFAKVUg1LqCDAHPKm1flEp9T5g0tRQctIHOOU0Jsw1QbjhOTiySLOvkQd3dpJaXrGTycPmm//WjjBtIS8NHmWHjy7MGZPQLFqDXtKFkh2GGmwLuf5GsWQksHd0i0cgvH7qahS01mWt9T6gH7hbKbUX+CTwa2t8fa2Etq75klIfVUodVEodjMfjV/eCBaFOzC7l6W0J0NMSsH+H6oS0LR1hPB5Fe9jL3FKBXNHwBpxv/TGz1+CIOSjH0iyy+Km3bOYDd/Tx6F0DCMLr5XIlqbcppV5QSo0rpT6vlIo59g68lj+itU4Cz2CEiIaAo0qpEQxj8bJSqhvDM3D+19wPTK1xrs9rrfdrrfd3dHS8lssQhLrz5ZfGeNfvP8uZmSXXeiJToL3ZS7eZTLb6ES7OZfAohyhds4+5dIHReaNzecghc21VFL1kzmXetMoo/Phdm/j9H99HyFfPVKFws3M5T+FzwK8DtwHnMBLFVh980ysdZKGU6lBKtZifA8BDwGFTKmNQaz2IYQju1FrPAF8HHlVK+ZRSQ8B24DUZH0G43nzumYucn8vwhe+PuNbjmQLtYR/dUePBPrNUnX0w0BrE32T0EXQ2+5lLFxhbMIyCU6eoJ2oYlBcuLdAa8hLxv+r/DQXhNXO5V4qw1vofzc+/q5Q6BPyjUuonWSOsswY9wBeVUg0YxudxrfUTr/RlrfVJpdTjwCmgBHxMKo+EG4nxhRwj5hu+FR6ySKSLtId91bJTM9l8bjbN9s5qiKiz2cexiSTjaxoFI/SUyBTscZqCcLW5nFFQSqmo1joFoLV+Win1QeB/Aa9a7Ky1Pgbc8SrfGVz1+2PAY692bkFYj1ix/u6Iv6bCaHmlTHvYZ4+/nFnKs1KuMJzI8tDuLvu7nc0+5rNFhhNZmv2NRANVb6At5KWpQbFS1vasBEG42lwufPRbwC7ngvmgfxD4aj0vShDWM8VShUSmULN+fDKFt9HDW7e32xVEAGdN6YnOZh9KGSqlM0t5RuezlCra5Sl0RPxobQzUGYgFXf0GltopwKZVlUeCcLW4XEfz32itXwBQSoWVUiFzfUxr/ZFrdYGCsN74xb95mf2/8U989eUJ1/rp6SVu6WqmryXAfLZoaxz96TMXiQWbeHBXJwBdER9zS3nOzxqVRzu6qiWkVgPbqemlmrkHUA0niacg1IvLlqQqpX5eKTUGjGJMYBtVSsmAHWHDcnp6iW+fMmYmHzMH4FiML+TY1Baky3zbt7yJU9NLvG1HBy1mc1m36SmcmzVmJWztcOcULFZXFwH8+U/fxW/8yB5++Laeq35vggCXL0n9FPDPgHdordu01q3AO4F3m3uCsOE4bhoCb6OHyWRVrqJS0Uwl8/THAnRFjAf77FKBQqnMVHLZ1WjWZeYczs2l6Y8FCHirc5E7HQNwBmK1iqYRfxM/ce9mu1pJEK42l0s0/yRwu9baDo5qrS8ppT4EHAV+o94XJwjrjbOzafxNHu4ZamPSMRoznilQLFfojwVtuYrZpTxhXyMVDYPt1bf+roifYqnCwZEF9vRGXed3zkGQ2QfC9eCy4SOnQXCsLQOVNb4uCDcNf/rsRf702Ys160YJaTObWoMuT8FqNutvqXoKRhOaIWOxeZWnAIYnsa3LrVPkbaz+X/L2fik7Fa49l/MUJpRSD2qtn3IuKqUeAKZf4RhBuOEpVzSf/eYZAD58zyZbnhrgUjzLXYMx+mIBUssrZAslQr5G/vHEDN4GD/sGWogEmvAomFvKU6kYLT39jlCQ1cAGsGMN5dLf/9DtDLQGbVkLQbiWXM4o/BLwf5RS36M6cOcu4D4MuQpBuCk5PllNIH/3XIL37DWSuivlCtOpZTa19tkJ4Xi6QMjXyLdPzfD2WzrsB3lHs4/ZpTzFcgVvg4f2UNUQdDnyBtu7ahVNP3Bnf13uSxCuhMuVpJ4E9gDfBQaBLebnPeaeINyUnJyqGoURM/wDMJVcpqKhvzVo6xDNpQvkV8pMLC5zW181P2DJVUwn83RH/Xg8yrVn4SxHFYT1wCt6CkqpbUCX1vovVq3fr5Sa0lrXBlwF4SZgfGGZpgZFyNfoyhuMLxif+2MB2kJVT8EyHIPtzryBj8lknky+ZGsWWXgbPfztR+5lW2dYqoiEdcflEs3/DUivsb5s7gnCTcn4Yo6+lgADsSATjgqj75mDcra0h21PIZ7OMxzPmutVo9AZ8TO3lGc6laevpba09M1b22rmKAvCeuByOYVBU9bChdb6oFJqsG5XJAjXiEvxDNFAE21h98N5YiHHQGuQsK+Rc6ZERbmi+asfjPDevT10R/1UKppGjyKeKZAtGrqNTk/B0jAC6Fuj30AQ1iuX8xT8l9mT/8qFG5r5TIEHfu9ZPvi552v2xheXGWgN0tsSYCppVGVPLObIFcu8bbsxw8MYiONjbqnAcCJLZ7OPsGOOgTOZvKVDdIqEG4fLGYWXlFI1GkdKqX+FUY0kCDcsXzlk6BaNzOc4NVUdiJMplFjIFhmIGcnk5ZUy2ULJMSHNPfQmnikwksgy1O5+8Fu9CgBD7TIzWbhxuFz46JeBrymlPkzVCOwHvMCP1vvCBOFqMDafozPiq0noOstOhxNZdvdGAOw5BgOtAQorRo9mPF3gUrw6S9mis9nHzFKemVSeH7q1Kn9t7FU9hSFRNBVuIC5XkjqrtX4L8GlgxPz5tNb6zeakNEFY16SWV3jb7zzNx798pGbv1NQSb9naBsB0yllhZBqFWJB2K5mcMUJELcEmV0NZR7OPC3MZ5rPFGk9ha0eYOze1cP/2dqJBmZAm3Di86jBXrfXTwNPX4FoE4ary3Pk4AN88MUOxVLElJDKFEsOJLB+4o48j40k7bwC4xmA2mXOUE+kCU8nlmiqijmYfhZLhTQyu8gYC3ga++gv31efGBKGOXFb7SBBuBC7FM5QrtRNif3Bx3v48vpizP5+eNnIIt/ZF6I76mVmqegrPX5ynPxagJdhULTvNFJhO5e1xmBbOklJJJgs3C2IUhBuagyMLPPB7z/LF50dq9oYT1W7kmVTVGzhp5hN290TpjVYrjLKFEs+dj/PuPd0oZUw5MzSMCkwml+ltcRfkOWcfiKKpcLMgRkFY92itOTaRROtab+CLPxgF4KWRhZq90fkcd24ylEannUZhaom2kJeuiI/OZh/xtDEM51I8y0pZ86bNxgjyBo+iJxrgzEyadL5E76rw0a4eIznd1xLA1yidycLNgRgFYd3z3fMJ3vdH3+evTAPg5PhEEsBuMrPIr5SZSi1z95CRTJ5xJJNPTS+xuzdiewMLZpNZVa6i+ta/qTXIi5eMMNRquYrNbSGe+/fv5G8+cs8bvUVBWDeIURDWPdaD/+8OuWci51fKjC7k8Ci4GM+SK5bsvfGFHFrDLd1hYsEm21Moliqcm01zqzncpi1s9CLkiiV79oFzNvLmtiDpgnHeLWv0Gwy0Bl2zEgThRkeMgrDusXoKhhNZVwjpwlwGrbHnFVu9BABHxg1DcmtvlO5ogNklwyicn0uzUtbcavYltJklpvOZIqPzOTqbfQS91aI85wN/SJLJwgZAjIKw7jllVgtlCiUWcyv2uhUysoyC1XUMcGh0kYi/kW0dYbojPttTOGl2L1tGodU0CgtZwyisLi3d1VOVtnbKWAjCzYoYBWFdUygZswr29BkP8VHHfINzsxmaGhTvuKWDBo/iwlzVKByfTHH7QAsej6I7GrCrj05NLRH0NtgP/7aw6SlkDQnszW3uKqL7Ta2jwTapLhI2BnUzCkopv1LqgFLqqFLqpFLq0+b6f1FKHVNKHVFKfVsp1es45hNKqQtKqbNKqYfrdW3CjcPYvJEbePsO4+FsNZcBnJ9Ns6U9TNDbSHfEb88+KFc0F+Yy7Ow23vJ7on7ms0UKpTKnppbY1ROxh95YcxEmFpeZSxdcSqdgVCAd+tRDfE0a0YQNQj09hQLwgNb6dmAf8IhS6l7gd7TWe7XW+4AngF8DUErtBh4FbgUeAf5EKSV1fhucS2avgaVOOu4wChfiGXvwfbujtHRsIUehVGG7OdWs26wamlsqcCmRYXtnNWFsNaAdGDZKWld7CmAko2VesrBRqJtR0AaWP99k/mit9ZLjayGM2c9gzH3+kta6oLUeBi4Ad9fr+oT1RzJXrFmzQkK7eiN0RXyMzhtGYaVcYWJx2Rabc/YbnDFzENaoy25TxvpiPEMiU3Q1mgW8DbSHffYAnc2tkkwWNjZ1zSkopRqUUkeAOeBJrfWL5vpjSqlx4MOYngLQB4w7Dp8w14QNwJ89d4l9n3nS5QmAkWTujwWI+JvY1Bq0w0eTi8uUK5pN5pt9Z7OPOdMoPH12jmZ/o51MtvoLDo4sAsY4TScDrQGSZgJ7k+QOhA1OXY2C1rpshon6gbuVUnvM9U9qrQeAvwZ+0fy6WusUqxeUUh9VSh1USh2Mx+P1unThGvMb3zgNwLGJlGv99PQSu83O4YHWoG00Rs1/bjbf+juafSxkixRLFb5zJs47b+mkqcH4z9sKH1ldz6slKQZixu+tIS/RgCiaChuba1J9pLVOAs9g5Aqc/A3wQfPzBDDg2OsHptY41+e11vu11vs7OjrqcLXCtcYpZnd2phpdzBUNNVNLTmJza4jppTyFUtmuQtpsh4+MB/+lRIZEpsBtfVH7PM3+JkLeBg5YRiHmNgpWs1qzX0pOBaGe1UcdSqkW83MAeAg4o5Ta7vja+4Az5uevA48qpXxKqSFgO3CgXtcnXHsqFc1SfqVm3ZKZADg9U5WrODuTRuuqxlBP1I/WRsL4wPACnc0+e8KZJU53aNQIEa32BrrNYyP+RtrD7qTx+/YZBXCrpbEFYSNSz1ejHuCLZgWRB3hca/2EUup/KaVuASrAKPBzAFrrk0qpx4FTQAn4mNa6XMfrE64x//O5S/zmN8/wg0884JKhthLEjR7FGYencHraMBBWbqDDNAAzS3m+dyHBgzu7UMqIOlpVRFbeYHUVUU80wMV4lm2dYfsYix1dzXz9F+9zTUsThI1K3YyC1voYcMca6x9c4+vW3mPAY/W6JuH68tSZOQD+8vsj/Mcf3mWvxzOGUXjr9naeORsnnV+h2d/E6eklwr5GOzFseQPfOTNHMrfC23a02+foNA3GwdG18wbWOZwzEJzs7W95w/cnCDcD0tEsXDVWyhX+8KnzrjkGq/cBTky6k8mWp2B1D1vyFefn0q43e+tN3hLGu29b1Si0h42H/fjCMp3NvhpJip97+1baQl4e2dP9+m9QEDYAYhSEq8bBkUV+78lzvPN3n6FQqo38WcbC6jWwuGRqFr15iyFzfdEUtrswl3U1mrWFvDR4FPF0gZ3dzbYhAGhq8Ng6Rrd0V/WKLAbbQxz81EP86B39b+QWBeGmR4yCcNU4P1dNEv+fw+7CsYnFHMncCs3+RqZTyxTN2cYL2SJ/9r1hNrcFGTIlJuLpAslckUSmwPauqlHweJT94L9zc6zm71uKpzvXMApATS5BEIRaxCgIr5nf/dZZPvm14zXr52czNPsbaQ15eXls0bX3zeMzAPzMfUNUtGEkAA6PLVIsVfjM+/cQ8DbQ7G9kbilvdzJv63TPMLAqhJwlp/aemTfYs8aeIAhXhhgF4TWhteaPnr7AX784ZucILM7NGjmAzW3BmhDRE8emuK0vaoeILNXSoxMpPAruGjTe/LsifubSBc6bRmF7p/ut/08+fCf/7PZefmh3V821/fcfv4Ov/sJbeO/e3po9QRCuDDEKwmvC+bB3Joy11pydTbOzO8JgW8ilZjq+kOPoRIr37O2xq3+siqOj40l2dDXbg206m33MLuU5P5vB3+Sp6R3obQnwh//iDtrCtVVE0WATd26K0eCRMJEgvF7EKAivicPj1bCQ1RMAMLtUIJlbYVdPM5tag0yllu1k8zeOTwPwntscRiFdQGvNsYkktzvKQS0NowvxDFs7wrbEtSAI1wYxCsJr4vR0Gm+Dh/5YwJaNADhtNp3t7I6wqTWI1jCVNEJE3zo5w+39UQZag0T8jXgbPMTTBcYXllnMrbB3oJoD6Ir6mV3Kc24m7ao8EgTh2iBGQXhNnJ5eYntXmDdvabMlJQDOmN3Ht3Q309Ni9BNMJw0l09PTS+wfbAWMCqAOU+b65JQRftrbV/UU+mNBVsqamaW8PQ9BEIRrhxgFYU3+x1Pn+bPnLtWsn50x8gZDHSEWskWWi0aI6MzMEn0tAaKBJnpNCYupVJ7xhRz5lQq3OB7w7c0+4pmCPUBnS0d1hsGAQ9Z6a4d4CoJwrRFZSKGGhWyR33/yHABv2hzjjk1GZVB+pcxcusCm1qDdE7CYKxLwBjgznbb7Ayyp6unkMmdN5dEdjt6BjrCPicUcw4ksXREfIUf3sVOeYq0mNEEQ6ot4ChuUSkVzx2e+zR8/faFm76nTs/bnk1NVgbpps4y0LxYgFjSMwoI5+/hiPMPOHuMh7m9qoDXkZSqV56yperp6BGYiU2A4kbUb1iyc1UaDMvBGEK45YhQ2KGdn0yzmVvidb52t2Ts5tUTQ24C30WPPLQBj2hkYD26rs3ghW+TiXJZSRbOzO2J/tyfqZya1zNmZNJvbgi5voKPZx3y2yPnZdI1R8Dc18H+/dYi//Om7pANZEK4DEj7aoDx/cd7+rLV2PYBPTy+xs7uZTKHEiKMvYdzsQu6PBSiajWuLphwFwK6earinJxpgYjFHsezOJ4BhFLSGpXypxigAfOq9u6/CHQqC8HoQT+EmJlMo8WN/+jxPm5LVTqzh9gCTyWX7s9aaMzNpdvZE2NQasj0FrTVffmmc3qif3pYArY7w0dlZo0x1sK36gO9t8XMpkWUkkWVnT9WDACOnYDHULslkQVhPiFG4ifnSgTFeGlnkZ77wkmvkJbg7kycWq0ZhZilPanmFXd3N9EQNyQlr/ch4kp+5b4gGjyIaaMKjYDFbZGw+x0BrgMaG6n9OPdEAxVKFioZd3as9herks6F2yRsIwnpCjMJNzHfPJ+zPw4mMa29kPsudm4z+AEuHCKr9Bjt7IrSFvSRzK6yUK/b6PvMYS7E0nikwvpiz5xxb9LZUp5jVegrVvdXDcARBuL6IUbiJGZvPssOUnj42UdUpyhVLzKUL3GOK002lqp6C1Zl8S3ezrS+0mC261i0G20JcimdNT2G1UahWEa02GH2xAB+5f4gv/uzd+Bob3vB9CoJw9ZBE801KqVxhYnGZf3X/EBOLyxyfTPGBO40BM9bs4719USL+Rpen8Ny5BFs6QkT8TbSbFUaJTJHzsxl6o34i/ib7u1s7wnzt8CTFcqXmwX97fws//46tbGoN1gjUNXgUn3yPJJMFYT0iRuEmZSqZp1TRbG0P09sSYHbJePCvlCv8t386h1Jw75Y2eqIBu/9gdinPC8Pz/NID2wFsT2EhW2QyuUx/zP3g39oZsquQVlcReRs9/IdHdtb1HgVBuPpI+GidsFwsc3jVYBowmsxWypWaRLFz/2I8g9bu/YtmDmFzW5BooInU8goAT5+Z47nzCfb2RYmFvPS0+Jk2w0fPnJ1Da3j3bcYc47aw4SnMZwvMLuXpivpdf8OpTSTdx4JwcyBGYZ3wmSdO8qN/8jxjjqqgYqnCfb/1HbZ/8pvc8Zlvc2h0oea4X/3qMR78vWf5wvMjrvUTZg5hV2/EZRQsvaEv/uzdgNVkZngKz56L0x3x230FlpRFPF1gOpWnZ5VRuNsUuQNq5h4IgnBjIkZhnWDNJvju+bi9tpAtMp3K8/CtXTT7m3jsG6drjvvOGeP7//2p81Qc3sTxyRRb2o3cgNMojM5naQ15aTH7DLojARIZQ6rixOQS+wdjdiNbNNCEt8HD+dkMxVKFrojbKIR8jQy0BuhrCUj3sSDcJIhRWCek8yUAXrhU7TROLhcBeN/tfbzjlg6GE1nXMdZw+109EZK5Fc7Npe29MzNpdvUapaDRQBPJnGUUcmx2aApZMtcTi8tMLOZcuQGlFJ0RH0cnksZ3V3kKAE9+/O089W/e/vpvXBCEdYUYhWtItlCqebADLOVXmFmqJnstUuaDvCXYRG9LgMXcCrliyd635hj/y7sHAHhp2Agv5VfKjC/mbBG6SKCJdL5EuaIZnc+5Oo+tB/3BkQUqGja3uRPG3RE/Z0xRu4FYbU+Bv6kBf5OUlQrCzULdjIJSyq+UOqCUOqqUOqmU+rS5/jtKqTNKqWNKqa8ppVocx3xCKXVBKXVWKfVwva7tevGRvzrIO3/3GeZNrSCLcXOecaNHkcgU7fWkGfKJBproN+cMTDkkKV42h9w8tLuLQFOD3aU8nMiidXUeQUvAKCOdzxSYSi27PQXTKPzA1EJarUxqJZeVgm0yCU0Qbnrq6SkUgAe01rcD+4BHlFL3Ak8Ce7TWe4FzwCcAlFK7gUeBW4FHgD9RSt1wr6BfOjDGf/2H2tj/7FLeFqH78sFx154lM7FvoIV4umowUg6jYDWDOSUp/uH4NHv7o/REDdXSRdOzuGB6ENZDPGoaheOTKbTG5Sl0mwNxfmCGrQbbaz0FMBrQAt4b7n8OQRBeI3UzCtrA0lZoMn+01vrbWmsrBvIC0G9+fj/wJa11QWs9DFwA7q7X9dWLX/3qcT7/3Uv2RDILZ7np0fGka89pFDKFEvkV49jV4SOozj3Or5Q5Ppni7Ts67O8s5gwv48JcBqWqvQOWUbD+7iaHNxD2NdLsb2R2qUDY12hXHFlYVUXbZAqaIGwI6tq8Zr7pHwK2AX+stX5x1Vd+Fviy+bkPw0hYTJhr645PfPU45UqF3/7nt7vWndU/h8cWecu2dvv3YxMpGj2K+7a122/yFhOLOULeBrabkhTxdIGB1iCp5RUaPIqwr9GWg7A8iYvxDBVd7Q9oDXlZyBbtvYFY0I71d0aMJrQXzZzD5lXdxz1RP+l8hs1twZoqon++v59ooIn7HPciCMLNS10TzVrrstZ6H4Y3cLdSao+1p5T6JFAC/tpaWusUqxeUUh9VSh1USh2Mx+NrHFJfKhXN3x4Y4/GDE/ZD2GJsodpjcNShNQRwYmqJHV3N3NobYXQ+x4rZCQzGm/1ge4iOZuPhbc0nSC4XiQaaUErhbfQQCzYxlzY8hfOzhmHZYfYUxIJekg5PYatj7rH1tn9odJGAORXNSY8ZQhpsq51tEPE38cE39dsjNgVBuLm5JtVHWusk8AxGrgCl1E8B7wU+rKutuBPAgOOwfmBqjXN9Xmu9X2u9v6Ojo67XvRaXHGqjz513G6UzM9UZBc6EMMDFuQw7usJs7QhTqmiXATkzk2ZXT4R2U1bC8gYWskU7SQzQ2VyVsj49s0RTg7If5LFgkz0acziRdSWFW0NefI0eShVNV8RX4w2EzTnKu3vdaqaCIGw86ll91GFVFimlAsBDwBml1CPAfwDep7XOOQ75OvCoUsqnlBoCtgMH6nV9r5cTk9UHvzPpC3BqOo1HwZb2kGtwzXKxzGRymS0dYbsvwOoinknliacL7Oxuto2CVYF0YS7DFscbf2fEZxuMA8ML7O1vwdto/E8YC3lZypf41slZCqWKK9yjlLK9hdUNaAAfuX8LH39oBx+5f8vr/LciCMLNQj1zCj3AF828ggd4XGv9hFLqAuADnjTfWF/QWv+c1vqkUupx4BRGWOljWuvyK538ejFhjqQMeRvszxZnppcYbA+xpT3s2rN6E7Z0hOxQjWUUfvtbZ2hqULzjlg5bayiRKVAsVbgUz/Lgri77PB1hH5fiWXLFEscnUnz0bdWHuBUS+trLE7SGvNy/3e1F9cUCXEpk1wwD7RtoYd9AS826IAgbj7oZBa31MeCONda3XeaYx4DH6nVNr4VvHJtmc1uQPX1R1/pUKk9ryMtALFDjKZyZSXNbf5T2kJcXh6udyScmjfzCLV3Ndomn1az2g4vz/PBtPWzrNHID0UATiUyBkfkspYp2zTfuMD2FU1NLlCqaOzfF7D2r6eyZc3HuGGipkave0xflufMJvA3SrygIwisjT4g1WC6W+djfvMx7//B7FEpuZ2U6uUxP1E9/LMikwyik8yuMLeTY1d1Mb0uAdL7EUt4oKf3BpXnaw162dYYJeBuI+BuZXcqTzq8wncrbyWKA9rCXeLrApXjVu7DobPZTLFfsKqJdjhzAHvOz1tCzhjjdQ6bHEZReA0EQLoMYhTV4aaSqRnpsVRXRVDJPb0uA7qjfftsHODdrjrHsjtg9BdNmT8GB4QXuGWqzE7zdaWJl3wAADHdJREFUpjLpRfPB70wKt4d9JDIFO/zkHF5jVSc9ey5Os7+RXkcoqC3ss7uT11IsfdPmGH/+U/v5Nw/f8pr+XQiCsLHYsEZBa81XDo7XVAkBrtCPc79S0Uwml+mN+mkNeckVy3ajmZWA3tUboc+UpJhM5lg0B9TcPlANQ3VF/Mwu5TlvGpLtDqPQ0WyEiCYWlwn7Gu3GM4BO0ygcGF5gd0+kpopolzkLeXXJqcWDu7pck9MEQRBWs2GNwt8dmuDf/d0x/ujpCzV7J6eWGGg13/YdoyqPT6bIFErcsSlGzJSettRHDwwv0Bv10xv122/qk8k8p6dNY+EYXt8dMbyMC/EM3gaPyxvobQkwlcoztpCjP+aWpLY8BYC7HLMMLP75m4zmcJltIAjC62XDGoVvnZwBjFnGTrTWHB5LctdgK80+9/zip8/OoRS8bUcHrSHjjXshW0RrzYvD89yzxQgRdYR9NDUoppLLnFrLKET9xNMFzkynGWoP0ehI/vbHAhRLFV4eW2RgVedxp8MovGkwxmp++LYevvmv7+e9e3te778WQRA2OBvWKFhSE04BOoDPfvMMqeUVbuuL0tPid4WPnj4bZ99AC60hr+0pLOaKXIxnSWSK3D1kvL17PIquiHHsqakluiI+uwcBDKNQ0UYCeluXW1PIUkNN5lbYsWov7KsWi9071Lbmfe1aI6wkCIJwpdRV+2i9kl8p2x3Fc6uMwstji7QEm3j0rk08czZuh48WskWOTST55Qd3ANW4/UK2yMi8kTC+Z6ga0uk28wbJ3Aq7e9ydwlZZarFUceUTAPpaqt7B7h53OaxSio/cP8SevqgolgqCUBc2pFEYTmSpaOPNe3bJbRQuxbO8e083AW8DPVE/J6eM6qMj44toDfduMR78sVDVUzg+kaI97HVNLeuK+jkylmR2Kc8DOztdf8PZVbx6RoHlKcDashOffM/u13PLgiAIV8SGNAqRQBO/9MA2xheX+drhSYYTWYbaQ6RyK8xni7aeUE+0Or/46HgKj8JuZqsOrilyydQacoZtupr9ttTF6m5hpyHY3tns2gv5Gvnzn9rPsYlUzcAbQRCEerMhcwp9LQF+5Ydu4cFdxhv8f/zqcaAqdmcNmrF0imZTBQ6PJ9nWGSZkxvUbGzy0h73MpQumUXG/8XdHqzmEe7a44//O8ZWD7bUP/gd3dfHxd+2Q3IAgCNecDekpWLx3by9PnZ7jH45Pk18p241qljdgNYONLeQ4OLLAB+/sdx3fHfVzZmaJhWyRLasmlrWYieiWYJOr18Di73/xrRybTNpzEgRBENYDG9JTcPLevT0UShWOjid5eWyRrojP7hS2xOv+4cQ0uWK5ZtBMTzTA4TFjmtnQKqOwt98wLL/1wb1r/t3b+qN8+J7NV/VeBEEQ3igb2lOA6pCa0fkcxyZS7BtoscM2/bEAHgXfPjkLGA9yJz0OmYmhDrdR2Nkd4dxvvNuWthYEQbgR2PBPrO6oH4+C0YUs4ws5tjpmEfubGhhsC5HIFGjwKLoczWPWsRYDsdrcgBgEQRBuNDb8U6upwUN3xM9Lw4uUKtolOQHYc5O7I35X5zG4K4fEAAiCcDMgTzKMATQHTGXUTavKQG/pNnoFfE21/6qs/oO3bF27u1gQBOFGQ4wC7r6B1Z7C+27vBWBpuVRzXINHcfLTD/MXP31XfS9QEAThGrHhE80A/+7hnbSHfQS8DTUKo9s6w3zqPbu4Y9Pa4ypDPvlXKAjCzYPSWl/va3jd7N+/Xx88ePB6X4YgCMINhVLqkNZ6/1p7Ej4SBEEQbMQoCIIgCDZiFARBEAQbMQqCIAiCjRgFQRAEwUaMgiAIgmAjRkEQBEGwEaMgCIIg2NzQzWtKqTj8/+2dfYwdVRnGfw9srdACAUtNBcJCQoEaKEKDmKA0GCAqoRJowBYlwcTEIMEvDEQIf+gfSoxRUg2aSCkokAgkYKI00PBhKN9Nt7aU8qmwuqEiEFcQYbevf5xzp8N6723u7sydi31+yWTPPWfOnWeePXvfPWdm3stf+nzYecCrfT5mJwZJCwyWnkHSAtbTjUHSAoOlpy4th0bEge0a3tdBoQkkPdHpScB+M0haYLD0DJIWsJ5uDJIWGCw9TWjx8pExxpgCBwVjjDEFDgq988umBZQYJC0wWHoGSQtYTzcGSQsMlp6+a/E1BWOMMQWeKRhjjClwUDDGGFOw2wcFSddL2i5pc6lusaSHJf1J0u8k7VtqOza3bcntH8z1J+TXz0m6VpIa1nO/pG2SNuZtfp1aJK0sHWujpB2SjmvKm13o6bc3syStyfVbJV1R6tOEN930zNibaej5gKTVuX5E0tJSnxn7U6GWKsbNIZLuy75vkXRprj9A0j2Sns0/9y/1uSKf/zZJZ1TpTVsiYrfegE8BxwObS3WPA6fk8kXA93J5CNgELM6vPwTsmcuPAZ8ABPwB+EzDeu4HlvTLmyn9jgFeKL3uuze70NNXb4AVwK25vDfwZ2C4wXHTTc+MvZmGnouB1bk8H3gS2KMqfyrUUsW4WQAcn8v7AM8Ai4BrgMtz/eXAD3N5ETACzAYOA56n4s+cqdtuP1OIiAeB16ZUHwk8mMv3AOfk8unApogYyX3/ERGTkhYA+0bEw5F+WzcCn29Kz3SOW4GWMl8AbgFo0Ju2eqqiRy0BzJE0BOwFvAP8s0Fv2uqZznEr0rMIWJf7bQfeAJZU5U8VWno9ZhctYxGxIZfHga3AQcAyYE3ebQ07z3MZKYD/JyJeBJ4DTqxy7Exltw8KHdgMnJXLy4FDcnkhEJLWStog6Tu5/iBgtNR/NNc1pafF6jzNvaqyqWVnLWXOY+eHcFPedNLTop/e3Aa8CYwBLwE/iojXaM6bTnpa1OFNNz0jwDJJQ5IOA07IbXX606uWFpV5I2kY+BjwKPDhiBiDFDhIsxRI5/tyqVvLg9q8cVBoz0XAxZKeJE3x3sn1Q8DJwMr882xJnyZN36ZS5b2+veoBWBkRxwCfzNsXa9YCgKSPA29FRGv9tilvOumB/ntzIjAJfIS0BPAtSYfTnDed9EB93nTTcz3pQ+0J4CfAemCCev3pVQtU6I2kucDtwNcjotssrZMHtXkzVMWb/L8REU+TlmaQtBD4XG4aBR6IiFdz2+9Ja5W/Bg4uvcXBwN8a1LMuIv6a+45Lupn0QXBjjVpanM97/ysfpRlvOumhAW9WAHdHxLvAdkkPkZYk/kgz3nTS80Jd3nTTExETwDda+0laDzwLvE5N/kxDS2XjRtIsUkD4TUTckatfkbQgIsby0tD2XD/Ke2cqLQ9q+7vyTKENrbsKJO0BXAlcl5vWAsdK2juvx54CPJWne+OSTspTyi8BdzalJ0995+U+s4AzSdPlOrW06pYDt7bqGvSmrZ6GvHkJOFWJOcBJwNMNetNWT53edNOTx++cXD4NmIiIWv+uetVSlTf5PH4FbI2IH5ea7gIuzOUL2XmedwHnS5qdl7OOAB6rdexUcbX6/byR/oscA94lRd8vA5eS7gp4BvgB+cnvvP8FwBbSgLimVL8k1z0PrCr36bceYA7prolNue2n5DsWatayFHikzfs05c3/6GnCG2Au8Nt8vKeAy5r0ppOeqryZhp5hYBvpouu9pLTOlflThZYKx83JpGWeTcDGvH2WdOfgOtKsZB1wQKnPd/P5b6N0h1FVY2fq5jQXxhhjCrx8ZIwxpsBBwRhjTIGDgjHGmAIHBWOMMQUOCsYYYwocFIzpAUmTOc3BFqUsmt/M97p36zMsaUW/NBozExwUjOmNf0fEcRHxUeA00j3mV++izzDpKWJjBh4/p2BMD0j6V0TMLb0+nJSGeR5wKHAT6UEngK9FxHpJjwBHAy+SMmBeS3pgaikpJfLPIuIXfTsJY7rgoGBMD0wNCrnudeAoYBzYERFvSzoCuCUilih9Ucu3I+LMvP9XgPkR8X1Js4GHgOWRUiMb0yhOiGfMzGllrJwFrFL6hrdJUmrzdpxOyll1bn69HymnjYOCaRwHBWNmQF4+miRltbwaeAVYTLpe93anbsAlEbG2LyKN6QFfaDZmmkg6kJRdc1Wkddj9gLGI2EHKtb9n3nWclLO/xVrgqznbJpIWtjJzGtM0nikY0xt7SdpIWiqaIF1YbqVA/jlwu6TlwH2kbzeDlBFzQtIIcAMpw+YwsCGnPf47FX2VojEzxReajTHGFHj5yBhjTIGDgjHGmAIHBWOMMQUOCsYYYwocFIwxxhQ4KBhjjClwUDDGGFPwX8f58wJ97NOyAAAAAElFTkSuQmCC",
 | ||
|       "text/plain": [
 | ||
|        "<Figure size 432x288 with 1 Axes>"
 | ||
|       ]
 | ||
|      },
 | ||
|      "metadata": {
 | ||
|       "needs_background": "light"
 | ||
|      },
 | ||
|      "output_type": "display_data"
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "train_df\n",
 | ||
|     "\n",
 | ||
|     "import matplotlib.pyplot as plt\n",
 | ||
|     "\n",
 | ||
|     "plt.plot(train_df['index'], train_df['co2'])\n",
 | ||
|     "plt.xlabel('Date')\n",
 | ||
|     "plt.ylabel('CO2 Levels')\n",
 | ||
|     "plt.show()"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Run FLAML\n",
 | ||
|     "\n",
 | ||
|     "In the FLAML automl run configuration, users can specify the task type, time budget, error metric, learner list, whether to subsample, resampling strategy type, and so on. All these arguments have default values which will be used if users do not provide them."
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 5,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [],
 | ||
|    "source": [
 | ||
|     "''' import AutoML class from flaml package '''\n",
 | ||
|     "from flaml import AutoML\n",
 | ||
|     "automl = AutoML()"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 6,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [],
 | ||
|    "source": [
 | ||
|     "settings = {\n",
 | ||
|     "    \"time_budget\": 240,  # total running time in seconds\n",
 | ||
|     "    \"metric\": 'mape',  # primary metric for validation: 'mape' is generally used for forecast tasks\n",
 | ||
|     "    \"task\": 'ts_forecast',  # task type\n",
 | ||
|     "    \"log_file_name\": 'CO2_forecast.log',  # flaml log file\n",
 | ||
|     "    \"eval_method\": \"holdout\",  # validation method can be chosen from ['auto', 'holdout', 'cv']\n",
 | ||
|     "    \"seed\": 7654321,  # random seed\n",
 | ||
|     "}"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 7,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stderr",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "[flaml.automl: 07-28 21:10:44] {2478} INFO - task = ts_forecast\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {2480} INFO - Data split method: time\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {2483} INFO - Evaluation method: holdout\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {2552} INFO - Minimizing error metric: mape\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {2694} INFO - List of ML learners in AutoML Run: ['lgbm', 'rf', 'xgboost', 'extra_tree', 'xgb_limitdepth', 'prophet', 'arima', 'sarimax']\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {2986} INFO - iteration 0, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {3114} INFO - Estimated sufficient time budget=2005s. Estimated necessary time budget=2s.\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {3161} INFO -  at 0.7s,\testimator lgbm's best error=0.0621,\tbest estimator lgbm's best error=0.0621\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {2986} INFO - iteration 1, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {3161} INFO -  at 0.7s,\testimator lgbm's best error=0.0621,\tbest estimator lgbm's best error=0.0621\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {2986} INFO - iteration 2, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {3161} INFO -  at 0.8s,\testimator lgbm's best error=0.0277,\tbest estimator lgbm's best error=0.0277\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {2986} INFO - iteration 3, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {3161} INFO -  at 0.8s,\testimator lgbm's best error=0.0277,\tbest estimator lgbm's best error=0.0277\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {2986} INFO - iteration 4, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {3161} INFO -  at 0.9s,\testimator lgbm's best error=0.0175,\tbest estimator lgbm's best error=0.0175\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {2986} INFO - iteration 5, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {3161} INFO -  at 0.9s,\testimator lgbm's best error=0.0055,\tbest estimator lgbm's best error=0.0055\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {2986} INFO - iteration 6, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {3161} INFO -  at 1.0s,\testimator lgbm's best error=0.0055,\tbest estimator lgbm's best error=0.0055\n",
 | ||
|       "[flaml.automl: 07-28 21:10:44] {2986} INFO - iteration 7, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {3161} INFO -  at 1.0s,\testimator lgbm's best error=0.0055,\tbest estimator lgbm's best error=0.0055\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 8, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {3161} INFO -  at 1.0s,\testimator lgbm's best error=0.0031,\tbest estimator lgbm's best error=0.0031\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 9, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {3161} INFO -  at 1.1s,\testimator lgbm's best error=0.0031,\tbest estimator lgbm's best error=0.0031\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 10, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {3161} INFO -  at 1.1s,\testimator lgbm's best error=0.0027,\tbest estimator lgbm's best error=0.0027\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 11, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {3161} INFO -  at 1.2s,\testimator lgbm's best error=0.0022,\tbest estimator lgbm's best error=0.0022\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 12, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {3161} INFO -  at 1.2s,\testimator lgbm's best error=0.0022,\tbest estimator lgbm's best error=0.0022\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 13, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {3161} INFO -  at 1.3s,\testimator lgbm's best error=0.0022,\tbest estimator lgbm's best error=0.0022\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 14, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {3161} INFO -  at 1.3s,\testimator lgbm's best error=0.0022,\tbest estimator lgbm's best error=0.0022\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 15, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {3161} INFO -  at 1.4s,\testimator lgbm's best error=0.0022,\tbest estimator lgbm's best error=0.0022\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 16, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {3161} INFO -  at 1.6s,\testimator rf's best error=0.0217,\tbest estimator lgbm's best error=0.0022\n",
 | ||
|       "[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 17, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {3161} INFO -  at 2.0s,\testimator xgboost's best error=0.6738,\tbest estimator lgbm's best error=0.0022\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 18, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {3161} INFO -  at 2.1s,\testimator extra_tree's best error=0.0197,\tbest estimator lgbm's best error=0.0022\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 19, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {3161} INFO -  at 2.2s,\testimator extra_tree's best error=0.0177,\tbest estimator lgbm's best error=0.0022\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 20, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {3161} INFO -  at 2.2s,\testimator xgb_limitdepth's best error=0.0447,\tbest estimator lgbm's best error=0.0022\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 21, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {3161} INFO -  at 2.2s,\testimator xgb_limitdepth's best error=0.0447,\tbest estimator lgbm's best error=0.0022\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 22, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {3161} INFO -  at 2.3s,\testimator xgb_limitdepth's best error=0.0029,\tbest estimator lgbm's best error=0.0022\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 23, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {3161} INFO -  at 2.4s,\testimator lgbm's best error=0.0022,\tbest estimator lgbm's best error=0.0022\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 24, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {3161} INFO -  at 2.4s,\testimator rf's best error=0.0217,\tbest estimator lgbm's best error=0.0022\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 25, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {3161} INFO -  at 2.5s,\testimator xgb_limitdepth's best error=0.0029,\tbest estimator lgbm's best error=0.0022\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 26, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {3161} INFO -  at 2.6s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator xgb_limitdepth's best error=0.0019\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 27, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {3161} INFO -  at 2.7s,\testimator rf's best error=0.0216,\tbest estimator xgb_limitdepth's best error=0.0019\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 28, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {3161} INFO -  at 2.8s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator xgb_limitdepth's best error=0.0019\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 29, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {3161} INFO -  at 2.9s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator xgb_limitdepth's best error=0.0019\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 30, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {3161} INFO -  at 2.9s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator xgb_limitdepth's best error=0.0019\n",
 | ||
|       "[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 31, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:47] {3161} INFO -  at 3.0s,\testimator lgbm's best error=0.0022,\tbest estimator xgb_limitdepth's best error=0.0019\n",
 | ||
|       "[flaml.automl: 07-28 21:10:47] {2986} INFO - iteration 32, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:47] {3161} INFO -  at 3.1s,\testimator lgbm's best error=0.0022,\tbest estimator xgb_limitdepth's best error=0.0019\n",
 | ||
|       "[flaml.automl: 07-28 21:10:47] {2986} INFO - iteration 33, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:10:47] {3161} INFO -  at 3.2s,\testimator lgbm's best error=0.0022,\tbest estimator xgb_limitdepth's best error=0.0019\n",
 | ||
|       "[flaml.automl: 07-28 21:10:47] {2986} INFO - iteration 34, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:10:47] {3161} INFO -  at 3.3s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator xgb_limitdepth's best error=0.0019\n",
 | ||
|       "[flaml.automl: 07-28 21:10:47] {2986} INFO - iteration 35, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:11:07] {3161} INFO -  at 23.3s,\testimator prophet's best error=0.0008,\tbest estimator prophet's best error=0.0008\n",
 | ||
|       "[flaml.automl: 07-28 21:11:07] {2986} INFO - iteration 36, current learner arima\n",
 | ||
|       "[flaml.automl: 07-28 21:11:08] {3161} INFO -  at 24.2s,\testimator arima's best error=0.0047,\tbest estimator prophet's best error=0.0008\n",
 | ||
|       "[flaml.automl: 07-28 21:11:08] {2986} INFO - iteration 37, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:11:09] {3161} INFO -  at 25.3s,\testimator sarimax's best error=0.0047,\tbest estimator prophet's best error=0.0008\n",
 | ||
|       "[flaml.automl: 07-28 21:11:09] {2986} INFO - iteration 38, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:09] {3161} INFO -  at 25.4s,\testimator xgboost's best error=0.6738,\tbest estimator prophet's best error=0.0008\n",
 | ||
|       "[flaml.automl: 07-28 21:11:09] {2986} INFO - iteration 39, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:10] {3161} INFO -  at 26.4s,\testimator extra_tree's best error=0.0177,\tbest estimator prophet's best error=0.0008\n",
 | ||
|       "[flaml.automl: 07-28 21:11:10] {2986} INFO - iteration 40, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:11:10] {3161} INFO -  at 26.6s,\testimator sarimax's best error=0.0047,\tbest estimator prophet's best error=0.0008\n",
 | ||
|       "[flaml.automl: 07-28 21:11:10] {2986} INFO - iteration 41, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:11:10] {3161} INFO -  at 26.7s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0008\n",
 | ||
|       "[flaml.automl: 07-28 21:11:10] {2986} INFO - iteration 42, current learner arima\n",
 | ||
|       "[flaml.automl: 07-28 21:11:10] {3161} INFO -  at 26.9s,\testimator arima's best error=0.0047,\tbest estimator prophet's best error=0.0008\n",
 | ||
|       "[flaml.automl: 07-28 21:11:10] {2986} INFO - iteration 43, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:10] {3161} INFO -  at 26.9s,\testimator xgboost's best error=0.1712,\tbest estimator prophet's best error=0.0008\n",
 | ||
|       "[flaml.automl: 07-28 21:11:10] {2986} INFO - iteration 44, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:11] {3161} INFO -  at 27.0s,\testimator xgboost's best error=0.0257,\tbest estimator prophet's best error=0.0008\n",
 | ||
|       "[flaml.automl: 07-28 21:11:11] {2986} INFO - iteration 45, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:11] {3161} INFO -  at 27.0s,\testimator xgboost's best error=0.0257,\tbest estimator prophet's best error=0.0008\n",
 | ||
|       "[flaml.automl: 07-28 21:11:11] {2986} INFO - iteration 46, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:11] {3161} INFO -  at 27.1s,\testimator xgboost's best error=0.0242,\tbest estimator prophet's best error=0.0008\n",
 | ||
|       "[flaml.automl: 07-28 21:11:11] {2986} INFO - iteration 47, current learner arima\n",
 | ||
|       "[flaml.automl: 07-28 21:11:11] {3161} INFO -  at 28.0s,\testimator arima's best error=0.0047,\tbest estimator prophet's best error=0.0008\n",
 | ||
|       "[flaml.automl: 07-28 21:11:11] {2986} INFO - iteration 48, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:11:12] {3161} INFO -  at 28.9s,\testimator sarimax's best error=0.0047,\tbest estimator prophet's best error=0.0008\n",
 | ||
|       "[flaml.automl: 07-28 21:11:12] {2986} INFO - iteration 49, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:11:17] {3161} INFO -  at 33.3s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:17] {2986} INFO - iteration 50, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:17] {3161} INFO -  at 33.3s,\testimator xgboost's best error=0.0242,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:17] {2986} INFO - iteration 51, current learner arima\n",
 | ||
|       "[flaml.automl: 07-28 21:11:17] {3161} INFO -  at 33.5s,\testimator arima's best error=0.0044,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:17] {2986} INFO - iteration 52, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:17] {3161} INFO -  at 33.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:17] {2986} INFO - iteration 53, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:11:17] {3161} INFO -  at 33.6s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:17] {2986} INFO - iteration 54, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {3161} INFO -  at 34.4s,\testimator sarimax's best error=0.0047,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 55, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {3161} INFO -  at 34.5s,\testimator xgboost's best error=0.0242,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 56, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {3161} INFO -  at 34.5s,\testimator xgboost's best error=0.0191,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 57, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {3161} INFO -  at 34.6s,\testimator xgboost's best error=0.0191,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 58, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {3161} INFO -  at 34.6s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 59, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {3161} INFO -  at 34.6s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 60, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {3161} INFO -  at 34.7s,\testimator xgboost's best error=0.0103,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 61, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {3161} INFO -  at 34.7s,\testimator xgboost's best error=0.0081,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 62, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {3161} INFO -  at 34.8s,\testimator xgboost's best error=0.0081,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 63, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {3161} INFO -  at 34.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 64, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {3161} INFO -  at 34.8s,\testimator xgboost's best error=0.0081,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 65, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {3161} INFO -  at 34.9s,\testimator xgboost's best error=0.0041,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 66, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:18] {3161} INFO -  at 35.0s,\testimator xgboost's best error=0.0041,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 67, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {3161} INFO -  at 35.1s,\testimator xgboost's best error=0.0029,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 68, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {3161} INFO -  at 35.2s,\testimator xgboost's best error=0.0029,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 69, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {3161} INFO -  at 35.3s,\testimator xgboost's best error=0.0028,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 70, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {3161} INFO -  at 35.3s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 71, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {3161} INFO -  at 35.3s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 72, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {3161} INFO -  at 35.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 73, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {3161} INFO -  at 35.4s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 74, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {3161} INFO -  at 35.4s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 75, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {3161} INFO -  at 35.5s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 76, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {3161} INFO -  at 35.6s,\testimator sarimax's best error=0.0047,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 77, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:11:24] {3161} INFO -  at 40.9s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:24] {2986} INFO - iteration 78, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:11:25] {3161} INFO -  at 41.3s,\testimator sarimax's best error=0.0041,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:25] {2986} INFO - iteration 79, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:25] {3161} INFO -  at 41.3s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:25] {2986} INFO - iteration 80, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:25] {3161} INFO -  at 41.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:25] {2986} INFO - iteration 81, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:11:30] {3161} INFO -  at 46.9s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:30] {2986} INFO - iteration 82, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:31] {3161} INFO -  at 47.1s,\testimator xgboost's best error=0.0027,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:31] {2986} INFO - iteration 83, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:31] {3161} INFO -  at 47.1s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:31] {2986} INFO - iteration 84, current learner arima\n",
 | ||
|       "[flaml.automl: 07-28 21:11:31] {3161} INFO -  at 47.6s,\testimator arima's best error=0.0044,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:31] {2986} INFO - iteration 85, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:31] {3161} INFO -  at 47.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:31] {2986} INFO - iteration 86, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:11:35] {3161} INFO -  at 51.8s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:35] {2986} INFO - iteration 87, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:11:35] {3161} INFO -  at 51.8s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:35] {2986} INFO - iteration 88, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:11:38] {3161} INFO -  at 54.9s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:38] {2986} INFO - iteration 89, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:38] {3161} INFO -  at 55.0s,\testimator extra_tree's best error=0.0177,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:38] {2986} INFO - iteration 90, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:11:39] {3161} INFO -  at 55.0s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:39] {2986} INFO - iteration 91, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:39] {3161} INFO -  at 55.0s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:39] {2986} INFO - iteration 92, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:39] {3161} INFO -  at 55.1s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:39] {2986} INFO - iteration 93, current learner arima\n",
 | ||
|       "[flaml.automl: 07-28 21:11:39] {3161} INFO -  at 55.3s,\testimator arima's best error=0.0043,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:39] {2986} INFO - iteration 94, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:39] {3161} INFO -  at 55.3s,\testimator xgboost's best error=0.0027,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:39] {2986} INFO - iteration 95, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:11:39] {3161} INFO -  at 55.5s,\testimator sarimax's best error=0.0040,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:39] {2986} INFO - iteration 96, current learner arima\n",
 | ||
|       "[flaml.automl: 07-28 21:11:40] {3161} INFO -  at 56.3s,\testimator arima's best error=0.0033,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:40] {2986} INFO - iteration 97, current learner arima\n",
 | ||
|       "[flaml.automl: 07-28 21:11:41] {3161} INFO -  at 57.4s,\testimator arima's best error=0.0033,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:41] {2986} INFO - iteration 98, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:41] {3161} INFO -  at 57.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:41] {2986} INFO - iteration 99, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:11:41] {3161} INFO -  at 57.8s,\testimator sarimax's best error=0.0038,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:41] {2986} INFO - iteration 100, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:41] {3161} INFO -  at 57.8s,\testimator extra_tree's best error=0.0089,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:41] {2986} INFO - iteration 101, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:41] {3161} INFO -  at 57.8s,\testimator extra_tree's best error=0.0089,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:41] {2986} INFO - iteration 102, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:41] {3161} INFO -  at 57.9s,\testimator extra_tree's best error=0.0089,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:41] {2986} INFO - iteration 103, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:42] {3161} INFO -  at 58.0s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:42] {2986} INFO - iteration 104, current learner arima\n",
 | ||
|       "[flaml.automl: 07-28 21:11:42] {3161} INFO -  at 58.3s,\testimator arima's best error=0.0033,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:42] {2986} INFO - iteration 105, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:42] {3161} INFO -  at 58.4s,\testimator extra_tree's best error=0.0089,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:42] {2986} INFO - iteration 106, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:42] {3161} INFO -  at 58.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:42] {2986} INFO - iteration 107, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:42] {3161} INFO -  at 58.5s,\testimator extra_tree's best error=0.0089,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:42] {2986} INFO - iteration 108, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:42] {3161} INFO -  at 58.6s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:42] {2986} INFO - iteration 109, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:42] {3161} INFO -  at 58.6s,\testimator extra_tree's best error=0.0089,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:42] {2986} INFO - iteration 110, current learner arima\n",
 | ||
|       "[flaml.automl: 07-28 21:11:43] {3161} INFO -  at 59.4s,\testimator arima's best error=0.0033,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:43] {2986} INFO - iteration 111, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:43] {3161} INFO -  at 59.4s,\testimator extra_tree's best error=0.0089,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:43] {2986} INFO - iteration 112, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:11:47] {3161} INFO -  at 63.3s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:47] {2986} INFO - iteration 113, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:47] {3161} INFO -  at 63.4s,\testimator extra_tree's best error=0.0074,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:47] {2986} INFO - iteration 114, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:47] {3161} INFO -  at 63.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:47] {2986} INFO - iteration 115, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:11:48] {3161} INFO -  at 64.6s,\testimator sarimax's best error=0.0038,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:48] {2986} INFO - iteration 116, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:48] {3161} INFO -  at 64.6s,\testimator extra_tree's best error=0.0074,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:48] {2986} INFO - iteration 117, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:11:48] {3161} INFO -  at 64.8s,\testimator sarimax's best error=0.0038,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:48] {2986} INFO - iteration 118, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:48] {3161} INFO -  at 64.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:48] {2986} INFO - iteration 119, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:48] {3161} INFO -  at 64.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:48] {2986} INFO - iteration 120, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:11:52] {3161} INFO -  at 68.2s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:52] {2986} INFO - iteration 121, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:52] {3161} INFO -  at 68.2s,\testimator extra_tree's best error=0.0074,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:52] {2986} INFO - iteration 122, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:52] {3161} INFO -  at 68.2s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:52] {2986} INFO - iteration 123, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:52] {3161} INFO -  at 68.3s,\testimator extra_tree's best error=0.0074,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:52] {2986} INFO - iteration 124, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:52] {3161} INFO -  at 68.4s,\testimator extra_tree's best error=0.0074,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:52] {2986} INFO - iteration 125, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {3161} INFO -  at 71.3s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 126, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {3161} INFO -  at 71.3s,\testimator extra_tree's best error=0.0055,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 127, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {3161} INFO -  at 71.4s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 128, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {3161} INFO -  at 71.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 129, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {3161} INFO -  at 71.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 130, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {3161} INFO -  at 71.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 131, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {3161} INFO -  at 71.5s,\testimator extra_tree's best error=0.0055,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 132, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {3161} INFO -  at 71.6s,\testimator extra_tree's best error=0.0055,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 133, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {3161} INFO -  at 71.6s,\testimator extra_tree's best error=0.0055,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 134, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {3161} INFO -  at 71.6s,\testimator extra_tree's best error=0.0051,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 135, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {3161} INFO -  at 71.7s,\testimator extra_tree's best error=0.0051,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 136, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {3161} INFO -  at 71.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 137, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {3161} INFO -  at 71.9s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 138, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {3161} INFO -  at 71.9s,\testimator extra_tree's best error=0.0051,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 139, current learner arima\n",
 | ||
|       "[flaml.automl: 07-28 21:11:56] {3161} INFO -  at 72.8s,\testimator arima's best error=0.0033,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:56] {2986} INFO - iteration 140, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:56] {3161} INFO -  at 72.8s,\testimator extra_tree's best error=0.0051,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:56] {2986} INFO - iteration 141, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:56] {3161} INFO -  at 72.9s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:56] {2986} INFO - iteration 142, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:56] {3161} INFO -  at 72.9s,\testimator extra_tree's best error=0.0051,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:56] {2986} INFO - iteration 143, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:11:56] {3161} INFO -  at 72.9s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:56] {2986} INFO - iteration 144, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:11:57] {3161} INFO -  at 73.0s,\testimator extra_tree's best error=0.0051,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:57] {2986} INFO - iteration 145, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:11:57] {3161} INFO -  at 73.1s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:11:57] {2986} INFO - iteration 146, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {3161} INFO -  at 76.1s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 147, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {3161} INFO -  at 76.2s,\testimator extra_tree's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 148, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {3161} INFO -  at 76.2s,\testimator extra_tree's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 149, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {3161} INFO -  at 76.4s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 150, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {3161} INFO -  at 76.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 151, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {3161} INFO -  at 76.5s,\testimator rf's best error=0.0150,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 152, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {3161} INFO -  at 76.6s,\testimator rf's best error=0.0150,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 153, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {3161} INFO -  at 76.7s,\testimator rf's best error=0.0096,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 154, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {3161} INFO -  at 76.7s,\testimator rf's best error=0.0096,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 155, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {3161} INFO -  at 76.8s,\testimator extra_tree's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 156, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {3161} INFO -  at 76.8s,\testimator rf's best error=0.0042,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 157, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {3161} INFO -  at 76.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 158, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {3161} INFO -  at 76.9s,\testimator rf's best error=0.0042,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 159, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {3161} INFO -  at 76.9s,\testimator extra_tree's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 160, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.0s,\testimator rf's best error=0.0042,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 161, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.0s,\testimator rf's best error=0.0036,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 162, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.1s,\testimator rf's best error=0.0036,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 163, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.1s,\testimator extra_tree's best error=0.0030,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 164, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.2s,\testimator rf's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 165, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.2s,\testimator rf's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 166, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.3s,\testimator extra_tree's best error=0.0027,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 167, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.4s,\testimator extra_tree's best error=0.0027,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 168, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.4s,\testimator rf's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 169, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.4s,\testimator rf's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 170, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.5s,\testimator rf's best error=0.0021,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 171, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.5s,\testimator extra_tree's best error=0.0027,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 172, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.6s,\testimator rf's best error=0.0021,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 173, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.6s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 174, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.6s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 175, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.7s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 176, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.7s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 177, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.8s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 178, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {3161} INFO -  at 77.8s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 179, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.2s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 180, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.3s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 181, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.3s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 182, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.4s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 183, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.4s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 184, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.4s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 185, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.5s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 186, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 187, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.6s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 188, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.6s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 189, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.6s,\testimator rf's best error=0.0021,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 190, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.7s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 191, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 192, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 193, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.8s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 194, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 195, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.9s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 196, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {3161} INFO -  at 81.9s,\testimator rf's best error=0.0021,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 197, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:12:09] {3161} INFO -  at 85.5s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:09] {2986} INFO - iteration 198, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:09] {3161} INFO -  at 85.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:09] {2986} INFO - iteration 199, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:09] {3161} INFO -  at 85.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:09] {2986} INFO - iteration 200, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:09] {3161} INFO -  at 85.6s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:09] {2986} INFO - iteration 201, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:09] {3161} INFO -  at 85.6s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:09] {2986} INFO - iteration 202, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:12:12] {3161} INFO -  at 88.9s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:12] {2986} INFO - iteration 203, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:12:16] {3161} INFO -  at 92.8s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:16] {2986} INFO - iteration 204, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:16] {3161} INFO -  at 92.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:16] {2986} INFO - iteration 205, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:16] {3161} INFO -  at 92.9s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:16] {2986} INFO - iteration 206, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:12:17] {3161} INFO -  at 93.0s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:17] {2986} INFO - iteration 207, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:12:20] {3161} INFO -  at 96.0s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:20] {2986} INFO - iteration 208, current learner arima\n",
 | ||
|       "[flaml.automl: 07-28 21:12:20] {3161} INFO -  at 96.3s,\testimator arima's best error=0.0033,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:20] {2986} INFO - iteration 209, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:20] {3161} INFO -  at 96.4s,\testimator rf's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:20] {2986} INFO - iteration 210, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:12:26] {3161} INFO -  at 102.7s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:26] {2986} INFO - iteration 211, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:26] {3161} INFO -  at 102.8s,\testimator rf's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:26] {2986} INFO - iteration 212, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:26] {3161} INFO -  at 102.9s,\testimator rf's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:26] {2986} INFO - iteration 213, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:26] {3161} INFO -  at 103.0s,\testimator rf's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:26] {2986} INFO - iteration 214, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:27] {3161} INFO -  at 103.0s,\testimator rf's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:27] {2986} INFO - iteration 215, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:27] {3161} INFO -  at 103.1s,\testimator rf's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:27] {2986} INFO - iteration 216, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:12:31] {3161} INFO -  at 107.5s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:31] {2986} INFO - iteration 217, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:12:35] {3161} INFO -  at 111.4s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:35] {2986} INFO - iteration 218, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:35] {3161} INFO -  at 111.5s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:35] {2986} INFO - iteration 219, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:12:35] {3161} INFO -  at 111.7s,\testimator sarimax's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:35] {2986} INFO - iteration 220, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:35] {3161} INFO -  at 111.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:35] {2986} INFO - iteration 221, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:35] {3161} INFO -  at 111.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:35] {2986} INFO - iteration 222, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:35] {3161} INFO -  at 111.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:35] {2986} INFO - iteration 223, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:35] {3161} INFO -  at 111.9s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:35] {2986} INFO - iteration 224, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:35] {3161} INFO -  at 111.9s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:35] {2986} INFO - iteration 225, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:12:39] {3161} INFO -  at 115.3s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:39] {2986} INFO - iteration 226, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:39] {3161} INFO -  at 115.4s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:39] {2986} INFO - iteration 227, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:39] {3161} INFO -  at 115.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:39] {2986} INFO - iteration 228, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:39] {3161} INFO -  at 115.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:39] {2986} INFO - iteration 229, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:39] {3161} INFO -  at 115.5s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:39] {2986} INFO - iteration 230, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:39] {3161} INFO -  at 115.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:39] {2986} INFO - iteration 231, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:39] {3161} INFO -  at 115.6s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:39] {2986} INFO - iteration 232, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:39] {3161} INFO -  at 115.6s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:39] {2986} INFO - iteration 233, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {3161} INFO -  at 118.6s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 234, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {3161} INFO -  at 118.6s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 235, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {3161} INFO -  at 118.7s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 236, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {3161} INFO -  at 118.7s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 237, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {3161} INFO -  at 118.7s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 238, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {3161} INFO -  at 118.8s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 239, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {3161} INFO -  at 118.8s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 240, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {3161} INFO -  at 118.8s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 241, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {3161} INFO -  at 118.9s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 242, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {3161} INFO -  at 119.0s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 243, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:12:43] {3161} INFO -  at 119.0s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:43] {2986} INFO - iteration 244, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:43] {3161} INFO -  at 119.0s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:43] {2986} INFO - iteration 245, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:43] {3161} INFO -  at 119.1s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:43] {2986} INFO - iteration 246, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:43] {3161} INFO -  at 119.2s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:43] {2986} INFO - iteration 247, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:12:43] {3161} INFO -  at 119.2s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:43] {2986} INFO - iteration 248, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:43] {3161} INFO -  at 119.3s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:43] {2986} INFO - iteration 249, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:12:46] {3161} INFO -  at 122.4s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:46] {2986} INFO - iteration 250, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:46] {3161} INFO -  at 122.4s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:46] {2986} INFO - iteration 251, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:46] {3161} INFO -  at 122.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:46] {2986} INFO - iteration 252, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:46] {3161} INFO -  at 122.5s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:46] {2986} INFO - iteration 253, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:12:46] {3161} INFO -  at 122.7s,\testimator xgboost's best error=0.0025,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:46] {2986} INFO - iteration 254, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:46] {3161} INFO -  at 122.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:46] {2986} INFO - iteration 255, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:46] {3161} INFO -  at 122.8s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:46] {2986} INFO - iteration 256, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:12:46] {3161} INFO -  at 123.0s,\testimator xgboost's best error=0.0025,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:46] {2986} INFO - iteration 257, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:12:50] {3161} INFO -  at 126.2s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:50] {2986} INFO - iteration 258, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:50] {3161} INFO -  at 126.2s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:50] {2986} INFO - iteration 259, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:12:50] {3161} INFO -  at 126.4s,\testimator sarimax's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:50] {2986} INFO - iteration 260, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:50] {3161} INFO -  at 126.4s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:50] {2986} INFO - iteration 261, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:12:50] {3161} INFO -  at 126.9s,\testimator sarimax's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:50] {2986} INFO - iteration 262, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:50] {3161} INFO -  at 126.9s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:50] {2986} INFO - iteration 263, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:12:50] {3161} INFO -  at 127.0s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:50] {2986} INFO - iteration 264, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:51] {3161} INFO -  at 127.0s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:51] {2986} INFO - iteration 265, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:51] {3161} INFO -  at 127.1s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:51] {2986} INFO - iteration 266, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:51] {3161} INFO -  at 127.1s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:51] {2986} INFO - iteration 267, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:12:51] {3161} INFO -  at 127.2s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:51] {2986} INFO - iteration 268, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:51] {3161} INFO -  at 127.2s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:51] {2986} INFO - iteration 269, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:12:54] {3161} INFO -  at 130.5s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:54] {2986} INFO - iteration 270, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:54] {3161} INFO -  at 130.6s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:54] {2986} INFO - iteration 271, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:54] {3161} INFO -  at 130.6s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:54] {2986} INFO - iteration 272, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {3161} INFO -  at 134.1s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 273, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {3161} INFO -  at 134.1s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 274, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {3161} INFO -  at 134.2s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 275, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {3161} INFO -  at 134.3s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 276, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {3161} INFO -  at 134.3s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 277, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {3161} INFO -  at 134.3s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 278, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {3161} INFO -  at 134.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 279, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {3161} INFO -  at 134.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 280, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {3161} INFO -  at 134.5s,\testimator xgboost's best error=0.0025,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 281, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {3161} INFO -  at 134.6s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 282, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {3161} INFO -  at 134.6s,\testimator sarimax's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 283, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {3161} INFO -  at 134.7s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 284, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {3161} INFO -  at 134.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 285, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {3161} INFO -  at 134.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 286, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {3161} INFO -  at 134.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 287, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {3161} INFO -  at 134.8s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 288, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:13:02] {3161} INFO -  at 138.2s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:02] {2986} INFO - iteration 289, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:13:05] {3161} INFO -  at 141.9s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:05] {2986} INFO - iteration 290, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:13:05] {3161} INFO -  at 142.0s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:05] {2986} INFO - iteration 291, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:06] {3161} INFO -  at 142.0s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:06] {2986} INFO - iteration 292, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:06] {3161} INFO -  at 142.1s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:06] {2986} INFO - iteration 293, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:13:08] {3161} INFO -  at 144.9s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:08] {2986} INFO - iteration 294, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {3161} INFO -  at 145.0s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 295, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {3161} INFO -  at 145.0s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 296, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {3161} INFO -  at 145.2s,\testimator xgboost's best error=0.0025,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 297, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {3161} INFO -  at 145.2s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 298, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {3161} INFO -  at 145.2s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 299, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {3161} INFO -  at 145.3s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 300, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {3161} INFO -  at 145.4s,\testimator xgboost's best error=0.0025,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 301, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {3161} INFO -  at 145.5s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 302, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {3161} INFO -  at 145.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 303, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {3161} INFO -  at 145.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 304, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {3161} INFO -  at 145.6s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 305, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {3161} INFO -  at 145.6s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 306, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {3161} INFO -  at 145.8s,\testimator sarimax's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 307, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {3161} INFO -  at 145.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 308, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {3161} INFO -  at 145.8s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 309, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {3161} INFO -  at 149.0s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 310, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {3161} INFO -  at 149.0s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 311, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {3161} INFO -  at 149.1s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 312, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {3161} INFO -  at 149.2s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 313, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {3161} INFO -  at 149.2s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 314, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {3161} INFO -  at 149.3s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 315, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {3161} INFO -  at 149.3s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 316, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {3161} INFO -  at 149.4s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 317, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {3161} INFO -  at 149.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 318, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:13:16] {3161} INFO -  at 152.3s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:16] {2986} INFO - iteration 319, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:13:19] {3161} INFO -  at 155.3s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:19] {2986} INFO - iteration 320, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:19] {3161} INFO -  at 155.3s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:19] {2986} INFO - iteration 321, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:19] {3161} INFO -  at 155.3s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:19] {2986} INFO - iteration 322, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:19] {3161} INFO -  at 155.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:19] {2986} INFO - iteration 323, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:19] {3161} INFO -  at 155.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:19] {2986} INFO - iteration 324, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:13:19] {3161} INFO -  at 155.5s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:19] {2986} INFO - iteration 325, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:19] {3161} INFO -  at 155.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:19] {2986} INFO - iteration 326, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:19] {3161} INFO -  at 155.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:19] {2986} INFO - iteration 327, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {3161} INFO -  at 159.2s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 328, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {3161} INFO -  at 159.2s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 329, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {3161} INFO -  at 159.4s,\testimator xgboost's best error=0.0025,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 330, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {3161} INFO -  at 159.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 331, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {3161} INFO -  at 159.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 332, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {3161} INFO -  at 159.5s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 333, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {3161} INFO -  at 159.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 334, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {3161} INFO -  at 159.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 335, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {3161} INFO -  at 159.6s,\testimator xgboost's best error=0.0025,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 336, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {3161} INFO -  at 159.7s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 337, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {3161} INFO -  at 159.8s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 338, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {3161} INFO -  at 159.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 339, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {3161} INFO -  at 160.0s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 340, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {3161} INFO -  at 160.0s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 341, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:24] {3161} INFO -  at 160.1s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:24] {2986} INFO - iteration 342, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:13:24] {3161} INFO -  at 160.1s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:24] {2986} INFO - iteration 343, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:24] {3161} INFO -  at 160.1s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:24] {2986} INFO - iteration 344, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:24] {3161} INFO -  at 160.2s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:24] {2986} INFO - iteration 345, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:13:27] {3161} INFO -  at 163.1s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:27] {2986} INFO - iteration 346, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:13:29] {3161} INFO -  at 165.9s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:29] {2986} INFO - iteration 347, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:30] {3161} INFO -  at 166.0s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:30] {2986} INFO - iteration 348, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:13:33] {3161} INFO -  at 169.4s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:33] {2986} INFO - iteration 349, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:13:33] {3161} INFO -  at 169.7s,\testimator sarimax's best error=0.0031,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:33] {2986} INFO - iteration 350, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:13:34] {3161} INFO -  at 170.4s,\testimator sarimax's best error=0.0031,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:34] {2986} INFO - iteration 351, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:13:34] {3161} INFO -  at 170.6s,\testimator sarimax's best error=0.0031,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:34] {2986} INFO - iteration 352, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:34] {3161} INFO -  at 170.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:34] {2986} INFO - iteration 353, current learner arima\n",
 | ||
|       "[flaml.automl: 07-28 21:13:35] {3161} INFO -  at 171.3s,\testimator arima's best error=0.0033,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:35] {2986} INFO - iteration 354, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:13:35] {3161} INFO -  at 171.7s,\testimator sarimax's best error=0.0031,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:35] {2986} INFO - iteration 355, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:35] {3161} INFO -  at 171.8s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:35] {2986} INFO - iteration 356, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:13:35] {3161} INFO -  at 171.8s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:35] {2986} INFO - iteration 357, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:35] {3161} INFO -  at 171.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:35] {2986} INFO - iteration 358, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:13:35] {3161} INFO -  at 171.9s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:35] {2986} INFO - iteration 359, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:36] {3161} INFO -  at 172.1s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:36] {2986} INFO - iteration 360, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:36] {3161} INFO -  at 172.1s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:36] {2986} INFO - iteration 361, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:36] {3161} INFO -  at 172.1s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:36] {2986} INFO - iteration 362, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:36] {3161} INFO -  at 172.2s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:36] {2986} INFO - iteration 363, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:36] {3161} INFO -  at 172.2s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:36] {2986} INFO - iteration 364, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:13:36] {3161} INFO -  at 172.3s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:36] {2986} INFO - iteration 365, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:13:36] {3161} INFO -  at 172.3s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:36] {2986} INFO - iteration 366, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:13:36] {3161} INFO -  at 172.3s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:36] {2986} INFO - iteration 367, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:13:37] {3161} INFO -  at 173.2s,\testimator sarimax's best error=0.0021,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:37] {2986} INFO - iteration 368, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:13:37] {3161} INFO -  at 173.6s,\testimator sarimax's best error=0.0021,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:37] {2986} INFO - iteration 369, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:13:37] {3161} INFO -  at 173.6s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:37] {2986} INFO - iteration 370, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:13:39] {3161} INFO -  at 175.5s,\testimator sarimax's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:39] {2986} INFO - iteration 371, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:39] {3161} INFO -  at 175.6s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:39] {2986} INFO - iteration 372, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:13:41] {3161} INFO -  at 177.5s,\testimator sarimax's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:41] {2986} INFO - iteration 373, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:41] {3161} INFO -  at 177.6s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:41] {2986} INFO - iteration 374, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {3161} INFO -  at 179.2s,\testimator sarimax's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 375, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {3161} INFO -  at 179.4s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 376, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {3161} INFO -  at 179.5s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 377, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {3161} INFO -  at 179.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 378, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {3161} INFO -  at 179.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 379, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {3161} INFO -  at 179.6s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 380, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {3161} INFO -  at 179.6s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 381, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {3161} INFO -  at 179.7s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 382, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {3161} INFO -  at 179.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 383, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {3161} INFO -  at 179.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 384, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:13:45] {3161} INFO -  at 181.5s,\testimator sarimax's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:45] {2986} INFO - iteration 385, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:13:48] {3161} INFO -  at 184.8s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:48] {2986} INFO - iteration 386, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:13:52] {3161} INFO -  at 188.3s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:52] {2986} INFO - iteration 387, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:52] {3161} INFO -  at 188.4s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:52] {2986} INFO - iteration 388, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:52] {3161} INFO -  at 188.5s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:52] {2986} INFO - iteration 389, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:13:52] {3161} INFO -  at 188.5s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:52] {2986} INFO - iteration 390, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:13:53] {3161} INFO -  at 189.5s,\testimator sarimax's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:53] {2986} INFO - iteration 391, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:53] {3161} INFO -  at 189.6s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:53] {2986} INFO - iteration 392, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:53] {3161} INFO -  at 189.7s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:53] {2986} INFO - iteration 393, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:53] {3161} INFO -  at 189.7s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:53] {2986} INFO - iteration 394, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {3161} INFO -  at 193.0s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 395, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {3161} INFO -  at 193.1s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 396, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {3161} INFO -  at 193.2s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 397, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {3161} INFO -  at 193.2s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 398, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {3161} INFO -  at 193.3s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 399, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {3161} INFO -  at 193.3s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 400, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {3161} INFO -  at 193.4s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 401, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {3161} INFO -  at 193.4s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 402, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {3161} INFO -  at 193.5s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 403, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {3161} INFO -  at 193.5s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 404, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {3161} INFO -  at 193.6s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 405, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {3161} INFO -  at 193.6s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 406, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {3161} INFO -  at 193.7s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 407, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {3161} INFO -  at 193.7s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 408, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:01] {3161} INFO -  at 197.0s,\testimator sarimax's best error=0.0012,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:01] {2986} INFO - iteration 409, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:02] {3161} INFO -  at 198.8s,\testimator sarimax's best error=0.0012,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:02] {2986} INFO - iteration 410, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:02] {3161} INFO -  at 198.9s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:02] {2986} INFO - iteration 411, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:05] {3161} INFO -  at 201.0s,\testimator sarimax's best error=0.0012,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:05] {2986} INFO - iteration 412, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:14:05] {3161} INFO -  at 201.1s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:05] {2986} INFO - iteration 413, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:06] {3161} INFO -  at 202.3s,\testimator sarimax's best error=0.0012,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:06] {2986} INFO - iteration 414, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:14:06] {3161} INFO -  at 202.3s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:06] {2986} INFO - iteration 415, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:14:06] {3161} INFO -  at 202.4s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:06] {2986} INFO - iteration 416, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:06] {3161} INFO -  at 202.4s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:06] {2986} INFO - iteration 417, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:14:06] {3161} INFO -  at 202.5s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:06] {2986} INFO - iteration 418, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:14:06] {3161} INFO -  at 202.6s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:06] {2986} INFO - iteration 419, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:14:09] {3161} INFO -  at 205.7s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:09] {2986} INFO - iteration 420, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:11] {3161} INFO -  at 207.4s,\testimator sarimax's best error=0.0012,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:11] {2986} INFO - iteration 421, current learner arima\n",
 | ||
|       "[flaml.automl: 07-28 21:14:11] {3161} INFO -  at 207.6s,\testimator arima's best error=0.0033,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:11] {2986} INFO - iteration 422, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:12] {3161} INFO -  at 208.6s,\testimator sarimax's best error=0.0010,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:12] {2986} INFO - iteration 423, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:14:12] {3161} INFO -  at 208.6s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:12] {2986} INFO - iteration 424, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:13] {3161} INFO -  at 209.9s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:13] {2986} INFO - iteration 425, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:15] {3161} INFO -  at 211.3s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:15] {2986} INFO - iteration 426, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:15] {3161} INFO -  at 211.8s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:15] {2986} INFO - iteration 427, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:18] {3161} INFO -  at 214.2s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:18] {2986} INFO - iteration 428, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:14:18] {3161} INFO -  at 214.2s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:18] {2986} INFO - iteration 429, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:14:18] {3161} INFO -  at 214.4s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:18] {2986} INFO - iteration 430, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:18] {3161} INFO -  at 214.4s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:18] {2986} INFO - iteration 431, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:14:18] {3161} INFO -  at 214.5s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:18] {2986} INFO - iteration 432, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:20] {3161} INFO -  at 216.7s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:20] {2986} INFO - iteration 433, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:14:20] {3161} INFO -  at 216.8s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:20] {2986} INFO - iteration 434, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:20] {3161} INFO -  at 216.9s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:20] {2986} INFO - iteration 435, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:21] {3161} INFO -  at 217.4s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:21] {2986} INFO - iteration 436, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:14:24] {3161} INFO -  at 220.1s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:24] {2986} INFO - iteration 437, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:24] {3161} INFO -  at 220.6s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:24] {2986} INFO - iteration 438, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:26] {3161} INFO -  at 223.0s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:26] {2986} INFO - iteration 439, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:14:30] {3161} INFO -  at 226.0s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:30] {2986} INFO - iteration 440, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:14:33] {3161} INFO -  at 229.2s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:33] {2986} INFO - iteration 441, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:14:36] {3161} INFO -  at 232.6s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:36] {2986} INFO - iteration 442, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:38] {3161} INFO -  at 234.4s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:38] {2986} INFO - iteration 443, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:14:38] {3161} INFO -  at 234.4s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:38] {2986} INFO - iteration 444, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:39] {3161} INFO -  at 235.1s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:39] {2986} INFO - iteration 445, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:39] {3161} INFO -  at 235.1s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
 | ||
|       "[flaml.automl: 07-28 21:14:39] {2986} INFO - iteration 446, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:41] {3161} INFO -  at 237.4s,\testimator sarimax's best error=0.0004,\tbest estimator sarimax's best error=0.0004\n",
 | ||
|       "[flaml.automl: 07-28 21:14:41] {2986} INFO - iteration 447, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:14:41] {3161} INFO -  at 237.5s,\testimator xgboost's best error=0.0024,\tbest estimator sarimax's best error=0.0004\n",
 | ||
|       "[flaml.automl: 07-28 21:14:41] {2986} INFO - iteration 448, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:43] {3161} INFO -  at 239.7s,\testimator sarimax's best error=0.0004,\tbest estimator sarimax's best error=0.0004\n",
 | ||
|       "[flaml.automl: 07-28 21:14:43] {2986} INFO - iteration 449, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:14:43] {3161} INFO -  at 239.7s,\testimator lgbm's best error=0.0022,\tbest estimator sarimax's best error=0.0004\n",
 | ||
|       "[flaml.automl: 07-28 21:14:43] {2986} INFO - iteration 450, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:14:43] {3161} INFO -  at 239.8s,\testimator extra_tree's best error=0.0016,\tbest estimator sarimax's best error=0.0004\n",
 | ||
|       "[flaml.automl: 07-28 21:14:43] {2986} INFO - iteration 451, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:14:43] {3161} INFO -  at 239.8s,\testimator lgbm's best error=0.0022,\tbest estimator sarimax's best error=0.0004\n",
 | ||
|       "[flaml.automl: 07-28 21:14:43] {2986} INFO - iteration 452, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:14:43] {3161} INFO -  at 239.8s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator sarimax's best error=0.0004\n",
 | ||
|       "[flaml.automl: 07-28 21:14:43] {2986} INFO - iteration 453, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:14:43] {3161} INFO -  at 239.9s,\testimator extra_tree's best error=0.0016,\tbest estimator sarimax's best error=0.0004\n",
 | ||
|       "[flaml.automl: 07-28 21:14:43] {2986} INFO - iteration 454, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:14:43] {3161} INFO -  at 239.9s,\testimator extra_tree's best error=0.0016,\tbest estimator sarimax's best error=0.0004\n",
 | ||
|       "[flaml.automl: 07-28 21:14:43] {2986} INFO - iteration 455, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:14:43] {3161} INFO -  at 239.9s,\testimator extra_tree's best error=0.0016,\tbest estimator sarimax's best error=0.0004\n",
 | ||
|       "[flaml.automl: 07-28 21:14:43] {2986} INFO - iteration 456, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:44] {3161} INFO -  at 240.0s,\testimator rf's best error=0.0018,\tbest estimator sarimax's best error=0.0004\n",
 | ||
|       "[flaml.automl: 07-28 21:14:44] {3425} INFO - retrain sarimax for 0.7s\n",
 | ||
|       "[flaml.automl: 07-28 21:14:44] {3432} INFO - retrained model: <statsmodels.tsa.statespace.sarimax.SARIMAXResultsWrapper object at 0x000001E2D9979400>\n",
 | ||
|       "[flaml.automl: 07-28 21:14:44] {2725} INFO - fit succeeded\n",
 | ||
|       "[flaml.automl: 07-28 21:14:44] {2726} INFO - Time taken to find the best model: 237.36335611343384\n",
 | ||
|       "[flaml.automl: 07-28 21:14:44] {2737} WARNING - Time taken to find the best model is 99% of the provided time budget and not all estimators' hyperparameter search converged. Consider increasing the time budget.\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "'''The main flaml automl API'''\n",
 | ||
|     "automl.fit(dataframe=train_df,  # training data\n",
 | ||
|     "           label='co2',  # label column\n",
 | ||
|     "           period=time_horizon,  # key word argument 'period' must be included for forecast task)\n",
 | ||
|     "           **settings)"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Best model and metric"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 8,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "Best ML leaner: sarimax\n",
 | ||
|       "Best hyperparmeter config: {'p': 8, 'd': 0, 'q': 8, 'P': 6, 'D': 3, 'Q': 1, 's': 6}\n",
 | ||
|       "Best mape on validation data: 0.00043466573064228554\n",
 | ||
|       "Training duration of best run: 0.7340686321258545s\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "''' retrieve best config and best learner'''\n",
 | ||
|     "print('Best ML leaner:', automl.best_estimator)\n",
 | ||
|     "print('Best hyperparmeter config:', automl.best_config)\n",
 | ||
|     "print(f'Best mape on validation data: {automl.best_loss}')\n",
 | ||
|     "print(f'Training duration of best run: {automl.best_config_train_time}s')"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 9,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "data": {
 | ||
|       "text/plain": [
 | ||
|        "<statsmodels.tsa.statespace.sarimax.SARIMAXResultsWrapper at 0x1e2d9979400>"
 | ||
|       ]
 | ||
|      },
 | ||
|      "execution_count": 9,
 | ||
|      "metadata": {},
 | ||
|      "output_type": "execute_result"
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "automl.model.estimator"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 10,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [],
 | ||
|    "source": [
 | ||
|     "''' pickle and save the automl object '''\n",
 | ||
|     "import pickle\n",
 | ||
|     "with open('automl.pkl', 'wb') as f:\n",
 | ||
|     "    pickle.dump(automl, f, pickle.HIGHEST_PROTOCOL)"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 11,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "Predicted labels\n",
 | ||
|       "2001-01-01    370.568362\n",
 | ||
|       "2001-02-01    371.297747\n",
 | ||
|       "2001-03-01    372.087653\n",
 | ||
|       "2001-04-01    373.040897\n",
 | ||
|       "2001-05-01    373.638221\n",
 | ||
|       "2001-06-01    373.202665\n",
 | ||
|       "2001-07-01    371.621574\n",
 | ||
|       "2001-08-01    369.611740\n",
 | ||
|       "2001-09-01    368.307775\n",
 | ||
|       "2001-10-01    368.360786\n",
 | ||
|       "2001-11-01    369.476460\n",
 | ||
|       "2001-12-01    370.849193\n",
 | ||
|       "Freq: MS, Name: predicted_mean, dtype: float64\n",
 | ||
|       "True labels\n",
 | ||
|       "514    370.175\n",
 | ||
|       "515    371.325\n",
 | ||
|       "516    372.060\n",
 | ||
|       "517    372.775\n",
 | ||
|       "518    373.800\n",
 | ||
|       "519    373.060\n",
 | ||
|       "520    371.300\n",
 | ||
|       "521    369.425\n",
 | ||
|       "522    367.880\n",
 | ||
|       "523    368.050\n",
 | ||
|       "524    369.375\n",
 | ||
|       "525    371.020\n",
 | ||
|       "Name: co2, dtype: float64\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "''' compute predictions of testing dataset '''\n",
 | ||
|     "flaml_y_pred = automl.predict(X_test)\n",
 | ||
|     "print(f\"Predicted labels\\n{flaml_y_pred}\")\n",
 | ||
|     "print(f\"True labels\\n{y_test}\")"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 12,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "mape = 0.0005710586398294955\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "''' compute different metric values on testing dataset'''\n",
 | ||
|     "from flaml.ml import sklearn_metric_loss_score\n",
 | ||
|     "print('mape', '=', sklearn_metric_loss_score('mape', y_true=y_test, y_predict=flaml_y_pred))"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Log history"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 13,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "{'Current Learner': 'lgbm', 'Current Sample': 502, 'Current Hyper-parameters': {'n_estimators': 4, 'num_leaves': 4, 'min_child_samples': 20, 'learning_rate': 0.09999999999999995, 'log_max_bin': 8, 'colsample_bytree': 1.0, 'reg_alpha': 0.0009765625, 'reg_lambda': 1.0, 'optimize_for_horizon': False, 'lags': 3}, 'Best Learner': 'lgbm', 'Best Hyper-parameters': {'n_estimators': 4, 'num_leaves': 4, 'min_child_samples': 20, 'learning_rate': 0.09999999999999995, 'log_max_bin': 8, 'colsample_bytree': 1.0, 'reg_alpha': 0.0009765625, 'reg_lambda': 1.0, 'optimize_for_horizon': False, 'lags': 3}}\n",
 | ||
|       "{'Current Learner': 'lgbm', 'Current Sample': 502, 'Current Hyper-parameters': {'n_estimators': 8, 'num_leaves': 4, 'min_child_samples': 19, 'learning_rate': 0.18686130359903158, 'log_max_bin': 9, 'colsample_bytree': 0.9311834484407709, 'reg_alpha': 0.0013872402855481538, 'reg_lambda': 0.43503398494225104, 'optimize_for_horizon': False, 'lags': 1}, 'Best Learner': 'lgbm', 'Best Hyper-parameters': {'n_estimators': 8, 'num_leaves': 4, 'min_child_samples': 19, 'learning_rate': 0.18686130359903158, 'log_max_bin': 9, 'colsample_bytree': 0.9311834484407709, 'reg_alpha': 0.0013872402855481538, 'reg_lambda': 0.43503398494225104, 'optimize_for_horizon': False, 'lags': 1}}\n",
 | ||
|       "{'Current Learner': 'lgbm', 'Current Sample': 502, 'Current Hyper-parameters': {'n_estimators': 9, 'num_leaves': 4, 'min_child_samples': 14, 'learning_rate': 0.23100120527451992, 'log_max_bin': 8, 'colsample_bytree': 1.0, 'reg_alpha': 0.0009765625, 'reg_lambda': 0.028424597762235913, 'optimize_for_horizon': False, 'lags': 1}, 'Best Learner': 'lgbm', 'Best Hyper-parameters': {'n_estimators': 9, 'num_leaves': 4, 'min_child_samples': 14, 'learning_rate': 0.23100120527451992, 'log_max_bin': 8, 'colsample_bytree': 1.0, 'reg_alpha': 0.0009765625, 'reg_lambda': 0.028424597762235913, 'optimize_for_horizon': False, 'lags': 1}}\n",
 | ||
|       "{'Current Learner': 'lgbm', 'Current Sample': 502, 'Current Hyper-parameters': {'n_estimators': 9, 'num_leaves': 9, 'min_child_samples': 9, 'learning_rate': 0.2917244979615619, 'log_max_bin': 7, 'colsample_bytree': 1.0, 'reg_alpha': 0.0009765625, 'reg_lambda': 0.006048554644106909, 'optimize_for_horizon': False, 'lags': 4}, 'Best Learner': 'lgbm', 'Best Hyper-parameters': {'n_estimators': 9, 'num_leaves': 9, 'min_child_samples': 9, 'learning_rate': 0.2917244979615619, 'log_max_bin': 7, 'colsample_bytree': 1.0, 'reg_alpha': 0.0009765625, 'reg_lambda': 0.006048554644106909, 'optimize_for_horizon': False, 'lags': 4}}\n",
 | ||
|       "{'Current Learner': 'lgbm', 'Current Sample': 502, 'Current Hyper-parameters': {'n_estimators': 4, 'num_leaves': 8, 'min_child_samples': 11, 'learning_rate': 0.8116893577982964, 'log_max_bin': 8, 'colsample_bytree': 0.97502360023323, 'reg_alpha': 0.0012398377555843262, 'reg_lambda': 0.02776044509327881, 'optimize_for_horizon': False, 'lags': 4}, 'Best Learner': 'lgbm', 'Best Hyper-parameters': {'n_estimators': 4, 'num_leaves': 8, 'min_child_samples': 11, 'learning_rate': 0.8116893577982964, 'log_max_bin': 8, 'colsample_bytree': 0.97502360023323, 'reg_alpha': 0.0012398377555843262, 'reg_lambda': 0.02776044509327881, 'optimize_for_horizon': False, 'lags': 4}}\n",
 | ||
|       "{'Current Learner': 'lgbm', 'Current Sample': 502, 'Current Hyper-parameters': {'n_estimators': 5, 'num_leaves': 16, 'min_child_samples': 7, 'learning_rate': 1.0, 'log_max_bin': 9, 'colsample_bytree': 0.9289697965752838, 'reg_alpha': 0.01291354098023607, 'reg_lambda': 0.012402833825431305, 'optimize_for_horizon': False, 'lags': 5}, 'Best Learner': 'lgbm', 'Best Hyper-parameters': {'n_estimators': 5, 'num_leaves': 16, 'min_child_samples': 7, 'learning_rate': 1.0, 'log_max_bin': 9, 'colsample_bytree': 0.9289697965752838, 'reg_alpha': 0.01291354098023607, 'reg_lambda': 0.012402833825431305, 'optimize_for_horizon': False, 'lags': 5}}\n",
 | ||
|       "{'Current Learner': 'lgbm', 'Current Sample': 502, 'Current Hyper-parameters': {'n_estimators': 10, 'num_leaves': 13, 'min_child_samples': 8, 'learning_rate': 1.0, 'log_max_bin': 9, 'colsample_bytree': 0.915047969012756, 'reg_alpha': 0.1456985407754094, 'reg_lambda': 0.010186415963233664, 'optimize_for_horizon': False, 'lags': 9}, 'Best Learner': 'lgbm', 'Best Hyper-parameters': {'n_estimators': 10, 'num_leaves': 13, 'min_child_samples': 8, 'learning_rate': 1.0, 'log_max_bin': 9, 'colsample_bytree': 0.915047969012756, 'reg_alpha': 0.1456985407754094, 'reg_lambda': 0.010186415963233664, 'optimize_for_horizon': False, 'lags': 9}}\n",
 | ||
|       "{'Current Learner': 'xgb', 'Current Sample': 502, 'Current Hyper-parameters': {'n_estimators': 17, 'max_depth': 6, 'min_child_weight': 1.1257301179325647, 'learning_rate': 0.3420575416463879, 'subsample': 1.0, 'colsample_bylevel': 0.8634518942394397, 'colsample_bytree': 0.8183410599521093, 'reg_alpha': 0.0031517221935712125, 'reg_lambda': 0.36563645650488746, 'optimize_for_horizon': False, 'lags': 1}, 'Best Learner': 'xgb', 'Best Hyper-parameters': {'n_estimators': 17, 'max_depth': 6, 'min_child_weight': 1.1257301179325647, 'learning_rate': 0.3420575416463879, 'subsample': 1.0, 'colsample_bylevel': 0.8634518942394397, 'colsample_bytree': 0.8183410599521093, 'reg_alpha': 0.0031517221935712125, 'reg_lambda': 0.36563645650488746, 'optimize_for_horizon': False, 'lags': 1}}\n",
 | ||
|       "{'Current Learner': 'prophet', 'Current Sample': 502, 'Current Hyper-parameters': {'changepoint_prior_scale': 0.05, 'seasonality_prior_scale': 10.0, 'holidays_prior_scale': 10.0, 'seasonality_mode': 'multiplicative'}, 'Best Learner': 'prophet', 'Best Hyper-parameters': {'changepoint_prior_scale': 0.05, 'seasonality_prior_scale': 10.0, 'holidays_prior_scale': 10.0, 'seasonality_mode': 'multiplicative'}}\n",
 | ||
|       "{'Current Learner': 'prophet', 'Current Sample': 502, 'Current Hyper-parameters': {'changepoint_prior_scale': 0.02574943279263944, 'seasonality_prior_scale': 10.0, 'holidays_prior_scale': 10.0, 'seasonality_mode': 'additive'}, 'Best Learner': 'prophet', 'Best Hyper-parameters': {'changepoint_prior_scale': 0.02574943279263944, 'seasonality_prior_scale': 10.0, 'holidays_prior_scale': 10.0, 'seasonality_mode': 'additive'}}\n",
 | ||
|       "{'Current Learner': 'prophet', 'Current Sample': 502, 'Current Hyper-parameters': {'changepoint_prior_scale': 0.029044518309983725, 'seasonality_prior_scale': 10.0, 'holidays_prior_scale': 8.831739687246309, 'seasonality_mode': 'additive'}, 'Best Learner': 'prophet', 'Best Hyper-parameters': {'changepoint_prior_scale': 0.029044518309983725, 'seasonality_prior_scale': 10.0, 'holidays_prior_scale': 8.831739687246309, 'seasonality_mode': 'additive'}}\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "from flaml.data import get_output_from_log\n",
 | ||
|     "time_history, best_valid_loss_history, valid_loss_history, config_history, train_loss_history = \\\n",
 | ||
|     "    get_output_from_log(filename=settings['log_file_name'], time_budget=180)\n",
 | ||
|     "\n",
 | ||
|     "for config in config_history:\n",
 | ||
|     "    print(config)"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 14,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "data": {
 | ||
|       "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYgAAAEWCAYAAAB8LwAVAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjAsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8GearUAAAgAElEQVR4nO3de5xdVX338c+XMYGAwAQJFCZAUGMkAhKdBhFvoDaAKCFSBR4vjXJrhVJtY4HWW30osalWfKTmiZQqlpsgidEnEikoqQgkgxNyI2ljQJgJhaEYgjCSZPJ7/thr4OSwZ2YnzJ4zc873/XrNa85ee529f5sh53fWWnuvpYjAzMys2m61DsDMzIYnJwgzM8vlBGFmZrmcIMzMLJcThJmZ5XKCMDOzXE4QZrtA0tslrat1HGZlcoKwEUfSw5LeU8sYIuI/ImJSWceXNE3SEknPSOqSdJekD5R1PrM8ThBmOSQ11fDcZwA3A9cC44EDgc8D79+FY0mS/53bLvH/OFY3JO0m6RJJv5b0P5K+L2m/iv03S/pvSU+nb+dvqNj3HUnfkrRI0rPACaml8leSVqT33CRpj1T/XZI6Kt7fZ920/7OSHpO0UdI5kkLSa3OuQcDXgC9HxNUR8XREbI+IuyLi3FTni5L+reI9E9LxXpG2fy7pckl3A88Bl0lqqzrPpyUtTK93l/SPkh6R9LikuZLGvMw/h9UBJwirJ38OTAfeCRwM/Ba4qmL/T4CJwAHAr4Drqt5/NnA5sDfwi1T2IeAk4HDgaOBP+jl/bl1JJwGfAd4DvDbF15dJwCHALf3UKeKjwHlk1/J/gEmSJlbsPxu4Pr3+CvA64JgUXwtZi8UanBOE1ZPzgb+JiI6IeB74InBG7zfriLgmIp6p2PdGSftWvP+HEXF3+sb++1T2jYjYGBFPAT8i+xDtS191PwT8a0SsjojngC/1c4xXpd+PFb7qfN9J59sWEU8DPwTOAkiJ4vXAwtRiORf4dEQ8FRHPAH8PnPkyz291wAnC6slhwHxJmyRtAh4EeoADJTVJmp26nzYDD6f37F/x/kdzjvnfFa+fA17Zz/n7qntw1bHzztPrf9Lvg/qpU0T1Oa4nJQiy1sOClKzGAXsC91f8d7stlVuDc4KwevIocHJENFf87BERnWQfiqeRdfPsC0xI71HF+8ua2vgxssHmXof0U3cd2XV8sJ86z5J9qPf6g5w61dfyU2B/SceQJYre7qUngW7gDRX/zfaNiP4SoTUIJwgbqUZJ2qPi5xXAXOBySYcBSBon6bRUf2/gebJv6HuSdaMMle8DMyUdIWlP+unfj2z+/c8An5M0U9I+afD9bZLmpWrLgXdIOjR1kV06UAARsY1sXGMOsB9weyrfDnwb+CdJBwBIapE0bZev1uqGE4SNVIvIvvn2/nwRuBJYCPxU0jPAvcCxqf61wG+ATmBN2jckIuInwDeAnwHrgXvSruf7qH8L8GHgE8BG4HHgf5ONIxARtwM3ASuA+4EfFwzlerIW1M0pYfT66xTXvan77d/JBsutwckLBpkNLUlHAKuA3as+qM2GFbcgzIaApNMljZY0luy20h85Odhw5wRhNjTOB7qAX5PdWfWntQ3HbGDuYjIzs1xuQZiZWa5X1DqAwbT//vvHhAkTah2GmdmIcf/99z8ZEbkPRtZVgpgwYQJtbW0DVzQzMwAk/aavfe5iMjOzXE4QZmaWywnCzMxyOUGYmVmu0hKEpGskPSFpVR/7JekbktanVbjeVLHvJEnr0r5LyorRzMz6VuZdTN8Bvkk2SVqek8lW95pINqHat4Bj01rAVwHvBTqAZZIWRsSaEmM1qzsL2juZs3gdGzd1c3DzGGZNm8T0KS21DssGUdl/49ISREQskTShnyqnAdem6Y3vldQs6SCyefrXR8QGAEk3prpOEDZo6v3Dc0F7J5feupLurT0AdG7q5tJbVwLU1XU2sqH4G9fyOYgWdlz1qiOV5ZUfSx8knUe29i6HHnro4EdpdacRPjznLF73wvX16t7aw2dvWcENSx+pUVQ2mNof2cSWnu07lHVv7WHO4nV1kSCUUxb9lOeKiHnAPIDW1lZPLGUDaoQPz85N3bnl1R8oNnL19bfc2MffflfUMkF0sOPSi+PJFkcZ3Ue52aDo6x9QPX14jm7aLfd6WprHcNP5x9UgIhtsx8++M/eLwMHNYwbtHLVMEAuBC9MYw7HA0xHxmKQuYKKkw8lW/zqTbD3hmqv3futGcXDzmNx/WPX04VndjQYwZlQTs6Z5obh6MWvapNL/xqUlCEk3AO8iWyi9A/gCMAogIuaSLRl5CtlSh88BM9O+bZIuBBYDTcA1EbG6rDgr9ZcAGqHfulEMxT+sWuv9f9JfaOrXUPyN62o9iNbW1tjVyfr+dsFKrrv3kR0GO8aMauKKGUcxfUpLn8250U27MeXQ5l2M2Grlyd89z4auZwmyloM/PK1RSbo/Ilrz9tXVbK67akF750uSA+w4cOlBv/qy/yt3Z/9X7s5px7Rw9rG++80sjxMEWROtr3ZUbwLwoJ+ZNRrPxUTftwTCiwngH844mjGjmnbYV2/91mZmldyCAJokevoYi+lNAB70M7NG4wQBfSYH2PEOpelTWpwQzKxhuIuJrBtpZ8rNzBqBEwRZN5LHF8zMduQuJl7sRvrsLSvY0rPd98WbmeEWBPDiE9RberYzumk3JwczM9yCeMkUGlt6tnsKDTMz3ILoc+rnOYvX1SgiM7PhoeETRF9TPw/mnOpmZiNRwyeIvuZOH8w51c3MRqKGTxC+xdXMLF/DD1L7Flczs3wNnyAgSxK9axF7ZlYzs0zDdzGZmVk+JwgzM8vlBGFmZrmcIMzMLJcThJmZ5XKCMDOzXE4QZmaWq9QEIekkSeskrZd0Sc7+sZLmS1ohaamkIyv2XSxplaTVkv6izDjNzOylSksQkpqAq4CTgcnAWZImV1W7DFgeEUcDHwOuTO89EjgXmAq8EThV0sSyYjUzs5cqswUxFVgfERsiYgtwI3BaVZ3JwB0AEbEWmCDpQOAI4N6IeC4itgF3AaeXGKuZmVUpM0G0AI9WbHekskoPADMAJE0FDgPGA6uAd0h6laQ9gVOAQ/JOIuk8SW2S2rq6ugb5EszMGleZCUI5ZVG1PRsYK2k5cBHQDmyLiAeBrwC3A7eRJZJteSeJiHkR0RoRrePGjRu04M3MGl2Zk/V1sOO3/vHAxsoKEbEZmAkgScBD6YeI+BfgX9K+v0/HMzOzIVJmC2IZMFHS4ZJGA2cCCysrSGpO+wDOAZakpIGkA9LvQ8m6oW4oMVYzM6tSWgsiIrZJuhBYDDQB10TEakkXpP1zyQajr5XUA6wBPllxiB9IehWwFfhURPy2rFjNzOylSl0PIiIWAYuqyuZWvL4HyL19NSLeXmZsZmbWPz9JbWZmuZwgzMwslxOEmZnlcoIwM7NcThBmZpbLCcLMzHI5QZiZWS4nCDMzy+UEYWZmuZwgzMwslxOEmZnlcoIwM7NcThBmZpbLCcLMzHINmCAk7TcUgZiZ2fBSpAVxn6SbJZ2SlgWtOwvaO2l/ZBP3PfQUx8++kwXtnbUOycys5ookiNcB84CPAusl/b2k15Ub1tBZ0N7JpbeuZEvPdgA6N3Vz6a0rnSTMrOENmCAic3tEnEW2bvTHgaWS7pJ0XOkRlmzO4nV0b+3Zoax7aw9zFq+rUURmZsPDgEuOpnWhP0LWgngcuAhYCBwD3AwcXmaAZdu4qXunys3MGkWRNanvAb4HTI+IjoryNklz+3jPiHFw8xg6c5LBwc1jahCNmdnwUWQMYlJEfLkqOQAQEV8pIaYhNWvaJMaMatqhbMyoJmZNm1SjiMzMhociCeKnkpp7NySNlbS4xJiG1PQpLVwx4yhGN2X/KVqax3DFjKOYPqWlxpGZmdVWkS6mcRGxqXcjIn4r6YASYxpy06e0cMPSRwC46fwRP+5uZjYoirQgeiQd2rsh6TAgihxc0kmS1klaL+mSnP1jJc2XtELSUklHVuz7tKTVklZJukHSHkXOaWZmg6NIgvgb4BeSvifpe8AS4NKB3iSpCbgKOBmYDJwlaXJVtcuA5RFxNPAx4Mr03hbgz4HWiDgSaALOLHZJZmY2GAbsYoqI2yS9CXgLIODTEfFkgWNPBdZHxAYASTcCpwFrKupMBq5I51kraYKkAytiGyNpK7AnsLHgNZmZ2SAoOllfD/AE8DQwWdI7CrynBXi0YrsjlVV6AJgBIGkqcBgwPiI6gX8EHgEeA56OiJ/mnUTSeZLaJLV1dXUVvBwzMxtIkcn6ziHrVloMfCn9/mKBY+fN21Q9djEbGCtpOdkDeO3ANkljyVobhwMHA3tJ+kjeSSJiXkS0RkTruHHjCoRlZmZFFGlBXAz8IfCbiDgBmAIU+areARxSsT2eqm6iiNgcETMj4hiyMYhxwEPAe4CHIqIrIrYCtwJvLXBOMzMbJEUSxO8j4vcAknaPiLVAkafIlgETJR0uaTTZIPPCygqSmtM+yOZ5WhIRm8m6lt4iac80g+y7gQeLXZKZmQ2GIs9BdKQH5RYAt0v6LQUGjCNim6QLybqkmoBrImK1pAvS/rnAEcC1knrIBq8/mfbdJ+kW4FfANrKup3k7fXVmZrbLitzFdHp6+UVJPwP2BW4rcvCIWAQsqiqbW/H6HmBiH+/9AvCFIucxM7PB12+CkLQbsCI9i0BE3DUkUZmZWc31OwYREduBByqfpDYzs8ZQZAziIGC1pKXAs72FEfGB0qIyM7OaK5IgvlR6FGZmNuwUGaT2uIOZWQMqsuToM7z4BPRoYBTwbETsU2ZgZmZWW0VaEHtXbkuaTjYRn5mZ1bGik/W9ICIWACeWEIuZmQ0jRbqYZlRs7ga0UnDBIDMzG7mK3MX0/orX24CHyWZaNTOzOlZkDGLmUARiZmbDS5H1IL6bJuvr3R4r6ZpywzIzs1orMkh9dERs6t2IiN+SrQlhZmZ1rEiC2C2t8AaApP0oNnZhZmYjWJEP+q8Cv0zrMwTwIeDyUqMyM7OaKzJIfa2kNrJnHwTMiIg1pUdmZmY1VeQ5iLcAqyPim2l7b0nHRsR9pUc3BBa0dzJn8To6N3Uzumk3FrR3Mn1KS63DMjOruSJjEN8Cflex/WwqG/EWtHdy6a0r6dzUDcCWnu1ceutKFrR31jgyM7PaK5IgFBEvPDmdFhGqi0HqOYvX0b21Z4ey7q09zFm8rkYRmZkNH0USxAZJfy5pVPq5GNhQdmBDYWNqORQtNzNrJEUSxAXAW4FOoAM4Fji3zKCGysHNY3aq3MyskQyYICLiiYg4MyIOiIgDgU8C7yo9siEwa9okxoxq2qFszKgmZk2bVKOIzMyGj0LTfUtqknSypGuBh4APlxvW0Jg+pYUrZhzF6KbsP0NL8xiumHGU72IyM2OAwWZJ7wDOBt4HLAWOB14dEc8VObikk4ArgSbg6oiYXbV/LHAN8Brg98AnImKVpEnATRVVXw18PiK+XuiqdsL0KS3csPQRAG46/7jBPryZ2YjVZ4KQ1AE8QnZL66yIeEbSQzuRHJqAq4D3ko1dLJO0sOohu8uA5RFxuqTXp/rvjoh1wDEVx+kE5u/85ZmZ2a7qr4vpB0ALWXfS+yXtxc4tFDQVWB8RGyJiC3AjL11HYjJwB0BErAUmSDqwqs67gV9HxG924txmZvYy9ZkgIuJiYALwNeAE4D+BcZI+JOmVBY7dAjxasd2Ryio9AMwAkDQVOAwYX1XnTOCGvk4i6TxJbZLaurq6CoRlZmZF9DtIHZk7I+JcsmRxNjCdbFW5gSjvkFXbs4GxkpYDFwHtZKvWZQeQRgMfAG7uJ8Z5EdEaEa3jxo0rEJaZmRVR+InoiNgK/Aj4kaQiDwp0AIdUbI8HNlYdczMwE0CSyO6QeqiiysnAryLi8aJxmpnZ4Ch0m2u1iCjyqPEyYKKkw1NL4ExgYWUFSc1pH8A5wJKUNHqdRT/dS2ZmVp7S5lSKiG2SLgQWk93mek1ErJZ0Qdo/FzgCuFZSD7CG7CE8ACTtSXYH1PllxWhmZn0rddK9iFgELKoqm1vx+h5gYh/vfQ54VZnxmZlZ34qsB/E6YBbZHUYv1I+IE0uMy8zMaqxIC+JmYC7wbaBngLpmZlYniiSIbRFRFwsEmZlZcUXuYvqRpD+TdJCk/Xp/So/MzMxqqkgL4uPp96yKsiCbQM/MzOrUgAkiIg4fikDMzGx4KXIX0yjgT4F3pKKfA/83PVltZmZ1qkgX07eAUcA/p+2PprJzygrKzMxqr0iC+MOIeGPF9p2SHigrIDMzGx6K3MXUI+k1vRuSXo2fhzAzq3tFWhCzgJ9J2kA2hfdhpBlYzcysfhW5i+kOSROBSWQJYm1EPF96ZGZmVlP9rUl9YkTcKWlG1a7XSCIibi05NjMzq6H+WhDvBO4E3p+zLwAnCDOzOtZngoiIL6SXfxcRlau8IckPz5mZ1bkidzH9IKfslsEOxMzMhpf+xiBeD7wB2LdqHGIfYI+yAzMzs9rqbwxiEnAq0MyO4xDPAOeWGZSZmdVef2MQPwR+KOm4tDSomZk1kCIPyrVL+hRZd9MLXUsR8YnSojIzs5orMkj9PeAPgGnAXcB4sm4mMzOrY0USxGsj4nPAsxHxXeB9wFHlhmVmZrVWJEH0rvuwSdKRwL7AhNIiMjOzYaFIgpgnaSzwOWAhsAb4hyIHl3SSpHWS1ku6JGf/WEnzJa2QtDQloN59zZJukbRW0oOSjit4TWZmNgiKTNZ3dXp5FzuxDrWkJuAq4L1AB7BM0sKIWFNR7TJgeUScnp67uAp4d9p3JXBbRJwhaTSwZ9Fzm5nZy9ffg3Kf6e+NEfG1AY49FVgfERvS8W4ETiNrgfSaDFyRjrdW0gRJBwLdZEuc/knatwXYMsD5zMxsEPXXxbR3+mklW5O6Jf1cQPbBPpAW4NGK7Y5UVukBYAaApKlka02MJ2updAH/Kqld0tWS9so7iaTzJLVJauvq6ioQlpmZFdFngoiIL0XEl4D9gTdFxF9GxF8Cbyb7EB+I8g5btT0bGCtpOXAR0A5sI2vZvAn4VkRMAZ4FXjKGkeKcFxGtEdE6bty4AmGZmVkRRR6UO5Qdu3e2UOwupg7gkIrt8cDGygoRsZm0Op0kAQ+lnz2Bjoi4L1W9hT4ShJmZlaNIgvgesFTSfLIWwOnAtQXetwyYmKYG7wTOBM6urCCpGXgujTGcAyxJSWOzpEclTYqIdWQD12swM7MhU+Qupssl/QR4eyqaGRHtBd63TdKFwGKgCbgmIlZLuiDtnwscAVwrqYcsAXyy4hAXAdelO5g24HWwzcyGVH93Me0TEZsl7Qc8nH569+0XEU8NdPCIWAQsqiqbW/H6HmBiH+9dTjZAbmZmNdBfC+J6sum+72fHwWWl7cLPRJiZ2cjT33Tfp6bfXl7UzKwB9dfF9Kb+3hgRvxr8cMzMbLjor4vpq/3sC+DEQY7FzMyGkf66mE4YykDMzGx4KfIcBGmW1cnsuKJckWchzMxshBowQUj6AvAusgSxCDgZ+AXFHpYzM7MRqsh6EGeQPcn83xExE3gjsHupUZmZWc0VSRDdEbEd2CZpH+AJ/AyEmVndKzIG0ZbmTPo22UNzvwOWlhqVmZnVXH/PQXwTuD4i/iwVzZV0G7BPRKwYkujMzKxm+mtB/BfwVUkHATcBN6T5kczMrAH0t2DQlRFxHPBO4Cmy1d0elPR5Sa8bsgjNzKwmBhykjojfRMRX0spuZ5OtB/Fg6ZGZmVlNDZggJI2S9H5J1wE/Af4T+GDpkZmZWU31N0j9XuAs4H1kdy3dCJwXEc8OUWxmZlZD/Q1SX0a2JsRfFVkcyMzM6osn6zMzs1xFnqQ2M7MG5ARhZma5nCDMzCyXE4SZmeVygjAzs1ylJghJJ0laJ2m9pEty9o+VNF/SCklL08p1vfselrRS0nJJbWXGaWZmL1VoydFdIakJuAp4L9ABLJO0MCLWVFS7DFgeEadLen2q/+6K/SdExJNlxWhmZn0rswUxFVgfERsiYgvZk9inVdWZDNwBEBFrgQmSDiwxJjMzK6jMBNECPFqx3ZHKKj0AzACQNBU4DBif9gXwU0n3Szqvr5NIOk9Sm6S2rq6uQQvezKzRlZkglFMWVduzgbGSlgMXAe3AtrTv+Ih4E3Ay8ClJ78g7SUTMi4jWiGgdN27cIIVuZmaljUGQtRgOqdgeD2ysrBARm4GZAJIEPJR+iIiN6fcTkuaTdVktKTFeMzOrUGYLYhkwUdLhkkYDZwILKytIak77AM4BlkTEZkl7Sdo71dkL+CNgVYmxmplZldJaEBGxTdKFwGKgCbgmIlZLuiDtnwscAVwrqQdYA3wyvf1AYH7WqOAVZGtj31ZWrGZm9lJldjEREYuARVVlcyte3wNMzHnfBuCNZcZmZmb985PUZmaWywnCzMxyOUGYmVkuJwgzM8vlBGFmZrmcIMzMLJcThJmZ5XKCMDOzXE4QZmaWywnCzMxyOUGYmVkuJwgzM8vlBGFmZrmcIMzMLJcThJmZ5XKCMDOzXE4QZmaWywnCzMxyOUGYmVkuJwgzM8vlBGFmZrmcIMzMLJcThJmZ5So1QUg6SdI6SeslXZKzf6yk+ZJWSFoq6ciq/U2S2iX9uMw4zczspUpLEJKagKuAk4HJwFmSJldVuwxYHhFHAx8DrqzafzHwYFkxmplZ38psQUwF1kfEhojYAtwInFZVZzJwB0BErAUmSDoQQNJ44H3A1SXGaGZmfSgzQbQAj1Zsd6SySg8AMwAkTQUOA8anfV8HPgts7+8kks6T1CapraurazDiNjMzyk0QyimLqu3ZwFhJy4GLgHZgm6RTgSci4v6BThIR8yKiNSJax40b97KDNjOzzCtKPHYHcEjF9nhgY2WFiNgMzASQJOCh9HMm8AFJpwB7APtI+reI+EiJ8ZqZWYUyWxDLgImSDpc0muxDf2FlBUnNaR/AOcCSiNgcEZdGxPiImJDed6eTg5nZ0CotQUTENuBCYDHZnUjfj4jVki6QdEGqdgSwWtJasrudLi4rnr4saO+k/ZFN3PfQUxw/+04WtHcOdQhmZsOSIqqHBUau1tbWaGtrK1x/QXsnl966ku6tPS+UjRnVxBUzjmL6lOrxdDOz+iPp/ohozdvX0E9Sz1m8bofkANC9tYc5i9fVKCIzs+GjoRPExk3dO1VuZtZIGjpBHNw8ZqfKzcwaSUMniFnTJjFmVNMOZWNGNTFr2qQaRWRmNnyU+RzEsNc7ED1n8To2burm4OYxzJo2yQPUZmY0eIKALEk4IZiZvVRDdzGZmVnfnCDMzCyXE4SZmeVygjAzs1xOEGZmlquu5mKS1AX8pmD1/YEnSwyn1nx9I5uvb2QbSdd3WETkLqZTVwliZ0hq62uCqnrg6xvZfH0jW71cn7uYzMwslxOEmZnlauQEMa/WAZTM1zey+fpGtrq4voYdgzAzs/41cgvCzMz64QRhZma5Gi5BSDpJ0jpJ6yVdUut4BoOkayQ9IWlVRdl+km6X9F/p99haxrirJB0i6WeSHpS0WtLFqbwurg9A0h6Slkp6IF3jl1J5PV1jk6R2ST9O23VzbQCSHpa0UtJySW2pbMRfY0MlCElNwFXAycBk4CxJk2sb1aD4DnBSVdklwB0RMRG4I22PRNuAv4yII4C3AJ9Kf7N6uT6A54ETI+KNwDHASZLeQn1d48XAgxXb9XRtvU6IiGMqnn8Y8dfYUAkCmAqsj4gNEbEFuBE4rcYxvWwRsQR4qqr4NOC76fV3gelDGtQgiYjHIuJX6fUzZB8yLdTJ9QFE5ndpc1T6CerkGiWNB94HXF1RXBfXNoARf42NliBagEcrtjtSWT06MCIeg+xDFjigxvG8bJImAFOA+6iz60tdMMuBJ4DbI6KervHrwGeB7RVl9XJtvQL4qaT7JZ2Xykb8NTbainLKKfN9viOApFcCPwD+IiI2S3l/ypErInqAYyQ1A/MlHVnrmAaDpFOBJyLifknvqnU8JTo+IjZKOgC4XdLaWgc0GBqtBdEBHFKxPR7YWKNYyva4pIMA0u8nahzPLpM0iiw5XBcRt6biurm+ShGxCfg52ZhSPVzj8cAHJD1M1qV7oqR/oz6u7QURsTH9fgKYT9adPeKvsdESxDJgoqTDJY0GzgQW1jimsiwEPp5efxz4YQ1j2WXKmgr/AjwYEV+r2FUX1wcgaVxqOSBpDPAeYC11cI0RcWlEjI+ICWT/3u6MiI9QB9fWS9JekvbufQ38EbCKOrjGhnuSWtIpZH2iTcA1EXF5jUN62STdALyLbIrhx4EvAAuA7wOHAo8AfxwR1QPZw56ktwH/AazkxT7sy8jGIUb89QFIOppsELOJ7Evb9yPi7yS9ijq5RoDUxfRXEXFqPV2bpFeTtRog67a/PiIur4drbLgEYWZmxTRaF5OZmRXkBGFmZrmcIMzMLJcThJmZ5XKCMDOzXE4QNmJI+idJf1GxvVjS1RXbX5X0mX7e/x1JZ6TXP5f0kkXlJY2SNDvNwLkqzbJ6ctr3sKT9dyHuF87bx/6r0iygayR1p9fLJZ0haVHvMxKDSdJBvTOr9rF/tKQlkhpttgWr4ARhI8kvgbcCSNqN7LmPN1Tsfytw98s8x5eBg4AjI+JI4P3A3i/zmP2KiE9FxDHAKcCv04ygx0TELRFxSnq6erB9Bvh2PzFtIZuB9MMlnNtGCCcIG0nuJiUIssSwCnhG0lhJuwNHAO2SPi9pWWoBzFPBiZsk7QmcC1wUEc8DRMTjEfH9nLqfScdfVdWq+ZikFWlth+/lvO/LqUVR6N9eb6tF0gRJayVdnc55naT3SLo7tXampvp7KVsfZJmy9Rf6mq34g8Bt6T1vSC2l5Sn2ianOAuB/FYnT6pObjzZipMnQtkk6lCxR3EM2G+9xwNPAiojYIumbEfF3AOlD+lTgRwVO8VrgkYjY3F8lSW8GZgLHkk0AeZ+ku4AtwN+QTdz2pKT9qt73D8C+wMzYtSdUXwv8MXAe2bQxZwNvAz5A9nT59HT+OyPiE6lraqmkf4+IZyviOBz4bW8SBC4AroyI69IUNE2pfBXwh7sQp9UJtyBspOltRfQmiBJxjigAAAIXSURBVHsqtn+Z6pwg6T5JK4ET2bEbajC8DZgfEc+mdRxuBd6eznVLRDwJUDWtwueA5og4fxeTA8BDEbEyIrYDq8kWowmyaUgmpDp/BFyibOrwnwN7kE31UOkgoKti+x7gMkl/DRwWEd0p/h5gS+88Q9Z4nCBspOkdhziK7BvuvWQtiLcCd0vaA/hn4IyIOIqsn32PgsdeDxxa4AOxry4r0ff08cuAN1e3KnbS8xWvt1dsb+fF3gABH6wYxzg0IipXcgPopuK/SURcT9YK6QYWSzqxou7uwO9fRsw2gjlB2EhzN1mX0VMR0ZO+pTeTJYl7ePGD70lla0j0efdQtYh4jmzm2G+krpbeu30+UlV1CTBd0p5p9s7TySYUvAP4UJqkjapkcBswG/h/JX8jXwxc1DvuImlKTp3/5MUWR+9kcxsi4htkM5AencpfBXRFxNYS47VhzAnCRpqVZHcv3VtV9nREPJnu+Pl2KltA9s19Z/wtWffLGkmr0jEqu2NIS6B+B1hKNqvs1RHRHhGrgcuBuyQ9AHyt6n03p9gWpmm9y/BlsiVLV6T4v1xdIY1H/FrSa1PRh4FVqVvq9cC1qfwEYFFJcdoI4NlczRqQpNOBN0fE3/ZT51bg0ohYN3SR2XDiu5jMGlBEzO/tCsuTutgWODk0NrcgzMwsl8cgzMwslxOEmZnlcoIwM7NcThBmZpbLCcLMzHL9f2YCmplXeUu2AAAAAElFTkSuQmCC",
 | ||
|       "text/plain": [
 | ||
|        "<Figure size 432x288 with 1 Axes>"
 | ||
|       ]
 | ||
|      },
 | ||
|      "metadata": {
 | ||
|       "needs_background": "light"
 | ||
|      },
 | ||
|      "output_type": "display_data"
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "import matplotlib.pyplot as plt\n",
 | ||
|     "import numpy as np\n",
 | ||
|     "\n",
 | ||
|     "plt.title('Learning Curve')\n",
 | ||
|     "plt.xlabel('Wall Clock Time (s)')\n",
 | ||
|     "plt.ylabel('Validation Accuracy')\n",
 | ||
|     "plt.scatter(time_history, 1 - np.array(valid_loss_history))\n",
 | ||
|     "plt.step(time_history, 1 - np.array(best_valid_loss_history), where='post')\n",
 | ||
|     "plt.show()"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Visualize"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 15,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "data": {
 | ||
|       "text/plain": [
 | ||
|        "<matplotlib.legend.Legend at 0x1e2d8e31fa0>"
 | ||
|       ]
 | ||
|      },
 | ||
|      "execution_count": 15,
 | ||
|      "metadata": {},
 | ||
|      "output_type": "execute_result"
 | ||
|     },
 | ||
|     {
 | ||
|      "data": {
 | ||
|       "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEHCAYAAABBW1qbAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjAsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8GearUAAAgAElEQVR4nOzdd1hUR9vA4d8sbSkWUFAEpdgrWLCjsWtiiV0TY9eYqDG9vmm+6V9iisYYo1FfE9HYE0sSe2+oiA0VFQWxgAoISN35/jgrQlRE43KAnfu69go7c/bsQ9R99pyZeUZIKVEURVEUAIPeASiKoihFh0oKiqIoSg6VFBRFUZQcKikoiqIoOVRSUBRFUXKopKAoiqLksLXUiYUQRmAr4GB+nyVSyveFEIuAmubDygIJUsrAXK+rAhwDPpBSfpnfe5QvX176+vpaInxFUZQSa//+/fFSSve79VksKQDpQHspZbIQwg7YLoRYK6UceOsAIcRXQOI/Xvc1sLYgb+Dr60toaOgjC1hRFMUaCCHO3avPYklBaqviks1P7cyPnJVyQggBDADa52p7EjgDpFgqLkVRFOXeLDqmIISwEUKEAVeAdVLKPbm6g4HLUspT5mOdgTeADy0Zk6IoinJvFk0KUsps83iBN9BUCFEvV/dgICTX8w+Br6WUyeRDCDFWCBEqhAiNi4t79EEriqJYMUuOKeSQUiYIITYDXYEjQghboA/QONdhzYB+Qogv0AagTUKINCnltH+cayYwE6BJkyaqcJOiWFhmZiYxMTGkpaXpHYrygIxGI97e3tjZ2RX4NZacfeQOZJoTgiPQEfjc3N0RiJBSxtw6XkoZnOu1HwDJ/0wIiqIUvpiYGEqVKoWvry/aUKBSHEgpuXr1KjExMfj5+RX4dZa8feQJbBJChAP70MYUVpn7BpH31pGiKEVUWloa5cqVUwmhmBFCUK5cuQe+wrPk7KNwoOE9+obf57UfWCAkRVEekkoIxdPD/LmpFc1KsbEx4jJh0Ql6h6EoJZpKCkqxsOnEFUbNC6X/jJ2sOHhB73AUHSxfvhwhBBEREfc99ptvviE1NfWh32vu3LlMmDChwO3/hiXO+W+opKAUeVHxKUwKOUitiqVp7OPKi4vC+H5TJGrXQOsSEhJC69atWbhw4X2P/bdJwZqppKAUaSnpWYydH4rBIJj5TGPmjWxKr8BK/N9fJ3h7+WGysk16h6gUguTkZHbs2MHs2bPzJIXs7GxeffVV6tevT4MGDZg6dSrfffcdsbGxtGvXjnbt2gHg4uKS85olS5YwfPhwAP744w+aNWtGw4YN6dixI5cvXy5wTHFxcfTt25egoCCCgoLYsWMHJpMJX19fEhJu3+asVq0aly9fvuvxRVGhrFNQlIchpeS1JYeIvJLM/0Y2o7KbEwDfDAyksqsT0zZFcjExjWlPNcLFQf1VLgwf/nGUY7FJj/ScdSqV5v0edfM9ZsWKFXTt2pUaNWrg5ubGgQMHaNSoETNnzuTs2bMcPHgQW1tbrl27hpubG1OmTGHTpk2UL18+3/O2bt2a3bt3I4Rg1qxZfPHFF3z11VcFinvSpEm89NJLtG7dmvPnz9OlSxeOHz9Or169WL58OSNGjGDPnj34+vpSoUIFnnrqqbseX9Sof0lKkTVjyxnWHL7E24/XonX18pCZBjZ2CIMNr3apSaWyjry78ggDf9zFz8ODqFDaqHfIioWEhITw4osvAjBo0CBCQkJo1KgR69evZ9y4cdjaah9lbm5uD3TemJgYBg4cyMWLF8nIyHig+fzr16/n2LFjOc+TkpK4ceMGAwcOZPLkyYwYMYKFCxcycODAfI8valRSUIqkLSfj+OKvCHoEVGJMsD9Eroflz4GdIzz2JtQfwFPNquBZ1sj4Xw/QZ/pO5owIokaFUnqHXqLd7xu9JVy9epWNGzdy5MgRhBBkZ2cjhOCLL75ASlmgaZe5j8k9b3/ixIm8/PLL9OzZk82bN/PBBx8UOC6TycSuXbtwdHTM096iRQsiIyOJi4tjxYoV/Oc//8n3+KJGjSkoRc65qylMXHCAmhVK8fmTNRHr3oVf+oJTOXB0hRXPwfTmcHgJ7aqX57dnW5CRbaLvDzvZeTpe7/CVR2zJkiUMHTqUc+fOERUVRXR0NH5+fmzfvp3OnTszY8YMsrKyALh27RoApUqVyvMtvEKFChw/fhyTycTy5ctz2hMTE/Hy8gJg3rx5DxRX586dmTbtdtGFsLAwQEtAvXv35uWXX6Z27dqUK1cu3+OLGpUUlCIlNSOLZ+fvRwjBzz3L4zT/Cdg5FZqMhLGbYOxmGPgr2NjB0lEwozX1krax/LkWVCxtZNjPe1l+MOZ+b6MUIyEhIfTu3TtPW9++fVmwYAGjR4+mSpUqNGjQgICAABYsWADA2LFj6datW85A82effUb37t1p3749np6eOef54IMP6N+/P8HBwfcdf/in7777jtDQUBo0aECdOnWYMWNGTt/AgQP55Zdfcm4d3e/4okQU52l9TZo0kWqTnZJDSsmEkIOsPXyRNe0uUSv0fTAYoOdUqNMr78EmExxdBps/hauRUKkhKa3eZNT20uw+e51XO9dgfLtqaiXuI3D8+HFq166tdxjKQ7rbn58QYr+UssndjldXCkqR8ePWM2wKP8sanxBq7XwZKtSFcdvvTAigJYv6/eD5PfDkD5B6FefFA1lg8z6vVr/Ml3+fVFNWFeUhqKSgFAlbT8ax6q8/2VTqfWpeWgVtXofhq6FslfxfaGMLgU/BhP3Q/WsMiTFMiH6JLR5TOLFvA6PmhZKcnlU4v4SilAAqKSi6Ox+fwq4FH7HM/j3cHTIRw/6A9u9oH/gFZWuvjTu8cBC6foZP9nmWOXzA8KjXeHvafC4nqb0AFKUgVFJQdJV6/RIXZ/TkDeaS5dcOw3M7wS/4/i+8FzsjNH8OJoVBxw9pbYziuxsvcvybnkQd2/foAleUEkolBUU38swWMqe1JDAzjFON38Np6GJwLvdoTm7vDK1fxO7lw1xu/DJNTIep8lsn4uYOgfjIR/MeilICqaSgFL7sLNjwX/hfL+Iz7VnR5H9U7/EKWGKmkLE0FXq8T+KzoSy074fT2b+R04JgxfNwPerRv5+iFHMqKSiFK+E8zH0ctn3J4uy2TK0+iwHdH7f423p5evHESz/wUsV5zMrqSlb4EuTUxrDqJUhUpbiLOhsbGwIDA3MeUVFRbN68me7du9/zNQEBAQwePDhP2/Dhw3FycsqzsG3SpEkIIYiP1xY+5i6edzfp6el07NiRwMBAFi1a9C9+q0fnk08+eWTnUklBKTzHVsKM1pguHeVNMYnZbq/w8YDmhbaWoIyjHVPHdOZovddpmTqFXWV7Ig/Mh+8awto3IflKocShPDhHR0fCwsJyHr6+vvkef2v18tatW0lJScnTV61aNVauXAlopSc2bdqUs6q5IA4ePEhmZiZhYWF5FqflJzs7u8DnfxgqKSjFS0Yq/DEJfhuKya0qoxy/YY1sxY/PNMa5kKubOtja8PXAQAa0C+Kp2H68UnEOmfX6w96Z8G0ArHsfUq8VakzKo7dgwQKeeeYZOnfuzO+//56nb/DgwTnf8Ddv3kyrVq1yCurdz5UrVxgyZAhhYWEEBgZy+vRpNmzYQMOGDalfvz4jR44kPT0dAF9fXyZPnkzr1q1ZvHgxf//9Ny1atKBRo0b079+f5ORkAPbt20fLli0JCAigadOm3Lhxg6ioKIKDg2nUqBGNGjVi586dAFy8eJE2bdoQGBhIvXr12LZtG2+++SY3b94kMDCQp59++l//v1MF8RTLunwMloyEuOPIlpN4Jb47m8/GMWd4Q3zLO+sSkhCCV7vUxMvVkf+sOELEzUH8b/h4yod+DTu+hX2zocV4aPE8GMvoEmORtfZNuHT40Z6zYn3o9lm+h9z60APw8/PLU7/obhYtWsS6des4ceIE06ZNy3MbqXr16qxcuZLr168TEhLCkCFDWLt2bYFC9fDwYNasWXz55ZesWrWKtLQ0HnvsMTZs2ECNGjUYOnQoP/zwQ05FV6PRyPbt24mPj6dPnz6sX78eZ2dnPv/8c6ZMmcKbb77JwIEDWbRoEUFBQSQlJeHo6IiHhwfr1q3DaDRy6tQpBg8eTGhoKAsWLKBLly688847ZGdnk5qaSnBwMNOmTXtktZRUUlAsQ0oI/Rn+ehscSsOQZcyK9WN5+HFe61KTx2p66B0hg5tWoWIZIxN+PUDPBRnMGTGFmsEva6UztnwGe2ZAh/cgaJTeoVq9W7ePCmLfvn24u7vj4+ODt7c3I0eO5Pr167i6uuYc06dPHxYuXMiePXv48ccfHzquEydO4OfnR40aNQAYNmwY33//fU5SuHV7affu3Rw7doxWrVoBkJGRQYsWLThx4gSenp4EBQUBULp0aQBSUlKYMGECYWFh2NjYcPLkSQCCgoIYOXIkmZmZPPnkkzmJ8lFSSUF59FKvwR8vwPE/oGoH6D2D7RcNfLp2D93qVeT5x6rqHWGOdjU9WPRsC0bO3Ue/GTv5cUhjWg74H1w8BH+9A6tfAZ9W4FFL71CLhvt8oy8KQkJCiIiIyBl3SEpKYunSpYwePTrnmEGDBtGoUSOGDRuGwfDwd9HvVzvO2dk557hOnToREhKSpz88PPyuY2pff/01FSpU4NChQ5hMJoxGba+QNm3asHXrVlavXs0zzzzDa6+9xtChQx86/rtRYwrKo3VuF8wIhhNrodN/4eklRGe4MDHkANU8XPiyf0CRK1JXz6sMy8e3wrOMkWFzzFVWPQOg/zywd4HNj24QT7Esk8nE4sWLCQ8PJyoqiqioKFauXHnHh3GVKlX4+OOPef755//V+9WqVYuoqCgiI7W1L/Pnz6dt27Z3HNe8eXN27NiRc1xqaionT56kVq1axMbGsm+ftrDyxo0bZGVlkZiYiKenJwaDgfnz5+cMVJ87dw4PDw/GjBnDqFGjOHDgAAB2dnZkZmb+q9/lFpUUlEfDlA1bvtCmm9rYwai/odUL3MySPDt/P1kmyY/PNCn0geWC8irryOJxLWni48ZLiw4xdcMppJObNq5wbKV25aAUORs2bMDb2zvn8fnnn+Pl5ZVnNlGbNm04duwYFy9ezPPaZ599lqpV77xqTU1NzXPOKVOm3PP9jUYjc+bMoX///tSvXx+DwcC4cePuOM7d3Z25c+cyePBgGjRoQPPmzYmIiMDe3p5FixYxceJEAgIC6NSpE2lpaTz//PPMmzeP5s2bc/LkyZwrjs2bNxMYGEjDhg1ZunQpkyZNArRS4Q0aNHgkA82qdLby7yXFwrKxELUN6g+AJ74CY2mklLy4KIzfD8Xy87Ag2tXSfxzhfjKyTLyxNJzlBy8wKKgy/+1aGbupgVC5GTz9m97h6UKVzi7eHrR0tsW+tgkhjMBWwMH8PkuklO8LIRYBNc2HlQUSpJSBQoimwMxbLwc+kFLmP8VA0d+JtdpOaFkZWgnrgME5K5Nnbz/LyrBYXu1co1gkBAB7WwNTBgTg7erI1I2RJKRmMqPVJNjwIUTvhcpN9Q5RUSzKktfy6UB7KWWyEMIO2C6EWCulzFntIYT4Ckg0Pz0CNJFSZgkhPIFDQog/pJSq7nFRFbZASwgVG0C/OVC+Wk7Xzsh4Pl0bQde6FRnfrlo+Jyl6hBC80rkmtgYDX68/ycEWA2joPB02TIbhq/QOT1EsymJjClKTbH5qZ37k3KsS2mjjACDEfHxqrgRgzH2sUgRFroffJ4JfWxi1Lk9CiLmeyvgFB/Ar78yXA4rewHJBjWnjR1knO77fcQmCX9Fuj53ZondYuijOt5mt2cP8uVl0oFkIYSOECAOuAOuklHtydQcDl6WUp3Id30wIcRQ4DIy721WCEGKsECJUCBEaFxdnyfCVe4kNg0VDwb02DPxFK1dtdjMjO2dgeeYzjXEpogPLBeFkb8vQFr6sP36ZyMr9oLQXbPyvtgbDihiNRq5evaoSQzEjpeTq1as501kLqlAGmoUQZYHlwEQp5RFz2w9ApJTyq7scXxuYB7SRUt5zdxQ10KyD61EwqxPYOmhXCKVvb4IupeSlRWGsPBTL7GFNaF+rgn5xPiLXUjJo+dkGujeoxJd+B2DVi/DUb1Cji96hFZrMzExiYmJIS1MbFRU3RqMRb29v7Ozs8rTrMtCcm5QyQQixGegKHBFC2AJ9gMb3OP64ECIFqAeoT/2iIvUa/NIPsjO0e+u5EgLAzzuiWBEWy8udapSIhADg5mzPoKAq/LrnHK906Iun6zfa1UK1Tto+0VbAzs4OPz8/vcNQConF/lYLIdzNVwgIIRyBjkCEubsjECGljMl1vJ85WSCE8EGboRRlqfiUB5R5ExYM1EpfD14I7jXzdO88Hc8na47TuU4FJhSzgeX7GdXaD5OE2Ttj4LG3tNo/x3+//wsVpRiy5FcdT2CTECIc2Ic2pnBr6sYgzAPMubRGm3EUhnar6XkpZbwF41MKypQNS0dDzD7o+xP4tMjTHXM9lQkLDuJbzomvBgRgMBTPgeV7qezmRI8GnoTsPU9i1SehfE3Y9In2/0VRShhLzj4Kl1I2lFI2kFLWk1JOztU3XEo54x/Hz5dS1pVSBkopG0kpV1gqNuUBSAlrX4eIVdD1M6jTK093WmY2437ZT2aWiZlDm1DKaHePExVvz7atSkpGNvP3RkO7tyH+BIRb52I2pWSzjpuiysPb/jXsmwUtJ0LzvMv3pZS8vfwwRy4k8fXAQKq6579jVXFW27M0j9V0Z86OKNKqP6Gtzdj8KWQ/mnozilJUqKSg3NuhhdpK3nr9oOPkPF0ZWSZeWxLOsgMXeLFjdTrWKRkDy/kZ17YqV1MyWLz/ArR/FxLOwcH5eoelKI+USgrK3Z3eCCvHg28wPDk9z0ybaykZDJm9hyX7Y5jUoTqTOlTXMdDC08zPjcDKZZm57QxZ/h3Auyls+T/IVFM1lZJDJQXlThfDtcVp5WvCoF+1NQlmkVeS6T19B2HRCXw7KJCXOtUotiuWH5QQgnFtqxJ97SZrjl6GDu/CjVhtMyFFKSFUUlDySjgPv/YDY2l4enGe7Sh3RMbTZ/oOktOyCBnTjF6BBd/svKToXKcC/u7OzNh8GukbDH5tYPsUSE++/4sVpRhQSUG5LfUa/NJXux0yZCmUuf2hv2DPeYb+vJeKZYysGN+Kxj5uOgaqH4NBMK5NVY5dTGLbqXhtbCElDvY+/JaOilKUqKSgaDLTYOFTWhmLwQvAQ6u/nm2S/HfVMd5efpjW1cqz9LmWVHZz0jdWnfVqWIkKpR2YseW0Vkq7ehfY8S3cTNA7NEX511RSULRFWMvGwPld0PtH8G0NQHJ6FmP/F8rs7WcZ3tKX2cNK7jqEB+Fga8Oo1n7sPH2V8JgEaP8OpCXCru/1Dk1R/jWVFKydlPDnW1rZhi6fQL0+AFxIuEm/H3ay+WQck3vV5YOedbG1UX9dbhnctAqljLba1YJngLaob/d0SFGL8JXiTf0rt3Y7v9PuhzcfDy3GAxAWnUCvaTu4cP0mPw8PYmgLX31jLIJKGe14prkPa49c4mx8CrR7BzJTYcc3eoemKP+KSgrWLHwxrHsP6vaGzh8BsDr8IgN/3IXRzsDS51vStoa7zkEWXSNa+WFnY2Dm1jNagcAGA2HvT5B08f4vVpQiSiUFa3Vmi7aVpk9r6P0jUgimbTzF+AUHqOdVhpXjW1GjQim9oyzS3Es50K+xN0v3x3AlKQ3avgGmLNh2xxYhilJsqKRgjS4dgUVDoFw1GPQr6djy8m+H+PLvkzwZWIlfRzejnIvD/c+jMDbYnyyTiZ93RIGbHzR8BvbPhevn9A5NUR6KSgrWJiFaW5xm7wJDlnA125Gnf9rD8oMXeLlTDb4eGIjRzkbvKIsN3/LOdKvnya+7z5GUlgltXgNhgK1f6B2aojwUlRSsyc3rWkLISIEhSziVVoYnp+/g8IVEpg5uyAsdqltNyYpHaVzbqtxIz2LBnvPagr+gURAWAvGReoemKA9MJQVrkZkGC5+Gq6dh0K9sTfSgz/Sd3MwwsXBsc3oEVNI7wmKrvncZWlUrx8/bz5KelQ2tX9LqRW3+RO/QFOWBqaRgDUwmWDEOzu2A3jOYf9mHEXP34eXqyIrxLWlYxVXvCIu959pW48qNdJYfuAAuHtBsHBxZqo3fKEoxopKCNfj7P3B0OdkdJ/PB2dq8u+IIbWu4s+S5lni7WnfJikelVbVy1PMqzcytZ8g2SWj1AjiU0bbtVJRiRCWFkm7nNNj9PRmNxzLqZHPm7oxiVGs/fhraBBcHW72jKzFuldU+E5/CumOXwNEVWk6AE6vhwn69w1OUAlNJoSQ7shT+fofUqk/Q69TjbIu8ykdP1uPd7nWwMagB5UetWz1PfMo58cOWM0gpoflz4OgGGz/SOzRFKTCVFEqqs9tg+ThuVGhKx6iniUnKYN6Ipgxp7qN3ZCWWjUEwJtifQ9EJ7D5zDRxKaYPOpzdC1A69w1OUAlFJoSS6fAwWPs0NJ2/aXxiLrYMTy59vSevq5fWOrMTr19ib8i72WqE8gKDR4FJRu1qQUt/gFKUAVFIoaRIvIH/tR7K0o0vci/h6e7FifCuqeaiSFYXBaGfDiFZ+bDkZx9HYRLB3gjavwvmd2hWDohRxKimUJDcTMP3Sl/Tk6/S/8QrNGwXwy+hmuDnb6x2ZVRnSzAdnext+3HJGa2g0DMpUgY3/VVcLSpFnsaQghDAKIfYKIQ4JIY4KIT40ty8SQoSZH1FCiDBzeychxH4hxGHzf9tbKrYSKT2ZzPn9yI47xai0SXTv3Jmv+gfgYKtKVhS2Mk52PNWsCqvCY4m+lgq29vDYGxB7ECJW6x2eouTLklcK6UB7KWUAEAh0FUI0l1IOlFIGSikDgaXAMvPx8UAPKWV9YBgw34KxlSwZqaTP748hdj8vZ0/k6cHDGN+umipZoaNRrf2xMQh+2ma+WmgwSCtAuOljbTGhohRRFksKUpNsfmpnfuRcOwvtE2sAEGI+/qCUMtbcfRQwCiFUqc77yUwj7ZdB2MXs4i3TeIaMmMjj9T31jsrqVSxjpHdDL34LjeZqcjrY2MJjb8GVY3B02f1PoCg6seiYghDCxnx76AqwTkq5J1d3MHBZSnnqLi/tCxyUUqbf5ZxjhRChQojQuLg4ywReXGRlkPrr0xjPb+F9xjF49Cs08y+nd1SK2dg2VUnPMjFvZ5TWULcPeNTVVjlnZ+kam6Lci0WTgpQy23ybyBtoKoSol6t7MOarhNyEEHWBz4Fn73HOmVLKJlLKJu7uVrwrWHYWNxYMwylqPR+LMQwc85aqYVTEVPNwoVPtCszbdY6U9CwwGKD9O3DtNBy646++ohQJhTL7SEqZAGwGugIIIWyBPsCi3McJIbyB5cBQKeXpwoitWDJlkxAyilJn1vCVYQT9x71PPa8yekel3MW4x6qSeDOThfuitYaaj0OlRrDlc8i640JYUXRnydlH7kKIsuafHYGOQIS5uyMQIaWMyXV8WWA18JaUUi3/vBeTiashz1I2cgXTbYbQ+7mP1LaZRVijKq409XNj9rYzZGabQAho/x9IjIYD/9M7PEW5gyWvFDyBTUKIcGAf2pjCKnPfIO68dTQBqAa8m2vKqocF4yt+pOTyoomUO7WYn20H0mP8/+Hv7qJ3VMp9PNe2KrGJafweZp5HUbU9+LSCrf8HGan6Bqco/yBkMV5M06RJExkaGqp3GIVDSi4segmviDmE2Peh3fPTqVjWUe+olAKQUtLt222YpOTPSW0wGASc2wlzukGn/2plthWlEAkh9kspm9ytT61oLg6kJOq3N/CKmMMK+x50nPCDSgjFiBCCZ9v6c/JyMhsjrmiNPi2hagfY/jWkJekboKLkopJCMXBy8Xv4Hv+RtQ5dafvCbNxLG/UOSXlA3RtUwqus4+1CeaDNRLp5DfbM0C8wRfkHlRSKuCO/TabGse/YaOxIy0n/w9VFrecrjuxsDIwO9iP03HVCo65pjV6NoVZ32DkVUq/pG6CimKmkUIQd+O1T6h37ih2ObWk2aQFlnFRCKM4GBlXG1cku79VCu7ch/YaWGBSlCFBJoYja+duXNDr2GaGOrWj0wm84O6qEUNw52dsyrKUv649f4eTlG1pjhbpQr692Cyn5ir4BKgoqKRRJGxd9S/OjHxHu2Iz6Ly7F0VGNIZQUw1r44miXq6w2aDWRstK1QWdF0ZlKCkWIlJLVC6bR9tj7nHRuRO0XV+DgoGYZlSSuzvYMDKrMyrALxCbc1BrLV4PAwbBvNiTF5n8CRbGwB0oKQghXIUQDSwVjzaSULFswgy4n3iXKuQHVX/gdOwcnvcNSLGB0sB8SmL397O3GNq9BdoaWGBRFR/dNCkKIzUKI0kIIN+AQMEcIMcXyoVkPk0ny6/yf6HHyHWKda+M3cRU2RrVSuaTydnWiZ0AlQvaeJyE1Q2t09YWa3eDAPFUTSdFVQa4Uykgpk9AK2M2RUjZGq12kPALZJsns+XPof/ptrjpXo/LE1RgcS+sdlmJhz7b1JzUjm/m7zt1uDBoNKXFwbKV+gSlWryBJwVYI4Ym2Ic6q+x2sFFxWtonv585jyJk3SHL2oeL4tQhHVf7aGtSqWJp2Nd2ZuzOKmxnZWqN/O213tr0z9Q1OsWoFSQqTgb+ASCnlPiGEP3C3jXGUB5CRZWLKnF8Zee4NbjpXwv35PxHOaoMcazKubVWupmSweL+5rLbBAEFjIGYfXDigb3CK1bpvUpBSLpZSNpBSPm9+fkZK2dfyoZVcaZnZfDJ7IeOiX8fk5I7buD/BxYo3DLJSTf3caFilLDO3niEr27xvc+BgsHOGfbP0DU6xWrb36hBCTCXXnsr/JKVUpR0fQmpGFpNnL+aNS69hcCqLy7NrobTaU9kaCSEY17Yqz87fz+rDF+kV6AXGMhAwEA7+Cp0/Aic3vcNUrEx+VwqhwP58HsoDupGWyVszl/LqpddxcHTGZcxqKFtZ77AUHXWqXYGq7s7M2HKGnDL2QWMgO11twqPo4p5XClLKebmfCyGcpZQplg+pZEpIzeCNn1Yw+drruBjtMY5eDW7+eoel6MxgEPgsJsgAACAASURBVDzbtiqvLwln66l42tZwhwp1wDdYW7PQciIYbPQOU7EiBVmn0EIIcQw4bn4eIISYbvHISpC0zGxemvkH7117C1cHMI78A8pX1zsspYh4MtCLiqWNzNicq1Be0zGQeB5O/qVfYIpVKsjso2+ALsBVACnlIaCNJYMqaaYv+ZPJ197Awz4D+xG/a98EFcXM3tbAqNZ+7DpzlUPRCVpjzSegVCXY95O+wSlWp0BlLqSU0f9oyrZALCXSrg0rGRkxBjf7LOyGrwTPAL1DUoqgwc2qUMrBlp+2mQvl2dhCk5FweiPEqxngSuEpSFKIFkK0BKQQwl4I8SrmW0lK/q7umEeTrSO4YeuG/XObwauR3iEpRZSLgy1PNavCmsMXib6WqjU2HgYGOzU9VSlUBUkK44DxgBcQAwSanyv3IiXZGz+h3LoX2C9qI0avw66cr95RKUXc8Fa+GIRgzo4orcHFA+r2hrAFkJ6sa2yK9ShIUhBSyqellBWklB5SyiFSyqsWj6y4ykqH5eOw2fo5i7PacO3JBXh7qnUIyv15lnGkR0AlFu07T+LNTK2x6RhIT4LwRfoGp1iNgiSFnUKIv4UQo4QQZS0eUXGWeg3m94HwhXyZ2Z+DjT7m8UAfvaNSipHRwX6kZGQTsve81uAdpI1D7f0J5D3XkirKI1OQMhfVgf8AdYEDQohVQoghFo+suLl2BmZ3Rsbs5R0xiXXlh/Jej7p6R6UUM3UrlaFVtXLM2XGWjCwTCAFNx0LccYjarnd4ihUo6OyjvVLKl4GmwDVg3n1eghDCKITYK4Q4JIQ4KoT40Ny+SAgRZn5ECSHCzO3lhBCbhBDJQohp/+J3KnzRe2FWR2RqPJNdP2FpVgumPdUQo51adKQ8uNHB/lxOSmdVuHkXtnp9wdFVVU9VCkVBFq+VFkIME0KsBXYCF9GSw/2kA+2llAFog9NdhRDNpZQDpZSBUspAYCmwzHx8GvAu8OrD/CK6Oboc5nYHYxl+rTuLOTGV+LBnXapXKKV3ZEox9VgNd6p7uDBzq7n0hZ0jNHwGIlZD4gW9w1NKuIJcKRxC+1CfLKWsIaV8Q0p539pHUnNryoSd+ZFzU1QIIdD2aAgxH58ipdyOlhyKPim1jdYXD4dKDTnYeTHv78ygZ0AlBjRR9YyUhyeEYEywPxGXbrAj0jynI2gUSBPsn6NvcEqR8N7KI3m3c32ECpIU/KWULwHhD3pyIYSN+fbQFWCdlHJPru5g4LKUsvitzMnOhD8mwfoPoF5fEvovZvzyc3iVdeTj3vXQ8p2iPLxeDStR3sWBmbcWs7n6Qo2usH+u2q7TykVeucH83eeIu2GZvwcFSQrNH7b2kZQy23ybyBtoKoSol6t7MOarhAchhBgrhAgVQoTGxcU96Mv/vbREWDBA20s3+BVkn594fcVJ4pLTmfZUQ0oZ7Qo/JqXEcbC1YXhLH7aejOPEpRtaY9MxartOhakbIzHa2jAm2M8i5y+U2kdSygRgM9AVQAhhi7bn8wNPvpZSzpRSNpFSNnF3L+SNaRKi4eeucHYr9JwGHd7jf7uj+fvYZd7oWosG3mrGrvLoPN3MB0c7m9ulL/zbgVtVbXqqYpUiryTz+6FYhrb0oZyLg0Xew2K1j4QQ7rfWNQghHIGOQIS5uyMQIaWMeYBY9RV7EGZ1gMQYeHoJNHqGo7GJfLz6OO1reTCqtWWytmK9XJ3t6d/Em5VhF7iSlKZt19l0DMTs1f4+KlZn2sZTGG1tGBtsubL7lqx95AlsEkKEA/vQxhRWmfsGcZdbR0KIKGAKMFwIESOEKBrlRCPWwJzHwcYBRv0NVduRkp7FxAUHcXW248v+AWocQbGIUa39yDJJ5u6M0hoCzNt17lX1kKzN6TjzVUILH8oZUiy2mPFhax89f78XSSnDpZQNzfs715NSTs7VN1xKOeMur/GVUrpJKV2klN5SymMF/1UsZPcMWPgUuNeC0evBozYA7648QtTVFL4d1BA3Z3udg1RKKp9yznSpU5Ff95wnJT0LHMtq23UeWaKtoFesxrSNkTjY2jCmtQ/M7w3LxljkfQqyojn+n7WPgLctEk1RYsqGNa/Dn29ArSdg+GooVQGApftjWHbgAhPbV6e5fzmdA1VKujFt/Em8mcniUPNd3KAxkJUGB+frG5hSaM7EJbMy7AJDmleh/MmFcDFMm41mAQUaU7iLAY80iqImPRkWPg17f4QWE2DA/8DeCdD+cN5deYSmfm5MbF9N50AVa9DYx5VGVcoye8dZsk1S26TJp7VWUtuktjaxBtM2RmJva+DZIFfYMFnbrrVeX4u818MmhZJ7Az3pIsx9HE79BY9/CV0+ztkjNy0zmwkLDuJga+DbQYHY2jzs/z5FeTBj2/gTfe0mfx29pDU0HQMJ5+HU3/oGpljc2fgUVoRdYEgzH8rv+RzSkqDbF1pdLAu456eaEMLtHo9ylNSkcPkozOoI8ZEweKH2Dy+Xz9ZGcOxiEl/2D8CzjKNOQSrWqFOdiviUc7pd+qKWebtOVQ+pxJu68RT2tgaer5mkLV5sNs6iW/rm91V3PxBq/m/uRyiQYbGI9BK5HmZ3AZkNI9dCjS55uv8+eom5O6MY2cqPDrUr6BSkYq1sDIJRrf0Ii05g/7nrYGMHTUaYt+uM1Ds8xUKi4lNYGRbL000r47b5HXB2h8fetOh73jMpSCn9pJT+5v/+82G5SbJ6CJ0Dvw4AVx8YveGOfZQvJNzktSXh1PcqwxvdauoUpGLt+jX2poyj3e3FbI3Udp0l3dSNkdgaBJPK7YULodD5v2AsbdH3tO6b4iYTrHsPVr0IVdvByD+hjFeeQ7KyTUwKOUhWtompgxviYKvKYSv6cLK35ZnmPvx97DJn41O02XB1n4SwX9V2nSXQuavaWMKoxq6U3v4RVGkBDQZa/H2tNylk3oQlw2HHt9BkJAxeBA53lrv+dsMpQs9d55M+9fEt71z4cSpKLkNb+mBnMPDzrQqZTceq7TpLqGnmq4QJLIKb1+Hx/7PY4HJu1pkUkuNgXg849jt0/giemAI2tncctiMynmmbIhnQxJtegV53OZGiFC6PUkaebFiJxfujuZ6SoW3XWbGB2q6zhDl3NYVlBy/wUv00nMLnQtBoqFi/UN7bOpPCjVht+8wB86DlxLtm3/jkdF5cFIZ/eWc+6Km21VSKjtHB/qRlmvhl97m823We26F3aMoj8v2mSGwMMDJxOji6Qbt3Cu2985uSWl8IsVsIES2EmCmEcM3Vt7dwwrMQzwCYFA51et2122SSvPLbIRJvZjLtqUY42d95FaEoeqlRoRSP1XRn3q4o0jKzoX4/tV1nCXL+aipLD1zg86rHsI/dC50+1MqbFJL8rhR+AD4A6gMnge1CiKrmvuK/aYCDyz27Zm0/w5aTcbzXvQ61PS070q8oD2NMsD/xyRmsDLtwe7vO46vUdp0lwPebIiljuEnPuBng1QQCnirU988vKbhIKf+UUiZIKb8EJgB/CiGak2tbzZLm4PnrfPHnCbrVq8jTzaroHY6i3FXLquWo41man7adxWSSarvOEiL6WipLD8TwfaW/sEmNhye+1EqmF6L83k0IIcrceiKl3AT0BeYDPpYOTA+JNzOZGHKQCqWNfNa3gSqHrRRZQgjGtPEj8koyW07Gmbfr7KK26yzmvt8USS0RTfO4JdrixEoNCz2G/JLC50Dt3A1SynCgA7DMkkHpQUrJ28sOczExjalPNaSMY/G/Q6aUbN0bVKJiaSMzt5oXs+Vs1/m7voEpDyX6WipL9kcz1TUEYSwN7d/VJY78VjQvkFLuBhBCuAghnM3t56WUlinkraOQvdGsPnyRVzvXpFEV1/u/QFF0ZmdjYEQrX3aducqRC4ng3968XacacC6Opm+OpIdhF37JB6HD++Dkpksc+d6sEkI8J4Q4D5xD24HtnBDivhvsFDcnLt3gwz+OEly9PM+2KVkVPJSSbXCzKrg42GqlLwwGbT57zF6IDdM7NOUBxFxPZU3oKT50DNFuGTUaqlss+U1J/Q/QA3hMSllOSukGtAO6mftKhJsZ2UxYcIBSRjumDAjEYFDjCErxUdpox8CgyqwKv0hswk0IfArsnGDfT3qHpjyA6ZtPM8FmOaUz47WS/Qb9yunkd6XwDNBHSnnmVoP55wGAfmnsEfvwj6NExiXzzcBA3Es56B2OojywEa18AZiz46w2n73BQDistussLi4k3ORA6G5G2qzRphZ7N9E1nnxvH0kp0+7SdhMwWSyiQvT7oVgW7ovm+ceq0rp6eb3DUZSH4u3qxOP1PQnZG01SWqY24Ky26yw2pm88xbs2cxEOLtDxA73DyTcpxAghOvyzUQjRHrhouZAKx/mrqby97DCNfVx5sWMNvcNRlH9lTLAfyelZLNobDRXqgk8rtV1nMXAh4SZJB5bQynAEQ4d3wVn/L6f5JYUXgB+FEHOFEBOFEBOEEPOAmWgL2YqtjCwTE0IOYBDw7aBA7NS2mkox18C7LM383Jiz4yyZ2aZc23Wu0zs0JR+zNhzmbZv5ZLjX06o1FwH5TUk9CtQDtgK+gL/553rmvmLrwPnrHL+YxBf9AvB2ddI7HEV5JMa28Sc2MY01hy9Cre5QylNNTy3CYhNu4hE2DU9xDfseX+k6uJzbPSu9CSGqARWklD//oz1YCBErpTxt8egspLl/OTa/1g6vsmqfZaXkaFfTA393Z37adoaeAZUQTUbCpo+17TrLV9M7POUffvtzE88bVpFSuz/OVZrrHU6O/O6bfAPcuEv7TXNfsaYSglLSGAyCMcH+HLmQxK4zV9V2nUXYxYRUGh37DJONEecnPtE7nDzySwq+5rIWeUgpQ9FuJ+VLCGEUQuwVQhwSQhwVQnxobl8khAgzP6KEEGG5XvOWECJSCHFCCNHlIX4fRbFqvRt6Uc7ZnlnbzmrbddbppbbrLII2rZxLG8MhbrZ+A1w89A4nj/ySgjGfvoJ8zU4H2kspA4BAoKsQormUcqCUMlBKGQgsxVxHSQhRBxgE1AW6AtOFEEXjJpuiFBNGOxuGtvBlY8QVIq/cuL1d5+Hf9A5NMbsUf402Z77iktEf17bj9Q7nDvklhX1CiDtqHAkhRgH773diqbn19cTO/MgpuS20EqQDgBBzUy9goZQyXUp5FogEmhbot1AUJceQ5lVwsDVoVwuVm6rtOouYiCWT8RbxGJ748q7bAOstv6TwIjBCCLFZCPGV+bEFGA1MKsjJhRA25ttDV4B1Uso9ubqDgctSylPm515AdK7+GHPbP885VggRKoQIjYuLK0gYimJVyrk40K+xN8sOXCAuOUObnnrlmNquswiIPx9Bi4u/cLBMJzzq37EMrEjIb0rqZSllS+BDIMr8+FBK2UJKeakgJ5dSZptvE3kDTYUQ9XJ1D+b2VQLA3YoO3fHVRko5U0rZRErZxN3dvSBhKIrVGdXaj0yTifm7oqBePzCWVdNTi4CrS14mExs8+n6hdyj3dN9VW1LKTVLKqebHxod5EyllArAZbawAIYQt0AdYlOuwGKByrufeQOzDvJ+iWDt/dxc61q7A/N3nuIkDNDJv15mk/knp5XrY79RM2sFmzxF4VSm61ZgttpRXCOEuhChr/tkR6AhEmLs7AhFSyphcL/kdGCSEcBBC+AHVgb2Wik9RSroxwf5cT81kyYEYaGLerjNUbdepi8w05Jo3OCW9aNDnLb2jyZcl6zt4ApuEEOHAPrQxhVXmvkHkvXV0awX1b8Ax4E9gvJRSFW5RlIcU5OtKQOWyzN52huyyvlC9s7aHs9qus9Alb5qCW0YsG31fpYpHmfu/QEcWSwpSynApZUMpZQMpZT0p5eRcfcOllDPu8pqPpZRVpZQ1pZRrLRWbolgDIQRjgv2IuprK+uOXtemparvOwnf9HA67vmZ1djO69hyodzT3pSrBKUoJ1rVuRbxdHflp6xmo2h7c/NUGPIUsbfWbZJoEB2q/ik85Z73DuS+VFBSlBLO1MTCylR+h565zICYRgsZA9B61XWdhiVyPMXIN32c/yTOdW+kdTYGopKAoJdyAoMqUNtoya9sZtV1nYcpKJ2v1a0TJisTVG4Nv+aJ/lQAqKShKiefiYMtTzXz488glzqfaQ4MBarvOwrDre2yvn+H9zGE817GO3tEUmEoKimIFhrf0xcYg+HnHWe0WUlYaHPxF77BKrsQY5Jb/Y50Molzg4/gVk6sEUElBUaxCxTJGegRU4rfQaBJK11DbdVraX++QlZ3F5IwhTGxfXe9oHohKCopiJcYE+5Oakc2ve85D0GhIOAcn/9Q7rJLnzGY4toLp2b0ICgwsVlcJoJKColiN2p6lCa5ennk7o0iv/ji4+sHf/4GMVL1DKzmyMmDN61x38OKHzCeY0L747XinkoKiWJExwf5cuZHO74fjoOd3cO0MbP5U77BKjj0zIP4Eb6c+TbdAP/zdXfSO6IGppKAoViS4enlqVSzFrG1nkb7B2padu6bBhQN6h1b8JV2ELZ9zqmwr/soKLJZXCaCSgqJYFSEEo4P9OXH5BltPxUOnyeBSAVZO0G59KA9v3bvI7EzGXx1Aj4BKVC2GVwmgkoKiWJ2eAZXwKOWgLWZzLAtPTIErR2HHt3qHVnyd+BMOL2Znxac5leVe7GYc5aaSgqJYGXtbA8Nb+bLtVDzHYpOg1uNQtw9s/QKuRNz/BEpe18/B8mfJ8qjH+Oh29GhQiWoexfMqAVRSUBSr9HRTH5ztbfh2w0mtodsXYO8Mv09QaxceRFYGLB4O0sTsSh+SmGnDCx2K51jCLSopKIoVKuNkx7i2Vfnr6GVCo66Bizt0/Rxi9qltOx/Eunch9gBJXb7huwNZdG9QiWoepfSO6l9RSUFRrNSoYD88SjnwyZrjSCm1mkjVOsGGyXA9Su/wir6jK7QpqM2f592T/mRkm5jUofiOJdyikoKiWCkne1te7lSDA+cT+OvoJRACun8NwgB/TAIp9Q6x6Lp6Wpux5dWEzVXGszIslvHtqhXrsYRbVFJQFCvWr7E31T1c+PzPE2Rmm6BsZej4gVaqIexXnaMrojJvwuJhYGNLaq/ZvPP7Sap5uPDcY1X1juyRUElBUayYrY2BN7vV4mx8CiF7z2uNTUZBlZbw19tw45K+ARZFf74Jlw5D7x/5ck8qsYk3+bxvfRxsbfSO7JFQSUFRrFz7Wh4093fj2/WnuJGWCQYD9JwKmWmw5lW9wytaDi2C/XOh9UuEOTZjzs6zDGnmQ2MfN70je2RUUlAUKyeE4K1utbmaksHMrWe0xvLVoN1bcPwPOLZS3wCLiisRsOpF8GlFZtu3eXNpOBVKGXm9a029I3ukVFJQFIWAymXpEVCJn7ad4XJSmtbYYiJ4BsDqV9UubRkp8NtQbS1H39nM3H6eiEs3+O+T9ShltNM7ukdKJQVFUQB4rXNNsk2Sr9eZF7TZ2ELPaZB6VSuxba2khFUvQ/xJ6DuLM+ml+HbDKR6vX5FOdSroHd0jp5KCoigAVCnnxDPNffktNJqTl29ojZ4NoPWL2kykyA36BqiXA/+D8IXw2FuYfNvy1rLDGG0NfNCzrt6RWYTFkoIQwiiE2CuEOCSEOCqE+DBX30QhxAlz+xfmNnshxBwhxGHzax6zVGyKotzdxPbVcHaw5bO1uWogtXkdylWHP16E9GT9gtPDxXBY8xr4t4M2r/JbaDR7zl7j7cdr41HKqHd0FmHJK4V0oL2UMgAIBLoKIZoLIdoBvYAGUsq6wJfm48cASCnrA52Ar4QQ6kpGUQqRq7M949tVY2PEFXaejtca7YzQaxokRmurna1FWpK2HsHJDfr8xJXkTD5ec5zm/m4MDKqsd3QWY7EPXam59bXCzvyQwHPAZ1LKdPNxV8zH1AE25GpLAJpYKj5FUe5ueEtfKpUx8tnaCEwm86rmKs2h6RitLtL53foGWBikhN8nahVQ+/0MLu588MdR0rNMfNqnAUIIvSO0GIt+ExdC2AghwoArwDop5R6gBhAshNgjhNgihAgyH34I6CWEsBVC+AGNgZKbjhWliDLa2fBK55qExySy6vDF2x0d3ocy3tqHZWaafgEWhr0/wbEV0OE98GnJ30cvsebwJSZ1qI5feWe9o7MoiyYFKWW2lDIQ8AaaCiHqAbaAK9AceA34TWhp92cgBggFvgF2Aln/PKcQYqwQIlQIERoXF2fJ8BXFaj3Z0IvanqX5v78iSM8yl9J2cIEe32izcLb+n74BWlLMfm01d42u0PIFbqRl8t7Ko9SqWIqxbfz1js7iCuWevZQyAdgMdEX74F9mvr20FzAB5aWUWVLKl6SUgVLKXkBZ4NRdzjVTStlEStnE3d29MMJXFKtjYxC8/Xgtoq/dZP6uc7c7qnWEgKdgxzfaIGxJk3pN2x+hlCc8+QMYDHzx5wku30jjs74NsLMp+cOclpx95C6EKGv+2RHoCEQAK4D25vYagD0QL4RwEkI4m9s7AVlSymOWik9RlPwFV3cnuHp5pm6MJDE183ZHl4/B0VXbkCf7jov54stkghXPwY2L0H8uOLkRGnWN+bvPMaKlH4GVy+odYaGwZNrzBDYJIcKBfWhjCqvQbhP5CyGOAAuBYVJKCXgAB4QQx4E3gGcsGJuiKAXwZrdaJKVlMn1L5O1GJzd4/Eu4eAh2TdMvuEdt11Q4+aeW9Lwbk56VzZvLDuNV1pFXOtfQO7pCY2upE0spw4GGd2nPAIbcpT0KKFlFRBSlmKtbqQy9G3oxZ0cUQ1v44lXWUeuo0wtqdYfNn2r/LV+8t6Dk3C5Y/6H2ezUdC8D0TaeJvJLMnBFBODtY7KOyyCn5N8gURflXXumsfVf76u8TtxuFgCe+AhsH+OMF7dZLcZUSD0tGgKuPVh1WCE5dvsH0zZH0CqxEu5oeekdYqFRSUBQlX15lHRnZyo/lBy9wNDbxdkepitqtlnM7YP8c/QL8N0zZsGyMNsDcfx4Yy2AySd5cdhgXB1ve615H7wgLnUoKiqLc13OPVaWMo13e8hcADYeAX1tY9z4kxugT3L+x7Ss4vREe/0Kr8wT8uucc+89d5z9P1KGci4POARY+lRQURbmvMo52TGxfnW2n4tl6Mtf6ICGg53cgs7VKosVpX+czm2HTJ9BgIDQaBsDFxJt8/ucJgquXp08jL33j04lKCoqiFMiQ5lWo7ObIp2sjyDbl+vB39dVW/p76Cw4v1i2+B5J0EZaOBvea0P1rEAIpJe+uOEqWycTHT9Yv0aUs8qOSgqIoBeJga8NrXWpx/GISKw5eyNvZdCx4B8HaNyC5iFcayM6CpaO0jXP6z9M2zgHWHrnE+uOXeblTDaqUc9I5SP2opKAoSoF1r+9JgHcZvvr7BGmZ2bc7DDbahjwZyfDnG/oFWBCbzIPj3b8Bj1oAJKZqpSzqeZVmZCs/nQPUl0oKiqIUmMEgeLNbbWIT05izIypvp0ctaPMaHFkKEWt0ie++Tv4N26doYwgBA3OaP117nOupGXzWpwG2VlDKIj/W/dsrivLAWlQtR4daHkzfFMm1lIy8na1eBI+6sPplSEu8+wn0khANy8dChfrQ7fOc5l2nr7JwXzSjg/2o51VGxwCLBpUUFEV5YG90q0VKRhbTNkbm7bC1h15TIfkyrHtPn+DuJitDW6CWnQUD5oGdtjI7LTObt5cfpoqbEy92sJ5SFvlRSUFRlAdWo0IpBjSpzPzdUZy/mpq306sxtBgP++fC2a26xHeH9R9AzD4tYZWrmtP83YZTnI1P4dM+9XG0t9EvviJEJQVFUR7KS51qYGsw8H+5y1/c8tjb4OoHv78AGal39hemY7/D7u+h6bNQt/ft5tgkZm49Q7/G3rSqVl7HAIsWlRQURXkoFUobGRPsxx+HYjkUnZC3095JW9R2/Sxs/kSfAAGunYGV47Wrl84f5TRnmyRvLQunjKMd7zxeW7/4iiCVFBRFeWhj21alnLM9n6w5jvznama/NtB4OOz6Hi7sL/zgMtPgt2EgDNBvjjbeYTZ3ZxSHYhJ5v2ddXJ3t8zmJ9VFJQVGUh+biYMuLHauz5+w1NkZcufOATpPBpSKsnKgN9hamv96CS+HQ+0etAqpZ9LVUvvr7BO1qutOjgWfhxlQMWE+RcEVRLGJQ0yr8vCOKz9ZG0LaGe955/sYy0H0KhAzStvBs+/qDv4HJBOlJ2hTXtAS4mXD757TEvM9z/xx/ElpNgppdc04lpeQ/K44A8FFv6y1lkR+VFBRF+VfsbAy80bUm4345wJL9MQxqWiXvATW7Qb1+sOULrRSGvUuuD/CE+3+4pyUB+RTaEwYt+RjLgmNZ7efSlaDOk3ckod8PxbLlZBzv96hze8MgJQ+VFBRF+de61K1IYx9Xpqw7Sc/ASjjZ/+OjpdvnWonq+U/e/QS2jtqH+a0P9VKe4FH7zg97Y9lcx5l/diilVWu9j2spGXz4xzECK5dlaAvff/9Ll1AqKSiK8q8JIXj78Vr0/WEXs7ad5YUO1fMe4FwexmyEmNC7f8DbWn7fgo9WHyPpZiaf9a2PjUHdNroXlRQURXkkGvu40bVuRX7ccprBTavgXuofH/RuftpDB1tPxrHswAUmtq9GrYqldYmhuFCzjxRFeWRe71qTtCwT3204pXcoOVIzsnhnxWH83Z0Z366a3uEUeSopKIryyPi7u/BU0yos2Hue03HJeocDwNfrThJ97Saf9q6P0U6Vsvj/9u48xqryjOP498fisAyLsigyLKIsso3LqJi4tFatkiqlaqI2pnaJ0bTW1tgosYmWlFQ0qUukqVi32kZj1aZpGyGujWgUwTIsKiiiMoqKC2sEZebpH+ed2wvOAJeZO3Pv5fdJbubwnvec8z453Pvc8773vGdPnBTMrF1ddfpoenTrwi3zWpj+ooMta9jIPQvWcNHxwzlh1IDObk5ZcFIws3Y1sLqKy089Z1snyAAACRZJREFUnHkrPmTxu591Wju+amzi2seWMrC6iuvOHtdp7Sg3Tgpm1u5+fPJhDO5Txax/tzD9RQe5Z8EaXlu3iZnTJtCvZ/dOaUM5KlpSkNRD0kJJ9ZJWSPpN3rorJa1M5Tensu6SHpC0TNLrkmYUq21mVly9DujG1WeM4dX3NjB/xYcddtympmD1+i08uriBW59cxZnjD+asiZ7KohDF/EnqduC0iNgiqTuwQNITQE9gGjA5IrZLGpzqXwBURcQkSb2A1yQ9FBHvFLGNZlYk5x9bwz0L1jB73kq+deTBdG/nx1xGBOs2bmNpwwbqGzaytGEDSxs2snnbDgAO6duDmdMmtusx9wdFSwqRXTM2//yge3oFcAVwU0RsT/WaZ9EKoLekbmSJ40tgU7HaZ2bF1a1rF2ZMHceP7l/Ewwvf45I23kX8+dYvqU8f/PVrs0TwyZbtAHTvKsYd0pdzaw+ltqY/k4f144hB1fv985b3RVFvXpPUFVgMHAHMiYiXJY0BTpY0C9gGXBMRrwCPkl1BrAN6Ab+MiK+NUkm6DLgMYPjw4buuNrMS8s2xg5ky6iBue+pNph9TQ3XV3n3kbN2+g2Xvb9zpKmDtZ18A2YwWhw+q5pQxA7MEUNOPI4f09c9N20lRk0JENAJHSeoP/F3SxHTMA4EpwHHAI5JGAccDjcChaf3zkp6KiLd32edcYC5AXV1d54xgmdlekcSMs49k2pwXmPuf1Vx95tiv1dm+o5E31m3eKQG89fEWmtK7e2j/ntQO68f3TxhBbU1/Jg7tS58eHjgulg6Z5iIiNkh6DjgLaAAeT91LCyU1AQOBi4F5EfEV8LGkF4A64O1WdmtmZaB2WH/OqT2Uu59fw4XHD2fL9h3Ur03dQA0beGPdZr5sbAJgQO8DmFzTj6mThlBb059JNf0YWF38eZHs/4qWFCQNAr5KCaEncDowm2yc4TTgudSVdADwCfAecJqkv5B1H00BbitW+8ys4/zqzLHMW76Ok2Y/k7sCqK7qxsShffnhSSNz3UBD+/f0Mw46WTGvFIYAD6RxhS7AIxHxL0kHAPdKWk42mPyDiAhJc4D7gOWAgPsiYmkR22dmHWT4gF7Mmj6JFe9vZHJNf2qH9WPUwGq6eLbSkqPOurGkPdTV1cWiRYs6uxlmZmVF0uKIqGtpnX+vZWZmOU4KZmaW46RgZmY5TgpmZpbjpGBmZjlOCmZmluOkYGZmOU4KZmaWU9Y3r0laD7zbhl0MJJtio5JUYkz5Kjm+So6tWaXHWC7xjYiIQS2tKOuk0FaSFrV2V1+5qsSY8lVyfJUcW7NKj7ES4nP3kZmZ5TgpmJlZzv6eFOZ2dgOKoBJjylfJ8VVybM0qPcayj2+/HlMwM7Od7e9XCmZmlqdskoKkYZKelfS6pBWSrkrlB0l6UtKb6e+BedvMkPSWpJWSvp1XPkvSWklb9nDMYyUtS/u4Q+mRUJJOkfSqpB2Szq+guC5P5UskLZA0vi2xlWB8l0pan+JbIuknFRTbrXlxrZK0oS2xlWiMIyQ9LWmppOck1ZRhbC3WUzt+prRZRJTFi+xJbsek5T7AKmA8cDNwXSq/DpidlscD9UAVcBiwGuia1k1J+9uyh2MuBE4kexLcE8DZqXwkMBn4M3B+BcXVN6/OuWTPzK6k83YpcGcl/p/cpc6VwL2VFiPwN7InNUL2SN8HyzC2FuvRjp8pbT7nnXnwNp7QfwBnACuBIXkneWVangHMyKs/Hzhxl320egLTvt7I+/dFwF271Lm/vU9gKcSVV/5EJZ032jkplFJsu9R7ETij0mIEVgA1aVnApnKKbW/qFeMzpdBX2XQf5ZM0EjgaeBk4OCLWAaS/g1O1ocDavM0aUtneGpq22dftC1YKcUn6qaTVZN+Wfl5YBLtXCvEB56Xuh0clDSsogN0okdiQNILsW+wzBex3r5RAjPXAeWl5OtBH0oAC9t2qDoqtLJRdUpBUDTwG/CIiNu2uagtlhfzUqq3bF6RU4oqIORFxOHAt8OsC9rv7g5ZGfP8ERkbEZOAp4IEC9tv6AUsjtmYXAo9GRGMB+93zgUsjxmuAUyX9FzgVeB/YUcC+Wz5gx8VWFsoqKUjqTnby/hoRj6fijyQNSeuHAB+n8gYg/5tgDfDBbvbdNW+gbmbaPn8ga7fbt0WJxvUw8N19iaeFNpREfBHxaURsT+V3A8e2LbLSiS3PhcBD+xpPK+0oiRgj4oOI+F5EHA1cn8o2llFs5aEz+64K7O8T2SDMbbuU38LOg0I3p+UJ7Dwo9DZpUGhP/Xp5618hGxhqHvCa2t79f6UUFzA6r845wKJKOm+kfuK0PB14qVJiS+vGAu+Q7j9qj1cpxUg22VyXtDwLmFluse2pHiUwptBpB96HE3gS2aXaUmBJek0FBgBPA2+mvwflbXM92S8EVpL3Kw2y/vIGoCn9vbGVY9YBy9M+7mx+swHHpe22Ap8CKyokrtvJBvOWAM8CEyrsvP0uxVef4htXKbGldTcCN1Xw++78dLxVwJ+AqjKMrcV6tONnSltfvqPZzMxyympMwczMistJwczMcpwUzMwsx0nBzMxynBTMzCzHScGsAJIa081IKyTVS7pa0m7fR5JGSrq4o9po1hZOCmaF+SIijoqICWSTp00FbtjDNiMBJwUrC75PwawAkrZERHXev0eR3YE7EBgBPAj0Tqt/FhEvSnoJOBJYQzbf0h3ATcA3yO6OnRMRd3VYEGa74aRgVoBdk0Iq+xwYB2wGmiJim6TRwEMRUSfpG8A1EfGdVP8yYHBE/FZSFfACcEFErOnQYMxa0K2zG2BWAZpnz+wO3CnpKKARGNNK/TOByXlP2OoHjCa7kjDrVE4KZm2Quo8ayWbSvAH4CKglG6/b1tpmwJURMb9DGmlWAA80m+0jSYOAP5I9zS3IvvGvi4gm4BKga6q6mexxj83mA1ekaZuRNEZSb8xKgK8UzArTU9ISsq6iHWQDy79P6/4APCbpArJZWLem8qXADkn1ZFMj3072i6RX00Pp19NOz64waysPNJuZWY67j8zMLMdJwczMcpwUzMwsx0nBzMxynBTMzCzHScHMzHKcFMzMLMdJwczMcv4HEXedtCkCYNYAAAAASUVORK5CYII=",
 | ||
|       "text/plain": [
 | ||
|        "<Figure size 432x288 with 1 Axes>"
 | ||
|       ]
 | ||
|      },
 | ||
|      "metadata": {
 | ||
|       "needs_background": "light"
 | ||
|      },
 | ||
|      "output_type": "display_data"
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "import matplotlib.pyplot as plt\n",
 | ||
|     "plt.plot(X_test, y_test, label='Actual level')\n",
 | ||
|     "plt.plot(X_test, flaml_y_pred, label='FLAML forecast')\n",
 | ||
|     "plt.xlabel('Date')\n",
 | ||
|     "plt.ylabel('CO2 Levels')\n",
 | ||
|     "plt.legend()"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "## 3. Forecast Problems with Exogeneous Variables"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Load Data and Preprocess\n",
 | ||
|     "\n",
 | ||
|     "Load dataset on NYC energy consumption. The task is to predict the average hourly demand of enegry used in a day given information on time, temperature, and precipitation. Temperature and precipiation values are both continuous values. To demonstrate FLAML's ability to handle categorical values as well, create a column with categorical values, where 1 denotes daily tempurature is above monthly average and 0 is below."
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 16,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [],
 | ||
|    "source": [
 | ||
|     "''' multivariate time series forecasting dataset'''\n",
 | ||
|     "import pandas as pd\n",
 | ||
|     "# pd.set_option(\"display.max_rows\", None, \"display.max_columns\", None)\n",
 | ||
|     "multi_df = pd.read_csv(\n",
 | ||
|     "    \"https://raw.githubusercontent.com/srivatsan88/YouTubeLI/master/dataset/nyc_energy_consumption.csv\"\n",
 | ||
|     ")\n",
 | ||
|     "# preprocessing data\n",
 | ||
|     "multi_df[\"timeStamp\"] = pd.to_datetime(multi_df[\"timeStamp\"])\n",
 | ||
|     "multi_df = multi_df.set_index(\"timeStamp\")\n",
 | ||
|     "multi_df = multi_df.resample(\"D\").mean()\n",
 | ||
|     "multi_df[\"temp\"] = multi_df[\"temp\"].fillna(method=\"ffill\")\n",
 | ||
|     "multi_df[\"precip\"] = multi_df[\"precip\"].fillna(method=\"ffill\")\n",
 | ||
|     "multi_df = multi_df[:-2]  # last two rows are NaN for 'demand' column so remove them\n",
 | ||
|     "multi_df = multi_df.reset_index()"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 17,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [],
 | ||
|    "source": [
 | ||
|     "''' Use feature engineering to create a categorical value'''\n",
 | ||
|     "# Using temperature values create categorical values \n",
 | ||
|     "# where 1 denotes daily tempurature is above monthly average and 0 is below.\n",
 | ||
|     "\n",
 | ||
|     "def get_monthly_avg(data):\n",
 | ||
|     "    data[\"month\"] = data[\"timeStamp\"].dt.month\n",
 | ||
|     "    data = data[[\"month\", \"temp\"]].groupby(\"month\")\n",
 | ||
|     "    data = data.agg({\"temp\": \"mean\"})\n",
 | ||
|     "    return data\n",
 | ||
|     "\n",
 | ||
|     "monthly_avg = get_monthly_avg(multi_df).to_dict().get(\"temp\")\n",
 | ||
|     "\n",
 | ||
|     "def above_monthly_avg(date, temp):\n",
 | ||
|     "    month = date.month\n",
 | ||
|     "    if temp > monthly_avg.get(month):\n",
 | ||
|     "        return 1\n",
 | ||
|     "    else:\n",
 | ||
|     "        return 0\n",
 | ||
|     "\n",
 | ||
|     "multi_df[\"temp_above_monthly_avg\"] = multi_df.apply(\n",
 | ||
|     "    lambda x: above_monthly_avg(x[\"timeStamp\"], x[\"temp\"]), axis=1\n",
 | ||
|     ")\n",
 | ||
|     "\n",
 | ||
|     "del multi_df[\"temp\"], multi_df[\"month\"]  # remove temperature column to reduce redundancy"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 18,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "data": {
 | ||
|       "text/html": [
 | ||
|        "<div>\n",
 | ||
|        "<style scoped>\n",
 | ||
|        "    .dataframe tbody tr th:only-of-type {\n",
 | ||
|        "        vertical-align: middle;\n",
 | ||
|        "    }\n",
 | ||
|        "\n",
 | ||
|        "    .dataframe tbody tr th {\n",
 | ||
|        "        vertical-align: top;\n",
 | ||
|        "    }\n",
 | ||
|        "\n",
 | ||
|        "    .dataframe thead th {\n",
 | ||
|        "        text-align: right;\n",
 | ||
|        "    }\n",
 | ||
|        "</style>\n",
 | ||
|        "<table border=\"1\" class=\"dataframe\">\n",
 | ||
|        "  <thead>\n",
 | ||
|        "    <tr style=\"text-align: right;\">\n",
 | ||
|        "      <th></th>\n",
 | ||
|        "      <th>timeStamp</th>\n",
 | ||
|        "      <th>demand</th>\n",
 | ||
|        "      <th>precip</th>\n",
 | ||
|        "      <th>temp_above_monthly_avg</th>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "  </thead>\n",
 | ||
|        "  <tbody>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>0</th>\n",
 | ||
|        "      <td>2012-01-01</td>\n",
 | ||
|        "      <td>4954.833333</td>\n",
 | ||
|        "      <td>0.002487</td>\n",
 | ||
|        "      <td>1</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>1</th>\n",
 | ||
|        "      <td>2012-01-02</td>\n",
 | ||
|        "      <td>5302.954167</td>\n",
 | ||
|        "      <td>0.000000</td>\n",
 | ||
|        "      <td>1</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>2</th>\n",
 | ||
|        "      <td>2012-01-03</td>\n",
 | ||
|        "      <td>6095.512500</td>\n",
 | ||
|        "      <td>0.000000</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>3</th>\n",
 | ||
|        "      <td>2012-01-04</td>\n",
 | ||
|        "      <td>6336.266667</td>\n",
 | ||
|        "      <td>0.000000</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>4</th>\n",
 | ||
|        "      <td>2012-01-05</td>\n",
 | ||
|        "      <td>6130.245833</td>\n",
 | ||
|        "      <td>0.000000</td>\n",
 | ||
|        "      <td>1</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>...</th>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>1864</th>\n",
 | ||
|        "      <td>2017-02-07</td>\n",
 | ||
|        "      <td>5861.319833</td>\n",
 | ||
|        "      <td>0.011938</td>\n",
 | ||
|        "      <td>1</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>1865</th>\n",
 | ||
|        "      <td>2017-02-08</td>\n",
 | ||
|        "      <td>5667.644708</td>\n",
 | ||
|        "      <td>0.001258</td>\n",
 | ||
|        "      <td>1</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>1866</th>\n",
 | ||
|        "      <td>2017-02-09</td>\n",
 | ||
|        "      <td>5947.661958</td>\n",
 | ||
|        "      <td>0.027029</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>1867</th>\n",
 | ||
|        "      <td>2017-02-10</td>\n",
 | ||
|        "      <td>6195.122500</td>\n",
 | ||
|        "      <td>0.000179</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>1868</th>\n",
 | ||
|        "      <td>2017-02-11</td>\n",
 | ||
|        "      <td>5461.026000</td>\n",
 | ||
|        "      <td>0.000492</td>\n",
 | ||
|        "      <td>1</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "  </tbody>\n",
 | ||
|        "</table>\n",
 | ||
|        "<p>1869 rows × 4 columns</p>\n",
 | ||
|        "</div>"
 | ||
|       ],
 | ||
|       "text/plain": [
 | ||
|        "      timeStamp       demand    precip  temp_above_monthly_avg\n",
 | ||
|        "0    2012-01-01  4954.833333  0.002487                       1\n",
 | ||
|        "1    2012-01-02  5302.954167  0.000000                       1\n",
 | ||
|        "2    2012-01-03  6095.512500  0.000000                       0\n",
 | ||
|        "3    2012-01-04  6336.266667  0.000000                       0\n",
 | ||
|        "4    2012-01-05  6130.245833  0.000000                       1\n",
 | ||
|        "...         ...          ...       ...                     ...\n",
 | ||
|        "1864 2017-02-07  5861.319833  0.011938                       1\n",
 | ||
|        "1865 2017-02-08  5667.644708  0.001258                       1\n",
 | ||
|        "1866 2017-02-09  5947.661958  0.027029                       0\n",
 | ||
|        "1867 2017-02-10  6195.122500  0.000179                       0\n",
 | ||
|        "1868 2017-02-11  5461.026000  0.000492                       1\n",
 | ||
|        "\n",
 | ||
|        "[1869 rows x 4 columns]"
 | ||
|       ]
 | ||
|      },
 | ||
|      "execution_count": 18,
 | ||
|      "metadata": {},
 | ||
|      "output_type": "execute_result"
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "# split data into train and test\n",
 | ||
|     "num_samples = multi_df.shape[0]\n",
 | ||
|     "multi_time_horizon = 180\n",
 | ||
|     "split_idx = num_samples - multi_time_horizon\n",
 | ||
|     "multi_train_df = multi_df[:split_idx]\n",
 | ||
|     "multi_test_df = multi_df[split_idx:]\n",
 | ||
|     "\n",
 | ||
|     "multi_X_test = multi_test_df[\n",
 | ||
|     "    [\"timeStamp\", \"precip\", \"temp_above_monthly_avg\"]\n",
 | ||
|     "]  # test dataframe must contain values for the regressors / multivariate variables\n",
 | ||
|     "multi_y_test = multi_test_df[\"demand\"]\n",
 | ||
|     "\n",
 | ||
|     "multi_train_df"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Run FLAML"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 19,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stderr",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "[flaml.automl: 07-28 21:14:47] {2478} INFO - task = ts_forecast\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {2480} INFO - Data split method: time\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {2483} INFO - Evaluation method: holdout\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {2552} INFO - Minimizing error metric: mape\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {2694} INFO - List of ML learners in AutoML Run: ['lgbm', 'rf', 'xgboost', 'extra_tree', 'xgb_limitdepth', 'prophet', 'arima', 'sarimax']\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {2986} INFO - iteration 0, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {3114} INFO - Estimated sufficient time budget=509s. Estimated necessary time budget=1s.\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {3161} INFO -  at 0.1s,\testimator lgbm's best error=0.1103,\tbest estimator lgbm's best error=0.1103\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {2986} INFO - iteration 1, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {3161} INFO -  at 0.1s,\testimator lgbm's best error=0.1103,\tbest estimator lgbm's best error=0.1103\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {2986} INFO - iteration 2, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {3161} INFO -  at 0.2s,\testimator lgbm's best error=0.0983,\tbest estimator lgbm's best error=0.0983\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {2986} INFO - iteration 3, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {3161} INFO -  at 0.2s,\testimator rf's best error=0.0968,\tbest estimator rf's best error=0.0968\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {2986} INFO - iteration 4, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {3161} INFO -  at 0.2s,\testimator lgbm's best error=0.0983,\tbest estimator rf's best error=0.0968\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {2986} INFO - iteration 5, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {3161} INFO -  at 0.3s,\testimator lgbm's best error=0.0925,\tbest estimator lgbm's best error=0.0925\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {2986} INFO - iteration 6, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {3161} INFO -  at 0.3s,\testimator lgbm's best error=0.0925,\tbest estimator lgbm's best error=0.0925\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {2986} INFO - iteration 7, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {3161} INFO -  at 0.3s,\testimator lgbm's best error=0.0925,\tbest estimator lgbm's best error=0.0925\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {2986} INFO - iteration 8, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {3161} INFO -  at 0.4s,\testimator lgbm's best error=0.0861,\tbest estimator lgbm's best error=0.0861\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {2986} INFO - iteration 9, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {3161} INFO -  at 0.4s,\testimator rf's best error=0.0877,\tbest estimator lgbm's best error=0.0861\n",
 | ||
|       "[flaml.automl: 07-28 21:14:47] {2986} INFO - iteration 10, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 0.4s,\testimator rf's best error=0.0877,\tbest estimator lgbm's best error=0.0861\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 11, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 0.5s,\testimator rf's best error=0.0877,\tbest estimator lgbm's best error=0.0861\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 12, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 0.5s,\testimator xgboost's best error=0.6523,\tbest estimator lgbm's best error=0.0861\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 13, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 0.6s,\testimator rf's best error=0.0836,\tbest estimator rf's best error=0.0836\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 14, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 0.6s,\testimator xgboost's best error=0.6523,\tbest estimator rf's best error=0.0836\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 15, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 0.6s,\testimator extra_tree's best error=0.1059,\tbest estimator rf's best error=0.0836\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 16, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 0.7s,\testimator rf's best error=0.0743,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 17, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 0.8s,\testimator extra_tree's best error=0.0962,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 18, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 0.8s,\testimator xgb_limitdepth's best error=0.0820,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 19, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 0.8s,\testimator lgbm's best error=0.0861,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 20, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 0.9s,\testimator xgb_limitdepth's best error=0.0820,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 21, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 0.9s,\testimator xgboost's best error=0.2637,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 22, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 0.9s,\testimator xgboost's best error=0.0959,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 23, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 1.0s,\testimator rf's best error=0.0743,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 24, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 1.1s,\testimator rf's best error=0.0743,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 25, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 1.1s,\testimator xgb_limitdepth's best error=0.0820,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 26, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 1.2s,\testimator xgb_limitdepth's best error=0.0820,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 27, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 1.2s,\testimator xgboost's best error=0.0959,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 28, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 1.2s,\testimator xgboost's best error=0.0959,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 29, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 1.2s,\testimator xgb_limitdepth's best error=0.0820,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 30, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 1.2s,\testimator xgb_limitdepth's best error=0.0820,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 31, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 1.3s,\testimator rf's best error=0.0743,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 32, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 1.3s,\testimator lgbm's best error=0.0861,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 33, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {3161} INFO -  at 1.4s,\testimator rf's best error=0.0743,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:48] {2986} INFO - iteration 34, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 07-28 21:14:49] {3161} INFO -  at 1.4s,\testimator xgb_limitdepth's best error=0.0791,\tbest estimator rf's best error=0.0743\n",
 | ||
|       "[flaml.automl: 07-28 21:14:49] {2986} INFO - iteration 35, current learner rf\n",
 | ||
|       "[flaml.automl: 07-28 21:14:49] {3161} INFO -  at 1.5s,\testimator rf's best error=0.0735,\tbest estimator rf's best error=0.0735\n",
 | ||
|       "[flaml.automl: 07-28 21:14:49] {2986} INFO - iteration 36, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:14:49] {3161} INFO -  at 1.6s,\testimator xgboost's best error=0.0834,\tbest estimator rf's best error=0.0735\n",
 | ||
|       "[flaml.automl: 07-28 21:14:49] {2986} INFO - iteration 37, current learner prophet\n",
 | ||
|       "[flaml.automl: 07-28 21:14:53] {3161} INFO -  at 6.0s,\testimator prophet's best error=0.0592,\tbest estimator prophet's best error=0.0592\n",
 | ||
|       "[flaml.automl: 07-28 21:14:53] {2986} INFO - iteration 38, current learner arima\n",
 | ||
|       "[flaml.automl: 07-28 21:14:54] {3161} INFO -  at 6.8s,\testimator arima's best error=0.6434,\tbest estimator prophet's best error=0.0592\n",
 | ||
|       "[flaml.automl: 07-28 21:14:54] {2986} INFO - iteration 39, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:55] {3161} INFO -  at 7.8s,\testimator sarimax's best error=0.6434,\tbest estimator prophet's best error=0.0592\n",
 | ||
|       "[flaml.automl: 07-28 21:14:55] {2986} INFO - iteration 40, current learner sarimax\n",
 | ||
|       "[flaml.automl: 07-28 21:14:57] {3161} INFO -  at 9.8s,\testimator sarimax's best error=0.5313,\tbest estimator prophet's best error=0.0592\n",
 | ||
|       "[flaml.automl: 07-28 21:14:57] {2986} INFO - iteration 41, current learner xgboost\n",
 | ||
|       "[flaml.automl: 07-28 21:14:57] {3161} INFO -  at 9.9s,\testimator xgboost's best error=0.0834,\tbest estimator prophet's best error=0.0592\n",
 | ||
|       "[flaml.automl: 07-28 21:14:57] {2986} INFO - iteration 42, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 07-28 21:14:57] {3161} INFO -  at 10.0s,\testimator extra_tree's best error=0.0962,\tbest estimator prophet's best error=0.0592\n",
 | ||
|       "[flaml.automl: 07-28 21:14:57] {2986} INFO - iteration 43, current learner lgbm\n",
 | ||
|       "[flaml.automl: 07-28 21:14:57] {3161} INFO -  at 10.1s,\testimator lgbm's best error=0.0861,\tbest estimator prophet's best error=0.0592\n",
 | ||
|       "[flaml.automl: 07-28 21:15:01] {3425} INFO - retrain prophet for 3.8s\n",
 | ||
|       "[flaml.automl: 07-28 21:15:01] {3432} INFO - retrained model: <prophet.forecaster.Prophet object at 0x000001E2D9A005E0>\n",
 | ||
|       "[flaml.automl: 07-28 21:15:01] {2725} INFO - fit succeeded\n",
 | ||
|       "[flaml.automl: 07-28 21:15:01] {2726} INFO - Time taken to find the best model: 5.99089241027832\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "automl = AutoML()\n",
 | ||
|     "settings = {\n",
 | ||
|     "    \"time_budget\": 10,  # total running time in seconds\n",
 | ||
|     "    \"metric\": \"mape\",  # primary metric\n",
 | ||
|     "    \"task\": \"ts_forecast\",  # task type\n",
 | ||
|     "    \"log_file_name\": \"energy_forecast_categorical.log\",  # flaml log file\n",
 | ||
|     "    \"eval_method\": \"holdout\",\n",
 | ||
|     "    \"log_type\": \"all\",\n",
 | ||
|     "    \"label\": \"demand\",\n",
 | ||
|     "}\n",
 | ||
|     "'''The main flaml automl API'''\n",
 | ||
|     "try:\n",
 | ||
|     "    import prophet\n",
 | ||
|     "\n",
 | ||
|     "    automl.fit(dataframe=multi_train_df, **settings, period=multi_time_horizon)\n",
 | ||
|     "except ImportError:\n",
 | ||
|     "    print(\"not using prophet due to ImportError\")\n",
 | ||
|     "    automl.fit(\n",
 | ||
|     "        dataframe=multi_train_df,\n",
 | ||
|     "        **settings,\n",
 | ||
|     "        estimator_list=[\"arima\", \"sarimax\"],\n",
 | ||
|     "        period=multi_time_horizon,\n",
 | ||
|     "    )"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Prediction and Metrics"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 20,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "Predicted labels 0      5352.985670\n",
 | ||
|       "1      6013.062371\n",
 | ||
|       "2      6106.856497\n",
 | ||
|       "3      6368.692993\n",
 | ||
|       "4      6081.394081\n",
 | ||
|       "          ...     \n",
 | ||
|       "175    6841.950842\n",
 | ||
|       "176    7584.557653\n",
 | ||
|       "177    7614.970448\n",
 | ||
|       "178    7729.474679\n",
 | ||
|       "179    7585.110004\n",
 | ||
|       "Name: yhat, Length: 180, dtype: float64\n",
 | ||
|       "True labels 1869    5486.409375\n",
 | ||
|       "1870    6015.156208\n",
 | ||
|       "1871    5972.218042\n",
 | ||
|       "1872    5838.364167\n",
 | ||
|       "1873    5961.476375\n",
 | ||
|       "           ...     \n",
 | ||
|       "2044    5702.361542\n",
 | ||
|       "2045    6398.154167\n",
 | ||
|       "2046    6471.626042\n",
 | ||
|       "2047    6811.112167\n",
 | ||
|       "2048    5582.297000\n",
 | ||
|       "Name: demand, Length: 180, dtype: float64\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "''' compute predictions of testing dataset '''\n",
 | ||
|     "multi_y_pred = automl.predict(multi_X_test)\n",
 | ||
|     "print(\"Predicted labels\", multi_y_pred)\n",
 | ||
|     "print(\"True labels\", multi_y_test)"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 21,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "mape = 0.08347031511602677\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "''' compute different metric values on testing dataset'''\n",
 | ||
|     "from flaml.ml import sklearn_metric_loss_score\n",
 | ||
|     "print('mape', '=', sklearn_metric_loss_score('mape', y_true=multi_y_test, y_predict=multi_y_pred))"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Visualize"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 22,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "data": {
 | ||
|       "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYsAAAEGCAYAAACUzrmNAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjAsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8GearUAAAgAElEQVR4nOy9eZxcVZn//z639up9yUZCNkhYkpCVTbZhkWVEVNQBRwUGEQSXcfg5I4zzFWccFBlUBEVFUHAhgCCCCsgOArLvkEBC0lk7S6+1r/f8/jj31r51pytdnZz361Wvqj51q+pUp3Of+zyfZxFSSjQajUajqYQx3hvQaDQaTeOjjYVGo9FoqqKNhUaj0Wiqoo2FRqPRaKqijYVGo9FoquIc7w3Ui+7ubjl79uzx3oZGo9FMKF5++eU+KeWkwvU91ljMnj2bl156aby3odFoNBMKIcSGUus6DKXRaDSaqmhjodFoNJqqaGOh0Wg0mqrssZpFKZLJJJs3byYWi433VjSjwOv1MmPGDFwu13hvRaPZ69irjMXmzZtpaWlh9uzZCCHGezuaESClpL+/n82bNzNnzpzx3o5Gs9exV4WhYrEYXV1d2lBMQIQQdHV1aa9Qoxkn9ipjAWhDMYHR/3Yazfix1xkLjUajGSuklNz54iZiyfR4b6XuaGMxDtxzzz0IIVi9enXVY6+99loikcioP+uWW27hS1/6Usn1SZMmsXTpUubNm8cpp5zCs88+O+rPGWtmz55NX1/feG9Do6nIm1uG+Y+73+DBt7aN91bqjjYW48DKlSs5+uijuf3226seu6vGohJnnXUWr776KmvWrOGyyy7jzDPPZNWqVXX5LI1mT2TrkNLQNg3U5/9oI6GNxW4mFArxzDPPcPPNN+cZi3Q6zde+9jUWLVrEIYccwvXXX891113H1q1bOf744zn++OMBaG5uzrzmrrvu4rzzzgPgT3/6E4cffjhLly7lpJNOYvv27SPa1/HHH8+FF17IjTfeCMD777/PqaeeyvLlyznmmGMyXtB5553HxRdfzPHHH8/cuXN58sknOf/88znooIMyewG4+OKLWbFiBQsWLOCKK67IrM+ePZsrrriCZcuWsWjRosz79vf3c/LJJ7N06VIuuugi9ARHzURg23AUgM2D0XHeSf3Zq1Jnc/nvP73NO1sDY/qeB+/TyhUfXlDxmD/+8Y+ceuqpzJ8/n87OTl555RWWLVvGjTfeyPr163n11VdxOp0MDAzQ2dnJD37wAx5//HG6u7srvu/RRx/Nc889hxCCm266iauvvprvf//7I9r/smXL+PnPfw7AhRdeyM9+9jPmzZvH888/zyWXXMJjjz0GwODgII899hj33XcfH/7wh3nmmWe46aabOPTQQ3nttddYsmQJV155JZ2dnaTTaU488UTeeOMNDjnkEAC6u7t55ZVXuOGGG7jmmmu46aab+O///m+OPvpovvnNb/KXv/wlY7Q0mkZmWyAOwJYhbSw0Y8zKlSv56le/CsDZZ5/NypUrWbZsGY888ghf+MIXcDrVP0lnZ+eI3nfz5s2cddZZ9Pb2kkgkRlWLYF/Nh0Ihnn32WT75yU9mnovH45nHH/7whxFCsGjRIqZMmcKiRYsAWLBgAT09PSxZsoQ777yTG2+8kVQqRW9vL++8807GWJx55pkALF++nD/84Q8APPXUU5nHH/rQh+jo6Bjx/jWa3U3Ws9jzw1B1NRZCiH8FPg8I4BdSymuFEJ3AHcBsoAf4JynloHX85cDngDTwFSnlX6315cAtgA+4H/hXuYtximoeQD3o7+/nscce46233kIIQTqdRgjB1VdfjZSyptTQ3GNyaw6+/OUvc+mll3LGGWfwxBNP8K1vfWvE+3v11Vc56KCDME2T9vZ2XnvttZLHeTweAAzDyDy2f06lUqxfv55rrrmGF198kY6ODs4777y8vdqvcTgcpFKpkt9No5kIbAuov+utQzFMU2IYe+7fcN00CyHEQpShOAxYDJwuhJgHXAY8KqWcBzxq/YwQ4mDgbGABcCpwgxDCYb3dT4ELgXnW7dR67bue3HXXXZxzzjls2LCBnp4eNm3axJw5c3j66ac5+eST+dnPfpY5eQ4MDADQ0tJCMBjMvMeUKVNYtWoVpmlyzz33ZNaHh4eZPn06ALfeeuuI9/bkk09y44038vnPf57W1lbmzJnD73//e0B5HK+//nrN7xUIBGhqaqKtrY3t27fzwAMPVH3Nsccey+9+9zsAHnjgAQYHB0f8HTSa3c22YWUsEmmTnaF4laMnNvUUuA8CnpNSRqSUKeBJ4GPARwD7bHYr8FHr8UeA26WUcSnlemAtcJgQYhrQKqX8u+VN/DrnNROKlStX8rGPfSxv7eMf/zi33XYbF1xwATNnzuSQQw5h8eLF3HbbbYDSDk477bSMwH3VVVdx+umnc8IJJzBt2rTM+3zrW9/ik5/8JMccc0xVfcPmjjvuYMmSJcyfP5/vfOc73H333Rx00EEA/O53v+Pmm29m8eLFLFiwgHvvvbfm77l48WKWLl3KggULOP/88znqqKOqvuaKK67gqaeeYtmyZTz00EPMnDmz5s/TaMYDKSXbAjH2n6ySTvZ0kVvUK+tECHEQcC9wJBBFeREvAZ+VUrbnHDcopewQQvwYZVx+a63fDDyAClVdJaU8yVo/Bvi6lPL0Ep95IcoDYebMmcs3bMif4bFq1arMyVAzMdH/hppGYSiSYMn/PMzZh+7L7S9u4kdnL+EjS6aP97Z2GSHEy1LKFYXrdfMspJSrgO8BDwMPAq8DqQovKRXskxXWS33mjVLKFVLKFZMmFU0F1Gg0mjHD1iuWzVLJGHt6RlRd6yyklDdLKZdJKY8FBoA1wHYrtIR1v8M6fDOwb87LZwBbrfUZJdY1Go1m3Oi19Ir9JjXR4Xft8WGouhoLIcRk634mcCawErgPONc65FxUqApr/WwhhEcIMQclZL8gpewFgkKII4RKlzkn5zUajaYB2JOKKGv9LtstYzG1zceMDj9btLHYJe4WQrwD/An4opUiexXwQSHEGuCD1s9IKd8G7gTeQYWtviiltLtzXQzchBK930dpGRqNpgEIx1Ms+Z+HeWz1yLoGNCLheIrl//sIj7xT/bv0DscQAia3eJje7tvjay3qWmchpTymxFo/cGKZ468Eriyx/hKwcMw3qNFodpntgRjD0STrdoY54cDx3s2usb4vzEA4wYYaej1tD8TobvbgchjM6PDxxHs7aq6Xmojo3lAajWaXGIomAfaINt22d5BMm1WP7R2OMbXVC8CkFg+xpEkkMfF/B+XQxmI343A4WLJkSebW09PDE088wemnF2UCZ1i8eDGf+tSn8tbOO+88/H5/XsHev/7rvyKEyLT2zm06WAq7Tbm9l3POOWcXvtnY0dPTk6kz0TQ+w5axiO4RxkLpDqkajMX2QIypbcpY+D0qSBNOVEr4nNhoY7Gb8fl8vPbaa5nb7NmzKx5vV2s/9dRThMPhvOf233//TLGcaZo8/vjjmSruWjnrrLMye/n1r39d02uklJhm9f9Mo0Ubi4nFcMT2LOr3N7G7sFuNJ9LVRe4dwTiTW1Trmia3ajYRiU98g1kObSwanNtuu43PfvaznHzyydx33315z33qU5/ijjvuAOCJJ57gqKOOyjQi3BV+8IMfsHDhQhYuXMi1114LqBP4QQcdxCWXXMKyZcvYtGkT//d//8ehhx7KIYcckteG/Ne//nWmEv2zn/0sUL6F+pNPPpnxbJYuXUowGOSyyy7jb3/7G0uWLOGHP/zhLn8fTX3ZGz2LtCkZjCToanID0LQXeBZ7b9fZBy6DbW+O7XtOXQSnXVXxkGg0ypIlSwCYM2dOXn+nUtxxxx08/PDDvPvuu/z4xz/OC0fNmzePe++9l8HBQVauXMlnPvOZmvowFb7/008/Dagw1iGHHMKvfvUrnn/+eaSUHH744Rx33HF0dHTw7rvv8qtf/YobbriBhx56iDVr1vDCCy8gpeSMM87gqaeeoquriyuvvJJnnnmG7u7uTI+rci3Ur7nmGn7yk59w1FFHEQqF8Hq9XHXVVVxzzTX8+c9/HtF30YwPQ7ZnsQfE6zdZmkXKrOxZDEUSSAmdtrFwq1PpnqxZ7L3GYpyww1C18OKLLzJp0iRmzZrFjBkzOP/88xkcHMxr333mmWdy++238/zzz2dmUYyEs846ix//+MeZn3/0ox/xsY99jKampsz7/+1vf+OMM85g1qxZHHHEEQA89NBDPPTQQyxduhRQbc3XrFnD66+/zic+8YlMfyq71Xq5FupHHXUUl156KZ/+9Kc588wzmTEjt/5SMxGwPYtYamKfKKWUGc8ikarsWQyEEwB0NqswlN+jwlDhuPYs9jyqeACNwMqVK1m9enVG1wgEAtx9991ccMEFmWPOPvtsli1bxrnnnoth7HpUsVJBkm1A7OMuv/xyLrroorxjrrvuupKpg+VaqF922WV86EMf4v777+eII47gkUce2eXvoNm9DEXViTM6wa+qB8KJjGeQqqLJ9VvGoqvAswhrzUKzuzFNk9///ve88cYb9PT00NPTw7333svKlSvzjps5cyZXXnkll1xyyZh87rHHHssf//hHIpEI4XCYe+65h2OOKSqX4ZRTTuGXv/wloVAIgC1btrBjxw5OPPFE7rzzTvr7+4Fsq/VyLdTff/99Fi1axNe//nVWrFjB6tWri9qyaxqbwB6iWeS260hVEbj7Q5axaFbGwm8J3Fqz0NSdRx99NC8E88UvfpHp06fnZTcde+yxvPPOO/T29ua9tvDq3iYSieS956WXXsqll15acR/Lli3jvPPO47DDDgPgggsuYOnSpfT09OQdd/LJJ7Nq1SqOPPJIQKXp/va3v2XBggV84xvf4LjjjsPhcLB06VJuueWWTAv16dOnc8QRR7B+/XoArr32Wh5//HEcDgcHH3wwp512GoZh4HQ6Wbx4Meeddx7/9m//VuW3pxlPbM0iOsGzoTblVGAnqgjcA2E1u6KzQOCO7MFhqLq1KB9vVqxYIV966aW8Nd3eeuKj/w0bjw/+4EnW7Ahx4NQWHvzqseO9nVHzsyff56oHVtPd7OED+3Vx3aeWlj322kfe49pH1rDmytNwOQziqTQH/NeD/PspB/DF4/ffjbsee3Z7i3KNRrN3MLSHhKE2DURo97vo8LuqVnAPhBO0+Vy4HOoU6nYYOA1BRIehNBqNphgpZbbOYoIL3JsHo+zb4SdtSpLVNItwtsYC1Px4v9uhBe49iT017LY3oP/tGo9Y0sykmU703lDbAzGmtHpxOUT1bKhQPKNX2DR5nHt06uxeZSy8Xi/9/f36pDMBkVLS39+P1+sd761ocrC9ina/a8K3+xiwvAWnw6gpDFVoLPxuhy7K21OYMWMGmzdvZufOneO9Fc0o8Hq9umivwbBrLKa2elm9LUgqbeJ0TLxrUClV+47OZjeuAVE1DDUQTrB8VkfeWrPHqVNn9xRcLlemclij0ew6dhPBKZaxiKVMmiegsQjEUiTTkq4mNy6HQahCOMk0JYORZAnPwqkbCWo0Gk0p7Ewoe67DRNUtBq2K7A6/G6chKhblDUeTpE1JV5Mnb73J49ijPQttLDQazaixNYsp1lyHiZoR1Z/p9aQ8i0qaRb9VkGdXb9v43c49WrPQxkKj0YwaOww1rW3P8CzsMFRFY2G1+ijOhnJUDF9NdLSx0Gg0o2Y4msQQ0G11X52ohXkDOWEolTpbPgyV6ThbUrPQxkKj0WiKGIqqSma7kd5ETZ/NdJFtVqmzlTQL//v38wvX99n3vd9AuC+z3uRxEkmmMavMwpioaGOh0WhGzXA0RZvPhdeljMVE9SwGIwm8LgO/24nLIco3EjRNFq/+Iccar9P6xDfgp0fBxucANVpVyok/16Mc2lhoNJpRMxRJ0OZ347ONxQQVePtDCab70rD5JRaGnsWZipY+cN3jtMc28U0ugQufALcfbvkQ9DyN37Nnz7TQxkKj0YyaQCxFq9eJ16VOJfGJeFUtJQds/wt/SF4MN53Ip9d9ndPNx0of++LNBIx23mo7DvZZCp9/HFr3gQcvp8mlhn7tqc0EtbHQaDSjJpEy8boc+NwT2LN4fSUX9n+Pna594OzbiDmamS03Fx83vBnee4D73R9kckerWvO1w4lXwLY32H/b/QB7bEaUNhYajWbUJNMmLofIhqEmombR+zphfPxkzk/gwA8x5N2XfdlWfNy2t0Ca/Dm2hOkdvuz6gjNhn6XMf/taBOYeW2uhjYVGoxk1ylgYE1vgDvayXXbQ0ewHYNg3g5lsL85qSoYB2B53Mb3dn103DFj0SbyRXlqI7LGdZ7Wx0Gg0oyaZUsbC4zQQYmKmzpqBbWw32+lscgEQ9M9khthJMhnPPzCpRO8onnzPAsCnmgq2ibD2LDQajaaQRFrichgIIfA6HROygtsMbmMH7XRavZ5CTfviFCapgY35B9rGQrqZ3l5gLLztALQR1p6FRqPRFJIyTdwOlQXkczsmnsAtJUZoO9tlR6YiO9w0Sz01sC7/2IQKQ0XxMKPIs7CMhfYsNBqNphg7DAXgc01AzyI2jJGOsUO2Z4xFpMUyFv0FxsLyLNIOL5Oa8zvO5noWOhtKo9FoCkimJS6nOo14XMbEE7hD2wHyjEXa201YehCD6/OPTUZICDdT2/0Yhsh/zvIsOoyIrrPQaDSaXKSUJNImLuvEuSueRTJtcuGvX+KtLcNjucXqBFWK7E7a6bKMhdPpYKOcgqOEsYjjKdYrIONZdDsjuoJbo9FocrE7s+aGoUbrWWwbjvHQO9t5fv3AmO2vJizPYrvsoNWnsqFcDkGPnIJjuCf/2GSUsCxjLFw+MFx0OaLas9BoNJpc7JkPdhjK63KMOnXWjvPvds0j2Ks+39WFw/KQXA6DDXIqzuENYGb3k06ECZuu4rRZACHA106nESG8qwL3wHoY3LBr71EH6moshBD/JoR4WwjxlhBipRDCK4ToFEI8LIRYY9135Bx/uRBirRDiXSHEKTnry4UQb1rPXSeEEKU/UaPR7C6SqXzPwusafTZUMKaMxW7PpgpuJyG84G7JLLkcBhvlZAwzkQlTAcQjIVVjUcqzAPC20y7GIHX2jxfDL06AwZ5de58xpm7GQggxHfgKsEJKuRBwAGcDlwGPSinnAY9aPyOEONh6fgFwKnCDEMJhvd1PgQuBedbt1HrtW6PR1EbSVF5EbursaD2DUFxN3NvtaaehbQw7O2m2QlAATodgQFrGI5oNiyWjISKVjIWvnVbGIHV2sAcifXDbWRDbzRpOBeodhnICPiGEE/ADW4GPALdaz98KfNR6/BHgdillXEq5HlgLHCaEmAa0Sin/LqWUwK9zXqPRaMaJTBgqo1kYozYWGc9it4ehtjNgdNFitRcHcBkGwzSpH3JO1jIZJSbdtHhdhe+i8LbTQqj4dyClutVCOqV0lNnHwM7V8PrtI/k2daVuxkJKuQW4BtgI9ALDUsqHgClSyl7rmF5gsvWS6cCmnLfYbK1Ntx4XrhchhLhQCPGSEOKlnTt3juXX0Wg0BZQMQ+2isRgPzaKPDppyjYVDEJCWsYgOZdaNVJQoHtzOMqdNXzvNMpTvWZgm3HkO3HCk0iKqEdoO0oSFHwd3c22v2U3UMwzVgfIW5gD7AE1CiM9UekmJNVlhvXhRyhullCuklCsmTZo00i1rNA3Fc+v6ueWZxjlZFGJPk3M6sqmzozUWtsC92zWL0HZ2yHaac4yF02EwjNUoMJZvLCKVjIW3naZ0KP87PHsdrLoPBtfDzR+EbW9W3k9gq7pvnQ5tM2B4U+XjdyP1DEOdBKyXUu6UUiaBPwAfALZboSWs+x3W8ZuBfXNePwMVttpsPS5c12j2aG5/YSM/fnzteG+jLHYYyu3Iz4aStYZccgiNRxgqHoJEiG1mG83eMp5FThjKkY4Sle6KnoXXDBFLKP2FbW/Co/8DB38ELvobONyw8p8hUiE9OLBF3bfuA2377jXGYiNwhBDCb2UvnQisAu4DzrWOORe413p8H3C2EMIjhJiDErJfsEJVQSHEEdb7nJPzGo1mj6UvlCBd2Ca7gSjSLKwBSPHUyNNngzF1gq2rZ5FK5P9s1VhsSbfleRYuh0EQHxKRF4ZypGPE8OCp4FkYSBzJkPp50wsg03Dy/8Kk+fBPv1Gpuvd8QYWnSmGl8ipjMUMNXGoQ6qlZPA/cBbwCvGl91o3AVcAHhRBrgA9aPyOlfBu4E3gHeBD4opTS/su5GLgJJXq/DzxQr31rNI1CXyieKXxrRIrqLKz7Sif8/lC8ZNFaMF5nz2LtI/Dd6dD/fnYtOghAb8JfYCwEEoOkqyUbhpISZ7pKGMpq+eFOBdQsjHhArfu71f2M5XDKlbDmryo0VYrAFnB6VcvzthkQ6VcNDN9/DN74/ai//lhQ12woKeUVUsoDpZQLpZSftTKd+qWUJ0op51n3AznHXyml3E9KeYCU8oGc9Zes99hPSvklORo/V6OZYDS+Z2EL3NnUWah8wv/nXzzPNX99r2i9rmEoKVU4KJ3Ir12IBwEImN4CgVudFpOu1mwYKp3EkGmi0pMJuxWR00wwnjIhFgDhUNXdNodeAB1zlJZR6jQW2Aot01SRX/tMtTa8BZ64Cp68alRff6zQFdwaTQOSNiUD4YnhWdgnT4+zehhqZyjOhv5w0fqYFOUlo/CjJbD6/vz1d++H3tfV49y6BctYhPDR4s0XuAHizpZsGCoZUS/HXT4MldemPKXe39uqTvw2hgOO/CJseRk2Plf8HoFeNqc7+M973iTetI9aG+xR+kciUv13UEe0sdBoGpDBSAJT0uCeRb5mYZ9E46nyJ/xEyqQvnChaD41FGCq0Q2UdvXhTdi02DI/9L/i7sj9nNqO0hRDe/DCU1fYj7swJQ1nGIiG8lG0gkeNZRJNpFYbytBYft+TT4OuEZ68vfi6whTWxFm57fiMX3ac0FdY9rj4/UWxkdyfaWGg0DUh/SJ1Q06YcVXbR7iBh1VnYqbP2HO54hf5QiZRJfyhetD4mqbPWyZ91T0C4X4Vvfnka9L0H//h/6rlSnoX0lQxDxZ05YShrlkXS4S3/+TmeRTSRVmGoUsbC7YcFH4P1T+WvSwnBXnrNTuZPaebZnW5MHLDqz9YewrUX99UBZ7knhBDXU6aeAUBK+ZW67Eij0dCXc0JNmzJzQm4kisNQ6r5cYZ3d0tw2hLlksqGSaaSU5a/eK2Gd/JFpeP02eMXKPvr0XTD3H+APF5Y0FmF8eRXc9u865miGYL5nkapkLIo8CysMVYrmKZAIqopth/XZkX5IJ9gs2jl8Thcdfjc7t3cyZdga72qmlO7i9JR+zzpTybN4CXgZ8ALLgDXWbQmwZzZs12gahFxj0ai6RVEYymWHoUp7FnYRXzSZLsqICsZSmdD+aFJv1Qstz8JwwsPfhP41cNZvYb/jlW7gbctmKAHEg5iGiwSugjoLK6srNwyVsI1Fmb5QAO4mTOHMjlaND5f2LEDtBfL3Y9VYbEi20+x1cvyBk9mQ6sx/XaVQVJ29jrLGQkp5q5TyVlS9w/FSyuullNej6iWW1HVXGs1eTl/O1Xej6haFqbPVBO5Eznqud5FImcRTZmb40KhDUQnLszjgH1XLjOO/AXOPyz7vbSvSLJJOVXxXKgwVNVogFYNkLONZpCt5FkKQ9rTlexaeltLH2sYidz8BVWOxOd1Bi9fJ8QdMZqtUWktMWJ9bzlgMrIPvzoD1fyu/v12kFs1iHyD3GzdbaxqNpk5MDM8iP3XW66ochso1Frnfz9Yruq251pFqInc6BTvfUyfGnHkTZkwZi97l/58qgDv60vzXFRqLeJCEQxmL3DCUwxAIARGjWS3EhnPmb1fwLADT05avWZQLQ2WMRbboz/YstslOWjxO5k9pZtg9FYAXUvPUMckyGVFv/F5pNu/Ur165FmNxFfCqEOIWIcQtqCK779RtRxqNhr5gvmbRiIw0ddY2LpDvWdg1FpNb1dVzVc/iV6fCTw6FW09XxWr2+wTUiffRjRIOPgOMgtNbkbEIkTBUD6hczwKUdxHOGIuhzEnadFU2Fvg6aCNENJ4qnw1l7wXy9xNWzU/7aKPF60IIQcu0/QF4zjxIHWOL+IW8/Qd1v+7xyvvbBaoaCynlr4DDgXus25FWeEqj0dSJ/pz00lS51hDjzEhTZ/PCUOGsMQxasywmtyjPomLn2XQKtrwCs45WP4ez3aWTMRX/D8oyoaIiYxEgavgRAvxuR96hLkPkGIvhrLGo4lng76JbBIjHI0qQHkkYKhYg7fSTxpFJ5Z3zD+dxZfN/4ph5mDqmVK3F9ndUO/NJB0L/WhiqTz+pWlNnHcBOYBCYL4Q4ti670Wg0QHE2VCNiewp29pCnSupsIp01ArmajF2QN8kyFhVrLYJbVbbTfsdbb5qN4ZvRIHHpIpouc1rztBZpFlGhWn0UZl85HQZhkdOm3ApDSWdlY2G0TKFLBDAj1udUDUPlG6+USxkXW3Bfuv90vvG1r9PVaQndpcJQb/8BhAGnXa1+rpN3UTZ11kYI8T3gLOBtwP4rkMBTZV+k0Wh2ib5gHCFUgksq3ZjGwvYUXEZB6mwZzyJeRuC2w1CTLM2iYhjKnk09ZYH1psHMU2YsQAhveWNTQrMI05FXkGfjchgEbKk2NpQxSqarqfzeAEfLFDoJYNqV35628nuBImNhC+65FeUAhketm/FQ/hX+phfhpV+pYUlzjlWtQt5/HJadU3Gfo6GqsUBNpTtASllcSaPRaMYcKSV94QTdzR52BuMN7FmYOA2BYVU8Z8JQ5TyLMmEoW+Ce3GoJ3JWMxZCqOTC75gMG6WgQe26djIcIy0rGol1dmacS4HRDPETI8JYxFoKgyGlTbnkWhqtCNhRgNE/CECbesBUKKheGcjcrb6AgDBV3qNBXiyd/Gp/hUevJWIhMlcWaR2Dl2apD7WnfU+nBc/8B3vur6mpbqNnsIrW82zqgzBxBjUYz1gTjKRIpk6mW4Nuo2VApU2b0CgAhBG6nUVPqbG6YzS7Im9yivm9FzWJoAyBYFWklJL1s3p4zETMRIoyPWLkK8lsJGQ8AACAASURBVMLahniQgOnLq7GwcTkMQnlhqIiakudyFB2bR7MautYSsjygcmEowygOi8UDxBylPQuHW62nollPitV/UtXgFz0Jky0BfP6pMGNFfpbVGFGLZxEBXhNCPApk/oV1BbdGUx/sEM2UVi9vbhnGbNh2H2YmbdbG6yw/h9suyuvwu/LCUHZ78po0i6GN0DqdoYSgEy/pWDY7SCRC1cNQoE7Qvg5Ihgm4PCU9C6dDEDUd4PJnsqFiePCU6zhr06SmRLdFLWNRLhvK3k9BWCwiVKfZwuwsp095KOlYTp1FPKTan/s6smsLPqpudaAWY3GfddNoNLsB+6p7ihWWaVTNIpk2i2Y7eFyOqp7FtDYfO3JSg0OxFE5D0OFXAYyqmkX7TIKxJGHpzRbiAUYyRFj6iFc1FkOZFNShdJkwlGGQSpvWCX0IzDRRPJkq9bI0K2PREbOGFpULQ9n7KQhDhX0+PE6j6Pfq9nhJSYN0jkajiv6aK+9nDKlqLHSarEazexm00mbtIrVG1ixcBVfaHqdRNXV2n3Yvq7epAUGGIQjGUjR7nTXNw2BoA8w5lkA0RQgvLTl1B85kmBDTavAsAhlhfDDtKbqKB3A5hcr28rarMJThIIq7/CwLmyYVhupOWMaiXBjK3k9BGCro9dPiLY76e1xOIngw4zmeRSJU2XMZY2rJhpoHfBc4GNUnCgAp5dw67kuj2Wuxr8ztuHXj1lkUNzj0VNIs0lnPwpRw49/W8dDb22jzuWj2OHE7DAxRwbNIJdRwoPaZBGJJItJLW04qqTMVISS95V+fG4ay+kj1p8qEoQxD1ZH42tXxLp8afFRulkXmM9pJ4aArZbUXrxaGGlinHqeTkIwQkP4ivQJUdXwEL85cYxEPQOuMyvsZQ2oJQ/0KuAL4IXA88C9A47XA1Gj2EOyTrd+t/ns2qmeRKOFZeF2OsmEg+3tNa1fXnFc9sDrz3EHTWhFC4Hc7y3sGw5sACe2zCPalCOPDmcoKue50uDaBOzac8Sz6ky6mlRS4hQr/Nber4UP+LsLSXd1YGAYBRzud6X5wNalhR+Xwtmc9C2s/w6a3pLHwuRxEpIeWRIFmUSnMNcbUkg3lk1I+Cggp5QYp5beAE+q7LY1m78UWiJs86kTTqNlQyZRZFJap6FnYYag2VdjW2eTmc0fPAbK9mbwuR3ljYaXNKs0iRRgPrrTlWZgmHjNCCG/m99cfitM7HM2+PtdYWFpH0PSVDkM5LM9ixnLYuQo5uI6o9GRamlQi5LAE50ohKMh6LZDJ0BpI+0p6Ol6XgwheNdPCZjdrFrUYi5gQwgDWCCG+JIT4GDC5zvvSaPZa7JNtU4N7FoWps6D6Q1WrszhoWiuTWjxc/fFDuPy0AzlmXjeLZqgTuc9tECsXRhqyMow6ZlkCtw+PaRkD6ySaW2fx7T+/w0W/eTn7eneTmomd41mEKX1ydjoMkqaEAz4EgAhsVZpFNc8CCLusautqeoK3TekO6ZTSUYDBMmExZSw8iNwK7kRI1WvsJmoJQ30V8ANfAb6N8irOreemNJq9GVsg9je6Z5EuTp31uAwGSoxNtY8HmNHh48VvnJRZ/83nDs889pXzLMJ9sPYRdbJv2YdgbCdhvHilZSwsDUKFodTrdwTjbOjPObnaMy1yNItgwfxtG7dDqGyoyQdBx2wY7CGCp7rADUTdnRCmeogot+7D8ix2Jr0lBW6fy0Gv9GLYnkU6qdqnN5LALaV80XoYQukVGo2mjtgx96xm0ZgCt6qzKNAsavAsKl2d+9zO4grune/Cz4+DVFS1sXA4CcZV6qyPuGpTbs/TzvEsQvEUw9EksWQ6M/IVb2u+ZyG9+EoU2mUEbiGUd/HcT4jJGlJngZi7K/tZlchN5bU8ix1JNweWFbg9GKl+tWCn0DZSGEoIsUIIcY8Q4hUhxBv2bXdsTqPZG4mn0nicBk6rjcbEqrOokDqbNhGCzPcqhc9lFHsWax5ShuLCJ+GM6wHVfDBkJWeasWB2nrYlcEspM21EdubUdGQ8i0Q2DOVzlzAWtsANcOA/AtSWOgvEvcpYyApX/b/5ew//9+Q29UNsOONZ7Ei4S4ahPJZm4UhZnoWdMrwbBe5awlC/A/4deJNsI0GNRlMn4klTGQsrxNOomkUyLYtO/NUEbrfDKJ6vHdyuRqAaLpqcsDNSYCy2vgpt+8I+2QGdgWhSCb5AKBSgNZENQ4HSfcKWsdgRjLFvp5pbkQ1DZUeqekt4Fm6HkUn1Zd8jiE1eypub53BCDZpFyqdqLUx3C+Xk8Nc3D7O9T6rL9RxPJ2CWTp21s6GcKTvsZnkWu1GzqEXg3imlvE9Kud7KhtogpdxQ951pNHsp8ZQKm2Q8i4Y1FqWK8hxl233EU8WeCG/dDdcvh1s+BL88meNijxbXSWx9Nc9QgPIs4tbgonBgMOtZyOwApXDc0i4CJTyLeIiUU51ovSUynPI8C4eTdR+9j3vNozPNEiuR9inPwn7/UgSiSQbSVrvz2HAmKypIuV5VgihenOmYWrDnjTdSGAq4QghxkxDiU0KIM+1b3Xem0eylxJMmHpeBw+oa2qieRSJtZuZv23hdlYvy8k62j38H7jofJh8In74bgMmyL1+ziA6pwrVpxcbC16zCPOFQVrBOWy2+I8k04YTtWRQYi6gyLnY7cJ+7+DTodBh5xZB2aK2W1Flp9YdKuMqHiAKxJANpf/Y7xgNIw00cd0mBWwhBwvDikpZGk9EsGkjgRonaB6I6z+bOs/hDvTal0ezNxFMmHmfjexaptCxRZ1G5N1Tm+Cevhie/B0s+Ax++Fhwu8LbTJgP5nknv6+p+n6WZpVgyTSJt0tTSAVGIhYaR7iACcPlbYVi1TLH7L24PxLLvN3UxvPpb2PQcCctYlDIAboeR1yW3FnHeRjZPxZSCuKuNcuYiGEsRwDIWsWGIBUi5WyCSPw88l6TDp868iXC2J1aDhaEWSylXSCnPlVL+i3U7v+4702j2UmyB22HYmkVjSoUlU2edBmlTZtJkc0nYYaiNz8HjV8IhZyvB2mFdSfu7aDUD+QL31lfVfY6xsCfrtbW1AxALB0jF1MnT16wyjHbmtEDP8ywO+Sdw+mBoY2b+dinNwmmIPCNt6xe1GAtHyyT+OfkNdu5XPgATiCUJ40XaMy3iAVKW8SoVhgJIOSzjkgjneBaNVcH9nBDi4LrvRKPRACp11utyZIxFo3oWJTULlz2Hu4KxeOEXaoLc6T/MH9Dj76I5PUQ0mUbabsHWV6F9Fvg7M4fZ8y86OlSldCISIBUNYEqBv0mFZfpyDESeZ+Frh0UfV3t02GGoYmPhchp5Bs/2LGrRLHwuB8+ZB6uuuGVQBk8gPVnBPWFpHKUEboCUPdI1GWlYzeJo1DyLd6202Td16qxGUz+KPYvGNBYl6ywyc7iLRe5E2mSyMQzv3AtLP60G9+Ti76IpNYyUOcZm66t5XgVkPYvuzm4AUrEgqagaqdplDVDqtwoDXQ6RnzoLsOJzAMSEOvl6SxgAl6G6ztpGKz6CMJS/SvdcKSWBqDJ4af8k1fMqFiBmTckrlToLkM54FqGcbKjGSp09te670Gg0GeIpk2aPcwLUWciSYSgo71mclngEzCSsKBHJburCl1LtOaKJNF7DVC0+Fn8q7zDbWHRankU6GsQUQSL46GpyA1nPYmanPz8MBTB9GSz5NGsDc3AaAmeJ2gl7LW2qzroZzaKGOgvbYJYbDxtOpLHtf2zKMlw9D0HzFKLGVICSAjeA6bKNRURpFk4fOGo5hY8NVb+5lSa7L3CC9ThSy+s0Gs3osCuOG92zKJc6C6VHoyZSJsfFH4fZx0D3vOI39HfhTQ4BUmUyRQbUelN33mF2GKrJyoYy40HMWJCw9NJpGQtbs5g7qZmBcCJPrAbgozfwYsc/lqzeBjLfK2kZ6pEI3LZnUS6F2N4/QHjKCogOQP8aosKf9/pCssYivNubCEJtFdxXAF8HLreWXMBv67kpjWZvRmVDGTiteH4jahZSyjKNBMt7FvG0SUd6ACaXkUD9XTjMBE3ECERTEOlT6wXGImCdbFv8HqJ4MifPEF66rIFR9tjWuZOULrEzFOf9naG894klTTxljYUy1EkruWAkqbO2BlLOswhEU9nHk5arB2aKqNFUucLdZc0ET4Z3e3tyqM1D+BhwBqo1FlLKrVA2I0yj0ewiqoI761k04gxu+4q78Eo7o1mUMBbJZBqvjKjur6Xwq2K2DhFkKJqASH/euo0dhmrxuogJPyIRQiTUSNVMGMryLPbrVlffP35sDSd+/0ne3pqdTKc8uNKnwMIQ4Eg0izafCiP1h+Iln8/1LIJNszPfL2I0la5wt5D27832LHZj2izUZiwSUqk8EkAIUeZfOh8hxAFCiNdybgEhxFeFEJ1CiIeFEGus+46c11wuhFhriemn5Kwvt4T1tUKI60S536ZGswcQS6mTWCNrFnamUDnNolQIRqbjODDLh0/8yoPoJMhwJKk6zeas2wQsY9HscZJw+HAmQzQNr6VHTqWr2TYWyrOYY3kWK1/YBMDaHVnvIpZMlw9DOe0wlPqedupsLdlQfreT6e0+1uwIlXw+kGMsEmkJ+6quuxHRVNEYiVxjsZtHqkJtxuJOIcTPgXYhxOeBR4BfVHuRlPJdKeUSKeUSYDlK67gHuAx4VEo5D3jU+hkrPfdsYAFKVL9BCGH/S/4UuBCYZ9206K7ZY4lb4RHDEAjRmHUWWWNRe+qs026CVy6Dx7rC7hRBBiPJCp5FkmaPE4chSDj8zEquw5UK8oacm9EsBsJxDAGzOvMzrrYMZYch5XWjLcAWsu0OuiMRuAHmT2nm3W3Bks/lhqESaTNjLELCX9EYGe4G1yyklNcAdwF3AwcA35RSXj/CzzkReN8SyD8C3Gqt3wp81Hr8EeB2KWVcSrkeWAscJoSYBrRKKf9ueTi/znmNRrNHIaXMpM5CcXFYo2BfaRdmEtkx/VKps86UNVuibBhK1VJ0UBiG6sw7LBhL0WrVIqSdTcySmwF4w5xLq8+F0xCYEpo8TrqaPfhcDk5bOJV2v4stg1ljEa0QhrJDScNWims8pQoQjQodc3OZP6WFdTvDaiZGAblhqETKhFlHARAQrRWNkeGxNYtIY4ahhBDtwBBwJ/BtKeXDo/ics4GV1uMpUspeAOvenro3HdiU85rN1tp063Hhukazx5FMS0yZjf07DNGQ2VAZzSI3DPXeQ8x++AI8JEp6Fu605VmUDUMpD2KyI6TCUJF+1cvJkZ9KGowlM+mlaStDKGV4eE/OwO9yZEJLtvdx18VHcs0nFzO93VfgWZhlPYsOy0MZjKhwVl6rkhqYP6WFRNqkJ3f4koUdRrPfl30Phc/8gdd8R1SZ9eFim+xADqy3wlANInALIdxCiFuAHuDnqNBTjxDil0IId60fYB17BvD7aoeWWJMV1kt91oVCiJeEEC/t3Lmz1i1qNA1DNuvG9iyMhvQskqmCMNTb98Dtn6K55yHmi80lNQu3PS+73BWxtw0MJ9NcYYZszaJArwDlWdhVzm6fituvd+6H0+nC6TAyGU72bO0F+7TR5FE6Qq5nUSkM1eFXxijPWNSgV9jMn6JO5Gu2F4ei8jUL6/e0/4nE0kbFz/C6DF4256l2KQ0WhvovVJrsvlLKZZb2MBNVyPf/RvAZpwGvSCm3Wz9vt0JLWPc7rPXNqHoOmxnAVmt9Ron1IqSUN1p9rFZMmjRpBFvUaBqDeEFbicb1LHKMRWAr3H0BdM4FYK7YWtKzcKWtE3U5YyEE+LuY7Axnw1AFegXkG4vuLhWieiY6M2Mc7C6yTQWV0NM7lGdhV2VXMhbtfnU9PBRRJ/aE1dyxVvaf3IwQ8G4pYxEt8CzsxyWGSeXidTl42TwAMbxxt49UhcrG4kzg81LKzLe1Hl+CSqetlU+RDUEB3Ed2hve5wL0562cLITxCiDkoIfsFK1QVFEIcYWVBnZPzGo1mjyJrLNSJSWkWjShwqxOuy2HAe38FMwUfvwkpDOYavUXGQkqJx7Q8i0pXxP4uunMF7qZizyKQE4byNalmgq+l52aK2ewwVGH31untPiKJdMYAxJJmyVYfAO2+fM8inkqPyLPwuR3M7PSzZntxRlQglsxoIoWdbSuFurwuBy+aB2QXGkizMKWURQE3KWWIMmGgQoQQfuCD5Lczvwr4oBBijfXcVdb7vo3SRd4BHgS+KKW0fdmLgZtQovf7wAO1fL5GM9Gwwzd2VlGjexZup1BjT9tmwtRDkO2z2E/0Fo1WTaRNmoTV0K+cwA3g76JDBLKaRYG4DUp0tk+29nu9KedkjIU3E4bK9wRmdKheULZuEUulSzYRBCXct3idWc+iylV/KeZNbinpWQRjKbqtFN94obGo4lmskjMx7YaCDRSGkkKIDqsuIu9GjeNVpZQRKWWXlHI4Z61fSnmilHKedT+Q89yVUsr9pJQHSCkfyFl/SUq50HruS1I2YJWSRjMG2KmaeZ5FA9dZuEnCuidg/skgBKJ7HnNFL7Fk/ikikTJpwjYWFYRZfydtMshQJF5Ss0ilTYYiyUyKLPudwOB+H2GdnIbfrTwJb4FmYTO9XYnhmy3dolIYCqDd72IoR7OopcYil/lTmunpCxe1GglEk5lK80ROtlQ8beKuEOryugxSOIlNWaYWGkXgBtqAl8vcdAW3RlMHMgK37Vk4GtOzsE9yXTteUKmc81QNreiaxxzRSzyZzD8+ZdKEpVlUDEN105weJhkNqIaDBZrFoHWlbxffMfso2j59K13NvownkZsNlUuuZyGlrBiGAujwuzOfV3IkbBXmdDeRMiW9w9G89WAsmfEsRhSGsgxJpkXIbuw4CxW6zkopZ+/GfWg0GshckXsznkWDZkNZ3k5X7+Oq++mcY9QT3fvjEwm8kW2o+lqFHYYyhRPDUSGZ0t+FNxWgNT0ITni5z8Gk/ggzu5RXYGsIGc8CMAzB1Z9YhM9lexalBe52vwu/28HmwUgm/OMtE4ZSx7sznkV8hKmzAK1WqCyYkyoLKnW2zefCldPNFiCRU19TCjtk1j/1aKZyPbTvW/bYeqC7x2o0DUSRZ9GomoV1kmve+Rrsexi4rDh6l+om2xruyTveDkOlnH6V9VQOfxcGJnNFLwA/fn6QX/xtXeZpu0Fgpz/f4Jxw4BSO3E95IeU8CyFEJn3W1oa8FcI+HX5XxrMYaeosZIcYFRmLaJJWr6t4dGvVbCj1XF/nMviPdTApK3YPhBOc8P0nWNUbGNEeR4I2FhpNA1GYOtuo2VD2npyJQH7GktV6vCOyIe/4ZNo2FlVay00+EICTDDXXYkC25LXNGLCGGnU2l/dOvGWMBWTTZ+3BRJU0CxWGytUsak+dBWjxKM8iFM8ai3gqTTxl0upz4XYaeZpFtTBUXvv3AuF/VW+AdTvDZVuMjAXaWGg0DUSs4CRmiMb0LBJWGMpIhvOF1uYphPHRGduYd3w8pcJQaVcVYzHrKFKuFs5w/B2AAVpYvS2QqY0YKBGGKqScwA2wT7uP3uFYJtxn12SUot3vIhhLkUqbeS1YasWepR2K53SZzXTMdSpjMYJsKF+FORlbrQyvwiy0saSWdh/XCCEWVDtOo9HsOkWehaMxe0PZYSgjGcw3FkKwxTGDudE3YDDrXSRSJs1EMasZC4eL0KyTaLbSbAdkK4FYim3WHO0BKwzV4S9vLOyTarOn2BOY1OxhMJIgbF3tVw5DWYV50eSoUmftMFQoJwxlG4tWr2vExsI2gqWMRe+w+v2UKoYcK2r59quBG4UQzwshviCEaKvbbjSavZzCoryG1SzSJg7SGKlYUVbOU74TmJFYDz9aDG/cCagToV/EkdWMBWDOPw2AuHQSN5QWstoKrwyE47R6nUXdbnOxDUCzp3g8aVezGymzJ9dqqbOgqrhHkzprh8Fye0HZs7dbvE7cDoN4Ol+zqPy97PbvxQYh41mUeG6sqKXr7E1SyqNQldOzgTeEELcJIY6v2640mr2UeCYMlaNZNGidRTYVNt9YPNp6JpdMukUVzG1+CbCyoYgia6g69h10CnHpZIBWjp6n2vas7lXGoj+cyNQolH19pt1HsSGww1f2ybWaZgEwFEmMKnXW4zRwOUSeZmH3hVKahSNzcjdNSTIta/IsoqXCUBnPYhzDUADWXIkDrVsf8DpwqRDi9rrtTKPZC5konkUiLWkpYyw8LoOtsksJ31FVc2tnQ9ViLLzNbTwhl7FBTmHFrA6mtXl5d5vK8hmMJDJN/spRLhsKoKtJGZotGWNRuc5CfWZyxF1nQWVfNXuceWGo7QE1PW9yiydP4K5luFLFMFRGs6ifZ1G2zsJGCPEDVNfYR4HvSClfsJ76nhDi3brtTKPZC4kn0wiRnUDnNAwiqVSVV+1+kmmTZlG6yK7Z46SnLwztXZmZFAlL4BaVWn1YCCH4X/dXGQhF+cGUFg6Y2pIJQ/WHEszo8Fd8/cyuJvxuB5NbvEXP2cV8dvfZWsJQWwYjRJPpkoJ5NVq8rjzPYrulvUxp9eJxGCQsT8A2FpUMksMQuB1GkWchpcwRuMdXs3gLOERKeVGOobA5rA570mj2WuIpE6/TkZnD7DAEDRiFIpXOad9R4Fl0N3voDydU9XXE8izSJs3Eam5R4WtqJoyPeZObOXBqK+/vDJFMmwyEE5k52+U4dl43r37zg7SV8EDs1262Tq7lxqpC1ljc/+Y2pITD5xb3qapGs8eZN+yodzhKu9+F1+XIE7gzk/iqhLp8bgexRL6xCMRShK21UkOnxopaTOVrwIEFY6+HgQ25PZ80Gs2uE0umMwV5oDSLRhyrGk2maTNszyK/VXZnk5tgLEXa24Fjx2oAEok4HpEk5qnuWQC0+9y4nQYzO/0cOLWFZFry/s4Qg5FExRoLUJ5JuZqIdr8bIWrzLJo9TpyG4IWeAfxuBytmjcJYeJ15RXnbhuNMbVUej9tpMBQdobFwOYo8i605A53GNQwF3AAsA95ADSJaaD3uEkJ8QUr5UN12p9HsZcQLsm4cDSpwh+NpOl0qjbWwVbYd6om52mmyNAsZV1PyDG9tnsWsLj8SidNhcMgMlYD51Hs7SaZlUfX2SHAYgk6/m76Q0g4qaRZCCNqtYz+wX/eIBW6AVq8zk3kFKgw1tc0yFjkV3MkawlCgPItoQcZTbu+pUnrGWFHLt+8BllpDhZYDS1GhqZOAq+u2M41mLyReUCnsbNBGgtFEmg5H6TCULSKHjFY1/jMZQ8aV5mDUGIb69kcX8svzDgVUQ77uZjd/fVvNT6tUkFcLua+v5FlAdmLecQeMbphas8eZp1n0DsfyPIsRh6FcDqKJfA1r65D6d2jxOsddszjQmjUBgJTyHZTxWFfhNRqNZhSottm5noUxdsYiFYeH/l9GR9gVwokUnQ51dV5kLCzPIiCs8FR0ABLKs3DU6Fl4XY7MgCMhBCtmdfLKxkGgcquPWrD3J0Tl7CPIZkQdN2+UxiInDJVMm/SH40wpYSziI9AsSoWhnIZgRod/3I3Fe0KInwohjrNuN1hrHiBZ7cUajaZ2ijwLYwwruDc+B89eB+/eX/trUglY8wis+hOY2ZNUNJGm1TYWhWEo68p9AGs9MgBxNTHO6RtdW+0Vszuwp9jsShhK7U95Ph6ngajU1BDV1vyAKS2ZrrcjpdnjyqTO7gjGkZJsGKpE6mw1Y+F3O4gUCNy9wzGmtHrxux11rbOoRbM4FzVK9asozeJp4GsoQ6EL8zSaMaSwB9GY1lkMWe03dtaQ8W6a8OJN8PiVEBtSa5MOhI/fBFMXEU6kaBMxcPnBkX8asYvm+tK2sejHSCpjUatnUcihs7Pi8q6GoWzPoloICuBbH1lQNLxoJLR4nSSs3lLbLG0hV7OIF4ShPFU0C6/Lwc5gPG9t61CU6e0+nA5R1wruisbCKsb7k5TyJOD7JQ4pHjCr0WhGTSxp5qVzjmnX2cEedd/3XuXjpITb/gnWPgxzj4fDL4JkFP70VXj6h/CJXxJJpGk2SqfCqnYcgm0pK/MpOoBIqjCUGOUo0IP3ac1kAnXtYhjKNjaV0mZtWr2VCwCrkdsfatuwOsnbmoVnFJqFvyAMFUumWbMjxHHzJzEcTRa1Qx9LKu7MmoEd0f2gNJrdQ109C9tYVPMstr6qDMU/XA6fvQcOOA0WngnTl8HAegBlLIgUhaBAaQydTW62xq2iuEg/RsK6rqyhgrsULofB0pnteJxGTSf5StieTy2exa5iV5EHc5oh5gncaRMp5QgF7qyx+M3fNzAQTvBPK/bF4zTGPQwVA94UQjwMhO1FKeVX6rYrjWYvJZ40i+osxkyzsLvADm2AZAxcxRXOALxzLxhOOOzC/EFFnXPg7XsAiMRTNLuiZYvsupo8bIpZJ+PIIEYqoh6P0lgAnHPkbBbs01pVZ6iGramMtDHgaLCNRSieYnsghttpZIr93A4DKSFlypo1C29OnUUgluQnT6zl2PmTOHK/Lu54ceO411n8xbppNJo6Y1dw2zgMg/RY1VkM9oC3XWkQ/Wth6sLiY6SEVffB7GOKBuzQMQeigxAdIpJM43NGiwrybLqa3WyPpNTzkX6IW1e8owxDAZy6cCqnLpw66tfbZMJQFUaqjhV2RlcwlqJ3OMa0Nm/G2NmGIZEys55FFc3C7856Fr97biNDkST/cYqamOfJaUxYD6oaCynlrUIIHzBTSrl39oJa/RdIJ2HBR8d7J5o9nKIK7rGaZxEPQaQPFn4c3rob+t4tbSy2vw0D6+ADJQIHHbPV/WAPkXganzcCnuklP66ryU1Pfxh8HcpYJFyYCAzX6LKKxpJuW+Ae4eS70ZDRLOIptltZSzYljUUNYaiUKUmmTXr6wkxp9bBwulIJPC6D2DgPP/owquXHg9bPS4QQ99VtR43Ik9+DJ75b/bhEGHrfUJkkGs0oKEydHTPNws6E2v8kdNgEfwAAIABJREFUQMDOMiL3qvtAGHDg6cXPdc4BINW/jkTaxGtGyoehmj1qXra/C6IDOFJh4sJXef72bqKzydYsdl8YKhhLsi2QLciDHGORNjNzLWqpswClGQXjyYznAiqsNq7zLIBvoRoGDgFIKV8D5tRtR42GlND/vnLhZZX/tA9eBj8/Bq5dCH+/ofrxGk0B8VRxb6gxyYayxe1JB0DHLOVZlGLLKzBlATSXKEKzPItUn6rHdafDFYyFm0giTdqrPIupiQ0MOrtLHru7afe5MMTuCkMpYzEYSbLNCkPZ2CGnXM/C46i8p9zRqsFYKq8Nu8ep6ixknc47tRiLVImGgXvPWTC4TbUsSMXU43KEdsDrt8N+J0DXfvDXy+H352YqV8eTB9/alpdBoWlMpJTKs8iJWxtCYEo1HGeXsI1FxxzoPgD61pQ+Lh5UoaNSeFrA343Zn2MsygjWtogcc7fD8BYOSb3FmtYjduUbjBmGIehq9uBzjbzl+Eix53A/v66fRNpk0YxsYqntRcRHEIbyW8YimkgTiqcyxgiUZ2Fagnk9qKlFuRDinwGHEGKeEOJ64Nm67KYR6V+bfTy4vvxxL94M6QScdjWccx988Nvwzn3w6P/Uf48V2DQQ4Qu/fZlfPlNh75qGIJpMIyX4c64WnYYK26RLXS2ufwp+tETdAzx9rWrnUYrBHiU2+zqUd7HzXXj/seLj4sGyojWgQlGDPbhJ4jCTFbOhACKOdgjvwEOSDZ1Hl3/f3cx3P7aIzx9b/wCJx+nA7TB4em0fAIflFBfa4caRahZghaFiBcbClTU+9aAWY/FlYAEQB1YCAVQ1995BnrHoKX1MMqaqXeefCt3zVFz2qK/AoZ+DF26EbW+Wfl0qof5z1pEdQZXb/fA72+v6OZpdJ2xlDOUO2XFYQ5CKdItVf4LfnKkuYN77q1p74w6VjFGKwQ3QPkv9bR52IXTtr17/ym/yj0sEK6e3dszBObyh7EhVG7twLmio50PSy0DX8vLvu5s56eApHDi1glEcQ5q9TiKJNLO6/EzO0Sw8OZpFIp3GYQgcRmVNx+dWfxvRZJpQQRjKrhup10yLWmZwR6SU35BSHmp1nv2GlDJW7XV7DP1rweFRot9Amavzra+qTJNl5+Svn/Bf6kruL18rrV/c9yW4ei7c84Xy772L7AyqNtKvbRrKGA5NYxK2upM2ufMruKFEaOHx76oLk+75sGOVuvDoe0+ltpZicL3SKgDa94XPP6pe+8Yd+cfFg5UHFHXMxhXeSqewLnKqeBaDqOefNhfR5B//TKjxwL76L5yHUZgNVcvYVtuzsMNQzZ58gRvG0bMQQswXQtwohHhICPGYfavLbhqR/vfVVVjbjPJhKLt3Tsu0/HVfB3zgy7DpOQjvzH+ubw28cacSE9+5D351mkpZHOvth7N9ZB5btWPM318zdoSt1tN5noWh/osW1VoENsPso2H6ctjxDvSvATOl/hYLBfHQTvX3Nm1xds3dpLS16FB2TUqVYlvJWHTOQUiT+WKz+rmKZ9FvqucfM5dk4vd7G/bV/2Fz8rWgQmNhj9KthG0sQvFUCc3C8izGMQz1e+BV4L+Af8+57R30r1X/qTpml7/6t0NJpWK9bfuq+9z/lKB67Di98Om74IJHVPvoWz8yJu2jc+kPKc9iWptXh6IaHDsM1VxNs0iEITasLk4mHwTBXuh5Rj0nTYgH8t947cOAhPmn5K/7OvI9kVQczGTlwrkOFedfbLyvfi4TsvK7HXhdBm+7FjJ8wCd5MH1Y3oltb8L+91wxu8CzsLOh0mkSaRN3DXUfdjaUPbypUOAG6tbyo9ZsqJ9KKV+QUr5s3+qym0YjnVLeRNf+6j9JOc0iZiWLeUsYC2+7dUyOsRjaqDKnlp8HTd0w5WD4xM0wvDErVo4R/aE4bT4XpyyYytNr++o6SUuza9iehd+dX2cB5KfPBnrVfes+MHmBevzWXdnnC0NR7z2oDMvUQ/LXC41FpYsem32WknC18nHH3yoeK4RQLT8STaz9wNUEaMqrCdibaPW56GpyM7c7f6RsrmdROCGxHLax2GF1ns1LnbUE7lidai1qMRZ/EkJcIoSYJoTotG912U2jMbRBufZd+6sskEhfaUE6XiF+67VS5XI9i00vgEzD0s9k16YtUfeBLeX3s/ZRePOu8s/bh+0IctvzGwHoCyfoanZz2JxO4imTNdt1o+BGxdYsSnoWuZpFcKu6b5mmLjQANj2ffT7PW0jA2sdg3snFBXG+dkiGlUcBStyGygK3y8u6KacwSVgXSBVCVl3NbvpDiUwn1NzvtTfxlRPm8cOzlhT1tCpMna1lbKvfCkPZbcrzi/LGWeBGzbP4d1S67MvW7aW67KbR6LdcbduzgNKhqHgADJcKKxXisz2LnFIV+3FTTpGSrwNcTTC8ufx+nrwa7v6cuq9QeLPyhU385z1vEk2k6Q/F6W7yMH+KOgG8t72+2Vea0WMbC3+eZmF5FrmaRa5n0TIte0HSNlPd5xqLjc8qIzD/1OIP9HXmH1/poieHtybnVHdXCFl1NbnpD8czxqJ1Lw1DLZrRxrHzi4scC4vyahK43baxUMkqzSXDUOPkWUgp55S4za3LbhqNLZZN7No/ry9OEXYGSalWBqXCUHZMOdeFF0KJ6JWMRWArOH1qIM1fv1HWYAyElU6xeTBCf0h5FrO6mnA5BO/t0MaiUcloFu4cz6JU6myuZyFENhQ1+yh1n2ss1j4KDjfMPa74A+3iuxEaix73gawxp1c9trPJw0CuZ7GXGoty5KfO1uZZqOl+ZcJQ4yVwCyH+I+fxJwue+05ddtNIpFMqB32/E6Cpy+qLI0prCrFAab0CSoehYgHVAtrlyz+2bXp5Y2GaSsg8/EI4/Avw3E9U6m0Jg2Ebi40DEfqtMJTLYTC3u5m1OgzVsITjKQxM/ANvqmZ/iXAmGyovdTbQqy407Kv6yQep+1kljMXwJmifqbKfCikyFtbfRpXOsOFkmt+K06FrnvKGy9Dd7KYvnCAYU9OX91bNohyZMFTSJFmjsRBC4HM52BFQxqK1ZFHe7g9DnZ3z+PKC50r4tMUIIdqFEHcJIVYLIVYJIY60NI+HhRBrrPuOnOMvF0KsFUK8K4Q4JWd9uRDiTeu568SuNrSvhXfvV1dwh35e/extgxX/oorvCg1Gpdx0p1uNnszzLMp4IpU8i0ifylRpnQGnXqVScl/9LWx7o+jQwYgyFuv7wgxGEpmc93lTmrVn0cDss+0x/u75Mq6bjoe7zofrl7PPNpWlnudZBLaoEJTNjBXq4mPOsern3AuTcB/4y/RkKutZVC5WiybSPOA+Gb78EhjlTyFdzW4SKZNtgdj/3955h8l51An6rc65e/KMNMrJSpacA+C4DkQDNnEBe5cDlni7C0u4u11YA3fssQsLR7glmXCEJRgwwcZgbLBxQrItB9mysjTS5NQ51/1R9XWaTqNgBdf7PPN0z9fVX3/VoX71ywhRnT9iqC4k2K4ZClT4rBUNVamteUo+i+feDCUa3K/3fyM+B9whpTwD2AQ8DXwYuEtKuQq4S/+PEGIdSkCtRwmjL+m2rgBfBt4OrNJ/bQmroyH9wFeQ4cHqcMOrPwGdy+Gn76yu+ZSJgrtJM0Grh0DV+AZhtokxlRFeS1SbHkLa9LBRK3t1zGJWuOy2oVmkLJdkXt0X5OBUimT2+LVeNBwh936G63d+kGkRhuu/Dm+6FdxBNjz6MaAmGio2XJ3Tc+br4D1/Vkl3rkC1ZpGYqPaNVTJHWFjm0eZmqES2UJUL0ghrk7J/MknA7TjqpkWnG3N8Fm02Y/K67CVNs1401InwWcgG9+v9PwchRAi4BPg6gJQyK6WcAa4DvqWHfQuwmkRcB/xASpmRUu4FdgHnCyEGgJCU8gGpyil+u+I5x4WJ4QN4Dt7Lne5rkKLiLXL54ap/VglRI0+WjzczQ4FycteaoeqNDw+q23oRUSVhoXeUVv7GzME5Qy3N4pH9ahGw2khaTu5dY8YUddLx7G846FnDOzyfho03wMorYfU1OHNqt1+oNUNVahY2u9rEwNxw2GQbwsLK7Wmz9Wkqm68K721Ep96k7JtIHHUv69MRh92GTZRDZ9sWFjoiSmlrJ0eexSYhRFQIEQPO1Pet/ze2ce7lwDhwixDiUSHE14QQfqBPSjkMoG979fiFQOXKN6SPLdT3a4/PQQjxdiHEFiHElvHx8XpD2mJyVF3GrUMh/vXOmlLOXavU7WzFpWaizXdjnkh1NFRDzUILi3qmqJJTUy8S3g71o56tFhbpXIGkrjA7MhPHRpHugGWGUtf4rPFbnHzEhjnsGMTjroiocwWwF9LYKJZ9FsUCxEfnVguw8EbKwqJYVIKgkRnKHVTmqyozlKjv36ggkSm0JSy6tWZxcDr5vA2bbYXVh7tdBzeU83ACLge2ilpSJ8zBLaW0SylDUsqglNKh71v/t7NNcABnA1+WUp6F6t/94Sbj6+mossnxetf8FV2/6tyenjr1+NskGVM7rZ7uHr54924OTiXJ5ovcdMvDfPFRtWtn5kD5CY0WfwtPuFqzaDQ+pGVgI81C2CGgZasQSruo0Sws53bQ7eCbzn/hZsctpdILSzp9uOw2djYKn73332DPHxrPo5b99x/zjPPnJVJCfJQJ0YHfXbEI60XbR7qsWcTHVI5OqJGwqNAs0jNqbCPNQojq8VapjxbmomQ2j8/VhhlKf+9yBfm8zd5uhctuK5mh3G36LKyCgbXRZZawOV6Jt8ezVdQQMCSltLKFfowSHqPatIS+HasYv6ji+YPAYX18sM7x40Ymphb26y48A4DfPzPGg3smuWfHOJ+++yAxW4iiJSykbF18zVujWTQyQ1nCop5mER2GYL8yOVhEFqms7wosYbFxYYjNtt1cYHumtMNz2G0s7/HXz7UoFuGeT8HWbzaeRyXT+1U9q++8EnKp9p5jqE96BvJpRmVHtS9Am4P8pMt5FrUaZi2Vi39ClcVuqFnUjm/1PdYks+1pFlava8AIiwa4HPZ5JeVBWbOofU/tNoHTLk5obagjQko5AhwUQqzRh64EtgO3oRL90Lc/1/dvA14vhHALIZahHNkPa1NVTAhxoY6CekvFc44L2YQSFqsWL2RZt5/fPzPGndtH8DrtvO+KlezPd5IY26cG51Iqy7uZz6Kug7vOj9LpAX/vHNPSw3uneOLp7RRrTQ9NNIsLBwRBkWKZGCbkLO80Fnf6GJ6t40BPTqp+HO0WM3zqVnU7vA1+9X7TFfBo0E21RgqRKht0SViICs2ilJDXhmaR1MLC39X4tauERQtzqkYJi9aLv8dpL5mfAsZnUReP00Yym5+XGcpKzKtn2nM77CckGupY8F7gu0KIx4HNwP8EPgVcJYTYCVyl/0dK+RTwQ5RAuQN4t5TSWuXeCXwN5fTeDdx+PC86n1JaQDDcxWVrenhgzyR3PDnKZWt6OH9ZF4dkN/ao3v23k8jkCasfYrFQoYk0EC51wmfv3TmONz1G1FljWossUkKoogSJ5dw+N6wiWxyiiJgsd0XrCriY1AKlCsv0NbUXpGTr/mn+9TcNWm+CygMYPA8u/RA89l2lZYxubzze0BgtLIYKYXwNzFClaKhSoENdt1158Zdy/ppFNt7SuQ3KDFVlLmuCZYoymkV9lnX72TUWn1fobNkMNVcAux22E1pI8IiRUj6mfQhnSilfKaWcllJOSimvlFKu0rdTFeM/KaVcIaVcI6W8veL4FinlBv3Ye+TxajJrvZ72L9i9Ya44o5dsvshEPMM16/vxu+0ckt24Eof0wm+FGzYJna0s+dFKE6kjLIamU/SLKUZkzQ6xTkSUpVms9UyWx409Xbrb4XMxncjO7dNrCYvMLCSn+MZ9e/nC3buIPvzduR3VJnaqhk4brofLPgIv/7zqvPaDNzZ+DwyN0cLiUC5UvVvUwsJPpqxZJMYB0Tx3ophTod1WWfwan8WjB6a5+Rfb1XfgCMxQiWyh7f7VlikqaBzcdVm3IMSzo7F5RUM1MkOBJSxOTc3ilESmo6RxgUMV4PO57DhsgsvP6CXgdnBIduPIJ9WPrJ3Y9MqSH/VKfVRSR1hMTo4TEGn25WqeE9G1gGarhYVNQDitdqB57KrfgabT7yJflERTNbkW0bIbqDixiwf2TAISz13/Ax74UvXYJ28FBKx7pXKGnnMjXPRuVaE3YyKt5k1cCYv92VC1ecdtmaFS5Wgoa0FvlAxXmTuR1BuGGsFy6yOH+Maf9vLU4aiqDzUPYZHXCWT+NsxQUM61MJpFfdYNhMgVJIWinHfobD0B7HHajbB4LrFnoyRtalfndti54ZxBXnXWQsJeJ363gyGpf3wzB5SzGlrnWYCKiGqVJevrhFyyKjEvP612/dvjNSaCkmZRdnJPJbJEfC5sM/tJOTtIhlZUaRaWWWAqWWOKqojAGt63nalElh5mcGWmqp3zoJo59W+stpt365Diyja0hvaIjSDdQWYKLgJVZqiyg7ukWbRqe1opLBITSuN1uKqG7JlQAv2320fV+Gxct/ht0fgISOpIm3Yc3FBOCDWlPuqzfkF5HWjfZ6H9QHWEhcthO6FVZ5932HNxMrbyD/Lm6zbw6deoLmN+rVkASgNo12cBSrNoJVw8FSYrIFco4kioneejM97q5KxAnyoSV6FZTCezSvWf3oe3dwWhxWdWaRYdPi0sKjroAUqzCPSDsDG6T40/z3OofN2VZJPlRcmi6xgLi2IRnvm1almbr+NjOZ2IjVD09wFUaxaWz0JkKjSLePPaTVWaxURd5/aecVV94M7toxUm0pm2NItkxhIW7WkKlhnK5FnUZ1l3AI/OvJ5PuQ+oL4DdRrN4bnHn42Qd9X+QfpedwyVhcbC1WQnKAiA1o3wCzcZbP3YtLEZm0/QJZU7Yl4uwf7KizIjNRtY/wOjBsgN7Mp6l06eEBR1LVJG5mQMloWaZBaySICWih1WxxPAismM7Wdrl45rusfJ1V5JLzE3csgot1hMWQ1uVT2M+bP8p/OAN8JXL4LPrj3lTqHkzvE2FFh8PYiPkfEpY1PdZpClYDu7MPDWLGhNUIpNneDZNT9DN08NRJor6c0xOtdZaoFQqpn0HtzFDNcNuE6zpV2tBO82PoCIpr6HPwmgWzxnuQpy8s/6PxmG3kXSGydncyrGcbsNnUengbjH+wcPqx1hMKjvywekkHaiFfkqGlJ25gr25Tg7v28mhGZXrMJ3M0uWzKa2nY6nq8Q0w9gxQLsEwVRsRpYvTFTuX448f4OKV3awX+wGQ6Znq0Nhcam7FXKdXRWdN7GQOv/yvcOvb6s63LlLCA19UJSxu+IZ6/779Snjk282fU++1jxXbfgD3/K/jk4QYGybrVZFuvrpmqFQ5zyI7H81ico5ze++E2mz89QtUf5atY1ZI7iHVkrWFZnFwWn3PQt72zEqWGcqUJ2/MugElLI6Fz8I4uJ9Dsvkifpmk2ERTCLidTDv7VUJcO5U6PTWqPjQ0Q92vhcW+Q8rhfGg6RVCkkMJG1u5l+3C1sBgqdtInJvmPP6hGTVOJHMtcOnO3Y6lqpSlscMeHYeag0jqgOnxWSqVZhBYw5VrIIka4aHkXC9Jq8RWFbHXiXT1hAarvRz3NIh1VO3OrmVQrDj4Mh7bChe9SEVdv+z0sexH86gMw0cDMte0H8IVzy/6Z2Gh1scdG/Paj8KWL4eGv1i/gaGEFABxrn4zO3k65VWZ+VVKe3Ym0u/GLDEVZYYZyNduYWMJiSmsW1Wao3ePKX3HFGb0s7/bz8Ih+wPJ7tRAWP9pykLDXyUXLm+RuVLBpMMLqvgCreltHWT1fWbdgfsLC0zQayn5C26o+r5hJZQmKZNPF3+92MGnvVZpFJqpKkNub7JycXtVJL9U6GmrHjPpIntmn/BBD0ylCIgnuICt752oWw1kvIZL84M8HGY2mmU5mWWLTIZORJapHxg23KDPQVy7Dmx7D67QzXSksUtOQT0NoIXuLfUREgvM74nhj+xmWuptapd8im6jfx6BrFemRHXzvwf3Vxy1BYyXyteLBLyoBu1mH4rqD8Kr/UEmLt71X+TMqkRIe1BFbo0+p26/9BdzdRtuVZ36pBMCvPwA/unHuuTX5WZ0Md6yFhc7eTriUBlAbZSSdfp1nUeHgbqZZOL1qk7D77rpFBHePJ7AJWNLloy/kYSSrhb5VvbiJsJiIZ/jNUyO8+uyFpVj/Vizt9nPn311KT9Dd1vjnIxu0sGjXD+RrUO4DVOVZY4Z6jphJ5giRxOZtnDfhdzk46FgE48+oWPYW9f9VDZ5ItYO7zo+yWJQ8NaU+kv2WZjGTotuZRbjDbFgQ4slDs6UciWQ2z1jWQ0CkEcUcH/rJ4xSKkgVFvV20uvutfyW89U5lwrj9g3T6XVVmqFvv0RVZQgvYllQ7xt7ttyCQbLPrmpGVfosGmkU8uBSPTPHMzmerH8gl1e2TP23+PlnsvRfWvrzaLxLsh2v+p2oTuuPX1eOH/lzu6zG1FxKTSuuriAKrSy6lMtZf+Ldw7b/As3fAPfUFTGJC7bwL48fY1BUbBSDq1MKixhcgXQH8IlUObGgjYomz3gT77lX5PDU+iz3jcQY7fCq72uPgQD6itI+n9GfT5Nw/2TpEriB54/mL5zFBQys2L4rwf990Dpetaa+e3doFIdYvCLGmf+5n5XbYTtkM7lOOmVgCn8jg8EUajgm4HWy1b1a78V2/ayuRCU+kHDrrClTXeNIcnk0xllNVR+Mzk4xF0wxNJ+lxZsAd5MxFEaYSWYa03fjwTIooPgA+cvkC7tmhNIqe4qgyPVVm+fatg0s/CE/fxkucW6vMUIf2q93ymOjitpkljDoWIvROfaf/LDXI0iyKBShkwOmjUJRVyX27Cv0AeGP7yq8rpRIWvi4Ye6rkO2lKNqFCiGtZ+3J1O1OjuTz8FSWwfd0wtRusjPWasilzGN+h7PS9a+GCdyA3/yX88dNzzWVS4suobOjMyDwd9a2IKY1l1qGE9Jz8BZcfP5lqn0WrLOvNb1JFJwH81QvQ7vEEK3qUEA64Hcxkgc1/WXpP796b4ot319eefrx1iPOWdpSqFxuODUIIrt3QX6oa24qFES+/et+L6A165jzmdphoqOeM2KxyLDv9jYWF321ni1inwlaTk81zLCysYoKZ2YaayO7xBFmc5G0ewiLBPTvGGZpO0WFPgSfE5kF1TduG1MJ9cDrFrFQ//JvO7uB9V6wEICJSKly31jR28fugayU3ZH9WpVm4kkoTueOAnccnBD+94D9VJ7511zHhU+csaRZaS/jDvgTrP3oH332onOPxSEItTOHkvvJr5jNqQd74WmWKe/g/mr9PFcJoDq4gIKq1nEIenvqZagDUu1Yt9BNas5k5OLdmVSEPP3uXEhSW5tG7HoTgsc4XA7Bvb432kJzEiWoNesRmqEZFB3T29rRuGDmnqZDLX646m8+q+l0t2p4SGig37aoInS0WJXsn4izvUc8PuB3E03nVAVJz194k335g35xTJrN5do7FedGqI6/mbDj+eIwZ6rnDKk/uDXY0HON3O5jKOmHxRepAO5pFaKFaaNJR8s4AH7vtKeKZ6ixqqymR8EUYcGf4t9/uYHg2TUikwK3UTpfdxuNDKqx2aLqsWZCe4e+uWs3v338pfe5M/WuyO6F/Ix1ytkpY+NKj5KWN//OQOu+6Jf2qK+Brv10dhw8l/8Nvd0aREn79xHDpPA+Mu0hJF13pikq4lgmqYymc/RbV13xqD9z2Prijtltv+fx1hYXNpoRgbVHGYg66Vqjoqak9ZWGRT5XrI1nMHlC1rB7+iso/sbtLjYMe10P3HqjWSKR2bh8s9uCO7qv2a2STc68TlAZ572fgsxvhnzvgE33wp8/NFRp6LlOoDcSckFS3H7/QPotSc6I2vm8XvENtZrpWlg4dnk2RzhVZoYWF3+0gkSmo+a+4AoDJnJvRaGZON8WdugfKaqNVnNQYzeI5JB1XP15fE2ERcDvUQr/ySnVAawq/f2aUm255mH0TdaJwlr5QmUVGHicqvXzz/n187d7qCq+7x+NEfE5s3giXLHLQ4XNRKEr8MgmeEC6HjXULQjx2UF3j0HSSpE3/eNOzCCFY3hNApGONa1V5IviL8SphEciNM0GY8aTakZw5WH6u3Ved92Et/oO9Xbz5wiVs2TdNKltASsljh2JMEsKbr/Rv6MXU6YUXvV/5b756JTzyLdXPvKII4pzx9ajtOlgZjda5XDl1h7aWH68p4V7yGe24XTnDe1aXNLDHJ1Ufh9GxkaqnRMeV8HiwuBZ7MaM6JYISfJ9eMTfDHeDX/wB3/bPKP3nh38OKy+G3/6Qc9JXo658puJRrq8ZxLFzBcp5Faa6ti/2x/DL4b4fLfitg34R6b5d2K0Ec9DjIFopqJ3rJP8CiC5QPA9UKtRKrrH09O7nh5MHtUP0xjkf5PCMsasgmlBnK1dQM5SCRycOKamFx19Nj3LNjnJd/4T7u31Wzo112qbqd3kdKlxL52r17GY9l+NGWg+wcjbF7LM6KngDCEyEskvzyvS/kO289Hz/JkqawaTDMk4dmKRQlQ9MpPEFt269dQBuZxrwRPIUoqVyeVLZAJl/AW0iQdqjxy7r9RHzl8hAlc1yqWrNwegO8cFU32UKRh/dNMRrNMB7LkBR+PIUE+UKxajwuv4rMOuevIDXF/p7LlUllzz3V12cJi0bd2uqVewf1/nStUPcPPljOKK9tO2stuNFDKtGvd13poS2j6gc2MzlW9ZSZkX0A3F/UOSuTu5SGcP//Udcbr+nKmE3C9tuUJnXjbXDlP8IbfqAcz49+pzoMORMFh5d4VuB3ze1TLdx+1c+iSrNoQ1iA0iQr2D+lNjFLu8o+C1Cd71hyMbz1TmaySljVbnieHY3hdthY3FlH4zOcNBzPPtxGWNSQT6hdovDTVjKaAAAgAElEQVQ0joYKuB0kswWKPetgYLOqk4TKih4Ie+jyu/j4r2oicbpXqXIaQMqmfnCJbJ4r/u0e/uHHj3PTLX9mx2iMlT2BUmc9h93Gi1Z2I9LlznqbFkVIZgvsHo8zNJ0iGNE26aq2rbONTWOeCHZZwEeGyUSG6USOIEm8gQguh41Ng9XzDnjcRKWPQqmMtVrMnR4/FyzrwmW3cd/O8ZIfxeYNEyTFbErb+Gs1hatuhr+6nTdH30VU+ph+7BfV15c9Us0iWO5DLYtlrW+mRrOo1GSKuZKwGI9l2B8X5HBQTM0wU1E7KzF5iKIUPOHYoA5M7IJ998GEdnZnqsOZ2XmnynLfcH35mBCqpDtUm8Z0iY1Epn7Zb+EK4LP6WVhFGtvRLOpwYDKJy2GjP6Qco5Z/JJ4um5ws0+i+Gs1ix2iclb0B7LZqYWY4uXjtuYv43d9f0nbpkPlghEUNBd3LopnTurQjyxXgHX+AC/8GUHHoS7v8vOWipTw9HC35IAC1WCy7RD1P+HHaBX95wWK8TjsfvHYNo9E0M8kcK3r95TBbUBFXxVzpes7UTu5H9k9zaDpFuFM7HKuERZN+GdoHESbBVCLLRDxDQKSwecPcctN5vP/qNVXDgx4Hs9JPPq6ERVEnurm9AbwuO+cs6eAPz45z+xPD2G0Cb7CDoEgyndTCorT46x2p08PB4GYOzOb4Q/FMbLvurPYBlHwWbWoW2qyUdwWhY1n5+OB56j2ojYiyhIUW3JawUMmOgqwzRIR4ydQHkJ85xCQhBhavJIlXaS5WXkflOS2eulVFIS15YfVxK4w1WUdYZPP1K7m6AxWahXqd6bybrfunuH/XBNtr8m6asW8ywaIOb6lvs/U9jmXUZyWlVBozdTSLkRhrjL/ipKc74GZlb7CqN/exwgiLGoS16DbpT+GvVN8rmExk6Qq4eOmZAwgBv3y8pvurFhYx6SXocfLx6zbw0H+7knddtpJ/uEYt0mf0h/SCqK+jJkN8ebef5T1+vnTPbibiGfo6O8DmmLuANtEsAMJCCYupRJYgSeyeEC9Y2c2iGjND0ONkFj8FXeYilVAC0ONTu9sXrurm2dE4P3vsMK85ZxCHL0yQZHlnnqsRFsBDe9W5phZeQbgwzTOPVtR9yulFSmsWKa1FlahpUVvUwuJn22Pg8pXDhbtX1e0kWNICNr1OhRdrrfCpw+qcrkAXYZGoEhb2+AhTti4Wd/nZzjLV+GnHr8umxUrNIhOHZ++EddfNjUazEuQSk9Xj3UFmkjmC9UpouAJ4RI5iIVfSLF7/rSe5/ssP8MavPcRLPn8vX/1je90N908mWdJVFsKBmu9xJl8sJf/tq6hBNpvKMRJNs9r4K57XGGFRg8i2LjlumQtqo5km4hm6A276Qh7OX9rJL7YdrnY0aWExI/0EPco+bdmo337Jcm5918W8cGW3jviJqh13ujrj22YTfPTl6zkwpR3Nnf5q4WI1ZGriswAIac1iMpEhKFI4fPWFo6VZWA2hkgl1PV6/Ov8N5wxy/dmDfO9tF/Cp68/E7g1XaxZaWOTsHv7ljmc4MJnkoT2TRHxOXv26mwAY33Zn+QVLPg4fqWyBN339IV72+fvIWjZYbaKzsAISHh3TgtsyRXWtVP0+5mgW+v285IPwjntLZdafOhxlUacXp7+DAXe6Slj40mMk3D0siHh5c/oDZN74c3jZZ5VJDao1iwMPqCgsKyekkiaaxZ7xOMu762hT2ndjyyZLr5Oz+7jlpvP4/tsu5KLlXXzxnl3EM3nimXyV+awSKSUHppIs6SoLbSsDOK41i0TF97lSWOzUzu3VfUdm/jKcHhhhUcMVS70UHN45zsFKyjuy8o8rky8QS+fp0iWZX75pAbvHEzxxqMI81LEEbvgGv/dcNadksxCCsxd3KPXRGwH0op+Zaxa7dHUPV69TVUoXdni1cNHj8mmVudvIDKV9MZZmMRlXmoU7UN+hb2kWlsaVSqrdrd+vFo6+kId/e+0mLl6hFkKXP0KQFNNWCXS9+P9hb5wv37Ob9//oMR7aO8V5SzsJdvSRxI0tVbHT1maugt3Le773CFv3T5PKFUrCEU9E5WHo81r90p+e0kJ58Fzo26gW2cii+j4Lm1M93r+hdHj74SjrB8Lg7aDHkeLpihpc4cIEOV8/A2EPKTwMdZwL5/51uZ9IpbCwCg1aj1Vi5TzU+CzyTj+HZ9OlZLkqtLBw5BMlB7fdE+LyM3q5aEUXH3rxGcwkc3zstqe46jN/4O3f3jr3HMBEPEsyW2BJheYYKG16lKC1NIxl3f6q8NkdJWFhNIvnM0ZY1LDYn8PexLkNlWaosrCwSn536xo4L904QIfPyV9/889sq9ilsuF6hvKR5iWbK/tfNChUePN1G3jHJcvZNBip3m23qoKrzVB9zjQHppJMxxJ4RK5h9JelWdgy6vzppLqeQLD++d2BDpyiQCyur0Mv/t97dBK3w8af901zYCrJBctUFFdcBLBlKgSqFgJ/3JfgrmfGuOGcQaBcAK+qkRSQS86Sk3aenshTLEq44h/hbXepMeFFSuDWOsTdQeVD0hyeSbF3IsHGwTB4IwRknBmtGUXjcTqJYgsNMBBWprHhmbQ1WX3OSjNUk/ffE1EmwyrNIkpMqvNa+Q9V6MgnWz5ZMkPZPOVxmxdFuHxNDz/eOsTwbLpUfbiWAzoSqtoMpTZEloPb0pSthjxW+OzjB2cJuB0sjDQIOjA8LzDCopZ0ExOOxtIK4nWEhaVZdPhd/PidF+N12XnT1x4ilS37N6LpXPPOYZUNkBos/v1hDx95yVpVqbLSjm8tVo0Enl5slwdz7BiJkYwqx7VooImEPE6i+HBq81w2pRasQLC+cLGETipWHWq75VCaD117BucvVULiQl21NGUL4MhVLLbabPWTJ6YYCHv4x5cqB3RJWFRW8EUFJMTxksoVGY6mVRkVhy5aV6ftrCUssvliyU9x6yMqb+IVmxaAtwNfPkomXySdKzByaJ962c7B0mJ5eFYvyA63Suqr1CyaVSEWund2jWYxnVfRSSt7GwsLkY1DNkYWJ15vtV/pn16+nr96wVJefdZCphuYoawci8V1zFDWpieRtYSF+u7sn0wQz+T51RPDXL2ub05Yr+H5hREWtWSijU04mpJmUZHlOqHNLlazF1A7xb/7i9XEMnlGouXy17F0vrlmUdWGtY22rZVmqFY9wd1hQLDEl+XZ0RgpHeXU6PyWZmEvZiCXJpdWO9RIA83CCjnOWufVDmuby8drzh3kM6/bxEdefEaphn/aEcJdR1j8fk+cG84ZJOxz0hdys3tM29BrNItiurwzr4o+A2WGgmont44U++b9e3np5+/jd9tH+dHWIS5c3qmc+54I7kIcG0Vi6TzJSdUt0N25kL6w+mxLmoX1vqWj1ee3OcsCqxZ/d7k3tnqjGM+6sNtE1a6/hDZD5dMxyMRJCe+c786ybj8fffl6VvQGSGYLpOu01dw/lcQmYLCjrB1Y1Utjmfqaxdb90/zs0UPEM3nedNGS+vMxPG8wwqKWNjQLf42tFyrMUIHqfsdWW8nKNqaxdK5u45IS9dqwNu2XUVECo9V4mw08IQZcGaaTOcYmdEJZA+ES8DiIohex9CyFTJKMdBL2zy1iVvm6uURZs8hj56LVAwQ9TgY7fLzj0hWl0L6sM4S3ULEz16G2KeniNeeoxX55d6DUN7pWsyATJa5LnuyuEBbDsymiLt0jvNJvoT/fXz2hsrTf+/1H2T+ZLL2W1Q8iRILZVI5cVFWFdUdUobfugJthrVmMxzJEix5krWZRY+aqwtdV1izyGShkGU47WdLpq9/PQJu6Cuk4ZOMk8BJqoJVaLXMtE1olByYTDIS9VcXqbDZRrg9FWcPoD3t4+aYFfO2+vfz773aybiDEWYsaJ6kanh8YYVFLG5pFoE4y00RcCYPuQPWOsraNqZSSeCbfvhmq3eZK6VkdCdW8uZI1vtuhFrzE7EzT8zvtNlJ2q6TIDIVMgjQuHI2SfvTrFiw/QTZJClfDfgYFVxi/rNAIcknSuLhgeXfJZLKi18/usbiKLKvsOgjYMjFieHHYRMlU9fDeKa741z9w7uceIyvcZCb3lc+fiZK2+dh2cIZXnbWQgpQE3A5evFHnXVh5KCJBNJ2jGFXZ3P4OJXiWdPlKzu+v3ruH/Ql7KSJLnb9FH2t/d9lnoT+roaStVNxvDtoMJTNxyMRJSE/DftadfvWdmtMFEaVZVEZClS7HbS+boTJWy1QHn77hTDYvijARz/CmC5cYE5QB0+uwljf+p4q/b4LXaccmah3cGTxOW6k/rkVtG9NktkBRtuhJbGkWlhmqVXMlT1iVzsilWpuhALwRwjZl1gmKZMvxOWcICup6ZDZJ2uahYQiAFjoypa6jmE2SlO7SrrcW6QkTkKo8iMNug1ySlHQrx71mRU+AaDrPRDxLj6faDGXPxYhJH2sHQuwai7N1/zQ33fIwCyIeNi/q4MCTXQQO76bfOlkmxiFULsZ7rljJNev7yeQL5cYzWrOIECeayuFNKGER6FJnuHJtL//7jh0cmknxm6dGuFz6Srke1vmbCnZfdznPQn9W++N2lYxZD6vsifZZRKWn4XcnUtIs5gqLQ9MpLl/TO+d4qc4ZZU054HLgcdr5+o3ncdtjh7j+nIVznmd4/mE0i1o6l1cVX6uHEKqOT62Du8vvnrMDsxzeVv+ImNZGmmoW7qDqR5CeVX+tmivV7fHdXLNw5WJEfE4CaGdtkwiwvLvCLJZLkhMNTFBQ1mgsh3g6Rkq6S+a4WoQ3QkikmE0oP0A+kyCBu6rHsxUltHs8Xm2iAxy5BAnhZ+1AkKeHo7z921voDbr5/tsu5APXrOaQ7MYZP1R+wUyMPVEbq3oDrOgJcO2Gfq7bXLEYViQtzqZy2FMTzEg/Ho+y9b94g9IwPve7Z9k/mSSGF5muiYZqpVlkZlW5cR3dNFv0qDIv9dDCQmQTFDNxYkVPw+9OyeRZR1jMpHJE/HOfF/A4S9/jsmZhL53vphcsa7vPguH0xgiLI6RUTFAzkcjO8VcAeJx2fC57SbOIpZU9uWkDeyHKfohmCXalF7EW0EqzVZMFyxNGpGZY3RdsS7OQ7vJu3pZPk7c3ERZaSNmzMaSU5NMJUrjpaCAs7LrJVHRWmWby6QRp6aoWFjpKaM94QkU7uUMlzcKVj5O1+0vaRzZf5Os3nUdvyEOn38Uh2Y0vWRYWMhNjd9TGNetLukY1Jc0iQTSdx5WeZEaUtZxl3X7O6A/ywy0qgiqGF5Gt9FlEKbgCfPyX27n+y/fzxbt3MRarcIhbPbGTk6XPKoavNMc5aDOUs5CimI4Rp5lmod6z6RqfRTpXIJsv1vV1BNz2KmHhcdoamxgNz2vMt+II8bvt1dFQscwcf4VFp9/FpPZpWJEnTc1QoHMnplvbwK2xUBYuTl/TpEKr9tTqvgBBS7NoookUrQUuMY69kKRgbxJv7wogEfhkgmS2QCGdIIWLzgZmKEdAhdImZpSwKGQSJHETqnh/BkIePE5bdfis1ixcxQRZR4CzFnfgtAv+/fWbS5qI22FnzN6HNzej8j3yGUQhQ1R6WdUoG9nKcBcJoqkcnuwkUXt1ufqXbFTaxYoeP3HpxZ4r+1yK6Sj3Hczw9fv2ksjk+fRvdvCSz93L/bsnmIxnSDr1uZITJWERl14WdTSo5mqzk7d58AslLBJybjSUhWXqm67xWUR1UcdwnXIilQ7ueCbf0B9iMBhhcYQoW29FNFQiQ1cdzQJUOG2tGSrUSlh0r4Lhx3Wdp1aaRUXPiTYc9FaL1zVasyg2C/UEHN4IGVwQH8FRSCMbVYQFsNnIOZQQmk5myz6LOiYQAI8WFslZZccvZJKkcFctbDabYGVvoJxV7dVJiLk0Tpmj4Axw/rJOnvjYNVy5tq/q/FG31iBmh8qLM966C2fpvQG6bEpYBHJTJJxzhYUQ8IbzFxPHizMfLzU1yiWjDCUc/OtrNnHH317Cb/72EkJeJ2/86kOc84nf8Z6f68isRFlYJPCUtIJ65J1+wiQQ2TgJGpuhnHYbQbdjTq5FVGuzoTpz9lf4LFTlWyMsDPUxwuIIqTRDSSmVz6KBZtHld80xQzX1WYCqIzW5U/WUno8ZqlkRQQuvKplxzeowZ/XZEZ5Q41BPIOh1MkEEGRvFWcwgHc0zeQuuICGRZCaZQ2bV4t/IZ+ENKWGRtvIysglS0j1nMT93SSePHJhWNaIszUIvtgXdOc7jnGtbT/gWqDszB0sO5bj01l04AXC4wOmnx5FSyZOFGdKurqohK3sD/Pp9L+Kmi5dScAawy4IqswLYc3FieHnBSvWcNf1Bfv7uF/DfX7KWl24cYH9aaxDJydL15J0BnE1MP/GuTVxqfxx7XvlImm00In7nHM1itolmEaxycDeofGswYITFEVMpLKKpPPmiLDmza+msEhbqOS3V/aUvUrep6fYd3Kl2fRxqfK8zzYULnIgWwiXocTAmwxSiI3hJI1zNG+BId5AgSaYSWUQ+RRpXw2gof1iVWM/FVU0lmUuRwjVnMb9gWSfpXFHV2ir1M1eLrWzSZjQfUOVCmNlf5SNolKsAgLeDLnuSZCJJkARZT/ecIWsHQjjsNoou/V6no5DP4ihmiEsvEW95vkGPk7ddspzrNi9gSuprTUyUaj3ZWny+0dWvZkBMYaNIQjbWLAA6fa45PotoqrE2a32PrZBuY4YyNMIIiyMk6HGUbMHj2h/RKJegy+9iMpFFSlmhWbT4UfZvLOdbtBIW1uOWg7sdzaJqfPPzBz1ORooRitERvCKLzd0gzFNj1w2QDs+ksOVTZG3eurt+AH9Y7cCLSaVZ2PJzzVAA5+taUg/tnSzXwmoju90W6ieHQ5X8aMcMBeCN0GlLIBMqYbHgmyssSljvdSZWWvxTNh8e59yfVl/IwwwBpLBBYhwyMYrYcHmbv59y9bWlLPUE3qbBERGfa44ZqplmEfA4yBclmXyRRKZQtwGTwQBGWBwxC8JeRmMZ8oViKdqlmYM7my+SyBaIp/MIQWt132ZXfbuhtabgcCmndmq6TR9HRRZ0G+ODHgfjMoJIjOIlg6OFsHD6OwiJBAenkzgLqaZmK5sWXDJlCYs0Kemes/PvCrhZ2Rvg4b1TZTOUDlm1N3l/OgNehmUncuZgaXxMegl5m5Vb6SAsEhTiur2qv6fhULtXv3YmWhJeBWegbhJbf9iDxEbGGS45uJPCR7iB1mURDAa5o6C67MWb5FmA+q7Nx2dhVRKIpfPGZ2FoihEWR8hgh5dCUTI8m2ZoSkUUNYposez1k/EM0bRS9dvqZKX7X7Rc/EH1bxje1rz/tkVlYlsb44MeJ2MygjM7S4AUTk9zYWHzhOmwpzk4lcJZTFc1PpqD00sWR2khdxRS5GyeuqUvLljWyZZ90xTcYcglKerkNru3cY5Ih9/FULGb4nTZDJVz+JvnDnjChGSc/KwSFrbg3GQ2C4f12plY6fyywefVHXBjExC3R0oO7qRoXL7DIux18pOi+i6M07xiccTnZDpRbYaa1Wapeq9TWUHZmKEMzTDC4ggZ1IJhaDrFgakkdptgIFI//8CKkppMZFURwXZ/kMsvBwQE+1oOZfllcPAhSE3NI4lvpnm/bk3Q42AM9Ry7kLi8LZrgeEIESXJoKoaTHMLdRFgIQcIWxJFV5UqcxTTFBprIBcu7iGfyDOfV62dHtgPg9DcWFp1+F0OyR2kWpez25iXo8XcTKUwSKio/iis80HCoVWVXVuS4NPIB2W2CnqCbYccgjD0NmRhx6WnsbNe4HXYes2/g2syn2Grb0FTQdfpcxDP5crMolGbhddrrCuDKCspGszA0wwiLI2RRp1rQhqaTHJhKsjDibRjR0qnrQ03Fs6qIYKtIKIue1fA398HaV7Qeu/xS1as7n27fDGVpFm2aoSx8/hY+EXcIn0wyMaVLcriaayIpexBnLgr5DDaKpZaqtZy9WF3Do5yhzvvMLwAa9uIA5S/aJ/twJEaUkxuweVu8P4svwleIcqltGwCeSGNhbTWNyibLwqLZ+ftDHnbYVqgot+hhorJ5dJNF2OvkGbmYoKe5ySrin1vyYzaVa2h2s4RFNJ0jkS0YYWFoyHEVFkKIfUKIJ4QQjwkhtuhjnUKI3wohdurbjorxHxFC7BJC7BBCXFNx/Bx9nl1CiM+Lk6Cq2UDYixBKs9g/lWRxZ+Pdc5e/XB9KFRGcxw+yf0PzBDuLxReBXS8k7Ti4HV6Y2tNWqG1Im6FKT/e11izsFCjqgnnOFg7crCOAJx8r9+tuIFz6Qx5sAp4tLIDQIM7JZ9TLBZprFg8WVU8MnvkVeRz4vM2juVhxJRLBVbatJKSbUKhJP/aQel+SsXIor6OJWaw35OHRnCr3LYcfY7boae5s11hjWm00rOTHyoioaCrf8DUsZ/l4TAVpBIyD29CA50KzuFxKuVlKea7+/8PAXVLKVcBd+n+EEOuA1wPrgWuBLwkhrG/ul4G3A6v037XPwXU3xeWw0R/yMDSd4uBUsqqpTC1zzFDzERZtX5AfFl2g7rfyWdjssPQFsON2kIWW4yM+Z5Vm0WgxL6E1lT6U09rlaS5ccs4wvkJZWNgahOY67DZ6gm5GohlYeSUAaekk4Gv83nf6XWyTK8g5AjC1h4TwtTT7EOhhKrwet8gzIcNNF/NASO11solyKK+riVmsP+ThgZQK5xXFvHa2z0dYNP/udPjmVp6dTeUa+kUsTWJU91sxmoWhESfCDHUd8C19/1vAKyuO/0BKmZFS7gV2AecLIQaAkJTyASmlBL5d8ZwTymCHl2dGokwlsk01C5/LgcdpYyqRIZbOEWjXDDVfll2qbttxiC+/HGZ1NnELzWKww8c/ve4SJFqha5bBDaUkwV6hzFDuFppIwR3GLxOlxkr2JtFWfSEPo7EMrLoKUDkTTfMO/C4K2BmKqGiiRKuwWc3kgHIoTxBuml0dCQZISyfZxCzFtNIs3E3MYv1hD3tSfmRQ+UHisrWDG+YhLOqYoaLpXMM5R/Rxq3GUcXAbGnG8hYUE7hRCbBVCvF0f65NSDgPoWyvUZCFQ0dKMIX1sob5fe3wOQoi3CyG2CCG2jI+PH8Np1Geww8d2XYKimbAA1ddiMn4cNQuANdeqHs+dy1qPXXFF+X4rhy/wsrMWI6wQ0mbRTVDWLITSLDy+1hnoYZEgrvt2O5pEW/WFPIzOpmHZJRSFg1iTWkmgFj+X3caOgFJso23u5FNL1PszRbjpAtrhcxHDSz41QyY+TUEK/E18Or06FyfVvQFQwqud67HGtFrMreTHqTk+i/qv0RVws2FhiF89Pgy0EdJteN5yvIXFC6SUZwMvBt4thLikydh6fgjZ5Pjcg1J+RUp5rpTy3J6exrHxx4rBDq9VEqilsOj0u3h6JKb7bx+nH2T/RvjwARjY1Hps71oI6LpJrXwcFgHt6G2lWehoqyV25bPwBZprFsIbIUSCmG7E5GwSbdUXcjMaS4MnzGjnOYzR0fT9FELQ4XfymOMsQJUDb2cnLxaezbDsZNg+0LTxT6ffRVyqMuW5pOoHHmmQyQ9KswCYCq0HVIJg05wPTbs+i1Ll2QozVDTVWLMAeNVZgyR0j3hjhjI04rgKCynlYX07BvwUOB8Y1aYl9K3OfGIIWFTx9EHgsD4+WOf4Caeyn3EznwXA5kURnh6OkitIehok7x0TWvkTLISAFZer+618HBZWCG8rzaJvPdicXO54EoBAoLkw8oW7sQvJoSEVreRuIiz6Qx5mkjnSuQJ3rPkE78u+p+Xi3+l3syvfQzGyjEkZassMFfJ5eFnmk3zf9+am48JeJ3G8kImTT0WJ4Wt6/r6QEhaHvKsBnSB4DM1QHqedgbCHZ0eVWalYlMQy+aYRVy/fNICV9mPMUIZGHDdhIYTwCyGC1n3gauBJ4DbgRj3sRuDn+v5twOuFEG4hxDKUI/thbaqKCSEu1FFQb6l4zgnFyrXo8Dlb/uA//soN/Pm//wU/fMdF/OUFS56Ly2vNqqvVbaCNPI7Kca2EhcsPiy9kSVFZFYPB5mau3h513tFDe4HmPo5evdiORTOMFUNM2ztx1+tdXUGX38VkMsfYK/4fN+fe3PZOfpIwHn9zrchuE6RtfmzZGIVUlLhs7hOxhMUux2ryNg8HZc8xjYYC2DQYYduQ0tJimTxS1s/etugNenjhKqWJm3IfhkYcT82iD7hPCLENeBj4lZTyDuBTwFVCiJ3AVfp/pJRPAT8EtgN3AO+WUlo1wN8JfA3l9N4N3H4cr7ttLM2ilQnKoifo5vxlnXhdJ8kPcv2r4F0PQteK9sa3a4aCstZCa2HhDiu3VXFiFwA+X2NNpF8vtiPRdClnpVUktVXIccqzmBG62lqcrR18pI2x044eutL7EOkZZYZq4hAPeRx4nXb2pv18/0V3cmfx3HlFQ7WTk7FpUYT9k0mmE9lS/bJWr3HjRUvoD3lKwsxgqOW46ZxSyj3AHOO5lHISuLLBcz4JfLLO8S3AhmN9jUeLlWuxqE1hcdIhhPJdtEvncrA5yxngzVhxBdx1M0DrjO9FF1BEcEH+zyDAG2iS1KZt/qPRdNvBAp1+F1PxbLlGUhu7c4fdht9lb0uwbPNdzNXRe/BNPsqTch2LmzxHCEFfyM1INI3PHQBhayujv10zFMCmRUo4bxuaKdUrazXnK9f2zekFYjBUYjK4jwKXw8Ybz1/My85sXA7itGLT6+FdDzTt112ifxN4VaXYlpqIr5OJwBksFKrWUzMfR1+wLCymEtm2Fv6BsIdYJs/+SRWa285OHuCVZy3k8jMa14WyONB1MWlc2GWudUVbYEmXn11jcdVcqc06YfMxQ21cGEYI2HZwtmmXPINhPhhhcZR88lUbuXbD80RY2J2qg1872GyqXhWobPEWpBaVA+Wama1CXpWzMjSd4rukGV0AAAleSURBVJH906VddDM2Dqox9+5U0VntLpyffNVGrttcN0q7ijWL+rm7oJTopPA1L1IIrF8QYtdYnIl4pu1r2bAwzI0XLeHiFV0txwY9Tlb2BNg2NFNRcdY4rg1HhxEWhuPHi/4erv6EEhwtCG1QzvactBMKNDbrKTOOhzueHCGRLXDJqtYh0tZO+0+7lLBoRxuZD5sWRbi9cD6AyhRvwfoFYfJFyZZ9021fi8dp55+v20CkRTnzymvadnCmaS8Lg2E+GGFhOH70b4SL39vW0I41LyKFmxRu/C0CAPpCHkaiaRw2wcUrmzQm0lg77elkDiHas/vPhzMHI/y+eBZR6WPW1druv26B8smMRNPHbce/aVGEyUSWh/aqyrntmt4MhkYYYWE4OXC4edZzJinhaRndZEXsnLOko+28gDMHlVO+7V4i8yDsddLX08Mlmc/yx3DrSjRLOn0lgXi8dvxXre2j0+/i1kcOYRMQMJnZhqPECAvDSUP6ipvZcubNLcf1h1SEz6Vr2s/S36x9G8faBFU+fwczBAn6W4ee2myCtQOh43o9/WEPt9x0Hj6XnaDHecwFpOH5h9luGE4aLjj/Yjj/4pbj+sPKYd6Ov8Ji0yKlWRyvnfzmxRF+8shQ2+dfvyDElv3Tx9U8tGlRhO+89YJSFJjBcDQYYWE45Xj1WQvp9DtZv6DNMiXAGf0hXHbbcfMRnDVPYbR+wfHVdCzOWdLBOUs6Wg80GFpghIXhlKPD7+JVZw22HliBy2HjktU9bWfbz5c1/UEWdXo5o7+9ooyWk9uEtBpOFYSUdQu4nvKce+65csuWLSf6MgzPI6SULZ3zFoWi5LO/fZY3XLCYhZE2yqcYDM8RQoitFc3qSphtjcFwjJhPt1+7TfCBa9Ycx6sxGI4tJhrKYDAYDC0xwsJgMBgMLTHCwmAwGAwtMcLCYDAYDC0xwsJgMBgMLTHCwmAwGAwtMcLCYDAYDC0xwsJgMBgMLTltM7iFEOPA/hN9HfOgG5g40RdxjDmd5nQ6zQVOr/mcTnOxOJFzWiKlnFOl87QVFqcaQogt9VLsT2VOpzmdTnOB02s+p9NcLE7GORkzlMFgMBhaYoSFwWAwGFpihMXJw1dO9AUcB06nOZ1Oc4HTaz6n01wsTro5GZ+FwWAwGFpiNAuDwWAwtMQIC4PBYDC0xAiLI0QIsUgIcbcQ4mkhxFNCiP+qj3cKIX4rhNipbzv08S49Pi6E+ELFeYJCiMcq/iaEEP/e4DXPEUI8IYTYJYT4vNDddoQQf6OPPyaEuE8Ise5Unk/F4zcIIaQQYl4hhCfTXIQQNwkhxivO8V/mM5eTcU76sdcKIbbra/neqToXIcRnK57/rBBiZj5zOUnntFif+1EhxONCiJccyZzmIKU0f0fwBwwAZ+v7QeBZYB3wv4EP6+MfBv5F3/cDLwT+BvhCk/NuBS5p8NjDwEWAAG4HXqyPhyrGvAK441SeT8U1/BF4EDj3VJ0LcFOzc56ic1oFPAp06P97T9W51Ix5L/CN0+Dz+QrwTn1/HbDvaL9/UkqjWRwpUsphKeUj+n4MeBpYCFwHfEsP+xbwSj0mIaW8D0g3OqcQYhXQC9xb57EBlFB4QKpvwbcrzh2tGOoH5h21cDLNR/Nx1A+t4flPobkcNSfZnN4GfFFKOa1fa+wUnkslbwC+P5+5nKRzkkBI3w8Dh49kTrUYYXEMEEIsBc4CHgL6pJTDoL5AqA+7Xd4A/Kf+8GtZCAxV/D+kj1nX8G4hxG7UAvu++Vx/LSd6PkKIs4BFUspfzvviazjRc9Fcr80BPxZCLJrHa9blJJjTamC1EOJPQogHhRDXzm8GZU6CuVjXsQRYBvx+Hq9Zl5NgTh8D3iSEGAJ+jdKYjhojLI4SIUQA+AnwtzU7/CPh9TTe2Yg6x0pfIinlF6WUK4APAf/jSC/gRM9HCGEDPgu8/yhf+4TPRd/+AlgqpTwT+B3lXeYRcZLMyYEyRV2GWtC+JoSIzPfFT5K5VD7/x1LKwtFcxEkypzcA35RSDgIvAb6jf1dHhREWR4EQwon6YnxXSnmrPjyqVURLVWxLRRdCbAIcUsqt+n97hZPrZtTOYbDiKYPUVy9/wBGaQE6S+QSBDcA9Qoh9wIXAbWL+Tu6TYS5IKSellBl9/KvAOfOZx8k4J/3Yz6WUOSnlXmAHSnicinOxaLYwt8VJNKe3Aj8EkFI+AHhQhQmPCiMsjhAdefB14Gkp5WcqHroNuFHfvxH4eZunrLKXSikLUsrN+u+ftAobE0JcqF/7Lda5tW3T4qXAzlN1PlLKWSllt5RyqZRyKcrB/Qop5ZZTbS76WgYqzvMKlC173pxMcwJ+Blyur6sbZZbac4rOBSHEGqADeKDdOZzkczoAXKmvay1KWIwf4dTKyGPgJX8+/qEiGSTwOPCY/nsJ0AXchVqw7wI6K56zD5gC4qidwbqKx/YAZ7R4zXOBJ4HdwBcoZ+B/DnhKX8PdwPpTeT41Y+5h/tFQJ81cgP+lP5tt+rNpep5TZE4C+AywHXgCeP2pOhf92MeAT51G68E64E/6O/cYcPXRzM36M+U+DAaDwdASY4YyGAwGQ0uMsDAYDAZDS4ywMBgMBkNLjLAwGAwGQ0uMsDAYDAZDS4ywMBiOAUKIgk6YekoIsU0I8fetsmaFEEuFEG98rq7RYDgajLAwGI4NKakSptYDV6Fi7D/a4jlLASMsDKcEJs/CYDgGCCHiUspAxf/LgT+jyiwsAb6DqggM8B4p5f1CiAeBtcBeVM2ozwOfQtVccqMqu/7HczYJg6EJRlgYDMeAWmGhj00DZwAxoCilTOvSLN+XUp4rhLgM+ICU8mV6/NtRvSE+IYRwo7JwXyNV/SWD4YTiONEXYDCcxliVQZ3AF4QQm4ECqpZSPa4GzhRC3KD/D6MK9BlhYTjhGGFhMBwHtBmqgKoy+lFgFNiE8hM2angjgPdKKX/znFykwTAPjIPbYDjGCCF6gP+LapcpURrCsJSyCLwZsOuhMVRJdovfAO/Upa4RQqwWQvgxGE4CjGZhMBwbvEKIx1AmpzzKoW2Vqv4S8BMhxGtQlWcT+vjjQF4IsQ34Jqp68FLgEV12epxj3J7VYDhSjIPbYDAYDC0xZiiDwWAwtMQIC4PBYDC0xAgLg8FgMLTECAuDwWAwtMQIC4PBYDC0xAgLg8FgMLTECAuDwWAwtOT/AwIcvnjm5HFRAAAAAElFTkSuQmCC",
 | ||
|       "text/plain": [
 | ||
|        "<Figure size 432x288 with 1 Axes>"
 | ||
|       ]
 | ||
|      },
 | ||
|      "metadata": {
 | ||
|       "needs_background": "light"
 | ||
|      },
 | ||
|      "output_type": "display_data"
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "import matplotlib.pyplot as plt\n",
 | ||
|     "plt.figure()\n",
 | ||
|     "plt.plot(multi_X_test[\"timeStamp\"], multi_y_test, label=\"Actual Demand\")\n",
 | ||
|     "plt.plot(multi_X_test[\"timeStamp\"], multi_y_pred, label=\"FLAML Forecast\")\n",
 | ||
|     "plt.xlabel(\"Date\")\n",
 | ||
|     "plt.ylabel(\"Energy Demand\")\n",
 | ||
|     "plt.legend()\n",
 | ||
|     "plt.show()"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "## 4. Forecasting Discrete Values"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Load Dataset and Preprocess\n",
 | ||
|     "\n",
 | ||
|     "Import [sales data](https://hcrystalball.readthedocs.io/en/v0.1.7/api/hcrystalball.utils.get_sales_data.html) from hcrystalball. The task is to predict whether daily sales will be above mean sales for thirty days into the future."
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 2,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [],
 | ||
|    "source": [
 | ||
|     "from hcrystalball.utils import get_sales_data\n",
 | ||
|     "time_horizon = 30\n",
 | ||
|     "df = get_sales_data(n_dates=180, n_assortments=1, n_states=1, n_stores=1)\n",
 | ||
|     "df = df[[\"Sales\", \"Open\", \"Promo\", \"Promo2\"]]\n",
 | ||
|     "# feature engineering - create a discrete value column\n",
 | ||
|     "# 1 denotes above mean and 0 denotes below mean\n",
 | ||
|     "import numpy as np\n",
 | ||
|     "df[\"above_mean_sales\"] = np.where(df[\"Sales\"] > df[\"Sales\"].mean(), 1, 0)\n",
 | ||
|     "df.reset_index(inplace=True)\n",
 | ||
|     "# train-test split\n",
 | ||
|     "discrete_train_df = df[:-time_horizon]\n",
 | ||
|     "discrete_test_df = df[-time_horizon:]\n",
 | ||
|     "discrete_X_train, discrete_X_test = (\n",
 | ||
|     "    discrete_train_df[[\"Date\", \"Open\", \"Promo\", \"Promo2\"]],\n",
 | ||
|     "    discrete_test_df[[\"Date\", \"Open\", \"Promo\", \"Promo2\"]],\n",
 | ||
|     ")\n",
 | ||
|     "discrete_y_train, discrete_y_test = discrete_train_df[\"above_mean_sales\"], discrete_test_df[\"above_mean_sales\"]"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 3,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "data": {
 | ||
|       "text/html": [
 | ||
|        "<div>\n",
 | ||
|        "<style scoped>\n",
 | ||
|        "    .dataframe tbody tr th:only-of-type {\n",
 | ||
|        "        vertical-align: middle;\n",
 | ||
|        "    }\n",
 | ||
|        "\n",
 | ||
|        "    .dataframe tbody tr th {\n",
 | ||
|        "        vertical-align: top;\n",
 | ||
|        "    }\n",
 | ||
|        "\n",
 | ||
|        "    .dataframe thead th {\n",
 | ||
|        "        text-align: right;\n",
 | ||
|        "    }\n",
 | ||
|        "</style>\n",
 | ||
|        "<table border=\"1\" class=\"dataframe\">\n",
 | ||
|        "  <thead>\n",
 | ||
|        "    <tr style=\"text-align: right;\">\n",
 | ||
|        "      <th></th>\n",
 | ||
|        "      <th>Date</th>\n",
 | ||
|        "      <th>Sales</th>\n",
 | ||
|        "      <th>Open</th>\n",
 | ||
|        "      <th>Promo</th>\n",
 | ||
|        "      <th>Promo2</th>\n",
 | ||
|        "      <th>above_mean_sales</th>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "  </thead>\n",
 | ||
|        "  <tbody>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>0</th>\n",
 | ||
|        "      <td>2015-02-02</td>\n",
 | ||
|        "      <td>24894</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>False</td>\n",
 | ||
|        "      <td>1</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>1</th>\n",
 | ||
|        "      <td>2015-02-03</td>\n",
 | ||
|        "      <td>22139</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>False</td>\n",
 | ||
|        "      <td>1</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>2</th>\n",
 | ||
|        "      <td>2015-02-04</td>\n",
 | ||
|        "      <td>20452</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>False</td>\n",
 | ||
|        "      <td>1</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>3</th>\n",
 | ||
|        "      <td>2015-02-05</td>\n",
 | ||
|        "      <td>20977</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>False</td>\n",
 | ||
|        "      <td>1</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>4</th>\n",
 | ||
|        "      <td>2015-02-06</td>\n",
 | ||
|        "      <td>19151</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>False</td>\n",
 | ||
|        "      <td>1</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>...</th>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>145</th>\n",
 | ||
|        "      <td>2015-06-27</td>\n",
 | ||
|        "      <td>13108</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>False</td>\n",
 | ||
|        "      <td>False</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>146</th>\n",
 | ||
|        "      <td>2015-06-28</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "      <td>False</td>\n",
 | ||
|        "      <td>False</td>\n",
 | ||
|        "      <td>False</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>147</th>\n",
 | ||
|        "      <td>2015-06-29</td>\n",
 | ||
|        "      <td>28456</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>False</td>\n",
 | ||
|        "      <td>1</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>148</th>\n",
 | ||
|        "      <td>2015-06-30</td>\n",
 | ||
|        "      <td>27140</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>False</td>\n",
 | ||
|        "      <td>1</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>149</th>\n",
 | ||
|        "      <td>2015-07-01</td>\n",
 | ||
|        "      <td>24957</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>True</td>\n",
 | ||
|        "      <td>False</td>\n",
 | ||
|        "      <td>1</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "  </tbody>\n",
 | ||
|        "</table>\n",
 | ||
|        "<p>150 rows × 6 columns</p>\n",
 | ||
|        "</div>"
 | ||
|       ],
 | ||
|       "text/plain": [
 | ||
|        "          Date  Sales   Open  Promo  Promo2  above_mean_sales\n",
 | ||
|        "0   2015-02-02  24894   True   True   False                 1\n",
 | ||
|        "1   2015-02-03  22139   True   True   False                 1\n",
 | ||
|        "2   2015-02-04  20452   True   True   False                 1\n",
 | ||
|        "3   2015-02-05  20977   True   True   False                 1\n",
 | ||
|        "4   2015-02-06  19151   True   True   False                 1\n",
 | ||
|        "..         ...    ...    ...    ...     ...               ...\n",
 | ||
|        "145 2015-06-27  13108   True  False   False                 0\n",
 | ||
|        "146 2015-06-28      0  False  False   False                 0\n",
 | ||
|        "147 2015-06-29  28456   True   True   False                 1\n",
 | ||
|        "148 2015-06-30  27140   True   True   False                 1\n",
 | ||
|        "149 2015-07-01  24957   True   True   False                 1\n",
 | ||
|        "\n",
 | ||
|        "[150 rows x 6 columns]"
 | ||
|       ]
 | ||
|      },
 | ||
|      "execution_count": 3,
 | ||
|      "metadata": {},
 | ||
|      "output_type": "execute_result"
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "discrete_train_df"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Run FLAML"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 4,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [],
 | ||
|    "source": [
 | ||
|     "from flaml import AutoML\n",
 | ||
|     "automl = AutoML()"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 6,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [],
 | ||
|    "source": [
 | ||
|     "settings = {\n",
 | ||
|     "    \"time_budget\": 15,  # total running time in seconds\n",
 | ||
|     "    \"metric\": \"accuracy\",  # primary metric\n",
 | ||
|     "    \"task\": \"ts_forecast_classification\",  # task type\n",
 | ||
|     "    \"log_file_name\": \"sales_classification_forecast.log\",  # flaml log file\n",
 | ||
|     "    \"eval_method\": \"holdout\",\n",
 | ||
|     "}"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 7,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stderr",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "[flaml.automl: 08-03 20:33:26] {2520} INFO - task = ts_forecast_classification\n",
 | ||
|       "[flaml.automl: 08-03 20:33:26] {2522} INFO - Data split method: time\n",
 | ||
|       "[flaml.automl: 08-03 20:33:26] {2525} INFO - Evaluation method: holdout\n",
 | ||
|       "[flaml.automl: 08-03 20:33:26] {2644} INFO - Minimizing error metric: 1-accuracy\n",
 | ||
|       "[flaml.automl: 08-03 20:33:27] {2786} INFO - List of ML learners in AutoML Run: ['lgbm', 'rf', 'xgboost', 'extra_tree', 'xgb_limitdepth']\n",
 | ||
|       "[flaml.automl: 08-03 20:33:27] {3088} INFO - iteration 0, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3221} INFO - Estimated sufficient time budget=11912s. Estimated necessary time budget=12s.\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.2s,\testimator lgbm's best error=0.2667,\tbest estimator lgbm's best error=0.2667\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 1, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.2s,\testimator lgbm's best error=0.2667,\tbest estimator lgbm's best error=0.2667\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 2, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.2s,\testimator lgbm's best error=0.1333,\tbest estimator lgbm's best error=0.1333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 3, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.3s,\testimator lgbm's best error=0.1333,\tbest estimator lgbm's best error=0.1333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 4, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.3s,\testimator lgbm's best error=0.0667,\tbest estimator lgbm's best error=0.0667\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 5, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.4s,\testimator lgbm's best error=0.0667,\tbest estimator lgbm's best error=0.0667\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 6, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.5s,\testimator lgbm's best error=0.0667,\tbest estimator lgbm's best error=0.0667\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 7, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.5s,\testimator lgbm's best error=0.0667,\tbest estimator lgbm's best error=0.0667\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 8, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.5s,\testimator lgbm's best error=0.0667,\tbest estimator lgbm's best error=0.0667\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 9, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.6s,\testimator lgbm's best error=0.0667,\tbest estimator lgbm's best error=0.0667\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 10, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.6s,\testimator lgbm's best error=0.0667,\tbest estimator lgbm's best error=0.0667\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 11, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 12, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 13, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.8s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 14, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.8s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 15, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.8s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 16, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.9s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 17, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 2.9s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 18, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 3.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 19, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 3.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 20, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3268} INFO -  at 3.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 21, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.1s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 22, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.1s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 23, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.2s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 24, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.2s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 25, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.3s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 26, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.3s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 27, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.4s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 28, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.4s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 29, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.4s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 30, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.5s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 31, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.5s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 32, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.5s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 33, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.6s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 34, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.6s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 35, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 36, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 37, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 38, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.8s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 39, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.8s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 40, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.9s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 41, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.9s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 42, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 3.9s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 43, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 4.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 44, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3268} INFO -  at 4.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 45, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.1s,\testimator rf's best error=0.1333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 46, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.1s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 47, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.2s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 48, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.2s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 49, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.2s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 50, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.3s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 51, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.3s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 52, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.3s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 53, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.4s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 54, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.5s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 55, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.5s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 56, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.6s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 57, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.6s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 58, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.6s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 59, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 60, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.7s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 61, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.8s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 62, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.9s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 63, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 4.9s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 64, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 5.0s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 65, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 5.0s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 66, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3268} INFO -  at 5.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 67, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.1s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 68, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.2s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 69, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.2s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 70, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.2s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 71, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.3s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 72, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.3s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 73, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.4s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 74, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.4s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 75, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.5s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 76, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.5s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 77, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.5s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 78, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.6s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 79, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.7s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 80, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.7s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 81, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.8s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 82, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.8s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 83, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.9s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 84, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 5.9s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 85, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 6.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 86, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3268} INFO -  at 6.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 87, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:33] {3268} INFO -  at 6.1s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:33] {3088} INFO - iteration 88, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:33] {3268} INFO -  at 6.1s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:33] {3088} INFO - iteration 89, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:33] {3268} INFO -  at 6.2s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:33] {3088} INFO - iteration 90, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:33] {3268} INFO -  at 6.2s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:33] {3088} INFO - iteration 91, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:34] {3268} INFO -  at 7.8s,\testimator xgboost's best error=0.1333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:34] {3088} INFO - iteration 92, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:34] {3268} INFO -  at 7.9s,\testimator extra_tree's best error=0.1333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:34] {3088} INFO - iteration 93, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:34] {3268} INFO -  at 7.9s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:34] {3088} INFO - iteration 94, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:34] {3268} INFO -  at 8.0s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:34] {3088} INFO - iteration 95, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:34] {3268} INFO -  at 8.0s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 96, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 8.1s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 97, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 8.1s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 98, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 8.2s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 99, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 8.2s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 100, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 8.3s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 101, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 8.3s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 102, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 8.4s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 103, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 8.4s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 104, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 8.5s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 105, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 8.6s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 106, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 8.6s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 107, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 8.7s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 108, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 8.7s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 109, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 8.8s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 110, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 8.8s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 111, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 8.9s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 112, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 9.0s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 113, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3268} INFO -  at 9.0s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 114, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.1s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 115, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.1s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 116, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.2s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 117, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.2s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 118, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.3s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 119, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.3s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 120, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.4s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 121, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.4s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 122, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.5s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 123, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.5s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 124, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.6s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 125, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.6s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 126, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 127, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.8s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 128, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.8s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 129, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.9s,\testimator xgboost's best error=0.1333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 130, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 9.9s,\testimator xgboost's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 131, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 10.0s,\testimator xgboost's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 132, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3268} INFO -  at 10.0s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 133, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 10.1s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 134, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 10.1s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 135, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 10.2s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 136, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 10.2s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 137, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 10.3s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 138, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 10.3s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 139, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 10.4s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 140, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 10.4s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 141, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 10.4s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 142, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 10.5s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 143, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 10.6s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 144, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 10.6s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 145, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 10.7s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 146, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 10.8s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 147, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 10.9s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 148, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 10.9s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 149, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 11.0s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 150, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3268} INFO -  at 11.0s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 151, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 11.1s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 152, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 11.1s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 153, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 11.2s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 154, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 11.2s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 155, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 11.3s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 156, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 11.4s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 157, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 11.4s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 158, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 11.5s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 159, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 11.5s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 160, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 11.6s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 161, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 11.6s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 162, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 11.7s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 163, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 11.7s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 164, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 11.8s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 165, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 11.8s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 166, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 11.9s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 167, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3268} INFO -  at 12.0s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 168, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.1s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 169, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.1s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 170, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.2s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 171, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.2s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 172, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.2s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 173, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.3s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 174, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.3s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 175, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.4s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 176, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.4s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 177, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.5s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 178, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.5s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 179, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.6s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 180, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.6s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 181, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.7s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 182, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 183, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.7s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 184, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.8s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 185, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.9s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 186, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 12.9s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 187, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3268} INFO -  at 13.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 188, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.1s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 189, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.1s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 190, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.2s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 191, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.2s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 192, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.3s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 193, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.3s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 194, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.4s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 195, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.4s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 196, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.5s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 197, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.5s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 198, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.6s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 199, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.6s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 200, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 201, current learner rf\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.8s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 202, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.8s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 203, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.9s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 204, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 13.9s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 205, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3268} INFO -  at 14.0s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 206, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.1s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 207, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.1s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 208, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.1s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 209, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.2s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 210, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.2s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 211, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.3s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 212, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.3s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 213, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.4s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 214, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.4s,\testimator xgb_limitdepth's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 215, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.5s,\testimator xgb_limitdepth's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 216, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.5s,\testimator xgb_limitdepth's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 217, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.6s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 218, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.6s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 219, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.7s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 220, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.7s,\testimator xgb_limitdepth's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 221, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 222, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.8s,\testimator xgb_limitdepth's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 223, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.8s,\testimator xgb_limitdepth's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 224, current learner lgbm\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.9s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 225, current learner xgboost\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.9s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 226, current learner xgb_limitdepth\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 14.9s,\testimator xgb_limitdepth's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 227, current learner extra_tree\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3268} INFO -  at 15.0s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3532} INFO - retrain lgbm for 0.0s\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {3539} INFO - retrained model: LGBMClassifier(boosting_type='gbdt', class_weight=None, colsample_bytree=1.0,\n",
 | ||
|       "               importance_type='split', learning_rate=0.7333523408279569,\n",
 | ||
|       "               max_bin=31, max_depth=-1, min_child_samples=8,\n",
 | ||
|       "               min_child_weight=0.001, min_split_gain=0.0, n_estimators=4,\n",
 | ||
|       "               n_jobs=-1, num_leaves=5, objective=None, random_state=None,\n",
 | ||
|       "               reg_alpha=0.0009765625, reg_lambda=7.593190995489472,\n",
 | ||
|       "               silent=True, subsample=1.0, subsample_for_bin=200000,\n",
 | ||
|       "               subsample_freq=0, verbose=-1)\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {2817} INFO - fit succeeded\n",
 | ||
|       "[flaml.automl: 08-03 20:33:41] {2818} INFO - Time taken to find the best model: 2.6732513904571533\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "\"\"\"The main flaml automl API\"\"\"\n",
 | ||
|     "automl.fit(X_train=discrete_X_train,\n",
 | ||
|     "           y_train=discrete_y_train,\n",
 | ||
|     "           **settings,\n",
 | ||
|     "           period=time_horizon)"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Best Model and Metric"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 8,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "Best ML leaner: lgbm\n",
 | ||
|       "Best hyperparmeter config: {'n_estimators': 4, 'num_leaves': 5, 'min_child_samples': 8, 'learning_rate': 0.7333523408279569, 'log_max_bin': 5, 'colsample_bytree': 1.0, 'reg_alpha': 0.0009765625, 'reg_lambda': 7.593190995489472, 'optimize_for_horizon': False, 'lags': 5}\n",
 | ||
|       "Best mape on validation data: 0.033333333333333326\n",
 | ||
|       "Training duration of best run: 0.017951011657714844s\n",
 | ||
|       "LGBMClassifier(boosting_type='gbdt', class_weight=None, colsample_bytree=1.0,\n",
 | ||
|       "               importance_type='split', learning_rate=0.7333523408279569,\n",
 | ||
|       "               max_bin=31, max_depth=-1, min_child_samples=8,\n",
 | ||
|       "               min_child_weight=0.001, min_split_gain=0.0, n_estimators=4,\n",
 | ||
|       "               n_jobs=-1, num_leaves=5, objective=None, random_state=None,\n",
 | ||
|       "               reg_alpha=0.0009765625, reg_lambda=7.593190995489472,\n",
 | ||
|       "               silent=True, subsample=1.0, subsample_for_bin=200000,\n",
 | ||
|       "               subsample_freq=0, verbose=-1)\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "\"\"\" retrieve best config and best learner\"\"\"\n",
 | ||
|     "print(\"Best ML leaner:\", automl.best_estimator)\n",
 | ||
|     "print(\"Best hyperparmeter config:\", automl.best_config)\n",
 | ||
|     "print(f\"Best mape on validation data: {automl.best_loss}\")\n",
 | ||
|     "print(f\"Training duration of best run: {automl.best_config_train_time}s\")\n",
 | ||
|     "print(automl.model.estimator)"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 9,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "Predicted label [1 1 0 0 1 1 1 1 1 0 0 1 1 1 1 1 0 0 1 1 1 1 1 0 0 1 1 1 1 1]\n",
 | ||
|       "True label 150    1\n",
 | ||
|       "151    1\n",
 | ||
|       "152    0\n",
 | ||
|       "153    0\n",
 | ||
|       "154    1\n",
 | ||
|       "155    1\n",
 | ||
|       "156    1\n",
 | ||
|       "157    1\n",
 | ||
|       "158    1\n",
 | ||
|       "159    0\n",
 | ||
|       "160    0\n",
 | ||
|       "161    1\n",
 | ||
|       "162    1\n",
 | ||
|       "163    1\n",
 | ||
|       "164    1\n",
 | ||
|       "165    1\n",
 | ||
|       "166    0\n",
 | ||
|       "167    0\n",
 | ||
|       "168    1\n",
 | ||
|       "169    1\n",
 | ||
|       "170    1\n",
 | ||
|       "171    1\n",
 | ||
|       "172    1\n",
 | ||
|       "173    0\n",
 | ||
|       "174    0\n",
 | ||
|       "175    1\n",
 | ||
|       "176    1\n",
 | ||
|       "177    1\n",
 | ||
|       "178    1\n",
 | ||
|       "179    1\n",
 | ||
|       "Name: above_mean_sales, dtype: int32\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "\"\"\" compute predictions of testing dataset \"\"\"\n",
 | ||
|     "discrete_y_pred = automl.predict(discrete_X_test)\n",
 | ||
|     "print(\"Predicted label\", discrete_y_pred)\n",
 | ||
|     "print(\"True label\", discrete_y_test)"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 10,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "accuracy = 1.0\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "from flaml.ml import sklearn_metric_loss_score\n",
 | ||
|     "print(\"accuracy\", \"=\", 1 - sklearn_metric_loss_score(\"accuracy\", discrete_y_test, discrete_y_pred))"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "## 5. Forecast Problems with Panel Datasets (Multiple Time Series)"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Load data and preprocess\n",
 | ||
|     "\n",
 | ||
|     "Import Stallion & Co.'s beverage sales data from pytorch-forecasting, orginally from Kaggle. The dataset contains about 21,000 monthly historic sales record as well as additional information about the sales price, the location of the agency, special days such as holidays, and volume sold in the entire industry. There are thousands of unique wholesaler-SKU/products combinations, each representing an individual time series. The task is to provide a six month forecast of demand at SKU level for each wholesaler."
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 2,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [],
 | ||
|    "source": [
 | ||
|     "def get_stalliion_data():\n",
 | ||
|     "    from pytorch_forecasting.data.examples import get_stallion_data\n",
 | ||
|     "\n",
 | ||
|     "    data = get_stallion_data()\n",
 | ||
|     "    # add time index\n",
 | ||
|     "    data[\"time_idx\"] = data[\"date\"].dt.year * 12 + data[\"date\"].dt.month\n",
 | ||
|     "    data[\"time_idx\"] -= data[\"time_idx\"].min()\n",
 | ||
|     "    # add additional features\n",
 | ||
|     "    data[\"month\"] = data.date.dt.month.astype(str).astype(\n",
 | ||
|     "        \"category\"\n",
 | ||
|     "    )  # categories have be strings\n",
 | ||
|     "    data[\"log_volume\"] = np.log(data.volume + 1e-8)\n",
 | ||
|     "    data[\"avg_volume_by_sku\"] = data.groupby(\n",
 | ||
|     "        [\"time_idx\", \"sku\"], observed=True\n",
 | ||
|     "    ).volume.transform(\"mean\")\n",
 | ||
|     "    data[\"avg_volume_by_agency\"] = data.groupby(\n",
 | ||
|     "        [\"time_idx\", \"agency\"], observed=True\n",
 | ||
|     "    ).volume.transform(\"mean\")\n",
 | ||
|     "    # we want to encode special days as one variable and thus need to first reverse one-hot encoding\n",
 | ||
|     "    special_days = [\n",
 | ||
|     "        \"easter_day\",\n",
 | ||
|     "        \"good_friday\",\n",
 | ||
|     "        \"new_year\",\n",
 | ||
|     "        \"christmas\",\n",
 | ||
|     "        \"labor_day\",\n",
 | ||
|     "        \"independence_day\",\n",
 | ||
|     "        \"revolution_day_memorial\",\n",
 | ||
|     "        \"regional_games\",\n",
 | ||
|     "        \"beer_capital\",\n",
 | ||
|     "        \"music_fest\",\n",
 | ||
|     "    ]\n",
 | ||
|     "    data[special_days] = (\n",
 | ||
|     "        data[special_days]\n",
 | ||
|     "        .apply(lambda x: x.map({0: \"-\", 1: x.name}))\n",
 | ||
|     "        .astype(\"category\")\n",
 | ||
|     "    )\n",
 | ||
|     "    return data, special_days"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 3,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [],
 | ||
|    "source": [
 | ||
|     "data, special_days = get_stalliion_data()\n",
 | ||
|     "time_horizon = 6  # predict six months\n",
 | ||
|     "# make time steps first column\n",
 | ||
|     "data[\"time_idx\"] = data[\"date\"].dt.year * 12 + data[\"date\"].dt.month\n",
 | ||
|     "data[\"time_idx\"] -= data[\"time_idx\"].min()\n",
 | ||
|     "training_cutoff = data[\"time_idx\"].max() - time_horizon\n",
 | ||
|     "ts_col = data.pop(\"date\")\n",
 | ||
|     "data.insert(0, \"date\", ts_col)\n",
 | ||
|     "# FLAML assumes input is not sorted, but we sort here for comparison purposes with y_test\n",
 | ||
|     "data = data.sort_values([\"agency\", \"sku\", \"date\"])\n",
 | ||
|     "X_train = data[lambda x: x.time_idx <= training_cutoff]\n",
 | ||
|     "X_test = data[lambda x: x.time_idx > training_cutoff]\n",
 | ||
|     "y_train = X_train.pop(\"volume\")\n",
 | ||
|     "y_test = X_test.pop(\"volume\")"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 4,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "data": {
 | ||
|       "text/html": [
 | ||
|        "<div>\n",
 | ||
|        "<style scoped>\n",
 | ||
|        "    .dataframe tbody tr th:only-of-type {\n",
 | ||
|        "        vertical-align: middle;\n",
 | ||
|        "    }\n",
 | ||
|        "\n",
 | ||
|        "    .dataframe tbody tr th {\n",
 | ||
|        "        vertical-align: top;\n",
 | ||
|        "    }\n",
 | ||
|        "\n",
 | ||
|        "    .dataframe thead th {\n",
 | ||
|        "        text-align: right;\n",
 | ||
|        "    }\n",
 | ||
|        "</style>\n",
 | ||
|        "<table border=\"1\" class=\"dataframe\">\n",
 | ||
|        "  <thead>\n",
 | ||
|        "    <tr style=\"text-align: right;\">\n",
 | ||
|        "      <th></th>\n",
 | ||
|        "      <th>date</th>\n",
 | ||
|        "      <th>agency</th>\n",
 | ||
|        "      <th>sku</th>\n",
 | ||
|        "      <th>industry_volume</th>\n",
 | ||
|        "      <th>soda_volume</th>\n",
 | ||
|        "      <th>avg_max_temp</th>\n",
 | ||
|        "      <th>price_regular</th>\n",
 | ||
|        "      <th>price_actual</th>\n",
 | ||
|        "      <th>discount</th>\n",
 | ||
|        "      <th>avg_population_2017</th>\n",
 | ||
|        "      <th>...</th>\n",
 | ||
|        "      <th>football_gold_cup</th>\n",
 | ||
|        "      <th>beer_capital</th>\n",
 | ||
|        "      <th>music_fest</th>\n",
 | ||
|        "      <th>discount_in_percent</th>\n",
 | ||
|        "      <th>timeseries</th>\n",
 | ||
|        "      <th>time_idx</th>\n",
 | ||
|        "      <th>month</th>\n",
 | ||
|        "      <th>log_volume</th>\n",
 | ||
|        "      <th>avg_volume_by_sku</th>\n",
 | ||
|        "      <th>avg_volume_by_agency</th>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "  </thead>\n",
 | ||
|        "  <tbody>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>25</th>\n",
 | ||
|        "      <td>2013-01-01</td>\n",
 | ||
|        "      <td>Agency_01</td>\n",
 | ||
|        "      <td>SKU_01</td>\n",
 | ||
|        "      <td>492612703</td>\n",
 | ||
|        "      <td>718394219</td>\n",
 | ||
|        "      <td>17.072000</td>\n",
 | ||
|        "      <td>1141.500000</td>\n",
 | ||
|        "      <td>1033.432731</td>\n",
 | ||
|        "      <td>108.067269</td>\n",
 | ||
|        "      <td>153733</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>9.467128</td>\n",
 | ||
|        "      <td>249</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "      <td>1</td>\n",
 | ||
|        "      <td>4.390441</td>\n",
 | ||
|        "      <td>2613.377501</td>\n",
 | ||
|        "      <td>74.829600</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>7183</th>\n",
 | ||
|        "      <td>2013-02-01</td>\n",
 | ||
|        "      <td>Agency_01</td>\n",
 | ||
|        "      <td>SKU_01</td>\n",
 | ||
|        "      <td>431937346</td>\n",
 | ||
|        "      <td>753938444</td>\n",
 | ||
|        "      <td>19.984000</td>\n",
 | ||
|        "      <td>1141.500000</td>\n",
 | ||
|        "      <td>1065.417195</td>\n",
 | ||
|        "      <td>76.082805</td>\n",
 | ||
|        "      <td>153733</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>6.665160</td>\n",
 | ||
|        "      <td>249</td>\n",
 | ||
|        "      <td>1</td>\n",
 | ||
|        "      <td>2</td>\n",
 | ||
|        "      <td>4.585620</td>\n",
 | ||
|        "      <td>2916.978087</td>\n",
 | ||
|        "      <td>90.036700</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>8928</th>\n",
 | ||
|        "      <td>2013-03-01</td>\n",
 | ||
|        "      <td>Agency_01</td>\n",
 | ||
|        "      <td>SKU_01</td>\n",
 | ||
|        "      <td>509281531</td>\n",
 | ||
|        "      <td>892192092</td>\n",
 | ||
|        "      <td>24.600000</td>\n",
 | ||
|        "      <td>1179.345820</td>\n",
 | ||
|        "      <td>1101.133633</td>\n",
 | ||
|        "      <td>78.212187</td>\n",
 | ||
|        "      <td>153733</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>music_fest</td>\n",
 | ||
|        "      <td>6.631828</td>\n",
 | ||
|        "      <td>249</td>\n",
 | ||
|        "      <td>2</td>\n",
 | ||
|        "      <td>3</td>\n",
 | ||
|        "      <td>4.895628</td>\n",
 | ||
|        "      <td>3215.061952</td>\n",
 | ||
|        "      <td>130.487150</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>10588</th>\n",
 | ||
|        "      <td>2013-04-01</td>\n",
 | ||
|        "      <td>Agency_01</td>\n",
 | ||
|        "      <td>SKU_01</td>\n",
 | ||
|        "      <td>532390389</td>\n",
 | ||
|        "      <td>838099501</td>\n",
 | ||
|        "      <td>27.532000</td>\n",
 | ||
|        "      <td>1226.687500</td>\n",
 | ||
|        "      <td>1138.283357</td>\n",
 | ||
|        "      <td>88.404143</td>\n",
 | ||
|        "      <td>153733</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>7.206737</td>\n",
 | ||
|        "      <td>249</td>\n",
 | ||
|        "      <td>3</td>\n",
 | ||
|        "      <td>4</td>\n",
 | ||
|        "      <td>4.992553</td>\n",
 | ||
|        "      <td>3515.822697</td>\n",
 | ||
|        "      <td>130.246150</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>12260</th>\n",
 | ||
|        "      <td>2013-05-01</td>\n",
 | ||
|        "      <td>Agency_01</td>\n",
 | ||
|        "      <td>SKU_01</td>\n",
 | ||
|        "      <td>551755254</td>\n",
 | ||
|        "      <td>864420003</td>\n",
 | ||
|        "      <td>29.396000</td>\n",
 | ||
|        "      <td>1230.331104</td>\n",
 | ||
|        "      <td>1148.969634</td>\n",
 | ||
|        "      <td>81.361470</td>\n",
 | ||
|        "      <td>153733</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>6.612974</td>\n",
 | ||
|        "      <td>249</td>\n",
 | ||
|        "      <td>4</td>\n",
 | ||
|        "      <td>5</td>\n",
 | ||
|        "      <td>5.168254</td>\n",
 | ||
|        "      <td>3688.107793</td>\n",
 | ||
|        "      <td>159.051550</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>...</th>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>8403</th>\n",
 | ||
|        "      <td>2017-02-01</td>\n",
 | ||
|        "      <td>Agency_60</td>\n",
 | ||
|        "      <td>SKU_23</td>\n",
 | ||
|        "      <td>530252010</td>\n",
 | ||
|        "      <td>850913048</td>\n",
 | ||
|        "      <td>25.242657</td>\n",
 | ||
|        "      <td>4261.294565</td>\n",
 | ||
|        "      <td>4087.082609</td>\n",
 | ||
|        "      <td>174.211956</td>\n",
 | ||
|        "      <td>2180611</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>4.088240</td>\n",
 | ||
|        "      <td>190</td>\n",
 | ||
|        "      <td>49</td>\n",
 | ||
|        "      <td>2</td>\n",
 | ||
|        "      <td>0.924259</td>\n",
 | ||
|        "      <td>2.418750</td>\n",
 | ||
|        "      <td>2664.670179</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>10359</th>\n",
 | ||
|        "      <td>2017-03-01</td>\n",
 | ||
|        "      <td>Agency_60</td>\n",
 | ||
|        "      <td>SKU_23</td>\n",
 | ||
|        "      <td>613143990</td>\n",
 | ||
|        "      <td>886129111</td>\n",
 | ||
|        "      <td>25.374816</td>\n",
 | ||
|        "      <td>4259.769000</td>\n",
 | ||
|        "      <td>4126.776000</td>\n",
 | ||
|        "      <td>132.993000</td>\n",
 | ||
|        "      <td>2180611</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>music_fest</td>\n",
 | ||
|        "      <td>3.122071</td>\n",
 | ||
|        "      <td>190</td>\n",
 | ||
|        "      <td>50</td>\n",
 | ||
|        "      <td>3</td>\n",
 | ||
|        "      <td>0.536493</td>\n",
 | ||
|        "      <td>4.353750</td>\n",
 | ||
|        "      <td>2965.472829</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>12114</th>\n",
 | ||
|        "      <td>2017-04-01</td>\n",
 | ||
|        "      <td>Agency_60</td>\n",
 | ||
|        "      <td>SKU_23</td>\n",
 | ||
|        "      <td>589969396</td>\n",
 | ||
|        "      <td>940912941</td>\n",
 | ||
|        "      <td>27.109204</td>\n",
 | ||
|        "      <td>4261.896428</td>\n",
 | ||
|        "      <td>4115.753572</td>\n",
 | ||
|        "      <td>146.142856</td>\n",
 | ||
|        "      <td>2180611</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>3.429057</td>\n",
 | ||
|        "      <td>190</td>\n",
 | ||
|        "      <td>51</td>\n",
 | ||
|        "      <td>4</td>\n",
 | ||
|        "      <td>0.231112</td>\n",
 | ||
|        "      <td>2.396250</td>\n",
 | ||
|        "      <td>2861.802300</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>13884</th>\n",
 | ||
|        "      <td>2017-05-01</td>\n",
 | ||
|        "      <td>Agency_60</td>\n",
 | ||
|        "      <td>SKU_23</td>\n",
 | ||
|        "      <td>628759461</td>\n",
 | ||
|        "      <td>917412482</td>\n",
 | ||
|        "      <td>28.479272</td>\n",
 | ||
|        "      <td>0.000000</td>\n",
 | ||
|        "      <td>0.000000</td>\n",
 | ||
|        "      <td>0.000000</td>\n",
 | ||
|        "      <td>2180611</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>0.000000</td>\n",
 | ||
|        "      <td>190</td>\n",
 | ||
|        "      <td>52</td>\n",
 | ||
|        "      <td>5</td>\n",
 | ||
|        "      <td>-18.420681</td>\n",
 | ||
|        "      <td>2.182500</td>\n",
 | ||
|        "      <td>3489.190286</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "    <tr>\n",
 | ||
|        "      <th>15669</th>\n",
 | ||
|        "      <td>2017-06-01</td>\n",
 | ||
|        "      <td>Agency_60</td>\n",
 | ||
|        "      <td>SKU_23</td>\n",
 | ||
|        "      <td>636846973</td>\n",
 | ||
|        "      <td>928366256</td>\n",
 | ||
|        "      <td>29.609259</td>\n",
 | ||
|        "      <td>4256.675000</td>\n",
 | ||
|        "      <td>4246.018750</td>\n",
 | ||
|        "      <td>10.656250</td>\n",
 | ||
|        "      <td>2180611</td>\n",
 | ||
|        "      <td>...</td>\n",
 | ||
|        "      <td>0</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>-</td>\n",
 | ||
|        "      <td>0.250342</td>\n",
 | ||
|        "      <td>190</td>\n",
 | ||
|        "      <td>53</td>\n",
 | ||
|        "      <td>6</td>\n",
 | ||
|        "      <td>0.924259</td>\n",
 | ||
|        "      <td>2.362500</td>\n",
 | ||
|        "      <td>3423.810793</td>\n",
 | ||
|        "    </tr>\n",
 | ||
|        "  </tbody>\n",
 | ||
|        "</table>\n",
 | ||
|        "<p>18900 rows × 30 columns</p>\n",
 | ||
|        "</div>"
 | ||
|       ],
 | ||
|       "text/plain": [
 | ||
|        "            date     agency     sku  industry_volume  soda_volume  \\\n",
 | ||
|        "25    2013-01-01  Agency_01  SKU_01        492612703    718394219   \n",
 | ||
|        "7183  2013-02-01  Agency_01  SKU_01        431937346    753938444   \n",
 | ||
|        "8928  2013-03-01  Agency_01  SKU_01        509281531    892192092   \n",
 | ||
|        "10588 2013-04-01  Agency_01  SKU_01        532390389    838099501   \n",
 | ||
|        "12260 2013-05-01  Agency_01  SKU_01        551755254    864420003   \n",
 | ||
|        "...          ...        ...     ...              ...          ...   \n",
 | ||
|        "8403  2017-02-01  Agency_60  SKU_23        530252010    850913048   \n",
 | ||
|        "10359 2017-03-01  Agency_60  SKU_23        613143990    886129111   \n",
 | ||
|        "12114 2017-04-01  Agency_60  SKU_23        589969396    940912941   \n",
 | ||
|        "13884 2017-05-01  Agency_60  SKU_23        628759461    917412482   \n",
 | ||
|        "15669 2017-06-01  Agency_60  SKU_23        636846973    928366256   \n",
 | ||
|        "\n",
 | ||
|        "       avg_max_temp  price_regular  price_actual    discount  \\\n",
 | ||
|        "25        17.072000    1141.500000   1033.432731  108.067269   \n",
 | ||
|        "7183      19.984000    1141.500000   1065.417195   76.082805   \n",
 | ||
|        "8928      24.600000    1179.345820   1101.133633   78.212187   \n",
 | ||
|        "10588     27.532000    1226.687500   1138.283357   88.404143   \n",
 | ||
|        "12260     29.396000    1230.331104   1148.969634   81.361470   \n",
 | ||
|        "...             ...            ...           ...         ...   \n",
 | ||
|        "8403      25.242657    4261.294565   4087.082609  174.211956   \n",
 | ||
|        "10359     25.374816    4259.769000   4126.776000  132.993000   \n",
 | ||
|        "12114     27.109204    4261.896428   4115.753572  146.142856   \n",
 | ||
|        "13884     28.479272       0.000000      0.000000    0.000000   \n",
 | ||
|        "15669     29.609259    4256.675000   4246.018750   10.656250   \n",
 | ||
|        "\n",
 | ||
|        "       avg_population_2017  ...  football_gold_cup beer_capital  music_fest  \\\n",
 | ||
|        "25                  153733  ...                  0            -           -   \n",
 | ||
|        "7183                153733  ...                  0            -           -   \n",
 | ||
|        "8928                153733  ...                  0            -  music_fest   \n",
 | ||
|        "10588               153733  ...                  0            -           -   \n",
 | ||
|        "12260               153733  ...                  0            -           -   \n",
 | ||
|        "...                    ...  ...                ...          ...         ...   \n",
 | ||
|        "8403               2180611  ...                  0            -           -   \n",
 | ||
|        "10359              2180611  ...                  0            -  music_fest   \n",
 | ||
|        "12114              2180611  ...                  0            -           -   \n",
 | ||
|        "13884              2180611  ...                  0            -           -   \n",
 | ||
|        "15669              2180611  ...                  0            -           -   \n",
 | ||
|        "\n",
 | ||
|        "      discount_in_percent timeseries time_idx month log_volume  \\\n",
 | ||
|        "25               9.467128        249        0     1   4.390441   \n",
 | ||
|        "7183             6.665160        249        1     2   4.585620   \n",
 | ||
|        "8928             6.631828        249        2     3   4.895628   \n",
 | ||
|        "10588            7.206737        249        3     4   4.992553   \n",
 | ||
|        "12260            6.612974        249        4     5   5.168254   \n",
 | ||
|        "...                   ...        ...      ...   ...        ...   \n",
 | ||
|        "8403             4.088240        190       49     2   0.924259   \n",
 | ||
|        "10359            3.122071        190       50     3   0.536493   \n",
 | ||
|        "12114            3.429057        190       51     4   0.231112   \n",
 | ||
|        "13884            0.000000        190       52     5 -18.420681   \n",
 | ||
|        "15669            0.250342        190       53     6   0.924259   \n",
 | ||
|        "\n",
 | ||
|        "      avg_volume_by_sku  avg_volume_by_agency  \n",
 | ||
|        "25          2613.377501             74.829600  \n",
 | ||
|        "7183        2916.978087             90.036700  \n",
 | ||
|        "8928        3215.061952            130.487150  \n",
 | ||
|        "10588       3515.822697            130.246150  \n",
 | ||
|        "12260       3688.107793            159.051550  \n",
 | ||
|        "...                 ...                   ...  \n",
 | ||
|        "8403           2.418750           2664.670179  \n",
 | ||
|        "10359          4.353750           2965.472829  \n",
 | ||
|        "12114          2.396250           2861.802300  \n",
 | ||
|        "13884          2.182500           3489.190286  \n",
 | ||
|        "15669          2.362500           3423.810793  \n",
 | ||
|        "\n",
 | ||
|        "[18900 rows x 30 columns]"
 | ||
|       ]
 | ||
|      },
 | ||
|      "execution_count": 4,
 | ||
|      "metadata": {},
 | ||
|      "output_type": "execute_result"
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "X_train"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Run FLAML"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 5,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stderr",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "Missing timestamps detected. To avoid error with estimators, set estimator list to ['prophet']. \n",
 | ||
|       "[flaml.automl: 07-28 21:26:03] {2478} INFO - task = ts_forecast_panel\n",
 | ||
|       "[flaml.automl: 07-28 21:26:03] {2480} INFO - Data split method: time\n",
 | ||
|       "[flaml.automl: 07-28 21:26:03] {2483} INFO - Evaluation method: holdout\n",
 | ||
|       "[flaml.automl: 07-28 21:26:03] {2552} INFO - Minimizing error metric: mape\n",
 | ||
|       "[flaml.automl: 07-28 21:26:03] {2694} INFO - List of ML learners in AutoML Run: ['tft']\n",
 | ||
|       "[flaml.automl: 07-28 21:26:03] {2986} INFO - iteration 0, current learner tft\n",
 | ||
|       "GPU available: False, used: False\n",
 | ||
|       "TPU available: False, using: 0 TPU cores\n",
 | ||
|       "IPU available: False, using: 0 IPUs\n",
 | ||
|       "\n",
 | ||
|       "   | Name                               | Type                            | Params\n",
 | ||
|       "----------------------------------------------------------------------------------------\n",
 | ||
|       "0  | loss                               | QuantileLoss                    | 0     \n",
 | ||
|       "1  | logging_metrics                    | ModuleList                      | 0     \n",
 | ||
|       "2  | input_embeddings                   | MultiEmbedding                  | 1.3 K \n",
 | ||
|       "3  | prescalers                         | ModuleDict                      | 256   \n",
 | ||
|       "4  | static_variable_selection          | VariableSelectionNetwork        | 3.4 K \n",
 | ||
|       "5  | encoder_variable_selection         | VariableSelectionNetwork        | 8.0 K \n",
 | ||
|       "6  | decoder_variable_selection         | VariableSelectionNetwork        | 2.7 K \n",
 | ||
|       "7  | static_context_variable_selection  | GatedResidualNetwork            | 1.1 K \n",
 | ||
|       "8  | static_context_initial_hidden_lstm | GatedResidualNetwork            | 1.1 K \n",
 | ||
|       "9  | static_context_initial_cell_lstm   | GatedResidualNetwork            | 1.1 K \n",
 | ||
|       "10 | static_context_enrichment          | GatedResidualNetwork            | 1.1 K \n",
 | ||
|       "11 | lstm_encoder                       | LSTM                            | 4.4 K \n",
 | ||
|       "12 | lstm_decoder                       | LSTM                            | 4.4 K \n",
 | ||
|       "13 | post_lstm_gate_encoder             | GatedLinearUnit                 | 544   \n",
 | ||
|       "14 | post_lstm_add_norm_encoder         | AddNorm                         | 32    \n",
 | ||
|       "15 | static_enrichment                  | GatedResidualNetwork            | 1.4 K \n",
 | ||
|       "16 | multihead_attn                     | InterpretableMultiHeadAttention | 676   \n",
 | ||
|       "17 | post_attn_gate_norm                | GateAddNorm                     | 576   \n",
 | ||
|       "18 | pos_wise_ff                        | GatedResidualNetwork            | 1.1 K \n",
 | ||
|       "19 | pre_output_gate_norm               | GateAddNorm                     | 576   \n",
 | ||
|       "20 | output_layer                       | Linear                          | 119   \n",
 | ||
|       "----------------------------------------------------------------------------------------\n",
 | ||
|       "33.6 K    Trainable params\n",
 | ||
|       "0         Non-trainable params\n",
 | ||
|       "33.6 K    Total params\n",
 | ||
|       "0.135     Total estimated model params size (MB)\n"
 | ||
|      ]
 | ||
|     },
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "Epoch 19: 100%|██████████| 129/129 [00:56<00:00,  2.27it/s, loss=45.9, v_num=2, train_loss_step=43.00, val_loss=65.20, train_loss_epoch=46.50]\n"
 | ||
|      ]
 | ||
|     },
 | ||
|     {
 | ||
|      "name": "stderr",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "[flaml.automl: 07-28 21:46:46] {3114} INFO - Estimated sufficient time budget=12424212s. Estimated necessary time budget=12424s.\n",
 | ||
|       "[flaml.automl: 07-28 21:46:46] {3161} INFO -  at 1242.6s,\testimator tft's best error=1324290483134574.7500,\tbest estimator tft's best error=1324290483134574.7500\n",
 | ||
|       "GPU available: False, used: False\n",
 | ||
|       "TPU available: False, using: 0 TPU cores\n",
 | ||
|       "IPU available: False, using: 0 IPUs\n",
 | ||
|       "\n",
 | ||
|       "   | Name                               | Type                            | Params\n",
 | ||
|       "----------------------------------------------------------------------------------------\n",
 | ||
|       "0  | loss                               | QuantileLoss                    | 0     \n",
 | ||
|       "1  | logging_metrics                    | ModuleList                      | 0     \n",
 | ||
|       "2  | input_embeddings                   | MultiEmbedding                  | 1.3 K \n",
 | ||
|       "3  | prescalers                         | ModuleDict                      | 256   \n",
 | ||
|       "4  | static_variable_selection          | VariableSelectionNetwork        | 3.4 K \n",
 | ||
|       "5  | encoder_variable_selection         | VariableSelectionNetwork        | 8.0 K \n",
 | ||
|       "6  | decoder_variable_selection         | VariableSelectionNetwork        | 2.7 K \n",
 | ||
|       "7  | static_context_variable_selection  | GatedResidualNetwork            | 1.1 K \n",
 | ||
|       "8  | static_context_initial_hidden_lstm | GatedResidualNetwork            | 1.1 K \n",
 | ||
|       "9  | static_context_initial_cell_lstm   | GatedResidualNetwork            | 1.1 K \n",
 | ||
|       "10 | static_context_enrichment          | GatedResidualNetwork            | 1.1 K \n",
 | ||
|       "11 | lstm_encoder                       | LSTM                            | 4.4 K \n",
 | ||
|       "12 | lstm_decoder                       | LSTM                            | 4.4 K \n",
 | ||
|       "13 | post_lstm_gate_encoder             | GatedLinearUnit                 | 544   \n",
 | ||
|       "14 | post_lstm_add_norm_encoder         | AddNorm                         | 32    \n",
 | ||
|       "15 | static_enrichment                  | GatedResidualNetwork            | 1.4 K \n",
 | ||
|       "16 | multihead_attn                     | InterpretableMultiHeadAttention | 676   \n",
 | ||
|       "17 | post_attn_gate_norm                | GateAddNorm                     | 576   \n",
 | ||
|       "18 | pos_wise_ff                        | GatedResidualNetwork            | 1.1 K \n",
 | ||
|       "19 | pre_output_gate_norm               | GateAddNorm                     | 576   \n",
 | ||
|       "20 | output_layer                       | Linear                          | 119   \n",
 | ||
|       "----------------------------------------------------------------------------------------\n",
 | ||
|       "33.6 K    Trainable params\n",
 | ||
|       "0         Non-trainable params\n",
 | ||
|       "33.6 K    Total params\n",
 | ||
|       "0.135     Total estimated model params size (MB)\n"
 | ||
|      ]
 | ||
|     },
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "Epoch 19: 100%|██████████| 145/145 [01:03<00:00,  2.28it/s, loss=45.2, v_num=3, train_loss_step=46.30, val_loss=67.60, train_loss_epoch=48.10]\n"
 | ||
|      ]
 | ||
|     },
 | ||
|     {
 | ||
|      "name": "stderr",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "[flaml.automl: 07-28 22:08:05] {3425} INFO - retrain tft for 1279.6s\n",
 | ||
|       "[flaml.automl: 07-28 22:08:05] {3432} INFO - retrained model: TemporalFusionTransformer(\n",
 | ||
|       "  (loss): QuantileLoss()\n",
 | ||
|       "  (logging_metrics): ModuleList(\n",
 | ||
|       "    (0): SMAPE()\n",
 | ||
|       "    (1): MAE()\n",
 | ||
|       "    (2): RMSE()\n",
 | ||
|       "    (3): MAPE()\n",
 | ||
|       "  )\n",
 | ||
|       "  (input_embeddings): MultiEmbedding(\n",
 | ||
|       "    (embeddings): ModuleDict(\n",
 | ||
|       "      (agency): Embedding(58, 16)\n",
 | ||
|       "      (sku): Embedding(25, 10)\n",
 | ||
|       "      (special_days): TimeDistributedEmbeddingBag(11, 6, mode=sum)\n",
 | ||
|       "      (month): Embedding(12, 6)\n",
 | ||
|       "    )\n",
 | ||
|       "  )\n",
 | ||
|       "  (prescalers): ModuleDict(\n",
 | ||
|       "    (avg_population_2017): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    (avg_yearly_household_income_2017): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    (encoder_length): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    (y_center): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    (y_scale): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    (time_idx): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    (price_regular): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    (discount_in_percent): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    (relative_time_idx): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    (y): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    (log_volume): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    (industry_volume): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    (soda_volume): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    (avg_max_temp): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    (avg_volume_by_agency): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    (avg_volume_by_sku): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "  )\n",
 | ||
|       "  (static_variable_selection): VariableSelectionNetwork(\n",
 | ||
|       "    (flattened_grn): GatedResidualNetwork(\n",
 | ||
|       "      (resample_norm): ResampleNorm(\n",
 | ||
|       "        (resample): TimeDistributedInterpolation()\n",
 | ||
|       "        (gate): Sigmoid()\n",
 | ||
|       "        (norm): LayerNorm((7,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "      )\n",
 | ||
|       "      (fc1): Linear(in_features=66, out_features=7, bias=True)\n",
 | ||
|       "      (elu): ELU(alpha=1.0)\n",
 | ||
|       "      (fc2): Linear(in_features=7, out_features=7, bias=True)\n",
 | ||
|       "      (gate_norm): GateAddNorm(\n",
 | ||
|       "        (glu): GatedLinearUnit(\n",
 | ||
|       "          (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "          (fc): Linear(in_features=7, out_features=14, bias=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (add_norm): AddNorm(\n",
 | ||
|       "          (norm): LayerNorm((7,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "    )\n",
 | ||
|       "    (single_variable_grns): ModuleDict(\n",
 | ||
|       "      (agency): ResampleNorm(\n",
 | ||
|       "        (gate): Sigmoid()\n",
 | ||
|       "        (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "      )\n",
 | ||
|       "      (sku): ResampleNorm(\n",
 | ||
|       "        (resample): TimeDistributedInterpolation()\n",
 | ||
|       "        (gate): Sigmoid()\n",
 | ||
|       "        (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "      )\n",
 | ||
|       "      (avg_population_2017): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (avg_yearly_household_income_2017): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (encoder_length): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (y_center): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (y_scale): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "    )\n",
 | ||
|       "    (prescalers): ModuleDict(\n",
 | ||
|       "      (avg_population_2017): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (avg_yearly_household_income_2017): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (encoder_length): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (y_center): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (y_scale): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    )\n",
 | ||
|       "    (softmax): Softmax(dim=-1)\n",
 | ||
|       "  )\n",
 | ||
|       "  (encoder_variable_selection): VariableSelectionNetwork(\n",
 | ||
|       "    (flattened_grn): GatedResidualNetwork(\n",
 | ||
|       "      (resample_norm): ResampleNorm(\n",
 | ||
|       "        (resample): TimeDistributedInterpolation()\n",
 | ||
|       "        (gate): Sigmoid()\n",
 | ||
|       "        (norm): LayerNorm((13,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "      )\n",
 | ||
|       "      (fc1): Linear(in_features=100, out_features=13, bias=True)\n",
 | ||
|       "      (elu): ELU(alpha=1.0)\n",
 | ||
|       "      (context): Linear(in_features=16, out_features=13, bias=False)\n",
 | ||
|       "      (fc2): Linear(in_features=13, out_features=13, bias=True)\n",
 | ||
|       "      (gate_norm): GateAddNorm(\n",
 | ||
|       "        (glu): GatedLinearUnit(\n",
 | ||
|       "          (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "          (fc): Linear(in_features=13, out_features=26, bias=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (add_norm): AddNorm(\n",
 | ||
|       "          (norm): LayerNorm((13,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "    )\n",
 | ||
|       "    (single_variable_grns): ModuleDict(\n",
 | ||
|       "      (special_days): ResampleNorm(\n",
 | ||
|       "        (resample): TimeDistributedInterpolation()\n",
 | ||
|       "        (gate): Sigmoid()\n",
 | ||
|       "        (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "      )\n",
 | ||
|       "      (month): ResampleNorm(\n",
 | ||
|       "        (resample): TimeDistributedInterpolation()\n",
 | ||
|       "        (gate): Sigmoid()\n",
 | ||
|       "        (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "      )\n",
 | ||
|       "      (time_idx): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (price_regular): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (discount_in_percent): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (relative_time_idx): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (y): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (log_volume): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (industry_volume): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (soda_volume): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (avg_max_temp): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (avg_volume_by_agency): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (avg_volume_by_sku): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "    )\n",
 | ||
|       "    (prescalers): ModuleDict(\n",
 | ||
|       "      (time_idx): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (price_regular): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (discount_in_percent): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (relative_time_idx): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (y): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (log_volume): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (industry_volume): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (soda_volume): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (avg_max_temp): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (avg_volume_by_agency): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (avg_volume_by_sku): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    )\n",
 | ||
|       "    (softmax): Softmax(dim=-1)\n",
 | ||
|       "  )\n",
 | ||
|       "  (decoder_variable_selection): VariableSelectionNetwork(\n",
 | ||
|       "    (flattened_grn): GatedResidualNetwork(\n",
 | ||
|       "      (resample_norm): ResampleNorm(\n",
 | ||
|       "        (resample): TimeDistributedInterpolation()\n",
 | ||
|       "        (gate): Sigmoid()\n",
 | ||
|       "        (norm): LayerNorm((6,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "      )\n",
 | ||
|       "      (fc1): Linear(in_features=44, out_features=6, bias=True)\n",
 | ||
|       "      (elu): ELU(alpha=1.0)\n",
 | ||
|       "      (context): Linear(in_features=16, out_features=6, bias=False)\n",
 | ||
|       "      (fc2): Linear(in_features=6, out_features=6, bias=True)\n",
 | ||
|       "      (gate_norm): GateAddNorm(\n",
 | ||
|       "        (glu): GatedLinearUnit(\n",
 | ||
|       "          (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "          (fc): Linear(in_features=6, out_features=12, bias=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (add_norm): AddNorm(\n",
 | ||
|       "          (norm): LayerNorm((6,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "    )\n",
 | ||
|       "    (single_variable_grns): ModuleDict(\n",
 | ||
|       "      (special_days): ResampleNorm(\n",
 | ||
|       "        (resample): TimeDistributedInterpolation()\n",
 | ||
|       "        (gate): Sigmoid()\n",
 | ||
|       "        (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "      )\n",
 | ||
|       "      (month): ResampleNorm(\n",
 | ||
|       "        (resample): TimeDistributedInterpolation()\n",
 | ||
|       "        (gate): Sigmoid()\n",
 | ||
|       "        (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "      )\n",
 | ||
|       "      (time_idx): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (price_regular): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (discount_in_percent): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "      (relative_time_idx): GatedResidualNetwork(\n",
 | ||
|       "        (resample_norm): ResampleNorm(\n",
 | ||
|       "          (resample): TimeDistributedInterpolation()\n",
 | ||
|       "          (gate): Sigmoid()\n",
 | ||
|       "          (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "        )\n",
 | ||
|       "        (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (elu): ELU(alpha=1.0)\n",
 | ||
|       "        (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
 | ||
|       "        (gate_norm): GateAddNorm(\n",
 | ||
|       "          (glu): GatedLinearUnit(\n",
 | ||
|       "            (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "            (fc): Linear(in_features=8, out_features=32, bias=True)\n",
 | ||
|       "          )\n",
 | ||
|       "          (add_norm): AddNorm(\n",
 | ||
|       "            (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "          )\n",
 | ||
|       "        )\n",
 | ||
|       "      )\n",
 | ||
|       "    )\n",
 | ||
|       "    (prescalers): ModuleDict(\n",
 | ||
|       "      (time_idx): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (price_regular): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (discount_in_percent): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "      (relative_time_idx): Linear(in_features=1, out_features=8, bias=True)\n",
 | ||
|       "    )\n",
 | ||
|       "    (softmax): Softmax(dim=-1)\n",
 | ||
|       "  )\n",
 | ||
|       "  (static_context_variable_selection): GatedResidualNetwork(\n",
 | ||
|       "    (fc1): Linear(in_features=16, out_features=16, bias=True)\n",
 | ||
|       "    (elu): ELU(alpha=1.0)\n",
 | ||
|       "    (fc2): Linear(in_features=16, out_features=16, bias=True)\n",
 | ||
|       "    (gate_norm): GateAddNorm(\n",
 | ||
|       "      (glu): GatedLinearUnit(\n",
 | ||
|       "        (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "        (fc): Linear(in_features=16, out_features=32, bias=True)\n",
 | ||
|       "      )\n",
 | ||
|       "      (add_norm): AddNorm(\n",
 | ||
|       "        (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "      )\n",
 | ||
|       "    )\n",
 | ||
|       "  )\n",
 | ||
|       "  (static_context_initial_hidden_lstm): GatedResidualNetwork(\n",
 | ||
|       "    (fc1): Linear(in_features=16, out_features=16, bias=True)\n",
 | ||
|       "    (elu): ELU(alpha=1.0)\n",
 | ||
|       "    (fc2): Linear(in_features=16, out_features=16, bias=True)\n",
 | ||
|       "    (gate_norm): GateAddNorm(\n",
 | ||
|       "      (glu): GatedLinearUnit(\n",
 | ||
|       "        (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "        (fc): Linear(in_features=16, out_features=32, bias=True)\n",
 | ||
|       "      )\n",
 | ||
|       "      (add_norm): AddNorm(\n",
 | ||
|       "        (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "      )\n",
 | ||
|       "    )\n",
 | ||
|       "  )\n",
 | ||
|       "  (static_context_initial_cell_lstm): GatedResidualNetwork(\n",
 | ||
|       "    (fc1): Linear(in_features=16, out_features=16, bias=True)\n",
 | ||
|       "    (elu): ELU(alpha=1.0)\n",
 | ||
|       "    (fc2): Linear(in_features=16, out_features=16, bias=True)\n",
 | ||
|       "    (gate_norm): GateAddNorm(\n",
 | ||
|       "      (glu): GatedLinearUnit(\n",
 | ||
|       "        (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "        (fc): Linear(in_features=16, out_features=32, bias=True)\n",
 | ||
|       "      )\n",
 | ||
|       "      (add_norm): AddNorm(\n",
 | ||
|       "        (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "      )\n",
 | ||
|       "    )\n",
 | ||
|       "  )\n",
 | ||
|       "  (static_context_enrichment): GatedResidualNetwork(\n",
 | ||
|       "    (fc1): Linear(in_features=16, out_features=16, bias=True)\n",
 | ||
|       "    (elu): ELU(alpha=1.0)\n",
 | ||
|       "    (fc2): Linear(in_features=16, out_features=16, bias=True)\n",
 | ||
|       "    (gate_norm): GateAddNorm(\n",
 | ||
|       "      (glu): GatedLinearUnit(\n",
 | ||
|       "        (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "        (fc): Linear(in_features=16, out_features=32, bias=True)\n",
 | ||
|       "      )\n",
 | ||
|       "      (add_norm): AddNorm(\n",
 | ||
|       "        (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "      )\n",
 | ||
|       "    )\n",
 | ||
|       "  )\n",
 | ||
|       "  (lstm_encoder): LSTM(16, 16, num_layers=2, batch_first=True, dropout=0.1)\n",
 | ||
|       "  (lstm_decoder): LSTM(16, 16, num_layers=2, batch_first=True, dropout=0.1)\n",
 | ||
|       "  (post_lstm_gate_encoder): GatedLinearUnit(\n",
 | ||
|       "    (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "    (fc): Linear(in_features=16, out_features=32, bias=True)\n",
 | ||
|       "  )\n",
 | ||
|       "  (post_lstm_gate_decoder): GatedLinearUnit(\n",
 | ||
|       "    (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "    (fc): Linear(in_features=16, out_features=32, bias=True)\n",
 | ||
|       "  )\n",
 | ||
|       "  (post_lstm_add_norm_encoder): AddNorm(\n",
 | ||
|       "    (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "  )\n",
 | ||
|       "  (post_lstm_add_norm_decoder): AddNorm(\n",
 | ||
|       "    (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "  )\n",
 | ||
|       "  (static_enrichment): GatedResidualNetwork(\n",
 | ||
|       "    (fc1): Linear(in_features=16, out_features=16, bias=True)\n",
 | ||
|       "    (elu): ELU(alpha=1.0)\n",
 | ||
|       "    (context): Linear(in_features=16, out_features=16, bias=False)\n",
 | ||
|       "    (fc2): Linear(in_features=16, out_features=16, bias=True)\n",
 | ||
|       "    (gate_norm): GateAddNorm(\n",
 | ||
|       "      (glu): GatedLinearUnit(\n",
 | ||
|       "        (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "        (fc): Linear(in_features=16, out_features=32, bias=True)\n",
 | ||
|       "      )\n",
 | ||
|       "      (add_norm): AddNorm(\n",
 | ||
|       "        (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "      )\n",
 | ||
|       "    )\n",
 | ||
|       "  )\n",
 | ||
|       "  (multihead_attn): InterpretableMultiHeadAttention(\n",
 | ||
|       "    (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "    (v_layer): Linear(in_features=16, out_features=4, bias=True)\n",
 | ||
|       "    (q_layers): ModuleList(\n",
 | ||
|       "      (0): Linear(in_features=16, out_features=4, bias=True)\n",
 | ||
|       "      (1): Linear(in_features=16, out_features=4, bias=True)\n",
 | ||
|       "      (2): Linear(in_features=16, out_features=4, bias=True)\n",
 | ||
|       "      (3): Linear(in_features=16, out_features=4, bias=True)\n",
 | ||
|       "    )\n",
 | ||
|       "    (k_layers): ModuleList(\n",
 | ||
|       "      (0): Linear(in_features=16, out_features=4, bias=True)\n",
 | ||
|       "      (1): Linear(in_features=16, out_features=4, bias=True)\n",
 | ||
|       "      (2): Linear(in_features=16, out_features=4, bias=True)\n",
 | ||
|       "      (3): Linear(in_features=16, out_features=4, bias=True)\n",
 | ||
|       "    )\n",
 | ||
|       "    (attention): ScaledDotProductAttention(\n",
 | ||
|       "      (softmax): Softmax(dim=2)\n",
 | ||
|       "    )\n",
 | ||
|       "    (w_h): Linear(in_features=4, out_features=16, bias=False)\n",
 | ||
|       "  )\n",
 | ||
|       "  (post_attn_gate_norm): GateAddNorm(\n",
 | ||
|       "    (glu): GatedLinearUnit(\n",
 | ||
|       "      (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "      (fc): Linear(in_features=16, out_features=32, bias=True)\n",
 | ||
|       "    )\n",
 | ||
|       "    (add_norm): AddNorm(\n",
 | ||
|       "      (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "    )\n",
 | ||
|       "  )\n",
 | ||
|       "  (pos_wise_ff): GatedResidualNetwork(\n",
 | ||
|       "    (fc1): Linear(in_features=16, out_features=16, bias=True)\n",
 | ||
|       "    (elu): ELU(alpha=1.0)\n",
 | ||
|       "    (fc2): Linear(in_features=16, out_features=16, bias=True)\n",
 | ||
|       "    (gate_norm): GateAddNorm(\n",
 | ||
|       "      (glu): GatedLinearUnit(\n",
 | ||
|       "        (dropout): Dropout(p=0.1, inplace=False)\n",
 | ||
|       "        (fc): Linear(in_features=16, out_features=32, bias=True)\n",
 | ||
|       "      )\n",
 | ||
|       "      (add_norm): AddNorm(\n",
 | ||
|       "        (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "      )\n",
 | ||
|       "    )\n",
 | ||
|       "  )\n",
 | ||
|       "  (pre_output_gate_norm): GateAddNorm(\n",
 | ||
|       "    (glu): GatedLinearUnit(\n",
 | ||
|       "      (fc): Linear(in_features=16, out_features=32, bias=True)\n",
 | ||
|       "    )\n",
 | ||
|       "    (add_norm): AddNorm(\n",
 | ||
|       "      (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
 | ||
|       "    )\n",
 | ||
|       "  )\n",
 | ||
|       "  (output_layer): Linear(in_features=16, out_features=7, bias=True)\n",
 | ||
|       ")\n",
 | ||
|       "[flaml.automl: 07-28 22:08:05] {2725} INFO - fit succeeded\n",
 | ||
|       "[flaml.automl: 07-28 22:08:05] {2726} INFO - Time taken to find the best model: 1242.6435902118683\n",
 | ||
|       "[flaml.automl: 07-28 22:08:05] {2737} WARNING - Time taken to find the best model is 414% of the provided time budget and not all estimators' hyperparameter search converged. Consider increasing the time budget.\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "from flaml import AutoML\n",
 | ||
|     "automl = AutoML()\n",
 | ||
|     "settings = {\n",
 | ||
|     "    \"time_budget\": 300,  # total running time in seconds\n",
 | ||
|     "    \"metric\": \"mape\",  # primary metric\n",
 | ||
|     "    \"task\": \"ts_forecast_panel\",  # task type\n",
 | ||
|     "    \"log_file_name\": \"stallion_forecast.log\",  # flaml log file\n",
 | ||
|     "    \"eval_method\": \"holdout\",\n",
 | ||
|     "}\n",
 | ||
|     "fit_kwargs_by_estimator = {\n",
 | ||
|     "    \"tft\": {\n",
 | ||
|     "        \"max_encoder_length\": 24,\n",
 | ||
|     "        \"static_categoricals\": [\"agency\", \"sku\"],\n",
 | ||
|     "        \"static_reals\": [\"avg_population_2017\", \"avg_yearly_household_income_2017\"],\n",
 | ||
|     "        \"time_varying_known_categoricals\": [\"special_days\", \"month\"],\n",
 | ||
|     "        \"variable_groups\": {\n",
 | ||
|     "            \"special_days\": special_days\n",
 | ||
|     "        },  # group of categorical variables can be treated as one variable\n",
 | ||
|     "        \"time_varying_known_reals\": [\n",
 | ||
|     "            \"time_idx\",\n",
 | ||
|     "            \"price_regular\",\n",
 | ||
|     "            \"discount_in_percent\",\n",
 | ||
|     "        ],\n",
 | ||
|     "        \"time_varying_unknown_categoricals\": [],\n",
 | ||
|     "        \"time_varying_unknown_reals\": [\n",
 | ||
|     "            \"y\",  # always need a 'y' column for the target column\n",
 | ||
|     "            \"log_volume\",\n",
 | ||
|     "            \"industry_volume\",\n",
 | ||
|     "            \"soda_volume\",\n",
 | ||
|     "            \"avg_max_temp\",\n",
 | ||
|     "            \"avg_volume_by_agency\",\n",
 | ||
|     "            \"avg_volume_by_sku\",\n",
 | ||
|     "        ],\n",
 | ||
|     "        \"batch_size\": 128,\n",
 | ||
|     "        \"gpu_per_trial\": -1,\n",
 | ||
|     "    }\n",
 | ||
|     "}\n",
 | ||
|     "\"\"\"The main flaml automl API\"\"\"\n",
 | ||
|     "automl.fit(\n",
 | ||
|     "    X_train=X_train,\n",
 | ||
|     "    y_train=y_train,\n",
 | ||
|     "    **settings,\n",
 | ||
|     "    period=time_horizon,\n",
 | ||
|     "    group_ids=[\"agency\", \"sku\"],\n",
 | ||
|     "    fit_kwargs_by_estimator=fit_kwargs_by_estimator,\n",
 | ||
|     ")"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Prediction and Metrics"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 9,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "17156    59.292\n",
 | ||
|       "18946    66.420\n",
 | ||
|       "20680    95.904\n",
 | ||
|       "3189     52.812\n",
 | ||
|       "4954     37.908\n",
 | ||
|       "          ...  \n",
 | ||
|       "19207     1.980\n",
 | ||
|       "20996     1.260\n",
 | ||
|       "3499      0.990\n",
 | ||
|       "5248      0.090\n",
 | ||
|       "6793      2.250\n",
 | ||
|       "Name: volume, Length: 2100, dtype: float64\n",
 | ||
|       "Agency_01  SKU_01  2017-07-01  2017-07-01    77.331932\n",
 | ||
|       "                   2017-08-01  2017-08-01    71.502121\n",
 | ||
|       "                   2017-09-01  2017-09-01    88.353912\n",
 | ||
|       "                   2017-10-01  2017-10-01    60.969868\n",
 | ||
|       "                   2017-11-01  2017-11-01    60.205246\n",
 | ||
|       "                                               ...    \n",
 | ||
|       "Agency_60  SKU_23  2017-08-01  2017-08-01     1.713270\n",
 | ||
|       "                   2017-09-01  2017-09-01     1.513947\n",
 | ||
|       "                   2017-10-01  2017-10-01     0.993663\n",
 | ||
|       "                   2017-11-01  2017-11-01     1.144696\n",
 | ||
|       "                   2017-12-01  2017-12-01     1.989883\n",
 | ||
|       "Length: 2100, dtype: float32\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "\"\"\" compute predictions of testing dataset \"\"\"\n",
 | ||
|     "y_pred = automl.predict(X_test)\n",
 | ||
|     "print(y_test)\n",
 | ||
|     "print(y_pred)"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 10,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "mape = 2743417592614313.0\n",
 | ||
|       "smape = 52.37\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "\"\"\" compute different metric values on testing dataset\"\"\"\n",
 | ||
|     "from flaml.ml import sklearn_metric_loss_score\n",
 | ||
|     "print(\"mape\", \"=\", sklearn_metric_loss_score(\"mape\", y_pred, y_test))\n",
 | ||
|     "\n",
 | ||
|     "def smape(y_pred, y_test):\n",
 | ||
|     "    import numpy as np\n",
 | ||
|     "\n",
 | ||
|     "    y_test, y_pred = np.array(y_test), np.array(y_pred)\n",
 | ||
|     "    return round(\n",
 | ||
|     "        np.mean(\n",
 | ||
|     "            np.abs(y_pred - y_test) /\n",
 | ||
|     "            ((np.abs(y_pred) + np.abs(y_test)) / 2)\n",
 | ||
|     "        ) * 100, 2\n",
 | ||
|     "    )\n",
 | ||
|     "\n",
 | ||
|     "print(\"smape\", \"=\", smape(y_pred, y_test))"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "## 6. Comparison with Alternatives (CO2 Dataset)"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "FLAML's MAPE"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 33,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "flaml mape = 0.0005710586398294955\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "from flaml.ml import sklearn_metric_loss_score\n",
 | ||
|     "print('flaml mape', '=', sklearn_metric_loss_score('mape', flaml_y_pred, y_test))"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Default Prophet"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 34,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [],
 | ||
|    "source": [
 | ||
|     "from prophet import Prophet\n",
 | ||
|     "prophet_model = Prophet()"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 35,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "data": {
 | ||
|       "text/plain": [
 | ||
|        "<prophet.forecaster.Prophet at 0x1e2d990d7c0>"
 | ||
|       ]
 | ||
|      },
 | ||
|      "execution_count": 35,
 | ||
|      "metadata": {},
 | ||
|      "output_type": "execute_result"
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "X_train_prophet = train_df.copy()\n",
 | ||
|     "X_train_prophet = X_train_prophet.rename(columns={'index': 'ds', 'co2': 'y'})\n",
 | ||
|     "prophet_model.fit(X_train_prophet)"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 36,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "Predicted labels 0     370.450675\n",
 | ||
|       "1     371.177764\n",
 | ||
|       "2     372.229577\n",
 | ||
|       "3     373.419835\n",
 | ||
|       "4     373.914917\n",
 | ||
|       "5     373.406484\n",
 | ||
|       "6     372.053428\n",
 | ||
|       "7     370.149037\n",
 | ||
|       "8     368.566631\n",
 | ||
|       "9     368.646853\n",
 | ||
|       "10    369.863891\n",
 | ||
|       "11    371.135959\n",
 | ||
|       "Name: yhat, dtype: float64\n",
 | ||
|       "True labels 514    370.175\n",
 | ||
|       "515    371.325\n",
 | ||
|       "516    372.060\n",
 | ||
|       "517    372.775\n",
 | ||
|       "518    373.800\n",
 | ||
|       "519    373.060\n",
 | ||
|       "520    371.300\n",
 | ||
|       "521    369.425\n",
 | ||
|       "522    367.880\n",
 | ||
|       "523    368.050\n",
 | ||
|       "524    369.375\n",
 | ||
|       "525    371.020\n",
 | ||
|       "Name: co2, dtype: float64\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "X_test_prophet = X_test.copy()\n",
 | ||
|     "X_test_prophet = X_test_prophet.rename(columns={'index': 'ds'})\n",
 | ||
|     "prophet_y_pred = prophet_model.predict(X_test_prophet)['yhat']\n",
 | ||
|     "print('Predicted labels', prophet_y_pred)\n",
 | ||
|     "print('True labels', y_test)"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "Default Prophet MAPE"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 37,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "default prophet mape = 0.0011396920680673015\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "from flaml.ml import sklearn_metric_loss_score\n",
 | ||
|     "print('default prophet mape', '=', sklearn_metric_loss_score('mape', prophet_y_pred, y_test))"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Auto ARIMA Models"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 38,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [],
 | ||
|    "source": [
 | ||
|     "from pmdarima.arima import auto_arima\n",
 | ||
|     "import pandas as pd\n",
 | ||
|     "import time\n",
 | ||
|     "\n",
 | ||
|     "X_train_arima = train_df.copy()\n",
 | ||
|     "X_train_arima.index = pd.to_datetime(X_train_arima['index'])\n",
 | ||
|     "X_train_arima = X_train_arima.drop('index', axis=1)\n",
 | ||
|     "X_train_arima = X_train_arima.rename(columns={'co2': 'y'})"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 39,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       " ARIMA(0,1,0)(0,0,0)[0] intercept   : AIC=1638.009, Time=0.02 sec\n",
 | ||
|       " ARIMA(0,1,1)(0,0,0)[0] intercept   : AIC=1344.207, Time=0.09 sec\n",
 | ||
|       " ARIMA(0,1,2)(0,0,0)[0] intercept   : AIC=1222.286, Time=0.14 sec\n",
 | ||
|       " ARIMA(0,1,3)(0,0,0)[0] intercept   : AIC=1174.928, Time=0.20 sec\n",
 | ||
|       " ARIMA(0,1,4)(0,0,0)[0] intercept   : AIC=1188.947, Time=0.43 sec\n",
 | ||
|       " ARIMA(0,1,5)(0,0,0)[0] intercept   : AIC=1091.452, Time=0.55 sec\n",
 | ||
|       " ARIMA(1,1,0)(0,0,0)[0] intercept   : AIC=1298.693, Time=0.08 sec\n",
 | ||
|       " ARIMA(1,1,1)(0,0,0)[0] intercept   : AIC=1240.963, Time=0.12 sec\n",
 | ||
|       " ARIMA(1,1,2)(0,0,0)[0] intercept   : AIC=1196.535, Time=0.19 sec\n",
 | ||
|       " ARIMA(1,1,3)(0,0,0)[0] intercept   : AIC=1176.484, Time=0.34 sec\n",
 | ||
|       " ARIMA(1,1,4)(0,0,0)[0] intercept   : AIC=inf, Time=1.18 sec\n",
 | ||
|       " ARIMA(2,1,0)(0,0,0)[0] intercept   : AIC=1180.404, Time=0.08 sec\n",
 | ||
|       " ARIMA(2,1,1)(0,0,0)[0] intercept   : AIC=990.719, Time=0.26 sec\n",
 | ||
|       " ARIMA(2,1,2)(0,0,0)[0] intercept   : AIC=988.094, Time=0.53 sec\n",
 | ||
|       " ARIMA(2,1,3)(0,0,0)[0] intercept   : AIC=1140.469, Time=0.53 sec\n",
 | ||
|       " ARIMA(3,1,0)(0,0,0)[0] intercept   : AIC=1126.139, Time=0.21 sec\n",
 | ||
|       " ARIMA(3,1,1)(0,0,0)[0] intercept   : AIC=989.496, Time=0.51 sec\n",
 | ||
|       " ARIMA(3,1,2)(0,0,0)[0] intercept   : AIC=991.558, Time=1.17 sec\n",
 | ||
|       " ARIMA(4,1,0)(0,0,0)[0] intercept   : AIC=1125.025, Time=0.19 sec\n",
 | ||
|       " ARIMA(4,1,1)(0,0,0)[0] intercept   : AIC=988.660, Time=0.98 sec\n",
 | ||
|       " ARIMA(5,1,0)(0,0,0)[0] intercept   : AIC=1113.673, Time=0.22 sec\n",
 | ||
|       "\n",
 | ||
|       "Best model:  ARIMA(2,1,2)(0,0,0)[0] intercept\n",
 | ||
|       "Total fit time: 8.039 seconds\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "# use same search space as FLAML\n",
 | ||
|     "start_time = time.time()\n",
 | ||
|     "arima_model = auto_arima(X_train_arima,\n",
 | ||
|     "                            start_p=2, d=None, start_q=1, max_p=10, max_d=10, max_q=10,\n",
 | ||
|     "                            suppress_warnings=True, stepwise=False, seasonal=False,\n",
 | ||
|     "                            error_action='ignore', trace=True, n_fits=650)\n",
 | ||
|     "autoarima_y_pred = arima_model.predict(n_periods=12)\n",
 | ||
|     "arima_time = time.time() - start_time"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 40,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       " ARIMA(0,1,0)(0,0,0)[12] intercept   : AIC=1638.009, Time=0.02 sec\n",
 | ||
|       " ARIMA(0,1,0)(0,0,1)[12] intercept   : AIC=1238.943, Time=0.23 sec\n",
 | ||
|       " ARIMA(0,1,0)(0,0,2)[12] intercept   : AIC=1040.890, Time=0.53 sec\n",
 | ||
|       " ARIMA(0,1,0)(0,0,3)[12] intercept   : AIC=911.545, Time=1.76 sec\n",
 | ||
|       " ARIMA(0,1,0)(0,0,4)[12] intercept   : AIC=823.103, Time=3.18 sec\n",
 | ||
|       " ARIMA(0,1,0)(0,0,5)[12] intercept   : AIC=792.850, Time=5.99 sec\n",
 | ||
|       " ARIMA(0,1,0)(1,0,0)[12] intercept   : AIC=inf, Time=0.26 sec\n",
 | ||
|       " ARIMA(0,1,0)(1,0,1)[12] intercept   : AIC=inf, Time=1.37 sec\n",
 | ||
|       " ARIMA(0,1,0)(1,0,2)[12] intercept   : AIC=inf, Time=2.60 sec\n",
 | ||
|       " ARIMA(0,1,0)(1,0,3)[12] intercept   : AIC=447.302, Time=5.94 sec\n",
 | ||
|       " ARIMA(0,1,0)(1,0,4)[12] intercept   : AIC=inf, Time=11.23 sec\n",
 | ||
|       " ARIMA(0,1,0)(2,0,0)[12] intercept   : AIC=inf, Time=1.10 sec\n",
 | ||
|       " ARIMA(0,1,0)(2,0,1)[12] intercept   : AIC=inf, Time=2.37 sec\n",
 | ||
|       " ARIMA(0,1,0)(2,0,2)[12] intercept   : AIC=inf, Time=2.75 sec\n",
 | ||
|       " ARIMA(0,1,0)(2,0,3)[12] intercept   : AIC=427.135, Time=7.49 sec\n",
 | ||
|       " ARIMA(0,1,0)(3,0,0)[12] intercept   : AIC=inf, Time=3.56 sec\n",
 | ||
|       " ARIMA(0,1,0)(3,0,1)[12] intercept   : AIC=424.286, Time=6.44 sec\n",
 | ||
|       " ARIMA(0,1,0)(3,0,2)[12] intercept   : AIC=431.435, Time=6.86 sec\n",
 | ||
|       " ARIMA(0,1,0)(4,0,0)[12] intercept   : AIC=inf, Time=8.12 sec\n",
 | ||
|       " ARIMA(0,1,0)(4,0,1)[12] intercept   : AIC=430.321, Time=11.65 sec\n",
 | ||
|       " ARIMA(0,1,0)(5,0,0)[12] intercept   : AIC=inf, Time=17.56 sec\n",
 | ||
|       " ARIMA(0,1,1)(0,0,0)[12] intercept   : AIC=1344.207, Time=0.08 sec\n",
 | ||
|       " ARIMA(0,1,1)(0,0,1)[12] intercept   : AIC=1112.274, Time=0.37 sec\n",
 | ||
|       " ARIMA(0,1,1)(0,0,2)[12] intercept   : AIC=993.565, Time=0.76 sec\n",
 | ||
|       " ARIMA(0,1,1)(0,0,3)[12] intercept   : AIC=891.683, Time=3.11 sec\n",
 | ||
|       " ARIMA(0,1,1)(0,0,4)[12] intercept   : AIC=820.025, Time=5.52 sec\n",
 | ||
|       " ARIMA(0,1,1)(1,0,0)[12] intercept   : AIC=612.811, Time=0.60 sec\n",
 | ||
|       " ARIMA(0,1,1)(1,0,1)[12] intercept   : AIC=393.876, Time=1.61 sec\n",
 | ||
|       " ARIMA(0,1,1)(1,0,2)[12] intercept   : AIC=416.358, Time=3.64 sec\n",
 | ||
|       " ARIMA(0,1,1)(1,0,3)[12] intercept   : AIC=424.837, Time=8.45 sec\n",
 | ||
|       " ARIMA(0,1,1)(2,0,0)[12] intercept   : AIC=510.637, Time=1.63 sec\n",
 | ||
|       " ARIMA(0,1,1)(2,0,1)[12] intercept   : AIC=398.093, Time=3.18 sec\n",
 | ||
|       " ARIMA(0,1,1)(2,0,2)[12] intercept   : AIC=401.837, Time=4.14 sec\n",
 | ||
|       " ARIMA(0,1,1)(3,0,0)[12] intercept   : AIC=467.985, Time=8.25 sec\n",
 | ||
|       " ARIMA(0,1,1)(3,0,1)[12] intercept   : AIC=412.757, Time=10.34 sec\n",
 | ||
|       " ARIMA(0,1,1)(4,0,0)[12] intercept   : AIC=448.948, Time=7.42 sec\n",
 | ||
|       " ARIMA(0,1,2)(0,0,0)[12] intercept   : AIC=1222.286, Time=0.14 sec\n",
 | ||
|       " ARIMA(0,1,2)(0,0,1)[12] intercept   : AIC=1046.922, Time=0.32 sec\n",
 | ||
|       " ARIMA(0,1,2)(0,0,2)[12] intercept   : AIC=947.532, Time=0.92 sec\n",
 | ||
|       " ARIMA(0,1,2)(0,0,3)[12] intercept   : AIC=867.310, Time=2.67 sec\n",
 | ||
|       " ARIMA(0,1,2)(1,0,0)[12] intercept   : AIC=608.450, Time=0.65 sec\n",
 | ||
|       " ARIMA(0,1,2)(1,0,1)[12] intercept   : AIC=389.029, Time=1.72 sec\n",
 | ||
|       " ARIMA(0,1,2)(1,0,2)[12] intercept   : AIC=421.446, Time=3.85 sec\n",
 | ||
|       " ARIMA(0,1,2)(2,0,0)[12] intercept   : AIC=507.685, Time=2.02 sec\n",
 | ||
|       " ARIMA(0,1,2)(2,0,1)[12] intercept   : AIC=408.463, Time=3.61 sec\n",
 | ||
|       " ARIMA(0,1,2)(3,0,0)[12] intercept   : AIC=460.596, Time=5.28 sec\n",
 | ||
|       " ARIMA(0,1,3)(0,0,0)[12] intercept   : AIC=1174.928, Time=0.18 sec\n",
 | ||
|       " ARIMA(0,1,3)(0,0,1)[12] intercept   : AIC=1037.324, Time=0.56 sec\n",
 | ||
|       " ARIMA(0,1,3)(0,0,2)[12] intercept   : AIC=947.471, Time=1.46 sec\n",
 | ||
|       " ARIMA(0,1,3)(1,0,0)[12] intercept   : AIC=602.141, Time=0.82 sec\n",
 | ||
|       " ARIMA(0,1,3)(1,0,1)[12] intercept   : AIC=399.084, Time=2.40 sec\n",
 | ||
|       " ARIMA(0,1,3)(2,0,0)[12] intercept   : AIC=500.296, Time=2.60 sec\n",
 | ||
|       " ARIMA(0,1,4)(0,0,0)[12] intercept   : AIC=1188.947, Time=0.42 sec\n",
 | ||
|       " ARIMA(0,1,4)(0,0,1)[12] intercept   : AIC=999.240, Time=0.87 sec\n",
 | ||
|       " ARIMA(0,1,4)(1,0,0)[12] intercept   : AIC=604.133, Time=0.99 sec\n",
 | ||
|       " ARIMA(0,1,5)(0,0,0)[12] intercept   : AIC=1091.452, Time=0.53 sec\n",
 | ||
|       " ARIMA(1,1,0)(0,0,0)[12] intercept   : AIC=1298.693, Time=0.05 sec\n",
 | ||
|       " ARIMA(1,1,0)(0,0,1)[12] intercept   : AIC=1075.553, Time=0.25 sec\n",
 | ||
|       " ARIMA(1,1,0)(0,0,2)[12] intercept   : AIC=971.074, Time=0.69 sec\n",
 | ||
|       " ARIMA(1,1,0)(0,0,3)[12] intercept   : AIC=882.846, Time=2.63 sec\n",
 | ||
|       " ARIMA(1,1,0)(0,0,4)[12] intercept   : AIC=818.711, Time=4.91 sec\n",
 | ||
|       " ARIMA(1,1,0)(1,0,0)[12] intercept   : AIC=inf, Time=0.59 sec\n",
 | ||
|       " ARIMA(1,1,0)(1,0,1)[12] intercept   : AIC=414.969, Time=1.19 sec\n",
 | ||
|       " ARIMA(1,1,0)(1,0,2)[12] intercept   : AIC=402.836, Time=3.25 sec\n",
 | ||
|       " ARIMA(1,1,0)(1,0,3)[12] intercept   : AIC=429.921, Time=6.47 sec\n",
 | ||
|       " ARIMA(1,1,0)(2,0,0)[12] intercept   : AIC=inf, Time=1.76 sec\n",
 | ||
|       " ARIMA(1,1,0)(2,0,1)[12] intercept   : AIC=419.397, Time=2.89 sec\n",
 | ||
|       " ARIMA(1,1,0)(2,0,2)[12] intercept   : AIC=409.246, Time=4.10 sec\n",
 | ||
|       " ARIMA(1,1,0)(3,0,0)[12] intercept   : AIC=inf, Time=4.96 sec\n",
 | ||
|       " ARIMA(1,1,0)(3,0,1)[12] intercept   : AIC=419.507, Time=7.41 sec\n",
 | ||
|       " ARIMA(1,1,0)(4,0,0)[12] intercept   : AIC=inf, Time=11.83 sec\n",
 | ||
|       " ARIMA(1,1,1)(0,0,0)[12] intercept   : AIC=1240.963, Time=0.11 sec\n",
 | ||
|       " ARIMA(1,1,1)(0,0,1)[12] intercept   : AIC=1069.162, Time=0.45 sec\n",
 | ||
|       " ARIMA(1,1,1)(0,0,2)[12] intercept   : AIC=973.065, Time=1.21 sec\n",
 | ||
|       " ARIMA(1,1,1)(0,0,3)[12] intercept   : AIC=884.323, Time=4.46 sec\n",
 | ||
|       " ARIMA(1,1,1)(1,0,0)[12] intercept   : AIC=588.156, Time=1.52 sec\n",
 | ||
|       " ARIMA(1,1,1)(1,0,1)[12] intercept   : AIC=399.035, Time=1.88 sec\n",
 | ||
|       " ARIMA(1,1,1)(1,0,2)[12] intercept   : AIC=409.509, Time=4.49 sec\n",
 | ||
|       " ARIMA(1,1,1)(2,0,0)[12] intercept   : AIC=503.551, Time=1.88 sec\n",
 | ||
|       " ARIMA(1,1,1)(2,0,1)[12] intercept   : AIC=399.929, Time=3.30 sec\n",
 | ||
|       " ARIMA(1,1,1)(3,0,0)[12] intercept   : AIC=457.277, Time=7.70 sec\n",
 | ||
|       " ARIMA(1,1,2)(0,0,0)[12] intercept   : AIC=1196.535, Time=0.18 sec\n",
 | ||
|       " ARIMA(1,1,2)(0,0,1)[12] intercept   : AIC=1042.432, Time=0.50 sec\n",
 | ||
|       " ARIMA(1,1,2)(0,0,2)[12] intercept   : AIC=948.444, Time=1.55 sec\n",
 | ||
|       " ARIMA(1,1,2)(1,0,0)[12] intercept   : AIC=587.318, Time=1.60 sec\n",
 | ||
|       " ARIMA(1,1,2)(1,0,1)[12] intercept   : AIC=403.282, Time=1.93 sec\n",
 | ||
|       " ARIMA(1,1,2)(2,0,0)[12] intercept   : AIC=498.922, Time=3.90 sec\n",
 | ||
|       " ARIMA(1,1,3)(0,0,0)[12] intercept   : AIC=1176.484, Time=0.29 sec\n",
 | ||
|       " ARIMA(1,1,3)(0,0,1)[12] intercept   : AIC=1039.309, Time=0.94 sec\n",
 | ||
|       " ARIMA(1,1,3)(1,0,0)[12] intercept   : AIC=604.131, Time=1.21 sec\n",
 | ||
|       " ARIMA(1,1,4)(0,0,0)[12] intercept   : AIC=inf, Time=1.19 sec\n",
 | ||
|       " ARIMA(2,1,0)(0,0,0)[12] intercept   : AIC=1180.404, Time=0.09 sec\n",
 | ||
|       " ARIMA(2,1,0)(0,0,1)[12] intercept   : AIC=1058.115, Time=0.33 sec\n",
 | ||
|       " ARIMA(2,1,0)(0,0,2)[12] intercept   : AIC=973.051, Time=0.92 sec\n",
 | ||
|       " ARIMA(2,1,0)(0,0,3)[12] intercept   : AIC=883.377, Time=2.84 sec\n",
 | ||
|       " ARIMA(2,1,0)(1,0,0)[12] intercept   : AIC=inf, Time=0.60 sec\n",
 | ||
|       " ARIMA(2,1,0)(1,0,1)[12] intercept   : AIC=416.548, Time=1.59 sec\n",
 | ||
|       " ARIMA(2,1,0)(1,0,2)[12] intercept   : AIC=420.663, Time=3.27 sec\n",
 | ||
|       " ARIMA(2,1,0)(2,0,0)[12] intercept   : AIC=inf, Time=2.23 sec\n",
 | ||
|       " ARIMA(2,1,0)(2,0,1)[12] intercept   : AIC=402.478, Time=4.16 sec\n",
 | ||
|       " ARIMA(2,1,0)(3,0,0)[12] intercept   : AIC=inf, Time=6.51 sec\n",
 | ||
|       " ARIMA(2,1,1)(0,0,0)[12] intercept   : AIC=990.719, Time=0.26 sec\n",
 | ||
|       " ARIMA(2,1,1)(0,0,1)[12] intercept   : AIC=881.526, Time=1.10 sec\n",
 | ||
|       " ARIMA(2,1,1)(0,0,2)[12] intercept   : AIC=837.402, Time=3.23 sec\n",
 | ||
|       " ARIMA(2,1,1)(1,0,0)[12] intercept   : AIC=584.045, Time=2.20 sec\n",
 | ||
|       " ARIMA(2,1,1)(1,0,1)[12] intercept   : AIC=443.982, Time=2.03 sec\n",
 | ||
|       " ARIMA(2,1,1)(2,0,0)[12] intercept   : AIC=501.152, Time=2.59 sec\n",
 | ||
|       " ARIMA(2,1,2)(0,0,0)[12] intercept   : AIC=988.094, Time=0.50 sec\n",
 | ||
|       " ARIMA(2,1,2)(0,0,1)[12] intercept   : AIC=757.710, Time=2.77 sec\n",
 | ||
|       " ARIMA(2,1,2)(1,0,0)[12] intercept   : AIC=595.703, Time=3.85 sec\n",
 | ||
|       " ARIMA(2,1,3)(0,0,0)[12] intercept   : AIC=1140.469, Time=0.95 sec\n",
 | ||
|       " ARIMA(3,1,0)(0,0,0)[12] intercept   : AIC=1126.139, Time=0.39 sec\n",
 | ||
|       " ARIMA(3,1,0)(0,0,1)[12] intercept   : AIC=996.923, Time=0.66 sec\n",
 | ||
|       " ARIMA(3,1,0)(0,0,2)[12] intercept   : AIC=918.438, Time=1.53 sec\n",
 | ||
|       " ARIMA(3,1,0)(1,0,0)[12] intercept   : AIC=inf, Time=0.88 sec\n",
 | ||
|       " ARIMA(3,1,0)(1,0,1)[12] intercept   : AIC=406.495, Time=2.17 sec\n",
 | ||
|       " ARIMA(3,1,0)(2,0,0)[12] intercept   : AIC=inf, Time=3.32 sec\n",
 | ||
|       " ARIMA(3,1,1)(0,0,0)[12] intercept   : AIC=989.496, Time=0.51 sec\n",
 | ||
|       " ARIMA(3,1,1)(0,0,1)[12] intercept   : AIC=856.486, Time=1.64 sec\n",
 | ||
|       " ARIMA(3,1,1)(1,0,0)[12] intercept   : AIC=604.951, Time=0.94 sec\n",
 | ||
|       " ARIMA(3,1,2)(0,0,0)[12] intercept   : AIC=991.558, Time=1.11 sec\n",
 | ||
|       " ARIMA(4,1,0)(0,0,0)[12] intercept   : AIC=1125.025, Time=0.18 sec\n",
 | ||
|       " ARIMA(4,1,0)(0,0,1)[12] intercept   : AIC=987.621, Time=0.50 sec\n",
 | ||
|       " ARIMA(4,1,0)(1,0,0)[12] intercept   : AIC=inf, Time=1.05 sec\n",
 | ||
|       " ARIMA(4,1,1)(0,0,0)[12] intercept   : AIC=988.660, Time=1.00 sec\n",
 | ||
|       " ARIMA(5,1,0)(0,0,0)[12] intercept   : AIC=1113.673, Time=0.22 sec\n",
 | ||
|       "\n",
 | ||
|       "Best model:  ARIMA(0,1,2)(1,0,1)[12] intercept\n",
 | ||
|       "Total fit time: 343.809 seconds\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "start_time = time.time()\n",
 | ||
|     "sarima_model = auto_arima(X_train_arima,\n",
 | ||
|     "                            start_p=2, d=None, start_q=1, max_p=10, max_d=10, max_q=10,\n",
 | ||
|     "                            start_P=2, D=None, start_Q=1, max_P=10, max_D=10, max_Q=10, m=12,\n",
 | ||
|     "                            suppress_warnings=True, stepwise=False, seasonal=True,\n",
 | ||
|     "                            error_action='ignore', trace=True, n_fits=50)\n",
 | ||
|     "sarima_time = time.time() - start_time\n",
 | ||
|     "autosarima_y_pred = sarima_model.predict(n_periods=12)"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "Auto ARIMA Models MAPE"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 41,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "auto arima mape = 0.0032060326207122916\n",
 | ||
|       "auto sarima mape = 0.0007347495325972257\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "from flaml.ml import sklearn_metric_loss_score\n",
 | ||
|     "print('auto arima mape', '=', sklearn_metric_loss_score('mape', y_test, autoarima_y_pred))\n",
 | ||
|     "print('auto sarima mape', '=', sklearn_metric_loss_score('mape', y_test, autosarima_y_pred))"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "markdown",
 | ||
|    "metadata": {},
 | ||
|    "source": [
 | ||
|     "### Compare All"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 42,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "name": "stdout",
 | ||
|      "output_type": "stream",
 | ||
|      "text": [
 | ||
|       "flaml mape = 0.0005706814258795216\n",
 | ||
|       "default prophet mape = 0.0011396920680673015\n",
 | ||
|       "auto arima mape = 0.0032060326207122916\n",
 | ||
|       "auto sarima mape = 0.0007347495325972257\n"
 | ||
|      ]
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "from flaml.ml import sklearn_metric_loss_score\n",
 | ||
|     "print('flaml mape', '=', sklearn_metric_loss_score('mape', y_test, flaml_y_pred))\n",
 | ||
|     "print('default prophet mape', '=', sklearn_metric_loss_score('mape', prophet_y_pred, y_test))\n",
 | ||
|     "print('auto arima mape', '=', sklearn_metric_loss_score('mape', y_test, autoarima_y_pred))\n",
 | ||
|     "print('auto sarima mape', '=', sklearn_metric_loss_score('mape', y_test, autosarima_y_pred))"
 | ||
|    ]
 | ||
|   },
 | ||
|   {
 | ||
|    "cell_type": "code",
 | ||
|    "execution_count": 43,
 | ||
|    "metadata": {},
 | ||
|    "outputs": [
 | ||
|     {
 | ||
|      "data": {
 | ||
|       "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEGCAYAAACKB4k+AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjAsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8GearUAAAgAElEQVR4nOyde1zP1x/Hn58uVEg3l9wqkugeFSpquc81mrsw1/YzYxc2tpnNmBmb2eYec8l1wtjcc0kuRahkXZQuLl10v3+/5/dHNG0uGQl9no/H96E+533O9/VJfd+fc3sdSQiBjIyMjIwMgEpVC5CRkZGReXmQk4KMjIyMTBlyUpCRkZGRKUNOCjIyMjIyZchJQUZGRkamDLWqFvAsGBgYCGNj46qWISMjI/NKERISkiqEqPewslc6KRgbGxMcHFzVMmRkZGReKSRJin9UmTx8JCMjIyNThpwUZGRkZGTKqLSkIEmShiRJ5yRJuiRJUrgkSV/cu75VkqTQe684SZJC/1GvmSRJOZIkfVBZ2mRkZGRkHk5lzikUAm8IIXIkSVIHTkmS9IcQYvD9AEmSvgMy/1FvCfBHJeqSkXktKC4uJjExkYKCgqqWIvOSoqGhQZMmTVBXV69wnUpLCqLUVCnn3rfq915lRkuSJEnAW8AbD1zrD8QCuZWlS0bmdSExMZE6depgbGxM6Z+TjMzfCCFIS0sjMTERExOTCter1DkFSZJU7w0P3QEOCSHOPlDsCtwWQkTdi60FzAC+eEKbEyRJCpYkKTglJaWypMvIvPQUFBSgr68vJwSZhyJJEvr6+k/dk6zUpCCEUAghbIEmgKMkSZYPFA8F/B74/gtgiRAih8cghFgphGgnhGhXr95Dl9nKyFQb5IQg8zj+y+/HC9mnIITIkCQpAOgBhEmSpAZ4Am0fCHMCBkmStBDQAZSSJBUIIZa9CI0yLzcKpYIjN46gq6GLQ0OHqpYjI/PaUpmrj+pJkqRz72tNoAsQea+4CxAphEi8Hy+EcBVCGAshjIHvga/lhCADcCnlEsP2D+P94+8z9sBYph6dSmJ24pMryrwQdu3ahSRJREZGPjH2+++/Jy8v7z+/17p16/jf//5X4evPQmW0+SpQmcNHhsAxSZIuA+cpnVP4/V7ZEMoPHcnI/IvU/FRmn5rNyH2j0IxtyLtxi/G5M49LcRH08+/HsovLyC/Jr2qZ1R4/Pz9cXFzYsmXLE2OfNSnIVD6VlhSEEJeFEHZCCGshhKUQYu4DZaOFEMsfU3eOEGJRZWmTebkpUZawMWIj/XYOICYojQnhC3CI6EctUQeVBG2GXppF/4KxrLy0kn7+/TgYdxD5BMGqIScnh8DAQNasWVMuKSgUCj744AOsrKywtrbmxx9/ZOnSpSQnJ+Pu7o67uzsAtWvXLquzY8cORo8eDcDevXtxcnLCzs6OLl26cPv27QprSklJYeDAgTg4OODg4EBgYCBKpRJjY2MyMjLK4kxNTbl9+/ZD46szr7T3kczrx/lb5/km8Ftq/tUQr9szUS/QpIGJNm1HGLHjZir6kh71ruVREtKa6Y2Wcqz5Zt4//j5Ohk587PgxLXRaVPUtVAlf7A0nIjnrubbZppE2n/exeGyMv78/PXr0wMzMDD09PS5cuIC9vT0rV67k+vXrXLx4ETU1NdLT09HT02Px4sUcO3YMAwODx7br4uLCmTNnkCSJ1atXs3DhQr777rsK6Z46dSrTpk3DxcWFGzdu0L17d65evUq/fv3YtWsXY8aM4ezZsxgbG9OgQQOGDRv20PjqipwUZF4KbuXe4vvAH7l9rgjnW97UKNGkibkubXsa09hMh58DYvgpIBaAIe2a4O3UmqAd0TidHkIn+z6sv/MdA/cMZKj5UHxsfahTo04V31H1wM/Pj/feew+AIUOG4Ofnh729PYcPH2bSpEmoqZV+xOjp6T1Vu4mJiQwePJibN29SVFT0VOvsDx8+TERERNn3WVlZZGdnM3jwYObOncuYMWPYsmULgwcPfmx8dUVOCjJVSpGiiF/Pb+bCoTjMbrrQVFkTI2s9HHo2p4GJNgCHI26z6OA1+to0wkhfix+PRpOYUcCSmW25si+eq6dv8rb+VyTZh7Dp6hr2X9/Pe/bv0c+0HypS9bD3etITfWWQlpbG0aNHCQsLQ5IkFAoFkiSxcOFChBAVWg75YMyD6+mnTJnC9OnT6du3LwEBAcyZM6fCupRKJUFBQWhqapa73qFDB6Kjo0lJScHf35/Zs2c/Nr66Uj3+YmReSo6GneLT+T+Tt74RbZJdMbYxYMinjvT2sS1LCFG3s3lvaygWjbRZOMia97u14ttB1pyJTWPY+vO07G1Ev2l2qKqqUveQFZ/lL8ekhimfnf6MkftHEpYaVsV3+fqyY8cORo0aRXx8PHFxcSQkJGBiYsKpU6fo1q0by5cvp6SkBID09HQA6tSpU+4pvEGDBly9ehWlUsmuXbvKrmdmZtK4cWMA1q9f/1S6unXrxrJlfy9cDA0ttVeTJIkBAwYwffp0Wrdujb6+/mPjqytyUpB54URExTD/6/WEL8unSVIb6tvVYOTcjvSf6IB+478nHjPzihn/azAa6qqsHNkODXVVALzaNeXXsY7czipgwM+BpGpJDPnUkXa9jEm5XIjrKW8+0VlIck4yw/YN4/PTn5OWn1ZVt/va4ufnx4ABA8pdGzhwIJs3b2bcuHE0a9YMa2trbGxs2Lx5MwATJkygZ8+eZRPNCxYsoHfv3rzxxhsYGhqWtTNnzhy8vLxwdXV94vzDP1m6dCnBwcFYW1vTpk0bli//e03L4MGD2bhxY9nQ0ZPiqyPSq7xqo127dkI+ZOfVISE6hT07TkNcHYpVCtGwLmCwVxd09f89/l+iUDJm3XnOxKbhN7497Yz1IPow1NSGpo4ARN/JYcy6c9zJKuT7wbb0tDIkLSmHgE2R3IrNwrCVNnE2QWxIXIummiY+tj4MMR+CmsrrMWp69epVWrduXdUyZF5yHvZ7IklSiBCi3cPi5aQgU6kIIUi6dpdD/hfJi5MoUM2loPVNRrzVC6P6TR5Zb96+CFadvM58TyuG2ujC/o/gUunTJs3dwf0TaOpIWk4h438N5sKNDD7uac6ETs1BQPjJJE7vikGpELToUpcdGis4fSsQUx1TPnb8GEdDxxf0E6g85KQgUxHkpCDzUiCUgriwNE7/fo2MG4XkqmeS1PwKQz170N7o8R/Iv11IZPq2S4zqYMTctoWw823IiAfX90t7CoE/QF4qtPAA908oaGDH+9svse/yTYY6NmNuPwvUVVXIuVvIyW1/EXsxBf3GtajdNYdlSd+SlJNEN6NufNDuAwxrGz5Wy8uMnBRkKoKcFGSqFKVCSfSFOwT/Ecfd5Dyya6YR0fQUXbs5MMRyMOoqj/d1D03I4K0VQbRtWoeN5kGoHl8AdQzBcyUYdSwNKsyB86sgcCnkp0PLbig7f8x34Vr8dCwG15YG/DTcHm2N0veKDU3hxJa/yM0spE1nQ662OM7aa6uRkBhnNY7RlqOpqVqzsn80zx05KchUBDkpyFQJimIlkWducuHgDbJS8smqlcJ5wz+xcGrK1HbvYqD55MnCO1kF9Fl2isZSOlvr+6KecBosPKH3EtDU+XeFwmw4txJO/wj5d8GsJwfrj8HnqILm9WqxdrQDTXS1ACjKL+GMfwxXTiRRW6cmlv0N2JS/gkPxh2hSuwkfOXyEW1O3V8p1VE4KMhVBTgoyL5TiQgXhJ5MIPXSD3MwicnVSOdnAH80WJczqMAubejYVaqegWMHQVWcwunmQ7zR9UUUBvb4Fm6HwpA/qgiw4uwKCfoSCTNKadGFiYjfi1FqwxrsdNk3/Tii3YjM5tjGS9ORcWtjXo5ZbDosiFhCbGYtzY2dmOszEuK7xM/xEXhxyUpCpCHJSkHlh3L2Vy67vLpCfXUyJYRYHdDeRVe8mU9tOxdPUE1UV1Qq1I4Rg1tYz2IQtYLBaADRuC56rQP8pLSsKMuHMcgj6CQozCVBpz+JiT3wG96OHZcOyMEWJkouHbhC8Lw5VdRUc+xsTqh/A8su/UKAoYGSbkUy0nkgt9VpP9/4vGDkpyFSEp00K8j4Fmf+EUqHksG8EhSWFHLFfwxrjz+joaMPvA37Hy8yrwgkBYM8f+xgXMZq31I6D6wcw9sDTJwQAjbrgNgPeuwydZ9BJLZw9qh+h3DqS7fsPlJnmqaqp0K6nMUM+daRes9qc8oumzp8WbOq4nTdN3sQ3zJc+u/qwN2avbLT3BFRVVbG1tS17xcXFERAQQO/evR9Zx8bGhqFDh5a7Nnr0aLS0tMptbJs6dSqSJJGamgqUN897GIWFhXTp0gVbW1u2bt36DHf1/Pj666+rWsJTIycFmf9EyJ/x3InP5s+m66nVRIUtvbcwu/1sdDQeMvb/KJQK4vy/otfZUeioKxCj9oLHp6Ba8UPGH4qmDrh/gsp7lylxfp831K4w8OxgrvwwkJJbfxud6TTQot97drwxypz05FwOfBtF37tj+bXbBupr1eeTU5/wv6P/QymUz6bnNUZTU5PQ0NCyl7Gx8WPj7+9ePnHiBLm55Y9iNzU1Zffu3UCp9cSxY8fKdjVXhIsXL1JcXExoaGi5zWmPQ6FQVLj9/4KcFGSqBSk3sjm/7zox9S7SwFKT9T3X00a/zdM1kplEwdo+GId+y+ka7akx5QwqzV2fr1AtPdS6fkaN98M523gULe6eQmV5B4q3jYXUKKDU+qB1x0YMm9OeFvb1Ob8vjojlBSwy+5mp9lM5kXiC36J+e766qjGbN29m5MiRdOvWjT179pQrGzp0aNkTfkBAAM7OzmWGek/izp07jBgxgtDQUGxtbYmJieHIkSPY2dlhZWXF2LFjKSwsBMDY2Ji5c+fi4uLC9u3bOXjwIB06dMDe3h4vLy9yckpPBD5//jwdO3bExsYGR0dHsrOziYuLw9XVFXt7e+zt7Tl9+jQAN2/epFOnTtja2mJpacnJkyeZOXMm+fn52NraMnz48Of1I6x0Xo+tnTIvjJJiBYd8wylQz+VKq4P4uW56etO5iD2IPVMQBQV8LvkwduIsaus8fmjgWVCprU+HCUvxPzWW2wcW4R3xO2pXdyFZeUHnGaDfAi3tGnR724JW7RtyfPM1di8OpY2LM06651kSsgT3pu7oa+pXmsZn5o+ZcOvK822zoRX0XPDYkPsfegAmJibl/IsextatWzl06BDXrl1j2bJl5YaRWrZsye7du7l79y5+fn6MGDGCP/74o0JS69evz+rVq1m0aBG///47BQUFuLm5ceTIEczMzBg1ahS//PJLmaOrhoYGp06dIjU1FU9PTw4fPkytWrX45ptvWLx4MTNnzmTw4MFs3boVBwcHsrKy0NTUpH79+hw6dAgNDQ2ioqIYOnQowcHBbN68me7duzNr1iwUCgV5eXm4urqybNmyV85LSe4pyDwV5/Zc5+7NPA6bbOBTt1kVWmpaRlEu7JkC20YSLxrQu+hrug1/HyODyksID9LfxRaLUd/TnWVslHqjDN8Ny9rBrsmQXmrLbWShz9DPnLDt2oyrgTdxOTecwqIiFocsfiEaXzUeHD56UkI4f/489erVw8jICA8PDy5cuMDdu3fLxXh6erJlyxbOnj2Lq+t/7zleu3YNExMTzMzMAPD29ubEiRNl5feHl86cOUNERATOzs7Y2tqyfv164uPjuXbtGoaGhjg4lJ4Hrq2tjZqaGsXFxYwfPx4rKyu8vLzKLLcdHBzw9fVlzpw5XLlyhTp1Xl3rdrmnIFNhbkZncPHQDSLqB9K5Y1s6NelU8crJoaU7k9NiONfYm+ExHnzc2xpn06czO3tWXFoasHpyT8asM2BFTi82mp/BOHwLXN5auvy10weo65ngPNCUxmY67PvpMt7Np7EyZj79WvR7ee0xnvBE/zLg5+dHZGRk2bxDVlYWO3fuZNy4cWUxQ4YMwd7eHm9vb1RU/vsz65MWCNSqVassrmvXrvj5lT8d+PLlyw/ds7JkyRIaNGjApUuXUCqVaGhoANCpUydOnDjBvn37GDlyJB9++CGjRo36z/qrErmnIFMhigpKOOAbRo5GOml2V5nWdlrFKiqVpbYUq7tAUR5nXX15K6Y7/doaM8bZuFI1P4qWDeqwy8cZ/YbNcA/rxqb2exCO4+HK9tKew54pcDceYysDjK30qXm5Mc1rmPHV2a8oVhRXieZXHaVSyfbt27l8+TJxcXHExcWxe/fuf30YN2vWjHnz5uHj4/NM72dubk5cXBzR0dEAbNiwgc6dO/8rrn379gQGBpbF5eXl8ddff2Fubk5ycjLnz58HIDs7m5KSEjIzMzE0NERFRYUNGzaUTVTHx8dTv359xo8fz9tvv82FCxcAUFdXp7j41fqdkZOCTIUI3BlNTlohp8y2M/+NedRQrfHkSlnJsKE/HPoMWvXgmuefjA7QxK6ZDvMGWFbp7uF6dWqyZXx7elg0ZNbhVD4tHEHJ/y5Au7FwaQv8aA97p9Kxex0URUqGZk/heuZ11oWvqzLNrxJHjhyhSZMmZa9vvvmGxo0bl1tN1KlTJyIiIrh582a5uhMnTqRFi38vSc7LyyvX5uLFjx7S09DQwNfXFy8vL6ysrFBRUWHSpEn/iqtXrx7r1q1j6NChWFtb0759eyIjI6lRowZbt25lypQp2NjY0LVrVwoKCvDx8WH9+vW0b9+ev/76q6zHERAQgK2tLXZ2duzcuZOpU6cCpVbh1tbWr9REs7x5TeaJ3IhIY+/SS1wyPEbPYfYMaDngyZUi98Hu/0FJAfRYQHqrIfT9KZBihZK9/3OhvrZG5QuvAEqlYOGBayw/HkNns3osG2ZHncLbcHIxXPgVtPQ52eg3rpy8zY3uJzic8zu7+u2iaZ2mVS1d3rwmUyHkzWsyz5WC3GIOrLvCXc1b1HUtpL9p/8dXKMqDve/BlmGg0xQmnqDYdiQ+my9wJ7uQFSPbvTQJAUBFRWJmT3Pme1pxKjoVr+VBJAt96L0YvPdCzi0cDA5QQ1ONtjE9UZPUmHd2nrypTea1RU4KMo/lmF8EBdklXLY6wKcusx8/5HPzEqzsDCG+4DwV3j4MBi356vcIzsSms8DTCtumT7G57QUy1LEZ68Y4kHQ3n/4/BXIlMROMOkCrXmiELMGhWwPuROUySecDApMCORR/qKoly8hUCnJSkHkk0RfuEBucxsXGh/ik9zS0a2g/PFCphNPLYJVHqXPpqN3QdS6o1WDLuRusD4pnvKsJnvaPPlTnZcC1ZT12+nREXVWFt1YEcTD8FrzxKRRmY8kmdBpooX6uMW10LPjm3DfkFOVUtWQZmedOpSUFSZI0JEk6J0nSJUmSwiVJ+uLe9a2SJIXee8VJkhR677rjA9cvSZJUgYFrmcoiL6uIQxsuc6fWDRx7Nce2vu3DA7NvwaaBcHAWmHWHyaehuRsAwXHpfLo7DNeWBszoYf7CtD8LZg3qsOudjpg1qM3EjSH8GqsF1oNRDV6Bc089Mm7nM0ZlOin5KfwU+lNVy5WRee5UZk+hEHhDCGED2AI9JElqL4QYLISwFULYAjuB+x4CYUC7e9d7ACskSZL3UVQBQgj2r79AcYGCO06hjLcd9/DAW1fgl44QHwS9v4fBG0FLD4DkjHwmbbxAYx1Nlg21R0311emU1q+jwZYJHXBvVZ+5eyO41XYaKBUY3fmZJua6JB4rYLDxMDZHbiYiLaKq5crIPFcq7S9VlHK/f61+71U2OyeVDk6/Bfjdi88TQpTcK9Z4MFbmxRIRlMTt8Dwumxzl8zdnPtzx9G4cbBwIahow8Ti0G1N27kFBsYKJG0IoKFawalQ76mo9o8FdFaBZQ5Uv+1sCsPySAtqORgrdgEtXDYryS2if3Bvdmrp8GfQlCmXlmqrJyLxIKvXxTZIk1XvDQ3eAQ0KIsw8UuwK3hRBRD8Q7SZIUDlwBJj2QJB5sc4IkScGSJAWnpKRUpvxqSXZ6Acf8rpJcJ5ohb3WjYa2G/w7KTYUNnlBSCCN+g3qtyoqEEMzYeZmw5Ey+H2xLywav7nb/xjqa9LdrzJbzN0hr9y6o1kD/6iJauzTir5MpvNd8BmFpYWz/a3tVS60y7ltnW1pa4uXlRV5e3jO3GRcXh6Wl5VPV8ff3L7Oc+CcpKSk4OTlhZ2fHyZMnn1nfs5KRkcHPP/9c1TIeSaUmBSGE4t5wUBPAUZKkB/+nh3Kvl/BA/FkhhAXgAHwsSdK/1i4KIVYKIdoJIdrVq1evMuVXO4RS8NuqIEoUJdTqlkkXY49/BxXmwKZBpRvThm2D+uXnClaciGV3aDIfdGtFlzYNXpDyymNS5xYUlihZG5oHTpMgbAdOjgWo1lBB/Vwj2hu254cLP5Can1rVUquE+95HYWFh1KhRg+XLl5crr2xr6vs8LikcOXIEc3NzLl68WGE/pcrUXa2Twn2EEBlAAKVzBdybK/AEHnoShhDiKpALPN3jgswzcebINXKuC6JbBzLdfcq/A0qKYNtIuHkZvHyhmVO54mORd/jmz0jetDbEx+0/HJLzEmJavzY9LBrya1A8We18QKMuWufm0a6nMXFX0pikP50iRRELzy+saqlVjqurK9HR0QQEBODu7s6wYcOwsrKioKCAMWPGYGVlhZ2dHceOHQNg3bp19OvXjx49etCqVSu++OKLsrYUCgXjx4/HwsKCbt26kZ+fD0BMTAw9evSgbdu2uLq6EhkZyenTp9mzZw8ffvhhmW32fUJDQ/noo4/Yv38/tra25Ofn4+fnh5WVFZaWlsyYMaMstnbt2nz22Wc4OTkRFBTExo0bcXR0xNbWlokTJ5Ylij///BN7e3tsbGzw8Ch9cDp37hwdO3bEzs6Ojh07cu3aNQDCw8PL2rC2tiYqKoqZM2cSExODra0tH374YeX+p/wHKm0iV5KkekCxECJDkiRNoAvwzb3iLkCkECLxgXgTIEEIUSJJkhHQCoirLH0y5Um/nUOw/w2SdaN5d/goNNT+0UlTKmH3OxBzFPr9BK16liuOScnh3S0Xad1Qm28HWVephcXzxsfNlD/CbrExNBMf5/fgyBdYt08iTF+DqP3ZvP3mOH658jP9TfvTsVHHKtH4zblviEyPfK5tmuuZM8NxxpMDgZKSEv744w969OgBlH5IhoWFYWJiwnfffQfAlStXiIyMpFu3bvz111/l4rS0tHBwcODNN9/EwMCAqKgo/Pz8WLVqFW+99RY7d+5kxIgRTJgwgeXLl9OyZUvOnj2Lj48PR48epW/fvvTu3ZtBgwaV02Vra8vcuXMJDg5m2bJlJCcnM2PGDEJCQtDV1aVbt274+/vTv39/cnNzsbS0ZO7cuVy9epVvvvmGwMBA1NXV8fHxYdOmTfTs2ZPx48dz4sQJTExMSE9PL/1ZmZtz4sQJ1NTUOHz4MJ988gk7d+5k+fLlTJ06leHDh1NUVIRCoWDBggWEhYW9tJbalbm6xxBYL0mSKqU9km1CiN/vlQ3hH0NHgAswU5KkYkAJ+Aghqmef/AWjVAq2Lj9JMQosBxpgpmdWPkAIODgbrmwDj8/AbkS54sz8YsavD6aGqgorR7VFq8brtWjMqkldXFsasPbUdcZOH4/G2RWoHZ9LxwHrObA6HJecXuzX3sfXZ79mZ9+d1FStWdWSXxgPnqfg6urK22+/zenTp3F0dMTExASAU6dOMWVKac/T3NwcIyOjsqTQtWtX9PVLz6nw9PTk1KlT9O/fHxMTk7J227ZtS1xcHDk5OZw+fRovL6+y979/cE5FOX/+PG5ubtwfeh4+fDgnTpygf//+qKqqMnDgQKB0yCkkJKTMOjs/P5/69etz5swZOnXqVHZvenqlq+0yMzPx9vYmKioKSZLKTPA6dOjAvHnzSExMxNPTk5YtWz6V3qqg0v56hRCXAbtHlI1+yLUNwIbK0iPzaA7sPo/yZk3SnIKZ1vYh3dnTS+HMT+A4EVymlytSKAXvbbnIjfQ8No1zoomu1gtS/WLxcTNl6KozbL+UxsjOH8K+92nhfAlDU31C9t5gps8nTD4xkbVX1jLZdvIL11fRJ/rnzf05hX9y3ygOHm9j/c8e5f3va9b8O7GqqqqSn5+PUqlER0fnmZ6wH6dFQ0MDVVXVsjhvb2/mz59fLmbPnj0P7QV/+umnuLu7s2vXLuLi4nBzcwNg2LBhODk5sW/fPrp3787q1atp3rz5f9b/Inh1Fo/LVAqJ8alEH8wgsf5VPhgy/t+/8KF+pS6nFgOgx4KyZaf3+fbANY5dS2FOXwucmr/EJ5M9I+2b62HfTIflx2MpthkBusZIR+fi7NmC/OxiVEPr08ukF6uurCI+K76q5b5UdOrUiU2bNgHw119/cePGDVq1Kl2xdujQIdLT08nPz8ff3x9nZ+dHtqOtrY2JiQnbt5eu9hJCcOnSJQDq1KlDdnb2E7U4OTlx/PhxUlNTUSgU+Pn5PdRS28PDgx07dnDnzh0A0tPTiY+Pp0OHDhw/fpzr16+XXYfSnsJ9B9h169aVtRMbG0vz5s1599136du3L5cvX66w1qpCTgrVGEWJkh3LAylQy6ePdzt0NXXLB0QdKp1HMOkMA1bAA4eeKJWCbw9Esvx4DMOdmjGivdELVv9ikSQJHzdTkjLy2RuWCu6z4PYVGuQeplX7hoQeSWBy86loqGrw1ZmvZMO8B/Dx8UGhUGBlZcXgwYNZt25dWU/AxcWFkSNHYmtry8CBA2nX7qHGnWVs2rSJNWvWYGNjg4WFBbt37wZKD+f59ttvsbOzKzfR/E8MDQ2ZP38+7u7u2NjYYG9vT79+/f4V16ZNG7766iu6deuGtbU1Xbt25ebNm9SrV4+VK1fi6emJjY1N2QluH330ER9//DHOzs7lVi5t3boVS0tLbG1tiYyMZNSoUejr6+Ps7IylpeVLOdEsW2dXYzatP0hGkBqKbvG86zmmfGFiMKzvA/qmMHofaPzte5RbWMK0raEcjLjNEIemfNnfEvVXaMfyf0WpFPRaepISpeDgVBdUVrhCST45w0+y6YtgjCwNyOwUzldnv+Ib12/o1bxXpep51a2z161bVzYBLFN5yNbZMhXicngU6UEq3Gn6F5P7l/mMw6wAACAASURBVJ84JuUv2OQFtRvAiJ3lEkLi3TwG/nKaw1dv83mfNsz3tKoWCQFKbbYnu7Ug+k4OhyJTwONTSI+l9vVt2HUzIubCHZxVumKpb8nC8wvJKsqqaskyMk9N9fhrlilHfkEBB9deIa9mJmMm9EJd5QEbiqxk2OgJKqow8jeoXb+sKDgunX7LAknKyGfdGEfGOJu8VktPK8KbVoY009Pi54AYRMvu0MQRji/Ezq0etXRqErQzhtlOs7lbeJelF5ZWtdyXmtGjR8u9hJcQOSlUQ1av3oNmrg4tB9TGuF6zvwvyM0r9jPLvwvAdoPf3KontwQkMXXWGOhpq7PJxppNZ9dxNrqaqwsTOzbmUkMHp2HToMgeyb6J+aQ0d+jfnTnw2ajF6DDMfxrZr27iScqWqJcvIPBVyUqhmHAkKQiXMgNxWiXi6df+7oDgf/IZCahQM2QSNSteIK5SCefsi+HDHZRxN9PB/xxnT+rWrSP3LwUD7JtSrU5OfA6LB2BlMu8CpJZhZa1LfqA5B/rFMaD2Jepr1+PLMl5Qo/2XhJSPz0iInhWpESkYaIVtvkqt1l0kTHjiuQqmAnePgRhB4rig7DyG7oJhx68+z6uR1vDsYsW6MIzpaNapE+8uEhroq411NCIxOIzQho3RDX/5dpDPLcPZqSW5GIVHH05nhOIOr6VfZErmlqiXLyFQYOSlUE4QQrFyxB80CbVxHtkC7Vp37BbBvOkT+Dj2/AcvSHZ3xabl4/nyak1GpfNXfki/6VY8VRhVlmJMRdTXV+flYNBjalO7jCPqZRg2LaGFfn4sH4+mg7YpLYxd+vPgjt3NvV7VkGZkKIf+VVxM2/LELnetGaLTNoYPdA6eoBSyAkHXg+j44TQQgKCaNfj8FkpJTyK9vO772exD+C7VrquHd0ZiDEbeJup0N7rOhpABOfkdHzxYolYKzu6/zidMnKISCb85/8+RGX1F27dqFJElERlbMe+n777+vsMX2xYsXkSSJAwcOPDauV69eZGRkVKjNp+HDDz/EwsLipdlPsG7dOpKTkyv1PeSkUA0IS7zK7T9Uya+bwWjv3n8XnF8NxxeUehm98SkAm8/eYOSasxjUrom/jzMdWxhUkeqXnzEdjdFUV+WXgBgwMAW74RC8Fm3VO9h6NOXa2VvUSKvLROuJHIo/xMnEqvfyrwz8/PxwcXFhy5aKDZM9TVK437af3z+t0koRQqBUKtm/fz86OjoV1lxRVqxYwYULF/j2228rFF9SUrnzR3JSkHlmCkoK2Lr6BBolWvSf4ID6fbO6iN2w7wMw6wG9f6BEKZizJ5xPdl3BpaUBv/l0xNig1uMbr+bo1qrBUMdm7L6UTEJ6HnSeCUgQsIC2PYzRrKNO4PYovNt407xuc+adnUdBSUFVy36u5OTkEBgYyJo1a8olhYCAAHr3/vsB5H//+x/r1q1j6dKlJCcn4+7ujru7O8AjrayFEOzYsYN169Zx8OBBCgpKf3ZxcXG0bt0aHx8f7O3tSUhIwNjYmNTUVOLi4jA3N2fcuHFYWloyfPhwDh8+jLOzMy1btuTcuXPAo62uH6Rv377k5ubi5OTE1q1biY+Px8PDA2trazw8PLhx4wZQurR2+vTpuLu7M2PGjIfaewPcvn2bAQMGYGNjg42NDadPnwagf//+tG3bFgsLC1auXAmUWoePHj0aS0tLrKysWLJkCTt27CA4OJjhw4eX2YBXBq+XnaXMv/hhmy8Nb7XC0F0N0xZNSy9eP1k6sdzUEQb5klkoeGfzeU5FpzLe1YSZPVujqlK99h/8V8Z3MmHDmThWnogtPb7TcTyc+ZkazlNx6tucgE3XuHEpg9ntZzP2wFhWXl7Ju/bvPncdt77+msKrz9c6u2Zrcxp+8sljY/z9/enRowdmZmbo6elx4cIF7O3tHxn/7rvvsnjxYo4dO4aBgcFjrawDAwMxMTGhRYsWuLm5sX//fjw9PQG4du0avr6+Dz2sJjo6mu3bt7Ny5UocHBzYvHkzp06dYs+ePXz99df4+/s/0ur6Qfbs2UPt2rXLDPj69OnDqFGj8Pb2Zu3atbz77rv4+/sDpZ5Ohw8fRlVVFQ8Pj4fae7/77rt07tyZXbt2oVAoyMkpPa147dq16OnpkZ+fj4ODAwMHDiQuLo6kpCTCwsKA0oN5dHR0WLZsGYsWLXqiHcizIPcUXmP+DDuCWmATlPVz6T/IpfTirSuwZVjpHoShW4jJVNL/50DOXk9j4SBrZr3ZRk4IT4FhXU087ZqwLTiBlOzCUhdZ9Vpw9EtaOzdCv3FtTv8WjZ2+PX1b9MU33JfYjNiqlv3c8PPzY8iQIUCp/9CjhnkexYNW1mpqamVW1k9q28jIiPbt2z+0TRMTE6ysrFBRUcHCwgIPDw8kScLKyoq4uDig1MDOy8sLS0tLpk2bRnh4+BO1BgUFMWzYMABGjhzJqVOnysq8vLxQVVUtZ+99/3CemzdvAnD06FEmTy510FVVVaVu3boALF26FBsbG9q3b09CQgJRUVE0b96c2NhYpkyZwp9//om2tjYvCrmn8JpyM+cmJzdH0UCYMHRyB1RUVeBuXOnmtJp1YMROTiYpeGdTIOqqKmwe3x4HY72qlv1KMsmtBdtDElgbeJ0ZPcyh4xQI+BqV5As4e5my5/tQLh9NZHqn6QQkBPDlmS9Z233tc90N/qQn+sogLS2No0ePEhYWhiRJKBQKJEli4cKFqKmpoVQqy2LvD/38k0d5rykUCnbu3MmePXuYN28eQgjS0tLK3EUftOb+Jw/abquoqJR9r6KiUjbm/yir66fhwf+/+3qe1t47ICCAw4cPExQUhJaWFm5ubhQUFKCrq8ulS5c4cOAAP/30E9u2bWPt2rVPrfG/IPcUXkMUSgWLN66lUboZ1n0MMTDUhtxU2OAJJYWIETtZF1bMaN/zNNLRxP8dZzkhPAMmBrXoaWXIhqB4MvOLoYMPaBnAkS9oaq6HsbUBwX/EoVlch2ltpxF8O5i9sXurWvYzs2PHDkaNGkV8fDxxcXEkJCRgYmLCqVOnMDIyIiIigsLCQjIzMzly5EhZvQetox9lZX348GFsbGxISEggLi6O+Ph4Bg4cWDZc86w8yur6cXTs2LFs3mTTpk24uLj8K+Zx9t4eHh788ssvQGnSy8rKIjMzE11dXbS0tIiMjOTMmTMApKamolQqGThwIF9++SUXLlwAKm4R/izISeE1ZFXgOhpdtqNmsxI6d7eCwpxSg7usZIoHb+GTUyXM2RuBe6v67JjckaZ6r+fBOC+SyZ1bkFNYwsYz8aU9Mdf34fpxiA3AeaApiiIlZ/fG4tnSE5t6Niw6v4jMwsyqlv1M+Pn5MWDAgHLXBg4cyObNm2natClvvfUW1tbWDB8+HDu7v8/bmjBhAj179sTd3f2RVtaPa/t58Cir68exdOlSfH19sba2ZsOGDfzwww8PjXuUvfcPP/zAsWPHsLKyom3btoSHh9OjRw9KSkqwtrbm008/LRsSS0pKws3NDVtbW0aPHl122M/o0aOZNGlSpU40y9bZrxkXb11k5+JgGuQ3w3tOJ7TrqoLfYIg9Tnb/9Yw7Y8DZ6+n4uLXgg26tUJHnD54bo33PcTkxk8AZb6ApFcOPbaFOAxh3hJPbo7hyLJG3ZjmSppXE4N8H09+0P3M6zvnP7/eqW2fLvBhk6+xqTFZRFis37cAwqwWub5mhrVuz9JCcmKPcclvImwdqczEhg+8H2/JRD3M5ITxnfNxMSc8tYuv5G6CuAW4zISkEIn/H4U0TamiqEbgjCjNdM0a2GcnOqJ2E3nk5D2+Xqb7ISeE1QQjBvD8WYR7dCT1zdWxcjOHQp3BlGzHW0+lytCn5xQq2TmhPf7vGVS33tcTRRA8HY11WnoilqEQJNkPBwAyOfImGpgoOvU1IjLxL3JU0JttMpmGthsw9M5diZXFVS5eRKUNOCq8JWy/vQPtYG9Q0VOg31gnp9I8QtIywJkPocr4txgZa7PmfM3bNdJ/cmMx/xsfNlOTMAnaHJoGqGrwxG1KvwaUtWHZujE4DLU7vjKampMHHjh8TdTeKTRGbqlq2jEwZclJ4DbiaEsnlzWnULdTnzf6N0IzZAYc+JVTbnT7Rvell2YjtEztiWFezqqW+9ri1qkdrQ21+OR6DQimgdV9oZAcB81EVxTgPNCXjdh5hx5N4o9kbuDV14+dLP3Mz52ZVS5eRAeSk8MqTV5zH+uW/0zjTDIuE3eRPHMRf3l8QHmTC6nNtmGlblx+H2qFZQ7WqpVYLJEnCx60FsSm5HAy/BZJUaq2dmQDBvhhZ6dPEXJfz+65TkFvMx44fAzD/3PwqVi4jU4qcFF5hFJmZbPtwPk2T7GmacITmtW/SwDEXZWN17t6pyTuhu+g0ZyKx3btz87PPyfrzAIpKcJKUKU8vK0OM9e8d2SkENHcHY1c48S1SUS4uXi0pyi/h/O/XaVS7EZNtJnMs4RjHbhyraukyMpWXFCRJ0pAk6ZwkSZckSQqXJOmLe9e3SpIUeu8VJ0lS6L3rXSVJCpEk6cq9f9+oLG2vOiUpKdxZtIigPmPIy+tEnZLreLxrTWOrM2SbaPK+yxfo7D1A8/37aDB7NjVbtiRr3z6S3nuPvzp05PogL+4sXkLumTMoi4qq+nZeO1RVJCZ1bsGVpExORafe6y18DnmpcOYX9BvXprVLI8KOJ3H3Vi4j2ozAVMeU+efmk1dcMffQl4nKss5eu3YtVlZWWFtbY2lpWbbev6Ls2bOHBQsWPFWdihAZGYmtrS12dnbExMQ89/aflri4uOe2fwMoXbVSGS9AAmrf+1odOAu0/0fMd8Bn9762Axrd+9oSSHrSe7Rt21ZUJ4oSE8XNL+aKq9Y2IsTOVSwbv1ssmbpD5EafF0VfNRbxn5kK7yW7xO3M/H/VVRYVidyQC+LOj8vE9WHDRYSFpYhoZS6u2tiK+LfHidTVa0T+1atCqVBUwZ29fhQUlwjHeYfE4BWn/764eagQXzcRIjdN5GYWihVTA8Tvy0KFEEKcv3leWK6zFBsjNlb4PSIiIp637P+El5eXcHFxEZ9//nmF4o2MjERKSspjYxISEkTz5s1FRkaGEEKI7OxsERsbW2FNxcXFFY59WubPny8+++yzCscrlUqhqMS/q2PHjok333zzkeUP+z0BgsWjPrsfVfA8X4AWcAFweuCaBCQALR8SLwFpQM3HtVtdkkJBTIxImjGz9IPc0krEzJwtvn1vq/jund9FTMhRUfBVE3HjsxZi0jJ/kZVfVKE2S7JzRNbRo+LmV/NEdK83RUQrcxHRylxc69BRJE5/X9zdsUMUJSdX8p293qw6ESOMZvwuguPSSy/cChfi87pCHJglhBAi5M84sWziEXEjIk0IIcSo/aNE9x3dRbGiYh9oL0NSyM7OFo0aNRLXrl0TrVq1Krv+zw+qd955R/j6+ooffvhBqKurC0tLS+Hm5iaEEGLz5s3C0tJSWFhYiI8++kgIIURISIiwsbERJSUl/3rPlStXinbt2glra2vh6ekpcnNzhRBCeHt7i2nTpgk3Nzcxffp04evrK955552yskmTJgk3NzdhYmIiAgICxJgxY4S5ubnw9vYua3vSpEmibdu2ok2bNg/94N+3b59o0KCBaNSoUZn+7777TlhYWAgLCwuxZMkSIYQQ169fF+bm5mLy5MnC1tZWxMXFiYULF4p27doJKyurcm2vX79eWFlZCWtrazFixAghhBB79uwRjo6OwtbWVnh4eIhbt24JIYQICAgQNjY2wsbGRtja2oqsrCzh5OQktLW1hY2NjVi8ePG/ND9tUqhUQzxJklSBEMAU+EkIcfaBYlfgthAi6iFVBwIXhRCFD2lzAjABoFmzZs9f9EtEfng4aStXkX3wIFLNmugOG4qu92h+9j1Nzbt1Me13l0b7J5NSXIPvGy9m8eg3KzyhrFq7FnXc3alzz9O++PZtck8HkXv6NLlBQWTt2wdADRMTanXoQC3njmg5OqJap06l3e/rxlDHZiw7Fs0vAdGs9naABm3AZgicWwXtfbB5oynhJ5M4tT2KwbMc8LbwZuqxqRy+cZgexj2e6r1ObvuL1ISc56rfoGltXN8ye2xMZVln9+nThwYNGmBiYoKHhweenp706dMHAE9PT8aPHw/A7NmzWbNmDVOmTAHKW1j/09Po7t27HD16lD179tCnTx8CAwNZvXo1Dg4OhIaGYmtry7x589DT00OhUODh4cHly5extrYua6NXr15MmjSJ2rVr88EHHxASEoKvry9nz55FCIGTkxOdO3dGV1e3nL33wYMHiYqK4ty5cwgh6Nu3LydOnEBfX5958+YRGBiIgYEB6enpALi4uHDmzBkkSWL16tUsXLiQ7777jkWLFvHTTz/h7OxMTk4OGhoaLFiwgEWLFvH7778/9f/xw6jUiWYhhEIIYQs0ARwlSbJ8oHgo8C+fXUmSLIBvgImPaHOlEKKdEKJdvXr1KkN2lZMXEsKN8ROIGziI3MBA9CdMwPToERp+8gk7DoZTI1EPRbso3EI+IK1YjRUm3zN/bO9nWmGk3qABOgP60/jbhbQ8eQKT3bupP3MG6k2bkLFrF4nv/I+/2ncgbshQUpYuJS84GFEsb7p6HLVqqjG6ozGHr97h2q17JmZuH4NSAce/QVVdhQ4DTElPziUi8CZuTd0w0jZifdj6R7qHvmxUlnW2qqoqf/75Jzt27MDMzIxp06YxZ84cAMLCwnB1dcXKyopNmzaVs72+b2H9MPr06VNmod2gQYNy9tr3LbW3bduGvb09dnZ2hIeHExER8Vj9p06dYsCAAdSqVYvatWvj6enJyZOlJ+w9aO998OBBDh48iJ2dHfb29kRGRhIVFcXRo0cZNGgQBgalJxzq6ZUaUyYmJtK9e3esrKz49ttvy+7R2dmZ6dOns3TpUjIyMlBTe/7P9S/EOlsIkSFJUgDQAwiTJEkN8ATaPhgnSVITYBcwSghR9TM4LxAhBLmnTpG6YgX5wSGo6ulRb9o0dIcNLXs6Px0QTsZZVW41vcIHt5aQUazKry1/5POhPVFTfX75XZIkNFqZodHKDP3RoxFFReSFhpb1IlKXryD1519Q0dJCy8mJ+tOnUbNly+f2/q8Tozsas/JELL8ERPP9EDvQNYJ2Y+D8Guj4Li3sm2NoWpdze2Np6dCAUW1G8eWZLwm5HUK7hhU/SOVJT/SVQWVaZ0Pp76GjoyOOjo507dqVMWPGMGfOHEaPHo2/vz82NjasW7eOgICAsjoVsdR+0E77/vclJSVcv36dRYsWcf78eXR1dRk9evQjdVdE/4NahBB8/PHHTJxY/ll36dKlD7VQnzJlCtOnT6dv374EBASUJcSZM2fy5ptvsn//ftq3b8/hw4cfq++/UJmrj+pJkqRz72tNoAtwf3lCFyBSCJH4QLwOsA/4WAgRWFm6XjaEUknWgYPEDRxEwvgJFCcm0eCTTzA9chiDiRPKEkJidBoh25K5VTeG4dKPFBQLtlv8woxhvZ5rQngYUo0a1HJ0pP5772GydStmQadp/ONStPv1JT80lOuDh5D15+MPVq+u6GjVYLhTM/ZcSuZG2r0VN50+BLWacGwekiTh4tWS/OxiQv6Io2+LvujW1GV9+PqqFV4BKtM6Ozk5ucwuGiA0NBQjIyMAsrOzMTQ0pLi4mE2bnt9u8KysLGrVqkXdunW5ffs2f/zxxxPrdOrUCX9/f/Ly8sjNzWXXrl24urr+K6579+6sXbu27LS1pKQk7ty5g4eHB9u2bSMtLQ2gbPjoQWvv9ev//l2IiYnBysqKGTNm0K5dOyIjI5+7nXZlfpoYAsckSboMnAcOCSHuD3oN4d9DR/+jdO7h0weWrNavRH1ViiguJsPfn9jefUiaOhVFbg6GX32J6cED6I0aiYrm37uPs9ML2PVTMNnq6djrLUO/sIj99quY4tW9SkztVOvWRbtrVww//xwT/13UbGlK0nvvcee77xAVtCGuToxzbY6aigorTtzr/NauD+0nQ9hOuHmZ+kbatGrfkEtHEyjOgqHmQwlIDCA28+U+oa0yrbOLi4v54IMPMDc3x9bWlq1bt5ZZVX/55Zc4OTnRtWtXzM3Nn9v92NjYYGdnh4WFBWPHjsXZ2fmJdezt7Rk9ejSOjo44OTkxbty4cvd6n27dujFs2DA6dOiAlZUVgwYNIjs7GwsLC2bNmkXnzp2xsbFh+vTpAMyZMwcvLy9cXV3LhpagdDmvpaUlNjY2aGpq0rNnT6ytrVFTU8PGxoYlS5Y8889Bts5+wSgLC8nYuZP01WsoTk6mZqtWGEycQJ3u3ZEeMhZaXKjg16+Pk5VaQLbJQiZnx3DUaS2De3WtAvUPR1lUxO2v5pGxbRu1nJ1ptOhb1HRlj6UH+fi3K+wMSeTUDHfqa2tAfgb8YFN6Tvbw7WSnF7BhdhC2XZpi3kuPbju60bt578daa8vW2TIVQbbOfklR5OSStmYN0R5duD33S9Tq1aPJLz9j4r8L7V69HpoQhFKwb80F8m8riTD2ZVxONGddfV+qhACgUqMGhnO/oOGXc8k7d464QV4UXL1a1bJeKiZ1bk6JUsmaU9dLL2jqgMt7EHUQ4k9TR0+D5jYGRAQmo61Sl34t+rE3Zi+p+alVK1ym2iEnhRdAcVISsb17c+fbRWiYtaTZunUYbfGjjrv7Y8/pPbsvlqTL2QQ3282UgjNcc/+V3l26vEDlT4eulxdGGzcgSkqIGzqMzL3PZ4nc64CRfi16Wzdi45l4MvPurdpynAi1G8LhL0AIrNyaUJhbQlTwbUa2GUmxspgtkVuqVrhMtUNOCpVMyd273Bg3HmVeHkabNtJs7VpqtXd64qHt0SF3CNkXT2S9s/RU/Z3cN37FvfPL7/yhaWODyc4daFhakPzhh9yevwBx77D06s5ktxbkFilYHxRXeqGGFnT+CBLOQNRBGpnpoNeoFpePJWKkbYR7U3e2XNtCfsmjj118lYd/ZSqf//L7ISeFSkRZUECizzsUJyXR9Oef0Grb9smVgJQb2Rz0vcKt2tepbfArrdx86eDsXslqnx9qBgYY+fqiO2IE6evXc2Ps25TcW11RnWltqI2HeX18A6+TV3QvUdqPAl0TOPIl0r3eQmpCDrdis/C28CazMJPd0Q/3/NHQ0CAtLU1ODDIPRQhBWloaGhoaT1XvqfYpSJKkCzQVQlx+qnephgiFgqQPPiA/NJTGS5ag1a5ia85zMwvxXxpMjspdIpqv4FOHpdg6uFWu2EpAUlen4exZaFpZcvOzz7k+cBBNflyKppVVVUurUnzcWzDwlyD8ziXwtosJqKqD+yz4bRyE/4aZY3+CdsVwJSCRrmPtsDaw5teIX/Ey80JVpfy8U5MmTUhMTCQlJaWK7kbmZUdDQ4MmTZo8VZ0nJoV7m8763osNBVIkSTouhJj+X0RWB4QQ3J43j5zDR2gwaxbaPbpXqF5JsYLfvj9Hfl4Bh9usYpb9dGwdXq5J5aelbr9+1DA1JXHKFOKHj6Dh55+jM9CzqmVVGW2N9HA00WP1yVhGtjeihpoKWA6EwO/h6FfUaNOP1h0NuXIsEedBpnhbePP+8fcJSAjAw8ijXFvq6uqYmJhU0Z3IvK5UZPiorhAii9IdyL5CiLaUbj6TeQRpK1dxd7Mf+uPeRm/kiArVEULw20/nyLpZzGHTjQy2cKNz+yGVrPTFoGlhgcnOnWi2tefmrFncmjsXUY0tu99xN+VmZgH+F5NKL6iolB7befc6hP2GZefGKJWC8JPJeDTzoHHtxqwLX1elmmWqDxVJCmqSJBkCbwHycpInkLHLn5QlS9Du04d60yvemdrzazApkfmcb7IPE1N1xrrNrESVLx41XV2arVqF3tix3N3sR/zoMRTfuVPVsqqETi0NsGj0wJGdAC27g35LOLcCnfpaNLPQJ/xEEiglRrUZRWhKKKF3QqtWuEy1oCJJYS5wAIgWQpyXJKk58DBn02pPzsmT3Jw9m1odO9Bo3ldIKhWbx9+76wKJQZkk6IVwq8VF5vVY9sTVSa8ikpoaDT76kMaLv6Pg6lXiBg4i78LFqpb1wpEkiXfcTbmemsufYbdKL6qogNNESAqBxGCs3BqTl1VEbGgK/U37o11D+5WwvpB59Xnip5YQYrsQwloI4XPv+1ghxMDKl/ZqkR8WTuLU96hpZkbjpUuRatSoUL3df4aSdPAW+ZoJHGm5jcXdllG7Ru1KVlu1aPfqhfGWLUgaGsR7e3N3y5Zqt4Kmu0VDmhvU4qdj0X/fu80QqKkNZ5djZKGPdj1NrhxLREtdi8GtBnPkxhFuZN2oWuEyLwU5J05U2gbRRyYFSZJ+lCRp6aNelaLmFaUoIYGEiRNR09Gh6YrlqNau2If61kP/Z++8o6Oquj78nEmb9N4TEhJIgNAhdAIoSgcL0puF9ooKlg/1pasICqivoGChSy8WEBSVXgKhhtAJ6b33ZCZzvj8mBhCEAEkmwH3WuouZc9tv4vLue/beZ++TpP18iVJVEZvqf8/ENhNp4NigitXWDNSBAdTetBHLtm1ImjGTxClT0BXf0j7jkeXvlp3nEnPYe6kse8jMGpoNg4itiPxkGnXyJPFqNqkxuQypPwRjlTErz600rHAFg5O1aROx4/9D6udfVMn17zRTCEPfIOffNgVAm5FB7CujQavF+7tvMXGpWA2/Zb8fR/PzMYp1NvzUYDmt67RgSL0hVay2ZmFka4v311/jOG4s2Zu3ED1sOJrEREPLqjaeaeaJu62ar/bcUCU++BV9v4WwZdRv546xqYrwvXE4mTvRx78PP135icyiTMOJVjAYUkpSFy4iccpULNu2xWP+/Cq5z78aBSnlihs3YNM/vj/26AoKiB03Hk1SEl5ff42Zn99dz5FS8uW2o9j8upNsTR2O198GrkXMaj/rkYwj3A1hZITLhMfPFQAAIABJREFUxIl4LfySkshIrj3fn/yjRw0tq1owNVYxuqMfR69lEBalL5mMoz/UfRrClmJmqiOwtRuXjiZTlKdhRIMRFJUWsf7iesMKV6h2pFZL0rRppC1ciO0zz+D99VcYWf1774gH4a4xBSFEWyHEOeB82fcmQoivqkTNQ4TUaol/8y2Kzp7Fc/48LJrfWi73n+h0kjk/HqXOnpUkFbUl0/8Ux+0P8EnIJ9ia2VaD6pqLddeu+G5Yj5GtLTEvvkTGylWPRZxhUCtvHCxNb54ttB4L+SkQ8SONOntRqtFx7lAC/nb+hHiFsPbCWopLHx9X2+OOrqCAuFcnkLVxE47jx+H+8WyEiUmV3a8i6TGfA92AdAAp5WkgpMoUPQRIKUmaOYu8PXtwmzYV6woUqdOW6pi64TCtQz/jWl4/TD3TWO+8nNebv05Tl6bVoLrmY+bvj+/GDVh17kzy7NkkTJ6MrvDf6/48CliYGvNiO1/+upDCuYQc/aD/E+AUAKFf4+hhiUddO87ujUenk4wKGkVGUQa/XP3FsMIVqgVtejrRI0eRt38/bjNm4PLGG1XuUahQzqSUMvYfQ491J5W0RV+RtXEjjuPGYj/o7gvMdDrJ/609RPczH3A1ewgW9lq+8/qc9p7tGBU0quoFP0QYWVnh9eX/cH7jdXJ+2UbUkKGUxMUbWlaVMqKtL5amRny9t2y2IAS0GgMJJ8vSU73ITS8iOjyNlq4taeDYgBURK9BJ3Z0vrPBQUxIdTdTgIRRfvozXwi+xHzSwWu5bEaMQK4RoB0ghhKkQ4m3KXEmPI5kbN+r9es8+i/Mbb1TonOV/nuCFC+8RnT0EYWbBrw1XY2mp5qMOH6ESSk3CfyJUKpzGj8d78ddo4uKIev558g4+uh1abS1MGNbWh+1nEohKy9cPNhlcnp7q19QJK3szwvfEIYRgVNAoonKi2Be3z7DCFaqMwjNniBo8BF1ODrWWLcX6ieqrkFyRJ9I44FXAE4gDmpZ9f+zI3b2bpBkzsezYEfdZMys0jTtzOoyQ/aNIznmeXOlBfMeTXNCGM6fjHBzNHate9EOMVadO1N60EWMXF2JHjyH9u+8e2TjDyx1qY6xS8d2BshacZlbQbDic+xFVfjJBHT2JPZ9JZlI+T/k8hbulu1L64hEld88eokeOQmVhgc/aNVjcpr1nVVIRoyCklEOllK5SShcp5TAp5WNXB7nw9GniJ72Jul49vD7/rEKBnpzzf+GyaQTHMt4gQdMYu27FbMpbybgm42jl3qoaVD/8mPr44LtuLdZPP03KvPkkvvvuI9mfwcVazXPNPdkYFkdaXlkQuVVZeurxZTTo4IHKWBC+Nx5jlTHD6g/jePJxwlPDDStcoVLJ3LiRuFcnYObnh++6tZgZoOBhRYzCISHE70KIl4UQdlWuqAZSEhVF7LjxGDs7471kMSrLu6eC6cJWULR6ItvTZ5GFH81GuPBZ3gyC3YIZ23hsNah+dFBZWuL52QKc33id7J9+Jm7iRHSPYEG90SF+lJTqWHkoSj/g4AcB3SBsKRYWkrotXLlwOJGSIi3PBzyPtYk1K84p2eGPAlJKUr9cSNLUaVi2a4fPyhUYOzkZREtFylzUBaYAQcAJIcQ2IUTFSn8+AmjT0ogZPQaAWt9+c/f/ULpS+H0K8Zu/Y3PGHIrNnOjymh9zk6egNlIzp+OcW+riK9wdIQRO48fjOmUKeX/8Sdy48egKCgwtq1Lxd7biqfqurDwSfb0JT+uxkJ8KEVtp1NkLTVEpF48kYWliSf/A/uyK3kV83qMdiH/UkVotiVOnkrZoEbbPPov3V4sq9OJZVVQ0++hoWf+EVkAG8Fi8nujy84kdOw5tWhreSxZj6ut75xOK82D9MM79cY6fM6dRYG5B61cdePX0yyTmJ/Jpp09xsajYimeF2+MwbCjus2eTf+QIMa+MpjQ319CSKpWxnfzIKtCw4VhZwp9fF3AKhCNf4+prjYuPNeF74pBSMrTeUFSoWH1utWFFK9w3uoICYl99lexNm3H6z3jcZ39UIdf0hosbOJZ0rEo0VWTxmo0QYqQQYgdwCEhEbxweaaRGQ9wbEym6cAHPzxZg3rjxnU/Ijkcu7cHhMAd250wgydyIui8XMz70FYxURqzssZLW7q2rR/wjjt1zz+K5YAGF4eHEjByFNiPD0JIqjRY+DrT0sefb/dfQlur06amtx0DiKYg7RuMuXmQmFRB3IRNXS1d6+vVk8+XNZBdnG1q6wj2iTU8nesRI8vcfwG3GDJxff71CySuXMy/z8dGP2XhpY5XoqshM4TT6jKNZUsoAKeVkKeUjXftISkni1GnkHziA24zpWHfufOcT4k+gXfI0v1/uzon85wk302L+3Hn+G/Y2de3qsqbXGgIdAqtF++OCTfdueC9aSPHVq0QPH4Em+dHpzTC2kz/xWYVsDy+rA9V4EJjZQuhi/Fu4YG5tQvieOABGNBhBobawyh4QClVD+RqEK1fuaQ2CVqdl2sFp2Jja8F6r96pEW0WMgp+UchJwT32ZhRBqIcRRIcRpIUSEEGJm2fh6IcSpsi1KCHGqbNxRCLFbCJEnhFh47z+l8kj94guyf/wRpwkTsH/hhTsffO4nCr8byM+Jk7hS0Ja96mKyO+1gbeRievj24Ptu3+NkbpiA0aOOVUgI3t9+gzYxkehhwyiJizO0pErhyXou+Dtb8s2+SH0KrpkVNB8O537CuDCFBu09iDqTRk5aIYEOgbTzaMea82soKX30gu+PIoVnzhA1aDC63Fx8li+7pzUIq86t4mz6WSZ5v495adWU2K+IUWhzn7WPioEnpJRN0M80ugsh2kgpB0opm0opmwKbgS1lxxcBU4G37/lXVCKZa9eSvngJdi+8gNOr//n3A6WE/fPJWjOZzRkfk1Tixy82uSQ0/56zuX/wnyb/YW7IXNTG6uoT/xhi2aoVtZYvozQnh+ihwyiOjDS0pAdGpRKMCfEjIiGHg1fKsr/Lq6cuJSjEE4Tg7D59gHlk0EhSC1P59dqvBlStUBFyd+/Wr0GwssJ37RrMm1a8xM217GssPLmQp9y7EbPejGXzw6pEY5XVPpJ68sq+mpRt5SuPhN55NgBYW3Z8vpTyAHrjYBBy//iDpA8+xKpzZ9ymT/t3/562BH56lYQdG9mU/RlFxi5sd00jsd5nFIprzO04l/FNxz+WVU8NgXnjxvisXIksLSV66DCKzp0ztKQH5plmnjhbm7FkX1npC4faENAdwpZibSPwa+LEuYMJaEtKaevelgD7AFZErHhkF/c9CmRu2HB9DcLaNXdPXLkBndQx/dB01MZq2sYOQlVcSl5dA1VJhfuvfSSEMCpzD6UAu6SUoTfs7ggkSynvqbWnEGKMECJMCBGWmpp6L6fekYITJ4l/623UjRriuWA+wtj49gfmp8OqZ7h0OJqfsj5EbW/H7jrRRHt9ipW5jqXdl9LTr2el6VKoGOrAAHxXryrr5jaKgpMPd5tPM2MjXmpfm/2X0zgbXxZEbj0WCtLg7BYadfaiOF/L5bDk8tIXV7KucDDh0S0H8rAipST1f1+SNG06lh3a39cahLUX1nIy5SSTfN8j+VgO59Q6Xulbr0r0VmntIyllaZmbyAtoJYRoeMPuwZTNEu4FKeU3UsqWUsqWzs7O93r6bSmOjCRu/HhM3Nzw/vprVBYWtz8w7TLy2ycJO+/Fruy3cPN3ILTJKU6bz8PF3I2NfdbRxLlJpWhSuHdMfX3x/WE1Rg72xLz8CvmHDxta0gMxpHUtrMyM+XZ/mUvMr7M+PTV0MR51bXHwsOTMbn16anff7rhYuCilL2oYUqMhccoU0r76CtvnnsN70b2vQYjLjeOLE1/QwaMDhX86Uygk/l29sLesWMvfe+V+ax/dwdl+K1LKLGAP0B1ACGEMPAcYvFuIJiVF3znN2Bjv777F2MHh9gdG7qH026fZHf8coTmDqBPswr6Gv7An7yscVY34+fm1eFh5VK94hVsw8fDAd/VqTL28iB0zlty//jK0pPvG1tyEwa282XYmkdiMgrL01LGQeAoRd4xGnb1Ii80jKTIHEyMThtUfRmhiKOfTH9t6lTUKXX6+fg3C5i36NQgffXjPfRCklMw4NAOVUDHS+A1y4vIJtS7llSfqVJHqiq1oTvtn7SPg/budJ4Rw/rsshhDCHOgKXCjb3RW4IKU0aLpIaV4esWPGos3KwnvJEky9vW9/4PHlFK8czra09zif24Em3T3Y6LmQP5I2YVbQmZ/6f4eVadVkAijcO8bOzvisXIFZ/frEvfY62du2G1rSffNSh9oI4PsD1/QDTa6npwa0csXU3Lg8PfX5gOexMLZQSl/UALRpafo1CAcO4jZzZoXXIPyTzZc3E5oUyqSgtznzSwpxRqV0eMoH26JYfWyzCrjfus0DKnCMO7BbCHEGOIY+prCtbN8gbuM6EkJEAQuAUUKIOCFElXWxlyUlxL32mj5P+IsvMG8YdOtBulL47b/k/Pghm3M+I6GoLs0HujFfvEdo0hE0yc+xvO+H2FkoGUY1DSM7O2otXYpF8+YkvPMOmRs2GFrSfeFua06/pp6sPxZLZn4JmFqWp6ealqRQv607V4+nkJ9djI2pDc8HPM/OaztJyk8ytPTHlpKoKP0ahKtX8Vq0EPuBFXlc3kpSfhLzw+bT2q01LmcbUlKg5bCdjpfaesLq/rBhRCUr13O/RuGuJk9KeUZK2UxK2VhK2VBKOeuGfaOklItvc46vlNJBSmklpfSSUlZJGonU6Uj47xQKDh/B/cMPsOrY4daDivNg3VCS9+1kU86X5Etn6g+34t2U8cTmJFIQ8yL/DXmJhp6PdxvNmoyRlSXe3yzBsmMHkqZNJ33ZckNLui/GhPhRqCll9ZFo/UDwKyB1ELaUhp080ekkEfsTABhWX1+WTCl9YRgKT5/W90HIy8NnxXKsu3S5r+tIKZl1eBalspTXvSZzbn8Cx8209H/SD5vjX0PGVWg1upLV6/lXoyCEcPiXzZEKGIWaTMGRI+T88gvOkyZh98wztx6QHQdLuxMZnsGPWXMxtrLFaUg+b10eh7EwJ+vKeLrX6ciw1rWqX7zCPaEyN8d74UKsu3UjZe5cUhcueujSNgPdrOkS6MzyQ1EUaUr16amBPSBsGXYOKmoFORKxL55SrQ4PKw+e9n2aTZc3kVvyaNWFquncsgahyf0nnWyL3Mb++P283vQNLvyURYmJ4Kw9vBikgv3zoMEzUOfJSlR/nTvNFI4DYWX/3riFAQ/10knLdu3wWfMDjmNuY2njjyO/eYLT0YHsyJyMg7ctOT1PM+3cZBo4NCIv8j94WtViznONlHUIDwnC1BTP+fOwffZZ0hYuJGXuJw+dYRjbyZ/0/BI2HS8Lw/2dnhqxhUadPSnIKSHylD5Fe2TQSPI1+Wy+tNmAih8fpJRkrFqtX4NQp849r0H4J2mFacw5OodmLs1onBJCakwuO02KebFTbSz/eA+EEXSbXXk/4B/8q1GQUtaWUvqV/fvPza/KFFUTFs2b3/pQj/gR3dLe7E8fwoGsofg2ceRo8EaWXPmKZ/yfRaSMITPXhEVDmmOtvrcsAgXDIoyNcf/oQ+yHDiVj+XKSpk1Hlj48rcZb13agibcd3+6PpFQnoXYncK4HR77Gp4EDNs7mhO/WG4wgxyBaubVi9fnVaHQaAyt/tJEaDUmzZpH80UdYde6Mz4rlD9wHYXbobIq0RbzfaBqhP18jy0ZFqp0RLzmdh8u/QZf3wNazkn7BrSgNgkFfsmLfPErWj2VH3gzCs0II7OLED96f8FvcTt5s8SbumhHsu5jJ1D4NlDjCQ4pQqXCd8l8cx44la+NGEv5vMlLzcDw0hRCMC/EjOr2A3yKSrqenJp1BxB+lUSdPEq9mkxqjdxmNDBpJckEyv0X9ZmDljy6lOTnEjh1L1tp1OLz8El5f/u+B+yD8HvU7u6J38WqzV4nbVYKmuJRNsoAJHd1R//E+ONeH1uMq6RfcHsUoaIvhx/Hk71rI1sJFROfUpV5fW+YavcWV7Ct83uVzGlr1Y/6uS/Rq7K7EER5yhBC4TJqI85tvkrN9O3FvTERXXGxoWRXi6SA3fB0tWLL3qt791XggqPXpqfXaumNsqiJ8r3620MGzA/62/krpiyqiJDqaqEGDyT8WhvtHH+L6zjsIowdrnpVZlMlHoR8R5BhEV+O+XAxNIsbFCCNbU4ZrNkF2LPReAEZV66V4vI1Cfjqs7Ed62EE25S0mu8QZ7wGC9zLHI6VkRfcVNHXswGtrTuJlb67EER4hnMaMxnXqFPL++ovYcePQ5ecbWtJdMVIJXunox+m4bEKvZZSlp46Acz+j1qYQ0NqNS0eTKcrX6Bc7BY3kQsYFQpNC735xhQqTf/QoUQMGUpqeTq3vv8Pu+ecr5bpzj80lpySHGW1mcmD9VUxtTNhUmMu7wSpMjiyEJkPAp12l3OtOPL5GIfUifPcEMVe1bM75HGlmjfGz8fw35g38bP1Y22stgfb1eHPDKTLyS5Q4wiOIw9ChuH/8MQWhR/Vd3HJyDC3prvRv4YWjpSlL9pYVyrshPbVxZy9KNTrOHdSnp/by64Wj2lEpfVGJZG3eTMzLr2Dk6Ijvxg1YtqqcfmN7YvewPXI7YxqNoeCEmszEfE44CRysTXk28TMwtYCnZt39QpXAnVJSGwkhjgghYoUQ3wgh7G/Yd7Ra1FUVccfhu6eISGvOtoz3sXGxIubJvXwW/TFdfbqyrPsynC2cWbIvkj0XU5U4wiOM3bPP4PnZZxSePUv0Q9DFTW1ixKh2vuy+mMrFpFyw94XAnnB8GY4uxnjUtePs3nh0OompkSlD6w/lYPxBLmfeU91JhX8gS0tJ/uRTEv87BcvgYHzXrcW0VuW4knNKcvjg8AfUta/LIK/hHNt2DVt/G3ZkZDOv/hWMovbBk9PAqnJqvd2NO80UvgZmAI2AS8ABIYR/2b6H+pVZ2vtxSL7FntThuNez4c9mS9mYsJYxjccwr9M8zI3NORaVwbzfLypxhMcAm25P4/3VIkoiI4keNhxNcrKhJd2RYW18MDcx4pt9ZYXyWo+FgnQ4u5lGnb3ITS8i+qy+D8OAwAGYG5uzIkIpfXG/6PLziXvtdTKWLsV+yGC8v1mCkY1NpV1/fth80ovS+aD9B4RuvgYSdpoWUcemlI6Rn4FHM2jxYqXd727cyShYSSl3SimzpJTzgAnATiFEG27oi/AwEhcjORnTiFptbFjqNZPjmceY3WE2rzV7DZVQkZFfosQRHjOsOnak1nffok1OJnroMEpi/1ktvuZgb2nKwGBvfjoVT2J2IdQO0WelhC6mdhNHLO3MCN+t129rZsuzdZ5l+7XtpBQ8Oi1LqwtNQgJRQ4eRt2cPrlOm4DZt2r+X1b8PDiUcYsvlLYwKGoVVkhtXT6bi2MqZA4nZfOm+E5GfAr0WgOrBgtj3wp2MghBClPtMpJS7geeBVYBPVQurSrzrO1DvRXPmmL5Btiab77t9Tx//PgDodFKJIzymWAQHU2v5MnS5uUQPGUrxlSuGlvSvvNyhNhJYeuDaTempRvFHaRjiSez5TDKT9MHzYQ2GoZM61pxfY1jRDxmFp09zbcBANHFxeC9ZgsOwoZV6/XxNPjMPzcTXxpfRQWPZt+4Sdi7m/JCVSWebJOrFrIWWL4Fn80q97924k1GYC9S/cUBKeQZ4kustNB9KjiUd4+2L/8HZwpk1vdbQzKVZ+T4ljvB4Y96oEbVWrURKSfTwERRGRBha0m3xdrCgd2N31oTGkF2ogcYDQG0HoYtp0MEDlbEgfK++Xae3tTdP1nqSDZc2kK+p+VlWNYHs7duJHj4ClVqN77q1t6+P9oB8fvxzEvMT+aD9B0T8kUROaiE2HVw5FZ/JPIvlCHMHeHJqpd/3btxpRfMaKeURACGElRDCsmw8RkpZNZWYqommzk0Z1XAUq3quwsvaq3xciSMoAKgDyrq4mauJGT6C3D//NLSk2zImxI/8klLWhMZcT089/wsWMpU6LVy4cDiRkiItAKOCRpFbksvWy1sNrLpmI6Uk9cuFJLz1NupGjfDdsB6zOpXfuyAsKYx1F9cxtP5QaosATuyMpk4LF765mMBYm8M4ZZ2Bpz8Ac/u7X6ySuWNKqhBivBAiBohG34EtWghxTw12aiImRia80fwNrE2ty8eUOILCjZj6+uK7di2mfn7EvTqB1K++qnGLwII8bOlY14mlB69RrC3Vp6ciIex7Gnf2RlNUysUj+hLajZ0b09ylOavOrUKr0xpWeA1FV1REwltvk7ZoEbbPPEOtZUv/venWA1CoLWT6oel4WXkxoekE9q27jMpYoGtiR1x8LJPkaqjVDpoMrvR7V4Q7paROAfoAnaWUjlJKB6AL0KNs3yODEkdQuB0mrq74rF6FTd8+pP3vS+InTqpxi9zGhviTmlvMjyfjwd5Hn54atgxXLxNcfKwJ3xNXbsxGBo0kIT+BP6L/MLDqmoc2NZXokSPJ2bED57fexP3j2ahMq6bd5aKTi4jJjWFmu5kkRxQQE5FOq961+fLwNT6w2oypNg96zdfHigzAnWYKw4HnpJSRfw+UfR4AVE13BwOhxBEU/g2VWo3H3Lm4vPMOubt2ETVkKCVx8YaWVU77Oo4EediwZF8kOp3UB5wLM/TpqV28yEwqIO5CJgCdvTvjY+PD8ojlNW7WY0iKLlzg2oCBFF+6jNeX/8Np9Ogq8xScTj3NqvOrGBAwgKb2zdm/4RKOnlYkuphglnSc3tpdiDbjwbXK+ovdlTu6j6SURbcZKwR0VaaomlHiCAp3QwiB48sv4b1ksT5FsX9/8kNrxvpNIQRjQvyITM3nzwsp4NsRXBpA6GLqNHfG3NqkvF2nSqgY0WAEEekRHE8+bmDlNYPcv/4iashQ0Onw/WE11l27Vtm9SkpLmHZwGi4WLkxqMYmwX6PIyyym46C6fPnnBT4xX4609oDO71aZhopwJ6MQJ4S4pYuDEOIJILHqJFUfShxB4V6w6tgR3w3rMXJwIOall8j44Yca8cbdq5E7nnbm+tIX5emp4RgnHqVBew+izqSRk1YIQF//vtib2T/2i9mklKR//72+B4KfH74bNqBuULVv54tPLyYyO5LpbadTkiY4/Ucs9dq5c6qwiFZpW6iru4boPhvMrO9+sSrkTkbhdWCJEGK5EOI1IcQEIcQK4Bv0C9keapQ4gsL9YFa7Nr7r12HVoQPJH3yo78tQYtieU8ZGKkZ3rE1YdCZhURnQ6Hp6alCIvu5+xH69y0ttrGZwvcHsidtDZHbknS77yCJLSkicMoWUT+dh3a0bPqtWYuLqUqX3PJ9+nqVnl9LPvx/tPdqzd+1FTNRGtO7nx8rfj/COySak3xP6jmoG5k4pqRFAQ2Af4Av4lX1uWLbvoUaJIyjcL0bW1nh9tQjHMWPI2riR6FEvok1LM6imAcHe2FmYsGRfpL54WouRcH4b1kZp1G7qTMSBBLQl+qZCA+sNxMzIjJURKw2q2RBoMzOJefkVsjdvwek/4/FcMB+VuXmV3lOj0zD14FTs1fa8E/wOl44mk3A5izbP+PPntTQGZ3+DWqVB9JpnsODyjdwp+6gO0EJKuVRK+ZaU8k0p5fdA8A01kB5KjkdnKnEEhQdCGBnh8uYkPBfMp+jcOa71f4HCs4Z7V7IwNWZEGx/+OJ/MlZS86+mpx76ncWcvivO1XA7T13RyUDvQz78fv1z9hbRCwxqz6qQ4MpKogYMoPH0aj08/xfn11xGqqi8UvTR8KRczLzK1zVTUpRYc3HwFF18bAtq6sW/nZp4xOoSq/URwrBmP1Tv9RT4Hbtf5u7Bs30NLoJs1L7bzVeIICg+MTc+e+K75AYQgeuhQsrdtN5iWEe18MTVS8d3+SLCrVVY9dTketc1w8LDkzO7r6anDGwxHo9Ow8tzjMVvIO3iQqIGD0OXnU2vFcmz79K6W+17JvMLiM4vp4duDJ2o9Qegv1yjKLaHT4AC2n45mfMHXFFh6IULeqhY9FeFORsG3rKzFTUgpw9C7kx5arMyMmdK7gRJHUKgU1A0aUHvTRtSNGpLw9tukzJ9vkP7PTlZmvNDSiy0n4knJKdK3bSzMQJRVT02LzSMpUt8zwtfWl95+vfnh3A8k5SdVu9bqJHPtWmLHjMXE3Z3aG9Zj0azZ3U+qBLQ6LVMPTsXaxJp3W79LakwuZ/fE0TDEEwcvK1J/W0AdVQLqvgvApGpdWPfCnYyC+g777voLhBBqIcRRIcRpIUSEEGJm2fh6IcSpsi1KCHHqhnPeE0JcEUJcFEJ0q/jPUFAwLMaOjvgsXYrdwIGkf/sdsePHG6Rpzysd/NDodCw7FAW+HcAlCEKXEBDsgqm5cXl6KsCEZhOQSBaeXFjtOqsDqdWS9OFHJM2chVWHDvisWYOJZ9U1vP8nq8+t5mz6Wd5v/T72pvbsXXsRtZUJrfv58dvBYwwvWU+KZ1dUgTXrUXcno3BMCHFLjSMhxMtARZKci4EnpJRNgKZAdyFEGynlQCllUyllU2AzZcX1hBANgEFAENAd+EoIUX31YhUUHhBhaor7zBm4zZhO/qHDRA0cRHHktWrV4OtkSY+Gbqw+Ek1eSak+PTU5HNPko9Rv687V4ynkZ+t7UntYeTC0/lB+vvozFzMuVqvOqqY0O5vYcePJXL0ah1Gj8PpqEUZWltV2/6jsKBaeWsgT3k/Qzbcb5w4mkHwth/bP10FlZoT13qkIIXDuv6DaNFWUOxmFicCLQog9Qoj5Zdte4BXgjbtdWOrJK/tqUraVJ3ULvTN/ALC2bKgfsE5KWSylvAZcASqn152CQjViP2gQPsuWUpqVRdSAAeTt3Vut9x8b4k9ukZZ1R2Og0Qv6omqhi2nYyROdThKxP6H82FcavYKVqRWfn3iow4TlSJ2OrM2budoCQGTQAAAgAElEQVS9B/lHjuA2ayau705GGFXf+6VO6ph+aDqmRqZMaTOFonwNh3+8ikddOwJau3FoxxpCSkOJa/Iawr7mdSG4U0pqspSyHTATiCrbZkop20opK+SEFEIYlbmHUoBdUsobO4h3BJKllH/3CfQEbuxsElc29s9rjhFChAkhwlJTUysiQ0Gh2rEIDqb2po2YeHkRO2486d99V20L3Zp429HGz4HvD1yjRKWG5iPhwnbszNKpFeRAxL54SrX6ogS2ZraMaTSGA/EHCE0MvcuVazaFERFEDx5C4n+nYOrnR+1NG7EfMKDaday7sI4TKSeYHDwZZwtnDm+9iqawlJDBAWiKCqhzfBaxRt749/m/atdWEe6ajyWl3C2l/LJs++teLi6lLC1zE3kBrYQQDW/YPZjrswSA26UB3fJ/kZTyGyllSyllS2fn6ulZqqBwP5h4euK75gesu3cjZd58Et5+B11hYbXce2yIP4nZRfxyOgGCX+bv6qmNOntRkFNC5KnrL1SD6w/G3dKdBccXoJMPXwWb0qwsEmfOJKr/C5TEx+Mxdw4+q1ehrlev2rXE5cbx+YnPae/Znr7+fUm8ms35g4k0edIbRw8rLmyaiadMJqPzbISxWbXrqwhVn6QLSCmzgD3oYwUIIYyB54D1NxwWB3jf8N0LSEBB4SFGZWGB54IFOE+cSM6vvxI9dBiaxKqvEtM50JlAV2u+2ReJtPWGer3g+HJ8AiywcVLfFHA2MzLjtWavcS79HDuv7axybZWF1OnI3LiRqz16krVhI/bDh+G/41ds+/UzSKq5lJIZh2egEiqmt5mO1En2rr2Ilb0ZLXv5Upx8iXpXv2efujONO/Spdn0VpcqMghDCWQhhV/bZHOgKXCjb3RW4IKWMu+GUn4FBQggzIURtoC5QM6qOKSg8AEIInMaNxWvRIkqio7nW/wUKTpyo8nuOCfHjYnIuey6llqWnZiIiNtGosxeJV7JJjb2+DKmXXy8C7QP538n/UVJq2LIdFaHwbARRgweTNHWa3lW0ZTNu77+PkbXh6gZtubyF0MRQ3mzxJu5W7oTviSc9Lo8OL9TF1MyItA2vUyRNUPf8uEavj6rKmYI7sFsIcQY4hj6msK1s3yBudh39XVZjA3AO2Am8KqWs/mRvBYUqwvqJLviuX4fKypLokaPI3LChSu/Xp4kH7rZqfaE8n/bg2hBCl1CvjRvGpqqbZgsqoeLNFm8SnxfP+ovr73BVw6LNzCRx+gyiXngBTXwCHp/M1buKAgMNqispP4l5YfNo5daK/gH9yc8uJvSXSGoFOeDXzJmS8C14ph9mk+0oghvVv/sFDUiVGQUp5RkpZTMpZWMpZUMp5awb9o2SUi6+zTkfSSn9pZSBUsodVaVNQcFQmNWpQ+0NG7Bs1YqkadNJmvUBUqOpknuZGqt4qX1tjkRmcDouuyw99SzqtKMEtHbj0tFkivKv37udZzvaurflmzPfkFNS/Wss7oTU6cjcsIHIHj3J2rQJhxHD9a6ivn0N/tZdqivl/QPvo5M6ZrTTu48ObrqCTivpODAAUZKHZvtkInQ+NOg7yeB670a1xBQUFBSuY2Rri/eSxTi8+CKZa9YQ8/IraDMyquReg1p5Y6025pt9kTelpzbu7EWpRldePfVvJrWYRFZxFkvDl1aJnvuhMDycqIGDSJo2XW9Ut2zB9b33DOoqupEV51ZwLOkY77V+D29rb2IvZHD5WDLNu9XCzsUCzV8fY1mcynrXSbSp42pouXdFMQoKCgZAGBvjOvn/8Jg7h8JTp4jq/wJFFy7c/cR7xFptwrA2Puw4m0hUtg5ajIIL23G0zKRWkAOn/oilpOh6z+b6jvXp7deb1edXG7z8hTYzk8Rp04kaMBBNUiIen35CrVUrUQcGGFTXjZxLP8eXJ7/kKZ+n6Offj1Ktjn1rL2HjpKZ5Nx9IjsDo6GLWaLvQp1c/Q8utEIpRUFAwILb9+uGzehVSqyVq0GASp06j4OTJSl3T8GI7X4xVKr47EAktX9YPHvuO4F61KcrTcHbvzbOFCc0moJM6Fp1aVGka7gVZWkrm+g1Edu9B1ubNOIwYgf+OHdj26VOjXC+F2kIm75uMg9qB6W2nI4Tg1B8xZCUXEDIoEGMTFaW/vEmOtGB/rVcJ9nUwtOQKoRgFBQUDY964Mb6bNmLTvTvZ27YRPXgIkT17kfbNt2iSUx74+i42ap5t5snGsDjSjF2gXm84sQI3LxNqBTlwclfMTbMFTytPhtQbws9Xf+ZS5qUHvv+9UHjmjN5VNH06ZgEB1N66Bdf33sXIyqpadVSEecfmEZ0TzewOs7E1syUnvZCw7VH4NXPGp6EjnF6LUdwRZmsG8Uq3loaWW2EUo6CgUAMwcXHBY87H1N2/D/cPP8DI3p7UBQu40qULMWPGkLNzJ7oH6PA2OsSPYq2OlYejy9NTCd/4r7OF0Y1HY2liyefHq6f8hTYzk8Sp04gaOAhtcjIen35KrZUrUAfUHFfRjeyJ3cOGSxsYFTSK1u6tATiw4TII6PBCXSjIQPf7VE4TQKp/f1r42BtYccVRjIKCQg3CyMoKu/798V3zA/47d+A4ejTFFy8RP3ESlzuGkDTrAwrPRtyze6mOixVPNXBl5eEoCtxbgWsjCF2CW20bajW4dbZga2bL6Eaj2R+/n6OJVbdcSJaWkrlund5VtGULDiNH4rfjV2z79K5RrqIbSStMY9rBadRzqMeEZvrOxJGnUrl2Oo3gXrWxdlDDXx9AQQbvFY9i4lPVv7L6QVCMgoJCDcXU1xeXSROp89efeH/7LVbt25O1aRNR/ftzrd8zpC9ffk9ZS+M6+ZFVoGFDWJw+PTUlAqL2E9z79rOFIfWHVGn5i8LTp4kaMJCkGTMxCwzE78etuL47uUa6iv5GSsmUg1Mo0BYwt+NcTI1MKSnUsm/dJRw9LWnS1RvijyPDlrGG7njUa0UTbztDy74nFKOgoFDDEUZGWHXsgOeC+dTdvw+36dMQZmakzJnL5ZBOxE6YQO5ff911vUMLHwda+Njz3YFraBs8B1ZusPN93GqZ33a2YGZkxoRmE4hIj+C3qN8q7fdoMzJImDJF7ypKScFj/jxqrViOWd26lXaPqmLthbUcjD/I2y3fxs/OD4DDP14lP7uYzsPqYSQkbHuTfFNH5hQ9x8SuNdP9dScUo6Cg8BBhZGuL/eDB1N64Ab9ffsZhxAgKT50m7j+vcrlzF5LnzKXo0r8Hh8eG+BGXWcivF7Oh92eQHA4HFvzrbKFXbX35iy9OfPHA5S9kaSmZa9dytUdPsn/8CYeXXsJvxw5se/Wqsa6iG7mSeYX5YfMJ8QphYOBAABKvZnN2XzyNO3vhVtsWwpZC4ilmFQ+hfVBtGnraGlj1vSOqq5xvVdCyZUsZFhZmaBkKCgZFajTk7T9A9tYt5O7eA1ot6oYNsX3uWWx79sTI7rr7QqeTdP1sL+YmRmx7rQNiy2iI2Apj9vLLBi0pMbkM/7Atpmrj8nMOxh9k3B/jmBw8mWENht1Zi1aLJikZTVwsJbGxaGLj0MTFURIXhyY6mtLsbCxat8Zt6hTM6tSpqj9JpVNSWsLg7YNJK0xjc9/NOJk7UarRsX72MTTFWgZPa42pNgO+bEmMui4hyZPY8UYI9d1tDC39tgghjkspb5sSZXy7QQUFhYcHYWKC9RNdsH6iC9qMDHK2bSNry1aSZ31AysdzsOr6JHbPPYdlu3aojIwYG+LH5M3hHLySTocen0DkHvjpPwT32Mrm+ac5uy+e5k9fb/7SzqMdbdzbsOTMEvr698WyUEdJbFzZg1//0C//nJgI2usuKIyNMfHwwNTLC3X37li2a4f10089FDODG/nixBdcyrzEoicX4WTuBMCJ36PJTMyn16uN9UZ06zSkpoBXi4bSs5F7jTUId0MxCgoKjxDGDg44jBiBw4gRFJ07R9aWreT88gu5O3Zi7OqKbb9+9Ozbl3nWZizZd5UOL7eGXgtgw3DcEpdSq8FTnPwtmgAfLaQk6N/24+KZdFVD1IU0ouZ2xKTwZjeSkYMDJt5emDdujE2vnph6eWHi5Y2ptxfGrq4I44f7MXM44TArz61kYOBAQrxCAMhIzCdsRxR1W7rg28gJLv0Gp9dyxGMkZ6+5Mu/Jhy+W8DeK+0hB4RFHV1JC3u49ZG/ZQt7+/aDTkeVfn5XWQUzoHoRLQQaa/WvRxCeSog7mqN9o/K9uxSf2DwCEmRkmXl5ctcjjvFk6/ULG4OjfQP/g9/JEZVl9vY+rm6yiLJ7/+XmsTK1Y13sd5sbmSJ1k64ITZCTkM2RGGyxKE2BJCFobb4KTJtOhvjdfDm5maOl3RHEfKSg8xqhMTbHp9jQ23Z5Gk5JCzs8/Y7x5C6+f2gSnNpEGGLs4YyKM8LS6gKtlHnH1+tH2w1FY+Hpj7OyEUKlQ58Xz7tY+pPulMqv9k4b+WVWOlJKZh2eSUZzBoq6LMDc2ByDiQAKJV7J5YkQ9LMx18P1wAL51m0l2bDFvPFnzs6juhJJ9pKDwGGHi4oLjK69Q59ftHJu6kNFP/h+ZW/6g7r59+C6ajUfjSNp3SKRYo+JqthMmri4Ilf4x4WnlyeB6g/np6k9czrx8lzs9/Px45Uf+iPmDN5q9QT0H/QK0/KxiDm+5gmegPfXausOOdyDpDJndvuSLkxr6NfWkjkvNXWdRERSjoKDwGCKEYMCAzkhvHz7cFUmpTkLQc1C/D+5np+Jdx4yTv8egKb65z9XoRqOxNLbk8xPVU/7CUETnRPPx0Y9p7daaEUEjysf3rb9Eaamk89BAxMnVcGIlssNb/F+4J1LCxK4P9ywBFKOgoPDYojYx4v+6B3IuMYctJ+JACH3Q2dSCYONvKMrTEL437qZz7NR2vNL4FfbF7eNY0jEDKa9aNDoN7+1/DxOVCR92+BCV0D8mI0+lEnkyleBevtiVXoFf34bandjm9CK7ziXz1tMB+Dg+/PEVxSgoKDzG9GnsQRMvW+b9fpHCklKwcoEen+Ce9SPe7nmc2nXrbGFIvSG4WriyIGxBpZb4riksOb2E8LRwpredjpulGwDFhVr2rb2Io5cVTTvYwobhYO5AZo+vmfHLBZp42fJS+9oGVl45KEZBQeExRqUSTOndgOScYr7dH6kfbPQCBPYkWPsJhbm3rnJWG6t5rdlrnE0/y2/RlVf+oiZwIvkE34Z/Sz//fjzt+3T5+OGtVynIKaHLkACMfh4P2XEwYAUz/0ohp0jD3P6NMTZ6NB6nj8avUFBQuG+CfR3oHuTG4r1XSckp0ruRen+Gu2UM3jbXOPl79C2zhd5+vQmwD+CL41+gKa2aHtPVTW5JLu8feB8PSw/ea/1e+XjClSwi9sXTuIs3rvFL4dIOePoj/sr34cdTCfyncx3quT2cC9Vuh2IUFBQUeLdHPTSlOhbsKqubZO0G3ecQbLyEwtvURDJSGTGpxSTi8uLYcGmDARRXPrNDZ5OUn8SckDlYmuhjA6UaHXtWX8DaQU2rxvH6kthBz5Hb5CXe33KWAFcrXu3y8JTrqAiKUVBQUMDXyZLhbXzZEBbLhaQc/WCTwbgH+eBtdoaTv0XeMlto79Ge1u6tWXJ6CbkluQZQXXnsuLaDbZHbGNtkLE2cm5SPH98ZRWZSAZ2eccL055fBsS70/ZKPd14kJbeIT/o3wdT40XqMPlq/RkFB4b55/ck6WKtN+Gj7ef2AEND7c4LtfqYwX8fZPbE3HS+EYFKLSWQWZ7Ls7DIDKK4cEvMS+eDwBzR1bsroRqPLxzMS8jm+M5q6LZzxOfMqaAph4CoOxxWzJjSGlzvUpulD1iuhIihGQUFBAQA7C1Nee6IO+y+nsediWW9oW0/c+43C2/QUJ3dcvmW2EOQYRM/aPVl1bhXJ+ckGUP1glOpKee/Ae+jQMbvjbIxV+iIPUifZvfoCJmojOjhvhLij0O9LCm3r8O6WM/g4WvDmU4EGVl81VJlREEKohRBHhRCnhRARQoiZN+x7TQhxsWz8k7IxUyHEMiFEeNk5natKm4KCwu0Z0dYXH0cLZv96Hm1pWbe1ZsMJDrxCYZERZ3eG33LOa81eo1SW8tXpr6pZ7YOzLGIZx5OP837r9/G29i4fj9gfT1JkNh3aZGNx6gtoPR4aPs+CXReJTi9gznONMTc1MqDyqqMqZwrFwBNSyiZAU6C7EKKNEKIL0A9oLKUMAuaVHT8aQErZCHgKmC+EUGYyCgrViKmxine71+NScp6+bSeAELgPfQ9vdTgnd8WhuaE7G4CXtReD6g3ixys/ciXzigFU3x8RaREsOrmIbr7d6OPXp3w8L7OYQ1uv4uVvRuDFseDdGp6axanYLL4/cI0hrWvR1t/RgMqrlip76Eo9eWVfTco2CYwH5kgpi8uOK5un0gD484axLOC2VfwUFBSqju4N3Qj2tWfBrkvkFZcZADtvgrs6Uai14Oyan245Z0yjMQ9V+YsCTQHv7n8XR3NHpraZelN/h33rLiJLJZ1VHyBMzeGF5ZRgzORNZ3CxVvNuj3oGVF71VOmbuBDCSAhxCkgBdkkpQ4EAoKMQIlQIsVcIEVx2+GmgnxDCWAhRG2gBeN/mmmOEEGFCiLDU1NSqlK+g8FgihOC/vRqQllfMkr1Xy8fdew3B2zaak2EqNCnRN51jp7bj5UYvszdu70NR/mJe2Dx9faOOH2Nrdr1l5tWTKVw7nUawTxi2eaHQfynYeLBo9xUuJucy+7mG2KhNDKi86qlSoyClLJVSNgW8gFZCiIboy3XbA22Ad4ANQm+mlwJxQBjwOXAI0N7mmt9IKVtKKVs6OztXpXwFhceWpt529G3iwbf7I0nMLtQPCkHwoLYU6mw5u3QF/KPExdD6Q3G1cOWz45/V6PIXf8X8xcZLG3mx4YsEuwWXjxcXaNi37hJODkU0zf0InpgCfp24kJTDV3uu0K+pB0/UczWg8uqhWnz2UsosYA/QHf2Df0uZe+kooAOcpJRaKeUkKWVTKWU/wA549OvzKijUUN7pFohOwqe/XSwfc29WDy+PAk7GNERz9Iebjlcbq5nQbALhaeH8Hv17dcutEKkFqcw4NIP6DvWZ0HTCTfsObb1KYU4JXVTTUQV2g/aT0JbqmLzpDDZqE6b3CTKQ6uqlKrOPnIUQdmWfzYGuwAXgR+CJsvEAwBRIE0JYCCEsy8afArRSynNVpU9BQeHOeDtY8GJ7X7aciOdsfHb5ePCgdhTq7Dj7417Ivnmlcx+/PtS1r8sXJ2pe+Qud1DH14FQKtYXMCZmDidF1N1DC5UzO7U+gse2fuDiXwLNfg0rF0oPXOB2XzYy+QThYmhpQffVRlTMFd2C3EOIMcAx9TGEbejeRnxDiLLAOGCn1c00X4IQQ4jwwGRhehdoUFBQqwKtd6uBgacqH28+Vu4Q8AhzwqqPmZHZPND+9fZMbyUhlxKTmk4jNjWXjpY2Gkn1b1l5Yy8GEg7wT/A5+tn7l41pNKbtXX8DaLJvWFqtgwCowtycqLZ/5v1+ia31Xejd2N6Dy6qUqs4/OSCmbSSkbSykbSilnlY2XSCmHlY01l1L+VTYeJaUMlFLWl1J2lVJG3/kOCgoKVY2N2oSJXetyJDKDP86nlI8HP9NAH1sIN4bT6246p4NnB1q7tWbx6cXkleT985IG4XLmZRaELaCzV2deCHjhpn3Hd0STlVxIZ4vPMekzG9wbo9NJJm8+g6mxio+ebXhTdtKjjrIOQEFB4Y4MblULP2dLPv71PJqyBW0edezwqmfPycIBaH6dCjmJ5ccLIZjUsqz8RYThy18UlxYzef9krE2tmdFuxk0P+PSEPE7svEaAei+12jaGZsMAWHsshtBrGUzpVR9XG7WhpBsExSgoKCjcERMjFe/3qE9kWj5rQmPKx4N716ZQa8nZ7BDY/uZNbqQgxyB61O7ByoiVpBSk3O6y1cYXJ77gcuZlPmj/AY7m1xedSZ1kz/LTmJJHhzqh0ONTABKyCvn41wu0r+PIgJa3ZMU/8ihGQUFB4a48Wd+Ftn6OfP7HJXKK9AHk8tlC8SA0F/6A8E03nfN6s9fRSi1fnTJc+YtD8YdYdW4Vg+sNpqNXx5v2nd0dRVJMMe0d1mI+dDGYqJFS8t+t4ZTqJB8/2/ixchv9jWIUFBQU7op+QVt9sgo1LNp9vZRFcO/aFBYZE2EyGna8A7nXi+J5WXsxKHAQW69s5WrW1dtdtkrJLMpkysEp+Nv682aLN2/al5dZxOEtl/A2PUXgsFFg7wvAT6cS2H0xlbe7BVLL0aLaNdcEFKOgoKBQIRp62vJcMy+WHYgiNqMAuD5bOJH5tL4m0q9v3eRGGtN4DBbGFnx+vHrLX0gpmXFoBlnFWcwNmYvaWH3Tvr2L/0SW6uj0pBZRrzsAaXnFzPwlgua17BjVzrda9dYkjA0tQEFB4eHhnW6BbA9P4JPfLvLl4GYABPeqzdb5J4gI/Jim5ydAxFZo+BwA9mp7Xm70Ml+c+IKwpDBauj14OTONTkN6YTpphWmkFaaRWpiq/1yg/5xemE5KYQpJ+Um83fJtAh1uLnF99a9jREWb085nH7Z9ppaPT/85gvziUuY+3xgj1ePnNvobxSgoKChUGDdbNWM6+vG/v67wUntfmtWyx6Nu2WzhsglB/q0w+fVtqB0Clk4ADKs/jLUX1vLZ8c9Y3XP1bf30UkoKtAWkFqSWP9jLH/aFaaQWpJJWpH/wZxZn3labvZk9ThZOOJs742vrS6B9IMMaDLvpmKL0VPZvicfZLJsmr74KKn35698ikth+JpG3ngqgrqt1Jf/VHi5ETa5Rcjdatmwpw8LCDC1DQeGxIq9YS+dP9+DjaMGmcW0RQpBwOYut80/Qvps1TcN7QP3e8MLy8nO2Xt7KtEPTeKnhS1iZWN38hl+2FWoLb7mXicoEJ3P9g97R3BFnc2ecLJzKx/4edzR3xER1l0J1Oh27Z33N+aQAXnjJGudWbQDILtTw1IK9OFqZ8fOE9pgYPfpedSHEcSnlbadtykxBQUHhnrAyM+atpwN4b0s4O84m0bOR+/XZwqE8gnq8i8k+fYN7GvQFoK9/X344/wNLzy4FwNrEuvytvqFTw5se8M4WzjipnXC2cMbG1KbSMoDiN3/DuaT6NG2YiXOrp8rHZ28/T3p+CUtHBT8WBuFuKEZBQUHhnhnQ0pvlB6OYs+MCT9Z3wczY6HpsQfs8Td1+0q9d8O0AFg4YqYxY0WMFmUWZOJk73RT4rQ60F/9izx5LbMxzaTW6X/n4gctprA+LZVwnfxp62t7hCo8PillUUFC4Z4xUgvd71Scmo4BVh/UVacpnC7vi0PRcBIWZsOP/ys+xNLHEy9qr2g0C2fGELdtGVqknnUe1xMRM/y6cX6zl3S1n8HOyZGLXutWrqQajGAUFBYX7olOAMyEBzvzvz8tkFZQA+kykwlwNEZfsIOQdCN8IF341nEhtCekr3uFkVncCm1jg3cSzfNe83y8Sl1nInOcbozZ5NPst3w+KUVBQULhv/tuzPnnFWv73p35Bm0ddOzwD7TnxewyaVm+Aa0PYNhEKMgyiT/f7VHZf7oipWkX74c3Lx49HZ7D8UBQj2vrQqraDQbTVVBSjoKCgcN8EulkzMNibVUeiiErLB6BVb18Kc0qIOJQK/RZBfhr89n71iwvfxNndMSRrAukwuCHmVvp+CEWaUv5v0xk8bM35v+6Pdr/l+0EJNCsoKDwQk54K4OdTCczZcYHFw1vgUde+fLYQFNIWk45vwr5PIehZCOhWOTfVFkNR9vWtMAuKsm4YyyL30BaO5H9Krfr2BLS63kZz4V9XuJqaz4qXWmFlpjwC/4nyF1FQUHggXKzVjOvkz/xdlzh6LYNWtR1o1duXrfNPErEvnqad34Hz2+CXifCfw2BuB6VaKM65+UFemHXzg74o6zYP/rLPt1nTcCPy/9u78+AoyzuA499fQkISAgkhRONuSERAAklADB5Yj0EsxaOKwGjVjnbGsfWqPaxHj9GqVLSORUdsPUpbrWNHqbRjrUexWjxR5AiGgAgUE0IFjbnQQI5f/3if7CwhB5vdDbub32dmZ3ef93mPHy95f/s+z/u+T1IqK/f+Ek1O4/RLJwYua62sbeC3/9nKvGl+Tp9gY7x3x5KCMSZsV546lqdWfcLCFzay/JpTupwt+Ei5YAk8PgsemAId7bC/qfcFSjKkZR34ypvYpSzbvbzvmpZFQ3M61ds72FHVyI7az5kx7xhG5KYD0NbewU3LKhiZkcovzi0egH+V+GRJwRgTtvTUZH4y+1h+/Ox6nq+o5fypvgPPFmYdDxc+Bh+/6p0pBB3MSctyZUHfUzPhEG5aa2lupXpTHTWr6qiuqqOprgWA4aPSmDqrgCkz/YG6j76xjcraRn576TSyMwbHeMv9YUnBGBMRc4/zsfSt7dz70mZmTz7y4LOF0vlQOj+sdbS3drBrWwPVVXVUb6xjT3UTKKSmJeM7diTTZo/BX5xD1uj0A+6E3rqnmcUrtjCn5EjmlA6e8Zb7w5KCMSYikpK8MRcueWwVS9/azjVnjOtytjAm5GWqKnW1e70kUFVH7ZZ62vZ3kJQkHDF2BCecezQFxTnkFQ4nqYdHVHR0KDcvqyA9JZlfnj853DATniUFY0zEzDgml1nFeTz82lYuKi9wZwvZrH3lE0pO8zEkte+bxPY27KOmqo7qqi+o3lTHlw3ejXHZR2RQPOMoCibl4BufTWr6oR2+nnx3B6t3fMF9C6aQN3xwjbfcH5YUjDERdcucYmYvXsniFVu484ISpp9zNH+7fy2Vb9Qy5cyDxzxu3d9O7ZZ6qqvqqKmq4/Od3v0OaZkpFEwcib84h4LiHIbnhH5Ar677knte2sRpE0Yzb5qv7xmMJQVjTGSNy8vk0hPH8NSqT7h8RiHjJnhnC2te3sHkU48ieUgSe6qbXJPQF+Tx2WQAAAmESURBVOzaWk9Hm5I8JIn8cVmcPPdICopzyPVnImEMdqOq/HT5BgT41dySQTnecn9YUjDGRNwNZ45n+Zqd3P3PTfz+iumBs4Xl96+lcc9XtOxtBWCUL5OyM/wUTMohf1w2KYfQvHSoln1QwxtbPuOO8yfjHzk4x1vuj6glBRFJA1YCQ916lqnqbW7a9cB1QBvwgqreJCIpwOPANFf/CVW9O1rbZ4yJnlGZQ7l25jgWvbiJtz/+jBkTcjl6Si67dzRRVDoq0CSUMSLyl4Y2trRSUd3Anf/YyPSikVx2YmHE15HIonmmsA+YqarN7oD/poi8CKQD5wNlqrpPRPJc/QXAUFUtFZEMYKOIPK2q/43iNhpjouSKGUU8+c4O7nqhiuev/xpnX10W8XV8ub+NytpGKmoa2FBTT8XOBrbt8fokMocOYdG8MpIG8XjL/RG1pKDeOJ/N7muKeylwNbBIVfe5ers7ZwGGicgQvMSxH2iM1vYZY6IrLSWZm+dM5PtPr+W5NTUsKD+4kzkULa3tbNzVyIaaBi8J7Kzn493NdLgRhfOz0ij1ZXHhcT5K/dlM9WeTldHHEJ3mIFHtUxCRZOADYBywRFVXicgE4FQRWQi0ADeq6vvAMrwziF1ABvBDVT3oebsichVwFcCYMaFf92yMGTjnleWz9M3t3PfKZs4pyycj9dAOOfvbOtj8vyYqdtYHksBHnzbR5jJAbmYqZf5s5pTkU+bPotSXRd4Iu9w0EqKaFFS1HZgqItnAchEpcescCZwETAeeEZGxwAlAO3CUm/6GiKxQ1W1dlvko8ChAeXm5RnP7jTHhERF+fk4x83/3Do+t3M4N3Yxw1tbewZbdzWyoaWB9TT0bdjawaVcT+9s7AMjOSKHUl8V3J46l1JdNmT+L/Kw0u5ooSgbk6iNVrReR14FvADXAc6556T0R6QBygUuAl1S1FdgtIm8B5cC2HhZrjIkD5UU5zCk5kkdWbuWi6QU0tbS65p8GKmrqqaxtZF+blwCGDx1CiS+L75xSRKk/iyn+bPwj0y0BDKBoXn00Gmh1CSEdmAXcg9fPMBN43TUlpQKfAZ8AM0Xkz3jNRycBi6O1fcaYgXPLnImsqPqUGYteDfQBpKckU+IbwWUnFQaagIpGDbOO4cMsmmcK+cCfXL9CEvCMqv5DRFKBpSLyIV5n8uWqqiKyBPgD8CEgwB9UtSKK22eMGSCFo4axcG4plTsbKPV7TUDHjM4k2RJAzBGvFSc+lZeX6+rVqw/3ZhhjTFwRkQ9Utby7aTZGszHGmABLCsYYYwIsKRhjjAmwpGCMMSbAkoIxxpgASwrGGGMCLCkYY4wJsKRgjDEmIK5vXhORPcCOMBaRi/eIjUSSiDEFS+T4Ejm2TokeY7zEV6iqo7ubENdJIVwisrqnu/riVSLGFCyR40vk2DoleoyJEJ81HxljjAmwpGCMMSZgsCeFRw/3BkRBIsYULJHjS+TYOiV6jHEf36DuUzDGGHOgwX6mYIwxJoglBWOMMQFxkxREpEBEXhORKhGpFJEbXHmOiPxLRLa495FB89wqIh+LyGYRmR1UvlBEqkWkuY91Hi8iG9wyHhQ3UKyInCYia0SkTUTmJ1Bc33Pl60TkTRGZFE5sMRjfFSKyx8W3TkSuTKDYfhMU10ciUh9ObDEaY6GIvCoiFSLyuoj44zC2butJBI8pYVPVuHjhDe85zX0eDnwETALuBW5x5bcA97jPk4D1wFDgaGArkOymneSW19zHOt8DTsYbHvRFYI4rLwLKgCeA+QkU14igOt8EXkqw/XYF8FAi/p/sUud6YGmixQg8izd8L3jjvD8Zh7F1W48IHlPC3ueHc+Vh7tC/A2cBm4H8oJ282X2+Fbg1qP7LwMldltHjDnTL2hT0/VvAI13q/DHSOzAW4goqfzGR9hsRTgqxFFuXem8DZyVajEAl4HefBWiMp9gOpV40jimhvuKm+SiYiBQBxwGrgCNUdReAe89z1XxAddBsNa7sUPncPP2dP2SxEJeIXCsiW/F+LX0/tAh6FwvxAfNc88MyESkIKYBexEhsiEgh3q/Yf4ew3EMSAzGuB+a5z3OB4SIyKoRl92iAYosLcZcURCQT+CvwA1Vt7K1qN2WhXH8b7vwhiZW4VHWJqh4D3Az8PITl9r7S2IjveaBIVcuAFcCfQlhuzyuMjdg6XQwsU9X2EJbb94pjI8YbgdNFZC1wOrATaAth2d2vcOBiiwtxlRREJAVv5z2lqs+54k9FJN9Nzwd2u/IaIPiXoB+o7WXZyUEddXe4+YM7snqdPxwxGtdfgAv6E0832xAT8anq56q6z5U/BhwfXmSxE1uQi4Gn+xtPD9sREzGqaq2qXqiqxwE/c2UNcRRbfDicbVchtvcJXifM4i7lv+bATqF73efJHNgptA3XKdRXu17Q9PfxOoY6O7zOjnT7XyzFBYwPqnMesDqR9huundh9ngu8myixuWnHAv/F3ZQaiVcsxYj3BNIk93khcEe8xdZXPWKgT+GwrbgfO/BreKdqFcA69zobGAW8Cmxx7zlB8/wM7wqBzQRdpYHXXl4DdLj323tYZznwoVvGQ51/bMB0N99e4HOgMkHiegCvM28d8BowOcH2290uvvUuvomJEpubdjuwKIH/7ua79X0EPA4MjcPYuq1HBI8p4b7sMRfGGGMC4qpPwRhjTHRZUjDGGBNgScEYY0yAJQVjjDEBlhSMMcYEWFIwJgQi0u5uRqoUkfUi8iMR6fXvSESKROSSgdpGY8JhScGY0HylqlNVdTLew9POBm7rY54iwJKCiQt2n4IxIRCRZlXNDPo+Fu8O3FygEHgSGOYmX6eqb4vIu0AxsB3veUsPAouAM/Dujl2iqo8MWBDG9MKSgjEh6JoUXNkXwESgCehQ1RYRGQ88rarlInIGcKOqnuvqXwXkqepdIjIUeAtYoKrbBzQYY7ox5HBvgDEJoPPpmSnAQyIyFWgHJvRQ/+tAWdAIW1nAeLwzCWMOK0sKxoTBNR+14z1J8zbgU2AKXn9dS0+zAder6ssDspHGhMA6mo3pJxEZDfwObzQ3xfvFv0tVO4BvA8muahPecI+dXgaudo9tRkQmiMgwjIkBdqZgTGjSRWQdXlNRG17H8v1u2sPAX0VkAd5TWPe68gqgTUTW4z0a+QG8K5LWuEHp9xChsSuMCZd1NBtjjAmw5iNjjDEBlhSMMcYEWFIwxhgTYEnBGGNMgCUFY4wxAZYUjDHGBFhSMMYYE/B/2speOwpDTWgAAAAASUVORK5CYII=",
 | ||
|       "text/plain": [
 | ||
|        "<Figure size 432x288 with 1 Axes>"
 | ||
|       ]
 | ||
|      },
 | ||
|      "metadata": {
 | ||
|       "needs_background": "light"
 | ||
|      },
 | ||
|      "output_type": "display_data"
 | ||
|     }
 | ||
|    ],
 | ||
|    "source": [
 | ||
|     "import matplotlib.pyplot as plt\n",
 | ||
|     "\n",
 | ||
|     "plt.plot(X_test, y_test, label='Actual level')\n",
 | ||
|     "plt.plot(X_test, flaml_y_pred, label='FLAML forecast')\n",
 | ||
|     "plt.plot(X_test, prophet_y_pred, label='Prophet forecast')\n",
 | ||
|     "plt.plot(X_test, autoarima_y_pred, label='AutoArima forecast')\n",
 | ||
|     "plt.plot(X_test, autosarima_y_pred, label='AutoSarima forecast')\n",
 | ||
|     "plt.xlabel('Date')\n",
 | ||
|     "plt.ylabel('CO2 Levels')\n",
 | ||
|     "plt.legend()\n",
 | ||
|     "plt.show()"
 | ||
|    ]
 | ||
|   }
 | ||
|  ],
 | ||
|  "metadata": {
 | ||
|   "kernelspec": {
 | ||
|    "display_name": "Python ('pytorch_forecasting')",
 | ||
|    "language": "python",
 | ||
|    "name": "python3"
 | ||
|   },
 | ||
|   "language_info": {
 | ||
|    "codemirror_mode": {
 | ||
|     "name": "ipython",
 | ||
|     "version": 3
 | ||
|    },
 | ||
|    "file_extension": ".py",
 | ||
|    "mimetype": "text/x-python",
 | ||
|    "name": "python",
 | ||
|    "nbconvert_exporter": "python",
 | ||
|    "pygments_lexer": "ipython3",
 | ||
|    "version": "3.8.1"
 | ||
|   },
 | ||
|   "vscode": {
 | ||
|    "interpreter": {
 | ||
|     "hash": "25a19fbe0a9132dfb9279d48d161753c6352f8f9478c2e74383d340069b907c3"
 | ||
|    }
 | ||
|   }
 | ||
|  },
 | ||
|  "nbformat": 4,
 | ||
|  "nbformat_minor": 2
 | ||
| }
 |