autogen/notebook/flaml_forecast.ipynb

1287 lines
195 KiB
Plaintext
Raw Normal View History

{
"cells": [
{
"cell_type": "markdown",
"source": [
"# Time Series Forecasting with FLAML Library"
],
"metadata": {}
},
{
"cell_type": "markdown",
"source": [
"## 1. Introduction\r\n",
"\r\n",
"FLAML is a Python library (https://github.com/microsoft/FLAML) designed to automatically produce accurate machine learning models with low computational cost. It is fast and cheap. The simple and lightweight design makes it easy to use and extend, such as adding new learners. FLAML can\r\n",
"\r\n",
" - serve as an economical AutoML engine,\r\n",
" - be used as a fast hyperparameter tuning tool, or\r\n",
" - be embedded in self-tuning software that requires low latency & resource in repetitive tuning tasks.\r\n",
" - In this notebook, we demonstrate how to use FLAML library to tune hyperparameters of XGBoost with a regression example.\r\n",
"\r\n",
"FLAML requires Python>=3.6. To run this notebook example, please install flaml with the notebook option:\r\n",
"\r\n",
"> pip install flaml[notebook]"
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": null,
"source": [
"!pip install flaml[notebook,forecast]"
],
"outputs": [],
"metadata": {}
},
{
"cell_type": "markdown",
"source": [
"## 2. Forecast Problem\r\n",
"\r\n",
"### Load data and preprocess\r\n",
"\r\n",
"Import co2 data from statsmodel. The dataset is from “Atmospheric CO2 from Continuous Air Samples at Mauna Loa Observatory, Hawaii, U.S.A.,” which collected CO2 samples from March 1958 to December 2001. The task is to predict monthly CO2 samples."
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": 64,
"source": [
"import statsmodels.api as sm\r\n",
"data = sm.datasets.co2.load_pandas()\r\n",
"data = data.data\r\n",
"# data is given in weeks, but the task is to predict monthly, so use monthly averages instead\r\n",
"data = data['co2'].resample('MS').mean()\r\n",
"data = data.fillna(data.bfill()) # makes sure there are no missing values\r\n",
"data = data.to_frame().reset_index()\r\n",
"# data = data.rename(columns={'index': 'ds', 'co2': 'y'})"
],
"outputs": [],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": 65,
"source": [
"# split the data into a train dataframe and X_test and y_test dataframes, where the number of samples for test is equal to\r\n",
"# the number of periods the user wants to predict\r\n",
"num_samples = data.shape[0]\r\n",
"time_horizon = 12\r\n",
"split_idx = num_samples - time_horizon\r\n",
"X_train = data[:split_idx] # X_train is a dataframe with two columns: time and value\r\n",
"X_test = data[split_idx:]['index'] # X_test is a dataframe with dates for prediction\r\n",
"y_test = data[split_idx:]['co2'] # y_test is a series of the values corresponding to the dates for prediction"
],
"outputs": [],
"metadata": {}
},
{
"cell_type": "markdown",
"source": [
"### Run FLAML\r\n",
"\r\n",
"In the FLAML automl run configuration, users can specify the task type, time budget, error metric, learner list, whether to subsample, resampling strategy type, and so on. All these arguments have default values which will be used if users do not provide them."
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": 66,
"source": [
"''' import AutoML class from flaml package '''\r\n",
"from flaml import AutoML\r\n",
"automl = AutoML()"
],
"outputs": [],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": 67,
"source": [
"settings = {\r\n",
" \"time_budget\": 180, # total running time in seconds\r\n",
" \"metric\": 'mape', # primary metric for validation: 'mape' is generally used for forecast tasks\r\n",
" \"task\": 'forecast', # task type\r\n",
" \"log_file_name\": 'CO2_forecast.log', # flaml log file\r\n",
" \"eval_method\": \"holdout\", # validation method can be chosen from ['auto', 'holdout', 'cv']\r\n",
" \"split_type\": 'time' # for foretask task, 'split_type' has to be 'time'\r\n",
"}"
],
"outputs": [],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": 69,
"source": [
"'''The main flaml automl API'''\r\n",
"automl.fit(dataframe=X_train, # training data\r\n",
" label=('index', 'co2'), # For 'forecast' task, label should be a tuple of strings for timestamp and value columns\r\n",
" **settings, \r\n",
" period=time_horizon) # key word argument 'period' must be included for forecast task)"
],
"outputs": [
{
"output_type": "stream",
"name": "stderr",
"text": [
"[flaml.automl: 08-31 21:19:53] {1209} INFO - Evaluation method: holdout\n",
"INFO:flaml.automl:Evaluation method: holdout\n",
"[flaml.automl: 08-31 21:19:53] {686} INFO - Using TimeSeriesSplit\n",
"INFO:flaml.automl:Using TimeSeriesSplit\n",
"[flaml.automl: 08-31 21:19:53] {1237} INFO - Minimizing error metric: mape\n",
"INFO:flaml.automl:Minimizing error metric: mape\n",
"[flaml.automl: 08-31 21:19:53] {1259} INFO - List of ML learners in AutoML Run: ['fbprophet', 'arima', 'sarimax']\n",
"INFO:flaml.automl:List of ML learners in AutoML Run: ['fbprophet', 'arima', 'sarimax']\n",
"[flaml.automl: 08-31 21:19:53] {1349} INFO - iteration 0, current learner fbprophet\n",
"INFO:flaml.automl:iteration 0, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:19:57] {1502} INFO - at 3.2s,\tbest fbprophet's error=0.0007,\tbest fbprophet's error=0.0007\n",
"INFO:flaml.automl: at 3.2s,\tbest fbprophet's error=0.0007,\tbest fbprophet's error=0.0007\n",
"[flaml.automl: 08-31 21:19:57] {1349} INFO - iteration 1, current learner fbprophet\n",
"INFO:flaml.automl:iteration 1, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:19:59] {1502} INFO - at 6.0s,\tbest fbprophet's error=0.0006,\tbest fbprophet's error=0.0006\n",
"INFO:flaml.automl: at 6.0s,\tbest fbprophet's error=0.0006,\tbest fbprophet's error=0.0006\n",
"[flaml.automl: 08-31 21:19:59] {1349} INFO - iteration 2, current learner fbprophet\n",
"INFO:flaml.automl:iteration 2, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:20:02] {1502} INFO - at 8.5s,\tbest fbprophet's error=0.0006,\tbest fbprophet's error=0.0006\n",
"INFO:flaml.automl: at 8.5s,\tbest fbprophet's error=0.0006,\tbest fbprophet's error=0.0006\n",
"[flaml.automl: 08-31 21:20:02] {1349} INFO - iteration 3, current learner fbprophet\n",
"INFO:flaml.automl:iteration 3, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:20:05] {1502} INFO - at 11.8s,\tbest fbprophet's error=0.0006,\tbest fbprophet's error=0.0006\n",
"INFO:flaml.automl: at 11.8s,\tbest fbprophet's error=0.0006,\tbest fbprophet's error=0.0006\n",
"[flaml.automl: 08-31 21:20:05] {1349} INFO - iteration 4, current learner arima\n",
"INFO:flaml.automl:iteration 4, current learner arima\n",
"[flaml.automl: 08-31 21:20:06] {1502} INFO - at 12.4s,\tbest arima's error=0.0120,\tbest fbprophet's error=0.0006\n",
"INFO:flaml.automl: at 12.4s,\tbest arima's error=0.0120,\tbest fbprophet's error=0.0006\n",
"[flaml.automl: 08-31 21:20:06] {1349} INFO - iteration 5, current learner arima\n",
"INFO:flaml.automl:iteration 5, current learner arima\n",
"[flaml.automl: 08-31 21:20:07] {1502} INFO - at 13.7s,\tbest arima's error=0.0046,\tbest fbprophet's error=0.0006\n",
"INFO:flaml.automl: at 13.7s,\tbest arima's error=0.0046,\tbest fbprophet's error=0.0006\n",
"[flaml.automl: 08-31 21:20:07] {1349} INFO - iteration 6, current learner fbprophet\n",
"INFO:flaml.automl:iteration 6, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:20:09] {1502} INFO - at 15.8s,\tbest fbprophet's error=0.0006,\tbest fbprophet's error=0.0006\n",
"INFO:flaml.automl: at 15.8s,\tbest fbprophet's error=0.0006,\tbest fbprophet's error=0.0006\n",
"[flaml.automl: 08-31 21:20:09] {1349} INFO - iteration 7, current learner fbprophet\n",
"INFO:flaml.automl:iteration 7, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:20:12] {1502} INFO - at 18.7s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 18.7s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:12] {1349} INFO - iteration 8, current learner arima\n",
"INFO:flaml.automl:iteration 8, current learner arima\n",
"[flaml.automl: 08-31 21:20:12] {1502} INFO - at 19.1s,\tbest arima's error=0.0046,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 19.1s,\tbest arima's error=0.0046,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:12] {1349} INFO - iteration 9, current learner arima\n",
"INFO:flaml.automl:iteration 9, current learner arima\n",
"[flaml.automl: 08-31 21:20:14] {1502} INFO - at 20.3s,\tbest arima's error=0.0022,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 20.3s,\tbest arima's error=0.0022,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:14] {1349} INFO - iteration 10, current learner arima\n",
"INFO:flaml.automl:iteration 10, current learner arima\n",
"[flaml.automl: 08-31 21:20:15] {1502} INFO - at 21.7s,\tbest arima's error=0.0022,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 21.7s,\tbest arima's error=0.0022,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:15] {1349} INFO - iteration 11, current learner sarimax\n",
"INFO:flaml.automl:iteration 11, current learner sarimax\n",
"[flaml.automl: 08-31 21:20:16] {1502} INFO - at 22.2s,\tbest sarimax's error=0.0120,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 22.2s,\tbest sarimax's error=0.0120,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:16] {1349} INFO - iteration 12, current learner sarimax\n",
"INFO:flaml.automl:iteration 12, current learner sarimax\n",
"[flaml.automl: 08-31 21:20:16] {1502} INFO - at 22.7s,\tbest sarimax's error=0.0055,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 22.7s,\tbest sarimax's error=0.0055,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:16] {1349} INFO - iteration 13, current learner sarimax\n",
"INFO:flaml.automl:iteration 13, current learner sarimax\n",
"[flaml.automl: 08-31 21:20:17] {1502} INFO - at 23.3s,\tbest sarimax's error=0.0055,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 23.3s,\tbest sarimax's error=0.0055,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:17] {1349} INFO - iteration 14, current learner sarimax\n",
"INFO:flaml.automl:iteration 14, current learner sarimax\n",
"[flaml.automl: 08-31 21:20:18] {1502} INFO - at 24.6s,\tbest sarimax's error=0.0055,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 24.6s,\tbest sarimax's error=0.0055,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:18] {1349} INFO - iteration 15, current learner arima\n",
"INFO:flaml.automl:iteration 15, current learner arima\n",
"[flaml.automl: 08-31 21:20:19] {1502} INFO - at 25.3s,\tbest arima's error=0.0022,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 25.3s,\tbest arima's error=0.0022,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:19] {1349} INFO - iteration 16, current learner sarimax\n",
"INFO:flaml.automl:iteration 16, current learner sarimax\n",
"[flaml.automl: 08-31 21:20:19] {1502} INFO - at 25.7s,\tbest sarimax's error=0.0031,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 25.7s,\tbest sarimax's error=0.0031,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:19] {1349} INFO - iteration 17, current learner fbprophet\n",
"INFO:flaml.automl:iteration 17, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:20:22] {1502} INFO - at 29.0s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 29.0s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:22] {1349} INFO - iteration 18, current learner sarimax\n",
"INFO:flaml.automl:iteration 18, current learner sarimax\n",
"[flaml.automl: 08-31 21:20:23] {1502} INFO - at 29.9s,\tbest sarimax's error=0.0031,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 29.9s,\tbest sarimax's error=0.0031,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:23] {1349} INFO - iteration 19, current learner arima\n",
"INFO:flaml.automl:iteration 19, current learner arima\n",
"[flaml.automl: 08-31 21:20:25] {1502} INFO - at 31.7s,\tbest arima's error=0.0022,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 31.7s,\tbest arima's error=0.0022,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:25] {1349} INFO - iteration 20, current learner arima\n",
"INFO:flaml.automl:iteration 20, current learner arima\n",
"[flaml.automl: 08-31 21:20:26] {1502} INFO - at 32.5s,\tbest arima's error=0.0022,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 32.5s,\tbest arima's error=0.0022,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:26] {1349} INFO - iteration 21, current learner fbprophet\n",
"INFO:flaml.automl:iteration 21, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:20:30] {1502} INFO - at 36.5s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 36.5s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:30] {1349} INFO - iteration 22, current learner fbprophet\n",
"INFO:flaml.automl:iteration 22, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:20:33] {1502} INFO - at 39.4s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 39.4s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:33] {1349} INFO - iteration 23, current learner sarimax\n",
"INFO:flaml.automl:iteration 23, current learner sarimax\n",
"[flaml.automl: 08-31 21:20:33] {1502} INFO - at 39.8s,\tbest sarimax's error=0.0031,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 39.8s,\tbest sarimax's error=0.0031,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:33] {1349} INFO - iteration 24, current learner sarimax\n",
"INFO:flaml.automl:iteration 24, current learner sarimax\n",
"[flaml.automl: 08-31 21:20:33] {1502} INFO - at 39.9s,\tbest sarimax's error=0.0031,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 39.9s,\tbest sarimax's error=0.0031,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:33] {1349} INFO - iteration 25, current learner arima\n",
"INFO:flaml.automl:iteration 25, current learner arima\n",
"[flaml.automl: 08-31 21:20:35] {1502} INFO - at 42.0s,\tbest arima's error=0.0021,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 42.0s,\tbest arima's error=0.0021,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:35] {1349} INFO - iteration 26, current learner sarimax\n",
"INFO:flaml.automl:iteration 26, current learner sarimax\n",
"[flaml.automl: 08-31 21:20:37] {1502} INFO - at 43.4s,\tbest sarimax's error=0.0022,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 43.4s,\tbest sarimax's error=0.0022,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:37] {1349} INFO - iteration 27, current learner sarimax\n",
"INFO:flaml.automl:iteration 27, current learner sarimax\n",
"[flaml.automl: 08-31 21:20:38] {1502} INFO - at 45.1s,\tbest sarimax's error=0.0022,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 45.1s,\tbest sarimax's error=0.0022,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:38] {1349} INFO - iteration 28, current learner sarimax\n",
"INFO:flaml.automl:iteration 28, current learner sarimax\n",
"[flaml.automl: 08-31 21:20:39] {1502} INFO - at 45.7s,\tbest sarimax's error=0.0022,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 45.7s,\tbest sarimax's error=0.0022,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:39] {1349} INFO - iteration 29, current learner fbprophet\n",
"INFO:flaml.automl:iteration 29, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:20:42] {1502} INFO - at 48.4s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 48.4s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:42] {1349} INFO - iteration 30, current learner fbprophet\n",
"INFO:flaml.automl:iteration 30, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:20:45] {1502} INFO - at 51.6s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 51.6s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:45] {1349} INFO - iteration 31, current learner fbprophet\n",
"INFO:flaml.automl:iteration 31, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:20:48] {1502} INFO - at 54.9s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 54.9s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:48] {1349} INFO - iteration 32, current learner sarimax\n",
"INFO:flaml.automl:iteration 32, current learner sarimax\n",
"[flaml.automl: 08-31 21:20:50] {1502} INFO - at 56.7s,\tbest sarimax's error=0.0019,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 56.7s,\tbest sarimax's error=0.0019,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:50] {1349} INFO - iteration 33, current learner fbprophet\n",
"INFO:flaml.automl:iteration 33, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:20:53] {1502} INFO - at 60.1s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 60.1s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:53] {1349} INFO - iteration 34, current learner fbprophet\n",
"INFO:flaml.automl:iteration 34, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:20:56] {1502} INFO - at 62.5s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 62.5s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:56] {1349} INFO - iteration 35, current learner sarimax\n",
"INFO:flaml.automl:iteration 35, current learner sarimax\n",
"[flaml.automl: 08-31 21:20:58] {1502} INFO - at 64.9s,\tbest sarimax's error=0.0019,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 64.9s,\tbest sarimax's error=0.0019,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:20:58] {1349} INFO - iteration 36, current learner fbprophet\n",
"INFO:flaml.automl:iteration 36, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:21:02] {1502} INFO - at 68.6s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 68.6s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:02] {1349} INFO - iteration 37, current learner sarimax\n",
"INFO:flaml.automl:iteration 37, current learner sarimax\n",
"[flaml.automl: 08-31 21:21:02] {1502} INFO - at 68.9s,\tbest sarimax's error=0.0019,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 68.9s,\tbest sarimax's error=0.0019,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:02] {1349} INFO - iteration 38, current learner fbprophet\n",
"INFO:flaml.automl:iteration 38, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:21:05] {1502} INFO - at 71.8s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 71.8s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:05] {1349} INFO - iteration 39, current learner sarimax\n",
"INFO:flaml.automl:iteration 39, current learner sarimax\n",
"[flaml.automl: 08-31 21:21:06] {1502} INFO - at 72.9s,\tbest sarimax's error=0.0019,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 72.9s,\tbest sarimax's error=0.0019,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:06] {1349} INFO - iteration 40, current learner sarimax\n",
"INFO:flaml.automl:iteration 40, current learner sarimax\n",
"[flaml.automl: 08-31 21:21:08] {1502} INFO - at 74.8s,\tbest sarimax's error=0.0019,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 74.8s,\tbest sarimax's error=0.0019,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:08] {1349} INFO - iteration 41, current learner fbprophet\n",
"INFO:flaml.automl:iteration 41, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:21:12] {1502} INFO - at 78.7s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 78.7s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:12] {1349} INFO - iteration 42, current learner arima\n",
"INFO:flaml.automl:iteration 42, current learner arima\n",
"[flaml.automl: 08-31 21:21:14] {1502} INFO - at 80.4s,\tbest arima's error=0.0018,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 80.4s,\tbest arima's error=0.0018,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:14] {1349} INFO - iteration 43, current learner sarimax\n",
"INFO:flaml.automl:iteration 43, current learner sarimax\n",
"[flaml.automl: 08-31 21:21:15] {1502} INFO - at 81.9s,\tbest sarimax's error=0.0019,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 81.9s,\tbest sarimax's error=0.0019,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:15] {1349} INFO - iteration 44, current learner arima\n",
"INFO:flaml.automl:iteration 44, current learner arima\n",
"[flaml.automl: 08-31 21:21:16] {1502} INFO - at 83.0s,\tbest arima's error=0.0018,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 83.0s,\tbest arima's error=0.0018,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:16] {1349} INFO - iteration 45, current learner fbprophet\n",
"INFO:flaml.automl:iteration 45, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:21:19] {1502} INFO - at 86.1s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 86.1s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:19] {1349} INFO - iteration 46, current learner arima\n",
"INFO:flaml.automl:iteration 46, current learner arima\n",
"[flaml.automl: 08-31 21:21:22] {1502} INFO - at 88.5s,\tbest arima's error=0.0018,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 88.5s,\tbest arima's error=0.0018,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:22] {1349} INFO - iteration 47, current learner fbprophet\n",
"INFO:flaml.automl:iteration 47, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:21:25] {1502} INFO - at 91.5s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 91.5s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:25] {1349} INFO - iteration 48, current learner fbprophet\n",
"INFO:flaml.automl:iteration 48, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:21:28] {1502} INFO - at 94.3s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 94.3s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:28] {1349} INFO - iteration 49, current learner sarimax\n",
"INFO:flaml.automl:iteration 49, current learner sarimax\n",
"[flaml.automl: 08-31 21:21:29] {1502} INFO - at 95.8s,\tbest sarimax's error=0.0019,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 95.8s,\tbest sarimax's error=0.0019,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:29] {1349} INFO - iteration 50, current learner fbprophet\n",
"INFO:flaml.automl:iteration 50, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:21:33] {1502} INFO - at 99.4s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 99.4s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:33] {1349} INFO - iteration 51, current learner fbprophet\n",
"INFO:flaml.automl:iteration 51, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:21:36] {1502} INFO - at 102.2s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 102.2s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:36] {1349} INFO - iteration 52, current learner fbprophet\n",
"INFO:flaml.automl:iteration 52, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:21:38] {1502} INFO - at 104.8s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 104.8s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:38] {1349} INFO - iteration 53, current learner fbprophet\n",
"INFO:flaml.automl:iteration 53, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:21:41] {1502} INFO - at 108.1s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 108.1s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:41] {1349} INFO - iteration 54, current learner arima\n",
"INFO:flaml.automl:iteration 54, current learner arima\n",
"[flaml.automl: 08-31 21:21:43] {1502} INFO - at 109.2s,\tbest arima's error=0.0018,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 109.2s,\tbest arima's error=0.0018,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:43] {1349} INFO - iteration 55, current learner fbprophet\n",
"INFO:flaml.automl:iteration 55, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:21:45] {1502} INFO - at 111.2s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 111.2s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:45] {1349} INFO - iteration 56, current learner fbprophet\n",
"INFO:flaml.automl:iteration 56, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:21:47] {1502} INFO - at 114.1s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 114.1s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:47] {1349} INFO - iteration 57, current learner fbprophet\n",
"INFO:flaml.automl:iteration 57, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:21:50] {1502} INFO - at 116.6s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 116.6s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:50] {1349} INFO - iteration 58, current learner fbprophet\n",
"INFO:flaml.automl:iteration 58, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:21:53] {1502} INFO - at 119.6s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 119.6s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:53] {1349} INFO - iteration 59, current learner arima\n",
"INFO:flaml.automl:iteration 59, current learner arima\n",
"[flaml.automl: 08-31 21:21:55] {1502} INFO - at 121.5s,\tbest arima's error=0.0016,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 121.5s,\tbest arima's error=0.0016,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:55] {1349} INFO - iteration 60, current learner fbprophet\n",
"INFO:flaml.automl:iteration 60, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:21:58] {1502} INFO - at 124.8s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 124.8s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:21:58] {1349} INFO - iteration 61, current learner arima\n",
"INFO:flaml.automl:iteration 61, current learner arima\n",
"[flaml.automl: 08-31 21:22:00] {1502} INFO - at 126.5s,\tbest arima's error=0.0016,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 126.5s,\tbest arima's error=0.0016,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:00] {1349} INFO - iteration 62, current learner arima\n",
"INFO:flaml.automl:iteration 62, current learner arima\n",
"[flaml.automl: 08-31 21:22:02] {1502} INFO - at 128.7s,\tbest arima's error=0.0014,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 128.7s,\tbest arima's error=0.0014,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:02] {1349} INFO - iteration 63, current learner fbprophet\n",
"INFO:flaml.automl:iteration 63, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:22:05] {1502} INFO - at 131.8s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 131.8s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:05] {1349} INFO - iteration 64, current learner arima\n",
"INFO:flaml.automl:iteration 64, current learner arima\n",
"[flaml.automl: 08-31 21:22:08] {1502} INFO - at 134.2s,\tbest arima's error=0.0014,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 134.2s,\tbest arima's error=0.0014,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:08] {1349} INFO - iteration 65, current learner arima\n",
"INFO:flaml.automl:iteration 65, current learner arima\n",
"[flaml.automl: 08-31 21:22:09] {1502} INFO - at 135.6s,\tbest arima's error=0.0014,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 135.6s,\tbest arima's error=0.0014,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:09] {1349} INFO - iteration 66, current learner fbprophet\n",
"INFO:flaml.automl:iteration 66, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:22:11] {1502} INFO - at 137.9s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 137.9s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:11] {1349} INFO - iteration 67, current learner fbprophet\n",
"INFO:flaml.automl:iteration 67, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:22:14] {1502} INFO - at 140.8s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 140.8s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:14] {1349} INFO - iteration 68, current learner arima\n",
"INFO:flaml.automl:iteration 68, current learner arima\n",
"[flaml.automl: 08-31 21:22:16] {1502} INFO - at 142.4s,\tbest arima's error=0.0014,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 142.4s,\tbest arima's error=0.0014,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:16] {1349} INFO - iteration 69, current learner arima\n",
"INFO:flaml.automl:iteration 69, current learner arima\n",
"[flaml.automl: 08-31 21:22:18] {1502} INFO - at 145.1s,\tbest arima's error=0.0014,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 145.1s,\tbest arima's error=0.0014,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:18] {1349} INFO - iteration 70, current learner fbprophet\n",
"INFO:flaml.automl:iteration 70, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:22:21] {1502} INFO - at 147.5s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 147.5s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:21] {1349} INFO - iteration 71, current learner sarimax\n",
"INFO:flaml.automl:iteration 71, current learner sarimax\n",
"[flaml.automl: 08-31 21:22:22] {1502} INFO - at 148.8s,\tbest sarimax's error=0.0019,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 148.8s,\tbest sarimax's error=0.0019,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:22] {1349} INFO - iteration 72, current learner arima\n",
"INFO:flaml.automl:iteration 72, current learner arima\n",
"[flaml.automl: 08-31 21:22:24] {1502} INFO - at 150.8s,\tbest arima's error=0.0014,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 150.8s,\tbest arima's error=0.0014,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:24] {1349} INFO - iteration 73, current learner sarimax\n",
"INFO:flaml.automl:iteration 73, current learner sarimax\n",
"[flaml.automl: 08-31 21:22:26] {1502} INFO - at 152.4s,\tbest sarimax's error=0.0010,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 152.4s,\tbest sarimax's error=0.0010,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:26] {1349} INFO - iteration 74, current learner fbprophet\n",
"INFO:flaml.automl:iteration 74, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:22:28] {1502} INFO - at 154.3s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 154.3s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:28] {1349} INFO - iteration 75, current learner fbprophet\n",
"INFO:flaml.automl:iteration 75, current learner fbprophet\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:22:30] {1502} INFO - at 156.8s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 156.8s,\tbest fbprophet's error=0.0005,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:30] {1349} INFO - iteration 76, current learner sarimax\n",
"INFO:flaml.automl:iteration 76, current learner sarimax\n",
"[flaml.automl: 08-31 21:22:32] {1502} INFO - at 158.7s,\tbest sarimax's error=0.0008,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 158.7s,\tbest sarimax's error=0.0008,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:32] {1349} INFO - iteration 77, current learner sarimax\n",
"INFO:flaml.automl:iteration 77, current learner sarimax\n",
"[flaml.automl: 08-31 21:22:34] {1502} INFO - at 160.4s,\tbest sarimax's error=0.0008,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 160.4s,\tbest sarimax's error=0.0008,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:34] {1349} INFO - iteration 78, current learner arima\n",
"INFO:flaml.automl:iteration 78, current learner arima\n",
"[flaml.automl: 08-31 21:22:37] {1502} INFO - at 163.7s,\tbest arima's error=0.0011,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 163.7s,\tbest arima's error=0.0011,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:37] {1349} INFO - iteration 79, current learner sarimax\n",
"INFO:flaml.automl:iteration 79, current learner sarimax\n",
"[flaml.automl: 08-31 21:22:40] {1502} INFO - at 166.2s,\tbest sarimax's error=0.0008,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 166.2s,\tbest sarimax's error=0.0008,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:40] {1349} INFO - iteration 80, current learner arima\n",
"INFO:flaml.automl:iteration 80, current learner arima\n",
"[flaml.automl: 08-31 21:22:42] {1502} INFO - at 169.0s,\tbest arima's error=0.0011,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 169.0s,\tbest arima's error=0.0011,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:42] {1349} INFO - iteration 81, current learner sarimax\n",
"INFO:flaml.automl:iteration 81, current learner sarimax\n",
"[flaml.automl: 08-31 21:22:44] {1502} INFO - at 170.3s,\tbest sarimax's error=0.0008,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 170.3s,\tbest sarimax's error=0.0008,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:44] {1349} INFO - iteration 82, current learner sarimax\n",
"INFO:flaml.automl:iteration 82, current learner sarimax\n",
"[flaml.automl: 08-31 21:22:47] {1502} INFO - at 173.8s,\tbest sarimax's error=0.0008,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 173.8s,\tbest sarimax's error=0.0008,\tbest fbprophet's error=0.0005\n",
"[flaml.automl: 08-31 21:22:47] {1349} INFO - iteration 83, current learner sarimax\n",
"INFO:flaml.automl:iteration 83, current learner sarimax\n",
"[flaml.automl: 08-31 21:22:48] {1502} INFO - at 174.8s,\tbest sarimax's error=0.0008,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 174.8s,\tbest sarimax's error=0.0008,\tbest fbprophet's error=0.0005\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:22:50] {1534} INFO - retrain sarimax for 2.2s\n",
"INFO:flaml.automl:retrain sarimax for 2.2s\n",
"[flaml.automl: 08-31 21:22:50] {1349} INFO - iteration 84, current learner sarimax\n",
"INFO:flaml.automl:iteration 84, current learner sarimax\n",
"[flaml.automl: 08-31 21:22:52] {1502} INFO - at 178.8s,\tbest sarimax's error=0.0008,\tbest fbprophet's error=0.0005\n",
"INFO:flaml.automl: at 178.8s,\tbest sarimax's error=0.0008,\tbest fbprophet's error=0.0005\n",
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
"[flaml.automl: 08-31 21:22:55] {1534} INFO - retrain sarimax for 2.5s\n",
"INFO:flaml.automl:retrain sarimax for 2.5s\n",
"[flaml.automl: 08-31 21:22:55] {1558} INFO - selected model: <prophet.forecaster.Prophet object at 0x000002C8485D46D0>\n",
"INFO:flaml.automl:selected model: <prophet.forecaster.Prophet object at 0x000002C8485D46D0>\n",
"[flaml.automl: 08-31 21:22:55] {1281} INFO - fit succeeded\n",
"INFO:flaml.automl:fit succeeded\n",
"[flaml.automl: 08-31 21:22:55] {1282} INFO - Time taken to find the best model: 71.81526470184326\n",
"INFO:flaml.automl:Time taken to find the best model: 71.81526470184326\n"
]
}
],
"metadata": {}
},
{
"cell_type": "markdown",
"source": [
"### Best model and metric"
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": 70,
"source": [
"''' retrieve best config and best learner'''\r\n",
"print('Best ML leaner:', automl.best_estimator)\r\n",
"print('Best hyperparmeter config:', automl.best_config)\r\n",
"print(f'Best mape on validation data: {automl.best_loss}')\r\n",
"print(f'Training duration of best run: {automl.best_config_train_time}s')"
],
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Best ML leaner: fbprophet\n",
"Best hyperparmeter config: {'changepoint_prior_scale': 0.02876449933617924, 'seasonality_prior_scale': 1.80360430903146, 'holidays_prior_scale': 2.1243991057068654, 'seasonality_mode': 'additive'}\n",
"Best mape on validation data: 0.0004765336783587436\n",
"Training duration of best run: 2.890876531600952s\n"
]
}
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": 71,
"source": [
"print(automl.model.estimator)"
],
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"<prophet.forecaster.Prophet object at 0x000002C8485D46D0>\n"
]
}
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": 72,
"source": [
"''' pickle and save the automl object '''\r\n",
"import pickle\r\n",
"with open('automl.pkl', 'wb') as f:\r\n",
" pickle.dump(automl, f, pickle.HIGHEST_PROTOCOL)"
],
"outputs": [],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": 75,
"source": [
"''' compute predictions of testing dataset '''\r\n",
"flaml_y_pred = automl.predict(X_test)\r\n",
"print('Predicted labels', flaml_y_pred)\r\n",
"print('True labels', y_test)"
],
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Predicted labels 0 370.179589\n",
"1 370.897300\n",
"2 371.950006\n",
"3 373.135219\n",
"4 373.634979\n",
"5 373.104820\n",
"6 371.760649\n",
"7 369.848551\n",
"8 368.250457\n",
"9 368.318975\n",
"10 369.517605\n",
"11 370.783469\n",
"Name: yhat, dtype: float64\n",
"True labels 514 370.175\n",
"515 371.325\n",
"516 372.060\n",
"517 372.775\n",
"518 373.800\n",
"519 373.060\n",
"520 371.300\n",
"521 369.425\n",
"522 367.880\n",
"523 368.050\n",
"524 369.375\n",
"525 371.020\n",
"Name: co2, dtype: float64\n"
]
}
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": 76,
"source": [
"''' compute different metric values on testing dataset'''\r\n",
"from flaml.ml import sklearn_metric_loss_score\r\n",
"print('mape', '=', sklearn_metric_loss_score('mape', flaml_y_pred, y_test))"
],
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"mape = 0.0006780276756290267\n"
]
}
],
"metadata": {}
},
{
"cell_type": "markdown",
"source": [
"### Log history"
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": 77,
"source": [
"from flaml.data import get_output_from_log\r\n",
"time_history, best_valid_loss_history, valid_loss_history, config_history, train_loss_history = \\\r\n",
" get_output_from_log(filename=settings['log_file_name'], time_budget=180)\r\n",
"\r\n",
"for config in config_history:\r\n",
" print(config)"
],
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"{'Current Learner': 'fbprophet', 'Current Sample': 502, 'Current Hyper-parameters': {'changepoint_prior_scale': 0.010000000000000002, 'seasonality_prior_scale': 1.0, 'holidays_prior_scale': 1.0, 'seasonality_mode': 'multiplicative'}, 'Best Learner': 'fbprophet', 'Best Hyper-parameters': {'changepoint_prior_scale': 0.010000000000000002, 'seasonality_prior_scale': 1.0, 'holidays_prior_scale': 1.0, 'seasonality_mode': 'multiplicative'}}\n",
"{'Current Learner': 'fbprophet', 'Current Sample': 502, 'Current Hyper-parameters': {'changepoint_prior_scale': 0.0091602623296037, 'seasonality_prior_scale': 0.8823866403788657, 'holidays_prior_scale': 3.2294014074557995, 'seasonality_mode': 'additive'}, 'Best Learner': 'fbprophet', 'Best Hyper-parameters': {'changepoint_prior_scale': 0.0091602623296037, 'seasonality_prior_scale': 0.8823866403788657, 'holidays_prior_scale': 3.2294014074557995, 'seasonality_mode': 'additive'}}\n",
"{'Current Learner': 'fbprophet', 'Current Sample': 502, 'Current Hyper-parameters': {'changepoint_prior_scale': 0.010000000000000002, 'seasonality_prior_scale': 1.0, 'holidays_prior_scale': 0.999999999999999, 'seasonality_mode': 'additive'}, 'Best Learner': 'fbprophet', 'Best Hyper-parameters': {'changepoint_prior_scale': 0.010000000000000002, 'seasonality_prior_scale': 1.0, 'holidays_prior_scale': 0.999999999999999, 'seasonality_mode': 'additive'}}\n",
"{'Current Learner': 'fbprophet', 'Current Sample': 502, 'Current Hyper-parameters': {'changepoint_prior_scale': 0.05247335998097256, 'seasonality_prior_scale': 0.987707602743762, 'holidays_prior_scale': 0.5484274380225445, 'seasonality_mode': 'additive'}, 'Best Learner': 'fbprophet', 'Best Hyper-parameters': {'changepoint_prior_scale': 0.05247335998097256, 'seasonality_prior_scale': 0.987707602743762, 'holidays_prior_scale': 0.5484274380225445, 'seasonality_mode': 'additive'}}\n",
"{'Current Learner': 'fbprophet', 'Current Sample': 502, 'Current Hyper-parameters': {'changepoint_prior_scale': 0.02876449933617924, 'seasonality_prior_scale': 1.80360430903146, 'holidays_prior_scale': 2.1243991057068654, 'seasonality_mode': 'additive'}, 'Best Learner': 'fbprophet', 'Best Hyper-parameters': {'changepoint_prior_scale': 0.02876449933617924, 'seasonality_prior_scale': 1.80360430903146, 'holidays_prior_scale': 2.1243991057068654, 'seasonality_mode': 'additive'}}\n"
]
}
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": 78,
"source": [
"import matplotlib.pyplot as plt\r\n",
"import numpy as np\r\n",
"\r\n",
"plt.title('Learning Curve')\r\n",
"plt.xlabel('Wall Clock Time (s)')\r\n",
"plt.ylabel('Validation Accuracy')\r\n",
"plt.scatter(time_history, 1 - np.array(valid_loss_history))\r\n",
"plt.step(time_history, 1 - np.array(best_valid_loss_history), where='post')\r\n",
"plt.show()"
],
"outputs": [
{
"output_type": "display_data",
"data": {
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
],
"image/svg+xml": "<?xml version=\"1.0\" encoding=\"utf-8\" standalone=\"no\"?>\r\n<!DOCTYPE svg PUBLIC \"-//W3C//DTD SVG 1.1//EN\"\r\n \"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd\">\r\n<svg height=\"277.314375pt\" version=\"1.1\" viewBox=\"0 0 411.23125 277.314375\" width=\"411.23125pt\" xmlns=\"http://www.w3.org/2000/svg\" xmlns:xlink=\"http://www.w3.org/1999/xlink\">\r\n <metadata>\r\n <rdf:RDF xmlns:cc=\"http://creativecommons.org/ns#\" xmlns:dc=\"http://purl.org/dc/elements/1.1/\" xmlns:rdf=\"http://www.w3.org/1999/02/22-rdf-syntax-ns#\">\r\n <cc:Work>\r\n <dc:type rdf:resource=\"http://purl.org/dc/dcmitype/StillImage\"/>\r\n <dc:date>2021-08-31T21:24:24.217702</dc:date>\r\n <dc:format>image/svg+xml</dc:format>\r\n <dc:creator>\r\n <cc:Agent>\r\n <dc:title>Matplotlib v3.4.2, https://matplotlib.org/</dc:title>\r\n </cc:Agent>\r\n </dc:creator>\r\n </cc:Work>\r\n </rdf:RDF>\r\n </metadata>\r\n <defs>\r\n <style type=\"text/css\">*{stroke-linecap:butt;stroke-linejoin:round;}</style>\r\n </defs>\r\n <g id=\"figure_1\">\r\n <g id=\"patch_1\">\r\n <path d=\"M 0 277.314375 \r\nL 411.23125 277.314375 \r\nL 411.23125 0 \r\nL 0 0 \r\nz\r\n\" style=\"fill:none;\"/>\r\n </g>\r\n <g id=\"axes_1\">\r\n <g id=\"patch_2\">\r\n <path d=\"M 69.23125 239.758125 \r\nL 404.03125 239.758125 \r\nL 404.03125 22.318125 \r\nL 69.23125 22.318125 \r\nz\r\n\" style=\"fill:#ffffff;\"/>\r\n </g>\r\n <g id=\"PathCollection_1\">\r\n <defs>\r\n <path d=\"M 0 3 \r\nC 0.795609 3 1.55874 2.683901 2.12132 2.12132 \r\nC 2.683901 1.55874 3 0.795609 3 0 \r\nC 3 -0.795609 2.683901 -1.55874 2.12132 -2.12132 \r\nC 1.55874 -2.683901 0.795609 -3 0 -3 \r\nC -0.795609 -3 -1.55874 -2.683901 -2.12132 -2.12132 \r\nC -2.683901 -1.55874 -3 -0.795609 -3 0 \r\nC -3 0.795609 -2.683901 1.55874 -2.12132 2.12132 \r\nC -1.55874 2.683901 -0.795609 3 0 3 \r\nz\r\n\" id=\"m9224bd9c40\" style=\"stroke:#1f77b4;\"/>\r\n </defs>\r\n <g clip-path=\"url(#pa603d35389)\">\r\n <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"84.449432\" xlink:href=\"#m9224bd9c40\" y=\"229.874489\"/>\r\n <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"96.825569\" xlink:href=\"#m9224bd9c40\" y=\"111.307384\"/>\r\n <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"108.175876\" xlink:href=\"#m9224bd9c40\" y=\"98.627902\"/>\r\n <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"153.201295\" xlink:href=\"#m9224bd9c40\" y=\"35.784604\"/>\r\n <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"388.813068\" xlink:href=\"#m9224bd9c40\" y=\"32.201761\"/>\r\n </g>\r\n </g>\r\n <g id=\"matplotlib.axis_1\">\r\n <g id=\"xtick_1\">\r\n <g id=\"line2d_1\">\r\n <defs>\r\n <path d=\"M 0 0 \r\nL 0 3.5 \r\n\" id=\"m2507cf2672\" style=\"stroke:#000000;stroke-width:0.8;\"/>\r\n </defs>\r\n <g>\r\n <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"70.423893\" xlink:href=\"#m2507cf2672\" y=\"239.758125\"/>\r\n </g>\r\n </g>\r\n <g id=\"text_1\">\r\n <!-- 0 -->\r\n <g transform=\"translate(67.242643 254.356562)scale(0.1 -0.1)\">\r\n <defs>\r\n <path d=\"M 2034 4250 \r\nQ 1547 4250 1301 3770 \r\nQ 1056 3291 1056 2328 \r\nQ 1056 1369 1301 889 \r\nQ 1547 409 2034 409 \r\nQ 2525 409 2770 889 \r\nQ 3016 1369 3016 2328 \r\nQ 3016 3291 2770 3770 \r\nQ 2525 4250 2034 4250 \r\nz\r\nM 2034 4750 \r\nQ 2819 4750 3233 4129 \r\nQ 3647 3509 3647 2328 \r\nQ 3647 1150 3233 529 \r\nQ 2819 -91 2034 -91 \r\nQ 1250 -91 836 529 \r\nQ 422 1150 422 2328 \r\nQ 422 3509 836 4129 \r\nQ 1250 4750 2034 4750 \r\nz\r\n\" id=\"DejaVuSans-30\" transform=\"scale(0.015625)\"/>\r\n </defs>\r\n <use xlink:href=\"#DejaVuSans-30\"/>\r\n </g>\r\n </g>\r\n </g>\r\n <g id=\"xtick_2\">\r\n <g id=\"line2d_2\">\r\n <g>\r\n <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"114.758364\" xlink:href=\"#m2507cf2672\" y=\"239.758125\"/>\r\n </g>\r\n </g>\r\n <g id=\"text_2\">\r\n <!-- 10 -->\r\n <g transform=\"translate(108.395864 254.356562)scale(0.1
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAZsAAAEWCAYAAACwtjr+AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8rg+JYAAAACXBIWXMAAAsTAAALEwEAmpwYAAAmm0lEQVR4nO3de5xV1X338c9XRMVEBAEtgoBWJCKKRoIhjbbaxFu84C0RbWw1iiZotTYYMJpq88RqfZI82liJ95JGY1VAbVViNEJrURwCwiCS4CURMBGjxAtEGPg9f+x1dHOcObPB2cxh5vt+vc5r9l57r71/G3F+rLXXWUsRgZmZWZm2au8AzMys43OyMTOz0jnZmJlZ6ZxszMysdE42ZmZWOicbMzMrnZONWTuTdLCkxe0dh1mZnGysU5P0sqTPtWcMEfHfETGkrOtLOkLSTElvS1ohaYak48q6n1lznGzMSiapSzve+2TgHmAy0B/YBfgWcOwmXEuS/DvDNon/4pg1Q9JWkiZIekHS7yX9h6SdcsfvkfRbSX9IrYZ9csfukHSjpIckvQscmlpQX5c0P9W5W9J26fy/kLQ0V7/Fc9PxSyS9Kmm5pLMlhaQ9m3kGAd8Dvh0Rt0TEHyJifUTMiIhz0jlXSPr3XJ1B6Xpbp/0nJH1H0pPAKuBSSQ1V9/k7SQ+k7W0l/V9Jv5H0O0mTJHX7iP85rANwsjFr3t8Co4E/B3YF3gRuyB1/GBgM7Az8AvhxVf3TgO8AOwD/k8q+CBwJ7A7sB/xNjfs3e66kI4GLgc8Be6b4WjIE2A24t8Y5RXwZGEv2LP8CDJE0OHf8NODOtH0NsBewf4qvH1lLyjo5Jxuz5p0LfDMilkbEe8AVwMmVf/FHxG0R8Xbu2HBJO+bq3x8RT6aWxB9T2fURsTwi3gAeJPuF3JKWzv0icHtELIyIVcCVNa7RK/18teAzt+SOdL+miPgDcD8wBiAlnU8AD6SW1DnA30XEGxHxNnAVcOpHvL91AE42Zs0bCEyVtFLSSmARsA7YRVIXSVenLra3gJdTnd65+q80c83f5rZXAR+vcf+Wzt216trN3afi9+ln3xrnFFF9jztJyYasVTMtJb4+wPbAnNyf2yOp3Do5Jxuz5r0CHBURPXKf7SJiGdkv2OPJurJ2BAalOsrVL2s69VfJXvRX7Fbj3MVkz3FSjXPeJUsQFX/SzDnVz/JToLek/cmSTqUL7XVgNbBP7s9sx4iolVStk3CyMYOukrbLfbYGJgHfkTQQQFIfScen83cA3iNrOWxP1lW0ufwHcKakvSVtT433IZGtH3IxcLmkMyV1TwMfPivppnTaPOAQSQNSN+DE1gKIiCay90DXAjsBj6by9cDNwPcl7QwgqZ+kIzb1Ya3jcLIxg4fI/kVe+VwBXAc8APxU0tvAU8BB6fzJwK+BZcBz6dhmEREPA9cDPweWALPSofdaOP9e4EvAWcBy4HfA/yF770JEPArcDcwH5gD/WTCUO8ladvek5FPxjRTXU6mL8WdkAxWsk5MXTzPbcknaG2gEtq36pW9WV9yyMdvCSDpB0jaSepINNX7QicbqnZON2ZbnXGAF8ALZCLmvtm84Zq1zN5qZmZXOLRszMyvd1u0dQL3q3bt3DBo0qL3DMDPbosyZM+f1iPjQF3mdbFowaNAgGhoaWj/RzMzeJ+nXzZW7G83MzErnZGNmZqVzsjEzs9I52ZiZWemcbMzMrHQejWZmZkybu4xrpy9m+crV7NqjG+OPGMLoA/q12fWdbMzMOrlpc5cxccoCVq9dB8CylauZOGUBQJslHHejmZl1ctdOX/x+oqlYvXYd105f3Gb3cLIxM+vklq9cvVHlm8LdaFaqsvuBzeyj27VHN5Y1k1h27dGtze7hlo2VptIPvGzlaoIP+oGnzV3W3qGZWc74I4bQrWuXDcq6de3C+CPabpFVt2ysNC31A19y73zumv2bdorKzJqza4/teHHFuwTQz6PRbEvSUn/vmnXrN3MkZtaa3h/flt4f35bj9+/HaQcNaPPrO9lYaVrqB+7Xoxt3nzuqHSIys/bidzZWms3RD2xmWwa3bKw0lf7eS+6dz5p160vpBzazLYOTjZVq9AH93h8M4K4zs87L3WhmZlY6JxszMyudk42ZmZXOycbMzErnZGNmZqVzsjEzs9KVmmwkHSlpsaQlkiY0c7ynpKmS5kuaLWlY7tiFkholLZR0Ua58uKRZkhZIelBS91Q+SNJqSfPSZ1KuzoHp/CWSrpekMp/bzMw2VFqykdQFuAE4ChgKjJE0tOq0S4F5EbEfcAZwXao7DDgHGAkMB46RNDjVuQWYEBH7AlOB8bnrvRAR+6fPebnyG4GxwOD0ObLtntTMzFpTZstmJLAkIl6MiDXAT4Djq84ZCjwGEBHPA4Mk7QLsDTwVEasiogmYAZyQ6gwBZqbtR4GTagUhqS/QPSJmRUQAk4HRH/XhzMysuDKTTT/gldz+0lSW9yxwIoCkkcBAoD/QCBwiqZek7YGjgd1SnUbguLR9Sq4cYHdJcyXNkHRwLo6lrcRBimGspAZJDStWrCj+pGZmVlOZyaa59yJRtX810FPSPOACYC7QFBGLgGvIWi6PkCWlplTnLGCcpDnADsCaVP4qMCAiDgAuBu5M73OKxJEVRtwUESMiYkSfPn2KPaWZmbWqzLnRlrJhq6M/sDx/QkS8BZwJkF7av5Q+RMStwK3p2FXpepXutsNT+V7AF1L5e8B7aXuOpBeAvVK9/rXiMDOzcpXZsnkGGCxpd0nbAKcCD+RPkNQjHQM4G5iZEhCSdk4/B5B1td1VVb4VcBkwKe33SYMSkLQH2UCAFyPiVeBtSZ9OCe0M4P7yHtvMzKqV1rKJiCZJ5wPTgS7AbRGxUNJ56fgksoEAkyWtA54DvpK7xH2SegFrgXER8WYqHyNpXNqeAtyetg8B/lFSE7AOOC8i3kjHvgrcAXQDHk4fMzPbTEpdYiAiHgIeqiqblNueRdYCaa7uwS2UX0caIl1Vfh9wXwt1GoBhzR0zM7PyeQYBMzMrnZONmZmVzsnGzMxK52RjZmalc7IxM7PSOdmYmVnpnGzMzKx0TjZmZlY6JxszMyudk42ZmZXOycbMzErnZGNmZqVzsjEzs9I52ZiZWemcbMzMrHRONmZmVjonGzMzK52TjZmZlc7JxszMSudkY2ZmpXOyMTOz0jnZmJlZ6ZxszMysdE42ZmZWOicbMzMrnZONmZmVzsnGzMxK12qykbTT5gjEzMw6riItm6cl3SPpaEkqPSIzM+twiiSbvYCbgC8DSyRdJWmvIheXdKSkxZKWSJrQzPGekqZKmi9ptqRhuWMXSmqUtFDSRbny4ZJmSVog6UFJ3auuOUDSO5K+nit7IsUxL312LhK/mZm1jVaTTWQejYgxwNnAXwOzJc2QNKqlepK6ADcARwFDgTGShladdikwLyL2A84Arkt1hwHnACOB4cAxkganOrcAEyJiX2AqML7qmt8HHm4mpNMjYv/0ea215zYzs7ZT5J1Nr9TKaAC+DlwA9Ab+HrizRtWRwJKIeDEi1gA/AY6vOmco8BhARDwPDJK0C7A38FRErIqIJmAGcEKqMwSYmbYfBU7KxToaeBFY2NpzmZnZ5lOkG20W0B0YHRFfiIgpEdEUEQ3ApBr1+gGv5PaXprK8Z4ETASSNBAYC/YFG4JCU6LYHjgZ2S3UagePS9imVckkfA74BXNlCPLenLrTLW3r3JGmspAZJDStWrKjxaGZmtjGKJJshEfHtiFhafSAirqlRr7lf6FG1fzXQU9I8shbTXKApIhYB15C1XB4hS0pNqc5ZwDhJc4AdgDWp/Erg+xHxTjP3PT11ux2cPl9uLuCIuCkiRkTEiD59+tR4NDMz2xhFks1PJfWo7KSX+tML1FvKB60RyFosy/MnRMRbEXFmROxP9s6mD/BSOnZrRHwyIg4B3gB+lcqfj4jDI+JA4C7ghXS5g4B/lvQycBFwqaTzU51l6efbZF1/IwvEb2ZmbWTrAuf0iYiVlZ2IeLPgaK5ngMGSdgeWAacCp+VPSElsVXqnczYwMyLeSsd
},
"metadata": {
"needs_background": "light"
}
}
],
"metadata": {}
},
{
"cell_type": "markdown",
"source": [
"## 3. Comparison with Alternatives"
],
"metadata": {}
},
{
"cell_type": "markdown",
"source": [
"FLAML's MAPE"
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": null,
"source": [
"from flaml.ml import sklearn_metric_loss_score\r\n",
"print('flaml mape', '=', sklearn_metric_loss_score('mape', flaml_y_pred, y_test))"
],
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"flaml mape = 0.0006780276756290267\n"
]
}
],
"metadata": {}
},
{
"cell_type": "markdown",
"source": [
"Default Prophet"
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": null,
"source": [
"from prophet import Prophet\r\n",
"prophet_model = Prophet()"
],
"outputs": [],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": null,
"source": [
"X_train_prophet = X_train.copy()\r\n",
"X_train_prophet = X_train_prophet.rename(columns={'index': 'ds', 'co2': 'y'})\r\n",
"prophet_model.fit(X_train_prophet)"
],
"outputs": [
{
"output_type": "stream",
"name": "stderr",
"text": [
"INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.\n",
"INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n"
]
},
{
"output_type": "execute_result",
"data": {
"text/plain": [
"<prophet.forecaster.Prophet at 0x2c84853f9d0>"
]
},
"metadata": {},
"execution_count": 38
}
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": null,
"source": [
"X_test_prophet = X_test.copy()\r\n",
"X_test_prophet = X_test_prophet.rename(columns={'index': 'ds'})\r\n",
"prophet_y_pred = prophet_model.predict(X_test_prophet)['yhat']\r\n",
"print('Predicted labels', prophet_y_pred)\r\n",
"print('True labels', y_test)\r\n",
"from flaml.ml import sklearn_metric_loss_score\r\n",
"print('default prophet mape', '=', sklearn_metric_loss_score('mape', prophet_y_pred, y_test))"
],
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Predicted labels 0 370.450675\n",
"1 371.177764\n",
"2 372.229577\n",
"3 373.419835\n",
"4 373.914917\n",
"5 373.406484\n",
"6 372.053428\n",
"7 370.149037\n",
"8 368.566631\n",
"9 368.646853\n",
"10 369.863891\n",
"11 371.135959\n",
"Name: yhat, dtype: float64\n",
"True labels co2\n",
"514 370.175\n",
"515 371.325\n",
"516 372.060\n",
"517 372.775\n",
"518 373.800\n",
"519 373.060\n",
"520 371.300\n",
"521 369.425\n",
"522 367.880\n",
"523 368.050\n",
"524 369.375\n",
"525 371.020\n",
"default prophet mape = 0.0011396920680673015\n"
]
}
],
"metadata": {}
},
{
"cell_type": "markdown",
"source": [
"Auto Arima"
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": null,
"source": [
"# !pip install pmdarima"
],
"outputs": [],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": null,
"source": [
"from pmdarima.arima import auto_arima\r\n",
"import pandas as pd\r\n",
"import time"
],
"outputs": [],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": null,
"source": [
"X_train_arima = X_train.copy()\r\n",
"X_train_arima.index = pd.to_datetime(X_train_arima['index'])\r\n",
"X_train_arima = X_train_arima.drop('index', axis=1)\r\n",
"X_train_arima = X_train_arima.rename(columns={'co2': 'y'})\r\n",
"# use same search space as FLAML\r\n",
"arima_model = auto_arima(X_train_arima, \r\n",
" start_p=2, d=None, start_q=2, max_p=10, max_d=2, max_q=10, \r\n",
" suppress_warnings=True, stepwise=False, seasonal=False, \r\n",
" error_action='ignore', trace=True, n_fits=500)"
],
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
" ARIMA(0,1,0)(0,0,0)[0] intercept : AIC=1638.009, Time=0.02 sec\n",
" ARIMA(0,1,1)(0,0,0)[0] intercept : AIC=1344.207, Time=0.10 sec\n",
" ARIMA(0,1,2)(0,0,0)[0] intercept : AIC=1222.286, Time=0.17 sec\n",
" ARIMA(0,1,3)(0,0,0)[0] intercept : AIC=1174.928, Time=0.22 sec\n",
" ARIMA(0,1,4)(0,0,0)[0] intercept : AIC=1188.947, Time=0.37 sec\n",
" ARIMA(0,1,5)(0,0,0)[0] intercept : AIC=1091.452, Time=0.46 sec\n",
" ARIMA(1,1,0)(0,0,0)[0] intercept : AIC=1298.693, Time=0.07 sec\n",
" ARIMA(1,1,1)(0,0,0)[0] intercept : AIC=1240.963, Time=0.15 sec\n",
" ARIMA(1,1,2)(0,0,0)[0] intercept : AIC=1196.535, Time=0.21 sec\n",
" ARIMA(1,1,3)(0,0,0)[0] intercept : AIC=1176.484, Time=0.31 sec\n",
" ARIMA(1,1,4)(0,0,0)[0] intercept : AIC=inf, Time=1.11 sec\n",
" ARIMA(2,1,0)(0,0,0)[0] intercept : AIC=1180.404, Time=0.11 sec\n",
" ARIMA(2,1,1)(0,0,0)[0] intercept : AIC=990.719, Time=0.32 sec\n",
" ARIMA(2,1,2)(0,0,0)[0] intercept : AIC=988.094, Time=0.48 sec\n",
" ARIMA(2,1,3)(0,0,0)[0] intercept : AIC=1140.469, Time=0.54 sec\n",
" ARIMA(3,1,0)(0,0,0)[0] intercept : AIC=1126.139, Time=0.29 sec\n",
" ARIMA(3,1,1)(0,0,0)[0] intercept : AIC=989.496, Time=0.71 sec\n",
" ARIMA(3,1,2)(0,0,0)[0] intercept : AIC=991.599, Time=0.89 sec\n",
" ARIMA(4,1,0)(0,0,0)[0] intercept : AIC=1125.025, Time=0.20 sec\n",
" ARIMA(4,1,1)(0,0,0)[0] intercept : AIC=988.660, Time=0.72 sec\n",
" ARIMA(5,1,0)(0,0,0)[0] intercept : AIC=1113.673, Time=0.20 sec\n",
"\n",
"Best model: ARIMA(2,1,2)(0,0,0)[0] intercept\n",
"Total fit time: 7.677 seconds\n"
]
}
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": null,
"source": [
"autoarima_y_pred = arima_model.predict(n_periods = 12)\r\n",
"print('Predicted labels', y_pred)\r\n",
"print('True labels', y_test)\r\n",
"from flaml.ml import sklearn_metric_loss_score\r\n",
"print('auto arima', '=', sklearn_metric_loss_score('mape', autoarima_y_pred, y_test))"
],
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Predicted labels [370.543233 371.28354891 372.2267332 373.49227877 373.88691133\n",
" 373.34103694 371.86609201 369.82045256 368.08845427 368.31840709\n",
" 369.67730838 371.05530796]\n",
"True labels co2\n",
"514 370.175\n",
"515 371.325\n",
"516 372.060\n",
"517 372.775\n",
"518 373.800\n",
"519 373.060\n",
"520 371.300\n",
"521 369.425\n",
"522 367.880\n",
"523 368.050\n",
"524 369.375\n",
"525 371.020\n",
"auto arima = 0.003201746906460404\n"
]
}
],
"metadata": {}
},
{
"cell_type": "markdown",
"source": [
"Auto SARIMA"
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": null,
"source": [
"# !pip install pmdarima"
],
"outputs": [],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": null,
"source": [
"from pmdarima.arima import auto_arima\r\n",
"import pandas as pd\r\n",
"import time"
],
"outputs": [],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": null,
"source": [
"X_train_arima = X_train.copy()\r\n",
"X_train_arima.index = pd.to_datetime(X_train_arima['index'])\r\n",
"X_train_arima = X_train_arima.drop('index', axis=1)\r\n",
"X_train_arima = X_train_arima.rename(columns={'co2': 'y'})\r\n",
"# use same search space as FLAML\r\n",
"arima_model = auto_arima(X_train_arima, \r\n",
" start_p=2, d=None, start_q=2, max_p=10, max_d=2, max_q=10,\r\n",
" start_P=1, D=None, start_Q=1, max_P=10, max_D=2, max_Q=10, m=12,\r\n",
" suppress_warnings=True, stepwise=False, seasonal=True, \r\n",
" error_action='ignore', trace=True, n_fits=50)"
],
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
" ARIMA(0,1,0)(0,0,0)[12] intercept : AIC=1638.009, Time=0.03 sec\n",
" ARIMA(0,1,0)(0,0,1)[12] intercept : AIC=1238.943, Time=0.24 sec\n",
" ARIMA(0,1,0)(0,0,2)[12] intercept : AIC=1040.890, Time=0.41 sec\n",
" ARIMA(0,1,0)(0,0,3)[12] intercept : AIC=911.545, Time=1.12 sec\n",
" ARIMA(0,1,0)(0,0,4)[12] intercept : AIC=823.103, Time=2.17 sec\n",
" ARIMA(0,1,0)(0,0,5)[12] intercept : AIC=792.850, Time=3.83 sec\n",
" ARIMA(0,1,0)(1,0,0)[12] intercept : AIC=inf, Time=0.29 sec\n",
" ARIMA(0,1,0)(1,0,1)[12] intercept : AIC=inf, Time=1.09 sec\n",
" ARIMA(0,1,0)(1,0,2)[12] intercept : AIC=inf, Time=1.95 sec\n",
" ARIMA(0,1,0)(1,0,3)[12] intercept : AIC=442.078, Time=4.98 sec\n",
" ARIMA(0,1,0)(1,0,4)[12] intercept : AIC=inf, Time=7.63 sec\n",
" ARIMA(0,1,0)(2,0,0)[12] intercept : AIC=inf, Time=0.82 sec\n",
" ARIMA(0,1,0)(2,0,1)[12] intercept : AIC=inf, Time=1.83 sec\n",
" ARIMA(0,1,0)(2,0,2)[12] intercept : AIC=inf, Time=2.20 sec\n",
" ARIMA(0,1,0)(2,0,3)[12] intercept : AIC=427.410, Time=6.58 sec\n",
" ARIMA(0,1,0)(3,0,0)[12] intercept : AIC=inf, Time=2.77 sec\n",
" ARIMA(0,1,0)(3,0,1)[12] intercept : AIC=438.942, Time=3.45 sec\n",
" ARIMA(0,1,0)(3,0,2)[12] intercept : AIC=431.438, Time=5.52 sec\n",
" ARIMA(0,1,0)(4,0,0)[12] intercept : AIC=inf, Time=5.85 sec\n",
" ARIMA(0,1,0)(4,0,1)[12] intercept : AIC=430.317, Time=8.43 sec\n",
" ARIMA(0,1,0)(5,0,0)[12] intercept : AIC=inf, Time=12.20 sec\n",
" ARIMA(0,1,1)(0,0,0)[12] intercept : AIC=1344.207, Time=0.10 sec\n",
" ARIMA(0,1,1)(0,0,1)[12] intercept : AIC=1112.274, Time=0.28 sec\n",
" ARIMA(0,1,1)(0,0,2)[12] intercept : AIC=993.565, Time=0.56 sec\n",
" ARIMA(0,1,1)(0,0,3)[12] intercept : AIC=891.683, Time=1.83 sec\n",
" ARIMA(0,1,1)(0,0,4)[12] intercept : AIC=820.025, Time=3.45 sec\n",
" ARIMA(0,1,1)(1,0,0)[12] intercept : AIC=612.811, Time=0.48 sec\n",
" ARIMA(0,1,1)(1,0,1)[12] intercept : AIC=392.523, Time=1.06 sec\n",
" ARIMA(0,1,1)(1,0,2)[12] intercept : AIC=424.761, Time=2.34 sec\n",
" ARIMA(0,1,1)(1,0,3)[12] intercept : AIC=423.152, Time=5.78 sec\n",
" ARIMA(0,1,1)(2,0,0)[12] intercept : AIC=510.637, Time=1.20 sec\n",
" ARIMA(0,1,1)(2,0,1)[12] intercept : AIC=412.849, Time=2.22 sec\n",
" ARIMA(0,1,1)(2,0,2)[12] intercept : AIC=396.908, Time=2.52 sec\n",
" ARIMA(0,1,1)(3,0,0)[12] intercept : AIC=467.985, Time=3.81 sec\n",
" ARIMA(0,1,1)(3,0,1)[12] intercept : AIC=405.933, Time=6.55 sec\n",
" ARIMA(0,1,1)(4,0,0)[12] intercept : AIC=448.948, Time=5.47 sec\n",
" ARIMA(0,1,2)(0,0,0)[12] intercept : AIC=1222.286, Time=0.16 sec\n",
" ARIMA(0,1,2)(0,0,1)[12] intercept : AIC=1046.922, Time=0.27 sec\n",
" ARIMA(0,1,2)(0,0,2)[12] intercept : AIC=947.532, Time=0.69 sec\n",
" ARIMA(0,1,2)(0,0,3)[12] intercept : AIC=867.310, Time=1.82 sec\n",
" ARIMA(0,1,2)(1,0,0)[12] intercept : AIC=608.450, Time=0.55 sec\n",
" ARIMA(0,1,2)(1,0,1)[12] intercept : AIC=402.050, Time=1.26 sec\n",
" ARIMA(0,1,2)(1,0,2)[12] intercept : AIC=422.338, Time=2.51 sec\n",
" ARIMA(0,1,2)(2,0,0)[12] intercept : AIC=507.685, Time=1.49 sec\n",
" ARIMA(0,1,2)(2,0,1)[12] intercept : AIC=408.472, Time=3.02 sec\n",
" ARIMA(0,1,2)(3,0,0)[12] intercept : AIC=460.596, Time=5.35 sec\n",
" ARIMA(0,1,3)(0,0,0)[12] intercept : AIC=1174.928, Time=0.18 sec\n",
" ARIMA(0,1,3)(0,0,1)[12] intercept : AIC=1037.324, Time=0.42 sec\n",
" ARIMA(0,1,3)(0,0,2)[12] intercept : AIC=947.471, Time=1.02 sec\n",
" ARIMA(0,1,3)(1,0,0)[12] intercept : AIC=602.141, Time=0.73 sec\n",
" ARIMA(0,1,3)(1,0,1)[12] intercept : AIC=399.087, Time=1.92 sec\n",
" ARIMA(0,1,3)(2,0,0)[12] intercept : AIC=500.296, Time=1.92 sec\n",
" ARIMA(0,1,4)(0,0,0)[12] intercept : AIC=1188.947, Time=0.36 sec\n",
" ARIMA(0,1,4)(0,0,1)[12] intercept : AIC=999.240, Time=0.71 sec\n",
" ARIMA(0,1,4)(1,0,0)[12] intercept : AIC=604.133, Time=0.88 sec\n",
" ARIMA(0,1,5)(0,0,0)[12] intercept : AIC=1091.452, Time=0.50 sec\n",
" ARIMA(1,1,0)(0,0,0)[12] intercept : AIC=1298.693, Time=0.08 sec\n",
" ARIMA(1,1,0)(0,0,1)[12] intercept : AIC=1075.553, Time=0.26 sec\n",
" ARIMA(1,1,0)(0,0,2)[12] intercept : AIC=971.074, Time=0.58 sec\n",
" ARIMA(1,1,0)(0,0,3)[12] intercept : AIC=882.846, Time=2.20 sec\n",
" ARIMA(1,1,0)(0,0,4)[12] intercept : AIC=818.711, Time=3.59 sec\n",
" ARIMA(1,1,0)(1,0,0)[12] intercept : AIC=inf, Time=0.50 sec\n",
" ARIMA(1,1,0)(1,0,1)[12] intercept : AIC=400.766, Time=1.07 sec\n",
" ARIMA(1,1,0)(1,0,2)[12] intercept : AIC=423.718, Time=2.76 sec\n",
" ARIMA(1,1,0)(1,0,3)[12] intercept : AIC=428.842, Time=5.94 sec\n",
" ARIMA(1,1,0)(2,0,0)[12] intercept : AIC=inf, Time=1.41 sec\n",
" ARIMA(1,1,0)(2,0,1)[12] intercept : AIC=416.666, Time=2.17 sec\n",
" ARIMA(1,1,0)(2,0,2)[12] intercept : AIC=409.065, Time=2.83 sec\n",
" ARIMA(1,1,0)(3,0,0)[12] intercept : AIC=inf, Time=3.85 sec\n",
" ARIMA(1,1,0)(3,0,1)[12] intercept : AIC=403.682, Time=6.69 sec\n",
" ARIMA(1,1,0)(4,0,0)[12] intercept : AIC=inf, Time=7.60 sec\n",
" ARIMA(1,1,1)(0,0,0)[12] intercept : AIC=1240.963, Time=0.14 sec\n",
" ARIMA(1,1,1)(0,0,1)[12] intercept : AIC=1069.162, Time=0.37 sec\n",
" ARIMA(1,1,1)(0,0,2)[12] intercept : AIC=973.065, Time=0.94 sec\n",
" ARIMA(1,1,1)(0,0,3)[12] intercept : AIC=884.323, Time=3.38 sec\n",
" ARIMA(1,1,1)(1,0,0)[12] intercept : AIC=588.156, Time=1.15 sec\n",
" ARIMA(1,1,1)(1,0,1)[12] intercept : AIC=399.033, Time=1.21 sec\n",
" ARIMA(1,1,1)(1,0,2)[12] intercept : AIC=409.596, Time=3.22 sec\n",
" ARIMA(1,1,1)(2,0,0)[12] intercept : AIC=503.551, Time=1.60 sec\n",
" ARIMA(1,1,1)(2,0,1)[12] intercept : AIC=402.095, Time=2.40 sec\n",
" ARIMA(1,1,1)(3,0,0)[12] intercept : AIC=457.277, Time=5.79 sec\n",
" ARIMA(1,1,2)(0,0,0)[12] intercept : AIC=1196.535, Time=0.25 sec\n",
" ARIMA(1,1,2)(0,0,1)[12] intercept : AIC=1042.432, Time=0.35 sec\n",
" ARIMA(1,1,2)(0,0,2)[12] intercept : AIC=948.444, Time=0.95 sec\n",
" ARIMA(1,1,2)(1,0,0)[12] intercept : AIC=583.862, Time=1.14 sec\n",
" ARIMA(1,1,2)(1,0,1)[12] intercept : AIC=403.010, Time=1.36 sec\n",
" ARIMA(1,1,2)(2,0,0)[12] intercept : AIC=502.719, Time=2.87 sec\n",
" ARIMA(1,1,3)(0,0,0)[12] intercept : AIC=1176.484, Time=0.30 sec\n",
" ARIMA(1,1,3)(0,0,1)[12] intercept : AIC=1039.309, Time=0.65 sec\n",
" ARIMA(1,1,3)(1,0,0)[12] intercept : AIC=604.131, Time=0.79 sec\n",
" ARIMA(1,1,4)(0,0,0)[12] intercept : AIC=inf, Time=1.05 sec\n",
" ARIMA(2,1,0)(0,0,0)[12] intercept : AIC=1180.404, Time=0.10 sec\n",
" ARIMA(2,1,0)(0,0,1)[12] intercept : AIC=1058.115, Time=0.25 sec\n",
" ARIMA(2,1,0)(0,0,2)[12] intercept : AIC=973.051, Time=0.74 sec\n",
" ARIMA(2,1,0)(0,0,3)[12] intercept : AIC=883.377, Time=1.90 sec\n",
" ARIMA(2,1,0)(1,0,0)[12] intercept : AIC=inf, Time=0.43 sec\n",
" ARIMA(2,1,0)(1,0,1)[12] intercept : AIC=416.799, Time=1.05 sec\n",
" ARIMA(2,1,0)(1,0,2)[12] intercept : AIC=400.863, Time=2.36 sec\n",
" ARIMA(2,1,0)(2,0,0)[12] intercept : AIC=inf, Time=1.49 sec\n",
" ARIMA(2,1,0)(2,0,1)[12] intercept : AIC=400.859, Time=2.41 sec\n",
" ARIMA(2,1,0)(3,0,0)[12] intercept : AIC=inf, Time=4.49 sec\n",
" ARIMA(2,1,1)(0,0,0)[12] intercept : AIC=990.719, Time=0.26 sec\n",
" ARIMA(2,1,1)(0,0,1)[12] intercept : AIC=881.526, Time=0.73 sec\n",
" ARIMA(2,1,1)(0,0,2)[12] intercept : AIC=837.402, Time=2.20 sec\n",
" ARIMA(2,1,1)(1,0,0)[12] intercept : AIC=584.097, Time=1.56 sec\n",
" ARIMA(2,1,1)(1,0,1)[12] intercept : AIC=443.589, Time=1.83 sec\n",
" ARIMA(2,1,1)(2,0,0)[12] intercept : AIC=494.535, Time=2.78 sec\n",
" ARIMA(2,1,2)(0,0,0)[12] intercept : AIC=988.094, Time=0.48 sec\n",
" ARIMA(2,1,2)(0,0,1)[12] intercept : AIC=757.307, Time=1.35 sec\n",
" ARIMA(2,1,2)(1,0,0)[12] intercept : AIC=594.527, Time=2.10 sec\n",
" ARIMA(2,1,3)(0,0,0)[12] intercept : AIC=1140.469, Time=0.53 sec\n",
" ARIMA(3,1,0)(0,0,0)[12] intercept : AIC=1126.139, Time=0.30 sec\n",
" ARIMA(3,1,0)(0,0,1)[12] intercept : AIC=996.923, Time=0.36 sec\n",
" ARIMA(3,1,0)(0,0,2)[12] intercept : AIC=918.438, Time=0.87 sec\n",
" ARIMA(3,1,0)(1,0,0)[12] intercept : AIC=inf, Time=0.62 sec\n",
" ARIMA(3,1,0)(1,0,1)[12] intercept : AIC=406.333, Time=1.24 sec\n",
" ARIMA(3,1,0)(2,0,0)[12] intercept : AIC=inf, Time=2.16 sec\n",
" ARIMA(3,1,1)(0,0,0)[12] intercept : AIC=989.496, Time=0.80 sec\n",
" ARIMA(3,1,1)(0,0,1)[12] intercept : AIC=856.486, Time=1.45 sec\n",
" ARIMA(3,1,1)(1,0,0)[12] intercept : AIC=604.951, Time=0.76 sec\n",
" ARIMA(3,1,2)(0,0,0)[12] intercept : AIC=991.599, Time=0.86 sec\n",
" ARIMA(4,1,0)(0,0,0)[12] intercept : AIC=1125.025, Time=0.20 sec\n",
" ARIMA(4,1,0)(0,0,1)[12] intercept : AIC=987.621, Time=0.41 sec\n",
" ARIMA(4,1,0)(1,0,0)[12] intercept : AIC=inf, Time=0.83 sec\n",
" ARIMA(4,1,1)(0,0,0)[12] intercept : AIC=988.660, Time=0.74 sec\n",
" ARIMA(5,1,0)(0,0,0)[12] intercept : AIC=1113.673, Time=0.23 sec\n",
"\n",
"Best model: ARIMA(0,1,1)(1,0,1)[12] intercept\n",
"Total fit time: 249.429 seconds\n"
]
}
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": null,
"source": [
"autosarima_y_pred = arima_model.predict(n_periods = 12)\r\n",
"print('Predicted labels', autosarima_y_pred)\r\n",
"print('True labels', y_test)\r\n",
"from flaml.ml import sklearn_metric_loss_score\r\n",
"print('auto sarima', '=', sklearn_metric_loss_score('mape', autosarima_y_pred, y_test))"
],
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Predicted labels [370.543233 371.28354891 372.2267332 373.49227877 373.88691133\n",
" 373.34103694 371.86609201 369.82045256 368.08845427 368.31840709\n",
" 369.67730838 371.05530796]\n",
"True labels co2\n",
"514 370.175\n",
"515 371.325\n",
"516 372.060\n",
"517 372.775\n",
"518 373.800\n",
"519 373.060\n",
"520 371.300\n",
"521 369.425\n",
"522 367.880\n",
"523 368.050\n",
"524 369.375\n",
"525 371.020\n",
"auto sarima = 0.0007724244328789994\n"
]
}
],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": null,
"source": [
"# !pip install matplotlib"
],
"outputs": [],
"metadata": {}
},
{
"cell_type": "code",
"execution_count": 79,
"source": [
"import matplotlib.pyplot as plt\r\n",
"plt.plot(X_test, y_test, label='Actual level')\r\n",
"plt.plot(X_test, flaml_y_pred, label='FLAML forecast')\r\n",
"plt.plot(X_test, autoarima_y_pred, label='Auto ARIMA forecast')\r\n",
"plt.plot(X_test, autosarima_y_pred, label='Auto SARIMA forecast')\r\n",
"plt.xlabel('Date')\r\n",
"plt.ylabel('CO2 Levels')\r\n",
"plt.legend()"
],
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": [
"<matplotlib.legend.Legend at 0x2c8518e4580>"
]
},
"metadata": {},
"execution_count": 79
},
{
"output_type": "display_data",
"data": {
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
],
"image/svg+xml": "<?xml version=\"1.0\" encoding=\"utf-8\" standalone=\"no\"?>\r\n<!DOCTYPE svg PUBLIC \"-//W3C//DTD SVG 1.1//EN\"\r\n \"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd\">\r\n<svg height=\"262.19625pt\" version=\"1.1\" viewBox=\"0 0 388.965625 262.19625\" width=\"388.965625pt\" xmlns=\"http://www.w3.org/2000/svg\" xmlns:xlink=\"http://www.w3.org/1999/xlink\">\r\n <metadata>\r\n <rdf:RDF xmlns:cc=\"http://creativecommons.org/ns#\" xmlns:dc=\"http://purl.org/dc/elements/1.1/\" xmlns:rdf=\"http://www.w3.org/1999/02/22-rdf-syntax-ns#\">\r\n <cc:Work>\r\n <dc:type rdf:resource=\"http://purl.org/dc/dcmitype/StillImage\"/>\r\n <dc:date>2021-08-31T21:24:33.220845</dc:date>\r\n <dc:format>image/svg+xml</dc:format>\r\n <dc:creator>\r\n <cc:Agent>\r\n <dc:title>Matplotlib v3.4.2, https://matplotlib.org/</dc:title>\r\n </cc:Agent>\r\n </dc:creator>\r\n </cc:Work>\r\n </rdf:RDF>\r\n </metadata>\r\n <defs>\r\n <style type=\"text/css\">*{stroke-linecap:butt;stroke-linejoin:round;}</style>\r\n </defs>\r\n <g id=\"figure_1\">\r\n <g id=\"patch_1\">\r\n <path d=\"M 0 262.19625 \r\nL 388.965625 262.19625 \r\nL 388.965625 0 \r\nL 0 0 \r\nz\r\n\" style=\"fill:none;\"/>\r\n </g>\r\n <g id=\"axes_1\">\r\n <g id=\"patch_2\">\r\n <path d=\"M 46.965625 224.64 \r\nL 381.765625 224.64 \r\nL 381.765625 7.2 \r\nL 46.965625 7.2 \r\nz\r\n\" style=\"fill:#ffffff;\"/>\r\n </g>\r\n <g id=\"matplotlib.axis_1\">\r\n <g id=\"xtick_1\">\r\n <g id=\"line2d_1\">\r\n <defs>\r\n <path d=\"M 0 0 \r\nL 0 3.5 \r\n\" id=\"mc5d4f135dd\" style=\"stroke:#000000;stroke-width:0.8;\"/>\r\n </defs>\r\n <g>\r\n <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"62.183807\" xlink:href=\"#mc5d4f135dd\" y=\"224.64\"/>\r\n </g>\r\n </g>\r\n <g id=\"text_1\">\r\n <!-- 2001-01 -->\r\n <g transform=\"translate(41.292401 239.238437)scale(0.1 -0.1)\">\r\n <defs>\r\n <path d=\"M 1228 531 \r\nL 3431 531 \r\nL 3431 0 \r\nL 469 0 \r\nL 469 531 \r\nQ 828 903 1448 1529 \r\nQ 2069 2156 2228 2338 \r\nQ 2531 2678 2651 2914 \r\nQ 2772 3150 2772 3378 \r\nQ 2772 3750 2511 3984 \r\nQ 2250 4219 1831 4219 \r\nQ 1534 4219 1204 4116 \r\nQ 875 4013 500 3803 \r\nL 500 4441 \r\nQ 881 4594 1212 4672 \r\nQ 1544 4750 1819 4750 \r\nQ 2544 4750 2975 4387 \r\nQ 3406 4025 3406 3419 \r\nQ 3406 3131 3298 2873 \r\nQ 3191 2616 2906 2266 \r\nQ 2828 2175 2409 1742 \r\nQ 1991 1309 1228 531 \r\nz\r\n\" id=\"DejaVuSans-32\" transform=\"scale(0.015625)\"/>\r\n <path d=\"M 2034 4250 \r\nQ 1547 4250 1301 3770 \r\nQ 1056 3291 1056 2328 \r\nQ 1056 1369 1301 889 \r\nQ 1547 409 2034 409 \r\nQ 2525 409 2770 889 \r\nQ 3016 1369 3016 2328 \r\nQ 3016 3291 2770 3770 \r\nQ 2525 4250 2034 4250 \r\nz\r\nM 2034 4750 \r\nQ 2819 4750 3233 4129 \r\nQ 3647 3509 3647 2328 \r\nQ 3647 1150 3233 529 \r\nQ 2819 -91 2034 -91 \r\nQ 1250 -91 836 529 \r\nQ 422 1150 422 2328 \r\nQ 422 3509 836 4129 \r\nQ 1250 4750 2034 4750 \r\nz\r\n\" id=\"DejaVuSans-30\" transform=\"scale(0.015625)\"/>\r\n <path d=\"M 794 531 \r\nL 1825 531 \r\nL 1825 4091 \r\nL 703 3866 \r\nL 703 4441 \r\nL 1819 4666 \r\nL 2450 4666 \r\nL 2450 531 \r\nL 3481 531 \r\nL 3481 0 \r\nL 794 0 \r\nL 794 531 \r\nz\r\n\" id=\"DejaVuSans-31\" transform=\"scale(0.015625)\"/>\r\n <path d=\"M 313 2009 \r\nL 1997 2009 \r\nL 1997 1497 \r\nL 313 1497 \r\nL 313 2009 \r\nz\r\n\" id=\"DejaVuSans-2d\" transform=\"scale(0.015625)\"/>\r\n </defs>\r\n <use xlink:href=\"#DejaVuSans-32\"/>\r\n <use x=\"63.623047\" xlink:href=\"#DejaVuSans-30\"/>\r\n <use x=\"127.246094\" xlink:href=\"#DejaVuSans-30\"/>\r\n <use x=\"190.869141\" xlink:href=\"#DejaVuSans-31\"/>\r\n <use x=\"254.492188\" xlink:href=\"#DejaVuSans-2d\"/>\r\n <use x=\"290.576172\" xlink:href=\"#DejaVuSans-30\"/>\r\n <use x=\"354.199219\" xlink:href=\"#DejaVuSans-31\"/>\r\n </g>\r\n </g>\r\n </g>\r\n <g id=\"xtick_2\">\r\n <g id=\"line2d_2\">\r\n <g>\r\n <use style=\"stroke:#000000;stroke-width:0.8;\" x=
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEGCAYAAACKB4k+AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8rg+JYAAAACXBIWXMAAAsTAAALEwEAmpwYAAB4zklEQVR4nO3dd1gUxxvA8e/Qm0hXrIi9gNhL7N3Ye++9x8RYUn4xmtgSNTH23lGjsUQTe++994KKKALSO9z8/riTgKKiAgc4n+e5R9id3X0X8N7bnZ13hJQSRVEURQEw0HcAiqIoSsahkoKiKIqSQCUFRVEUJYFKCoqiKEoClRQURVGUBEb6DuBjODg4SBcXF32HoSiKkqmcO3fOX0rpmNy6TJ0UXFxcOHv2rL7DUBRFyVSEEA/ftE7dPlIURVESpFlSEEKYCSFOCyEuCSGuCSF+1C1fL4S4qHt5CSEuvrJdPiFEmBBiVFrFpiiKoiQvLW8fRQN1pJRhQghj4KgQ4l8pZYeXDYQQ04HgV7abCfybhnEpiqIob5BmSUFq62eE6b411r0SamoIIQTQHqiTaFlL4D4QnlZxKUpmExsbi7e3N1FRUfoORclkzMzMyJMnD8bGxineJk07moUQhsA5oBAwR0p5KtHq6oCvlPKOrq0lMAaoD6hbR4qi4+3tTbZs2XBxcUH7WUpR3k1KSUBAAN7e3hQoUCDF26VpR7OUMl5K6QHkASoKIUolWt0J8Ez0/Y/ATCllGG8hhOgvhDgrhDjr5+eX6jErSkYTFRWFvb29SgjKexFCYG9v/95XmOnySKqUMkgIcRBoBFwVQhgBrYFyiZpVAtoKIaYBNoBGCBElpZz9yr4WAgsBypcvr0q8Kp8ElRCUD/EhfzdplhSEEI5ArC4hmAP1gKm61fWAm1JK75ftpZTVE207Hgh7NSEony5NdDShu/dgnDMHFhUq6DscRcmy0vL2kTNwQAhxGTgD7JFSbtet60jSW0eKkqz4sHAClizlXr36+Hz9NQ+7dcdnzFjiAgP1HdonZ/PmzQghuHnz5jvb/vbbb0RERHzwsZYvX87QoUNTvPxjpMU+M7M0SwpSystSyjJSSncpZSkp5YRE63pKKee/ZdvxUspf0yo2JeOLCwzE74/Z3K1bl+e//IJJoYLkXbwY+4EDCN6xg/ufNyH4779Rk0SlH09PT6pVq8a6deve2fZjk4KiP2pEs5KhxPr64jtlKnfr1sN/zhwsKpTHZcN68i9bxkaRizN1OlBg0yaM8+XF5+vRPO7XnxjvJ/oOO8sLCwvj2LFjLFmyJElSiI+PZ9SoUbi5ueHu7s4ff/zBrFmz8PHxoXbt2tSuXRsAKyurhG02btxIz549Afj777+pVKkSZcqUoV69evj6+qY4Jj8/P9q0aUOFChWoUKECx44dQ6PR4OLiQlBQUEK7QoUK4evrm2x75XWZuvaRknXEPHxIwOIlBG/ZgtRoyN60CfZ9+2JauDAAnqcfMf7v6wAMr1OIL9asIWjdevxmzOB+s2Y4DhuGXfduCKOs/Sf949/XuO4Tkqr7LJHLmh+alXxrmy1bttCoUSOKFCmCnZ0d58+fp2zZsixcuJAHDx5w4cIFjIyMePHiBXZ2dsyYMYMDBw7g4ODw1v1Wq1aNkydPIoRg8eLFTJs2jenTp6co7hEjRjBy5EiqVavGo0ePaNiwITdu3KBFixZs3ryZXr16cerUKVxcXMiRIwedO3dOtr2SVNb+H6RkeFG3bhGwYCEhO3cijIywadcWu969McmTJ6HNWa8X/G/rVWoUcSSntSmz9t/l0YsIpnbsSLa6dXg2YSLPp00jZMcOnCdOwKxECT2eUdbk6enJF198AUDHjh3x9PSkbNmy7N27l4EDB2KkS8Z2dnbvtV9vb286dOjA06dPiYmJea/n6ffu3cv169cTvg8JCSE0NJQOHTowYcIEevXqxbp16+jQocNb2ytJqaSg6EXE+QsELFhA2KFDGFhYYN+7F3Y9emDkmLSa79PgSAauPk9uG3P+6FgGa3Mj8ttb8suuW/gER7GwWznyzJ1D6K5dPPvpZx60a49djx44DhuKgbm5ns4u7bzrE31aCAgIYP/+/Vy9ehUhBPHx8QghmDZtGlLKFD32mLhN4ufmhw0bxpdffknz5s05ePAg48ePT3FcGo2GEydOYP7K77lKlSrcvXsXPz8/tmzZwnfffffW9kpSqk9BSTdSSsKOHOVh12487NyZyEuXcBwxnEIH9uM0atRrCSEqNp4Bq84RFRvPou7lyW5hjBCCIbUL8XtHDy4+CqL13OM8DIjAulEjCu7Yjk3rVrxYupT7zZoTpu4Zp4qNGzfSvXt3Hj58iJeXF48fP6ZAgQIcPXqUBg0aMH/+fOLi4gB48eIFANmyZUvyKTxHjhzcuHEDjUbD5s2bE5YHBweTO3duAFasWPFecTVo0IDZs/97av3ixYuANgG1atWKL7/8kuLFi2Nvb//W9kpSKikoaU7GxxOycxdebdryuF8/Yh4/Jsc34yi0fx8OgwZhmD3769tIybi/rnDZO5iZHTwonCMbhD6DKO399BYeuVnTrxKBETG0nneccw9fYJg9O84TJ5Jv5QqEkRGP+/TFZ8wY9fjqR/L09KRVq1ZJlrVp04a1a9fSt29f8uXLh7u7O6VLl2bt2rUA9O/fn8aNGyd0NE+ZMoWmTZtSp04dnJ2dE/Yzfvx42rVrR/Xq1d/Z//CqWbNmcfbsWdzd3SlRogTz5//3QGOHDh1YvXp1wq2jd7VX/iMy8yN95cuXl2qSnYxLxsQQ/Pd2AhYvJubBA0zy58e+fz+yN2uGMDF567aLj9znpx03+LJ+EYZXzw1HZ8Kx38AsOzT4GdzbgxA88A+n9/IzPAmKZHq70jQrnQvQDnbznz+fgEWLMcyWjRzjxmLdrFmmHBl848YNihcvru8wlEwqub8fIcQ5KWX55NqrKwUl1WkiI3mxchV3Gzbi6bffIszMyD1zBq7/7MCmTZt3JoTDt/2Y9M8NGpfKydBct2FuJTg8DYo1BZt8sLk/rGgG/nco4GDJX4OqUjpPdoZ5XmDOgbtIKTEwNcVpxAgK/LUJk3z58Bk9hsd9+xHj7f3WYyvKp04lBSXVxIeE4D9/Pnfr1MV30iSMc+ci78IFFPhrE9aNGyMMDd+5Dy//cIZ5XqCmQxh/yCkYrO8MxhbQYzu0WwZ99kCTGfD0MsyrCgcmYWuiYVWfSjQvnYtfdt1i7KYrxMZrADArUoT8a9eQ4/vviLx4kftNmxGwZClSdw9cUZSk1NNHykeL8/fnxYoVBK71RBMejmWN6jj0749F+WSvTt8oLDqOISuOMUhuYEDENsRjY2jwE1QaCIa6evAGhlChj/aqYfd3cGgqXN6AWZPp/N6xDvntLfhj/12eBEUyt2tZrM2MEYaG2HXpQra6dbWPr/7yC8E7tuM8cSLmJdP/aR5FychUn4LywTQxMTz/5VeCNmxAxsSQrVFDHPr1+6BxAhqNZM6C2bR4Not84jmUaqNNCNa53r7h/YOw/Ut4cU+7TcNJbLgVyzd/XcHV0ZKlPSuQx9YiobmUktBdu3n280/EB7z47/FVC4s3H0PPVJ+C8jFUn4KSbvxnzSJw1SqsmzbB9Z8d5Jk588MGjgV68eCPZgzz/Q5rS0vovg3aLn13QgBwrQWDjkOtcXDjb5hdgfaanazsWY6nwVG0mnucy95BCc2FEFg3akjBHTuwaduWF8uWcb95C8KOqsdXFQVUUlA+UMT5CwQsXYZNu3bk+vlnTN9jJGqC2Cg4OJX4PyqS88Vp/nYaRPYvT4Frzffbj7EZ1BoLg09C7rLwzyiqHuzIjrbZMDUyoP2CE+y+9izJJobW1jhP+JH8q1YijI153LcvT0aPJk73nL2ifKpUUlDemyYiAp9xYzF2dsZpzJgP28mdPTC3MhycxO74sgyzX0T9fj8jjEw/PDD7gtBtC7RZAsHe5NvUhN3F/8XDyYgBq8+x5OiD16qqWlSoQIEtm3EYPIiQf3dy//MmBG3ZoqqvvsLQ0BAPD4+El5eXFwcPHqRp06Zv3KZ06dJ
},
"metadata": {
"needs_background": "light"
}
}
],
"metadata": {}
}
],
"metadata": {
"kernelspec": {
"name": "python3",
"display_name": "Python 3.8.10 64-bit ('python38': conda)"
},
"language_info": {
"name": "python",
"version": "3.8.10",
"mimetype": "text/x-python",
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"pygments_lexer": "ipython3",
"nbconvert_exporter": "python",
"file_extension": ".py"
},
"interpreter": {
"hash": "8b6c8c3ba4bafbc4530f534c605c8412f25bf61ef13254e4f377ccd42b838aa4"
}
},
"nbformat": 4,
"nbformat_minor": 2
}