File size: 4,603 Bytes
001cf70
 
 
881bc20
 
 
 
4cc7625
881bc20
a9073bb
 
 
 
 
 
 
 
0aa0c9c
4ad6488
 
 
 
 
 
 
 
0aa0c9c
4ad6488
 
 
 
 
 
 
0aa0c9c
4ad6488
 
 
 
 
 
0aa0c9c
4ad6488
 
881bc20
 
4cc7625
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
---

license: cc
---

## About

This repository provides model weights to run load forecasting models trained on ComStock datasets. The companion dataset repository is [this](https://huggingface.co/datasets/APPFL/Illinois_load_datasets). The model definitions are present in the `models` directory. The corresponding trained model weights are present in the `weights` directory. The corresponding model keyword arguments (as a function of a provided `lookback` and `lookahead`) can be imported from the file `model_kwargs.py`.

Note that `lookback` is denoted by `L` and `lookahead` by `T` in the weights directory. We provide weights for the following `(L,T)` pairs: `(512,4)`, `(512,48)`, and `(512,96)`, and for `HOM`ogenous and `HET`erogenous datasets.

## Packages

Executing the code only requires `numpy` and `torch` (PyTorch) packages. You can either have them in your Python base installation, or use a `conda` environment.

## Example

In order to see how to use the model definitions and load the weights into them, see `example.py`.

## Technical Details for Running the Models

In input layout of the models are as follows:

- The `forward()` functions of `LSTM`, `LSTNet`, and `PatchTST` take in two arguments: `forward(input, future_time_idx)`. They are laid out as follows:

  - `input` is a tensor of shape `(B,L,num_features)` where `B` is the batch size, `L` is the lookback duration, and `num_features` is 8 for our current application.
  - `future_time_idx` is a tensor of shape `(B,T,2)` where `T` is the lookahead and 2 is the number of time index features.
  - The time indices in `input` as well as `fut_time_idx` are both normalized.
  - The custom `torch.utils.data.Dataset` class for the train, val, and test sets can be generated by executing the `get_data_and_generate_train_val_test_sets` function in the `custom_dataset.py` file in the [companion dataset](https://huggingface.co/datasets/APPFL/Illinois_load_datasets).
  - Non-time features are normalized. The mean and standard deviation of the [companion dataset](https://huggingface.co/datasets/APPFL/Illinois_load_datasets) can be inferred by executing `example_dataset.py` there and looking at `Case 1` and `Case 4`.
  - The output shape is `(B,1)` denoting the pointwise forecast `T` steps into the future.
- The `forward()` functions of `Transformer`, `Autoformer`, `Informer`, and `TimesNet` take in two arguments:` forward(input, future_time_idx)`. They are laid out as follows:

  - `input` is a tensor of shape `(B,L,num_features)` where `B` is the batch size, `L` is the lookback duration, and `num_features` is 8 for our current application.
  - `future_time_idx` is a tensor of shape `(B,T,2)` where `T` is the lookahead and 2 is the number of time index features.
  - The time indices in `input` as well as `fut_time_idx` are un-normalized to allow for embedding.
  - The custom `torch.utils.data.Dataset` class for the train, val, and test sets can be generated by executing the `get_data_and_generate_train_val_test_sets` function in the `custom_dataset.py` file in the [companion dataset](https://huggingface.co/datasets/APPFL/Illinois_load_datasets).
  - Non-time features are normalized. The mean and standard deviation of the [companion dataset](https://huggingface.co/datasets/APPFL/Illinois_load_datasets) can be inferred by executing `example_dataset.py` there and looking at `Case 2` and `Case 5`.
  - The output shape is `(B,1)` denoting the pointwise forecast `T` steps into the future.
- The `forward()` functions of  `TimesFM` takes in one argument:` forward(input)`. It is laid out as follows:

  - `input` is a tensor of shape `(B,L)` where `B` is the batch size and `L` is the lookback duration. Since it is univariate, there is only one feature.
  - The sole feature is normalized. The mean and standard deviation of the [companion dataset](https://huggingface.co/datasets/APPFL/Illinois_load_datasets) can be inferred by executing `example_dataset.py` there and looking at `Case 3` and `Case 6`.
  - The custom `torch.utils.data.Dataset` class for the train, val, and test sets can be generated by executing the `get_data_and_generate_train_val_test_sets` function in the `custom_dataset_univariate.py` file in the [companion dataset](https://huggingface.co/datasets/APPFL/Illinois_load_datasets).
  - The output shape is `(B,T)` denoting the rolling horizon forecast `T` steps into the future.

## Credits

Some model definitions have been adapted from the code provided in the [TSLib Library](https://github.com/thuml/Time-Series-Library).