Model Card for Chronos T5 Small Fine-Tuned Model

Summary

This model is fine-tuned for time-series forecasting tasks and serves as a tool for both practical predictions and research into time-series modeling. It is based on the amazon/chronos-t5-small architecture and has been adapted using a dataset with 15 million rows of proprietary time-series data. Due to confidentiality restrictions, dataset details cannot be shared.

Fine-Tuning Dataset

The model was fine-tuned on a proprietary dataset containing 15 million rows of time-series data. While details about the dataset are confidential, the following general characteristics are provided:

  • The dataset consists of multi-dimensional time-series data.
  • Features include historical values, contextual attributes, and external covariates relevant to forecasting.
  • The data spans multiple domains, enabling generalization across a wide range of forecasting tasks.

This large-scale dataset ensures the model captures complex patterns and temporal dependencies necessary for accurate forecasting.

Evaluation

Testing Data, Factors & Metrics

Testing Data

The model was evaluated using several publicly available time-series datasets, including:

  • electricity_15min
  • monash_electricity_hourly
  • monash_electricity_weekly
  • monash_kdd_cup_2018
  • monash_pedestrian_counts

Factors

Evaluation was conducted across datasets representing various domains such as electricity usage, pedestrian counts, and competition data.

Metrics

Two primary metrics were used for evaluation:

  • MASE (Mean Absolute Scaled Error): A normalized metric for assessing forecast accuracy.
  • WQL (Weighted Quantile Loss): Measures the quality of probabilistic predictions.

Results

Dataset Model MASE WQL
electricity_15min amazon/chronos-t5-small 0.425 0.085
monash_electricity_hourly amazon/chronos-t5-small 1.537 0.110
monash_electricity_weekly amazon/chronos-t5-small 1.943 0.086
monash_kdd_cup_2018 amazon/chronos-t5-small 0.693 0.309
monash_pedestrian_counts amazon/chronos-t5-small 0.308 0.247

Summary

The fine-tuned model performs well on short-term electricity datasets (e.g., electricity_15min) with low MASE and WQL values. Performance varies depending on the dataset's characteristics, particularly with longer-term or aggregated data.

Technical Specifications

Model Architecture and Objective

The model is based on the amazon/chronos-t5-small architecture, fine-tuned specifically for time-series forecasting tasks. It leverages pre-trained capabilities for sequence-to-sequence modeling, adapted to handle multi-horizon forecasting scenarios.

Citation

If you use this model in your research or applications, please cite it as:

@misc{Fevzi2024LLaMA-2-7B-NIEXCHE,
  author = {Fevzi KILAS},
  title = {LLaMA-2-7B-NIEXCHE: A Turkish Agriculture QA Model},
  year = {2024},
  howpublished = {https://huggingface.co/NIEXCHE/turkish_agriculture_QA_llama2_22.6k}
}

Contact:

NIEXCHE (Fevzi KILAS)

header

Downloads last month
24
Safetensors
Model size
46.2M params
Tensor type
F32
·
Inference API
Inference API (serverless) does not yet support transformers models for this pipeline type.

Model tree for NIEXCHE/chronos-t5-small-fine-tuned-v1

Finetuned
(1)
this model