license: mit
language:
- en
tags:
- tau
- hep
- fcc
- clic
- ee
- reconstruction
- identification
- decay_mode
- foundation_model
- omnijet_alpha
Model Card for Model ID
This modelcard aims to be a base template for new models. It has been generated using this raw template.
Model Details
Model Description
- Developed by: Joschka Birk, Anna Hallin, Gregor Kasieczka
- Model type: Transformer
- Language(s) (NLP): Pytorch
- Finetuned from model: https://doi.org/10.1088/2632-2153/ad66ad
The OmniJet- model was published in here was used as the base model for identifying hadronically decaying taus, reconstructing their kinematics and predicting their decay mode. The base model, initially trained on JetClass dataset, was now fine-tuned on Fu ure dataset. The models included here are for 3 separate tasks:
- Tau-tagging (binary classification)
- Tau kinematic reconstruction (regression)
- Tau decay mode classification (multiclass-classification)
And for 3 different ways of training:
- From scratch
- Fixed backbone (fine-tune only head)
- Fine-tuning (fine-tune both head and backbone)
This will add up to 9 different models.
Model Sources [optional]
- Repository (base model): https://github.com/uhh-pd-ml/omnijet_alpha
- Repository (fine-tuned model): https://github.com/HEP-KBFI/ml-tau-en-reg
- Paper: https://doi.org/10.1088/2632-2153/ad66ad
Uses
Direct Use
The intended use of the models is to study the feasibility of foundation models for the purposes of reconstructing and identifying hadronically decaying tau leptons.
Out-of-Scope Use
This model is not intended for physics measurements on real data. The trainings have been done on CLIC detector simulations.
Bias, Risks, and Limitations
The model has only been trained on simulation data and has not been validated against real data. Although the base model has been published in a peer-reviewed journal, the fine-tuned model has not been.
How to Get Started with the Model
Use the code below to get started with the model.
# Clone the repository
git clone git@github.com:HEP-KBFI/ml-tau-en-reg.git --recursive
cd ml-tau-en-reg
# Get the models
git clone https://huggingface.co/LauritsT/TauRecoID models
Training Details
Training Data
The data used to fine-tune the base model can be found here: Fu ure dataset
Training Hyperparameters
- No hyperparameter tuning has been done.
Speeds, Sizes, Times [optional]
Training on 1M jets on AMD MI250x for 100 epochs takes ~8h.
Evaluation
Testing Data, Factors & Metrics
Testing Data
Testing data can also be found in the same Zenodo entry as the rest of the data.
Software
Software to train and analyze the model
Citation [optional]
Model Card Authors [optional]
Laurits Tani (laurits.tani@cern.ch)
Model Card Contact
Laurits Tani (laurits.tani@cern.ch)