starcoder-peft-airscript

This model is a fine-tuned version of bigcode/starcoderbase-1b on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7248

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 10
  • eval_batch_size: 10
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 20
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 30
  • training_steps: 1600

Training results

Training Loss Epoch Step Validation Loss
1.2597 0.0625 100 1.1604
0.9591 0.125 200 0.9402
0.8109 0.1875 300 0.8431
0.7151 0.25 400 0.7917
0.6362 0.3125 500 0.7607
0.5759 0.375 600 0.7401
0.5284 0.4375 700 0.7334
0.4926 0.5 800 0.7252
0.4616 0.5625 900 0.7212
0.4369 0.625 1000 0.7236
0.4111 0.6875 1100 0.7255
0.3969 0.75 1200 0.7236
0.3855 0.8125 1300 0.7260
0.3822 0.875 1400 0.7262
0.3768 0.9375 1500 0.7256
0.3778 1.0 1600 0.7248

Framework versions

  • PEFT 0.13.2
  • Transformers 4.45.2
  • Pytorch 2.5.0
  • Datasets 3.0.1
  • Tokenizers 0.20.1
Downloads last month
73
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for cy948/starcoder-peft-airscript

Adapter
(72)
this model

Collection including cy948/starcoder-peft-airscript