KoModernBERT / README.md
CocoRoF's picture
cc-100_0-2 Done
ddeca54 verified
metadata
library_name: transformers
license: apache-2.0
base_model: CocoRoF/KoModernBERT
tags:
  - generated_from_trainer
model-index:
  - name: KoModernBERT
    results: []

KoModernBERT

This model is a fine-tuned version of CocoRoF/KoModernBERT on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.3473

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 8
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 512
  • total_eval_batch_size: 64
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 1.0

Training results

Training Loss Epoch Step Validation Loss
26.6178 0.0928 5000 3.3099
23.887 0.1856 10000 2.9665
22.3186 0.2784 15000 2.7910
21.6275 0.3711 20000 2.6757
20.7564 0.4639 25000 2.5967
20.0201 0.5567 30000 2.5263
19.7037 0.6495 35000 2.4709
19.2119 0.7423 40000 2.4196
19.053 0.8351 45000 2.3825
18.7262 0.9279 50000 2.3473

Framework versions

  • Transformers 4.48.1
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0