--- base_model: roneneldan/TinyStories-33M library_name: Distily tags: - generated_from_trainer model-index: - name: distily_bench_obj_cross_v2.10 results: [] --- # distily_bench_obj_cross_v2.10 This student model is distilled from the teacher model [roneneldan/TinyStories-33M](https://huggingface.co/roneneldan/TinyStories-33M) using the dataset (unspecified). The [Distily](https://github.com/lapp0/distily) library was used for this distillation. It achieves the following results on the evaluation set: - eval_enwikippl: 133.3661 - eval_frwikippl: 19650.0977 - eval_zhwikippl: 54146.3867 - eval_tinystoriesppl: 9.1470 - eval_loss: 1.2078 - eval_runtime: 12.9975 - eval_samples_per_second: 76.938 - eval_steps_per_second: 9.617 ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=0, loss_fn=None, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=0, loss_fn=None, layer_mapper=None, projector=None)) - train_embeddings: True - learning_rate: 4e-06 - train_batch_size: 1 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1.0 ### Resource Usage Peak GPU Memory: 6.6064 GB ### Eval-Phase Metrics | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 | | 0 | 0 | 50480.5703 | 85684.4844 | 6.8305 | 13.0278 | 76.759 | 9.595 | 33932.0586 | 94692.1562 | | 5000 | 0.0505 | 133.2834 | 19661.1758 | 1.2085 | 13.0193 | 76.809 | 9.601 | 9.1500 | 54349.0273 | | 10000 | 0.1010 | 133.3712 | 19627.9590 | 1.2079 | 13.0358 | 76.712 | 9.589 | 9.1530 | 54117.5273 | | 15000 | 0.1515 | 133.2834 | 19650.0977 | 1.2077 | 13.0514 | 76.62 | 9.578 | 9.1402 | 54088.6328 | | 20000 | 0.2020 | 133.3299 | 19639.0254 | 1.2077 | 13.0057 | 76.889 | 9.611 | 9.1545 | 54204.1992 | | 25000 | 0.2525 | 133.4539 | 19650.0977 | 1.2079 | 13.0301 | 76.745 | 9.593 | 9.1538 | 54349.0273 | | 30000 | 0.3030 | 133.5160 | 19650.0977 | 1.2079 | 13.03 | 76.746 | 9.593 | 9.1542 | 54088.6328 | | 35000 | 0.3535 | 133.2834 | 19627.9590 | 1.2078 | 13.0569 | 76.588 | 9.573 | 9.1451 | 54117.5273 | | 40000 | 0.4040 | 133.3712 | 19627.9590 | 1.2078 | 12.9991 | 76.928 | 9.616 | 9.1523 | 54146.3867 | | 45000 | 0.4545 | 133.3041 | 19650.0977 | 1.2077 | 12.9923 | 76.969 | 9.621 | 9.1477 | 54088.6328 | | 50000 | 0.5051 | 133.2834 | 19650.0977 | 1.2078 | 13.1989 | 75.764 | 9.47 | 9.1470 | 54204.1992 | | 55000 | 0.5556 | 133.4953 | 19650.0977 | 1.2078 | 13.1556 | 76.013 | 9.502 | 9.1485 | 54117.5273 | | 60000 | 0.6061 | 133.4901 | 19661.1758 | 1.2077 | 13.206 | 75.723 | 9.465 | 9.1477 | 54117.5273 | | 65000 | 0.6566 | 133.4488 | 19650.0977 | 1.2077 | 13.0052 | 76.892 | 9.612 | 9.1470 | 54117.5273 | | 70000 | 0.7071 | 133.3661 | 19650.0977 | 1.2078 | 12.9996 | 76.925 | 9.616 | 9.1470 | 54117.5273 | | 75000 | 0.7576 | 133.4074 | 19650.0977 | 1.2079 | 13.0082 | 76.874 | 9.609 | 9.1470 | 54117.5273 | | 80000 | 0.8081 | 133.4488 | 19650.0977 | 1.2078 | 12.9816 | 77.032 | 9.629 | 9.1485 | 54117.5273 | | 85000 | 0.8586 | 133.3661 | 19650.0977 | 1.2077 | 12.9875 | 76.997 | 9.625 | 9.1470 | 54117.5273 | | 90000 | 0.9091 | 133.3661 | 19650.0977 | 1.2077 | 12.985 | 77.012 | 9.626 | 9.1462 | 54117.5273 | | 95000 | 0.9596 | 133.4074 | 19650.0977 | 1.2078 | 13.0478 | 76.641 | 9.58 | 9.1470 | 54146.3867 | | 99000 | 1.0 | 133.3661 | 19650.0977 | 1.2078 | 12.9975 | 76.938 | 9.617 | 9.1470 | 54146.3867 | ### Framework versions - Distily 0.2.0 - Transformers 4.44.0 - Pytorch 2.3.0 - Datasets 2.21.0