--- base_model: OFA-Sys/chinese-clip-vit-base-patch16 tags: - generated_from_trainer metrics: - accuracy model-index: - name: aoi_clip_high_resolution_concate_fusin_crop_each_text results: [] --- [Visualize in Weights & Biases](https://wandb.ai/shark_meow_team/huggingface/runs/5lcdjsus) # aoi_clip_high_resolution_concate_fusin_crop_each_text This model is a fine-tuned version of [OFA-Sys/chinese-clip-vit-base-patch16](https://huggingface.co/OFA-Sys/chinese-clip-vit-base-patch16) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 3.4957 - Accuracy: 0.0648 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 20 - eval_batch_size: 20 - seed: 42 - gradient_accumulation_steps: 10 - total_train_batch_size: 200 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 60.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-------:|:-----:|:---------------:|:--------:| | 1.7286 | 5.9821 | 1602 | 3.0151 | 0.0654 | | 1.6207 | 11.9642 | 3204 | 3.2376 | 0.0665 | | 1.5399 | 17.9462 | 4806 | 3.2386 | 0.0685 | | 1.4981 | 23.9283 | 6408 | 3.3545 | 0.0673 | | 1.4774 | 29.9104 | 8010 | 3.3404 | 0.0677 | | 1.4648 | 35.8925 | 9612 | 3.4236 | 0.0670 | | 1.4549 | 41.8745 | 11214 | 3.4689 | 0.0664 | | 1.4528 | 47.8566 | 12816 | 3.5205 | 0.0659 | | 1.4538 | 53.8387 | 14418 | 3.4703 | 0.0655 | | 1.4519 | 59.8208 | 16020 | 3.4957 | 0.0651 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1