finetuning models
#1
by
DazMashaly
- opened
is there a demo for finetuning the quantized model? Is it even possible?
This comment has been hidden
Great question! We currently don't have a guide for this, but I'll outline what you can do here:
- If you want to finetune the model, you can check out the original source from https://github.com/ultralytics/ultralytics and follow their guides to train yourself or finetune.
- With the new checkpoint, you can instantiate the AI Hub Models class as:
checkpoint = "/path/to/new/checkpoint.pt" from qai_hub_models.models.yolov8_det import Model model = Model.from_pretrained(checkpoint)
Now you can use the functions (compile, profile, inference) on that instance.
As for the quantized model, we currently do not offer Yolo-v8-Detection as quantized. Please stay tuned since we hope to expand the models that offer quantization.