Add support for HF Clip
Would it be possible to add support for HF Clip format. So that we can directly load the model using the Hugging face transformers library without using open clip? A possible example is https://huggingface.co/laion/CLIP-ViT-L-14-DataComp.XL-s13B-b90K/tree/main
agree!
That would be of great help for fine-tuning.
any update?
any update?
Hi all
@AbhiKand
@ERICQIU
@G-AshwinKumar
@Jayantjivi
@Passionsu
,
I translated from the open-clip and timm based checkpoint to a HF-friendly implementation
You can now use it through:
CLIP_DIR = "/path/to/BiomedCLIP"
model = AutoModel.from_pretrained(CLIP_DIR, trust_remote_code=True)
processor = AutoProcessor.from_pretrained(CLIP_DIR, trust_remote_code=True)
with model and files in this PR:
https://huggingface.co/microsoft/BiomedCLIP-PubMedBERT_256-vit_base_patch16_224/discussions/20
@AbhiKand @ERICQIU @G-AshwinKumar @Jayantjivi @Yingshu you may find it here https://huggingface.co/chuhac/BiomedCLIP-vit-bert-hf
@AbhiKand @ERICQIU @G-AshwinKumar @Jayantjivi @Yingshu you may find it here https://huggingface.co/chuhac/BiomedCLIP-vit-bert-hf
Thank you!!