Instructions to use timm/eva_giant_patch14_224.clip_ft_in1k with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- timm
How to use timm/eva_giant_patch14_224.clip_ft_in1k with timm:
import timm model = timm.create_model("hf_hub:timm/eva_giant_patch14_224.clip_ft_in1k", pretrained=True) - Transformers
How to use timm/eva_giant_patch14_224.clip_ft_in1k with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("image-classification", model="timm/eva_giant_patch14_224.clip_ft_in1k") pipe("https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/parrots.png")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("timm/eva_giant_patch14_224.clip_ft_in1k", dtype="auto") - Notebooks
- Google Colab
- Kaggle
total downloads of EVA-CLIP models
#2
by QuanSun - opened
Hi @rwightman ,
Would it be possible for us to obtain information on the total downloads of EVA-CLIP models in the timm?
Thanks,
Quan
Hmmm, to get the current sum of rolling monthly download counts...
from huggingface_hub import HfApi, ModelFilter
api = HfApi()
filter = ModelFilter(library='timm', model_name='eva')
sum([x.downloads for x in api.list_models(filter=filter)])
You can get an overview in the GUI https://huggingface.co/models?library=timm&sort=downloads&search=eva
Total counts are at the bottom of the settings tab, e.g. https://huggingface.co/timm/eva_large_patch14_196.in22k_ft_in22k_in1k/settings
I'm not sure if there is a programatic way of getting the accumulated counts? @julien-c
Thanks for your help!
QuanSun changed discussion status to closed