mlx-community/FastVLM-0.5B-bf16

This model was converted to MLX format from apple/FastVLM-0.5B using mlx-vlm from this PR. Refer to the original model card for more details on the model.

Use with mlx

pip install -U mlx-vlm
python -m mlx_vlm.generate --model mlx-community/FastVLM-0.5B-bf16 --max-tokens 100 --temperature 0.0 --prompt "Describe this image in detail." --image https://huggingface.co/datasets/huggingface/documentation-images/resolve/0052a70beed5bf71b92610a43a52df6d286cd5f3/diffusers/rabbit.jpg
Downloads last month
174
Safetensors
Model size
0.6B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support