mlx-community/Phi-3.5-vision-instruct-bf16

This model was converted to MLX format from microsoft/Phi-3.5-vision-instruct using mlx-vlm version 0.0.13. Refer to the original model card for more details on the model.

Use with mlx

pip install -U mlx-vlm
python -m mlx_vlm.generate --model mlx-community/Phi-3.5-vision-instruct-bf16 --max-tokens 100 --temp 0.0
Downloads last month
103
Safetensors
Model size
4.15B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the HF Inference API does not support mlx models with pipeline type text-generation

Collection including mlx-community/Phi-3.5-vision-instruct-bf16