idefics2-8b-4bit / README.md
prince-canuma's picture
Update README.md
ffd7920 verified
|
raw
history blame
914 Bytes
metadata
language:
  - en
license: apache-2.0
tags:
  - multimodal
  - vision
  - image-text-to-text
  - mlx
datasets:
  - HuggingFaceM4/OBELICS
  - laion/laion-coco
  - wikipedia
  - facebook/pmd
  - pixparse/idl-wds
  - pixparse/pdfa-eng-wds
  - wendlerc/RenderedText
  - HuggingFaceM4/the_cauldron
  - teknium/OpenHermes-2.5
  - GAIR/lima
  - databricks/databricks-dolly-15k
  - meta-math/MetaMathQA
  - TIGER-Lab/MathInstruct
  - microsoft/orca-math-word-problems-200k
  - camel-ai/math
  - AtlasUnified/atlas-math-sets
  - tiedong/goat

mlx-community/idefics2-8b-4bit

This model was converted to MLX format from HuggingFaceM4/idefics2-8b using mlx-vlm version 0.0.4. Refer to the original model card for more details on the model.

Use with mlx

pip install -U mlx-vlm
python -m mlx_vlm.generate --model mlx-community/idefics2-8b-4bit --max-tokens 100 --temp 0.0