Diego Caumont's picture
4

Diego Caumont

Diegg

AI & ML interests

None yet

Recent Activity

liked a model 17 days ago
mradermacher/BabyHercules-4x150M-GGUF
liked a model about 2 months ago
vikhyatk/moondream-next
View all activity

Organizations

None yet

Diegg's activity

Reacted to mrfakename's post with 👀 8 months ago
view post
Post
4058
Mistral AI recently released a new Mixtral model. It's another Mixture of Experts model with 8 experts, each with 22B parameters. It requires over 200GB of VRAM to run in float16, and over 70GB of VRAM to run in int4. However, individuals have been successful at finetuning it on Apple Silicon laptops using the MLX framework. It features a 64K context window, twice that of their previous models (32K).

The model was released over torrent, a method Mistral has recently often used for their releases. While the license has not been confirmed yet, a moderator on their Discord server yesterday suggested it was Apache 2.0 licensed.

Sources:
https://twitter.com/_philschmid/status/1778051363554934874
https://twitter.com/reach_vb/status/1777946948617605384
  • 1 reply
·
liked a Space over 1 year ago