Dicta-LM 3.0: Advancing The Frontier of Hebrew Sovereign LLMs
Dicta-LM 3.0 is a powerful open-weight collection of LLMs, trained on extensive corpora of Hebrew and English texts. The models are available for download and for unlimited use. The models set a new SOTA for their weight-class for Hebrew, both as base models and chat models.
This is the 1.7-billion-parameter instruct model, originally initialized from Qwen3-1.7B-Base.
This version of the model is dynamically quantized to FP8, utilizing the Hopper and Blackwell architectures for faster inference with a lower memory footprint. This model can run with <4GB of VRAM.
For full details of this model please read our release blog post or the technical report.
You can view and access the full collection of base/instruct unquantized/quantized versions of DictaLM 3.0 here.
Instruction format
In order to leverage instruction fine-tuning, your prompt should be rendered using the chat template specified for this model. Most libraries deal with this automatically, so you can just let them do it.
Usage
vLLM
vllm serve dicta-il/DictaLM-3.0-1.7B-Instruct-FP8 --enable-auto-tool-choice --tool-call-parser hermes
And then you can access it via the openai library:
from openai import OpenAI
client = OpenAI(
base_url="http://localhost:8000/v1",
api_key="sk-no-key-required"
)
response = client.chat.completions.create(
model="dicta-il/DictaLM-3.0-1.7B-Instruct-FP8",
messages=[
{"role": "user", "content": "Hello, how are you?"}
],
)
print(response.choices[0].message.content)
The model supports tool-calling, enabling integration with external tools and APIs. For example how to use the tool calling, see the vLLM documentation.
Citation
If you use this model, please cite:
@article{Shmidman2025DictaLM3,
title={{Dicta-LM 3.0: Advancing The Frontier of Hebrew Sovereign LLMs}},
author={Shaltiel Shmidman and Avi Shmidman and Amir DN Cohen and Moshe Koppel},
year={2025},
publisher={{DICTA / Jerusalem, Israel}},
note={https://www.dicta.org.il/publications/DictaLM_3_0___Techincal_Report.pdf}
}
- Downloads last month
- 13
