IQRA-AI/gemma-3-4b-Quran-1epochs

Model Description

IQRA-AI/gemma-3-4b-Quran-1epochs is a fine-tuned version of the Gemma 3 4B model that has been specifically trained on Quranic content. This model has been fine-tuned for 1 epoch on a dataset consisting of Quranic knowledge, making it capable of providing information about the Quran, its chapters (surahs), verses (ayat), and related Islamic topics.

Use Cases

This model is designed to:

  • Answer questions about the Quran
  • Provide information about the revelation of Quranic chapters
  • Explain Quranic verses and their meanings
  • Share knowledge about Islamic principles derived from the Quran

Limitations

  • The model has been trained for just 1 epoch, so its knowledge might not be as comprehensive as more extensively trained models
  • As with any language model, outputs should be verified against authentic Islamic sources
  • The model may occasionally generate inaccurate information

Usage

Installation

pip install transformers accelerate torch

Inference Example

from transformers import AutoProcessor, Gemma3ForConditionalGeneration
import torch

model_id = "IQRA-AI/gemma-3-4b-Quran-1epochs"

model = Gemma3ForConditionalGeneration.from_pretrained(
    model_id, device_map="auto"
).eval()

processor = AutoProcessor.from_pretrained(model_id)

messages = [
    {
        "role": "system",
        "content": [{"type": "text", "text": ""}]
    },
    {
        "role": "user",
        "content": [
            {"type": "text", "text": "Surah Al Fatihah turun dimana?"}
        ]
    }
]

inputs = processor.apply_chat_template(
    messages, add_generation_prompt=True, tokenize=True,
    return_dict=True, return_tensors="pt"
).to(model.device, dtype=torch.bfloat16)

input_len = inputs["input_ids"].shape[-1]

with torch.inference_mode():
    generation = model.generate(**inputs, max_new_tokens=100, do_sample=False)
    generation = generation[0][input_len:]

decoded = processor.decode(generation, skip_special_tokens=True)
print(decoded)

Model Training

License

This model is subject to the Gemma 3 license. Please ensure you follow all licensing requirements when using this model.

Citation

If you use this model in your research or applications, please cite:

@misc{iqra-ai-gemma-3-4b-quran,
  author = {IQRA-AI},
  title = {Gemma 3 4B Quran 1 Epoch},
  year = {2025},
  publisher = {Hugging Face},
  author = {Ariel Fikru Avicenna}
  howpublished = {\url{https://huggingface.co/IQRA-AI/gemma-3-4b-Quran-1epochs}}
}

Contact

For questions, suggestions, or issues related to this model, please contact us through the Hugging Face model repository.

Downloads last month
1
Safetensors
Model size
4B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for IQRA-AI/gemma-3-4b-Quran-1epochs

Finetuned
(382)
this model
Quantizations
1 model

Dataset used to train IQRA-AI/gemma-3-4b-Quran-1epochs