🕌 Qwen Islamic Expert - Cultural Knowledge Specialist
Model Description
Qwen Islamic Expert is a specialized LoRA (Low-Rank Adaptation) fine-tuned model based on Qwen/Qwen2.5-7B-Instruct, designed to provide accurate and comprehensive knowledge about Islamic culture, history, traditions, and civilization.
This model has been specifically trained to understand and respond to questions about:
- Islamic Jurisprudence (Fiqh) - Religious rulings and different schools of Islamic law
- Islamic History - Prophetic biography, Caliphs, Islamic empires and civilizations
- Islamic Ethics & Social Conduct - Moral principles derived from Islamic teachings
- Islamic Traditions & Customs - Cultural practices across different Muslim societies
- Islamic Philosophy & Sciences - Intellectual contributions to world civilization
- Islamic Arts & Architecture - Cultural expressions, literature, and architectural heritage
- Islamic Law & Social Systems - Legal frameworks and socio-economic principles
🎯 Performance
| Metric | Score | Comparison |
|---|---|---|
| Validation Accuracy | 72.3% | 🏆 Beats NileChat-3B baseline (69.5%) |
| Improvement over baseline | +2.8% | ✅ Significant improvement |
| Cultural Understanding | High | ✅ Specialized Islamic knowledge |
| Multilingual Support | Arabic/English | ✅ Bilingual capabilities |
🚀 Quick Start
Installation
pip install torch transformers peft accelerate bitsandbytes
Basic Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
import torch
# Load base model with quantization for memory efficiency
base_model = AutoModelForCausalLM.from_pretrained(
"Qwen/Qwen2.5-7B-Instruct",
torch_dtype=torch.bfloat16,
device_map="auto",
load_in_4bit=True
)
# Load tokenizer
tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen2.5-7B-Instruct")
# Load the Islamic Expert LoRA adapter
model = PeftModel.from_pretrained(
base_model,
"rafiulbiswas/qwen-islamic-expert",
torch_dtype=torch.bfloat16
)
# Set to evaluation mode
model.eval()
Example Usage
def ask_islamic_question(question, options=None):
if options:
# Multiple choice format
prompt = f'''<|im_start|>system
أنت عالم متخصص في الثقافة الإسلامية والحضارة الإسلامية.<|im_end|>
<|im_start|>user
{question}
A. {options[0]}
B. {options[1]}
C. {options[2]}
D. {options[3]}
بناءً على معرفتك العميقة بالثقافة الإسلامية، ما هي الإجابة الصحيحة؟<|im_end|>
<|im_start|>assistant
'''
else:
# Open-ended question
prompt = f'''<|im_start|>system
أنت عالم متخصص في الثقافة الإسلامية والحضارة الإسلامية.<|im_end|>
<|im_start|>user
{question}<|im_end|>
<|im_start|>assistant
'''
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
with torch.no_grad():
outputs = model.generate(
**inputs,
max_new_tokens=150,
temperature=0.7,
do_sample=True,
pad_token_id=tokenizer.eos_token_id
)
response = tokenizer.decode(outputs[0][inputs.input_ids.shape[1]:], skip_special_tokens=True)
return response.strip()
# Example 1: Multiple Choice Question
question = "ما هي أركان الإسلام الخمسة؟"
options = [
"الصلاة والزكاة والصوم والحج والشهادة",
"القرآن والسنة والإجماع والقياس والمصلحة",
"الإيمان والإسلام والإحسان والقدر والبعث",
"العدل والحرية والمساواة والشورى والكرامة"
]
answer = ask_islamic_question(question, options)
print(f"الجواب: {answer}")
# Example 2: Open-ended Question
question = "اشرح مفهوم العدالة في الإسلام"
answer = ask_islamic_question(question)
print(f"الجواب: {answer}")
📊 Training Details
Dataset
- Primary Dataset: UBC-NLP/palmx_2025_subtask2_islamic
- Task Type: Multiple-choice question answering
- Domain: Islamic culture, history, traditions, and civilization
- Languages: Arabic and English
- Training Samples: ~750 examples
- Validation Samples: ~75 examples
Training Configuration
- Base Model: Qwen/Qwen2.5-7B-Instruct
- Method: LoRA (Low-Rank Adaptation)
- LoRA Rank: 32
- LoRA Alpha: 64
- LoRA Dropout: 0.05
- Target Modules: q_proj, k_proj, v_proj, o_proj, gate_proj, up_proj, down_proj, embed_tokens, lm_head
- Training Epochs: 5
- Learning Rate: 1e-4
- Batch Size: 16 (effective, via gradient accumulation)
- Optimizer: AdamW
- Scheduler: Cosine with restarts
Training Infrastructure
- Hardware: GPU with 4-bit quantization (BitsAndBytesConfig)
- Precision: bfloat16
- Memory Optimization: Gradient checkpointing, 4-bit quantization
- Framework: HuggingFace Transformers + PEFT
🎓 Model Capabilities
Strengths
- ✅ Islamic Jurisprudence: Accurate knowledge of Fiqh principles and rulings
- ✅ Historical Knowledge: Comprehensive understanding of Islamic history
- ✅ Cultural Sensitivity: Respectful and accurate cultural representations
- ✅ Multilingual: Responds appropriately in both Arabic and English
- ✅ Contextual Understanding: Grasps nuanced cultural and religious concepts
Example Topics Covered
- Worship & Rituals: Prayer, fasting, pilgrimage, charity
- Islamic Law: Halal/Haram, marriage, inheritance, business ethics
- History: Prophetic era, Rashidun Caliphate, Umayyad, Abbasid periods
- Philosophy: Islamic philosophy, theology, and intellectual traditions
- Culture: Art, architecture, literature, music, and social customs
- Geography: Islamic world, historical centers of learning
⚖️ Ethical Considerations & Limitations
Responsible Use
- ✅ Educational Purpose: Designed for learning about Islamic culture and history
- ✅ Factual Information: Trained on scholarly and authentic sources
- ✅ Cultural Respect: Maintains respectful tone toward Islamic traditions
- ⚠️ Not Religious Authority: Should not replace consultation with qualified Islamic scholars
- ⚠️ Academic Context: Best used for educational and research purposes
Limitations
- Scope: Specialized for Islamic culture; may not perform well on general tasks
- Training Data: Limited to specific dataset; may not cover all cultural nuances
- Language: Optimized for Arabic/English; performance in other languages not guaranteed
- Temporal Knowledge: Training data has cutoff date; recent events may not be covered
📚 Citation & References
If you use this model in your research or applications, please cite:
@misc{qwen-islamic-expert-2024,
title={Qwen Islamic Expert: A Specialized Cultural Knowledge Model},
author={Md.Rafiul Biswas, Kais Attia, Shimaa Ibrahim, Mabrouka Bessghaier, Firoj Alam, and Wajdi Zaghouani},
year={2024},
howpublished={Hugging Face Model Hub},
url={https://huggingface.co/rafiulbiswas/qwen-islamic-expert}
}
🔄 Version History
Version 1.0 (Current)
- Release Date: July 25, 2025
- Checkpoint: checkpoint-100
- Performance: 72.3% validation accuracy
- Features: Initial release with Islamic cultural knowledge specialization
This model is designed to promote understanding and education about Islamic culture and civilization. Please use responsibly and consult qualified scholars for religious guidance.
- Downloads last month
- 11