|
--- |
|
language: |
|
- ar |
|
datasets: |
|
- arabic QA |
|
widget: |
|
- text: "answer: 7 سنوات ونصف context: الثورة الجزائرية أو ثورة المليون شهيد، اندلعت في 1 نوفمبر 1954 ضد المستعمر الفرنسي ودامت 7 سنوات ونصف. استشهد فيها أكثر من مليون ونصف مليون جزائري" |
|
--- |
|
|
|
# Arabic Question generation Model |
|
[AraT5-Base Model](https://huggingface.co/UBC-NLP/AraT5-base) fine-tuned on Arabic Question-Answering Dataset for **Question generation** by just prepending the *answer* to the context* |
|
|
|
## Model in Action 🚀 |
|
```python |
|
from transformers import AutoTokenizer,AutoModelForSeq2SeqLM |
|
|
|
model = AutoModelForSeq2SeqLM.from_pretrained("Mihakram/Arabic_Question_Generation") |
|
tokenizer = AutoTokenizer.from_pretrained("Mihakram/Arabic_Question_Generation") |
|
|
|
def get_question(context,answer): |
|
text="context: " +context + " " + "answer: " + answer + " </s>" |
|
text_encoding = tokenizer.encode_plus( |
|
text,return_tensors="pt" |
|
) |
|
model.eval() |
|
output = model.generate( |
|
input_ids=text_encoding['input_ids'], |
|
attention_mask=text_encoding['attention_mask'], |
|
max_length=64, |
|
num_beams=5, |
|
num_return_sequences=1 |
|
) |
|
|
|
preds = [ |
|
tokenizer.decode(gen_id,skip_special_tokens=True,clean_up_tokenization_spaces=True) |
|
for gen_id in generated_ids |
|
] |
|
return tokenizer.decode(output[0],skip_special_tokens=True,clean_up_tokenization_spaces=True) |
|
|
|
context="الثورة الجزائرية أو ثورة المليون شهيد، اندلعت في 1 نوفمبر 1954 ضد المستعمر الفرنسي ودامت 7 سنوات ونصف. استشهد فيها أكثر من مليون ونصف مليون جزائري" |
|
answer =" 7 سنوات ونصف" |
|
|
|
get_question(answer,context) |
|
|
|
#output : question="كم استمرت الثورة الجزائرية؟ " |
|
|
|
``` |
|
## Details of Ara-T5 |
|
|
|
The **Ara-T5** model was presented in [AraT5: Text-to-Text Transformers for Arabic Language Generation](https://arxiv.org/abs/2109.12068) by *El Moatez Billah Nagoudi, AbdelRahim Elmadany, Muhammad Abdul-Mageed* |
|
|
|
|
|
|
|
## Citation |
|
If you want to cite this model you can use this: |
|
|
|
```bibtex |
|
@misc{Mihakram/, |
|
title={}, |
|
author={Mihoubi, Ibrir}, |
|
publisher={Hugging Face}, |
|
journal={Hugging Face Hub}, |
|
howpublished={\url{https://huggingface.co/}}, |
|
year={2022} |
|
} |
|
``` |
|
|
|
> Created by [LinkedIn](https://www.linkedin.com/in/mihoubi-akram/) |