File size: 3,311 Bytes
7beef95
6fa5ed4
 
7beef95
 
 
 
 
 
 
 
 
6fa5ed4
7beef95
6fa5ed4
7beef95
 
faa3b6f
7beef95
e846463
7beef95
e846463
7beef95
e846463
7beef95
e846463
6fa5ed4
7beef95
 
 
 
 
e846463
7beef95
e846463
7beef95
6fa5ed4
7beef95
 
 
 
 
e846463
7beef95
6fa5ed4
 
e846463
7beef95
e846463
7beef95
6fa5ed4
7beef95
 
 
 
 
 
 
6fa5ed4
7beef95
 
 
e846463
 
 
 
 
6fa5ed4
e846463
6fa5ed4
e846463
6fa5ed4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
---
language:
- ar
tags:
- llama
- text-generation
- instruct
- arabic
- math
- fine-tuned
datasets:
- Jr23xd23/Arabic_LLaMA_Math_Dataset
license: apache-2.0
base_model: meta-llama/Llama-3.2-3B-Instruct
pipeline_tag: text-generation
inference: true
---

# Math_Arabic_Llama-3.2-3B-Instruct

## Model Description

**Math_Arabic_Llama-3.2-3B-Instruct** is a fine-tuned version of the **Llama-3.2-3B-Instruct** model, tailored for solving mathematical problems in Arabic. The model was trained using the **[Arabic LLaMA Math Dataset](https://github.com/jaberjaber23/Arabic-LLaMA-Math-Dataset)**, which includes a wide range of mathematical problems in natural language (Arabic). This model is ideal for educational applications, tutoring, and systems that require automatic math problem-solving in Arabic.

## Model Details

- **Model Type**: Transformer-based language model fine-tuned for text generation
- **Languages**: Arabic
- **Base Model**: [meta-llama/Llama-3.2-3B-Instruct](https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct)
- **Dataset**: [Arabic LLaMA Math Dataset](https://github.com/jaberjaber23/Arabic-LLaMA-Math-Dataset)
- **Number of Parameters**: 3 billion
- **Fine-tuned by**: [Jr23xd23](https://huggingface.co/Jr23xd23)

## Training Data

The model was fine-tuned on the **Arabic LLaMA Math Dataset**, which consists of 12,496 examples covering various mathematical topics, such as:

- Basic Arithmetic
- Algebra
- Geometry
- Probability
- Combinatorics

Each example in the dataset includes:
- **Instruction**: The problem statement in Arabic
- **Solution**: The solution to the problem in Arabic

## Intended Use

### Primary Use Cases:

- Solving mathematical problems in Arabic
- Educational applications
- Tutoring systems for Arabic-speaking students
- Mathematical reasoning tasks in Arabic

### How to Use

You can use the model in Python with the Hugging Face transformers library:

```python
from transformers import AutoModelForCausalLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("Jr23xd23/Math_Arabic_Llama-3.2-3B-Instruct")
model = AutoModelForCausalLM.from_pretrained("Jr23xd23/Math_Arabic_Llama-3.2-3B-Instruct")

# Example: Solving a math problem in Arabic
input_text = "ما هو مجموع الزوايا في مثلث؟" # What is the sum of angles in a triangle?
inputs = tokenizer(input_text, return_tensors="pt")
output = model.generate(**inputs, max_length=100)
print(tokenizer.decode(output[0], skip_special_tokens=True))
```

## Limitations

- The model is not designed for non-mathematical language tasks.
- Performance may degrade when applied to highly complex mathematical problems beyond the scope of the training dataset.
- The model's outputs should be verified for critical applications.

## License

This model is licensed under the **Apache 2.0 License**.

## Citation

If you use this model in your research or projects, please cite it as follows:

```bibtex
@model{Math_Arabic_Llama_3.2_3B_Instruct,
  title={Math_Arabic_Llama-3.2-3B-Instruct},
  author={Jr23xd23},
  year={2024},
  publisher={Hugging Face},
  url={https://huggingface.co/Jr23xd23/Math_Arabic_Llama-3.2-3B-Instruct},
}
```

## Acknowledgements

Special thanks to the creators of the **Arabic LLaMA Math Dataset** for providing a rich resource for fine-tuning the model.