File size: 2,451 Bytes
4e0294f
 
 
 
 
 
 
 
 
 
 
48317c6
 
 
 
 
4e0294f
48317c6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
30d8efa
48317c6
 
 
 
 
4e0294f
 
 
 
 
48317c6
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
---
base_model: unsloth/phi-3.5-mini-instruct-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
license: apache-2.0
language:
- en
metrics:
- bleu
- cer
- meteor
library_name: transformers
---
# Phi-3.5B Finetuned Model

## 1. Introduction
This model is a finetuned version of the Phi-3.5B large language model. It is designed to provide detailed and accurate responses for university course-related queries. The model has been optimized to deliver insights into course details, fee structures, duration options, and campus locations, along with links to relevant course pages. The finetuning process utilized a domain-specific dataset to ensure precision and reliability.

---

## 2. Dataset Used for Finetuning
The finetuning of the Phi-3.5B model was performed using a private dataset created through web scraping. The data was collected from the University of Westminster website and included:

- Course titles
- Campus details
- Duration options (full-time, part-time, distance learning)
- Fee structures (for UK and international students)
- Course descriptions
- Direct links to course pages

The dataset was cleaned and structured to enhance the model's ability to generate accurate and context-aware responses.

---

## 3. How to Use This Model
To use the Phi-3.5B finetuned model, follow the steps below:

```python
from transformers import TextStreamer

def chatml(question, model):
         messages = [{"role": "user", "content": question},]

         inputs = tokenizer.apply_chat_template(messages,
                                                tokenize=True,
                                                add_generation_prompt=True,
                                                return_tensors="pt",).to("cuda")

         text_streamer = TextStreamer(tokenizer, skip_special_tokens=True,
                                      skip_prompt=True)
         return model.generate(input_ids=inputs,
                               streamer=text_streamer,
                               max_new_tokens=512)
  
question = "Does the University of Westminster offer a course on AI, Data and Communication MA?"
x = query(question, model)
```

This setup ensures you can effectively query the Phi-3.5B finetuned model and receive detailed, relevant responses.

---


# Uploaded  model

- **Developed by:** roger33303
- **License:** apache-2.0
- **Finetuned from model :** unsloth/phi-3.5-mini-instruct-bnb-4bit