library_name: transformers
---
# ibleducation/ibl-fordham-7b
ibleducation/ibl-fordham-7b is a model finetuned on top of openchat/openchat_3.5
This model is finetuned to answer questions about fordham university.
## Model Details
- **Developed by:** [IBL Education](https://ibl.ai)
- **Model type:** [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)
- **Base Model:** [OpenChat 3.5](https://huggingface.co/openchat/openchat_3.5)
- **Language:** English
- **Finetuned from weights:** [OpenChat 3.5](https://huggingface.co/openchat/openchat_3.5)
- **Finetuned on data:**
- [ibleducation/fordham-university](https://huggingface.co/datasets/ibleducation/fordham-university)
- **Model License:** Apache 2.0
- **Epochs**: 7
## How to Use ibl-fordham-7b Model from Python Code (HuggingFace transformers) ##
### Install the necessary packages
Requires: [transformers](https://pypi.org/project/transformers/) 4.35.0 or later, and [accelerate](https://pypi.org/project/accelerate/) 0.23.0 or later.
```shell
pip install transformers==4.35.0
pip install accelerate==0.23.0
```
### You can then try the following example code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
import transformers
import torch
model_id = "ibleducation/ibl-fordham-7b"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
model_id,
device_map="auto",
)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
)
prompt = "What programmes are offered at fordham university?"
response = pipeline(prompt)
print(response['generated_text'])
```
**Important** - Use the prompt template below for ibl-fordham-7b:
```
{prompt}
```