|
--- |
|
license: mit |
|
language: |
|
- en |
|
library_name: transformers |
|
pipeline_tag: text-generation |
|
tags: |
|
- freeai |
|
- conversational |
|
- meowgpt |
|
- gpt |
|
- free |
|
- opensource |
|
- splittic |
|
- ai |
|
- llama |
|
- llama3 |
|
widget: |
|
- text: <s> [|User|] Hello World </s>[|Assistant|] |
|
datasets: |
|
- Open-Orca/SlimOrca-Dedup |
|
- jondurbin/airoboros-3.2 |
|
- microsoft/orca-math-word-problems-200k |
|
- m-a-p/Code-Feedback |
|
- MaziyarPanahi/WizardLM_evol_instruct_V2_196k |
|
- mlabonne/orpo-dpo-mix-40k |
|
--- |
|
# MeowGPT Readme |
|
|
|
## Overview |
|
MeowGPT, developed by CutyCat2000x, is a language model based on Llama with the checkpoint version ll3. This model is designed to generate text in a conversational manner and can be used for various natural language processing tasks. |
|
|
|
## Usage |
|
### Loading the Model |
|
To use MeowGPT, you can load it via the `transformers` library in Python using the following code: |
|
|
|
```python |
|
from transformers import LlamaTokenizer, AutoModelForCausalLM, AutoTokenizer |
|
|
|
tokenizer = LlamaTokenizer.from_pretrained("cutycat2000x/MeowGPT-ll3") |
|
model = AutoModelForCausalLM.from_pretrained("cutycat2000x/MeowGPT-ll3") |
|
``` |
|
|
|
### Example Prompt |
|
An example of how to prompt the model for generating text: |
|
|
|
```python |
|
{{ bos_token }}{% if messages[0]['role'] == 'system' %}{% set loop_messages = messages[1:] %}{% set system_message = messages[0]['content'] %}{% else %}{% set loop_messages = messages %}{% set system_message = false %}{% endif %}{% for message in loop_messages %}{% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}{{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}{% endif %}{% if loop.index0 == 0 and system_message != false %}{% set content = '<<SYS>>\\n' + system_message + '\\n<</SYS>>\\n\\n' + message['content'] %}{% else %}{% set content = message['content'] %}{% endif %}{% if message['role'] == 'user' %}{{ '[INST] ' + content.strip() + ' [/INST]' }}{% elif message['role'] == 'assistant' %}{{ ' ' + content.strip() + eos_token }}{% endif %}{% endfor %} |
|
``` |
|
|
|
The <s> and </s> are start and end tokens. |
|
|
|
## About the Model |
|
- **Base Model**: Llama3 |
|
- **Checkpoint Version**: ll3 |
|
- **Datasets Used**: Open-Orca/SlimOrca-Dedup, jondurbin/airoboros-3.2, microsoft/orca-math-word-problems-200k, m-a-p/Code-Feedback, MaziyarPanahi/WizardLM_evol_instruct_V2_196k, mlabonne/orpo-dpo-mix-40k |
|
|
|
## Citation |
|
If you use MeowGPT in your research or projects, please consider citing CutyCat2000x. |
|
|
|
## Disclaimer |
|
Please note that while MeowGPT is trained to assist in generating text based on given prompts, it may not always provide accurate or contextually appropriate responses. It's recommended to review and validate the generated content before usage in critical applications. |
|
|
|
For more information or support, refer to the `transformers` library documentation or CutyCat2000x's resources. |