---
license: apache-2.0
datasets:
- BAAI/COIG-PC
- ehartford/dolphin
- emozilla/booksum-summary-analysis_llama-8192
- OpenLeecher/GPT4-10k
- 0x70DA/stackoverflow-chat-data
- togethercomputer/Long-Data-Collections
---
## RWKV 7B world focus on reading comprehension
This is a experimental model based on RWKV 7B world.
why this model is special? ===>
remove eod, add special token, change vocabs.
this model is used to QA in large texts, do some in context learning with knowledge indexed database.
## trainning details
train with this kind of new format,
```User: xxxx\n\nxxxxx\n\nAssistant: xxxxx\n\nUser: xxxx\n\nAssistant: \n\n```
so ,use User Assistant as your prefix names.
and when inference in RWKV runner, just use the following format is fine.
User: xxxx\n\nAssistant: xxxx\n\n,in which are the test cases used.
--------------------------------------------
to use this model with RWKV runner,some effort needed, copy back-python folder to a new one ,which is in the same folder with rwkv-runner.exe(or the file to run) , then pastin rwkv_vocab_v20230424.txt into rwkv_pip folder to replace the vocabs file
../py310/python main.py in this new folder, then use RWKV runner setting API to 127.0.0.0.1:8000, and go to 127.0.0.1:8000/docs to switch model using this one
try different temp and topp , 1.2 0.5 may works.
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6176b32847ee6431f632981e/K5k9xaIjqm96buZ5czzfE.png)
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6176b32847ee6431f632981e/NmnLnzJq9FYSTd8w-uX4g.png)
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6176b32847ee6431f632981e/yfWGk3n_G-5tDfQKNzkaV.png)
temp 1.2 topp 0.6
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6176b32847ee6431f632981e/eb5_sEfPt8SHarLXJP1ig.png)