|
--- |
|
license: apache-2.0 |
|
datasets: |
|
- mlabonne/CodeLlama-2-20k |
|
language: |
|
- zh |
|
--- |
|
# Chinese-CodeLlama-7B-SFT-V1 |
|
|
|
We implemented SFT based on our [Chinese-CodeLlama-7B-PT](https://huggingface.co/frankminors123/Chinese-CodeLlama-7B-PT). The dataset comes from [CodeLlama-2-20k](https://huggingface.co/datasets/mlabonne/CodeLlama-2-20k), we used Google Translate to translate it into Chinese. |
|
|
|
In addition, we designed appropriate Chinese prompt template for coding tasks, and during the fine-tuning stage, `memory efficient attention` was applied which save us a lot of GPU memory space. |
|
|
|
The Chinese prompt template used is as follows: |
|
```python |
|
PROMPT_TEMPLATE = ( |
|
"下面是描述一项任务的指令,并且与一则输入配对用来提供更多的上下文。请给出尽可能满足请求的回答.\n" |
|
"### 指令:\n{instruction}\n### 输入:\n{input}\n### 回答:\n" |
|
) |
|
``` |
|
|
|
If you are interested in our work, please follow our progress in the future. |