File size: 918 Bytes
9ff7f86 b67f4db ca182f4 b67f4db 0c285d9 69279b0 11680d6 4385370 8deead1 b3936b4 8deead1 11680d6 8deead1 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
---
language:
- en
- zh
license: llama3
library_name: transformers
base_model: unsloth/llama-3-8b-bnb-4bit
datasets:
- erhwenkuo/alpaca-data-gpt4-chinese-zhtw
pipeline_tag: text-generation
tags:
- llama-3
prompt_template: >-
{{ if .System }}<|start_header_id|>system<|end_header_id|> {{ .System
}}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|>
{{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>
{{ .Response }}<|eot_id|>
---
# LLAMA 3 8B with capable to output Traditional Chinese
## ✨ Recommend using LMStudio for this model
I tried using Ollama to run it, but it became quite delulu.
So for now, I'm sticking with LMStudio :)The performance isn't actually that great, but it's capable of answering some basic questions. Sometimes it just acts really dumb though :(
> LLAMA 3.1 can actually output pretty well Chinese, so this repo can be ignored. |