File size: 2,181 Bytes
8bab1fc
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/model-cards
license: apache-2.0
language:
  - zh
widget:
  - text: >-
      A chat between a curious user and an artificial intelligence assistant.
      The assistant gives helpful, detailed, and polite answers to the user's
      questions. USER: 你好,請問你可以幫我寫一封推薦信嗎? ASSISTANT:
library_name: transformers
pipeline_tag: text-generation
extra_gated_heading: Acknowledge license to accept the repository.
extra_gated_prompt: Please contact the author for access.
extra_gated_button_content: Acknowledge license 同意以上內容
extra_gated_fields:
  Name: text
  Mail: text
  Organization: text
  Country: text
  Any utilization of the Taiwan LLM repository mandates the explicit acknowledgment and attribution to the original author: checkbox
  使用Taiwan LLM必須明確地承認和歸功於優必達株式會社 Ubitus 以及原始作者: checkbox
---

# Taiwan-LLM-7B-v2.0.1-chat-4bits-GPTQ 
- Model creator: [Yen-Ting Lin](https://huggingface.co/yentinglin)
- Original model: [Taiwan LLM based on LLaMa2-7b v2.0.1 chat](https://huggingface.co/yentinglin/Taiwan-LLM-7B-v2.0.1-chat)

## Description

This repo contains GPTQ model files for [Yen-Ting Lin's Language Models for Taiwan LLM based on LLaMa2-7b v2.0.1 chat](https://huggingface.co/yentinglin/Taiwan-LLM-7B-v2.0.1-chat).

# Original model card: Yen-Ting Lin's Language Models for Taiwan LLM based on LLaMa2-7b

# Taiwan LLM based on LLaMa2-7b

continue pretraining on 20 billion tokens in traditional mandarin and instruction fine-tuning on millions of conversations.

This version does NOT include commoncrawl.

# 🌟 Checkout New [Taiwan-LLM Demo Chat-UI](http://www.twllm.com) 🌟

# Collaboration with Ubitus K.K. 💪💪💪

本項目與 Ubitus K.K. 合作進行。Ubitus 為本項目提供寶貴的技術支持和計算資源。

Taiwan LLM v2 is conducted in collaboration with [Ubitus K.K.](http://ubitus.net). Ubitus provides valuable technical support and compute resources for the project.