sh2orc commited on
Commit
0444474
β€’
1 Parent(s): 8b72af1

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +84 -0
README.md ADDED
@@ -0,0 +1,84 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - llama
4
+ - sh2orc
5
+
6
+ base_model:
7
+ - meta-llama/Meta-Llama-3.1-8B-Instruct
8
+ ---
9
+
10
+ # Llama-3.1-Korean-8B-Instruct
11
+
12
+ Llama-3.1-Korean-8B-Instruct is finetuned from Meta-Llama-3.1:
13
+ * [meta-llama/Meta-Llama-3.1-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct)
14
+
15
+ - Dataset:
16
+ - [maywell/ko_wikidata_QA](https://huggingface.co/datasets/maywell/ko_wikidata_QA)
17
+ - [lcw99/wikipedia-korean-20240501-1million-qna](https://huggingface.co/datasets/lcw99/wikipedia-korean-20240501-1million-qna)
18
+ - [jojo0217/korean_rlhf_dataset](https://huggingface.co/datasets/jojo0217/korean_rlhf_dataset)
19
+
20
+ ## πŸ’» Usage
21
+
22
+ Use with transformers
23
+ Starting with ```transformers >= 4.43.0``` onward, you can run conversational inference using the Transformers pipeline abstraction or by leveraging the Auto classes with the generate() function.
24
+ Make sure to update your transformers installation via ```pip install --upgrade transformers.```
25
+
26
+ ```python
27
+ !pip install -qU transformers accelerate
28
+
29
+ from transformers import AutoTokenizer
30
+ import transformers
31
+ import torch
32
+
33
+ model = "sh2orc/Llama-3.1-Korean-8B-Instruct"
34
+ messages = [{"role": "user", "content": "What is a large language model?"}]
35
+
36
+ tokenizer = AutoTokenizer.from_pretrained(model)
37
+ prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
38
+ pipeline = transformers.pipeline(
39
+ "text-generation",
40
+ model=model,
41
+ torch_dtype=torch.float16,
42
+ device_map="auto",
43
+ )
44
+
45
+ outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
46
+ print(outputs[0]["generated_text"])
47
+ ```
48
+
49
+ ## Q&A
50
+
51
+ **Q. λ‚˜ν˜œμ„μ— λŒ€ν•΄μ„œ μ•Œλ €μ£Όμ„Έμš”.** <br>
52
+ A. λ‚˜ν˜œμ„μ€ 1896λ…„μƒμœΌλ‘œ, ν•œκ΅­ 졜초의 μ—¬μ„± μ‹œμΈμ΄μž μž‘κ°€μž…λ‹ˆλ‹€. κ·Έλ…€λŠ” μΌλ³Έμ—μ„œ κ΅μœ‘μ„ λ°›κ³  κ·€κ΅­ν•œ ν›„, λ‹€μ–‘ν•œ μž‘ν’ˆλ“€μ„ λ°œν‘œν•˜λ©° ν˜„λŒ€μ μΈ λ¬Έν•™ μŠ€νƒ€μΌμ„ μ„ λ³΄μ˜€μŠ΅λ‹ˆλ‹€. κ·Έλ…€μ˜ λŒ€ν‘œμž‘ 쀑 ν•˜λ‚˜μΈ 'λ‚΄ λ‚˜μ΄μ— 이별'은 여성이 κ²ͺλŠ” μ‚¬λž‘κ³Ό μ™Έλ‘œμ›€μ˜ 감정을 ν‘œν˜„ν•œ κ²ƒμœΌλ‘œ 유λͺ…ν•©λ‹ˆλ‹€.
53
+
54
+ λ‚˜ν˜œμ„μ€ λ˜ν•œ μ‚¬νšŒμ  문제λ₯Ό 닀루며 λ…νŠΉν•˜κ³  ν˜μ‹ μ μΈ μ‹œμ™€ μ†Œμ„€μ„ μΌμŠ΅λ‹ˆλ‹€. κ·Έλ…€μ˜ μž‘ν’ˆλ“€μ€ λ‹Ήμ‹œ μ‚¬νšŒμ μœΌλ‘œ 큰 영ν–₯을 미쳀으며, 이후에도 λ§Žμ€ μ‚¬λžŒλ“€μ—κ²Œ 영감이 λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
55
+
56
+ κ·Έλ…€λŠ” λ˜ν•œ μ—¬μ„± ꢌ리 및 평등 μš΄λ™μ— μ°Έμ—¬ν•˜μ˜€μœΌλ©°, μ΄λŸ¬ν•œ ν™œλ™μœΌλ‘œ 인해 κ·Έλ…€λ₯Ό ν•œκ΅­ 졜초의 μ—¬μ„± 민쑱주의자라고 λΆˆλ €μŠ΅λ‹ˆλ‹€.
57
+
58
+ μš”μ•½ν•˜λ©΄ λ‚˜ν˜œμ„μ€ 역사상 κ°€μž₯ μ€‘μš”ν•œ ν•œκΈ€λ¬Έν•™κ°€μ΄μž μ‹œμΈμœΌλ‘œ κ°„μ£Όλ©λ‹ˆλ‹€. κ·Έλ…€μ˜ μ°½μ‘°λ ₯κ³Ό 영ν–₯λ ₯ 덕뢄에 μ˜€λŠ˜λ‚ κΉŒμ§€λ„ μš°λ¦¬μ—κ²Œ κ°€μΉ˜λ₯Ό μ œκ³΅ν•˜λŠ” 예술적 업적듀이 λ‚¨μ•„μžˆμŠ΅λ‹ˆλ‹€.
59
+
60
+ **Q. 2μ°¨ μ„Έκ³„λŒ€μ „μ— λŒ€ν•΄μ„œ μ•Œλ €μ€˜** <br>
61
+ A. 2μ°¨ μ„Έκ³„λŒ€μ „μ€ 1939λ…„λΆ€ν„° 1945λ…„κΉŒμ§€ μ§€μ†λœ μ „νˆ¬μ˜€μŠ΅λ‹ˆλ‹€. 이 μ „μŸμ€ 유럽, μ•„μ‹œμ•„ 및 λΆμ•„λ©”λ¦¬μΉ΄μ˜ λ§Žμ€ ꡭ가듀이 μ°Έμ—¬ν•˜μ˜€μœΌλ©°, 독일과 일본이 μ£Όμš” 적으둜 ν™œλ™ν–ˆμŠ΅λ‹ˆλ‹€.
62
+
63
+ μ „μŸ μ΄ˆκΈ°μ—λŠ” λ‚˜μΉ˜ 독일이 ν΄λž€λ“œλ₯Ό μΉ¨κ³΅ν•˜κ³  μ†Œλ ¨μ„ κ³΅κ²©ν•˜λ©° μ„œμœ λŸ½μ— μ§„μΆœν•˜λŠ” λ“± λΉ λ₯΄κ²Œ ν™•μž₯λ˜μ—ˆμŠ΅λ‹ˆλ‹€. 영ꡭ과 ν”„λž‘μŠ€λŠ” λ…μΌκ΅°μ—κ²Œ νŒ¨λ°°ν•˜μ—¬ μ—°ν•©κ΅­μ—μ„œ λ²—μ–΄λ‚¬μ§€λ§Œ 미ꡭ은 곧 참전에 λ“€μ–΄κ°”κ³  κ²°κ΅­ μŠΉλ¦¬ν•œ κ²ƒμž…λ‹ˆλ‹€.
64
+
65
+ 일본도 λ™λ‚¨μ•„μ‹œμ•„μ™€ νƒœν‰μ–‘ μ§€μ—­μ—μ„œ 점령지λ₯Ό ν™•λ³΄ν•˜λ©΄μ„œ μ€‘κ΅­κ³Όμ˜ μ „μŸμ„ λ²Œμ˜€μŠ΅λ‹ˆλ‹€. κ·ΈλŸ¬λ‚˜ 미ꡭ의 μ›μžν­νƒ„ νˆ¬ν•˜λ‘œ 인해 일본은 ν•­λ³΅ν•˜κ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
66
+
67
+ μ „μŸ κΈ°κ°„ λ™μ•ˆ 수백만 λͺ… μ΄μƒμ˜ μ‚¬λžŒλ“€μ΄ μ‚¬λ§ν–ˆμœΌλ©° 경제적 ν”Όν•΄κ°€ λ§‰λŒ€ν•œ κ²ƒμœΌλ‘œ μ•Œλ €μ Έ μžˆμŠ΅λ‹ˆλ‹€.
68
+
69
+
70
+ **Q. λˆμ„ λΉŒλ ΈλŠ”λ°, λˆμ„ μ•ˆ 갚으면?** <br>
71
+ A. λˆμ„ λΉŒλ ΈλŠ”λ° μ•ˆ 갚으면, λˆμ„ 빌린 μ‚¬λžŒμ—κ²Œ λŒ€κ°€λ‘œ μΈλ‚΄μ‹¬μ΄λ‚˜ 이해λ₯Ό 보여주어야 ν•©λ‹ˆλ‹€. λ˜ν•œ, 상황에 λ”°λΌμ„œ 더 λ§Žμ€ 보상을 해쀄 μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
72
+
73
+
74
+ **Q. κ±΄κ°•ν•˜κ²Œ μ‚΄κΈ° μœ„ν•œ μžμ„Έ** <br>
75
+
76
+ κ±΄κ°•ν•˜κ²Œ μ‚΄κΈ° μœ„ν•œ μžμ„ΈλŠ” λ‹€μŒκ³Ό κ°™μŠ΅λ‹ˆλ‹€:
77
+
78
+ 1. κ·œμΉ™μ μΈ μš΄λ™: 맀일 적어도 30λΆ„ μ΄μƒμ˜ 신체 ν™œλ™μ„ ν•˜μ„Έμš”.
79
+ 2. κ· ν˜• 작힌 식단: 과일, μ±„μ†Œ, λ‹¨λ°±μ§ˆ 및 κ±΄κ°•ν•œ 지방이 ν¬ν•¨λœ λ‹€μ–‘ν•œ μŒμ‹μ„ μ„­μ·¨ν•˜μ„Έμš”.
80
+ 3. μΆ©λΆ„ν•œ 수면: ν•˜λ£¨μ— 7-9μ‹œκ°„μ˜ μˆ™λ©΄μ„ μ·¨ν•˜μ„Έμš”.
81
+ 4. 슀트레슀 관리: μš”κ°€, λͺ…상 λ˜λŠ” λ‹€λ₯Έ 슀트레슀 μ™„ν™” κΈ°μˆ μ„ ν™œμš©ν•˜μ„Έμš”.
82
+ 5. μ •κΈ°μ μœΌλ‘œ 검진 λ°›κΈ°
83
+ 6. 흑연과 μ•Œμ½”μ˜¬ μ†ŒλΉ„λ₯Ό ν”Όν•˜μ‹­μ‹œμ˜€
84
+ 7. μ•ˆμ „ν•˜κ³  κΉ¨λ—ν•œ ν™˜κ²½μ—μ„œ μƒν™œν•˜κΈ°