--- tags: - merge - mergekit - lazymergekit - sh2orc - bccard base_model: - sh2orc/Llama-3-Korean-8B --- # Llama-3-Korean-12B Llama-3-Korean-12B is a merge of the following models using LazyMergekit: * [sh2orc/Llama-3-Korean-8B](sh2orc/Llama-3-Korean-8B) - Dataset: - [maywell/ko_wikidata_QA](https://huggingface.co/datasets/maywell/ko_wikidata_QA) - [lcw99/wikipedia-korean-20240501-1million-qna](https://huggingface.co/datasets/lcw99/wikipedia-korean-20240501-1million-qna) - [inswave/AISquare_Koalpaca_Orca_merged](https://huggingface.co/datasets/inswave/AISquare_Koalpaca_Orca_merged) - [inswave/AISqaure_Intergrated_v1](https://huggingface.co/datasets/inswave/AISqaure_Intergrated_v1) ## πŸ’» Usage ```python !pip install -qU transformers accelerate from transformers import AutoTokenizer import transformers import torch model = "sh2orc/Llama-3-Korean-12B" messages = [{"role": "user", "content": "What is a large language model?"}] tokenizer = AutoTokenizer.from_pretrained(model) prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) pipeline = transformers.pipeline( "text-generation", model=model, torch_dtype=torch.float16, device_map="auto", ) outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95) print(outputs[0]["generated_text"]) ``` ## Q&A **Q. 2μ°¨ μ„Έκ³„λŒ€μ „μ— λŒ€ν•΄μ„œ μ•Œλ €μ€˜**
A. 2μ°¨ μ„Έκ³„λŒ€μ „μ€ 1939λ…„λΆ€ν„° 1945λ…„κΉŒμ§€ λ²Œμ–΄μ§„ λŒ€κ·œλͺ¨ μ „μŸμœΌλ‘œ, 유럽과 μ•„μ‹œμ•„μ—μ„œ μ—¬λŸ¬ λ‚˜λΌλ“€μ΄ μ°Έμ—¬ν–ˆμŠ΅λ‹ˆλ‹€. 이 μ „μŸμ€ 제1μ°¨ μ„Έκ³„λŒ€μ „μ„ μ‹œμž‘μœΌλ‘œ ν•œ 20μ„ΈκΈ° μ΅œμ•…μ˜ μ „μŸ 쀑 ν•˜λ‚˜μ˜€μŠ΅λ‹ˆλ‹€. μ „μŸμ€ 독일과 μ΄νƒˆλ¦¬μ•„κ°€ ν΄λž€λ“œλ₯Ό μΉ¨κ³΅ν•˜λ©΄μ„œ μ‹œμž‘λ˜μ—ˆκ³ , 이후 μ†Œλ ¨μ΄ 독일에 λ§žμ„œκΈ° μœ„ν•΄ μ°Έμ „ν•˜λ©΄μ„œ 유럽 전선이 ν˜•μ„±λ˜μ—ˆμŠ΅λ‹ˆλ‹€. μ „μŸμ€ 1941λ…„ 6μ›” 22일 독일이 μ†Œλ ¨μ„ μΉ¨κ³΅ν•˜λ©΄μ„œ μ‹œμž‘λ˜μ—ˆκ³ , 1945λ…„ 5μ›” 8일 독일이 연합ꡰ에 ν•­λ³΅ν•˜λ©΄μ„œ μ’…λ£Œλ˜μ—ˆμŠ΅λ‹ˆλ‹€. μ „μŸμ€ 1945λ…„ 8μ›” 15일 일본 제ꡭ이 연합ꡰ에 ν•­λ³΅ν•˜λ©΄μ„œ μ’…κ²°λ˜μ—ˆμŠ΅λ‹ˆλ‹€. 이 μ „μŸμ€ 2μ°¨ μ„Έκ³„λŒ€μ „μœΌλ‘œ μ•Œλ €μ Έ 있으며, 2μ°¨ μ„Έκ³„λŒ€μ „μœΌλ‘œ μ•Œλ €μ Έ μžˆμŠ΅λ‹ˆλ‹€. **Q. λ‚˜μ΄κ°€ λ“€λ©΄ 무엇을 λ¨Όμ € κ³ λ―Όν•΄μ•Όν•΄?**
A. λ‚˜μ΄κ°€ λ“€λ©΄ μ—¬λŸ¬ 가지 λ¬Έμ œκ°€ λ°œμƒν•  수 μžˆμ§€λ§Œ, κ°€μž₯ μ€‘μš”ν•œ 것은 건강을 μœ μ§€ν•˜λŠ” κ²ƒμž…λ‹ˆλ‹€. λ”°λΌμ„œ μš΄λ™κ³Ό κ· ν˜• 작힌 μ‹μŠ΅κ΄€μ„ μœ μ§€ν•˜λŠ” 것이 μ€‘μš”ν•©λ‹ˆλ‹€. λ˜ν•œ, μ Šμ„ λ•ŒλŠ” μ ŠμŒμ„ μžλž‘ν–ˆμ§€λ§Œ λ‚˜μ΄κ°€ λ“€λ©΄ μΉœκ΅¬μ™€ κ°€μ‘±κ³Όμ˜ 관계가 더 μ€‘μš”ν•΄μ§‘λ‹ˆλ‹€. μ΄λŸ¬ν•œ 이유둜, λ‚˜μ΄κ°€ λ“€λ©΄μ„œλ„ κ±΄κ°•ν•œ 삢을 μœ μ§€ν•˜κ³ , μ†Œμ€‘ν•œ μ‚¬λžŒλ“€κ³Ό ν•¨κ»˜ μ‹œκ°„μ„ λ³΄λ‚΄λŠ” 것이 μ’‹μŠ΅λ‹ˆλ‹€.