dahara1 commited on
Commit
7b909e9
·
1 Parent(s): a36a1b1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -131,12 +131,12 @@ GPTQ is a technique (called quantization) that reduces model size.  
131
  ただし、性能は少し落ちてしまいます。また、日本語と英語以外の言語への翻訳能力は著しく低下しているはずです。
132
  However, performance is slightly reduced. Also, the ability to translate into languages other than Japanese and English should be significantly reduced.
133
 
134
- [Sample Code For Free Colab webbigdata/ALMA-7B-Ja-V2-GPTQ-Ja-En](https://github.com/webbigdata-jp/ALMA/blob/master/ALMA_7B_Ja_V2_GPTQ_Ja_En_Free_Colab_sample.ipynb)
135
 
136
  ファイル全体を一度に翻訳したい場合は、以下のColabをお試しください。
137
  If you want to translate the entire file at once, try Colab below.
138
 
139
- [ALMA_7B_Ja_GPTQ_Ja_En_batch_translation_sample](https://github.com/webbigdata-jp/ALMA/blob/master/ALMA_7B_Ja_V2_GPTQ_Ja_En_batch_translation_sample.ipynb)
140
 
141
 
142
  **ALMA** (**A**dvanced **L**anguage **M**odel-based tr**A**nslator) is an LLM-based translation model, which adopts a new translation model paradigm: it begins with fine-tuning on monolingual data and is further optimized using high-quality parallel data. This two-step fine-tuning process ensures strong translation performance.
 
131
  ただし、性能は少し落ちてしまいます。また、日本語と英語以外の言語への翻訳能力は著しく低下しているはずです。
132
  However, performance is slightly reduced. Also, the ability to translate into languages other than Japanese and English should be significantly reduced.
133
 
134
+ [Sample Code For Free Colab webbigdata/ALMA-7B-Ja-V2-GPTQ-Ja-En](https://github.com/webbigdata-jp/python_sample/blob/master/ALMA_7B_Ja_V2_GPTQ_Ja_En_Free_Colab_sample.ipynb)
135
 
136
  ファイル全体を一度に翻訳したい場合は、以下のColabをお試しください。
137
  If you want to translate the entire file at once, try Colab below.
138
 
139
+ [ALMA_7B_Ja_GPTQ_Ja_En_batch_translation_sample](https://github.com/webbigdata-jp/python_sample/blob/master/ALMA_7B_Ja_V2_GPTQ_Ja_En_batch_translation_sample.ipynb)
140
 
141
 
142
  **ALMA** (**A**dvanced **L**anguage **M**odel-based tr**A**nslator) is an LLM-based translation model, which adopts a new translation model paradigm: it begins with fine-tuning on monolingual data and is further optimized using high-quality parallel data. This two-step fine-tuning process ensures strong translation performance.