Text Generation
Transformers
Safetensors
llama
text-generation-inference
Inference Endpoints
jonabur commited on
Commit
a4690c9
1 Parent(s): a3641fb

update checkpoint list

Browse files
Files changed (1) hide show
  1. README.md +10 -4
README.md CHANGED
@@ -16,12 +16,11 @@ language:
16
 
17
  # Viking 13B
18
 
19
- **NOTE:** We are aware of an incompatibility with HF transformers that impacts finetuning and are working to correct it.
20
-
21
-
22
  _**NOTE:** This is a **research checkpoint** of a model for which **training has not been completed.** It is being provided in its current state for research and testing purposes. **Care should be taken when using the outputs of the model.** Once pretraining has completed we intend to release additional instruction-tuned and chat-tuned varieties._
23
 
24
- Viking 13B is a 13B parameter decoder-only transformer pretrained on Finnish, English, Swedish, Danish, Norwegian, Icelandic and code. It is being trained on 2 trillion tokens (1 trillion as of this release). Viking 13B is a fully open source model and is made available under the Apache 2.0 License.
 
 
25
 
26
  Viking was created in a collaboration between the [TurkuNLP group](https://turkunlp.org/) of the University of Turku, [SiloGen](https://www.silo.ai/silogen) from [Silo AI](https://www.silo.ai/),and [High Performance Language Technologies](https://hplt-project.org/) (HPLT). Training was conducted on the [LUMI supercomputer](https://www.lumi-supercomputer.eu/), using compute resources generously provided by [CSC](https://csc.fi/) - IT Center for Science, Finland.
27
 
@@ -96,6 +95,13 @@ Training checkpoints are available as branches in the repository. Checkpoints w
96
  * [800B](https://huggingface.co/LumiOpen/Viking-13B/tree/800B)
97
  * [900B](https://huggingface.co/LumiOpen/Viking-13B/tree/900B)
98
  * [1000B](https://huggingface.co/LumiOpen/Viking-13B/tree/1000B)
 
 
 
 
 
 
 
99
 
100
  The transformers library allows you to load a checkpoint from a branch as follows:
101
 
 
16
 
17
  # Viking 13B
18
 
 
 
 
19
  _**NOTE:** This is a **research checkpoint** of a model for which **training has not been completed.** It is being provided in its current state for research and testing purposes. **Care should be taken when using the outputs of the model.** Once pretraining has completed we intend to release additional instruction-tuned and chat-tuned varieties._
20
 
21
+ Viking 13B is a 13B parameter decoder-only transformer pretrained on Finnish,
22
+ English, Swedish, Danish, Norwegian, Icelandic and code. It is being trained
23
+ on 2 trillion tokens (1.3 trillion as of this release). Viking 13B is a fully open source model and is made available under the Apache 2.0 License.
24
 
25
  Viking was created in a collaboration between the [TurkuNLP group](https://turkunlp.org/) of the University of Turku, [SiloGen](https://www.silo.ai/silogen) from [Silo AI](https://www.silo.ai/),and [High Performance Language Technologies](https://hplt-project.org/) (HPLT). Training was conducted on the [LUMI supercomputer](https://www.lumi-supercomputer.eu/), using compute resources generously provided by [CSC](https://csc.fi/) - IT Center for Science, Finland.
26
 
 
95
  * [800B](https://huggingface.co/LumiOpen/Viking-13B/tree/800B)
96
  * [900B](https://huggingface.co/LumiOpen/Viking-13B/tree/900B)
97
  * [1000B](https://huggingface.co/LumiOpen/Viking-13B/tree/1000B)
98
+ * [1100B](https://huggingface.co/LumiOpen/Viking-13B/tree/1100B)
99
+ * [1200B](https://huggingface.co/LumiOpen/Viking-13B/tree/1200B)
100
+ * [1300B](https://huggingface.co/LumiOpen/Viking-13B/tree/1300B)
101
+ * [1400B](https://huggingface.co/LumiOpen/Viking-13B/tree/1400B)
102
+ * [1500B](https://huggingface.co/LumiOpen/Viking-13B/tree/1500B)
103
+ * [1600B](https://huggingface.co/LumiOpen/Viking-13B/tree/1600B)
104
+ * [1700B](https://huggingface.co/LumiOpen/Viking-13B/tree/1700B)
105
 
106
  The transformers library allows you to load a checkpoint from a branch as follows:
107