Update README.md
Browse files
README.md
CHANGED
@@ -4,8 +4,6 @@ license: apache-2.0
|
|
4 |
|
5 |
|
6 |
|
7 |
-
|
8 |
-
|
9 |
APUS-xDAN-4.0-MOE
|
10 |
Introduction
|
11 |
APUS-xDAN-4.0-MOE is a transformer-based MoE decoder-only language model pretrained on a large amount of data.
|
@@ -20,5 +18,5 @@ The code of APUS-xDAN-4.0-MOE has been in the latest Hugging face transformers a
|
|
20 |
|
21 |
|
22 |
|
23 |
-
|
24 |
-
|
|
|
4 |
|
5 |
|
6 |
|
|
|
|
|
7 |
APUS-xDAN-4.0-MOE
|
8 |
Introduction
|
9 |
APUS-xDAN-4.0-MOE is a transformer-based MoE decoder-only language model pretrained on a large amount of data.
|
|
|
18 |
|
19 |
|
20 |
|
21 |
+
License
|
22 |
+
APUS-xDAN-4.0-MOE is licensed under the LLAMA 2 Community License, Copyright (c) Meta Platforms, Inc. All Rights Reserved.
|