Update README.md
Browse files
README.md
CHANGED
@@ -14,9 +14,6 @@ datasets:
|
|
14 |
This model was trained for a single epoch with 1.6B SEC data in Llama-pro style
|
15 |
|
16 |
|
17 |
-
|
18 |
-
## Model Details
|
19 |
-
|
20 |
### Model Description
|
21 |
|
22 |
We only trained the newly added blocks as in the Llama pro paper while keeping every other layer frozen.
|
|
|
14 |
This model was trained for a single epoch with 1.6B SEC data in Llama-pro style
|
15 |
|
16 |
|
|
|
|
|
|
|
17 |
### Model Description
|
18 |
|
19 |
We only trained the newly added blocks as in the Llama pro paper while keeping every other layer frozen.
|