anchen1011
commited on
Commit
•
2ff18ec
1
Parent(s):
bb8b382
Update README.md
Browse files
README.md
CHANGED
@@ -13,8 +13,6 @@ We follow an extremely simple format to organize and manage our models aand data
|
|
13 |
|
14 |
# Model
|
15 |
|
16 |
-
### Format
|
17 |
-
|
18 |
Repo should be named as `midreal/{model_function}_{train_method}_{train_technique}_{base_model}_{date}`
|
19 |
|
20 |
Model card should include ([example](https://huggingface.co/midreal/passage_sft_qlora_llama3_70b_0704)):
|
@@ -45,27 +43,8 @@ Tracking of the training procedure
|
|
45 |
Person in charge
|
46 |
```
|
47 |
|
48 |
-
### Usage
|
49 |
-
|
50 |
-
Models could be uploaded with:
|
51 |
-
```
|
52 |
-
upload
|
53 |
-
```
|
54 |
-
|
55 |
-
and downloaded with:
|
56 |
-
```
|
57 |
-
download
|
58 |
-
```
|
59 |
-
|
60 |
-
and updated with:
|
61 |
-
```
|
62 |
-
update
|
63 |
-
```
|
64 |
-
|
65 |
# Dataset
|
66 |
|
67 |
-
### Format
|
68 |
-
|
69 |
Repo should be named as `midreal/{model_function}_{train_method}_{status}_{date}`
|
70 |
|
71 |
Dataset card should include ([example](https://huggingface.co/datasets/midreal/passage_sft_lmflow_0704)):
|
@@ -80,9 +59,7 @@ The schema that the data elements should follow
|
|
80 |
Person in charge
|
81 |
```
|
82 |
|
83 |
-
Other information about the dataset
|
84 |
-
|
85 |
-
### Status
|
86 |
|
87 |
The production of dataset should somehow follow a pipeline from `raw_data` to `story_data` to `openai` or `lmflow` format.
|
88 |
|
@@ -94,19 +71,25 @@ The production of dataset should somehow follow a pipeline from `raw_data` to `s
|
|
94 |
|
95 |
`lmflow` refers to [LMFlow data format](https://optimalscale.github.io/LMFlow/examples/DATASETS.html#data-format).
|
96 |
|
97 |
-
|
|
|
|
|
98 |
|
99 |
-
|
100 |
```
|
101 |
-
upload
|
102 |
```
|
103 |
|
104 |
-
|
105 |
```
|
106 |
-
|
|
|
107 |
```
|
108 |
|
109 |
-
and
|
110 |
```
|
111 |
-
|
112 |
-
|
|
|
|
|
|
|
|
13 |
|
14 |
# Model
|
15 |
|
|
|
|
|
16 |
Repo should be named as `midreal/{model_function}_{train_method}_{train_technique}_{base_model}_{date}`
|
17 |
|
18 |
Model card should include ([example](https://huggingface.co/midreal/passage_sft_qlora_llama3_70b_0704)):
|
|
|
43 |
Person in charge
|
44 |
```
|
45 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
46 |
# Dataset
|
47 |
|
|
|
|
|
48 |
Repo should be named as `midreal/{model_function}_{train_method}_{status}_{date}`
|
49 |
|
50 |
Dataset card should include ([example](https://huggingface.co/datasets/midreal/passage_sft_lmflow_0704)):
|
|
|
59 |
Person in charge
|
60 |
```
|
61 |
|
62 |
+
Other information about the dataset could also be commented on dataset cards.
|
|
|
|
|
63 |
|
64 |
The production of dataset should somehow follow a pipeline from `raw_data` to `story_data` to `openai` or `lmflow` format.
|
65 |
|
|
|
71 |
|
72 |
`lmflow` refers to [LMFlow data format](https://optimalscale.github.io/LMFlow/examples/DATASETS.html#data-format).
|
73 |
|
74 |
+
# Usage
|
75 |
+
|
76 |
+
We suggest using Huggingface CLI ([docs](https://huggingface.co/docs/huggingface_hub/main/en/guides/cli)).
|
77 |
|
78 |
+
Once you have installed huggingface-cli and login, models/datasets could be uploaded with:
|
79 |
```
|
80 |
+
huggingface-cli upload [midreal/repo_id] [local_path] ([path_in_repo]) (--repo-type=dataset)
|
81 |
```
|
82 |
|
83 |
+
e.g.
|
84 |
```
|
85 |
+
huggingface-cli upload midreal/model ./path/to/curated/data /data/train
|
86 |
+
huggingface-cli upload midreal/dataset . . --repo-type=dataset
|
87 |
```
|
88 |
|
89 |
+
and downloaded with:
|
90 |
```
|
91 |
+
huggingface-cli download midreal/model
|
92 |
+
huggingface-cli download midreal/dataset --repo-type dataset
|
93 |
+
```
|
94 |
+
|
95 |
+
If the repo doesn’t exist yet, it will be created automatically.
|