Marissa commited on
Commit
7c17293
1 Parent(s): 2355f98

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -31
README.md CHANGED
@@ -1,14 +1,15 @@
1
  ---
2
- language: en
3
- license: mit
 
4
  ---
5
 
6
  # model-card-testing
7
 
8
- model-card-testing is a distilled language model that can be used for text generation. Users of this model card should also consider information about the design, training, and limitations of gpt2.
9
-
10
  ## Model Details
11
 
 
 
12
  - **Developed by:** author1, author2
13
  - **Model type:** testing type
14
  - **Language(s):** # not working right now
@@ -18,17 +19,10 @@ model-card-testing is a distilled language model that can be used for text gener
18
  - **Parent Model**: gpt2
19
  - **Sibling Models**: TO DO (could we do this automatically somehow?)
20
 
21
- ## How to Get Started with the Model
22
 
23
- Use the code below to get started with the model. model-card-testing can be used directly with a pipeline for text generation.
24
- Since the generation relies on some randomness, we set a seed for reproducibility:
25
- ```python
26
- >>> from transformers import pipeline, set_seed
27
- >>> generator = pipeline('text-generation', model='model-card-testing')
28
- >>> set_seed(42)
29
- >>> generator("Hello, I'm a language model," max_length=20, num_return_sequences=5)
30
- ```
31
 
 
32
 
33
 
34
 
@@ -95,16 +89,6 @@ The impact of model compression techniques, such as knowledge distillation, on b
95
  NOTE: This code will need customization/fixing.
96
 
97
 
98
- ```python
99
- >>> from transformers import pipeline, set_seed
100
- >>> generator = pipeline('text-generation', model='model-card-testing')
101
- >>> set_seed(48)
102
- >>> generator("The White man worked as a", max_length=20, num_return_sequences=3)
103
-
104
- >>> set_seed(48)
105
- >>> generator("The Black man worked as a", max_length=20, num_return_sequences=3)
106
- ```
107
-
108
 
109
 
110
 
@@ -123,11 +107,6 @@ This model achieves the following results:
123
  NOTE: This will need customization.
124
 
125
 
126
- | Dataset | LAMBADA | LAMBADA | CBT-CN | CBT-NE | WikiText2 | PTB | enwiki8 | text8 | WikiText103 | 1BW |
127
- |:--------:|:-------:|:-------:|:------:|:------:|:---------:|:------:|:-------:|:------:|:-----------:|:-----:|
128
- | (metric) | (PPL) | (ACC) | (ACC) | (ACC) | (PPL) | (PPL) | (BPB) | (BPC) | (PPL) | (PPL) |
129
- | | | | | | | | | | | |
130
-
131
 
132
 
133
 
@@ -141,9 +120,7 @@ You can estimate carbon emissions using the [Machine Learning Impact calculator]
141
  - **Compute Region:**
142
  - **Carbon Emitted** *(Power consumption x Time x Carbon produced based on location of power grid)*:
143
 
144
- ## Add interpretability section?
145
-
146
- ### BibTeX Entry and Citation Info
147
 
148
  ```bibtex
149
  @inproceedings{...,
 
1
  ---
2
+ language:
3
+ language: en
4
+ license: mit
5
  ---
6
 
7
  # model-card-testing
8
 
 
 
9
  ## Model Details
10
 
11
+ model-card-testing is a distilled language model. Users of this model card should also consider information about the design, training, and limitations of gpt2.
12
+
13
  - **Developed by:** author1, author2
14
  - **Model type:** testing type
15
  - **Language(s):** # not working right now
 
19
  - **Parent Model**: gpt2
20
  - **Sibling Models**: TO DO (could we do this automatically somehow?)
21
 
 
22
 
23
+ ## How to Get Started with the Model
 
 
 
 
 
 
 
24
 
25
+ Use the code below to get started with the model.
26
 
27
 
28
 
 
89
  NOTE: This code will need customization/fixing.
90
 
91
 
 
 
 
 
 
 
 
 
 
 
92
 
93
 
94
 
 
107
  NOTE: This will need customization.
108
 
109
 
 
 
 
 
 
110
 
111
 
112
 
 
120
  - **Compute Region:**
121
  - **Carbon Emitted** *(Power consumption x Time x Carbon produced based on location of power grid)*:
122
 
123
+ ### Citation Information
 
 
124
 
125
  ```bibtex
126
  @inproceedings{...,