robot-bengali-2
commited on
Commit
โข
d10efa1
1
Parent(s):
a3aed67
Update readme
Browse files
README.md
CHANGED
@@ -7,29 +7,31 @@ sdk: static
|
|
7 |
pinned: false
|
8 |
---
|
9 |
|
10 |
-
<
|
11 |
-
|
12 |
-
</
|
13 |
-
|
14 |
-
|
15 |
-
|
16 |
-
|
17 |
-
|
18 |
-
|
19 |
-
|
20 |
-
</
|
21 |
-
|
22 |
-
|
23 |
-
</
|
24 |
-
|
25 |
-
|
26 |
-
|
27 |
-
|
28 |
-
<
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
</
|
33 |
-
|
34 |
-
|
35 |
-
</
|
|
|
|
|
|
7 |
pinned: false
|
8 |
---
|
9 |
|
10 |
+
<div class="lg:col-span-3">
|
11 |
+
<p>
|
12 |
+
This organization is a part of the NeurIPS 2021 demonstration <a href="https://training-transformers-together.github.io/">"Training Transformers Together"</a>.
|
13 |
+
</p>
|
14 |
+
<p>
|
15 |
+
In this demo, we train a model similar to <a target="_blank" href="https://openai.com/blog/dall-e/">OpenAI DALL-E</a> โ
|
16 |
+
a Transformer "language model" that generates images from text descriptions.
|
17 |
+
It is trained on <a target="_blank" href="https://laion.ai/laion-400-open-dataset/">LAION-400M</a>,
|
18 |
+
the world's largest openly available image-text-pair dataset with 400 million samples. Our model is based on
|
19 |
+
the <a target="_blank" href="https://github.com/lucidrains/DALLE-pytorch">dalleโpytorch</a> implementation
|
20 |
+
by <a target="_blank" href="https://github.com/lucidrains">Phil Wang</a> with a few tweaks to make it communication-efficient.
|
21 |
+
</p>
|
22 |
+
<p>
|
23 |
+
See details about how to join and how it works on <a target="_blank" href="https://training-transformers-together.github.io/">our website</a>.
|
24 |
+
</p>
|
25 |
+
<p>
|
26 |
+
This organization gathers people participating in the collaborative training and provides links to the necessary resources:
|
27 |
+
</p>
|
28 |
+
<ul>
|
29 |
+
<li>๐ Starter kits for <b>Google Colab</b> and <b>Kaggle</b> (easy way to join the training)</li>
|
30 |
+
<li>๐ <a target="_blank" href="https://huggingface.co/spaces/training-transformers-together/Dashboard">Dashboard</a> (the current training state: loss, number of peers, etc.)</li>
|
31 |
+
<li>๐ <a target="_blank" href="https://huggingface.co/training-transformers-together/dalle-demo">Model</a> (the latest checkpoint)</li>
|
32 |
+
<li>๐ <a target="_blank" href="https://huggingface.co/datasets/laion/laion_100m_vqgan_f8">Dataset</a></li>
|
33 |
+
</ul>
|
34 |
+
<p>
|
35 |
+
Feel free to reach us on <a target="_blank" href="https://discord.gg/uGugx9zYvN">Discord</a> if you have any questions ๐
|
36 |
+
</p>
|
37 |
+
</div>
|