robot-bengali-2
commited on
Commit
β’
fe6ddb2
1
Parent(s):
d79a8e5
Update after the training run has finished
Browse files
README.md
CHANGED
@@ -13,29 +13,35 @@ pinned: false
|
|
13 |
This organization is a part of the NeurIPS 2021 demonstration <u><a href="https://training-transformers-together.github.io/">"Training Transformers Together"</a></u>.
|
14 |
</p>
|
15 |
<p class="mb-2">
|
16 |
-
In this demo, we
|
17 |
a Transformer "language model" that generates images from text descriptions.
|
18 |
-
Training
|
19 |
-
We
|
20 |
-
the world's largest openly available image-text-pair dataset with 400 million samples. Our model
|
21 |
the <u><a target="_blank" href="https://github.com/lucidrains/DALLE-pytorch">dalleβpytorch</a></u> implementation
|
22 |
by <u><a target="_blank" href="https://github.com/lucidrains">Phil Wang</a></u> with a few tweaks to make it communication-efficient.
|
23 |
</p>
|
24 |
<p class="mb-2">
|
25 |
-
See details about how
|
26 |
</p>
|
27 |
<p class="mb-2">
|
28 |
-
This organization gathers people participating in the collaborative training and provides links to the
|
29 |
</p>
|
30 |
<ul class="mb-2">
|
31 |
-
<li>π
|
32 |
-
<li>π <u><a target="_blank" href="https://huggingface.co/spaces/training-transformers-together/Dashboard">Dashboard</a></u> (the current training state: loss, number of peers, etc.)</li>
|
33 |
-
<li>π <u><a target="_blank" href="https://colab.research.google.com/drive/1Vkb-4nhEEH1a5vrKtpL4MTNiUTPdpPUl?usp=sharing">Colab notebook for running inference</a></u>
|
34 |
<li>π <u><a target="_blank" href="https://huggingface.co/training-transformers-together/dalle-demo-v1">Model weights</a></u> (the latest checkpoint)</li></li>
|
35 |
-
<li>π
|
36 |
<li>π <u><a target="_blank" href="https://github.com/learning-at-home/dalle-hivemind">Code</a></u></li>
|
37 |
<li>π <u><a target="_blank" href="https://huggingface.co/datasets/laion/laion_100m_vqgan_f8">Dataset</a></u></li>
|
38 |
</ul>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
39 |
<p class="mb-2">
|
40 |
Feel free to reach us on <u><a target="_blank" href="https://discord.gg/uGugx9zYvN">Discord</a></u> if you have any questions π
|
41 |
</p>
|
|
|
13 |
This organization is a part of the NeurIPS 2021 demonstration <u><a href="https://training-transformers-together.github.io/">"Training Transformers Together"</a></u>.
|
14 |
</p>
|
15 |
<p class="mb-2">
|
16 |
+
In this demo, we've trained a model similar to <u><a target="_blank" href="https://openai.com/blog/dall-e/">OpenAI DALL-E</a></u> β
|
17 |
a Transformer "language model" that generates images from text descriptions.
|
18 |
+
Training happened collaboratively β volunteers from all over the Internet contributed to the training using hardware available to them.
|
19 |
+
We used <u><a target="_blank" href="https://laion.ai/laion-400-open-dataset/">LAION-400M</a></u>,
|
20 |
+
the world's largest openly available image-text-pair dataset with 400 million samples. Our model was based on
|
21 |
the <u><a target="_blank" href="https://github.com/lucidrains/DALLE-pytorch">dalleβpytorch</a></u> implementation
|
22 |
by <u><a target="_blank" href="https://github.com/lucidrains">Phil Wang</a></u> with a few tweaks to make it communication-efficient.
|
23 |
</p>
|
24 |
<p class="mb-2">
|
25 |
+
See details about how it works on <u><a target="_blank" href="https://training-transformers-together.github.io/">our website</a></u>.
|
26 |
</p>
|
27 |
<p class="mb-2">
|
28 |
+
This organization gathers people participating in the collaborative training and provides links to the related materials:
|
29 |
</p>
|
30 |
<ul class="mb-2">
|
31 |
+
<li>π <u><a target="_blank" href="https://training-transformers-together.github.io/InferenceResults.html">Inference results</a></u></li></li>
|
|
|
|
|
32 |
<li>π <u><a target="_blank" href="https://huggingface.co/training-transformers-together/dalle-demo-v1">Model weights</a></u> (the latest checkpoint)</li></li>
|
33 |
+
<li>π <u><a target="_blank" href="https://colab.research.google.com/drive/1sXHqy5hKVEQyFX-H2Ai7KzLij-7M_xCB?usp=sharing">Colab notebook for running inference</a> (updated on Apr 5)</u>
|
34 |
<li>π <u><a target="_blank" href="https://github.com/learning-at-home/dalle-hivemind">Code</a></u></li>
|
35 |
<li>π <u><a target="_blank" href="https://huggingface.co/datasets/laion/laion_100m_vqgan_f8">Dataset</a></u></li>
|
36 |
</ul>
|
37 |
+
<p class="mb-2">
|
38 |
+
The materials below were available during the training run itself:
|
39 |
+
</p>
|
40 |
+
<ul class="mb-2">
|
41 |
+
<li>π Starter kits for <u><a target="_blank" href="https://colab.research.google.com/drive/1BqTWcfsvNQwQqqCRKMKp1_jvQ5L1BhCY?usp=sharing">Google Colab</a></u> and <u><a target="_blank" href="https://www.kaggle.com/yhn112/training-transformers-together/">Kaggle</a></u> (easy way to join the training)</li>
|
42 |
+
<li>π <u><a target="_blank" href="https://huggingface.co/spaces/training-transformers-together/Dashboard">Dashboard</a></u> (the current training state: loss, number of peers, etc.)</li>
|
43 |
+
<li>π Weights & Biases plots for <u><a target="_blank" href="https://wandb.ai/learning-at-home/dalle-hivemind/runs/3l7q56ht">aux peers</a></u> (aggregating the metrics) and actual <u><a target="_blank" href="https://wandb.ai/learning-at-home/dalle-hivemind-trainers">trainers</a></u> (contributing with their GPUs)</li>
|
44 |
+
</ul>
|
45 |
<p class="mb-2">
|
46 |
Feel free to reach us on <u><a target="_blank" href="https://discord.gg/uGugx9zYvN">Discord</a></u> if you have any questions π
|
47 |
</p>
|