File size: 1,371 Bytes
19c6069 6964520 19c6069 6964520 19c6069 6964520 19c6069 6964520 19c6069 6964520 19c6069 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 |
---
license: mit
---
# RDT-1B
RDT-1B is a 1B-parameter imitation learning Diffusion Transformer pre-trained on 1M+ multi-robot episodes. Given a language instruction and 3-view RGB image observations, RDT can predict the next
64 robot actions. RDT is inherently compatible with almost all kinds of modern mobile manipulators, from single-arm to dual-arm, joint to EEF, pos. to vel., and even with a mobile chassis.
All the code and model weights are licensed under MIT license.
Please refer to our [project page](), [github repository]() and [paper]() for more information.
## Model Details
- **Developed by** Thu-ml team
- **License:** MIT
- **Pretrain dataset:** [More Information Needed]
- **Finetune dataset:** [More Information Needed]
- **Repository:** [More Information Needed]
- **Paper :** [More Information Needed]
- **Project Page:** https://rdt-robotics.github.io/rdt-robotics/
## Uses
RDT-1B supports finetuning and pre-training on custom dataset, as well as deploying and inferencing on real-robots.
Please refer to [our repository](https://github.com/GeneralEmbodiedSystem/RoboticsDiffusionTransformer/blob/main/docs/pretrain.md) for all the above guides.
## Citation
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
|