File size: 1,331 Bytes
abc266c
 
 
a2513f9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
---
license: apache-2.0
---


# πŸ‘οΈ GLaMM-GranD-Pretrained

---
## πŸ“ Description
GLaMM-GranD-Pretrained is the model pretrained on GranD dataset, 
a large-scale dataset generated with automated annotation pipeline for detailed region-level understanding and segmentation masks. 
GranD comprises 7.5M unique concepts anchored in a total of 810M regions, each with a segmentation mask.


## πŸ’» Download
To get started with GLaMM-GranD-Pretrained, follow these steps:
   ```
   git lfs install
   git clone https://huggingface.co/MBZUAI/GLaMM-GranD-Pretrained
   ```

## πŸ“š Additional Resources
- **Paper:** [ArXiv](https://arxiv.org/abs/2311.03356).
- **GitHub Repository:** For updates: [GitHub - GLaMM](https://github.com/mbzuai-oryx/groundingLMM).
- **Project Page:** For a detailed overview and insights into the project, visit our [Project Page - GLaMM](https://mbzuai-oryx.github.io/groundingLMM/).

## πŸ“œ Citations and Acknowledgments

```bibtex
  @article{hanoona2023GLaMM,
          title={GLaMM: Pixel Grounding Large Multimodal Model},
          author={Rasheed, Hanoona and Maaz, Muhammad and Shaji, Sahal and Shaker, Abdelrahman and Khan, Salman and Cholakkal, Hisham and Anwer, Rao M. and Xing, Eric and Yang, Ming-Hsuan and Khan, Fahad S.},
          journal={ArXiv 2311.03356},
          year={2023}
  }