Datasets:

ArXiv:
License:

The dataset is currently empty. Upload or create new data files. Then, you will be able to explore them in the Dataset Viewer.

β›³ NeRF-MAE Dataset

Download the preprocessed datasets here.

Extract pretraining and finetuning dataset under NeRF-MAE/datasets. The directory structure should look like this:

NeRF-MAE
β”œβ”€β”€ pretrain
β”‚   β”œβ”€β”€ features
β”‚   └── nerfmae_split.npz
└── finetune
    └── front3d_rpn_data
        β”œβ”€β”€ features
        β”œβ”€β”€ aabb
        └── obb

For more details, dataloaders and how to use this dataset: see our Github repo: https://github.com/zubair-irshad/NeRF-MAE

Coming Soon: Multi-view rendered images and Instant-NGP checkpoints (totalling 3200+ trained NeRF checkpoints and over 1M images)

Note: The above datasets are all you need to train and evaluate our method. Bonus: we will be releasing our multi-view rendered posed RGB images from FRONT3D, HM3D and Hypersim as well as Instant-NGP trained checkpoints soon (these comprise over 1.6M+ images and 3200+ NeRF checkpoints)

Please note that our dataset was generated using the instruction from NeRF-RPN and 3D-CLR. Please consider citing our work, NeRF-RPN and 3D-CLR if you find this dataset useful in your research.

Please also note that our dataset uses Front3D, Habitat-Matterport3D, HyperSim and ScanNet as the base version of the dataset i.e. we train a NeRF per scene and extract radiance and desnity grid as well as aligned NeRF-grid 3D annotations. Please read the term of use for each dataset if you want to utilize the posed multi-view images for each of these datasets.

Downloads last month
37