metadata
license: apache-2.0
Model: SatMAE (https://arxiv.org/abs/2207.08051)
Variant: vitlarge-fmow-pretrain-800
Example Usage:
from huggingface_hub import hf_hub_download
import torch
hf_hub_download("MVRL/satmae-vitlarge-fmow-pretrain-800", "model.py", local_dir=".")
from model import MaskedAutoencoderViT
model = MaskedAutoencoderViT.from_pretrained("MVRL/satmae-vitlarge-fmow-pretrain-800")
print(model.forward_encoder(torch.randn(1, 3, 224, 224), mask_ratio=0.0)[0].shape)