--- license: cc-by-4.0 pretty_name: Ground-based 2d images assembled in Maireles-González et al. tags: - astronomy - compression - images dataset_info: config_name: tiny features: - name: image dtype: image: mode: I;16 - name: telescope dtype: string - name: image_id dtype: string splits: - name: train num_bytes: 307620692 num_examples: 10 - name: test num_bytes: 168984694 num_examples: 5 download_size: 238361934 dataset_size: 476605386 --- # GBI-16-2D-Legacy Dataset GBI-16-2D-Legacy is a Huggingface `dataset` wrapper around a compression dataset assembled by Maireles-González et al. (Publications of the Astronomical Society of the Pacific, 135:094502, 2023 September; doi: [https://doi.org/10.1088/1538-3873/acf6e0](https://doi.org/10.1088/1538-3873/acf6e0)). It contains 226 FITS images from 5 different ground-based telescope/cameras with a varying amount of entropy per image. # Usage You first need to install the `datasets` and `astropy` packages: ```bash pip install datasets astropy ``` There are two datasets: `tiny` and `full`, each with `train` and `test` splits. The `tiny` dataset has 2 4D images in the `train` and 1 in the `test`. The `full` dataset contains all the images in the `data/` directory. ## Use from Huggingface Directly To directly use from this data from Huggingface, you'll want to log in on the command line before starting python: ```bash huggingface-cli login ``` or ``` import huggingface_hub huggingface_hub.login(token=token) ``` Then in your python script: ```python from datasets import load_dataset dataset = load_dataset("AstroCompress/GBI-16-2D-Legacy", "tiny") ds = dataset.with_format("np") ``` ## Local Use Alternatively, you can clone this repo and use directly without connecting to hf: ```bash git clone https://huggingface.co/datasets/AstroCompress/GBI-16-2D-Legacy ``` Then `cd SBI-16-3D` and start python like: ```python from datasets import load_dataset dataset = load_dataset("./GBI-16-2D-Legacy", "tiny", data_dir="./data/") ds = dataset.with_format("np") ``` Now you should be able to use the `ds` variable like: ```python ds["test"][0]["image"].shape # -> (9, 2048, 2048) ``` Note of course that it will take a long time to download and convert the images in the local cache for the `full` dataset. Afterward, the usage should be quick as the files are memory-mapped from disk.