Dataset Preview
Full Screen
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code:   DatasetGenerationError
Exception:    ArrowNotImplementedError
Message:      Unsupported cast from double to halffloat using function cast_half_float
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1766, in _prepare_split_single
                  writer.write(example, key)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 500, in write
                  self.write_examples_on_file()
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 458, in write_examples_on_file
                  self.write_batch(batch_examples=batch_examples)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 568, in write_batch
                  arrays.append(pa.array(typed_sequence))
                File "pyarrow/array.pxi", line 247, in pyarrow.lib.array
                File "pyarrow/array.pxi", line 112, in pyarrow.lib._handle_arrow_array_protocol
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 208, in __arrow_array__
                  out = cast_array_to_feature(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1804, in wrapper
                  return func(array, *args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2075, in cast_array_to_feature
                  casted_array_values = _c(array.values, feature.feature)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1804, in wrapper
                  return func(array, *args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2075, in cast_array_to_feature
                  casted_array_values = _c(array.values, feature.feature)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1804, in wrapper
                  return func(array, *args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2075, in cast_array_to_feature
                  casted_array_values = _c(array.values, feature.feature)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1804, in wrapper
                  return func(array, *args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2116, in cast_array_to_feature
                  return array_cast(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1804, in wrapper
                  return func(array, *args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1963, in array_cast
                  return array.cast(pa_type)
                File "pyarrow/array.pxi", line 996, in pyarrow.lib.Array.cast
                File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow/compute.py", line 404, in cast
                  return call_function("cast", [arr], options, memory_pool)
                File "pyarrow/_compute.pyx", line 590, in pyarrow._compute.call_function
                File "pyarrow/_compute.pyx", line 385, in pyarrow._compute.Function.call
                File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status
                File "pyarrow/error.pxi", line 91, in pyarrow.lib.check_status
              pyarrow.lib.ArrowNotImplementedError: Unsupported cast from double to halffloat using function cast_half_float
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1775, in _prepare_split_single
                  num_examples, num_bytes = writer.finalize()
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 599, in finalize
                  self.write_examples_on_file()
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 458, in write_examples_on_file
                  self.write_batch(batch_examples=batch_examples)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 568, in write_batch
                  arrays.append(pa.array(typed_sequence))
                File "pyarrow/array.pxi", line 247, in pyarrow.lib.array
                File "pyarrow/array.pxi", line 112, in pyarrow.lib._handle_arrow_array_protocol
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 208, in __arrow_array__
                  out = cast_array_to_feature(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1804, in wrapper
                  return func(array, *args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2075, in cast_array_to_feature
                  casted_array_values = _c(array.values, feature.feature)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1804, in wrapper
                  return func(array, *args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2075, in cast_array_to_feature
                  casted_array_values = _c(array.values, feature.feature)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1804, in wrapper
                  return func(array, *args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2075, in cast_array_to_feature
                  casted_array_values = _c(array.values, feature.feature)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1804, in wrapper
                  return func(array, *args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2116, in cast_array_to_feature
                  return array_cast(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1804, in wrapper
                  return func(array, *args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1963, in array_cast
                  return array.cast(pa_type)
                File "pyarrow/array.pxi", line 996, in pyarrow.lib.Array.cast
                File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow/compute.py", line 404, in cast
                  return call_function("cast", [arr], options, memory_pool)
                File "pyarrow/_compute.pyx", line 590, in pyarrow._compute.call_function
                File "pyarrow/_compute.pyx", line 385, in pyarrow._compute.Function.call
                File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status
                File "pyarrow/error.pxi", line 91, in pyarrow.lib.check_status
              pyarrow.lib.ArrowNotImplementedError: Unsupported cast from double to halffloat using function cast_half_float
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1524, in compute_config_parquet_and_info_response
                  parquet_operations, partial, estimated_dataset_info = stream_convert_to_parquet(
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1099, in stream_convert_to_parquet
                  builder._prepare_split(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1627, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1784, in _prepare_split_single
                  raise DatasetGenerationError("An error occurred while generating the dataset") from e
              datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

__key__
string
__url__
string
img.png
image
latent.pth
sequence
0
hf://datasets/Birchlabs/sdxl-latents-ffhq@667d9927f3d7eb419d21b93e9aec9f3ffc9bb952/00000.tar
[[[0.56640625,1.4072265625,0.38525390625,1.400390625,3.142578125,1.998046875,-1.7041015625,1.1953125(...TRUNCATED)
1
hf://datasets/Birchlabs/sdxl-latents-ffhq@667d9927f3d7eb419d21b93e9aec9f3ffc9bb952/00000.tar
[[[-17.703125,-19.6875,-16.140625,-20.765625,-14.4453125,-19.9375,-19.3125,-18.234375,-17.0625,-18.4(...TRUNCATED)
2
hf://datasets/Birchlabs/sdxl-latents-ffhq@667d9927f3d7eb419d21b93e9aec9f3ffc9bb952/00000.tar
[[[5.99609375,3.982421875,5.78125,3.6171875,6.734375,4.03125,1.73046875,5.59765625,3.7421875,2.70312(...TRUNCATED)
3
hf://datasets/Birchlabs/sdxl-latents-ffhq@667d9927f3d7eb419d21b93e9aec9f3ffc9bb952/00000.tar
[[[3.908203125,3.609375,4.52734375,0.8994140625,6.8125,4.09765625,0.56005859375,5.35546875,4.4765625(...TRUNCATED)
4
hf://datasets/Birchlabs/sdxl-latents-ffhq@667d9927f3d7eb419d21b93e9aec9f3ffc9bb952/00000.tar
[[[19.0,16.109375,17.140625,18.65625,15.2421875,22.703125,19.6875,18.375,17.25,21.8125,20.5625,15.89(...TRUNCATED)
5
hf://datasets/Birchlabs/sdxl-latents-ffhq@667d9927f3d7eb419d21b93e9aec9f3ffc9bb952/00000.tar
[[[-11.5859375,-13.578125,-15.2890625,-12.6484375,-16.9375,-12.7109375,-12.1875,-13.5390625,-11.1328(...TRUNCATED)
6
hf://datasets/Birchlabs/sdxl-latents-ffhq@667d9927f3d7eb419d21b93e9aec9f3ffc9bb952/00000.tar
[[[13.1953125,11.1484375,9.3515625,12.1328125,8.0390625,9.90625,11.6875,12.265625,12.5078125,10.25,1(...TRUNCATED)
7
hf://datasets/Birchlabs/sdxl-latents-ffhq@667d9927f3d7eb419d21b93e9aec9f3ffc9bb952/00000.tar
[[[-11.2890625,-14.796875,-15.390625,-6.09765625,10.8671875,4.55859375,0.137451171875,1.25,3.828125,(...TRUNCATED)
8
hf://datasets/Birchlabs/sdxl-latents-ffhq@667d9927f3d7eb419d21b93e9aec9f3ffc9bb952/00000.tar
[[[6.59375,6.22265625,-0.6640625,-1.6787109375,-7.44921875,-7.93359375,-3.994140625,-2.41796875,-3.7(...TRUNCATED)
9
hf://datasets/Birchlabs/sdxl-latents-ffhq@667d9927f3d7eb419d21b93e9aec9f3ffc9bb952/00000.tar
[[[-1.9423828125,-1.6826171875,5.3671875,0.46044921875,6.59375,9.890625,4.09375,-0.3984375,4.1210937(...TRUNCATED)
End of preview.
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/datasets-cards)

https://github.com/NVlabs/ffhq-dataset samples encoded to float16 SDXL latents via Ollin VAE.

Dataset created using this script.

VAE encoder used NATTEN attention, kernel size 17.

Didn't bother saving mean & logvar, because variance is low enough it's not worth the doubling of filesize to retain.
Sampled from diagonal gaussian distribution, saved the resulting latents.
Also kept the original image.

Schema/usage:

from typing import TypedDict, Iterator
from webdataset import WebDataset
Sample = TypedDict('Sample', {
  '__key__': str,
  '__url__': str,
  'img.png': bytes, # PIL image, serialized. 1024*1024px
  'latent.pth': bytes, # FloatTensor, serialized. 128*128 latents
})

it: Iterator[Sample] = WebDataset('{00000..00035}.tar')

for sample in it:
  pass
# avg/val.pt (mean):
[-2.8982300758361816, -0.9609659910202026, 0.2416578084230423, -0.307400107383728]
# avg/sq.pt:
[65.80902099609375, 32.772762298583984, 36.080204010009766, 25.072017669677734]

# std
# (sq - val**2)**.5
[7.5768914222717285, 5.643518924713135, 6.001816749572754, 4.997751712799072]
# 1/std
[0.13198024034500122, 0.17719440162181854, 0.16661621630191803, 0.2000899761915207]

Flickr-Faces-HQ Dataset (FFHQ)

Flickr-Faces-HQ (FFHQ) is a high-quality image dataset of human faces, originally created as a benchmark for generative adversarial networks (GAN):

A Style-Based Generator Architecture for Generative Adversarial Networks
Tero Karras (NVIDIA), Samuli Laine (NVIDIA), Timo Aila (NVIDIA)
https://arxiv.org/abs/1812.04948

The dataset consists of 70,000 high-quality PNG images at 1024×1024 resolution and contains considerable variation in terms of age, ethnicity and image background. It also has good coverage of accessories such as eyeglasses, sunglasses, hats, etc. The images were crawled from Flickr, thus inheriting all the biases of that website, and automatically aligned and cropped using dlib. Only images under permissive licenses were collected. Various automatic filters were used to prune the set, and finally Amazon Mechanical Turk was used to remove the occasional statues, paintings, or photos of photos.

Please note that this dataset is not intended for, and should not be used for, development or improvement of facial recognition technologies. For business inquiries, please visit our website and submit the form: NVIDIA Research Licensing

Licenses

The individual images were published in Flickr by their respective authors under either Creative Commons BY 2.0, Creative Commons BY-NC 2.0, Public Domain Mark 1.0, Public Domain CC0 1.0, or U.S. Government Works license. All of these licenses allow free use, redistribution, and adaptation for non-commercial purposes. However, some of them require giving appropriate credit to the original author, as well as indicating any changes that were made to the images. The license and original author of each image are indicated in the metadata.

The dataset itself (including JSON metadata, download script, and documentation) is made available under Creative Commons BY-NC-SA 4.0 license by NVIDIA Corporation. You can use, redistribute, and adapt it for non-commercial purposes, as long as you (a) give appropriate credit by citing our paper, (b) indicate any changes that you've made, and (c) distribute any derivative works under the same license.

Downloads last month
71