mattany's picture
Update README.md
52d9cab verified
metadata
dataset_info:
  features:
    - name: id
      dtype: string
    - name: url
      dtype: string
    - name: title
      dtype: string
    - name: text
      dtype: string
  splits:
    - name: train
      num_bytes: 76682393.95137328
      num_examples: 24325
  download_size: 80505648
  dataset_size: 76682393.95137328
configs:
  - config_name: default
    data_files:
      - split: train
        path: data/train-*
license: mit
task_categories:
  - text-generation
  - text2text-generation
  - text-classification
language:
  - en
size_categories:
  - 10K<n<100K

Generated by doing a BFS using wikipedia API, starting from thee category "Solar System", with maximum depth 5. All child categories and pages were brought in. Finally, used dataset '"wikimedia/wikipedia", "20231101.en"' to get the contents of the articles