AlexHung29629's picture
Librarian Bot: Add dpo tag (#2)
18175bd verified
metadata
dataset_info:
  features:
    - name: prompt
      dtype: string
    - name: chosen
      dtype: string
    - name: rejected
      dtype: string
  splits:
    - name: train
      num_bytes: 243412260
      num_examples: 128000
  download_size: 82603750
  dataset_size: 243412260
tags:
  - dpo

Dataset Card for "stack-exchange-paired-128K"

token數

llama2: 97868021

More Information needed