PyTorch
Safetensors
English
llama

RedPJ-ProX-0.7B

ArXiv | Models | Data | Code

RedPJ-ProX-0.7B is a tiny language model. It was and trained on the RedPajama-V2-pro for 25B tokens.

Evaluations

ProX models are evaluated over 10 language model benchmarks in zero-shot setting.

ArC-c ARC-e CSQA HellaS MMLU OBQA PiQA SIQA WinoG SciQ AVG
raw 26.1 44.3 29.7 39.1 27.3 29.2 66.9 39.0 52.0 67.4 42.1
ours 26.4 51.9 30.9 42.4 29.4 31.6 67.9 40.0 52.2 73.5 44.6

Citation

@article{zhou2024programming,
  title={Programming Every Example: Lifting Pre-training Data Quality like Experts at Scale},
  author={Zhou, Fan and Wang, Zengzhi and Liu, Qian and Li, Junlong and Liu, Pengfei},
  journal={arXiv preprint arXiv:2409.17115},
  year={2024}
}
Downloads last month
17
Safetensors
Model size
759M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for gair-prox/RedPJ-ProX-0.7B

Finetunes
1 model

Dataset used to train gair-prox/RedPJ-ProX-0.7B

Collection including gair-prox/RedPJ-ProX-0.7B