Barcenas 14b phi-4 v2

Based on pankajmathur/orca_mini_phi-4 And trained with the dataset mlabonne/OpenThoughts-79k-filtered

The goal of this new model is to work around the bugs of the first version, using a better base and a much larger dataset containing related quality data covering math, science, code and puzzles.

This new version is expected to perform much better than the first version and achieve better benchmark results.

Made with ❤️ in Guadalupe, Nuevo Leon, Mexico 🇲🇽

Downloads last month
28
Safetensors
Model size
14.7B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Danielbrdz/Barcenas-14b-phi-4-v2

Base model

microsoft/phi-4
Finetuned
(57)
this model
Quantizations
2 models

Dataset used to train Danielbrdz/Barcenas-14b-phi-4-v2