bling-phi-3-onnx

bling-phi-3-onnx is a fast and accurate fact-based question-answering model, designed for retrieval augmented generation (RAG) with complex business documents, quantized and packaged in ONNX int4 for AI PCs using Intel GPU, CPU and NPU.

This model is one of the most accurate in the BLING/DRAGON model series, which is especially notable given the relatively small size and is ideal for use on AI PCs and local inferencing.

Model Description

  • Developed by: llmware
  • Model type: phi-3
  • Parameters: 3.8 billion
  • Quantization: int4
  • Model Parent: llmware/bling-phi-3
  • Language(s) (NLP): English
  • License: Apache 2.0
  • Uses: Fact-based question-answering, RAG
  • RAG Benchmark Accuracy Score: 99.5

Model Card Contact

llmware on github
llmware on hf
llmware website

Downloads last month
29
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API has been turned off for this model.

Model tree for llmware/bling-phi-3-onnx

Quantized
(3)
this model

Collections including llmware/bling-phi-3-onnx