Edit model card

Fusion-In-Decoder Base on Natural Questions

This trained model is based on the Fusion-In-Decoder model, and trained on the Natural Questions dataset.

Model Details

Model is based on Fusion-In-Decoder, which in turn is based on the google/flan-t5-base checkpoint as the base model. For training, we utilized text retrieval for each query, which provides a collection of relevant passages for it.

We note that the passages were retrieved using a corpus based on Wikipedia.

Evaluation

See model performance on Evaluation Results tab on the right side.

Downloads last month
517
Safetensors
Model size
248M params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train Intel/fid_flan_t5_base_nq

Evaluation results