Edit model card

Nemotron-4-340B-Instruct-hf

Converted checkpoint of nvidia/Nemotron-4-340B-Instruct. Specifically it was produced from the v1.0 .nemo checkpoint on NGC.

You can deploy this model with vllm>=0.5.4 (PR#6611):

vllm serve mgoin/Nemotron-4-340B-Instruct-hf --tensor-parallel-size 16

Evaluations

All the below evaluations were run with the FP8 checkpoint using lm-eval==0.4.3 on 8xA100 GPUs.

lm_eval --model vllm --model_args pretrained=/home/mgoin/code/Nemotron-4-340B-Instruct-hf-FP8,tensor_parallel_size=8,distributed_executor_backend="ray",gpu_memory_utilization=0.6,enforce_eager=True --tasks mmlu --num_fewshot 0 --batch_size 4
vllm (pretrained=/home/mgoin/code/Nemotron-4-340B-Instruct-hf-FP8,tensor_parallel_size=8,distributed_executor_backend=ray,gpu_memory_utilization=0.6,enforce_eager=True), gen_kwargs: (None), limit: None, num_fewshot: 0, batch_size: 4
|                 Tasks                 |Version|Filter|n-shot|Metric|   |Value |   |Stderr|
|---------------------------------------|-------|------|-----:|------|---|-----:|---|-----:|
|mmlu                                   |N/A    |none  |     0|acc   |↑  |0.7905|±  |0.0033|
|  - abstract_algebra                   |      0|none  |     0|acc   |↑  |0.5100|±  |0.0502|
|  - anatomy                            |      0|none  |     0|acc   |↑  |0.8074|±  |0.0341|
|  - astronomy                          |      0|none  |     0|acc   |↑  |0.8816|±  |0.0263|
|  - business_ethics                    |      0|none  |     0|acc   |↑  |0.7300|±  |0.0446|
|  - clinical_knowledge                 |      0|none  |     0|acc   |↑  |0.8453|±  |0.0223|
|  - college_biology                    |      0|none  |     0|acc   |↑  |0.9236|±  |0.0222|
|  - college_chemistry                  |      0|none  |     0|acc   |↑  |0.5300|±  |0.0502|
|  - college_computer_science           |      0|none  |     0|acc   |↑  |0.7400|±  |0.0441|
|  - college_mathematics                |      0|none  |     0|acc   |↑  |0.4500|±  |0.0500|
|  - college_medicine                   |      0|none  |     0|acc   |↑  |0.7630|±  |0.0324|
|  - college_physics                    |      0|none  |     0|acc   |↑  |0.5490|±  |0.0495|
|  - computer_security                  |      0|none  |     0|acc   |↑  |0.8300|±  |0.0378|
|  - conceptual_physics                 |      0|none  |     0|acc   |↑  |0.8043|±  |0.0259|
|  - econometrics                       |      0|none  |     0|acc   |↑  |0.7105|±  |0.0427|
|  - electrical_engineering             |      0|none  |     0|acc   |↑  |0.7448|±  |0.0363|
|  - elementary_mathematics             |      0|none  |     0|acc   |↑  |0.6667|±  |0.0243|
|  - formal_logic                       |      0|none  |     0|acc   |↑  |0.5952|±  |0.0439|
|  - global_facts                       |      0|none  |     0|acc   |↑  |0.4900|±  |0.0502|
|  - high_school_biology                |      0|none  |     0|acc   |↑  |0.9097|±  |0.0163|
|  - high_school_chemistry              |      0|none  |     0|acc   |↑  |0.7143|±  |0.0318|
|  - high_school_computer_science       |      0|none  |     0|acc   |↑  |0.9100|±  |0.0288|
|  - high_school_european_history       |      0|none  |     0|acc   |↑  |0.8788|±  |0.0255|
|  - high_school_geography              |      0|none  |     0|acc   |↑  |0.9242|±  |0.0189|
|  - high_school_government_and_politics|      0|none  |     0|acc   |↑  |0.9845|±  |0.0089|
|  - high_school_macroeconomics         |      0|none  |     0|acc   |↑  |0.8333|±  |0.0189|
|  - high_school_mathematics            |      0|none  |     0|acc   |↑  |0.4630|±  |0.0304|
|  - high_school_microeconomics         |      0|none  |     0|acc   |↑  |0.8824|±  |0.0209|
|  - high_school_physics                |      0|none  |     0|acc   |↑  |0.6159|±  |0.0397|
|  - high_school_psychology             |      0|none  |     0|acc   |↑  |0.9394|±  |0.0102|
|  - high_school_statistics             |      0|none  |     0|acc   |↑  |0.7639|±  |0.0290|
|  - high_school_us_history             |      0|none  |     0|acc   |↑  |0.9412|±  |0.0165|
|  - high_school_world_history          |      0|none  |     0|acc   |↑  |0.9409|±  |0.0153|
|  - human_aging                        |      0|none  |     0|acc   |↑  |0.8072|±  |0.0265|
|  - human_sexuality                    |      0|none  |     0|acc   |↑  |0.8855|±  |0.0279|
| - humanities                          |N/A    |none  |     0|acc   |↑  |0.7594|±  |0.0060|
|  - international_law                  |      0|none  |     0|acc   |↑  |0.9091|±  |0.0262|
|  - jurisprudence                      |      0|none  |     0|acc   |↑  |0.8704|±  |0.0325|
|  - logical_fallacies                  |      0|none  |     0|acc   |↑  |0.8528|±  |0.0278|
|  - machine_learning                   |      0|none  |     0|acc   |↑  |0.6786|±  |0.0443|
|  - management                         |      0|none  |     0|acc   |↑  |0.8641|±  |0.0339|
|  - marketing                          |      0|none  |     0|acc   |↑  |0.9359|±  |0.0160|
|  - medical_genetics                   |      0|none  |     0|acc   |↑  |0.8400|±  |0.0368|
|  - miscellaneous                      |      0|none  |     0|acc   |↑  |0.9221|±  |0.0096|
|  - moral_disputes                     |      0|none  |     0|acc   |↑  |0.8382|±  |0.0198|
|  - moral_scenarios                    |      0|none  |     0|acc   |↑  |0.6168|±  |0.0163|
|  - nutrition                          |      0|none  |     0|acc   |↑  |0.8791|±  |0.0187|
| - other                               |N/A    |none  |     0|acc   |↑  |0.8214|±  |0.0065|
|  - philosophy                         |      0|none  |     0|acc   |↑  |0.8521|±  |0.0202|
|  - prehistory                         |      0|none  |     0|acc   |↑  |0.8796|±  |0.0181|
|  - professional_accounting            |      0|none  |     0|acc   |↑  |0.6383|±  |0.0287|
|  - professional_law                   |      0|none  |     0|acc   |↑  |0.6838|±  |0.0119|
|  - professional_medicine              |      0|none  |     0|acc   |↑  |0.8824|±  |0.0196|
|  - professional_psychology            |      0|none  |     0|acc   |↑  |0.8611|±  |0.0140|
|  - public_relations                   |      0|none  |     0|acc   |↑  |0.8000|±  |0.0383|
|  - security_studies                   |      0|none  |     0|acc   |↑  |0.8204|±  |0.0246|
| - social_sciences                     |N/A    |none  |     0|acc   |↑  |0.8811|±  |0.0057|
|  - sociology                          |      0|none  |     0|acc   |↑  |0.9055|±  |0.0207|
| - stem                                |N/A    |none  |     0|acc   |↑  |0.7180|±  |0.0076|
|  - us_foreign_policy                  |      0|none  |     0|acc   |↑  |0.9600|±  |0.0197|
|  - virology                           |      0|none  |     0|acc   |↑  |0.5482|±  |0.0387|
|  - world_religions                    |      0|none  |     0|acc   |↑  |0.9006|±  |0.0229|

|      Groups      |Version|Filter|n-shot|Metric|   |Value |   |Stderr|
|------------------|-------|------|-----:|------|---|-----:|---|-----:|
|mmlu              |N/A    |none  |     0|acc   |↑  |0.7905|±  |0.0033|
| - humanities     |N/A    |none  |     0|acc   |↑  |0.7594|±  |0.0060|
| - other          |N/A    |none  |     0|acc   |↑  |0.8214|±  |0.0065|
| - social_sciences|N/A    |none  |     0|acc   |↑  |0.8811|±  |0.0057|
| - stem           |N/A    |none  |     0|acc   |↑  |0.7180|±  |0.0076|

The original paper evals for reference:

image/png

Downloads last month
192
Safetensors
Model size
341B params
Tensor type
BF16
·
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for mgoin/Nemotron-4-340B-Instruct-hf

Finetuned
(1)
this model
Finetunes
2 models
Quantizations
1 model

Collection including mgoin/Nemotron-4-340B-Instruct-hf