Incomplete Json object returned by model inspector.

#1
by EarthlingX - opened

It causes LM Studio to not automatically set properties for the model ("QuantFactory/Meta-Llama-3.1-8B-Instruct-GGUF/Meta-Llama-3.1-8B-Instruct.Q6_K.gguf").
Not really a big issue after i figured that out, but it causes model not to load in GPU VRAM in LM Studio until n_gpu_layers are set manually.
Here's the details :
This is the object returned by model inspector :

{
"name": "Models",
"arch": "llama",
"rope": {}
}

and this is what is returned for another model ("QuantFactory/Llama3-8B-Instruct-Replete-Adapted-GGUF/Llama3-8B-Instruct-Replete-Adapted.Q6_K.gguf" ) :

{
"name": "models",
"arch": "llama",
"quant": "Q6_K",
"context_length": 8192,
"embedding_length": 4096,
"num_layers": 32,
"rope": {
"freq_base": 500000,
"dimension_count": 128
},
"head_count": 32,
"head_count_kv": 8,
"parameters": "7B"
}

I haven't tried any other quantized versions of this model.

Solved with the LM Studio 0.2.28 update.

EarthlingX changed discussion status to closed

Sign up or log in to comment