Where is the VQ tokenizer?

#1
by WeiChow - opened

Hello! Great Job!
I notice in your repo's README

We have several existing VQs that you can use, LIBERO VQ and Bridge VQ. Download them into this vq/ folder before running VLA training or inference.

However, When I download this model, I have found that model_state_dict.keys() is like

dino_featurizer.blocks.2.mlp.fc2.bias
dino_featurizer.blocks.2.ls2.gamma
dino_featurizer.blocks.3.norm1.weight
dino_featurizer.blocks.3.norm1.bias
dino_featurizer.blocks.3.attn.qkv.weight
dino_featurizer.blocks.3.attn.qkv.bias
dino_featurizer.blocks.3.attn.proj.weight
dino_featurizer.blocks.3.attn.proj.bias
dino_featurizer.blocks.3.ls1.gamma
dino_featurizer.blocks.3.norm2.weight
dino_featurizer.blocks.3.norm2.bias
dino_featurizer.blocks.3.mlp.fc1.weight
dino_featurizer.blocks.3.mlp.fc1.bias
dino_featurizer.blocks.3.mlp.fc2.weight
dino_featurizer.blocks.3.mlp.fc2.bias
dino_featurizer.blocks.3.ls2.gamma
dino_featurizer.blocks.4.norm1.weight
dino_featurizer.blocks.4.norm1.bias
dino_featurizer.blocks.4.attn.qkv.weight
dino_featurizer.blocks.4.attn.qkv.bias
dino_featurizer.blocks.4.attn.proj.weight
dino_featurizer.blocks.4.attn.proj.bias
dino_featurizer.blocks.4.ls1.gamma
dino_featurizer.blocks.4.norm2.weight
dino_featurizer.blocks.4.norm2.bias
dino_featurizer.blocks.4.mlp.fc1.weight
dino_featurizer.blocks.4.mlp.fc1.bias
dino_featurizer.blocks.4.mlp.fc2.weight
dino_featurizer.blocks.4.mlp.fc2.bias
dino_featurizer.blocks.4.ls2.gamma
dino_featurizer.blocks.5.norm1.weight
dino_featurizer.blocks.5.norm1.bias
dino_featurizer.blocks.5.attn.qkv.weight
dino_featurizer.blocks.5.attn.qkv.bias
dino_featurizer.blocks.5.attn.proj.weight
dino_featurizer.blocks.5.attn.proj.bias
dino_featurizer.blocks.5.ls1.gamma
dino_featurizer.blocks.5.norm2.weight
dino_featurizer.blocks.5.norm2.bias
dino_featurizer.blocks.5.mlp.fc1.weight

dino_featurizer.blocks.10.attn.qkv.bias
dino_featurizer.blocks.10.attn.proj.weight
dino_featurizer.blocks.10.attn.proj.bias
dino_featurizer.blocks.10.ls1.gamma
dino_featurizer.blocks.10.norm2.weight
dino_featurizer.blocks.10.norm2.bias
dino_featurizer.blocks.10.mlp.fc1.weight
dino_featurizer.blocks.10.mlp.fc1.bias
dino_featurizer.blocks.10.mlp.fc2.weight
dino_featurizer.blocks.10.mlp.fc2.bias
dino_featurizer.blocks.10.ls2.gamma
dino_featurizer.blocks.11.norm1.weight
dino_featurizer.blocks.11.norm1.bias
dino_featurizer.blocks.11.attn.qkv.weight
dino_featurizer.blocks.11.attn.qkv.bias
dino_featurizer.blocks.11.attn.proj.weight
dino_featurizer.blocks.11.attn.proj.bias
dino_featurizer.blocks.11.ls1.gamma
dino_featurizer.blocks.11.norm2.weight
dino_featurizer.blocks.11.norm2.bias
dino_featurizer.blocks.11.mlp.fc1.weight
dino_featurizer.blocks.11.mlp.fc1.bias
dino_featurizer.blocks.11.mlp.fc2.weight
dino_featurizer.blocks.11.mlp.fc2.bias
dino_featurizer.blocks.11.ls2.gamma
dino_featurizer.blocks.12.norm1.weight
dino_featurizer.blocks.12.norm1.bias
dino_featurizer.blocks.12.attn.qkv.weight
dino_featurizer.blocks.12.attn.qkv.bias
dino_featurizer.blocks.12.attn.proj.weight
dino_featurizer.blocks.12.attn.proj.bias
dino_featurizer.blocks.12.ls1.gamma
dino_featurizer.blocks.12.norm2.weight
dino_featurizer.blocks.12.norm2.bias
dino_featurizer.blocks.12.mlp.fc1.weight
dino_featurizer.blocks.12.mlp.fc1.bias
dino_featurizer.blocks.12.mlp.fc2.weight
dino_featurizer.blocks.12.mlp.fc2.bias
dino_featurizer.blocks.12.ls2.gamma
dino_featurizer.blocks.13.norm1.weight
dino_fea
dino_featurizer.blocks.17.ls1.gamma
dino_featurizer.blocks.17.norm2.weight
dino_featurizer.blocks.17.norm2.bias
dino_featurizer.blocks.17.mlp.fc1.weight
dino_featurizer.blocks.17.mlp.fc1.bias
dino_featurizer.blocks.17.mlp.fc2.weight
dino_featurizer.blocks.17.mlp.fc2.bias
dino_featurizer.blocks.17.ls2.gamma
dino_featurizer.blocks.18.norm1.weight
dino_featurizer.blocks.18.norm1.bias
dino_featurizer.blocks.18.attn.qkv.weight
dino_featurizer.blocks.18.attn.qkv.bias
dino_featurizer.blocks.18.attn.proj.weight
dino_featurizer.blocks.18.attn.proj.bias
dino_featurizer.blocks.18.ls1.gamma
dino_featurizer.blocks.18.norm2.weight
dino_featurizer.blocks.18.norm2.bias
dino_featurizer.blocks.18.mlp.fc1.weight
dino_featurizer.blocks.18.mlp.fc1.bias
dino_featurizer.blocks.18.mlp.fc2.weight
dino_featurizer.blocks.18.mlp.fc2.bias
dino_featurizer.blocks.18.ls2.gamma
dino_featurizer.blocks.19.norm1.weight
dino_featurizer.blocks.19.norm1.bias
dino_featurizer.blocks.19.attn.qkv.weight
dino_featurizer.blocks.19.attn.qkv.bias
dino_featurizer.blocks.19.attn.proj.weight
dino_featurizer.blocks.19.attn.proj.bias
dino_featurizer.blocks.19.ls1.gamma
dino_featurizer.blocks.19.norm2.weight
dino_featurizer.blocks.19.norm2.bias
dino_featurizer.blocks.19.mlp.fc1.weight
dino_featurizer.blocks.19.mlp.fc1.bias
dino_featurizer.blocks.19.mlp.fc2.weight
dino_featurizer.blocks.19.mlp.fc2.bias
dino_featurizer.blocks.19.ls2.gamma
dino_featurizer.blocks.20.norm1.weight
dino_featurizer.blocks.20.norm1.bias
dino_featurizer.blocks.20.attn.qkv.weight
dino_featurizer.blocks.20.attn.qkv.bias
dino_featurizer.blocks.20.attn.pr
siglip_featurizer.blocks.0.attn.proj.weight
siglip_featurizer.blocks.0.attn.proj.bias
siglip_featurizer.blocks.0.norm2.weight
siglip_featurizer.blocks.0.norm2.bias
siglip_featurizer.blocks.0.mlp.fc1.weight
siglip_featurizer.blocks.0.mlp.fc1.bias
siglip_featurizer.blocks.0.mlp.fc2.weight
siglip_featurizer.blocks.0.mlp.fc2.bias
siglip_featurizer.blocks.1.norm1.weight
siglip_featurizer.blocks.1.norm1.bias
siglip_featurizer.blocks.1.attn.qkv.weight
siglip_featurizer.blocks.1.attn.qkv.bias
siglip_featurizer.blocks.1.attn.proj.weight
siglip_featurizer.blocks.1.attn.proj.bias
siglip_featurizer.blocks.1.norm2.weight
siglip_featurizer.blocks.1.norm2.bias
siglip_featurizer.blocks.1.mlp.fc1.weight
siglip_featurizer.blocks.1.mlp.fc1.bias
siglip_featurizer.blocks.1.mlp.fc2.weight
siglip_featurizer.blocks.1.mlp.fc2.bias
siglip_featurizer.blocks.2.norm1.weight
siglip_featurizer.blocks.2.norm1.bias
siglip_featurizer.blocks.2.attn.qkv.weight
siglip_featurizer.blocks.2.attn.qkv.bias
siglip_featurizer.blocks.2.attn.proj.weight
siglip_featurizer.blocks.2.attn.proj.bias
siglip_featurizer.blocks.2.norm2.weight
siglip_featurizer.blocks.2.norm2.bias
siglip_featurizer.blocks.2.mlp.fc1.weight
siglip_featurizer.blocks.2.mlp.fc1.bias
siglip_featurizer.blocks.2.mlp.fc2.weight
siglip_featurizer.blocks.2.mlp.fc2.bias
siglip_featurizer.blocks.3.norm1.weight
siglip_featurizer.blocks.3.norm1.bias
siglip_featurizer.blocks.3.attn.qkv.weight
siglip_featurizer.blocks.3.attn.qkv.bias
siglip_featurizer.blocks.3.attn.proj.weight
siglip_featurizer.blocks.3.attn.proj.bias
siglip_featurizer.blocks.3.norm2.weight
siglip_featurizer.blocks.3.norm2.bias
siglip_featurizer.blocks.
siglip_featurizer.blocks.8.attn.proj.bias
siglip_featurizer.blocks.8.norm2.weight
siglip_featurizer.blocks.8.norm2.bias
siglip_featurizer.blocks.8.mlp.fc1.weight
siglip_featurizer.blocks.8.mlp.fc1.bias
siglip_featurizer.blocks.8.mlp.fc2.weight
siglip_featurizer.blocks.8.mlp.fc2.bias
siglip_featurizer.blocks.9.norm1.weight
siglip_featurizer.blocks.9.norm1.bias
siglip_featurizer.blocks.9.attn.qkv.weight
siglip_featurizer.blocks.9.attn.qkv.bias
siglip_featurizer.blocks.9.attn.proj.weight
siglip_featurizer.blocks.9.attn.proj.bias
siglip_featurizer.blocks.9.norm2.weight
siglip_featurizer.blocks.9.norm2.bias
siglip_featurizer.blocks.9.mlp.fc1.weight
siglip_featurizer.blocks.9.mlp.fc1.bias
siglip_featurizer.blocks.9.mlp.fc2.weight
siglip_featurizer.blocks.9.mlp.fc2.bias
siglip_featurizer.blocks.10.norm1.weight
siglip_featurizer.blocks.10.norm1.bias
siglip_featurizer.blocks.10.attn.qkv.weight
siglip_featurizer.blocks.10.attn.qkv.bias
siglip_featurizer.blocks.10.attn.proj.weight
siglip_featurizer.blocks.10.attn.proj.bias
siglip_featurizer.blocks.10.norm2.weight
siglip_featurizer.blocks.10.norm2.bias
siglip_featurizer.blocks.10.mlp.fc1.weight
siglip_featurizer.blocks.10.mlp.fc1.bias
siglip_featurizer.blocks.10.mlp.fc2.weight
siglip_featurizer.blocks.10.mlp.fc2.bias
siglip_featurizer.blocks.11.norm1.weight
siglip_featurizer.blocks.11.norm1.bias
siglip_featurizer.blocks.11.attn.qkv.weight
siglip_featurizer.blocks.11.attn.qkv.bias
siglip_featurizer.blocks.11.attn.proj.weight
siglip_featurizer.blocks.11.attn.proj.bias
siglip_featurizer.blocks.11.norm2.weight
siglip_featurizer.blocks.11.norm2.bias
siglip_featurizer.blocks.11.mlp.fc1.weight
siglip_featurizer.blocks.11.mlp.fc1.bias
siglip_featurizer.blocks.16.attn.proj.bias
siglip_featurizer.blocks.16.norm2.weight
siglip_featurizer.blocks.16.norm2.bias
siglip_featurizer.blocks.16.mlp.fc1.weight
siglip_featurizer.blocks.16.mlp.fc1.bias
siglip_featurizer.blocks.16.mlp.fc2.weight
siglip_featurizer.blocks.16.mlp.fc2.bias
siglip_featurizer.blocks.17.norm1.weight
siglip_featurizer.blocks.17.norm1.bias
siglip_featurizer.blocks.17.attn.qkv.weight
siglip_featurizer.blocks.17.attn.qkv.bias
siglip_featurizer.blocks.17.attn.proj.weight
siglip_featurizer.blocks.17.attn.proj.bias
siglip_featurizer.blocks.17.norm2.weight
siglip_featurizer.blocks.17.norm2.bias
siglip_featurizer.blocks.17.mlp.fc1.weight
siglip_featurizer.blocks.17.mlp.fc1.bias
siglip_featurizer.blocks.17.mlp.fc2.weight
siglip_featurizer.blocks.17.mlp.fc2.bias
siglip_featurizer.blocks.18.norm1.weight
siglip_featurizer.blocks.18.norm1.bias
siglip_featurizer.blocks.18.attn.qkv.weight
siglip_featurizer.blocks.18.attn.qkv.bias
siglip_featurizer.blocks.18.attn.proj.weight
siglip_featurizer.blocks.18.attn.proj.bias
siglip_featurizer.blocks.18.norm2.weight
siglip_featurizer.blocks.18.norm2.bias
siglip_featurizer.blocks.18.mlp.fc1.weight
siglip_featurizer.blocks.18.mlp.fc1.bias
siglip_featurizer.blocks.18.mlp.fc2.weight
siglip_featurizer.blocks.18.mlp.fc2.bias
siglip_featurizer.blocks.19.norm1.weight
siglip_featurizer.blocks.19.norm1.bias
siglip_featurizer.blocks.19.attn.qkv.weight
siglip_featurizer.blocks.19.attn.qkv.bias
siglip_featurizer.blocks.19.attn.proj.weight
siglip_featurizer.blocks.19.attn.proj.bias
siglip_featurizer.blocks.19.norm2.weight
siglip_featurizer.blocks.19.norm2.bias
siglip_featurizer.blocks.19.mlp.fc1.weight
siglip_featurizer.block
siglip_featurizer.blocks.24.norm2.weight
siglip_featurizer.blocks.24.norm2.bias
siglip_featurizer.blocks.24.mlp.fc1.weight
siglip_featurizer.blocks.24.mlp.fc1.bias
siglip_featurizer.blocks.24.mlp.fc2.weight
siglip_featurizer.blocks.24.mlp.fc2.bias
siglip_featurizer.blocks.25.norm1.weight
siglip_featurizer.blocks.25.norm1.bias
siglip_featurizer.blocks.25.attn.qkv.weight
siglip_featurizer.blocks.25.attn.qkv.bias
siglip_featurizer.blocks.25.attn.proj.weight
siglip_featurizer.blocks.25.attn.proj.bias
siglip_featurizer.blocks.25.norm2.weight
siglip_featurizer.blocks.25.norm2.bias
siglip_featurizer.blocks.25.mlp.fc1.weight
siglip_featurizer.blocks.25.mlp.fc1.bias
siglip_featurizer.blocks.25.mlp.fc2.weight
siglip_featurizer.blocks.25.mlp.fc2.bias
siglip_featurizer.blocks.26.norm1.weight
siglip_featurizer.blocks.26.norm1.bias
siglip_featurizer.blocks.26.attn.qkv.weight
siglip_featurizer.blocks.26.attn.qkv.bias
siglip_featurizer.blocks.26.attn.proj.weight
siglip_featurizer.blocks.26.attn.proj.bias
siglip_featurizer.blocks.26.norm2.weight
siglip_featurizer.blocks.26.norm2.bias
siglip_featurizer.blocks.26.mlp.fc1.weight
siglip_featurizer.blocks.26.mlp.fc1.bias
siglip_featurizer.blocks.26.mlp.fc2.weight
siglip_featurizer.blocks.26.mlp.fc2.bias
siglip_featurizer.norm.weight
siglip_featurizer.norm.bias
siglip_featurizer.attn_pool.latent
siglip_featurizer.attn_pool.q.weight
siglip_featurizer.attn_pool.q.bias
siglip_featurizer.attn_pool.kv.weight
siglip_featurizer.attn_pool.kv.bias
siglip_featurizer.attn_pool.proj.weight
siglip_featurizer.attn_pool.proj.bias
siglip_featurizer.attn_pool.norm.weight
siglip_featurizer.attn_po
llm.model.layers.4.self_attn.q_proj.bias
llm.model.layers.4.self_attn.k_proj.weight
llm.model.layers.4.self_attn.k_proj.bias
llm.model.layers.4.self_attn.v_proj.weight
llm.model.layers.4.self_attn.v_proj.bias
llm.model.layers.4.self_attn.o_proj.weight
llm.model.layers.4.mlp.gate_proj.weight
llm.model.layers.4.mlp.up_proj.weight
llm.model.layers.4.mlp.down_proj.weight
llm.model.layers.4.input_layernorm.weight
llm.model.layers.4.post_attention_layernorm.weight
llm.model.layers.5.self_attn.q_proj.weight
llm.model.layers.5.self_attn.q_proj.bias
llm.model.layers.5.self_attn.k_proj.weight
llm.model.layers.5.self_attn.k_proj.bias
llm.model.layers.5.self_attn.v_proj.weight
llm.model.layers.5.self_attn.v_proj.bias
llm.model.layers.5.self_attn.o_proj.weight
llm.model.layers.5.mlp.gate_proj.weight
llm.model.layers.5.mlp.up_proj.weight
llm.model.layers.5.mlp.down_proj.weight
llm.model.layers.5.input_layernorm.weight
llm.model.layers.5.post_attention_layernorm.weight
llm.model.layers.6.self_attn.q_proj.weight
llm.model.layers.6.self_attn.q_proj.bias
llm.model.layers.6.self_attn.k_proj.weight
llm.model.layers.6.self_attn.k_proj.bias
llm.model.layers.6.self_attn.v_proj.weight
llm.model.layers.6.self_attn.v_proj.bias
llm.model.layers.6.self_attn.o_proj.weight
llm.model.layers.6.mlp.gate_proj.weight
llm.model.layers.6.mlp.up_proj.weight
llm.model.layers.6.mlp.down_proj.weight
llm.model.layers.6.input_layernorm.weight
llm.model.layers.6.post_attention_layernorm.weight
llm.model.layers.7.self_attn.q_proj.weight
llm.model.layers.7.self_attn.q_proj.bias
llm.model.layers.7.self_attn.k_proj.weight
llm.model.layers.7.self_attn.k_proj.bias
llm.model.layers.7.self_attn.v_proj.weight
llm.model.layers.7.self_attn.v_proj.
llm.model.layers.11.post_attention_layernorm.weight
llm.model.layers.12.self_attn.q_proj.weight
llm.model.layers.12.self_attn.q_proj.bias
llm.model.layers.12.self_attn.k_proj.weight
llm.model.layers.12.self_attn.k_proj.bias
llm.model.layers.12.self_attn.v_proj.weight
llm.model.layers.12.self_attn.v_proj.bias
llm.model.layers.12.self_attn.o_proj.weight
llm.model.layers.12.mlp.gate_proj.weight
llm.model.layers.12.mlp.up_proj.weight
llm.model.layers.12.mlp.down_proj.weight
llm.model.layers.12.input_layernorm.weight
llm.model.layers.12.post_attention_layernorm.weight
llm.model.layers.13.self_attn.q_proj.weight
llm.model.layers.13.self_attn.q_proj.bias
llm.model.layers.13.self_attn.k_proj.weight
llm.model.layers.13.self_attn.k_proj.bias
llm.model.layers.13.self_attn.v_proj.weight
llm.model.layers.13.self_attn.v_proj.bias
llm.model.layers.13.self_attn.o_proj.weight
llm.model.layers.13.mlp.gate_proj.weight
llm.model.layers.13.mlp.up_proj.weight
llm.model.layers.13.mlp.down_proj.weight
llm.model.layers.13.input_layernorm.weight
llm.model.layers.13.post_attention_layernorm.weight
llm.model.layers.14.self_attn.q_proj.weight
llm.model.layers.14.self_attn.q_proj.bias
llm.model.layers.14.self_attn.k_proj.weight
llm.model.layers.14.self_attn.k_proj.bias
llm.model.layers.14.self_attn.v_proj.weight
llm.model.layers.14.self_attn.v_proj.bias
llm.model.layers.14.self_attn.o_proj.weight
llm.model.layers.14.mlp.gate_proj.weight
llm.model.layers.14.mlp.up_proj.weight
llm.model.layers.14.mlp.down_proj.weight
llm.model.layers.14.input_layernorm.weight
llm.model.layers.14.post_attention_layernorm.weight
llm.model.layers.15.self_attn.q_proj.weight
llm.model.layers.15.self_attn.q_proj.bias
llm.model.layers.15.self_attn.k_proj.weight
llm.model.
llm.model.layers.19.mlp.gate_proj.weight
llm.model.layers.19.mlp.up_proj.weight
llm.model.layers.19.mlp.down_proj.weight
llm.model.layers.19.input_layernorm.weight
llm.model.layers.19.post_attention_layernorm.weight
llm.model.layers.20.self_attn.q_proj.weight
llm.model.layers.20.self_attn.q_proj.bias
llm.model.layers.20.self_attn.k_proj.weight
llm.model.layers.20.self_attn.k_proj.bias
llm.model.layers.20.self_attn.v_proj.weight
llm.model.layers.20.self_attn.v_proj.bias
llm.model.layers.20.self_attn.o_proj.weight
llm.model.layers.20.mlp.gate_proj.weight
llm.model.layers.20.mlp.up_proj.weight
llm.model.layers.20.mlp.down_proj.weight
llm.model.layers.20.input_layernorm.weight
llm.model.layers.20.post_attention_layernorm.weight
llm.model.layers.21.self_attn.q_proj.weight
llm.model.layers.21.self_attn.q_proj.bias
llm.model.layers.21.self_attn.k_proj.weight
llm.model.layers.21.self_attn.k_proj.bias
llm.model.layers.21.self_attn.v_proj.weight
llm.model.layers.21.self_attn.v_proj.bias
llm.model.layers.21.self_attn.o_proj.weight
llm.model.layers.21.mlp.gate_proj.weight
llm.model.layers.21.mlp.up_proj.weight
llm.model.layers.21.mlp.down_proj.weight
llm.model.layers.21.input_layernorm.weight
llm.model.layers.21.post_attention_layernorm.weight
llm.model.layers.22.self_attn.q_proj.weight
llm.model.layers.22.self_attn.q_proj.bias
llm.model.layers.22.self_attn.k_proj.weight
llm.model.layers.22.self_attn.k_proj.bias
llm.model.layers.22.self_attn.v_proj.weight
llm.model.layers.22.self_attn.v_proj.bias
llm.model.layers.22.self_attn.o_proj.weight
llm.model.layers.22.mlp.gate_proj.weight
llm.model.layers.22.mlp.up_proj.weight
llm.model.layers.22.mlp.down_proj.weight
llm.model.layers.22.input_layernorm.weight
llm.model.layers.22.post_attention_layernorm.weight
llm.model.layers.23.self_attn.q_proj.weight
llm.model.layers.23.self_attn.q_proj.bias
llm.model.layers.23.self_attn.k_proj.weight
llm.model.layers.23.self_attn.k_proj.bias
llm.model.layers.23.self_attn.v_proj.weight
llm.model.layers.23.self_attn.v_proj.bias
llm.model.layers.23.self_attn.o_proj.weight
llm.model.layers.23.mlp.gate_proj.weight
llm.model.layers.23.mlp.up_proj.weight
llm.model.layers.23.mlp.down_proj.weight
llm.model.layers.23.input_layernorm.weight
llm.model.layers.23.post_attention_layernorm.weight
llm.model.norm.weight
llm.lm_head.weight
projector.0.weight
projector.0.bias
projector.2.weight
projector.2.bias
projector.4.weight
projector.4.bias
Stanford ILIAD org

Hi sorry, I provided the wrong link in the readme! Here's the correct link to find the pretrained vq models: https://huggingface.co/Stanford-ILIAD/pretrain_vq

belkhale changed discussion status to closed

Sign up or log in to comment