Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Salesforce
/
instructblip-vicuna-7b
like
85
Follow
Salesforce
764
Image-Text-to-Text
Transformers
PyTorch
Safetensors
English
instructblip
vision
image-captioning
Inference Endpoints
arxiv:
2305.06500
License:
other
Model card
Files
Files and versions
Community
15
Train
Deploy
Use this model
92cb457
instructblip-vicuna-7b
4 contributors
History:
19 commits
RaushanTurganbay
HF staff
update the files
92cb457
7 days ago
qformer_tokenizer
update the files
7 days ago
.gitattributes
1.48 kB
initial commit
over 1 year ago
README.md
2.14 kB
Update README.md
over 1 year ago
added_tokens.json
41 Bytes
update the files
7 days ago
config.json
1.35 kB
update the files
7 days ago
generation_config.json
141 Bytes
update the files
7 days ago
model-00001-of-00004.safetensors
9.9 GB
LFS
update the files
7 days ago
model-00002-of-00004.safetensors
9.96 GB
LFS
update the files
7 days ago
model-00003-of-00004.safetensors
9.92 GB
LFS
update the files
7 days ago
model-00004-of-00004.safetensors
1.88 GB
LFS
update the files
7 days ago
model.safetensors.index.json
104 kB
update the files
7 days ago
preprocessor_config.json
439 Bytes
Upload processor
over 1 year ago
processor_config.json
75 Bytes
update the files
7 days ago
pytorch_model-00001-of-00004.bin
9.9 GB
LFS
Upload InstructBlipForConditionalGeneration
over 1 year ago
pytorch_model-00002-of-00004.bin
9.96 GB
LFS
Upload InstructBlipForConditionalGeneration
over 1 year ago
pytorch_model-00003-of-00004.bin
9.92 GB
LFS
Upload InstructBlipForConditionalGeneration
over 1 year ago
pytorch_model-00004-of-00004.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
1.87 GB
LFS
Upload InstructBlipForConditionalGeneration
over 1 year ago
pytorch_model.bin.index.json
107 kB
Upload InstructBlipForConditionalGeneration
over 1 year ago
special_tokens_map.json
549 Bytes
update the files
7 days ago
tokenizer.json
3.62 MB
update the files
7 days ago
tokenizer.model
500 kB
LFS
Upload processor
over 1 year ago
tokenizer_config.json
1.31 kB
update the files
7 days ago