HugoLaurencon commited on
Commit
20f670a
1 Parent(s): 2f09a11

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -1,5 +1,5 @@
1
  ---
2
- license: mit
3
  datasets:
4
  - HuggingFaceM4/WebSight
5
  language:
@@ -100,6 +100,6 @@ print(generated_text)
100
 
101
  # License
102
 
103
- The model is built on top of two pre-trained models: [SigLIP](https://github.com/huggingface/transformers/pull/26522) and [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1). As such, users should comply with the licenses of these models.
104
 
105
- The two pre-trained models are connected to each other with newly initialized parameters that we train. These are not based on any of the two base frozen models forming the composite model. We release the additional weights we trained under an MIT license.
 
1
  ---
2
+ license: apache-2.0
3
  datasets:
4
  - HuggingFaceM4/WebSight
5
  language:
 
100
 
101
  # License
102
 
103
+ The model is built on top of two pre-trained models: [SigLIP](https://github.com/huggingface/transformers/pull/26522) and [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1), which are delivered under an Apache license 2.0. As such, users should comply with the licenses of these models.
104
 
105
+ The two pre-trained models are connected to each other with newly initialized parameters that we train. These are not based on any of the two base frozen models forming the composite model. We release the additional weights we trained under an Apache license 2.0.