File size: 1,643 Bytes
8f4977d b7e870b 8f4977d b7e870b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 |
---
license: llama2
language:
- en
---
# DAMA 65B
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** Tomasz Limisiewicz, David Mareček, Tomáš Musil
- **Language(s) (NLP):** English
- **Adapted from model:** LLaMA 65B
### Model Sources
<!-- Provide the basic links for the model. -->
- **Repository:** [Link](github.com/tomlimi/DAMA)
- **Paper:** [Link](openreview.net/pdf?id=XIZEFyVGC9)
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
The model is the adapted version of LLaMA 65B for decreasing gender bias.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
The model mitigates the gender bias of the original model.
Thus, it is better suited for the generation and processing of texts in sensitive domains.
## Citation
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```
@inproceedings{
limisiewicz2024debiasing,
title={Debiasing Algorithm through Model Adaptation},
author={Tomasz Limisiewicz and David Mare{\v{c}}ek and Tom{\'a}{\v{s}} Musil},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=XIZEFyVGC9}
}
```
|