|
--- |
|
tags: |
|
- bert |
|
- adapter-transformers |
|
- adapterhub:sw/wiki |
|
language: |
|
- sw |
|
license: "apache-2.0" |
|
--- |
|
|
|
# Adapter `bert-base-multilingual-cased-sw-wiki_pfeiffer` for bert-base-multilingual-cased |
|
|
|
Pfeiffer Adapter trained with Masked Language Modelling on Swahili Wikipedia Articles for 100k steps and a batch size of 64. |
|
|
|
|
|
**This adapter was created for usage with the [Adapters](https://github.com/Adapter-Hub/adapters) library.** |
|
|
|
## Usage |
|
|
|
First, install `adapters`: |
|
|
|
``` |
|
pip install -U adapters |
|
``` |
|
|
|
Now, the adapter can be loaded and activated like this: |
|
|
|
```python |
|
from adapters import AutoAdapterModel |
|
|
|
model = AutoAdapterModel.from_pretrained("bert-base-multilingual-cased") |
|
adapter_name = model.load_adapter("AdapterHub/bert-base-multilingual-cased-sw-wiki_pfeiffer") |
|
model.set_active_adapters(adapter_name) |
|
``` |
|
|
|
## Architecture & Training |
|
|
|
- Adapter architecture: pfeiffer |
|
- Prediction head: None |
|
- Dataset: [sw/wiki](https://adapterhub.ml/explore/sw/wiki/) |
|
|
|
## Author Information |
|
|
|
- Author name(s): Jonas Pfeiffer |
|
- Author email: jonas@pfeiffer.ai |
|
- Author links: [Website](https://pfeiffer.ai), [GitHub](https://github.com/jopfeiff), [Twitter](https://twitter.com/@PfeiffJo) |
|
|
|
## Versions |
|
- `nd` **(main)** |
|
- `wd` |
|
|
|
## Citation |
|
|
|
```bibtex |
|
@article{pfeiffer20madx, |
|
title={{MAD-X}: An {A}dapter-based {F}ramework for {M}ulti-task {C}ross-lingual {T}ransfer}, |
|
author={Pfeiffer, Jonas and Vuli\'{c}, Ivan and Gurevych, Iryna and Ruder, Sebastian}, |
|
journal={arXiv preprint}, |
|
year={2020}, |
|
url={https://arxiv.org/pdf/2005.00052.pdf}, |
|
} |
|
|
|
``` |
|
|
|
*This adapter has been auto-imported from https://github.com/Adapter-Hub/Hub/blob/master/adapters/ukp/bert-base-multilingual-cased-sw-wiki_pfeiffer.yaml*. |