|
---
|
|
license: cc-by-nc-sa-4.0
|
|
---
|
|
|
|
![ime](https://user-images.githubusercontent.com/2136700/160290194-4f30a796-876a-4750-bb3b-b5b62c4676c5.png)
|
|
# Transformers4IME
|
|
|
|
Transformers4IME is repo for exploring and adapting transformer-based models to IME.
|
|
|
|
## PinyinGPT
|
|
|
|
PinyinGPT is a model from [Exploring and Adapting Chinese GPT to Pinyin Input Method](https://arxiv.org/abs/2203.00249)
|
|
which appears in ACL2022.
|
|
```bibtex
|
|
@article{tan2022exploring,
|
|
title={Exploring and Adapting Chinese GPT to Pinyin Input Method},
|
|
author={Tan, Minghuan and Dai, Yong and Tang, Duyu and Feng, Zhangyin and Huang, Guoping and Jiang, Jing and Li, Jiwei and Shi, Shuming},
|
|
journal={arXiv preprint arXiv:2203.00249},
|
|
year={2022}
|
|
}
|
|
```
|
|
|
|
The code can be found at
|
|
* [Gitee](https://gitee.com/visualjoyce/Transformers4IME)
|
|
* [Github](https://github.com/visualjoyce/Transformers4IME)
|
|
|
|
|