Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,26 @@
|
|
1 |
---
|
2 |
license: mit
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: mit
|
3 |
---
|
4 |
+
This is the 25 MB compressed version of GraphCodeBERT that has been fine-tuned for the Clone Detection task using [BigCloneBench](https://github.com/clonebench/BigCloneBench.git) dataset.
|
5 |
+
|
6 |
+
The compression is based on our ASE 2022 paper named ["**Compressing Pre-trained Models of Code into 3 MB**"](https://arxiv.org/abs/2208.07120).
|
7 |
+
|
8 |
+
If you are interested in using this model, please check our **GitHub repository: https://github.com/soarsmu/Compressor.git**. If you use the model or any code from our repo in your paper, please kindly cite:
|
9 |
+
```
|
10 |
+
@inproceedings{shi2022compressing,
|
11 |
+
author = {Shi, Jieke and Yang, Zhou and Xu, Bowen and Kang, Hong Jin and Lo, David},
|
12 |
+
title = {Compressing Pre-Trained Models of Code into 3 MB},
|
13 |
+
year = {2023},
|
14 |
+
isbn = {9781450394758},
|
15 |
+
publisher = {Association for Computing Machinery},
|
16 |
+
address = {New York, NY, USA},
|
17 |
+
url = {https://doi.org/10.1145/3551349.3556964},
|
18 |
+
doi = {10.1145/3551349.3556964},
|
19 |
+
booktitle = {Proceedings of the 37th IEEE/ACM International Conference on Automated Software Engineering},
|
20 |
+
articleno = {24},
|
21 |
+
numpages = {12},
|
22 |
+
keywords = {Pre-Trained Models, Model Compression, Genetic Algorithm},
|
23 |
+
location = {Rochester, MI, USA},
|
24 |
+
series = {ASE '22}
|
25 |
+
}
|
26 |
+
```
|