Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,36 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# Model card for CoNN Copy
|
2 |
+
|
3 |
+
|
4 |
+
### Introduction
|
5 |
+
In paper Neural Comprehension: Language Models with Compiled Neural Networks , we introduced the integration of Compiled Neural Networks (CoNN) into the framework of language models, enabling existing language models to perform symbolic operations with perfect accuracy without the need for external tools.
|
6 |
+
In this model card, we introduce the Last Letter model, which is similar to the Transformer model and can output the last letter.
|
7 |
+
|
8 |
+
|
9 |
+
|
10 |
+
### Install
|
11 |
+
|
12 |
+
```
|
13 |
+
git clone https://github.com/WENGSYX/Neural-Comprehension
|
14 |
+
cd apex
|
15 |
+
pip install .
|
16 |
+
```
|
17 |
+
|
18 |
+
To run neural comprehension, you need to install `PyTorch`, `Transformers`, `jax`, and `tracr`.
|
19 |
+
|
20 |
+
|
21 |
+
|
22 |
+
### How to Use?
|
23 |
+
|
24 |
+
```
|
25 |
+
from NeuralComprehension.CoNN.modeling_conn import CoNNModel
|
26 |
+
from NeuralComprehension.tracr4torch import Tokenizer
|
27 |
+
|
28 |
+
|
29 |
+
model = CoNNModel.from_pretrained('WENGSYX/CoNN_Last_Letter')
|
30 |
+
tokenizer = Tokenizer(model.config.input_encoding_map, model.config.output_encoding_map,model.config.max_position_embeddings)
|
31 |
+
|
32 |
+
output = model(tokenizer('w e n g s y x').unsqueeze(0))
|
33 |
+
print(tokenizer.decode(output.argmax(2)))
|
34 |
+
|
35 |
+
>>> [['bos', 'x', 'x', 'x', 'x', 'x', 'x', 'x']]
|
36 |
+
```
|