tim1900 commited on
Commit
8f7ecfd
1 Parent(s): b2cb74d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -11,7 +11,7 @@ pipeline_tag: token-classification
11
 
12
  ## Introduction
13
 
14
- BertChunker is a text chunker based on BERT with a classifier head to predict the start token of chunks (for use in RAG, etc). It was finetuned on [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2). The whole training lasted for 10 minutes on a Nvidia P40 GPU on a 50 MB synthetized dataset.
15
 
16
  This repo includes model checkpoint, BertChunker class definition file and all the other files needed.
17
 
@@ -25,7 +25,7 @@ from modeling_bertchunker import BertChunker
25
 
26
  # load bert tokenizer
27
  tokenizer = AutoTokenizer.from_pretrained(
28
- "sentence-transformers/all-MiniLM-L6-v2",
29
  padding_side="right",
30
  model_max_length=255,
31
  trust_remote_code=True,
@@ -33,7 +33,7 @@ tokenizer = AutoTokenizer.from_pretrained(
33
 
34
  # load MiniLM-L6-H384-uncased bert config
35
  config = AutoConfig.from_pretrained(
36
- "sentence-transformers/all-MiniLM-L6-v2",
37
  trust_remote_code=True,
38
  )
39
 
 
11
 
12
  ## Introduction
13
 
14
+ BertChunker is a text chunker based on BERT with a classifier head to predict the start token of chunks (for use in RAG, etc). It was finetuned on [nreimers/MiniLM-L6-H384-uncased](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2). The whole training lasted for 10 minutes on a Nvidia P40 GPU on a 50 MB synthetized dataset.
15
 
16
  This repo includes model checkpoint, BertChunker class definition file and all the other files needed.
17
 
 
25
 
26
  # load bert tokenizer
27
  tokenizer = AutoTokenizer.from_pretrained(
28
+ "nreimers/MiniLM-L6-H384-uncased",
29
  padding_side="right",
30
  model_max_length=255,
31
  trust_remote_code=True,
 
33
 
34
  # load MiniLM-L6-H384-uncased bert config
35
  config = AutoConfig.from_pretrained(
36
+ "nreimers/MiniLM-L6-H384-uncased",
37
  trust_remote_code=True,
38
  )
39