diff --git "a/README.md" "b/README.md" --- "a/README.md" +++ "b/README.md" @@ -6,132 +6,70 @@ tags: - sentence-similarity - feature-extraction - generated_from_trainer -- dataset_size:6131012 +- dataset_size:6661966 - loss:MultipleNegativesRankingLoss - loss:CachedMultipleNegativesRankingLoss - loss:SoftmaxLoss +- loss:AnglELoss +- loss:CoSENTLoss - loss:CosineSimilarityLoss -base_model: -- tasksource/ModernBERT-base-nli -- answerdotai/ModernBERT-base +base_model: answerdotai/ModernBERT-base widget: -- source_sentence: >- - Daniel went to the kitchen. Sandra went back to the kitchen. Daniel moved to - the garden. Sandra grabbed the apple. Sandra went back to the office. Sandra - dropped the apple. Sandra went to the garden. Sandra went back to the - bedroom. Sandra went back to the office. Mary went back to the office. - Daniel moved to the bathroom. Sandra grabbed the apple. Sandra travelled to - the garden. Sandra put down the apple there. Mary went back to the bathroom. - Daniel travelled to the garden. Mary took the milk. Sandra grabbed the - apple. Mary left the milk there. Sandra journeyed to the bedroom. John - travelled to the office. John went back to the garden. Sandra journeyed to - the garden. Mary grabbed the milk. Mary left the milk. Mary grabbed the - milk. Mary went to the hallway. John moved to the hallway. Mary picked up - the football. Sandra journeyed to the kitchen. Sandra left the apple. Mary - discarded the milk. John journeyed to the garden. Mary dropped the football. - Daniel moved to the bathroom. Daniel journeyed to the kitchen. Mary - travelled to the bathroom. Daniel went to the bedroom. Mary went to the - hallway. Sandra got the apple. Sandra went back to the hallway. Mary moved - to the kitchen. Sandra dropped the apple there. Sandra grabbed the milk. - Sandra journeyed to the bathroom. John went back to the kitchen. Sandra went - to the kitchen. Sandra travelled to the bathroom. Daniel went to the garden. - Daniel moved to the kitchen. Sandra dropped the milk. Sandra got the milk. - Sandra put down the milk. John journeyed to the garden. Sandra went back to - the hallway. Sandra picked up the apple. Sandra got the football. Sandra - moved to the garden. Daniel moved to the bathroom. Daniel travelled to the - garden. Sandra went back to the bathroom. Sandra discarded the football. +- source_sentence: Daniel went to the kitchen. Sandra went back to the kitchen. Daniel + moved to the garden. Sandra grabbed the apple. Sandra went back to the office. + Sandra dropped the apple. Sandra went to the garden. Sandra went back to the bedroom. + Sandra went back to the office. Mary went back to the office. Daniel moved to + the bathroom. Sandra grabbed the apple. Sandra travelled to the garden. Sandra + put down the apple there. Mary went back to the bathroom. Daniel travelled to + the garden. Mary took the milk. Sandra grabbed the apple. Mary left the milk there. + Sandra journeyed to the bedroom. John travelled to the office. John went back + to the garden. Sandra journeyed to the garden. Mary grabbed the milk. Mary left + the milk. Mary grabbed the milk. Mary went to the hallway. John moved to the hallway. + Mary picked up the football. Sandra journeyed to the kitchen. Sandra left the + apple. Mary discarded the milk. John journeyed to the garden. Mary dropped the + football. Daniel moved to the bathroom. Daniel journeyed to the kitchen. Mary + travelled to the bathroom. Daniel went to the bedroom. Mary went to the hallway. + Sandra got the apple. Sandra went back to the hallway. Mary moved to the kitchen. + Sandra dropped the apple there. Sandra grabbed the milk. Sandra journeyed to the + bathroom. John went back to the kitchen. Sandra went to the kitchen. Sandra travelled + to the bathroom. Daniel went to the garden. Daniel moved to the kitchen. Sandra + dropped the milk. Sandra got the milk. Sandra put down the milk. John journeyed + to the garden. Sandra went back to the hallway. Sandra picked up the apple. Sandra + got the football. Sandra moved to the garden. Daniel moved to the bathroom. Daniel + travelled to the garden. Sandra went back to the bathroom. Sandra discarded the + football. sentences: - In the adulthood stage, it can jump, walk, run - The chocolate is bigger than the container. - The football before the bathroom was in the garden. -- source_sentence: >- - Context: I am devasted. - - Speaker 1: I am very devastated these days. - - Speaker 2: That seems bad and I am sorry to hear that. What happened? - - Speaker 1: My father day 3 weeks ago.I still can't believe. - - Speaker 2: I am truly sorry to hear that. Please accept my apologies for - your loss. May he rest in peace +- source_sentence: Almost everywhere the series converges then . sentences: - - 'The main emotion of this example dialogue is: content' - - 'This text is about: genealogy' - - The intent of this example is to be offensive/disrespectful. -- source_sentence: in three distinguish’d parts, with three distinguish���d guides + - The series then converges almost everywhere . + - Scrivener dated the manuscript to the 12th century , C. R. Gregory to the 13th + century . Currently the manuscript is dated by the INTF to the 12th century . + - Both daughters died before he did , Tosca in 1976 and Janear in 1981 . +- source_sentence: how are you i'm doing good thank you you im not good having cough + and colg sentences: - - This example is paraphrase. - - This example is neutral. - - This example is negative. -- source_sentence: A boy is playing a piano. + - 'This example tweet expresses the emotion: happiness' + - This example utterance is about cooking recipies. + - This example text from a US presidential speech is about macroeconomics +- source_sentence: A man is doing pull-ups sentences: - - Nine killed in Syrian-linked clashes in Lebanon - - A man is singing and playing a guitar. - - My opinion is to wait until the child itself expresses a desire for this. -- source_sentence: Francis I of France was a king. + - The man is doing exercises in a gym + - A black and white dog with a large branch is running in the field + - There is no man drawing +- source_sentence: A chef is preparing some food sentences: - - >- - The Apple QuickTake -LRB- codenamed Venus , Mars , Neptune -RRB- is one of - the first consumer digital camera lines .. digital camera. digital camera. - It was launched in 1994 by Apple Computer and was marketed for three years - before being discontinued in 1997 .. Apple Computer. Apple Computer. Three - models of the product were built including the 100 and 150 , both built by - Kodak ; and the 200 , built by Fujifilm .. Kodak. Kodak. Fujifilm. Fujifilm. - The QuickTake cameras had a resolution of 640 x 480 pixels maximum -LRB- 0.3 - Mpx -RRB- .. resolution. Display resolution. The 200 model is only - officially compatible with the Apple Macintosh for direct connections , - while the 100 and 150 model are compatible with both the Apple Macintosh and - Microsoft Windows .. Apple Macintosh. Apple Macintosh. Microsoft Windows. - Microsoft Windows. Because the QuickTake 200 is almost identical to the Fuji - DS-7 or to Samsung 's Kenox SSC-350N , Fuji 's software for that camera can - be used to gain Windows compatibility for the QuickTake 200 .. Some other - software replacements also exist as well as using an external reader for the - removable media of the QuickTake 200 .. Time Magazine profiled QuickTake as - `` the first consumer digital camera '' and ranked it among its `` 100 - greatest and most influential gadgets from 1923 to the present '' list .. - digital camera. digital camera. Time Magazine. Time Magazine. While the - QuickTake was probably the first digicam to have wide success , technically - this is not true as the greyscale Dycam Model 1 -LRB- also marketed as the - Logitech FotoMan -RRB- was the first consumer digital camera to be sold in - the US in November 1990 .. digital camera. digital camera. greyscale. - greyscale. At least one other camera , the Fuji DS-X , was sold in Japan - even earlier , in late 1989 . - - >- - The ganglion cell layer -LRB- ganglionic layer -RRB- is a layer of the - retina that consists of retinal ganglion cells and displaced amacrine cells - .. retina. retina. In the macula lutea , the layer forms several strata .. - macula lutea. macula lutea. The cells are somewhat flask-shaped ; the - rounded internal surface of each resting on the stratum opticum , and - sending off an axon which is prolonged into it .. flask. Laboratory flask. - stratum opticum. stratum opticum. axon. axon. From the opposite end numerous - dendrites extend into the inner plexiform layer , where they branch and form - flattened arborizations at different levels .. inner plexiform layer. inner - plexiform layer. arborizations. arborizations. dendrites. dendrites. The - ganglion cells vary much in size , and the dendrites of the smaller ones as - a rule arborize in the inner plexiform layer as soon as they enter it ; - while those of the larger cells ramify close to the inner nuclear layer .. - inner plexiform layer. inner plexiform layer. dendrites. dendrites. inner - nuclear layer. inner nuclear layer - - >- - Coyote was a brand of racing chassis designed and built for the use of A. J. - Foyt 's race team in USAC Championship car racing including the Indianapolis - 500 .. A. J. Foyt. A. J. Foyt. USAC. United States Auto Club. Championship - car. American Championship car racing. Indianapolis 500. Indianapolis 500. - It was used from 1966 to 1983 with Foyt himself making 141 starts in the car - , winning 25 times .. George Snider had the second most starts with 24 .. - George Snider. George Snider. Jim McElreath has the only other win with a - Coyote chassis .. Jim McElreath. Jim McElreath. Foyt drove a Coyote to - victory in the Indy 500 in 1967 and 1977 .. With Foyt 's permission , fellow - Indy 500 champion Eddie Cheever 's Cheever Racing began using the Coyote - name for his new Daytona Prototype chassis , derived from the Fabcar chassis - design that he had purchased the rights to in 2007 .. Eddie Cheever. Eddie - Cheever. Cheever Racing. Cheever Racing. Daytona Prototype. Daytona - Prototype + - The man is lifting weights + - A chef is preparing a meal + - A dog is in a sandy area with the sand that is being stirred up into the air and + several plants are in the background datasets: - tomaarsen/natural-questions-hard-negatives - tomaarsen/gooaq-hard-negatives - bclavie/msmarco-500k-triplets +- sentence-transformers/all-nli - sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1 - sentence-transformers/gooaq - sentence-transformers/natural-questions @@ -147,22 +85,23 @@ pipeline_tag: sentence-similarity library_name: sentence-transformers --- -# SentenceTransformer based on tasksource/ModernBERT-base-nli +# SentenceTransformer based on answerdotai/ModernBERT-base -This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [tasksource/ModernBERT-base-nli](https://huggingface.co/tasksource/ModernBERT-base-nli) on the [tomaarsen/natural-questions-hard-negatives](https://huggingface.co/datasets/tomaarsen/natural-questions-hard-negatives), [tomaarsen/gooaq-hard-negatives](https://huggingface.co/datasets/tomaarsen/gooaq-hard-negatives), [bclavie/msmarco-500k-triplets](https://huggingface.co/datasets/bclavie/msmarco-500k-triplets), [sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1), [sentence-transformers/gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq), [sentence-transformers/natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions), [merged-2l-nli](https://huggingface.co/datasets/tasksource/merged-2l-nli), [merged-3l-nli](https://huggingface.co/datasets/tasksource/merged-3l-nli), [zero-shot-label-nli](https://huggingface.co/datasets/tasksource/zero-shot-label-nli), [dataset_train_nli](https://huggingface.co/datasets/MoritzLaurer/dataset_train_nli), [paws/labeled_final](https://huggingface.co/datasets/paws), [glue/mrpc](https://huggingface.co/datasets/glue), [glue/qqp](https://huggingface.co/datasets/glue), [fever-evidence-related](https://huggingface.co/datasets/mwong/fever-evidence-related), [glue/stsb](https://huggingface.co/datasets/glue), sick/relatedness and [sts-companion](https://huggingface.co/datasets/tasksource/sts-companion) datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. +This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [answerdotai/ModernBERT-base](https://huggingface.co/answerdotai/ModernBERT-base) on the [tomaarsen/natural-questions-hard-negatives](https://huggingface.co/datasets/tomaarsen/natural-questions-hard-negatives), [tomaarsen/gooaq-hard-negatives](https://huggingface.co/datasets/tomaarsen/gooaq-hard-negatives), [bclavie/msmarco-500k-triplets](https://huggingface.co/datasets/bclavie/msmarco-500k-triplets), [sentence-transformers/all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli), [sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1), [sentence-transformers/gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq), [sentence-transformers/natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions), [merged-2l-nli](https://huggingface.co/datasets/tasksource/merged-2l-nli), [merged-3l-nli](https://huggingface.co/datasets/tasksource/merged-3l-nli), [zero-shot-label-nli](https://huggingface.co/datasets/tasksource/zero-shot-label-nli), [dataset_train_nli](https://huggingface.co/datasets/MoritzLaurer/dataset_train_nli), [paws/labeled_final](https://huggingface.co/datasets/paws), [glue/mrpc](https://huggingface.co/datasets/glue), [glue/qqp](https://huggingface.co/datasets/glue), [fever-evidence-related](https://huggingface.co/datasets/mwong/fever-evidence-related), [glue/stsb_0](https://huggingface.co/datasets/glue), [glue/stsb_1](https://huggingface.co/datasets/glue), [glue/stsb_2](https://huggingface.co/datasets/glue), sick/relatedness_0, sick/relatedness_1, sick/relatedness_2, [sts-companion_0](https://huggingface.co/datasets/tasksource/sts-companion), [sts-companion_1](https://huggingface.co/datasets/tasksource/sts-companion) and [sts-companion_2](https://huggingface.co/datasets/tasksource/sts-companion) datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer -- **Base model:** [tasksource/ModernBERT-base-nli](https://huggingface.co/tasksource/ModernBERT-base-nli) -- **Maximum Sequence Length:** 2048 tokens +- **Base model:** [answerdotai/ModernBERT-base](https://huggingface.co/answerdotai/ModernBERT-base) +- **Maximum Sequence Length:** 8192 tokens - **Output Dimensionality:** 768 dimensions - **Similarity Function:** Cosine Similarity - **Training Datasets:** - [tomaarsen/natural-questions-hard-negatives](https://huggingface.co/datasets/tomaarsen/natural-questions-hard-negatives) - [tomaarsen/gooaq-hard-negatives](https://huggingface.co/datasets/tomaarsen/gooaq-hard-negatives) - [bclavie/msmarco-500k-triplets](https://huggingface.co/datasets/bclavie/msmarco-500k-triplets) + - [sentence-transformers/all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli) - [sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1) - [sentence-transformers/gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq) - [sentence-transformers/natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions) @@ -174,9 +113,15 @@ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [t - [glue/mrpc](https://huggingface.co/datasets/glue) - [glue/qqp](https://huggingface.co/datasets/glue) - [fever-evidence-related](https://huggingface.co/datasets/mwong/fever-evidence-related) - - [glue/stsb](https://huggingface.co/datasets/glue) - - sick/relatedness - - [sts-companion](https://huggingface.co/datasets/tasksource/sts-companion) + - [glue/stsb_0](https://huggingface.co/datasets/glue) + - [glue/stsb_1](https://huggingface.co/datasets/glue) + - [glue/stsb_2](https://huggingface.co/datasets/glue) + - sick/relatedness_0 + - sick/relatedness_1 + - sick/relatedness_2 + - [sts-companion_0](https://huggingface.co/datasets/tasksource/sts-companion) + - [sts-companion_1](https://huggingface.co/datasets/tasksource/sts-companion) + - [sts-companion_2](https://huggingface.co/datasets/tasksource/sts-companion) - **Language:** en @@ -190,7 +135,7 @@ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [t ``` SentenceTransformer( - (0): Transformer({'max_seq_length': 2048, 'do_lower_case': False}) with Transformer model: ModernBertModel + (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: ModernBertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` @@ -213,9 +158,9 @@ from sentence_transformers import SentenceTransformer model = SentenceTransformer("tasksource/ModernBERT-base-embed") # Run inference sentences = [ - 'Francis I of France was a king.', - "Coyote was a brand of racing chassis designed and built for the use of A. J. Foyt 's race team in USAC Championship car racing including the Indianapolis 500 .. A. J. Foyt. A. J. Foyt. USAC. United States Auto Club. Championship car. American Championship car racing. Indianapolis 500. Indianapolis 500. It was used from 1966 to 1983 with Foyt himself making 141 starts in the car , winning 25 times .. George Snider had the second most starts with 24 .. George Snider. George Snider. Jim McElreath has the only other win with a Coyote chassis .. Jim McElreath. Jim McElreath. Foyt drove a Coyote to victory in the Indy 500 in 1967 and 1977 .. With Foyt 's permission , fellow Indy 500 champion Eddie Cheever 's Cheever Racing began using the Coyote name for his new Daytona Prototype chassis , derived from the Fabcar chassis design that he had purchased the rights to in 2007 .. Eddie Cheever. Eddie Cheever. Cheever Racing. Cheever Racing. Daytona Prototype. Daytona Prototype", - "The Apple QuickTake -LRB- codenamed Venus , Mars , Neptune -RRB- is one of the first consumer digital camera lines .. digital camera. digital camera. It was launched in 1994 by Apple Computer and was marketed for three years before being discontinued in 1997 .. Apple Computer. Apple Computer. Three models of the product were built including the 100 and 150 , both built by Kodak ; and the 200 , built by Fujifilm .. Kodak. Kodak. Fujifilm. Fujifilm. The QuickTake cameras had a resolution of 640 x 480 pixels maximum -LRB- 0.3 Mpx -RRB- .. resolution. Display resolution. The 200 model is only officially compatible with the Apple Macintosh for direct connections , while the 100 and 150 model are compatible with both the Apple Macintosh and Microsoft Windows .. Apple Macintosh. Apple Macintosh. Microsoft Windows. Microsoft Windows. Because the QuickTake 200 is almost identical to the Fuji DS-7 or to Samsung 's Kenox SSC-350N , Fuji 's software for that camera can be used to gain Windows compatibility for the QuickTake 200 .. Some other software replacements also exist as well as using an external reader for the removable media of the QuickTake 200 .. Time Magazine profiled QuickTake as `` the first consumer digital camera '' and ranked it among its `` 100 greatest and most influential gadgets from 1923 to the present '' list .. digital camera. digital camera. Time Magazine. Time Magazine. While the QuickTake was probably the first digicam to have wide success , technically this is not true as the greyscale Dycam Model 1 -LRB- also marketed as the Logitech FotoMan -RRB- was the first consumer digital camera to be sold in the US in November 1990 .. digital camera. digital camera. greyscale. greyscale. At least one other camera , the Fuji DS-X , was sold in Japan even earlier , in late 1989 .", + 'A chef is preparing some food', + 'A chef is preparing a meal', + 'A dog is in a sandy area with the sand that is being stirred up into the air and several plants are in the background', ] embeddings = model.encode(sentences) print(embeddings.shape) @@ -339,6 +284,30 @@ You can finetune this model on your own dataset. } ``` +#### sentence-transformers/all-nli + +* Dataset: [sentence-transformers/all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab) +* Size: 500,000 training samples +* Columns: anchor, positive, and negative +* Approximate statistics based on the first 1000 samples: + | | anchor | positive | negative | + |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| + | type | string | string | string | + | details | | | | +* Samples: + | anchor | positive | negative | + |:---------------------------------------------------------------------------|:-------------------------------------------------|:-----------------------------------------------------------| + | A person on a horse jumps over a broken down airplane. | A person is outdoors, on a horse. | A person is at a diner, ordering an omelette. | + | Children smiling and waving at camera | There are children present | The kids are frowning | + | A boy is jumping on skateboard in the middle of a red bridge. | The boy does a skateboarding trick. | The boy skates down the sidewalk. | +* Loss: [MultipleNegativesRankingLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: + ```json + { + "scale": 20.0, + "similarity_fct": "cos_sim" + } + ``` + #### sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1 * Dataset: [sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1) at [84ed2d3](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1/tree/84ed2d35626f617d890bd493b4d6db69a741e0e2) @@ -438,7 +407,7 @@ You can finetune this model on your own dataset. | | sentence1 | sentence2 | label | |:--------|:-------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------| | type | string | string | int | - | details | | | | + | details | | | | * Samples: | sentence1 | sentence2 | label | |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------|:---------------| @@ -456,7 +425,7 @@ You can finetune this model on your own dataset. | | label | sentence1 | sentence2 | |:--------|:------------------------------------------------|:------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | int | string | string | - | details | | | | + | details | | | | * Samples: | label | sentence1 | sentence2 | |:---------------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------| @@ -543,10 +512,10 @@ You can finetune this model on your own dataset. * Size: 403,218 training samples * Columns: sentence1, sentence2, and label * Approximate statistics based on the first 1000 samples: - | | sentence1 | sentence2 | label | - |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:------------------------------------------------| - | type | string | string | int | - | details | | | | + | | sentence1 | sentence2 | label | + |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------| + | type | string | string | int | + | details | | | | * Samples: | sentence1 | sentence2 | label | |:--------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------| @@ -555,9 +524,57 @@ You can finetune this model on your own dataset. | Brad Pitt directed 12 Years a Slave. | The Bronze Bauhinia Star -LRB- , BBS -RRB- is the lowest rank in Order of the Bauhinia Star in Hong Kong , created in 1997 to replace the British honours system of the Order of the British Empire after the transfer of sovereignty to People 's Republic of China and the establishment of the Hong Kong Special Administrative Region -LRB- HKSAR -RRB- .. Order of the Bauhinia Star. Order of the Bauhinia Star. British honours system. British honours system. Order of the British Empire. Order of the British Empire. Special Administrative Region. Special Administrative Region of the People's Republic of China. It is awarded to persons who have given outstanding service over a long period of time , but in a more limited field or way than that required for the Silver Bauhinia Star .. Silver Bauhinia Star. Silver Bauhinia Star | 1 | * Loss: [SoftmaxLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#softmaxloss) -#### glue/stsb +#### glue/stsb_0 + +* Dataset: [glue/stsb_0](https://huggingface.co/datasets/glue) at [bcdcba7](https://huggingface.co/datasets/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c) +* Size: 5,749 training samples +* Columns: sentence1, sentence2, and label +* Approximate statistics based on the first 1000 samples: + | | sentence1 | sentence2 | label | + |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------| + | type | string | string | float | + | details | | | | +* Samples: + | sentence1 | sentence2 | label | + |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------| + | Snowden Hits Hurdles in Search for Asylum | Snowden's hits hurdles in search for asylum | 5.0 | + | Ukrainian protesters back in streets for anti-government rally | Ukraine protesters topple Lenin statue in Kiev | 2.5999999046325684 | + | "Biotech products, if anything, may be safer than conventional products because of all the testing," Fraley said, adding that 18 countries have adopted biotechnology. | "Biotech products, if anything, may be safer than conventional products because of all the testing," said Robert Fraley, Monsanto's executive vice president. | 3.200000047683716 | +* Loss: [AnglELoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#angleloss) with these parameters: + ```json + { + "scale": 20.0, + "similarity_fct": "pairwise_angle_sim" + } + ``` + +#### glue/stsb_1 + +* Dataset: [glue/stsb_1](https://huggingface.co/datasets/glue) at [bcdcba7](https://huggingface.co/datasets/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c) +* Size: 5,749 training samples +* Columns: sentence1, sentence2, and label +* Approximate statistics based on the first 1000 samples: + | | sentence1 | sentence2 | label | + |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------| + | type | string | string | float | + | details | | | | +* Samples: + | sentence1 | sentence2 | label | + |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------| + | Snowden Hits Hurdles in Search for Asylum | Snowden's hits hurdles in search for asylum | 5.0 | + | Ukrainian protesters back in streets for anti-government rally | Ukraine protesters topple Lenin statue in Kiev | 2.5999999046325684 | + | "Biotech products, if anything, may be safer than conventional products because of all the testing," Fraley said, adding that 18 countries have adopted biotechnology. | "Biotech products, if anything, may be safer than conventional products because of all the testing," said Robert Fraley, Monsanto's executive vice president. | 3.200000047683716 | +* Loss: [CoSENTLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters: + ```json + { + "scale": 20.0, + "similarity_fct": "pairwise_cos_sim" + } + ``` + +#### glue/stsb_2 -* Dataset: [glue/stsb](https://huggingface.co/datasets/glue) at [bcdcba7](https://huggingface.co/datasets/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c) +* Dataset: [glue/stsb_2](https://huggingface.co/datasets/glue) at [bcdcba7](https://huggingface.co/datasets/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c) * Size: 5,749 training samples * Columns: sentence1, sentence2, and label * Approximate statistics based on the first 1000 samples: @@ -578,9 +595,57 @@ You can finetune this model on your own dataset. } ``` -#### sick/relatedness +#### sick/relatedness_0 -* Dataset: sick/relatedness +* Dataset: sick/relatedness_0 +* Size: 4,439 training samples +* Columns: sentence1, sentence2, and label +* Approximate statistics based on the first 1000 samples: + | | sentence1 | sentence2 | label | + |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------| + | type | string | string | float | + | details | | | | +* Samples: + | sentence1 | sentence2 | label | + |:-----------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------|:-------------------------------| + | The dark skinned male is standing on one hand in front of a yellow building | The dark skinned male is not standing on one hand in front of a yellow building | 4.0 | + | A man is singing and playing a guitar | A boy is skillfully playing a piano | 2.299999952316284 | + | A picture is being drawn by a man | The person is drawing | 4.099999904632568 | +* Loss: [AnglELoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#angleloss) with these parameters: + ```json + { + "scale": 20.0, + "similarity_fct": "pairwise_angle_sim" + } + ``` + +#### sick/relatedness_1 + +* Dataset: sick/relatedness_1 +* Size: 4,439 training samples +* Columns: sentence1, sentence2, and label +* Approximate statistics based on the first 1000 samples: + | | sentence1 | sentence2 | label | + |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------| + | type | string | string | float | + | details | | | | +* Samples: + | sentence1 | sentence2 | label | + |:-----------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------|:-------------------------------| + | The dark skinned male is standing on one hand in front of a yellow building | The dark skinned male is not standing on one hand in front of a yellow building | 4.0 | + | A man is singing and playing a guitar | A boy is skillfully playing a piano | 2.299999952316284 | + | A picture is being drawn by a man | The person is drawing | 4.099999904632568 | +* Loss: [CoSENTLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters: + ```json + { + "scale": 20.0, + "similarity_fct": "pairwise_cos_sim" + } + ``` + +#### sick/relatedness_2 + +* Dataset: sick/relatedness_2 * Size: 4,439 training samples * Columns: sentence1, sentence2, and label * Approximate statistics based on the first 1000 samples: @@ -601,9 +666,57 @@ You can finetune this model on your own dataset. } ``` -#### sts-companion +#### sts-companion_0 -* Dataset: [sts-companion](https://huggingface.co/datasets/tasksource/sts-companion) at [fd8beff](https://huggingface.co/datasets/tasksource/sts-companion/tree/fd8beffb788df5f6673bc688e6dcbe3690a3acc6) +* Dataset: [sts-companion_0](https://huggingface.co/datasets/tasksource/sts-companion) at [fd8beff](https://huggingface.co/datasets/tasksource/sts-companion/tree/fd8beffb788df5f6673bc688e6dcbe3690a3acc6) +* Size: 5,289 training samples +* Columns: label, sentence1, and sentence2 +* Approximate statistics based on the first 1000 samples: + | | label | sentence1 | sentence2 | + |:--------|:---------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| + | type | float | string | string | + | details | | | | +* Samples: + | label | sentence1 | sentence2 | + |:-----------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| + | 4.6 | As a matter of urgency, therefore, the staff complement of the Interdepartmental Group attached to the Commission Secretariat should be strengthened at the earliest possible opportunity in order to ensure that all proposals for acts which are general in scope are accompanied, when considered by the College of Commissioners and on the basis of Article 299(2), by a simplified sheet outlining their potential impact. | Thus, it is urgent that the inter-service group staff should be strengthened very quickly at the heart of the General Secretariat of the Commission, so that all proposals to act of general scope can be accompanied, during their examination by the college on the basis of Article 299(2), a detailed impact statement. | + | 4.0 | Reiterating the calls made by the European Parliament in its resolution of 16 March 2000, what initiatives does the Presidency of the European Council propose to take with a view to playing a more active role so as to guarantee the full and complete application of the UN peace plan? | As requested by the European Parliament in its resolution of 16 March 2000, that these initiatives the presidency of the European Council is going to take to play a more active role in order to ensure the full implementation of the UN peace plan? | + | 3.2 | Let us, as a Europe of 15 Member States, organise ourselves in order to be able to welcome those countries who are knocking at the door into the fold under respectable conditions. | Let us organise itself to 15 in order to be able to welcome the right conditions for countries which are knocking on our door. | +* Loss: [AnglELoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#angleloss) with these parameters: + ```json + { + "scale": 20.0, + "similarity_fct": "pairwise_angle_sim" + } + ``` + +#### sts-companion_1 + +* Dataset: [sts-companion_1](https://huggingface.co/datasets/tasksource/sts-companion) at [fd8beff](https://huggingface.co/datasets/tasksource/sts-companion/tree/fd8beffb788df5f6673bc688e6dcbe3690a3acc6) +* Size: 5,289 training samples +* Columns: label, sentence1, and sentence2 +* Approximate statistics based on the first 1000 samples: + | | label | sentence1 | sentence2 | + |:--------|:---------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| + | type | float | string | string | + | details | | | | +* Samples: + | label | sentence1 | sentence2 | + |:-----------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| + | 4.6 | As a matter of urgency, therefore, the staff complement of the Interdepartmental Group attached to the Commission Secretariat should be strengthened at the earliest possible opportunity in order to ensure that all proposals for acts which are general in scope are accompanied, when considered by the College of Commissioners and on the basis of Article 299(2), by a simplified sheet outlining their potential impact. | Thus, it is urgent that the inter-service group staff should be strengthened very quickly at the heart of the General Secretariat of the Commission, so that all proposals to act of general scope can be accompanied, during their examination by the college on the basis of Article 299(2), a detailed impact statement. | + | 4.0 | Reiterating the calls made by the European Parliament in its resolution of 16 March 2000, what initiatives does the Presidency of the European Council propose to take with a view to playing a more active role so as to guarantee the full and complete application of the UN peace plan? | As requested by the European Parliament in its resolution of 16 March 2000, that these initiatives the presidency of the European Council is going to take to play a more active role in order to ensure the full implementation of the UN peace plan? | + | 3.2 | Let us, as a Europe of 15 Member States, organise ourselves in order to be able to welcome those countries who are knocking at the door into the fold under respectable conditions. | Let us organise itself to 15 in order to be able to welcome the right conditions for countries which are knocking on our door. | +* Loss: [CoSENTLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters: + ```json + { + "scale": 20.0, + "similarity_fct": "pairwise_cos_sim" + } + ``` + +#### sts-companion_2 + +* Dataset: [sts-companion_2](https://huggingface.co/datasets/tasksource/sts-companion) at [fd8beff](https://huggingface.co/datasets/tasksource/sts-companion/tree/fd8beffb788df5f6673bc688e6dcbe3690a3acc6) * Size: 5,289 training samples * Columns: label, sentence1, and sentence2 * Approximate statistics based on the first 1000 samples: @@ -653,7 +766,7 @@ You can finetune this model on your own dataset. | | sentence1 | sentence2 | label | |:--------|:-------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------| | type | string | string | int | - | details | | | | + | details | | | | * Samples: | sentence1 | sentence2 | label | |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------|:---------------| @@ -668,10 +781,10 @@ You can finetune this model on your own dataset. * Size: 14,419 evaluation samples * Columns: label, sentence1, and sentence2 * Approximate statistics based on the first 1000 samples: - | | label | sentence1 | sentence2 | - |:--------|:------------------------------------------------|:------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| - | type | int | string | string | - | details | | | | + | | label | sentence1 | sentence2 | + |:--------|:------------------------------------------------|:-------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| + | type | int | string | string | + | details | | | | * Samples: | label | sentence1 | sentence2 | |:---------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------| @@ -770,9 +883,57 @@ You can finetune this model on your own dataset. | Tilda Swinton is a vegan. | Michael Ronald Taylor -LRB- 1 June 1938 , Ealing , West London - 19 January 1969 -RRB- was a British jazz composer , pianist and co-songwriter for the band Cream .. Ealing. Ealing. London. London. British. United Kingdom. Cream. Cream ( band ). Mike Taylor was brought up by his grandparents in London and Kent , and joined the RAF for his national service .. London. London. Having rehearsed and written extensively throughout the early 1960s , he recorded two albums for the Lansdowne series produced by Denis Preston : Pendulum -LRB- 1966 -RRB- with drummer Jon Hiseman , bassist Tony Reeves and saxophonist Dave Tomlin -RRB- and Trio -LRB- 1967 -RRB- with Hiseman and bassists Jack Bruce and Ron Rubin .. Denis Preston. Denis Preston. Jon Hiseman. Jon Hiseman. Dave Tomlin. Dave Tomlin ( musician ). Jack Bruce. Jack Bruce. They were issued on UK Columbia .. Columbia. Columbia Graphophone Company. During his brief recording career , several of Taylor 's pieces were played and recorded by his ... | 1 | * Loss: [SoftmaxLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#softmaxloss) -#### glue/stsb +#### glue/stsb_0 + +* Dataset: [glue/stsb_0](https://huggingface.co/datasets/glue) at [bcdcba7](https://huggingface.co/datasets/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c) +* Size: 1,500 evaluation samples +* Columns: sentence1, sentence2, and label +* Approximate statistics based on the first 1000 samples: + | | sentence1 | sentence2 | label | + |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------| + | type | string | string | float | + | details | | | | +* Samples: + | sentence1 | sentence2 | label | + |:-------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------| + | The room used for defecation is almost always referred to by euphemism. | I'm English, and would probably use 'toilet' most of the time, and always in the context of a private home. | 1.600000023841858 | + | The two-year note US2YT=RR fell 5/32 in price, taking its yield to 1.23 percent from 1.16 percent late on Monday. | The benchmark 10-year note US10YT=RR lost 11/32 in price, taking its yield to 3.21 percent from 3.17 percent late on Monday. | 2.0 | + | I use Elinchrom Skyports, but if money is not an issue then go for PocketWizards. | Or just go with the ultra-cheap YongNuo RF-602, which give you a lot of bang for the buck. | 1.2000000476837158 | +* Loss: [AnglELoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#angleloss) with these parameters: + ```json + { + "scale": 20.0, + "similarity_fct": "pairwise_angle_sim" + } + ``` + +#### glue/stsb_1 -* Dataset: [glue/stsb](https://huggingface.co/datasets/glue) at [bcdcba7](https://huggingface.co/datasets/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c) +* Dataset: [glue/stsb_1](https://huggingface.co/datasets/glue) at [bcdcba7](https://huggingface.co/datasets/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c) +* Size: 1,500 evaluation samples +* Columns: sentence1, sentence2, and label +* Approximate statistics based on the first 1000 samples: + | | sentence1 | sentence2 | label | + |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------| + | type | string | string | float | + | details | | | | +* Samples: + | sentence1 | sentence2 | label | + |:-------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------| + | The room used for defecation is almost always referred to by euphemism. | I'm English, and would probably use 'toilet' most of the time, and always in the context of a private home. | 1.600000023841858 | + | The two-year note US2YT=RR fell 5/32 in price, taking its yield to 1.23 percent from 1.16 percent late on Monday. | The benchmark 10-year note US10YT=RR lost 11/32 in price, taking its yield to 3.21 percent from 3.17 percent late on Monday. | 2.0 | + | I use Elinchrom Skyports, but if money is not an issue then go for PocketWizards. | Or just go with the ultra-cheap YongNuo RF-602, which give you a lot of bang for the buck. | 1.2000000476837158 | +* Loss: [CoSENTLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters: + ```json + { + "scale": 20.0, + "similarity_fct": "pairwise_cos_sim" + } + ``` + +#### glue/stsb_2 + +* Dataset: [glue/stsb_2](https://huggingface.co/datasets/glue) at [bcdcba7](https://huggingface.co/datasets/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c) * Size: 1,500 evaluation samples * Columns: sentence1, sentence2, and label * Approximate statistics based on the first 1000 samples: @@ -793,9 +954,57 @@ You can finetune this model on your own dataset. } ``` -#### sick/relatedness +#### sick/relatedness_0 + +* Dataset: sick/relatedness_0 +* Size: 495 evaluation samples +* Columns: sentence1, sentence2, and label +* Approximate statistics based on the first 495 samples: + | | sentence1 | sentence2 | label | + |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------| + | type | string | string | float | + | details | | | | +* Samples: + | sentence1 | sentence2 | label | + |:-------------------------------------------------------------------------------|:--------------------------------------------------------------------------|:--------------------------------| + | The young boys are playing outdoors and the man is smiling nearby | There is no boy playing outdoors and there is no man smiling | 3.5999999046325684 | + | A person in a black jacket is doing tricks on a motorbike | A skilled person is riding a bicycle on one wheel | 3.4000000953674316 | + | Four children are doing backbends in the gym | Four girls are doing backbends and playing outdoors | 3.799999952316284 | +* Loss: [AnglELoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#angleloss) with these parameters: + ```json + { + "scale": 20.0, + "similarity_fct": "pairwise_angle_sim" + } + ``` + +#### sick/relatedness_1 + +* Dataset: sick/relatedness_1 +* Size: 495 evaluation samples +* Columns: sentence1, sentence2, and label +* Approximate statistics based on the first 495 samples: + | | sentence1 | sentence2 | label | + |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------| + | type | string | string | float | + | details | | | | +* Samples: + | sentence1 | sentence2 | label | + |:-------------------------------------------------------------------------------|:--------------------------------------------------------------------------|:--------------------------------| + | The young boys are playing outdoors and the man is smiling nearby | There is no boy playing outdoors and there is no man smiling | 3.5999999046325684 | + | A person in a black jacket is doing tricks on a motorbike | A skilled person is riding a bicycle on one wheel | 3.4000000953674316 | + | Four children are doing backbends in the gym | Four girls are doing backbends and playing outdoors | 3.799999952316284 | +* Loss: [CoSENTLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters: + ```json + { + "scale": 20.0, + "similarity_fct": "pairwise_cos_sim" + } + ``` + +#### sick/relatedness_2 -* Dataset: sick/relatedness +* Dataset: sick/relatedness_2 * Size: 495 evaluation samples * Columns: sentence1, sentence2, and label * Approximate statistics based on the first 495 samples: @@ -816,9 +1025,57 @@ You can finetune this model on your own dataset. } ``` -#### sts-companion +#### sts-companion_0 -* Dataset: [sts-companion](https://huggingface.co/datasets/tasksource/sts-companion) at [fd8beff](https://huggingface.co/datasets/tasksource/sts-companion/tree/fd8beffb788df5f6673bc688e6dcbe3690a3acc6) +* Dataset: [sts-companion_0](https://huggingface.co/datasets/tasksource/sts-companion) at [fd8beff](https://huggingface.co/datasets/tasksource/sts-companion/tree/fd8beffb788df5f6673bc688e6dcbe3690a3acc6) +* Size: 5,289 evaluation samples +* Columns: label, sentence1, and sentence2 +* Approximate statistics based on the first 1000 samples: + | | label | sentence1 | sentence2 | + |:--------|:---------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| + | type | float | string | string | + | details | | | | +* Samples: + | label | sentence1 | sentence2 | + |:-----------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| + | 3.8 | After all, it is by no means certain that the proposed definition of equitable price is better than any other, because the various definitions that are currently in use in the Member States are all perfectly satisfactory. | In fact, it is not absolutely certain that the definition of price that is proposed is better than another, because the different currently in the Member States all fully. | + | 2.0 | rslw: no, why would i hate them? | why do you hate america so much? | + | 3.0 | Families of #Newtown Victims Look for Answers on #Gun Violence #NRA | Families of Newtown School Massacre Victims Organize Against Gun Violence | +* Loss: [AnglELoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#angleloss) with these parameters: + ```json + { + "scale": 20.0, + "similarity_fct": "pairwise_angle_sim" + } + ``` + +#### sts-companion_1 + +* Dataset: [sts-companion_1](https://huggingface.co/datasets/tasksource/sts-companion) at [fd8beff](https://huggingface.co/datasets/tasksource/sts-companion/tree/fd8beffb788df5f6673bc688e6dcbe3690a3acc6) +* Size: 5,289 evaluation samples +* Columns: label, sentence1, and sentence2 +* Approximate statistics based on the first 1000 samples: + | | label | sentence1 | sentence2 | + |:--------|:---------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| + | type | float | string | string | + | details | | | | +* Samples: + | label | sentence1 | sentence2 | + |:-----------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| + | 3.8 | After all, it is by no means certain that the proposed definition of equitable price is better than any other, because the various definitions that are currently in use in the Member States are all perfectly satisfactory. | In fact, it is not absolutely certain that the definition of price that is proposed is better than another, because the different currently in the Member States all fully. | + | 2.0 | rslw: no, why would i hate them? | why do you hate america so much? | + | 3.0 | Families of #Newtown Victims Look for Answers on #Gun Violence #NRA | Families of Newtown School Massacre Victims Organize Against Gun Violence | +* Loss: [CoSENTLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters: + ```json + { + "scale": 20.0, + "similarity_fct": "pairwise_cos_sim" + } + ``` + +#### sts-companion_2 + +* Dataset: [sts-companion_2](https://huggingface.co/datasets/tasksource/sts-companion) at [fd8beff](https://huggingface.co/datasets/tasksource/sts-companion/tree/fd8beffb788df5f6673bc688e6dcbe3690a3acc6) * Size: 5,289 evaluation samples * Columns: label, sentence1, and sentence2 * Approximate statistics based on the first 1000 samples: @@ -845,7 +1102,7 @@ You can finetune this model on your own dataset. - `per_device_train_batch_size`: 24 - `learning_rate`: 3.5e-05 - `weight_decay`: 1e-06 -- `num_train_epochs`: 1 +- `num_train_epochs`: 2 - `warmup_ratio`: 0.1 - `fp16`: True @@ -869,7 +1126,7 @@ You can finetune this model on your own dataset. - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 -- `num_train_epochs`: 1 +- `num_train_epochs`: 2 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} @@ -975,367 +1232,812 @@ You can finetune this model on your own dataset. | Epoch | Step | Training Loss | |:------:|:------:|:-------------:| -| 0.0028 | 500 | 5.7412 | -| 0.0055 | 1000 | 2.2293 | -| 0.0083 | 1500 | 1.1572 | -| 0.0111 | 2000 | 0.9386 | -| 0.0138 | 2500 | 0.8352 | -| 0.0166 | 3000 | 0.7291 | -| 0.0194 | 3500 | 0.6555 | -| 0.0222 | 4000 | 0.6488 | -| 0.0249 | 4500 | 0.6267 | -| 0.0277 | 5000 | 0.5527 | -| 0.0305 | 5500 | 0.5985 | -| 0.0332 | 6000 | 0.5574 | -| 0.0360 | 6500 | 0.5642 | -| 0.0388 | 7000 | 0.5821 | -| 0.0415 | 7500 | 0.5289 | -| 0.0443 | 8000 | 0.5374 | -| 0.0471 | 8500 | 0.5187 | -| 0.0499 | 9000 | 0.5278 | -| 0.0526 | 9500 | 0.4983 | -| 0.0554 | 10000 | 0.4758 | -| 0.0582 | 10500 | 0.4939 | -| 0.0609 | 11000 | 0.4944 | -| 0.0637 | 11500 | 0.4967 | -| 0.0665 | 12000 | 0.4543 | -| 0.0692 | 12500 | 0.4649 | -| 0.0720 | 13000 | 0.4612 | -| 0.0748 | 13500 | 0.4612 | -| 0.0776 | 14000 | 0.4684 | -| 0.0803 | 14500 | 0.4904 | -| 0.0831 | 15000 | 0.4538 | -| 0.0859 | 15500 | 0.4388 | -| 0.0886 | 16000 | 0.4584 | -| 0.0914 | 16500 | 0.4728 | -| 0.0942 | 17000 | 0.4236 | -| 0.0969 | 17500 | 0.4328 | -| 0.0997 | 18000 | 0.4624 | -| 0.1025 | 18500 | 0.4732 | -| 0.1053 | 19000 | 0.4375 | -| 0.1080 | 19500 | 0.4495 | -| 0.1108 | 20000 | 0.4296 | -| 0.1136 | 20500 | 0.4211 | -| 0.1163 | 21000 | 0.4399 | -| 0.1191 | 21500 | 0.4353 | -| 0.1219 | 22000 | 0.4407 | -| 0.1246 | 22500 | 0.3892 | -| 0.1274 | 23000 | 0.4121 | -| 0.1302 | 23500 | 0.4253 | -| 0.1330 | 24000 | 0.4066 | -| 0.1357 | 24500 | 0.4168 | -| 0.1385 | 25000 | 0.3921 | -| 0.1413 | 25500 | 0.4008 | -| 0.1440 | 26000 | 0.4164 | -| 0.1468 | 26500 | 0.4047 | -| 0.1496 | 27000 | 0.4031 | -| 0.1523 | 27500 | 0.3955 | -| 0.1551 | 28000 | 0.3809 | -| 0.1579 | 28500 | 0.3992 | -| 0.1606 | 29000 | 0.3686 | -| 0.1634 | 29500 | 0.3851 | -| 0.1662 | 30000 | 0.3776 | -| 0.1690 | 30500 | 0.3919 | -| 0.1717 | 31000 | 0.4026 | -| 0.1745 | 31500 | 0.38 | -| 0.1773 | 32000 | 0.41 | -| 0.1800 | 32500 | 0.3731 | -| 0.1828 | 33000 | 0.3831 | -| 0.1856 | 33500 | 0.3727 | -| 0.1883 | 34000 | 0.3664 | -| 0.1911 | 34500 | 0.3882 | -| 0.1939 | 35000 | 0.3873 | -| 0.1967 | 35500 | 0.3529 | -| 0.1994 | 36000 | 0.3923 | -| 0.2022 | 36500 | 0.4051 | -| 0.2050 | 37000 | 0.4134 | -| 0.2077 | 37500 | 0.3478 | -| 0.2105 | 38000 | 0.3602 | -| 0.2133 | 38500 | 0.3547 | -| 0.2160 | 39000 | 0.3748 | -| 0.2188 | 39500 | 0.3537 | -| 0.2216 | 40000 | 0.38 | -| 0.2244 | 40500 | 0.3731 | -| 0.2271 | 41000 | 0.3537 | -| 0.2299 | 41500 | 0.3576 | -| 0.2327 | 42000 | 0.3626 | -| 0.2354 | 42500 | 0.3587 | -| 0.2382 | 43000 | 0.3488 | -| 0.2410 | 43500 | 0.3694 | -| 0.2437 | 44000 | 0.3508 | -| 0.2465 | 44500 | 0.3634 | -| 0.2493 | 45000 | 0.3608 | -| 0.2521 | 45500 | 0.4007 | -| 0.2548 | 46000 | 0.3559 | -| 0.2576 | 46500 | 0.3317 | -| 0.2604 | 47000 | 0.3518 | -| 0.2631 | 47500 | 0.3578 | -| 0.2659 | 48000 | 0.3375 | -| 0.2687 | 48500 | 0.3229 | -| 0.2714 | 49000 | 0.3319 | -| 0.2742 | 49500 | 0.3656 | -| 0.2770 | 50000 | 0.3598 | -| 0.2798 | 50500 | 0.3705 | -| 0.2825 | 51000 | 0.3431 | -| 0.2853 | 51500 | 0.3587 | -| 0.2881 | 52000 | 0.3361 | -| 0.2908 | 52500 | 0.3734 | -| 0.2936 | 53000 | 0.3361 | -| 0.2964 | 53500 | 0.3322 | -| 0.2991 | 54000 | 0.347 | -| 0.3019 | 54500 | 0.3617 | -| 0.3047 | 55000 | 0.3318 | -| 0.3074 | 55500 | 0.3401 | -| 0.3102 | 56000 | 0.328 | -| 0.3130 | 56500 | 0.3553 | -| 0.3158 | 57000 | 0.3669 | -| 0.3185 | 57500 | 0.4088 | -| 0.3213 | 58000 | 0.3636 | -| 0.3241 | 58500 | 0.3372 | -| 0.3268 | 59000 | 0.3494 | -| 0.3296 | 59500 | 0.3504 | -| 0.3324 | 60000 | 0.3389 | -| 0.3351 | 60500 | 0.3219 | -| 0.3379 | 61000 | 0.3283 | -| 0.3407 | 61500 | 0.3202 | -| 0.3435 | 62000 | 0.3185 | -| 0.3462 | 62500 | 0.3449 | -| 0.3490 | 63000 | 0.3527 | -| 0.3518 | 63500 | 0.3349 | -| 0.3545 | 64000 | 0.3225 | -| 0.3573 | 64500 | 0.3269 | -| 0.3601 | 65000 | 0.3074 | -| 0.3628 | 65500 | 0.3513 | -| 0.3656 | 66000 | 0.3166 | -| 0.3684 | 66500 | 0.3472 | -| 0.3712 | 67000 | 0.3395 | -| 0.3739 | 67500 | 0.3437 | -| 0.3767 | 68000 | 0.3491 | -| 0.3795 | 68500 | 0.3181 | -| 0.3822 | 69000 | 0.3324 | -| 0.3850 | 69500 | 0.3335 | -| 0.3878 | 70000 | 0.3401 | -| 0.3905 | 70500 | 0.3433 | -| 0.3933 | 71000 | 0.3229 | -| 0.3961 | 71500 | 0.3264 | -| 0.3989 | 72000 | 0.3123 | -| 0.4016 | 72500 | 0.3207 | -| 0.4044 | 73000 | 0.3008 | -| 0.4072 | 73500 | 0.2998 | -| 0.4099 | 74000 | 0.2992 | -| 0.4127 | 74500 | 0.3134 | -| 0.4155 | 75000 | 0.3262 | -| 0.4182 | 75500 | 0.2988 | -| 0.4210 | 76000 | 0.2936 | -| 0.4238 | 76500 | 0.314 | -| 0.4266 | 77000 | 0.3083 | -| 0.4293 | 77500 | 0.3103 | -| 0.4321 | 78000 | 0.3303 | -| 0.4349 | 78500 | 0.3282 | -| 0.4376 | 79000 | 0.3415 | -| 0.4404 | 79500 | 0.3001 | -| 0.4432 | 80000 | 0.321 | -| 0.4459 | 80500 | 0.3219 | -| 0.4487 | 81000 | 0.3477 | -| 0.4515 | 81500 | 0.2871 | -| 0.4542 | 82000 | 0.2913 | -| 0.4570 | 82500 | 0.3121 | -| 0.4598 | 83000 | 0.3057 | -| 0.4626 | 83500 | 0.32 | -| 0.4653 | 84000 | 0.3086 | -| 0.4681 | 84500 | 0.3091 | -| 0.4709 | 85000 | 0.3243 | -| 0.4736 | 85500 | 0.3104 | -| 0.4764 | 86000 | 0.3124 | -| 0.4792 | 86500 | 0.3134 | -| 0.4819 | 87000 | 0.2967 | -| 0.4847 | 87500 | 0.3036 | -| 0.4875 | 88000 | 0.3079 | -| 0.4903 | 88500 | 0.2959 | -| 0.4930 | 89000 | 0.3332 | -| 0.4958 | 89500 | 0.3151 | -| 0.4986 | 90000 | 0.3233 | -| 0.5013 | 90500 | 0.3083 | -| 0.5041 | 91000 | 0.2913 | -| 0.5069 | 91500 | 0.31 | -| 0.5096 | 92000 | 0.2962 | -| 0.5124 | 92500 | 0.3254 | -| 0.5152 | 93000 | 0.312 | -| 0.5180 | 93500 | 0.3152 | -| 0.5207 | 94000 | 0.3208 | -| 0.5235 | 94500 | 0.3039 | -| 0.5263 | 95000 | 0.3187 | -| 0.5290 | 95500 | 0.3052 | -| 0.5318 | 96000 | 0.3114 | -| 0.5346 | 96500 | 0.315 | -| 0.5373 | 97000 | 0.2862 | -| 0.5401 | 97500 | 0.3104 | -| 0.5429 | 98000 | 0.3 | -| 0.5457 | 98500 | 0.3017 | -| 0.5484 | 99000 | 0.3189 | -| 0.5512 | 99500 | 0.2919 | -| 0.5540 | 100000 | 0.2913 | -| 0.5567 | 100500 | 0.2936 | -| 0.5595 | 101000 | 0.3044 | -| 0.5623 | 101500 | 0.3034 | -| 0.5650 | 102000 | 0.2999 | -| 0.5678 | 102500 | 0.2961 | -| 0.5706 | 103000 | 0.328 | -| 0.5734 | 103500 | 0.3061 | -| 0.5761 | 104000 | 0.295 | -| 0.5789 | 104500 | 0.2997 | -| 0.5817 | 105000 | 0.2981 | -| 0.5844 | 105500 | 0.2966 | -| 0.5872 | 106000 | 0.2798 | -| 0.5900 | 106500 | 0.3001 | -| 0.5927 | 107000 | 0.3018 | -| 0.5955 | 107500 | 0.3076 | -| 0.5983 | 108000 | 0.3093 | -| 0.6010 | 108500 | 0.3096 | -| 0.6038 | 109000 | 0.2914 | -| 0.6066 | 109500 | 0.2874 | -| 0.6094 | 110000 | 0.2777 | -| 0.6121 | 110500 | 0.2854 | -| 0.6149 | 111000 | 0.3279 | -| 0.6177 | 111500 | 0.2843 | -| 0.6204 | 112000 | 0.2956 | -| 0.6232 | 112500 | 0.3076 | -| 0.6260 | 113000 | 0.314 | -| 0.6287 | 113500 | 0.295 | -| 0.6315 | 114000 | 0.2914 | -| 0.6343 | 114500 | 0.3041 | -| 0.6371 | 115000 | 0.2871 | -| 0.6398 | 115500 | 0.3004 | -| 0.6426 | 116000 | 0.2954 | -| 0.6454 | 116500 | 0.2959 | -| 0.6481 | 117000 | 0.3214 | -| 0.6509 | 117500 | 0.2828 | -| 0.6537 | 118000 | 0.3005 | -| 0.6564 | 118500 | 0.2918 | -| 0.6592 | 119000 | 0.2988 | -| 0.6620 | 119500 | 0.2901 | -| 0.6648 | 120000 | 0.2796 | -| 0.6675 | 120500 | 0.2988 | -| 0.6703 | 121000 | 0.2969 | -| 0.6731 | 121500 | 0.2892 | -| 0.6758 | 122000 | 0.2812 | -| 0.6786 | 122500 | 0.2992 | -| 0.6814 | 123000 | 0.2691 | -| 0.6841 | 123500 | 0.2966 | -| 0.6869 | 124000 | 0.2906 | -| 0.6897 | 124500 | 0.2807 | -| 0.6925 | 125000 | 0.2684 | -| 0.6952 | 125500 | 0.2771 | -| 0.6980 | 126000 | 0.2992 | -| 0.7008 | 126500 | 0.274 | -| 0.7035 | 127000 | 0.2846 | -| 0.7063 | 127500 | 0.2898 | -| 0.7091 | 128000 | 0.2795 | -| 0.7118 | 128500 | 0.2758 | -| 0.7146 | 129000 | 0.2883 | -| 0.7174 | 129500 | 0.2968 | -| 0.7201 | 130000 | 0.2756 | -| 0.7229 | 130500 | 0.3116 | -| 0.7257 | 131000 | 0.2923 | -| 0.7285 | 131500 | 0.2758 | -| 0.7312 | 132000 | 0.262 | -| 0.7340 | 132500 | 0.283 | -| 0.7368 | 133000 | 0.2937 | -| 0.7395 | 133500 | 0.2891 | -| 0.7423 | 134000 | 0.2743 | -| 0.7451 | 134500 | 0.3087 | -| 0.7478 | 135000 | 0.2855 | -| 0.7506 | 135500 | 0.2902 | -| 0.7534 | 136000 | 0.278 | -| 0.7562 | 136500 | 0.2607 | -| 0.7589 | 137000 | 0.2634 | -| 0.7617 | 137500 | 0.2807 | -| 0.7645 | 138000 | 0.294 | -| 0.7672 | 138500 | 0.2837 | -| 0.7700 | 139000 | 0.2521 | -| 0.7728 | 139500 | 0.2751 | -| 0.7755 | 140000 | 0.3012 | -| 0.7783 | 140500 | 0.2816 | -| 0.7811 | 141000 | 0.2756 | -| 0.7839 | 141500 | 0.2661 | -| 0.7866 | 142000 | 0.2585 | -| 0.7894 | 142500 | 0.2718 | -| 0.7922 | 143000 | 0.2724 | -| 0.7949 | 143500 | 0.2804 | -| 0.7977 | 144000 | 0.2582 | -| 0.8005 | 144500 | 0.2636 | -| 0.8032 | 145000 | 0.2536 | -| 0.8060 | 145500 | 0.2862 | -| 0.8088 | 146000 | 0.2842 | -| 0.8116 | 146500 | 0.2702 | -| 0.8143 | 147000 | 0.2727 | -| 0.8171 | 147500 | 0.2591 | -| 0.8199 | 148000 | 0.2709 | -| 0.8226 | 148500 | 0.2879 | -| 0.8254 | 149000 | 0.2669 | -| 0.8282 | 149500 | 0.2748 | -| 0.8309 | 150000 | 0.2689 | -| 0.8337 | 150500 | 0.2414 | -| 0.8365 | 151000 | 0.261 | -| 0.8393 | 151500 | 0.2967 | -| 0.8420 | 152000 | 0.2757 | -| 0.8448 | 152500 | 0.2667 | -| 0.8476 | 153000 | 0.252 | -| 0.8503 | 153500 | 0.2659 | -| 0.8531 | 154000 | 0.2799 | -| 0.8559 | 154500 | 0.2653 | -| 0.8586 | 155000 | 0.275 | -| 0.8614 | 155500 | 0.3067 | -| 0.8642 | 156000 | 0.2742 | -| 0.8669 | 156500 | 0.2616 | -| 0.8697 | 157000 | 0.2793 | -| 0.8725 | 157500 | 0.2721 | -| 0.8753 | 158000 | 0.2623 | -| 0.8780 | 158500 | 0.2801 | -| 0.8808 | 159000 | 0.2499 | -| 0.8836 | 159500 | 0.283 | -| 0.8863 | 160000 | 0.2641 | -| 0.8891 | 160500 | 0.2642 | -| 0.8919 | 161000 | 0.271 | -| 0.8946 | 161500 | 0.2624 | -| 0.8974 | 162000 | 0.2721 | -| 0.9002 | 162500 | 0.2698 | -| 0.9030 | 163000 | 0.2519 | -| 0.9057 | 163500 | 0.2771 | -| 0.9085 | 164000 | 0.2719 | -| 0.9113 | 164500 | 0.2747 | -| 0.9140 | 165000 | 0.28 | -| 0.9168 | 165500 | 0.2618 | -| 0.9196 | 166000 | 0.2755 | -| 0.9223 | 166500 | 0.3104 | -| 0.9251 | 167000 | 0.2671 | -| 0.9279 | 167500 | 0.2491 | -| 0.9307 | 168000 | 0.262 | -| 0.9334 | 168500 | 0.2514 | -| 0.9362 | 169000 | 0.2632 | -| 0.9390 | 169500 | 0.2834 | -| 0.9417 | 170000 | 0.2573 | -| 0.9445 | 170500 | 0.2662 | -| 0.9473 | 171000 | 0.2631 | -| 0.9500 | 171500 | 0.2507 | -| 0.9528 | 172000 | 0.2739 | -| 0.9556 | 172500 | 0.2567 | -| 0.9584 | 173000 | 0.2489 | -| 0.9611 | 173500 | 0.2607 | -| 0.9639 | 174000 | 0.2627 | -| 0.9667 | 174500 | 0.2715 | -| 0.9694 | 175000 | 0.2603 | -| 0.9722 | 175500 | 0.2533 | -| 0.9750 | 176000 | 0.261 | -| 0.9777 | 176500 | 0.2485 | -| 0.9805 | 177000 | 0.2719 | -| 0.9833 | 177500 | 0.2693 | -| 0.9861 | 178000 | 0.2825 | -| 0.9888 | 178500 | 0.2697 | -| 0.9916 | 179000 | 0.2601 | -| 0.9944 | 179500 | 0.2459 | -| 0.9971 | 180000 | 0.2674 | -| 0.9999 | 180500 | 0.2725 | +| 0.0025 | 500 | 6.0463 | +| 0.0050 | 1000 | 2.5823 | +| 0.0074 | 1500 | 1.1895 | +| 0.0099 | 2000 | 0.9445 | +| 0.0124 | 2500 | 0.8209 | +| 0.0149 | 3000 | 0.7738 | +| 0.0174 | 3500 | 0.7587 | +| 0.0198 | 4000 | 0.7189 | +| 0.0223 | 4500 | 0.7077 | +| 0.0248 | 5000 | 0.6986 | +| 0.0273 | 5500 | 0.6977 | +| 0.0297 | 6000 | 0.6969 | +| 0.0322 | 6500 | 0.6646 | +| 0.0347 | 7000 | 0.6125 | +| 0.0372 | 7500 | 0.6107 | +| 0.0397 | 8000 | 0.6454 | +| 0.0421 | 8500 | 0.6437 | +| 0.0446 | 9000 | 0.6001 | +| 0.0471 | 9500 | 0.613 | +| 0.0496 | 10000 | 0.5964 | +| 0.0521 | 10500 | 0.6019 | +| 0.0545 | 11000 | 0.5807 | +| 0.0570 | 11500 | 0.5661 | +| 0.0595 | 12000 | 0.5615 | +| 0.0620 | 12500 | 0.5679 | +| 0.0645 | 13000 | 0.5783 | +| 0.0669 | 13500 | 0.5627 | +| 0.0694 | 14000 | 0.5501 | +| 0.0719 | 14500 | 0.538 | +| 0.0744 | 15000 | 0.5828 | +| 0.0769 | 15500 | 0.5524 | +| 0.0793 | 16000 | 0.5327 | +| 0.0818 | 16500 | 0.5356 | +| 0.0843 | 17000 | 0.4979 | +| 0.0868 | 17500 | 0.5223 | +| 0.0892 | 18000 | 0.4955 | +| 0.0917 | 18500 | 0.5079 | +| 0.0942 | 19000 | 0.506 | +| 0.0967 | 19500 | 0.4926 | +| 0.0992 | 20000 | 0.4845 | +| 0.1016 | 20500 | 0.5078 | +| 0.1041 | 21000 | 0.4937 | +| 0.1066 | 21500 | 0.4937 | +| 0.1091 | 22000 | 0.4971 | +| 0.1116 | 22500 | 0.4699 | +| 0.1140 | 23000 | 0.5022 | +| 0.1165 | 23500 | 0.5162 | +| 0.1190 | 24000 | 0.5221 | +| 0.1215 | 24500 | 0.5147 | +| 0.1240 | 25000 | 0.4719 | +| 0.1264 | 25500 | 0.489 | +| 0.1289 | 26000 | 0.5117 | +| 0.1314 | 26500 | 0.4643 | +| 0.1339 | 27000 | 0.469 | +| 0.1364 | 27500 | 0.5095 | +| 0.1388 | 28000 | 0.441 | +| 0.1413 | 28500 | 0.4765 | +| 0.1438 | 29000 | 0.4943 | +| 0.1463 | 29500 | 0.4797 | +| 0.1487 | 30000 | 0.4709 | +| 0.1512 | 30500 | 0.4429 | +| 0.1537 | 31000 | 0.429 | +| 0.1562 | 31500 | 0.4445 | +| 0.1587 | 32000 | 0.4982 | +| 0.1611 | 32500 | 0.4501 | +| 0.1636 | 33000 | 0.4812 | +| 0.1661 | 33500 | 0.4483 | +| 0.1686 | 34000 | 0.4613 | +| 0.1711 | 34500 | 0.4646 | +| 0.1735 | 35000 | 0.4737 | +| 0.1760 | 35500 | 0.4648 | +| 0.1785 | 36000 | 0.4004 | +| 0.1810 | 36500 | 0.4346 | +| 0.1835 | 37000 | 0.4536 | +| 0.1859 | 37500 | 0.4469 | +| 0.1884 | 38000 | 0.4381 | +| 0.1909 | 38500 | 0.4451 | +| 0.1934 | 39000 | 0.4202 | +| 0.1958 | 39500 | 0.4437 | +| 0.1983 | 40000 | 0.4188 | +| 0.2008 | 40500 | 0.4016 | +| 0.2033 | 41000 | 0.4258 | +| 0.2058 | 41500 | 0.4072 | +| 0.2082 | 42000 | 0.4248 | +| 0.2107 | 42500 | 0.4414 | +| 0.2132 | 43000 | 0.4317 | +| 0.2157 | 43500 | 0.445 | +| 0.2182 | 44000 | 0.4106 | +| 0.2206 | 44500 | 0.4343 | +| 0.2231 | 45000 | 0.4025 | +| 0.2256 | 45500 | 0.4235 | +| 0.2281 | 46000 | 0.4583 | +| 0.2306 | 46500 | 0.4001 | +| 0.2330 | 47000 | 0.4188 | +| 0.2355 | 47500 | 0.4073 | +| 0.2380 | 48000 | 0.4407 | +| 0.2405 | 48500 | 0.4214 | +| 0.2430 | 49000 | 0.4181 | +| 0.2454 | 49500 | 0.4153 | +| 0.2479 | 50000 | 0.4171 | +| 0.2504 | 50500 | 0.4174 | +| 0.2529 | 51000 | 0.3984 | +| 0.2553 | 51500 | 0.4045 | +| 0.2578 | 52000 | 0.403 | +| 0.2603 | 52500 | 0.4109 | +| 0.2628 | 53000 | 0.4445 | +| 0.2653 | 53500 | 0.4114 | +| 0.2677 | 54000 | 0.3777 | +| 0.2702 | 54500 | 0.3682 | +| 0.2727 | 55000 | 0.3973 | +| 0.2752 | 55500 | 0.3998 | +| 0.2777 | 56000 | 0.3988 | +| 0.2801 | 56500 | 0.3965 | +| 0.2826 | 57000 | 0.434 | +| 0.2851 | 57500 | 0.3958 | +| 0.2876 | 58000 | 0.417 | +| 0.2901 | 58500 | 0.3767 | +| 0.2925 | 59000 | 0.3901 | +| 0.2950 | 59500 | 0.398 | +| 0.2975 | 60000 | 0.3788 | +| 0.3000 | 60500 | 0.4102 | +| 0.3025 | 61000 | 0.3718 | +| 0.3049 | 61500 | 0.394 | +| 0.3074 | 62000 | 0.3836 | +| 0.3099 | 62500 | 0.4169 | +| 0.3124 | 63000 | 0.4074 | +| 0.3148 | 63500 | 0.4379 | +| 0.3173 | 64000 | 0.3747 | +| 0.3198 | 64500 | 0.4141 | +| 0.3223 | 65000 | 0.3865 | +| 0.3248 | 65500 | 0.395 | +| 0.3272 | 66000 | 0.3571 | +| 0.3297 | 66500 | 0.3847 | +| 0.3322 | 67000 | 0.3778 | +| 0.3347 | 67500 | 0.4095 | +| 0.3372 | 68000 | 0.4036 | +| 0.3396 | 68500 | 0.3824 | +| 0.3421 | 69000 | 0.3811 | +| 0.3446 | 69500 | 0.368 | +| 0.3471 | 70000 | 0.4028 | +| 0.3496 | 70500 | 0.3978 | +| 0.3520 | 71000 | 0.3765 | +| 0.3545 | 71500 | 0.3735 | +| 0.3570 | 72000 | 0.3625 | +| 0.3595 | 72500 | 0.3696 | +| 0.3619 | 73000 | 0.3999 | +| 0.3644 | 73500 | 0.353 | +| 0.3669 | 74000 | 0.3902 | +| 0.3694 | 74500 | 0.3925 | +| 0.3719 | 75000 | 0.3382 | +| 0.3743 | 75500 | 0.3531 | +| 0.3768 | 76000 | 0.3618 | +| 0.3793 | 76500 | 0.3372 | +| 0.3818 | 77000 | 0.382 | +| 0.3843 | 77500 | 0.3866 | +| 0.3867 | 78000 | 0.3513 | +| 0.3892 | 78500 | 0.3727 | +| 0.3917 | 79000 | 0.3603 | +| 0.3942 | 79500 | 0.397 | +| 0.3967 | 80000 | 0.351 | +| 0.3991 | 80500 | 0.3675 | +| 0.4016 | 81000 | 0.3861 | +| 0.4041 | 81500 | 0.3423 | +| 0.4066 | 82000 | 0.3618 | +| 0.4091 | 82500 | 0.3784 | +| 0.4115 | 83000 | 0.3688 | +| 0.4140 | 83500 | 0.3343 | +| 0.4165 | 84000 | 0.3831 | +| 0.4190 | 84500 | 0.4134 | +| 0.4214 | 85000 | 0.3548 | +| 0.4239 | 85500 | 0.3422 | +| 0.4264 | 86000 | 0.3471 | +| 0.4289 | 86500 | 0.3506 | +| 0.4314 | 87000 | 0.3338 | +| 0.4338 | 87500 | 0.3283 | +| 0.4363 | 88000 | 0.3696 | +| 0.4388 | 88500 | 0.3476 | +| 0.4413 | 89000 | 0.3662 | +| 0.4438 | 89500 | 0.3607 | +| 0.4462 | 90000 | 0.3553 | +| 0.4487 | 90500 | 0.3637 | +| 0.4512 | 91000 | 0.388 | +| 0.4537 | 91500 | 0.348 | +| 0.4562 | 92000 | 0.3678 | +| 0.4586 | 92500 | 0.3961 | +| 0.4611 | 93000 | 0.3309 | +| 0.4636 | 93500 | 0.3639 | +| 0.4661 | 94000 | 0.3393 | +| 0.4686 | 94500 | 0.3861 | +| 0.4710 | 95000 | 0.3484 | +| 0.4735 | 95500 | 0.3511 | +| 0.4760 | 96000 | 0.3445 | +| 0.4785 | 96500 | 0.3486 | +| 0.4809 | 97000 | 0.3262 | +| 0.4834 | 97500 | 0.3342 | +| 0.4859 | 98000 | 0.3845 | +| 0.4884 | 98500 | 0.3481 | +| 0.4909 | 99000 | 0.3275 | +| 0.4933 | 99500 | 0.3567 | +| 0.4958 | 100000 | 0.3656 | +| 0.4983 | 100500 | 0.3299 | +| 0.5008 | 101000 | 0.3396 | +| 0.5033 | 101500 | 0.3497 | +| 0.5057 | 102000 | 0.3484 | +| 0.5082 | 102500 | 0.3684 | +| 0.5107 | 103000 | 0.318 | +| 0.5132 | 103500 | 0.2966 | +| 0.5157 | 104000 | 0.3452 | +| 0.5181 | 104500 | 0.3365 | +| 0.5206 | 105000 | 0.3352 | +| 0.5231 | 105500 | 0.3854 | +| 0.5256 | 106000 | 0.3712 | +| 0.5280 | 106500 | 0.334 | +| 0.5305 | 107000 | 0.3381 | +| 0.5330 | 107500 | 0.3289 | +| 0.5355 | 108000 | 0.3332 | +| 0.5380 | 108500 | 0.3441 | +| 0.5404 | 109000 | 0.3701 | +| 0.5429 | 109500 | 0.3268 | +| 0.5454 | 110000 | 0.3072 | +| 0.5479 | 110500 | 0.3348 | +| 0.5504 | 111000 | 0.3501 | +| 0.5528 | 111500 | 0.3179 | +| 0.5553 | 112000 | 0.3276 | +| 0.5578 | 112500 | 0.3958 | +| 0.5603 | 113000 | 0.3317 | +| 0.5628 | 113500 | 0.3564 | +| 0.5652 | 114000 | 0.3042 | +| 0.5677 | 114500 | 0.3482 | +| 0.5702 | 115000 | 0.3383 | +| 0.5727 | 115500 | 0.3557 | +| 0.5752 | 116000 | 0.3195 | +| 0.5776 | 116500 | 0.3265 | +| 0.5801 | 117000 | 0.3174 | +| 0.5826 | 117500 | 0.3392 | +| 0.5851 | 118000 | 0.3279 | +| 0.5875 | 118500 | 0.3254 | +| 0.5900 | 119000 | 0.3501 | +| 0.5925 | 119500 | 0.336 | +| 0.5950 | 120000 | 0.3899 | +| 0.5975 | 120500 | 0.3614 | +| 0.5999 | 121000 | 0.3473 | +| 0.6024 | 121500 | 0.3275 | +| 0.6049 | 122000 | 0.3213 | +| 0.6074 | 122500 | 0.303 | +| 0.6099 | 123000 | 0.3258 | +| 0.6123 | 123500 | 0.3175 | +| 0.6148 | 124000 | 0.3418 | +| 0.6173 | 124500 | 0.3422 | +| 0.6198 | 125000 | 0.3212 | +| 0.6223 | 125500 | 0.3171 | +| 0.6247 | 126000 | 0.3428 | +| 0.6272 | 126500 | 0.3327 | +| 0.6297 | 127000 | 0.3126 | +| 0.6322 | 127500 | 0.3194 | +| 0.6346 | 128000 | 0.3341 | +| 0.6371 | 128500 | 0.3246 | +| 0.6396 | 129000 | 0.3154 | +| 0.6421 | 129500 | 0.3224 | +| 0.6446 | 130000 | 0.3422 | +| 0.6470 | 130500 | 0.2983 | +| 0.6495 | 131000 | 0.3257 | +| 0.6520 | 131500 | 0.301 | +| 0.6545 | 132000 | 0.3276 | +| 0.6570 | 132500 | 0.34 | +| 0.6594 | 133000 | 0.3348 | +| 0.6619 | 133500 | 0.3298 | +| 0.6644 | 134000 | 0.323 | +| 0.6669 | 134500 | 0.3099 | +| 0.6694 | 135000 | 0.3454 | +| 0.6718 | 135500 | 0.3088 | +| 0.6743 | 136000 | 0.3501 | +| 0.6768 | 136500 | 0.3238 | +| 0.6793 | 137000 | 0.3017 | +| 0.6818 | 137500 | 0.3071 | +| 0.6842 | 138000 | 0.3165 | +| 0.6867 | 138500 | 0.2963 | +| 0.6892 | 139000 | 0.3186 | +| 0.6917 | 139500 | 0.3292 | +| 0.6941 | 140000 | 0.3108 | +| 0.6966 | 140500 | 0.3156 | +| 0.6991 | 141000 | 0.3188 | +| 0.7016 | 141500 | 0.2935 | +| 0.7041 | 142000 | 0.319 | +| 0.7065 | 142500 | 0.3123 | +| 0.7090 | 143000 | 0.302 | +| 0.7115 | 143500 | 0.3254 | +| 0.7140 | 144000 | 0.3018 | +| 0.7165 | 144500 | 0.3272 | +| 0.7189 | 145000 | 0.3258 | +| 0.7214 | 145500 | 0.3557 | +| 0.7239 | 146000 | 0.2816 | +| 0.7264 | 146500 | 0.3372 | +| 0.7289 | 147000 | 0.3406 | +| 0.7313 | 147500 | 0.3564 | +| 0.7338 | 148000 | 0.3341 | +| 0.7363 | 148500 | 0.3068 | +| 0.7388 | 149000 | 0.3565 | +| 0.7413 | 149500 | 0.3161 | +| 0.7437 | 150000 | 0.3187 | +| 0.7462 | 150500 | 0.3356 | +| 0.7487 | 151000 | 0.3103 | +| 0.7512 | 151500 | 0.3316 | +| 0.7536 | 152000 | 0.2906 | +| 0.7561 | 152500 | 0.3262 | +| 0.7586 | 153000 | 0.3039 | +| 0.7611 | 153500 | 0.301 | +| 0.7636 | 154000 | 0.3108 | +| 0.7660 | 154500 | 0.2937 | +| 0.7685 | 155000 | 0.2802 | +| 0.7710 | 155500 | 0.2926 | +| 0.7735 | 156000 | 0.3112 | +| 0.7760 | 156500 | 0.309 | +| 0.7784 | 157000 | 0.3059 | +| 0.7809 | 157500 | 0.313 | +| 0.7834 | 158000 | 0.3024 | +| 0.7859 | 158500 | 0.3122 | +| 0.7884 | 159000 | 0.2937 | +| 0.7908 | 159500 | 0.3102 | +| 0.7933 | 160000 | 0.3206 | +| 0.7958 | 160500 | 0.2895 | +| 0.7983 | 161000 | 0.3207 | +| 0.8007 | 161500 | 0.3099 | +| 0.8032 | 162000 | 0.2979 | +| 0.8057 | 162500 | 0.3607 | +| 0.8082 | 163000 | 0.3325 | +| 0.8107 | 163500 | 0.3117 | +| 0.8131 | 164000 | 0.3027 | +| 0.8156 | 164500 | 0.3347 | +| 0.8181 | 165000 | 0.3034 | +| 0.8206 | 165500 | 0.2918 | +| 0.8231 | 166000 | 0.315 | +| 0.8255 | 166500 | 0.2943 | +| 0.8280 | 167000 | 0.3407 | +| 0.8305 | 167500 | 0.312 | +| 0.8330 | 168000 | 0.2758 | +| 0.8355 | 168500 | 0.3487 | +| 0.8379 | 169000 | 0.3216 | +| 0.8404 | 169500 | 0.3087 | +| 0.8429 | 170000 | 0.2963 | +| 0.8454 | 170500 | 0.2879 | +| 0.8479 | 171000 | 0.3588 | +| 0.8503 | 171500 | 0.3507 | +| 0.8528 | 172000 | 0.3208 | +| 0.8553 | 172500 | 0.3181 | +| 0.8578 | 173000 | 0.2946 | +| 0.8602 | 173500 | 0.2846 | +| 0.8627 | 174000 | 0.3069 | +| 0.8652 | 174500 | 0.3134 | +| 0.8677 | 175000 | 0.3164 | +| 0.8702 | 175500 | 0.3191 | +| 0.8726 | 176000 | 0.2892 | +| 0.8751 | 176500 | 0.3081 | +| 0.8776 | 177000 | 0.2622 | +| 0.8801 | 177500 | 0.298 | +| 0.8826 | 178000 | 0.337 | +| 0.8850 | 178500 | 0.2701 | +| 0.8875 | 179000 | 0.2966 | +| 0.8900 | 179500 | 0.2894 | +| 0.8925 | 180000 | 0.3133 | +| 0.8950 | 180500 | 0.3172 | +| 0.8974 | 181000 | 0.2937 | +| 0.8999 | 181500 | 0.2804 | +| 0.9024 | 182000 | 0.3296 | +| 0.9049 | 182500 | 0.2831 | +| 0.9074 | 183000 | 0.2719 | +| 0.9098 | 183500 | 0.3014 | +| 0.9123 | 184000 | 0.2939 | +| 0.9148 | 184500 | 0.2835 | +| 0.9173 | 185000 | 0.3625 | +| 0.9197 | 185500 | 0.3056 | +| 0.9222 | 186000 | 0.3241 | +| 0.9247 | 186500 | 0.2916 | +| 0.9272 | 187000 | 0.2913 | +| 0.9297 | 187500 | 0.2813 | +| 0.9321 | 188000 | 0.2967 | +| 0.9346 | 188500 | 0.3152 | +| 0.9371 | 189000 | 0.2752 | +| 0.9396 | 189500 | 0.2855 | +| 0.9421 | 190000 | 0.3114 | +| 0.9445 | 190500 | 0.3117 | +| 0.9470 | 191000 | 0.305 | +| 0.9495 | 191500 | 0.316 | +| 0.9520 | 192000 | 0.2817 | +| 0.9545 | 192500 | 0.2777 | +| 0.9569 | 193000 | 0.2823 | +| 0.9594 | 193500 | 0.3473 | +| 0.9619 | 194000 | 0.3045 | +| 0.9644 | 194500 | 0.2951 | +| 0.9668 | 195000 | 0.3043 | +| 0.9693 | 195500 | 0.2739 | +| 0.9718 | 196000 | 0.2671 | +| 0.9743 | 196500 | 0.2876 | +| 0.9768 | 197000 | 0.267 | +| 0.9792 | 197500 | 0.3052 | +| 0.9817 | 198000 | 0.2789 | +| 0.9842 | 198500 | 0.2794 | +| 0.9867 | 199000 | 0.2907 | +| 0.9892 | 199500 | 0.2758 | +| 0.9916 | 200000 | 0.3191 | +| 0.9941 | 200500 | 0.2741 | +| 0.9966 | 201000 | 0.269 | +| 0.9991 | 201500 | 0.2939 | +| 1.0016 | 202000 | 0.2716 | +| 1.0040 | 202500 | 0.3019 | +| 1.0065 | 203000 | 0.3044 | +| 1.0090 | 203500 | 0.2788 | +| 1.0115 | 204000 | 0.2759 | +| 1.0140 | 204500 | 0.2746 | +| 1.0164 | 205000 | 0.2908 | +| 1.0189 | 205500 | 0.27 | +| 1.0214 | 206000 | 0.2686 | +| 1.0239 | 206500 | 0.2816 | +| 1.0263 | 207000 | 0.2916 | +| 1.0288 | 207500 | 0.2948 | +| 1.0313 | 208000 | 0.2814 | +| 1.0338 | 208500 | 0.2454 | +| 1.0363 | 209000 | 0.2638 | +| 1.0387 | 209500 | 0.2887 | +| 1.0412 | 210000 | 0.3043 | +| 1.0437 | 210500 | 0.2737 | +| 1.0462 | 211000 | 0.2693 | +| 1.0487 | 211500 | 0.2825 | +| 1.0511 | 212000 | 0.284 | +| 1.0536 | 212500 | 0.2693 | +| 1.0561 | 213000 | 0.2721 | +| 1.0586 | 213500 | 0.2677 | +| 1.0611 | 214000 | 0.267 | +| 1.0635 | 214500 | 0.2752 | +| 1.0660 | 215000 | 0.3046 | +| 1.0685 | 215500 | 0.2788 | +| 1.0710 | 216000 | 0.2612 | +| 1.0735 | 216500 | 0.2984 | +| 1.0759 | 217000 | 0.2838 | +| 1.0784 | 217500 | 0.2752 | +| 1.0809 | 218000 | 0.2592 | +| 1.0834 | 218500 | 0.2728 | +| 1.0858 | 219000 | 0.2643 | +| 1.0883 | 219500 | 0.2636 | +| 1.0908 | 220000 | 0.2581 | +| 1.0933 | 220500 | 0.2652 | +| 1.0958 | 221000 | 0.2637 | +| 1.0982 | 221500 | 0.2734 | +| 1.1007 | 222000 | 0.2703 | +| 1.1032 | 222500 | 0.2537 | +| 1.1057 | 223000 | 0.2765 | +| 1.1082 | 223500 | 0.2744 | +| 1.1106 | 224000 | 0.2525 | +| 1.1131 | 224500 | 0.2798 | +| 1.1156 | 225000 | 0.2749 | +| 1.1181 | 225500 | 0.2886 | +| 1.1206 | 226000 | 0.2889 | +| 1.1230 | 226500 | 0.2756 | +| 1.1255 | 227000 | 0.2694 | +| 1.1280 | 227500 | 0.2712 | +| 1.1305 | 228000 | 0.2701 | +| 1.1329 | 228500 | 0.2433 | +| 1.1354 | 229000 | 0.3027 | +| 1.1379 | 229500 | 0.2572 | +| 1.1404 | 230000 | 0.2682 | +| 1.1429 | 230500 | 0.2794 | +| 1.1453 | 231000 | 0.2521 | +| 1.1478 | 231500 | 0.271 | +| 1.1503 | 232000 | 0.2418 | +| 1.1528 | 232500 | 0.2426 | +| 1.1553 | 233000 | 0.2404 | +| 1.1577 | 233500 | 0.2991 | +| 1.1602 | 234000 | 0.2571 | +| 1.1627 | 234500 | 0.2737 | +| 1.1652 | 235000 | 0.2513 | +| 1.1677 | 235500 | 0.2901 | +| 1.1701 | 236000 | 0.2489 | +| 1.1726 | 236500 | 0.2548 | +| 1.1751 | 237000 | 0.2895 | +| 1.1776 | 237500 | 0.2195 | +| 1.1801 | 238000 | 0.2362 | +| 1.1825 | 238500 | 0.2522 | +| 1.1850 | 239000 | 0.2532 | +| 1.1875 | 239500 | 0.2468 | +| 1.1900 | 240000 | 0.2506 | +| 1.1924 | 240500 | 0.2422 | +| 1.1949 | 241000 | 0.2325 | +| 1.1974 | 241500 | 0.2487 | +| 1.1999 | 242000 | 0.2315 | +| 1.2024 | 242500 | 0.2195 | +| 1.2048 | 243000 | 0.234 | +| 1.2073 | 243500 | 0.2313 | +| 1.2098 | 244000 | 0.253 | +| 1.2123 | 244500 | 0.2621 | +| 1.2148 | 245000 | 0.2433 | +| 1.2172 | 245500 | 0.2455 | +| 1.2197 | 246000 | 0.2485 | +| 1.2222 | 246500 | 0.2192 | +| 1.2247 | 247000 | 0.2423 | +| 1.2272 | 247500 | 0.2565 | +| 1.2296 | 248000 | 0.227 | +| 1.2321 | 248500 | 0.2255 | +| 1.2346 | 249000 | 0.2428 | +| 1.2371 | 249500 | 0.2506 | +| 1.2396 | 250000 | 0.2525 | +| 1.2420 | 250500 | 0.2195 | +| 1.2445 | 251000 | 0.2585 | +| 1.2470 | 251500 | 0.23 | +| 1.2495 | 252000 | 0.2146 | +| 1.2519 | 252500 | 0.2564 | +| 1.2544 | 253000 | 0.2335 | +| 1.2569 | 253500 | 0.2149 | +| 1.2594 | 254000 | 0.2751 | +| 1.2619 | 254500 | 0.2714 | +| 1.2643 | 255000 | 0.2386 | +| 1.2668 | 255500 | 0.2123 | +| 1.2693 | 256000 | 0.1983 | +| 1.2718 | 256500 | 0.2266 | +| 1.2743 | 257000 | 0.2416 | +| 1.2767 | 257500 | 0.2202 | +| 1.2792 | 258000 | 0.2175 | +| 1.2817 | 258500 | 0.2696 | +| 1.2842 | 259000 | 0.2454 | +| 1.2867 | 259500 | 0.2413 | +| 1.2891 | 260000 | 0.2117 | +| 1.2916 | 260500 | 0.2249 | +| 1.2941 | 261000 | 0.2516 | +| 1.2966 | 261500 | 0.226 | +| 1.2990 | 262000 | 0.2175 | +| 1.3015 | 262500 | 0.2212 | +| 1.3040 | 263000 | 0.2286 | +| 1.3065 | 263500 | 0.2197 | +| 1.3090 | 264000 | 0.2446 | +| 1.3114 | 264500 | 0.2474 | +| 1.3139 | 265000 | 0.25 | +| 1.3164 | 265500 | 0.2342 | +| 1.3189 | 266000 | 0.2382 | +| 1.3214 | 266500 | 0.2228 | +| 1.3238 | 267000 | 0.2408 | +| 1.3263 | 267500 | 0.2122 | +| 1.3288 | 268000 | 0.2069 | +| 1.3313 | 268500 | 0.2278 | +| 1.3338 | 269000 | 0.23 | +| 1.3362 | 269500 | 0.2458 | +| 1.3387 | 270000 | 0.2375 | +| 1.3412 | 270500 | 0.2324 | +| 1.3437 | 271000 | 0.1933 | +| 1.3462 | 271500 | 0.2282 | +| 1.3486 | 272000 | 0.2308 | +| 1.3511 | 272500 | 0.2405 | +| 1.3536 | 273000 | 0.2097 | +| 1.3561 | 273500 | 0.2146 | +| 1.3585 | 274000 | 0.2025 | +| 1.3610 | 274500 | 0.2444 | +| 1.3635 | 275000 | 0.2063 | +| 1.3660 | 275500 | 0.2165 | +| 1.3685 | 276000 | 0.2347 | +| 1.3709 | 276500 | 0.2188 | +| 1.3734 | 277000 | 0.2005 | +| 1.3759 | 277500 | 0.2168 | +| 1.3784 | 278000 | 0.1846 | +| 1.3809 | 278500 | 0.2299 | +| 1.3833 | 279000 | 0.2108 | +| 1.3858 | 279500 | 0.2209 | +| 1.3883 | 280000 | 0.1987 | +| 1.3908 | 280500 | 0.2218 | +| 1.3933 | 281000 | 0.2078 | +| 1.3957 | 281500 | 0.2268 | +| 1.3982 | 282000 | 0.2208 | +| 1.4007 | 282500 | 0.2114 | +| 1.4032 | 283000 | 0.2111 | +| 1.4057 | 283500 | 0.2091 | +| 1.4081 | 284000 | 0.2301 | +| 1.4106 | 284500 | 0.231 | +| 1.4131 | 285000 | 0.1773 | +| 1.4156 | 285500 | 0.2026 | +| 1.4180 | 286000 | 0.2642 | +| 1.4205 | 286500 | 0.2203 | +| 1.4230 | 287000 | 0.1972 | +| 1.4255 | 287500 | 0.2095 | +| 1.4280 | 288000 | 0.1908 | +| 1.4304 | 288500 | 0.1959 | +| 1.4329 | 289000 | 0.1783 | +| 1.4354 | 289500 | 0.215 | +| 1.4379 | 290000 | 0.2032 | +| 1.4404 | 290500 | 0.195 | +| 1.4428 | 291000 | 0.2339 | +| 1.4453 | 291500 | 0.2118 | +| 1.4478 | 292000 | 0.2089 | +| 1.4503 | 292500 | 0.2201 | +| 1.4528 | 293000 | 0.1976 | +| 1.4552 | 293500 | 0.2068 | +| 1.4577 | 294000 | 0.2256 | +| 1.4602 | 294500 | 0.2233 | +| 1.4627 | 295000 | 0.2022 | +| 1.4651 | 295500 | 0.1961 | +| 1.4676 | 296000 | 0.2252 | +| 1.4701 | 296500 | 0.2185 | +| 1.4726 | 297000 | 0.1927 | +| 1.4751 | 297500 | 0.1983 | +| 1.4775 | 298000 | 0.1956 | +| 1.4800 | 298500 | 0.1851 | +| 1.4825 | 299000 | 0.2053 | +| 1.4850 | 299500 | 0.2106 | +| 1.4875 | 300000 | 0.2221 | +| 1.4899 | 300500 | 0.1912 | +| 1.4924 | 301000 | 0.2068 | +| 1.4949 | 301500 | 0.1929 | +| 1.4974 | 302000 | 0.21 | +| 1.4999 | 302500 | 0.2102 | +| 1.5023 | 303000 | 0.1769 | +| 1.5048 | 303500 | 0.2144 | +| 1.5073 | 304000 | 0.2213 | +| 1.5098 | 304500 | 0.1909 | +| 1.5123 | 305000 | 0.1661 | +| 1.5147 | 305500 | 0.1867 | +| 1.5172 | 306000 | 0.1859 | +| 1.5197 | 306500 | 0.1901 | +| 1.5222 | 307000 | 0.2428 | +| 1.5246 | 307500 | 0.1973 | +| 1.5271 | 308000 | 0.2198 | +| 1.5296 | 308500 | 0.1884 | +| 1.5321 | 309000 | 0.182 | +| 1.5346 | 309500 | 0.1879 | +| 1.5370 | 310000 | 0.1844 | +| 1.5395 | 310500 | 0.2378 | +| 1.5420 | 311000 | 0.18 | +| 1.5445 | 311500 | 0.1745 | +| 1.5470 | 312000 | 0.1723 | +| 1.5494 | 312500 | 0.2071 | +| 1.5519 | 313000 | 0.1799 | +| 1.5544 | 313500 | 0.175 | +| 1.5569 | 314000 | 0.2341 | +| 1.5594 | 314500 | 0.1852 | +| 1.5618 | 315000 | 0.202 | +| 1.5643 | 315500 | 0.1827 | +| 1.5668 | 316000 | 0.2029 | +| 1.5693 | 316500 | 0.1777 | +| 1.5718 | 317000 | 0.2193 | +| 1.5742 | 317500 | 0.1966 | +| 1.5767 | 318000 | 0.1811 | +| 1.5792 | 318500 | 0.1716 | +| 1.5817 | 319000 | 0.2036 | +| 1.5841 | 319500 | 0.1719 | +| 1.5866 | 320000 | 0.1992 | +| 1.5891 | 320500 | 0.1983 | +| 1.5916 | 321000 | 0.2162 | +| 1.5941 | 321500 | 0.2094 | +| 1.5965 | 322000 | 0.2195 | +| 1.5990 | 322500 | 0.1907 | +| 1.6015 | 323000 | 0.2261 | +| 1.6040 | 323500 | 0.1834 | +| 1.6065 | 324000 | 0.1719 | +| 1.6089 | 324500 | 0.1719 | +| 1.6114 | 325000 | 0.1938 | +| 1.6139 | 325500 | 0.1957 | +| 1.6164 | 326000 | 0.1951 | +| 1.6189 | 326500 | 0.1836 | +| 1.6213 | 327000 | 0.1802 | +| 1.6238 | 327500 | 0.1797 | +| 1.6263 | 328000 | 0.1898 | +| 1.6288 | 328500 | 0.2018 | +| 1.6312 | 329000 | 0.1729 | +| 1.6337 | 329500 | 0.2015 | +| 1.6362 | 330000 | 0.1822 | +| 1.6387 | 330500 | 0.1749 | +| 1.6412 | 331000 | 0.1829 | +| 1.6436 | 331500 | 0.2003 | +| 1.6461 | 332000 | 0.1714 | +| 1.6486 | 332500 | 0.1718 | +| 1.6511 | 333000 | 0.1697 | +| 1.6536 | 333500 | 0.1836 | +| 1.6560 | 334000 | 0.1953 | +| 1.6585 | 334500 | 0.1859 | +| 1.6610 | 335000 | 0.1862 | +| 1.6635 | 335500 | 0.1733 | +| 1.6660 | 336000 | 0.1961 | +| 1.6684 | 336500 | 0.1735 | +| 1.6709 | 337000 | 0.1917 | +| 1.6734 | 337500 | 0.2077 | +| 1.6759 | 338000 | 0.171 | +| 1.6784 | 338500 | 0.1741 | +| 1.6808 | 339000 | 0.1719 | +| 1.6833 | 339500 | 0.1672 | +| 1.6858 | 340000 | 0.173 | +| 1.6883 | 340500 | 0.1684 | +| 1.6907 | 341000 | 0.1848 | +| 1.6932 | 341500 | 0.19 | +| 1.6957 | 342000 | 0.1764 | +| 1.6982 | 342500 | 0.1631 | +| 1.7007 | 343000 | 0.1709 | +| 1.7031 | 343500 | 0.1941 | +| 1.7056 | 344000 | 0.1738 | +| 1.7081 | 344500 | 0.1678 | +| 1.7106 | 345000 | 0.1685 | +| 1.7131 | 345500 | 0.1794 | +| 1.7155 | 346000 | 0.1709 | +| 1.7180 | 346500 | 0.1807 | +| 1.7205 | 347000 | 0.2089 | +| 1.7230 | 347500 | 0.1677 | +| 1.7255 | 348000 | 0.1571 | +| 1.7279 | 348500 | 0.2283 | +| 1.7304 | 349000 | 0.183 | +| 1.7329 | 349500 | 0.2039 | +| 1.7354 | 350000 | 0.1896 | +| 1.7378 | 350500 | 0.1921 | +| 1.7403 | 351000 | 0.1983 | +| 1.7428 | 351500 | 0.1738 | +| 1.7453 | 352000 | 0.1871 | +| 1.7478 | 352500 | 0.1936 | +| 1.7502 | 353000 | 0.1726 | +| 1.7527 | 353500 | 0.1822 | +| 1.7552 | 354000 | 0.1687 | +| 1.7577 | 354500 | 0.1733 | +| 1.7602 | 355000 | 0.1721 | +| 1.7626 | 355500 | 0.1838 | +| 1.7651 | 356000 | 0.1503 | +| 1.7676 | 356500 | 0.166 | +| 1.7701 | 357000 | 0.1544 | +| 1.7726 | 357500 | 0.165 | +| 1.7750 | 358000 | 0.1785 | +| 1.7775 | 358500 | 0.1729 | +| 1.7800 | 359000 | 0.1735 | +| 1.7825 | 359500 | 0.1582 | +| 1.7850 | 360000 | 0.1932 | +| 1.7874 | 360500 | 0.1554 | +| 1.7899 | 361000 | 0.1804 | +| 1.7924 | 361500 | 0.1833 | +| 1.7949 | 362000 | 0.1557 | +| 1.7973 | 362500 | 0.1733 | +| 1.7998 | 363000 | 0.1937 | +| 1.8023 | 363500 | 0.1543 | +| 1.8048 | 364000 | 0.2162 | +| 1.8073 | 364500 | 0.1977 | +| 1.8097 | 365000 | 0.1783 | +| 1.8122 | 365500 | 0.1758 | +| 1.8147 | 366000 | 0.2004 | +| 1.8172 | 366500 | 0.1752 | +| 1.8197 | 367000 | 0.1815 | +| 1.8221 | 367500 | 0.1643 | +| 1.8246 | 368000 | 0.1749 | +| 1.8271 | 368500 | 0.1772 | +| 1.8296 | 369000 | 0.1959 | +| 1.8321 | 369500 | 0.1621 | +| 1.8345 | 370000 | 0.2145 | +| 1.8370 | 370500 | 0.1797 | +| 1.8395 | 371000 | 0.174 | +| 1.8420 | 371500 | 0.187 | +| 1.8445 | 372000 | 0.1556 | +| 1.8469 | 372500 | 0.2023 | +| 1.8494 | 373000 | 0.1968 | +| 1.8519 | 373500 | 0.2218 | +| 1.8544 | 374000 | 0.1656 | +| 1.8568 | 374500 | 0.1893 | +| 1.8593 | 375000 | 0.1589 | +| 1.8618 | 375500 | 0.1722 | +| 1.8643 | 376000 | 0.1609 | +| 1.8668 | 376500 | 0.1949 | +| 1.8692 | 377000 | 0.1801 | +| 1.8717 | 377500 | 0.1618 | +| 1.8742 | 378000 | 0.1683 | +| 1.8767 | 378500 | 0.1532 | +| 1.8792 | 379000 | 0.1563 | +| 1.8816 | 379500 | 0.1942 | +| 1.8841 | 380000 | 0.1634 | +| 1.8866 | 380500 | 0.1547 | +| 1.8891 | 381000 | 0.1615 | +| 1.8916 | 381500 | 0.1938 | +| 1.8940 | 382000 | 0.1685 | +| 1.8965 | 382500 | 0.1862 | +| 1.8990 | 383000 | 0.1514 | +| 1.9015 | 383500 | 0.1666 | +| 1.9039 | 384000 | 0.1861 | +| 1.9064 | 384500 | 0.1447 | +| 1.9089 | 385000 | 0.1844 | +| 1.9114 | 385500 | 0.1504 | +| 1.9139 | 386000 | 0.1772 | +| 1.9163 | 386500 | 0.2152 | +| 1.9188 | 387000 | 0.1768 | +| 1.9213 | 387500 | 0.208 | +| 1.9238 | 388000 | 0.1718 | +| 1.9263 | 388500 | 0.1614 | +| 1.9287 | 389000 | 0.1635 | +| 1.9312 | 389500 | 0.1671 | +| 1.9337 | 390000 | 0.1981 | +| 1.9362 | 390500 | 0.1622 | +| 1.9387 | 391000 | 0.1519 | +| 1.9411 | 391500 | 0.1795 | +| 1.9436 | 392000 | 0.1912 | +| 1.9461 | 392500 | 0.1726 | +| 1.9486 | 393000 | 0.1878 | +| 1.9511 | 393500 | 0.1642 | +| 1.9535 | 394000 | 0.1626 | +| 1.9560 | 394500 | 0.1614 | +| 1.9585 | 395000 | 0.2133 | +| 1.9610 | 395500 | 0.1761 | +| 1.9634 | 396000 | 0.1756 | +| 1.9659 | 396500 | 0.1823 | +| 1.9684 | 397000 | 0.1555 | +| 1.9709 | 397500 | 0.1556 | +| 1.9734 | 398000 | 0.1652 | +| 1.9758 | 398500 | 0.1525 | +| 1.9783 | 399000 | 0.1869 | +| 1.9808 | 399500 | 0.1486 | +| 1.9833 | 400000 | 0.1702 | +| 1.9858 | 400500 | 0.1525 | +| 1.9882 | 401000 | 0.167 | +| 1.9907 | 401500 | 0.1929 | +| 1.9932 | 402000 | 0.1478 | +| 1.9957 | 402500 | 0.182 | +| 1.9982 | 403000 | 0.1598 | @@ -1389,6 +2091,29 @@ You can finetune this model on your own dataset. } ``` +#### AnglELoss +```bibtex +@misc{li2023angleoptimized, + title={AnglE-optimized Text Embeddings}, + author={Xianming Li and Jing Li}, + year={2023}, + eprint={2309.12871}, + archivePrefix={arXiv}, + primaryClass={cs.CL} +} +``` + +#### CoSENTLoss +```bibtex +@online{kexuefm-8847, + title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT}, + author={Su Jianlin}, + year={2022}, + month={Jan}, + url={https://kexue.fm/archives/8847}, +} +``` +