modelId
stringlengths
4
122
author
stringlengths
2
42
last_modified
unknown
downloads
int64
0
392M
likes
int64
0
6.56k
library_name
stringclasses
368 values
tags
sequencelengths
1
4.05k
pipeline_tag
stringclasses
51 values
createdAt
unknown
card
stringlengths
1
1M
kik41/lora-length-long-llama-3-8b-v2
kik41
"2024-11-12T23:02:07Z"
0
0
transformers
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
"2024-11-12T22:57:20Z"
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
kik41/lora-formality-formal-llama-3-8b-v2
kik41
"2024-11-12T22:57:54Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T22:57:54Z"
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
dbl-blnd/parce-lunar
dbl-blnd
"2024-11-12T22:58:02Z"
0
0
null
[ "license:cc0-1.0", "region:us" ]
null
"2024-11-12T22:58:02Z"
--- license: cc0-1.0 ---
tttx/problem0_model_more_aug_30
tttx
"2024-11-12T23:52:45Z"
0
0
peft
[ "peft", "safetensors", "llama", "alignment-handbook", "trl", "sft", "generated_from_trainer", "dataset:tttx/problem0_data_more_aug", "base_model:barc0/Llama-3.1-ARC-Potpourri-Transduction-8B", "base_model:adapter:barc0/Llama-3.1-ARC-Potpourri-Transduction-8B", "license:llama3.1", "region:us" ]
null
"2024-11-12T22:58:28Z"
--- base_model: barc0/Llama-3.1-ARC-Potpourri-Transduction-8B datasets: - tttx/problem0_data_more_aug library_name: peft license: llama3.1 tags: - alignment-handbook - trl - sft - generated_from_trainer model-index: - name: problem0_model_more_aug_30 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # problem0_model_more_aug_30 This model is a fine-tuned version of [barc0/Llama-3.1-ARC-Potpourri-Transduction-8B](https://huggingface.co/barc0/Llama-3.1-ARC-Potpourri-Transduction-8B) on the tttx/problem0_data_more_aug dataset. It achieves the following results on the evaluation set: - Loss: 0.1777 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - distributed_type: multi-GPU - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 0.0385 | 1.0 | 78 | 0.1729 | | 0.0076 | 2.0 | 156 | 0.1777 | ### Framework versions - PEFT 0.13.2 - Transformers 4.47.0.dev0 - Pytorch 2.4.0+cu121 - Datasets 3.1.0 - Tokenizers 0.20.3
barchetta/alto-130959
barchetta
"2024-11-12T23:05:55Z"
0
0
null
[ "safetensors", "llama", "region:us" ]
null
"2024-11-12T22:59:26Z"
Entry not found
scv15130/kisoq
scv15130
"2024-11-12T22:59:49Z"
0
0
null
[ "license:openrail", "region:us" ]
null
"2024-11-12T22:59:49Z"
--- license: openrail ---
advokat/FSFT_1.0-DD
advokat
"2024-11-12T23:06:42Z"
0
0
diffusers
[ "diffusers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "diffusers:FluxPipeline", "region:us" ]
text-to-image
"2024-11-12T23:00:25Z"
--- library_name: diffusers --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🧨 diffusers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Kiefels/gillianaflux
Kiefels
"2024-11-12T23:00:51Z"
0
0
diffusers
[ "diffusers", "safetensors", "text-to-image", "flux", "lora", "template:sd-lora", "fluxgym", "base_model:black-forest-labs/FLUX.1-dev", "base_model:adapter:black-forest-labs/FLUX.1-dev", "license:other", "region:us" ]
text-to-image
"2024-11-12T23:00:39Z"
--- tags: - text-to-image - flux - lora - diffusers - template:sd-lora - fluxgym widget: - output: url: sample/gillianaflux_002200_00_20241112225828.png text: Gillian Anderson, X-Files, Ginger, Actress base_model: black-forest-labs/FLUX.1-dev instance_prompt: Gillian Anderson, X-Files, Ginger, Actress license: other license_name: flux-1-dev-non-commercial-license license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md --- # GillianAFlux A Flux LoRA trained on a local computer with [Fluxgym](https://github.com/cocktailpeanut/fluxgym) <Gallery /> ## Trigger words You should use `Gillian Anderson, X-Files, Ginger, Actress` to trigger the image generation. ## Download model and use it with ComfyUI, AUTOMATIC1111, SD.Next, Invoke AI, Forge, etc. Weights for this model are available in Safetensors format.
ihughes15234/llama_3_1_8bi_tictactoe_dpo1epoch_v3
ihughes15234
"2024-11-12T23:07:44Z"
0
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "text-generation-inference", "unsloth", "trl", "conversational", "en", "base_model:ihughes15234/llama_3_1_8bi_tictactoe1200_10epoch", "base_model:finetune:ihughes15234/llama_3_1_8bi_tictactoe1200_10epoch", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-generation
"2024-11-12T23:01:21Z"
--- base_model: ihughes15234/llama_3_1_8bi_tictactoe1200_10epoch tags: - text-generation-inference - transformers - unsloth - llama - trl license: apache-2.0 language: - en --- # Uploaded model - **Developed by:** ihughes15234 - **License:** apache-2.0 - **Finetuned from model :** ihughes15234/llama_3_1_8bi_tictactoe1200_10epoch This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
bb1070/thinkvalley_sidetable_white_bg_b2_4e4_3k
bb1070
"2024-11-12T23:03:16Z"
0
0
diffusers
[ "diffusers", "flux", "lora", "replicate", "text-to-image", "en", "base_model:black-forest-labs/FLUX.1-dev", "base_model:adapter:black-forest-labs/FLUX.1-dev", "license:other", "region:us" ]
text-to-image
"2024-11-12T23:03:13Z"
--- license: other license_name: flux-1-dev-non-commercial-license license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md language: - en tags: - flux - diffusers - lora - replicate base_model: "black-forest-labs/FLUX.1-dev" pipeline_tag: text-to-image # widget: # - text: >- # prompt # output: # url: https://... instance_prompt: TOK --- # Thinkvalley_Sidetable_White_Bg_B2_4E4_3K <Gallery /> Trained on Replicate using: https://replicate.com/ostris/flux-dev-lora-trainer/train ## Trigger words You should use `TOK` to trigger the image generation. ## Use it with the [🧨 diffusers library](https://github.com/huggingface/diffusers) ```py from diffusers import AutoPipelineForText2Image import torch pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.float16).to('cuda') pipeline.load_lora_weights('bb1070/thinkvalley_sidetable_white_bg_b2_4e4_3k', weight_name='lora.safetensors') image = pipeline('your prompt').images[0] ``` For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
JonOlds64/accor
JonOlds64
"2024-11-12T23:06:29Z"
0
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "text-generation-inference", "unsloth", "trl", "sft", "conversational", "en", "base_model:meta-llama/Llama-3.1-8B-Instruct", "base_model:finetune:meta-llama/Llama-3.1-8B-Instruct", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-generation
"2024-11-12T23:03:46Z"
--- base_model: meta-llama/Llama-3.1-8B-Instruct tags: - text-generation-inference - transformers - unsloth - llama - trl - sft license: apache-2.0 language: - en --- # Uploaded model - **Developed by:** JonOlds64 - **License:** apache-2.0 - **Finetuned from model :** meta-llama/Llama-3.1-8B-Instruct This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
tistak/HiqSy9aUmqLWZ6ZVjD7P
tistak
"2024-11-12T23:06:54Z"
0
0
null
[ "safetensors", "llama", "region:us" ]
null
"2024-11-12T23:04:03Z"
Entry not found
dj17292n/opt-6.7b-lora
dj17292n
"2024-11-12T23:04:55Z"
0
0
transformers
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
"2024-11-12T23:04:51Z"
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Dreamslol/Qwen2.5SvelteFineTune
Dreamslol
"2024-11-12T23:05:22Z"
0
0
transformers
[ "transformers", "text-generation-inference", "unsloth", "qwen2", "gguf", "en", "base_model:unsloth/Qwen2.5-Coder-14B-Instruct-bnb-4bit", "base_model:finetune:unsloth/Qwen2.5-Coder-14B-Instruct-bnb-4bit", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
"2024-11-12T23:05:21Z"
--- base_model: unsloth/Qwen2.5-Coder-14B-Instruct-bnb-4bit tags: - text-generation-inference - transformers - unsloth - qwen2 - gguf license: apache-2.0 language: - en --- # Uploaded model - **Developed by:** Dreamslol - **License:** apache-2.0 - **Finetuned from model :** unsloth/Qwen2.5-Coder-14B-Instruct-bnb-4bit This qwen2 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
featherless-ai-quants/NousResearch-Meta-Llama-3-70B-GGUF
featherless-ai-quants
"2024-11-12T23:05:43Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:05:43Z"
Entry not found
mradermacher/Crimson_Dawn-v0.2-GGUF
mradermacher
"2024-11-12T23:29:22Z"
0
1
transformers
[ "transformers", "gguf", "en", "fr", "de", "es", "it", "pt", "ru", "zh", "ja", "dataset:Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned", "dataset:anthracite-org/stheno-filtered-v1.1", "dataset:PJMixers/hieunguyenminh_roleplay-deduped-ShareGPT", "dataset:Gryphe/Sonnet3.5-Charcard-Roleplay", "dataset:Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned", "dataset:anthracite-org/kalo-opus-instruct-22k-no-refusal", "dataset:anthracite-org/nopm_claude_writing_fixed", "dataset:anthracite-org/kalo_opus_misc_240827", "base_model:Epiculous/Crimson_Dawn-v0.2", "base_model:quantized:Epiculous/Crimson_Dawn-v0.2", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
"2024-11-12T23:05:49Z"
--- base_model: Epiculous/Crimson_Dawn-v0.2 datasets: - Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned - anthracite-org/stheno-filtered-v1.1 - PJMixers/hieunguyenminh_roleplay-deduped-ShareGPT - Gryphe/Sonnet3.5-Charcard-Roleplay - Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned - anthracite-org/kalo-opus-instruct-22k-no-refusal - anthracite-org/nopm_claude_writing_fixed - anthracite-org/kalo_opus_misc_240827 language: - en - fr - de - es - it - pt - ru - zh - ja library_name: transformers license: apache-2.0 quantized_by: mradermacher --- ## About <!-- ### quantize_version: 2 --> <!-- ### output_tensor_quantised: 1 --> <!-- ### convert_type: hf --> <!-- ### vocab_type: --> <!-- ### tags: --> static quants of https://huggingface.co/Epiculous/Crimson_Dawn-v0.2 <!-- provided-files --> weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion. ## Usage If you are unsure how to use GGUF files, refer to one of [TheBloke's READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for more details, including on how to concatenate multi-part files. ## Provided Quants (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) | Link | Type | Size/GB | Notes | |:-----|:-----|--------:|:------| | [GGUF](https://huggingface.co/mradermacher/Crimson_Dawn-v0.2-GGUF/resolve/main/Crimson_Dawn-v0.2.Q2_K.gguf) | Q2_K | 4.9 | | | [GGUF](https://huggingface.co/mradermacher/Crimson_Dawn-v0.2-GGUF/resolve/main/Crimson_Dawn-v0.2.Q3_K_S.gguf) | Q3_K_S | 5.6 | | | [GGUF](https://huggingface.co/mradermacher/Crimson_Dawn-v0.2-GGUF/resolve/main/Crimson_Dawn-v0.2.Q3_K_M.gguf) | Q3_K_M | 6.2 | lower quality | | [GGUF](https://huggingface.co/mradermacher/Crimson_Dawn-v0.2-GGUF/resolve/main/Crimson_Dawn-v0.2.Q3_K_L.gguf) | Q3_K_L | 6.7 | | | [GGUF](https://huggingface.co/mradermacher/Crimson_Dawn-v0.2-GGUF/resolve/main/Crimson_Dawn-v0.2.IQ4_XS.gguf) | IQ4_XS | 6.9 | | | [GGUF](https://huggingface.co/mradermacher/Crimson_Dawn-v0.2-GGUF/resolve/main/Crimson_Dawn-v0.2.Q4_0_4_4.gguf) | Q4_0_4_4 | 7.2 | fast on arm, low quality | | [GGUF](https://huggingface.co/mradermacher/Crimson_Dawn-v0.2-GGUF/resolve/main/Crimson_Dawn-v0.2.Q4_K_S.gguf) | Q4_K_S | 7.2 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/Crimson_Dawn-v0.2-GGUF/resolve/main/Crimson_Dawn-v0.2.Q4_K_M.gguf) | Q4_K_M | 7.6 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/Crimson_Dawn-v0.2-GGUF/resolve/main/Crimson_Dawn-v0.2.Q5_K_S.gguf) | Q5_K_S | 8.6 | | | [GGUF](https://huggingface.co/mradermacher/Crimson_Dawn-v0.2-GGUF/resolve/main/Crimson_Dawn-v0.2.Q5_K_M.gguf) | Q5_K_M | 8.8 | | | [GGUF](https://huggingface.co/mradermacher/Crimson_Dawn-v0.2-GGUF/resolve/main/Crimson_Dawn-v0.2.Q6_K.gguf) | Q6_K | 10.2 | very good quality | | [GGUF](https://huggingface.co/mradermacher/Crimson_Dawn-v0.2-GGUF/resolve/main/Crimson_Dawn-v0.2.Q8_0.gguf) | Q8_0 | 13.1 | fast, best quality | Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): ![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 ## FAQ / Model Request See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized. ## Thanks I thank my company, [nethype GmbH](https://www.nethype.de/), for letting me use its servers and providing upgrades to my workstation to enable this work in my free time. <!-- end -->
tensorblock/Llama-medx_v3.1-GGUF
tensorblock
"2024-11-12T23:33:44Z"
0
0
transformers
[ "transformers", "gguf", "TensorBlock", "GGUF", "base_model:skumar9/Llama-medx_v3.1", "base_model:quantized:skumar9/Llama-medx_v3.1", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
"2024-11-12T23:06:14Z"
--- library_name: transformers license: apache-2.0 base_model: skumar9/Llama-medx_v3.1 tags: - TensorBlock - GGUF --- <div style="width: auto; margin-left: auto; margin-right: auto"> <img src="https://i.imgur.com/jC7kdl8.jpeg" alt="TensorBlock" style="width: 100%; min-width: 400px; display: block; margin: auto;"> </div> <div style="display: flex; justify-content: space-between; width: 100%;"> <div style="display: flex; flex-direction: column; align-items: flex-start;"> <p style="margin-top: 0.5em; margin-bottom: 0em;"> Feedback and support: TensorBlock's <a href="https://x.com/tensorblock_aoi">Twitter/X</a>, <a href="https://t.me/TensorBlock">Telegram Group</a> and <a href="https://x.com/tensorblock_aoi">Discord server</a> </p> </div> </div> ## skumar9/Llama-medx_v3.1 - GGUF This repo contains GGUF format model files for [skumar9/Llama-medx_v3.1](https://huggingface.co/skumar9/Llama-medx_v3.1). The files were quantized using machines provided by [TensorBlock](https://tensorblock.co/), and they are compatible with llama.cpp as of [commit b4011](https://github.com/ggerganov/llama.cpp/commit/a6744e43e80f4be6398fc7733a01642c846dce1d). ## Prompt template ``` <|im_start|>system {system_prompt}<|im_end|> <|im_start|>user {prompt}<|im_end|> <|im_start|>assistant ``` ## Model file specification | Filename | Quant type | File Size | Description | | -------- | ---------- | --------- | ----------- | | [Llama-medx_v3.1-Q2_K.gguf](https://huggingface.co/tensorblock/Llama-medx_v3.1-GGUF/tree/main/Llama-medx_v3.1-Q2_K.gguf) | Q2_K | 2.961 GB | smallest, significant quality loss - not recommended for most purposes | | [Llama-medx_v3.1-Q3_K_S.gguf](https://huggingface.co/tensorblock/Llama-medx_v3.1-GGUF/tree/main/Llama-medx_v3.1-Q3_K_S.gguf) | Q3_K_S | 3.413 GB | very small, high quality loss | | [Llama-medx_v3.1-Q3_K_M.gguf](https://huggingface.co/tensorblock/Llama-medx_v3.1-GGUF/tree/main/Llama-medx_v3.1-Q3_K_M.gguf) | Q3_K_M | 3.743 GB | very small, high quality loss | | [Llama-medx_v3.1-Q3_K_L.gguf](https://huggingface.co/tensorblock/Llama-medx_v3.1-GGUF/tree/main/Llama-medx_v3.1-Q3_K_L.gguf) | Q3_K_L | 4.025 GB | small, substantial quality loss | | [Llama-medx_v3.1-Q4_0.gguf](https://huggingface.co/tensorblock/Llama-medx_v3.1-GGUF/tree/main/Llama-medx_v3.1-Q4_0.gguf) | Q4_0 | 4.341 GB | legacy; small, very high quality loss - prefer using Q3_K_M | | [Llama-medx_v3.1-Q4_K_S.gguf](https://huggingface.co/tensorblock/Llama-medx_v3.1-GGUF/tree/main/Llama-medx_v3.1-Q4_K_S.gguf) | Q4_K_S | 4.370 GB | small, greater quality loss | | [Llama-medx_v3.1-Q4_K_M.gguf](https://huggingface.co/tensorblock/Llama-medx_v3.1-GGUF/tree/main/Llama-medx_v3.1-Q4_K_M.gguf) | Q4_K_M | 4.583 GB | medium, balanced quality - recommended | | [Llama-medx_v3.1-Q5_0.gguf](https://huggingface.co/tensorblock/Llama-medx_v3.1-GGUF/tree/main/Llama-medx_v3.1-Q5_0.gguf) | Q5_0 | 5.215 GB | legacy; medium, balanced quality - prefer using Q4_K_M | | [Llama-medx_v3.1-Q5_K_S.gguf](https://huggingface.co/tensorblock/Llama-medx_v3.1-GGUF/tree/main/Llama-medx_v3.1-Q5_K_S.gguf) | Q5_K_S | 5.215 GB | large, low quality loss - recommended | | [Llama-medx_v3.1-Q5_K_M.gguf](https://huggingface.co/tensorblock/Llama-medx_v3.1-GGUF/tree/main/Llama-medx_v3.1-Q5_K_M.gguf) | Q5_K_M | 5.339 GB | large, very low quality loss - recommended | | [Llama-medx_v3.1-Q6_K.gguf](https://huggingface.co/tensorblock/Llama-medx_v3.1-GGUF/tree/main/Llama-medx_v3.1-Q6_K.gguf) | Q6_K | 6.143 GB | very large, extremely low quality loss | | [Llama-medx_v3.1-Q8_0.gguf](https://huggingface.co/tensorblock/Llama-medx_v3.1-GGUF/tree/main/Llama-medx_v3.1-Q8_0.gguf) | Q8_0 | 7.954 GB | very large, extremely low quality loss - not recommended | ## Downloading instruction ### Command line Firstly, install Huggingface Client ```shell pip install -U "huggingface_hub[cli]" ``` Then, downoad the individual model file the a local directory ```shell huggingface-cli download tensorblock/Llama-medx_v3.1-GGUF --include "Llama-medx_v3.1-Q2_K.gguf" --local-dir MY_LOCAL_DIR ``` If you wanna download multiple model files with a pattern (e.g., `*Q4_K*gguf`), you can try: ```shell huggingface-cli download tensorblock/Llama-medx_v3.1-GGUF --local-dir MY_LOCAL_DIR --local-dir-use-symlinks False --include='*Q4_K*gguf' ```
mradermacher/Ice0.32-10.11-RP-i1-GGUF
mradermacher
"2024-11-13T01:14:48Z"
0
0
null
[ "gguf", "region:us" ]
null
"2024-11-12T23:06:22Z"
<!-- ### quantize_version: 2 --> <!-- ### output_tensor_quantised: 1 --> <!-- ### convert_type: hf --> <!-- ### vocab_type: --> <!-- ### tags: nicoboss --> weighted/imatrix quants of https://huggingface.co/icefog72/Ice0.32-10.11-RP
tensorblock/lawma-8b-GGUF
tensorblock
"2024-11-13T00:02:46Z"
0
0
null
[ "gguf", "legal", "TensorBlock", "GGUF", "en", "dataset:ricdomolm/lawma-all-tasks", "base_model:ricdomolm/lawma-8b", "base_model:quantized:ricdomolm/lawma-8b", "license:mit", "region:us" ]
null
"2024-11-12T23:06:43Z"
--- datasets: - ricdomolm/lawma-all-tasks language: - en license: mit tags: - legal - TensorBlock - GGUF base_model: ricdomolm/lawma-8b --- <div style="width: auto; margin-left: auto; margin-right: auto"> <img src="https://i.imgur.com/jC7kdl8.jpeg" alt="TensorBlock" style="width: 100%; min-width: 400px; display: block; margin: auto;"> </div> <div style="display: flex; justify-content: space-between; width: 100%;"> <div style="display: flex; flex-direction: column; align-items: flex-start;"> <p style="margin-top: 0.5em; margin-bottom: 0em;"> Feedback and support: TensorBlock's <a href="https://x.com/tensorblock_aoi">Twitter/X</a>, <a href="https://t.me/TensorBlock">Telegram Group</a> and <a href="https://x.com/tensorblock_aoi">Discord server</a> </p> </div> </div> ## ricdomolm/lawma-8b - GGUF This repo contains GGUF format model files for [ricdomolm/lawma-8b](https://huggingface.co/ricdomolm/lawma-8b). The files were quantized using machines provided by [TensorBlock](https://tensorblock.co/), and they are compatible with llama.cpp as of [commit b4011](https://github.com/ggerganov/llama.cpp/commit/a6744e43e80f4be6398fc7733a01642c846dce1d). ## Prompt template ``` <|begin_of_text|><|start_header_id|>system<|end_header_id|> Cutting Knowledge Date: December 2023 Today Date: 26 Jul 2024 {system_prompt}<|eot_id|><|start_header_id|>user<|end_header_id|> {prompt}<|eot_id|><|start_header_id|>assistant<|end_header_id|> ``` ## Model file specification | Filename | Quant type | File Size | Description | | -------- | ---------- | --------- | ----------- | | [lawma-8b-Q2_K.gguf](https://huggingface.co/tensorblock/lawma-8b-GGUF/tree/main/lawma-8b-Q2_K.gguf) | Q2_K | 2.961 GB | smallest, significant quality loss - not recommended for most purposes | | [lawma-8b-Q3_K_S.gguf](https://huggingface.co/tensorblock/lawma-8b-GGUF/tree/main/lawma-8b-Q3_K_S.gguf) | Q3_K_S | 3.413 GB | very small, high quality loss | | [lawma-8b-Q3_K_M.gguf](https://huggingface.co/tensorblock/lawma-8b-GGUF/tree/main/lawma-8b-Q3_K_M.gguf) | Q3_K_M | 3.743 GB | very small, high quality loss | | [lawma-8b-Q3_K_L.gguf](https://huggingface.co/tensorblock/lawma-8b-GGUF/tree/main/lawma-8b-Q3_K_L.gguf) | Q3_K_L | 4.025 GB | small, substantial quality loss | | [lawma-8b-Q4_0.gguf](https://huggingface.co/tensorblock/lawma-8b-GGUF/tree/main/lawma-8b-Q4_0.gguf) | Q4_0 | 4.341 GB | legacy; small, very high quality loss - prefer using Q3_K_M | | [lawma-8b-Q4_K_S.gguf](https://huggingface.co/tensorblock/lawma-8b-GGUF/tree/main/lawma-8b-Q4_K_S.gguf) | Q4_K_S | 4.370 GB | small, greater quality loss | | [lawma-8b-Q4_K_M.gguf](https://huggingface.co/tensorblock/lawma-8b-GGUF/tree/main/lawma-8b-Q4_K_M.gguf) | Q4_K_M | 4.583 GB | medium, balanced quality - recommended | | [lawma-8b-Q5_0.gguf](https://huggingface.co/tensorblock/lawma-8b-GGUF/tree/main/lawma-8b-Q5_0.gguf) | Q5_0 | 5.215 GB | legacy; medium, balanced quality - prefer using Q4_K_M | | [lawma-8b-Q5_K_S.gguf](https://huggingface.co/tensorblock/lawma-8b-GGUF/tree/main/lawma-8b-Q5_K_S.gguf) | Q5_K_S | 5.215 GB | large, low quality loss - recommended | | [lawma-8b-Q5_K_M.gguf](https://huggingface.co/tensorblock/lawma-8b-GGUF/tree/main/lawma-8b-Q5_K_M.gguf) | Q5_K_M | 5.339 GB | large, very low quality loss - recommended | | [lawma-8b-Q6_K.gguf](https://huggingface.co/tensorblock/lawma-8b-GGUF/tree/main/lawma-8b-Q6_K.gguf) | Q6_K | 6.143 GB | very large, extremely low quality loss | | [lawma-8b-Q8_0.gguf](https://huggingface.co/tensorblock/lawma-8b-GGUF/tree/main/lawma-8b-Q8_0.gguf) | Q8_0 | 7.954 GB | very large, extremely low quality loss - not recommended | ## Downloading instruction ### Command line Firstly, install Huggingface Client ```shell pip install -U "huggingface_hub[cli]" ``` Then, downoad the individual model file the a local directory ```shell huggingface-cli download tensorblock/lawma-8b-GGUF --include "lawma-8b-Q2_K.gguf" --local-dir MY_LOCAL_DIR ``` If you wanna download multiple model files with a pattern (e.g., `*Q4_K*gguf`), you can try: ```shell huggingface-cli download tensorblock/lawma-8b-GGUF --local-dir MY_LOCAL_DIR --local-dir-use-symlinks False --include='*Q4_K*gguf' ```
glif-loradex-trainer/dham_dham_roadside
glif-loradex-trainer
"2024-11-12T23:08:00Z"
0
0
diffusers
[ "diffusers", "text-to-image", "template:sd-lora", "base_model:black-forest-labs/FLUX.1-dev", "base_model:finetune:black-forest-labs/FLUX.1-dev", "license:other", "region:us", "flux", "lora", "base_model:adapter:black-forest-labs/FLUX.1-dev" ]
text-to-image
"2024-11-12T23:07:22Z"
--- tags: - diffusers - text-to-image - template:sd-lora - base_model:black-forest-labs/FLUX.1-dev - base_model:finetune:black-forest-labs/FLUX.1-dev - license:other - region:us - flux - lora widget: - output: url: samples/1731452738668__000003000_0.jpg text: TOK robot - output: url: samples/1731452760864__000003000_1.jpg text: TOK gengar - output: url: samples/1731452783084__000003000_2.jpg text: TOK hair salon - output: url: samples/1731452805298__000003000_3.jpg text: TOK monstertruck show - output: url: samples/1731452827518__000003000_4.jpg text: TOK eiffel tower base_model: black-forest-labs/FLUX.1-dev trigger: TOK instance_prompt: TOK license: other license_name: flux-1-dev-non-commercial-license license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md --- # dham_roadside Model trained with [AI Toolkit by Ostris](https://github.com/ostris/ai-toolkit) under the [Glif Loradex program](https://huggingface.co/glif-loradex-trainer) by [Glif](https://glif.app) user `dham`. <Gallery /> ## Trigger words You should use `TOK` to trigger the image generation. ## Download model Weights for this model are available in Safetensors format. [Download](/glif-loradex-trainer/dham_dham_roadside/tree/main) them in the Files & versions tab. ## License This model is licensed under the [flux-1-dev-non-commercial-license](https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md).
eurecom-ds/scoresdeve-ema-celeba-64-unconditional
eurecom-ds
"2024-11-12T23:16:57Z"
0
0
diffusers
[ "diffusers", "safetensors", "region:us" ]
null
"2024-11-12T23:07:41Z"
Entry not found
tttx/problem0_model_aug_30
tttx
"2024-11-12T23:53:18Z"
0
0
peft
[ "peft", "safetensors", "llama", "alignment-handbook", "trl", "sft", "generated_from_trainer", "dataset:tttx/problem0_data", "base_model:barc0/Llama-3.1-ARC-Potpourri-Transduction-8B", "base_model:adapter:barc0/Llama-3.1-ARC-Potpourri-Transduction-8B", "license:llama3.1", "region:us" ]
null
"2024-11-12T23:08:02Z"
--- base_model: barc0/Llama-3.1-ARC-Potpourri-Transduction-8B datasets: - tttx/problem0_data library_name: peft license: llama3.1 tags: - alignment-handbook - trl - sft - generated_from_trainer model-index: - name: problem0_model_aug_30 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # problem0_model_aug_30 This model is a fine-tuned version of [barc0/Llama-3.1-ARC-Potpourri-Transduction-8B](https://huggingface.co/barc0/Llama-3.1-ARC-Potpourri-Transduction-8B) on the tttx/problem0_data dataset. It achieves the following results on the evaluation set: - Loss: 0.2431 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - distributed_type: multi-GPU - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 0.0836 | 1.0 | 75 | 0.1690 | | 0.0112 | 2.0 | 150 | 0.2431 | ### Framework versions - PEFT 0.13.2 - Transformers 4.47.0.dev0 - Pytorch 2.4.0+cu121 - Datasets 3.1.0 - Tokenizers 0.20.3
OhaymakingO/4-12112149-02Haymak
OhaymakingO
"2024-11-12T23:11:51Z"
0
0
null
[ "safetensors", "llama", "region:us" ]
null
"2024-11-12T23:08:10Z"
Entry not found
nitic-nlp-team/webnavix-llama-merged
nitic-nlp-team
"2024-11-13T01:11:53Z"
0
0
null
[ "safetensors", "llama", "text-generation", "en", "dataset:McGill-NLP/WebLINX-full", "base_model:nitic-nlp-team/webnavix-llama-ai-tools", "base_model:finetune:nitic-nlp-team/webnavix-llama-ai-tools", "license:apache-2.0", "region:us" ]
text-generation
"2024-11-12T23:08:29Z"
--- license: apache-2.0 datasets: - McGill-NLP/WebLINX-full language: - en base_model: - nitic-nlp-team/webnavix-llama-ai-tools - nitic-nlp-team/webnavix-llama-social-interaction - nitic-nlp-team/webnavix-llama-summarizing - nitic-nlp-team/webnavix-llama-information-lookup - nitic-nlp-team/webnavix-llama-composing - nitic-nlp-team/webnavix-llama-booking - nitic-nlp-team/webnavix-llama-shopping - nitic-nlp-team/webnavix-llama-task-management - nitic-nlp-team/webnavix-llama-shared pipeline_tag: text-generation ---
MayBashendy/Arabic_FineTuningAraBERT_AugV4_k10_task3_organization_fold1
MayBashendy
"2024-11-12T23:48:22Z"
0
0
transformers
[ "transformers", "safetensors", "bert", "text-classification", "generated_from_trainer", "base_model:aubmindlab/bert-base-arabertv02", "base_model:finetune:aubmindlab/bert-base-arabertv02", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
"2024-11-12T23:08:40Z"
--- library_name: transformers base_model: aubmindlab/bert-base-arabertv02 tags: - generated_from_trainer model-index: - name: Arabic_FineTuningAraBERT_AugV4_k10_task3_organization_fold1 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Arabic_FineTuningAraBERT_AugV4_k10_task3_organization_fold1 This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.8539 - Qwk: -0.0233 - Mse: 0.8539 - Rmse: 0.9241 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse | |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:| | No log | 0.0043 | 2 | 4.4234 | 0.0716 | 4.4234 | 2.1032 | | No log | 0.0087 | 4 | 1.3773 | 0.0363 | 1.3773 | 1.1736 | | No log | 0.0130 | 6 | 0.6226 | 0.1681 | 0.6226 | 0.7890 | | No log | 0.0174 | 8 | 0.6099 | 0.4211 | 0.6099 | 0.7810 | | No log | 0.0217 | 10 | 1.0508 | -0.0421 | 1.0508 | 1.0251 | | No log | 0.0260 | 12 | 1.2297 | 0.0 | 1.2297 | 1.1089 | | No log | 0.0304 | 14 | 1.0768 | 0.0 | 1.0768 | 1.0377 | | No log | 0.0347 | 16 | 0.9256 | 0.0 | 0.9256 | 0.9621 | | No log | 0.0390 | 18 | 0.9259 | 0.0 | 0.9259 | 0.9622 | | No log | 0.0434 | 20 | 0.5360 | 0.4211 | 0.5360 | 0.7321 | | No log | 0.0477 | 22 | 0.5159 | 0.4211 | 0.5159 | 0.7182 | | No log | 0.0521 | 24 | 0.9098 | 0.0 | 0.9098 | 0.9538 | | No log | 0.0564 | 26 | 1.2076 | 0.0 | 1.2076 | 1.0989 | | No log | 0.0607 | 28 | 1.0547 | 0.0 | 1.0547 | 1.0270 | | No log | 0.0651 | 30 | 1.0774 | 0.0 | 1.0774 | 1.0380 | | No log | 0.0694 | 32 | 1.2423 | 0.0 | 1.2423 | 1.1146 | | No log | 0.0738 | 34 | 1.5652 | -0.4667 | 1.5652 | 1.2511 | | No log | 0.0781 | 36 | 1.1867 | 0.0 | 1.1867 | 1.0893 | | No log | 0.0824 | 38 | 0.6791 | 0.0 | 0.6791 | 0.8241 | | No log | 0.0868 | 40 | 0.6129 | 0.2326 | 0.6129 | 0.7829 | | No log | 0.0911 | 42 | 0.6713 | 0.0 | 0.6713 | 0.8194 | | No log | 0.0954 | 44 | 0.7903 | 0.0 | 0.7903 | 0.8890 | | No log | 0.0998 | 46 | 1.1152 | 0.0 | 1.1152 | 1.0560 | | No log | 0.1041 | 48 | 1.1671 | 0.0 | 1.1671 | 1.0803 | | No log | 0.1085 | 50 | 0.7346 | -0.0233 | 0.7346 | 0.8571 | | No log | 0.1128 | 52 | 0.6166 | -0.0233 | 0.6166 | 0.7852 | | No log | 0.1171 | 54 | 0.6611 | -0.0233 | 0.6611 | 0.8131 | | No log | 0.1215 | 56 | 0.8505 | 0.0 | 0.8505 | 0.9222 | | No log | 0.1258 | 58 | 1.3081 | 0.0 | 1.3081 | 1.1437 | | No log | 0.1302 | 60 | 1.5708 | 0.2414 | 1.5708 | 1.2533 | | No log | 0.1345 | 62 | 1.3264 | 0.0 | 1.3264 | 1.1517 | | No log | 0.1388 | 64 | 1.1819 | 0.0 | 1.1819 | 1.0872 | | No log | 0.1432 | 66 | 1.2094 | 0.0 | 1.2094 | 1.0997 | | No log | 0.1475 | 68 | 1.2633 | 0.0 | 1.2633 | 1.1240 | | No log | 0.1518 | 70 | 1.3436 | 0.0 | 1.3436 | 1.1591 | | No log | 0.1562 | 72 | 1.0899 | 0.0 | 1.0899 | 1.0440 | | No log | 0.1605 | 74 | 1.0411 | 0.0 | 1.0411 | 1.0203 | | No log | 0.1649 | 76 | 1.0505 | 0.0 | 1.0505 | 1.0250 | | No log | 0.1692 | 78 | 0.9258 | 0.0 | 0.9258 | 0.9622 | | No log | 0.1735 | 80 | 1.0697 | 0.0 | 1.0697 | 1.0343 | | No log | 0.1779 | 82 | 0.9280 | 0.4211 | 0.9280 | 0.9633 | | No log | 0.1822 | 84 | 0.8834 | 0.5926 | 0.8834 | 0.9399 | | No log | 0.1866 | 86 | 1.4874 | 0.0 | 1.4874 | 1.2196 | | No log | 0.1909 | 88 | 1.1199 | 0.0 | 1.1199 | 1.0582 | | No log | 0.1952 | 90 | 1.0479 | 0.0 | 1.0479 | 1.0237 | | No log | 0.1996 | 92 | 1.2919 | 0.0 | 1.2919 | 1.1366 | | No log | 0.2039 | 94 | 1.5278 | 0.0 | 1.5278 | 1.2360 | | No log | 0.2082 | 96 | 1.7297 | -0.3276 | 1.7297 | 1.3152 | | No log | 0.2126 | 98 | 1.4582 | 0.0 | 1.4582 | 1.2076 | | No log | 0.2169 | 100 | 1.2088 | 0.0 | 1.2088 | 1.0995 | | No log | 0.2213 | 102 | 1.1461 | 0.0 | 1.1461 | 1.0706 | | No log | 0.2256 | 104 | 0.9322 | 0.0 | 0.9322 | 0.9655 | | No log | 0.2299 | 106 | 1.0467 | 0.0 | 1.0467 | 1.0231 | | No log | 0.2343 | 108 | 1.6695 | 0.3803 | 1.6695 | 1.2921 | | No log | 0.2386 | 110 | 1.8289 | 0.0833 | 1.8289 | 1.3524 | | No log | 0.2430 | 112 | 1.4800 | -0.3883 | 1.4800 | 1.2166 | | No log | 0.2473 | 114 | 0.9636 | 0.0 | 0.9636 | 0.9816 | | No log | 0.2516 | 116 | 1.1585 | 0.0 | 1.1585 | 1.0763 | | No log | 0.2560 | 118 | 1.4178 | -0.3883 | 1.4178 | 1.1907 | | No log | 0.2603 | 120 | 1.7692 | 0.0774 | 1.7692 | 1.3301 | | No log | 0.2646 | 122 | 1.6876 | 0.0774 | 1.6876 | 1.2991 | | No log | 0.2690 | 124 | 1.7683 | 0.0774 | 1.7683 | 1.3298 | | No log | 0.2733 | 126 | 1.9478 | 0.0774 | 1.9478 | 1.3956 | | No log | 0.2777 | 128 | 2.1714 | -0.0206 | 2.1714 | 1.4736 | | No log | 0.2820 | 130 | 1.7846 | 0.0704 | 1.7846 | 1.3359 | | No log | 0.2863 | 132 | 1.5874 | -0.1085 | 1.5874 | 1.2599 | | No log | 0.2907 | 134 | 1.7491 | -0.1085 | 1.7491 | 1.3226 | | No log | 0.2950 | 136 | 1.9704 | -0.0331 | 1.9704 | 1.4037 | | No log | 0.2993 | 138 | 1.9853 | -0.0331 | 1.9853 | 1.4090 | | No log | 0.3037 | 140 | 1.5625 | -0.3883 | 1.5625 | 1.2500 | | No log | 0.3080 | 142 | 1.0280 | 0.0 | 1.0280 | 1.0139 | | No log | 0.3124 | 144 | 1.0065 | 0.0 | 1.0065 | 1.0032 | | No log | 0.3167 | 146 | 1.3350 | 0.0222 | 1.3350 | 1.1554 | | No log | 0.3210 | 148 | 1.8351 | -0.0845 | 1.8351 | 1.3547 | | No log | 0.3254 | 150 | 2.0992 | -0.0206 | 2.0992 | 1.4489 | | No log | 0.3297 | 152 | 2.0106 | 0.0884 | 2.0106 | 1.4180 | | No log | 0.3341 | 154 | 1.4389 | -0.3883 | 1.4389 | 1.1996 | | No log | 0.3384 | 156 | 1.3249 | 0.0222 | 1.3249 | 1.1510 | | No log | 0.3427 | 158 | 1.3651 | -0.3883 | 1.3651 | 1.1684 | | No log | 0.3471 | 160 | 1.9597 | 0.0884 | 1.9597 | 1.3999 | | No log | 0.3514 | 162 | 2.2204 | -0.0206 | 2.2204 | 1.4901 | | No log | 0.3557 | 164 | 2.0051 | -0.3484 | 2.0051 | 1.4160 | | No log | 0.3601 | 166 | 1.6200 | -0.4667 | 1.6200 | 1.2728 | | No log | 0.3644 | 168 | 1.4351 | 0.0 | 1.4351 | 1.1980 | | No log | 0.3688 | 170 | 1.3089 | 0.0 | 1.3089 | 1.1441 | | No log | 0.3731 | 172 | 1.4994 | -0.4667 | 1.4994 | 1.2245 | | No log | 0.3774 | 174 | 1.8123 | -0.2791 | 1.8123 | 1.3462 | | No log | 0.3818 | 176 | 1.6965 | -0.3276 | 1.6965 | 1.3025 | | No log | 0.3861 | 178 | 1.5872 | -0.3276 | 1.5872 | 1.2598 | | No log | 0.3905 | 180 | 1.5810 | -0.1085 | 1.5810 | 1.2574 | | No log | 0.3948 | 182 | 1.5772 | -0.1085 | 1.5772 | 1.2559 | | No log | 0.3991 | 184 | 2.0910 | -0.0206 | 2.0910 | 1.4460 | | No log | 0.4035 | 186 | 2.5121 | -0.0097 | 2.5121 | 1.5850 | | No log | 0.4078 | 188 | 2.1942 | -0.0097 | 2.1942 | 1.4813 | | No log | 0.4121 | 190 | 1.5090 | 0.2667 | 1.5090 | 1.2284 | | No log | 0.4165 | 192 | 0.9127 | 0.0 | 0.9127 | 0.9554 | | No log | 0.4208 | 194 | 0.7683 | 0.0 | 0.7683 | 0.8765 | | No log | 0.4252 | 196 | 0.8166 | 0.0 | 0.8166 | 0.9037 | | No log | 0.4295 | 198 | 1.0163 | 0.0 | 1.0163 | 1.0081 | | No log | 0.4338 | 200 | 1.2260 | 0.0 | 1.2260 | 1.1072 | | No log | 0.4382 | 202 | 1.2730 | 0.0 | 1.2730 | 1.1283 | | No log | 0.4425 | 204 | 1.4769 | 0.2524 | 1.4769 | 1.2153 | | No log | 0.4469 | 206 | 1.4511 | 0.0222 | 1.4511 | 1.2046 | | No log | 0.4512 | 208 | 1.4500 | -0.3883 | 1.4500 | 1.2041 | | No log | 0.4555 | 210 | 1.6449 | -0.2791 | 1.6449 | 1.2826 | | No log | 0.4599 | 212 | 1.8602 | -0.0845 | 1.8602 | 1.3639 | | No log | 0.4642 | 214 | 2.1845 | -0.0331 | 2.1845 | 1.4780 | | No log | 0.4685 | 216 | 2.6248 | -0.0206 | 2.6248 | 1.6201 | | No log | 0.4729 | 218 | 2.4775 | -0.0206 | 2.4775 | 1.5740 | | No log | 0.4772 | 220 | 1.7767 | -0.1085 | 1.7767 | 1.3329 | | No log | 0.4816 | 222 | 1.5185 | -0.4667 | 1.5185 | 1.2323 | | No log | 0.4859 | 224 | 1.4856 | -0.4667 | 1.4856 | 1.2188 | | No log | 0.4902 | 226 | 1.7511 | -0.0845 | 1.7511 | 1.3233 | | No log | 0.4946 | 228 | 2.2073 | -0.0206 | 2.2073 | 1.4857 | | No log | 0.4989 | 230 | 2.1623 | -0.0206 | 2.1623 | 1.4705 | | No log | 0.5033 | 232 | 1.7745 | 0.0774 | 1.7745 | 1.3321 | | No log | 0.5076 | 234 | 1.8624 | 0.0884 | 1.8624 | 1.3647 | | No log | 0.5119 | 236 | 2.0133 | 0.0884 | 2.0133 | 1.4189 | | No log | 0.5163 | 238 | 1.8023 | 0.0833 | 1.8023 | 1.3425 | | No log | 0.5206 | 240 | 1.3704 | 0.0 | 1.3704 | 1.1706 | | No log | 0.5249 | 242 | 1.3839 | 0.0 | 1.3839 | 1.1764 | | No log | 0.5293 | 244 | 1.6512 | -0.1379 | 1.6512 | 1.2850 | | No log | 0.5336 | 246 | 1.6268 | -0.4667 | 1.6268 | 1.2754 | | No log | 0.5380 | 248 | 1.3589 | 0.0 | 1.3589 | 1.1657 | | No log | 0.5423 | 250 | 1.2664 | 0.0 | 1.2664 | 1.1254 | | No log | 0.5466 | 252 | 1.2838 | 0.0 | 1.2838 | 1.1331 | | No log | 0.5510 | 254 | 1.5199 | -0.3883 | 1.5199 | 1.2329 | | No log | 0.5553 | 256 | 1.8575 | 0.0774 | 1.8575 | 1.3629 | | No log | 0.5597 | 258 | 1.8986 | 0.0774 | 1.8986 | 1.3779 | | No log | 0.5640 | 260 | 2.0602 | 0.0884 | 2.0602 | 1.4353 | | No log | 0.5683 | 262 | 1.8245 | 0.0774 | 1.8245 | 1.3508 | | No log | 0.5727 | 264 | 2.1433 | -0.0206 | 2.1433 | 1.4640 | | No log | 0.5770 | 266 | 2.0090 | 0.0884 | 2.0090 | 1.4174 | | No log | 0.5813 | 268 | 1.6078 | -0.1379 | 1.6078 | 1.2680 | | No log | 0.5857 | 270 | 1.1653 | 0.0 | 1.1653 | 1.0795 | | No log | 0.5900 | 272 | 0.9898 | 0.0 | 0.9898 | 0.9949 | | No log | 0.5944 | 274 | 1.2673 | 0.0 | 1.2673 | 1.1257 | | No log | 0.5987 | 276 | 1.7040 | -0.1379 | 1.7040 | 1.3054 | | No log | 0.6030 | 278 | 1.7996 | 0.0774 | 1.7996 | 1.3415 | | No log | 0.6074 | 280 | 1.3944 | 0.2524 | 1.3944 | 1.1809 | | No log | 0.6117 | 282 | 0.9260 | -0.0233 | 0.9260 | 0.9623 | | No log | 0.6161 | 284 | 1.0102 | -0.0233 | 1.0102 | 1.0051 | | No log | 0.6204 | 286 | 1.5825 | -0.1085 | 1.5825 | 1.2580 | | No log | 0.6247 | 288 | 2.1399 | -0.0097 | 2.1399 | 1.4628 | | No log | 0.6291 | 290 | 1.9827 | -0.1547 | 1.9827 | 1.4081 | | No log | 0.6334 | 292 | 1.4652 | -0.4667 | 1.4652 | 1.2104 | | No log | 0.6377 | 294 | 1.3204 | 0.0 | 1.3204 | 1.1491 | | No log | 0.6421 | 296 | 1.4004 | 0.0 | 1.4004 | 1.1834 | | No log | 0.6464 | 298 | 1.2925 | 0.0 | 1.2925 | 1.1369 | | No log | 0.6508 | 300 | 1.3384 | 0.0 | 1.3384 | 1.1569 | | No log | 0.6551 | 302 | 1.6259 | -0.1085 | 1.6259 | 1.2751 | | No log | 0.6594 | 304 | 2.0655 | 0.0884 | 2.0655 | 1.4372 | | No log | 0.6638 | 306 | 2.0902 | 0.0884 | 2.0902 | 1.4458 | | No log | 0.6681 | 308 | 2.0809 | 0.0884 | 2.0809 | 1.4425 | | No log | 0.6725 | 310 | 2.0531 | 0.0833 | 2.0531 | 1.4329 | | No log | 0.6768 | 312 | 1.6806 | 0.0704 | 1.6806 | 1.2964 | | No log | 0.6811 | 314 | 1.4962 | -0.1085 | 1.4962 | 1.2232 | | No log | 0.6855 | 316 | 1.7977 | 0.0774 | 1.7977 | 1.3408 | | No log | 0.6898 | 318 | 2.0285 | 0.0833 | 2.0285 | 1.4242 | | No log | 0.6941 | 320 | 1.7995 | 0.0704 | 1.7995 | 1.3414 | | No log | 0.6985 | 322 | 1.4052 | 0.0 | 1.4052 | 1.1854 | | No log | 0.7028 | 324 | 1.2522 | 0.0 | 1.2522 | 1.1190 | | No log | 0.7072 | 326 | 1.3075 | 0.0 | 1.3075 | 1.1435 | | No log | 0.7115 | 328 | 1.1769 | 0.0 | 1.1769 | 1.0849 | | No log | 0.7158 | 330 | 1.0567 | 0.0 | 1.0567 | 1.0280 | | No log | 0.7202 | 332 | 1.0304 | 0.0 | 1.0304 | 1.0151 | | No log | 0.7245 | 334 | 1.1211 | 0.0 | 1.1211 | 1.0588 | | No log | 0.7289 | 336 | 1.4516 | -0.1379 | 1.4516 | 1.2048 | | No log | 0.7332 | 338 | 1.6132 | 0.0774 | 1.6132 | 1.2701 | | No log | 0.7375 | 340 | 1.4857 | -0.1379 | 1.4857 | 1.2189 | | No log | 0.7419 | 342 | 1.9309 | 0.0833 | 1.9309 | 1.3896 | | No log | 0.7462 | 344 | 2.3171 | -0.0097 | 2.3171 | 1.5222 | | No log | 0.7505 | 346 | 2.0576 | -0.0097 | 2.0576 | 1.4344 | | No log | 0.7549 | 348 | 1.4952 | 0.2524 | 1.4952 | 1.2228 | | No log | 0.7592 | 350 | 1.1502 | 0.0 | 1.1502 | 1.0725 | | No log | 0.7636 | 352 | 1.0051 | -0.0233 | 1.0051 | 1.0025 | | No log | 0.7679 | 354 | 1.1469 | -0.0233 | 1.1469 | 1.0709 | | No log | 0.7722 | 356 | 1.6551 | -0.1085 | 1.6551 | 1.2865 | | No log | 0.7766 | 358 | 2.0374 | 0.0884 | 2.0374 | 1.4274 | | No log | 0.7809 | 360 | 1.8607 | 0.0833 | 1.8607 | 1.3641 | | No log | 0.7852 | 362 | 1.2728 | 0.0 | 1.2728 | 1.1282 | | No log | 0.7896 | 364 | 0.9629 | -0.0233 | 0.9629 | 0.9813 | | No log | 0.7939 | 366 | 1.0698 | -0.0233 | 1.0698 | 1.0343 | | No log | 0.7983 | 368 | 1.5381 | -0.3276 | 1.5381 | 1.2402 | | No log | 0.8026 | 370 | 1.8293 | 0.0774 | 1.8293 | 1.3525 | | No log | 0.8069 | 372 | 1.6964 | 0.0774 | 1.6964 | 1.3025 | | No log | 0.8113 | 374 | 1.4341 | -0.4444 | 1.4341 | 1.1975 | | No log | 0.8156 | 376 | 1.4935 | -0.1379 | 1.4935 | 1.2221 | | No log | 0.8200 | 378 | 1.4437 | -0.1748 | 1.4437 | 1.2016 | | No log | 0.8243 | 380 | 1.3388 | -0.0233 | 1.3388 | 1.1571 | | No log | 0.8286 | 382 | 1.3320 | -0.0233 | 1.3320 | 1.1541 | | No log | 0.8330 | 384 | 1.4488 | -0.1748 | 1.4488 | 1.2037 | | No log | 0.8373 | 386 | 1.7693 | 0.0833 | 1.7693 | 1.3301 | | No log | 0.8416 | 388 | 2.0481 | -0.0206 | 2.0481 | 1.4311 | | No log | 0.8460 | 390 | 1.9244 | 0.0884 | 1.9244 | 1.3872 | | No log | 0.8503 | 392 | 1.4968 | -0.1748 | 1.4968 | 1.2234 | | No log | 0.8547 | 394 | 1.0836 | -0.0233 | 1.0836 | 1.0410 | | No log | 0.8590 | 396 | 0.9872 | -0.0233 | 0.9872 | 0.9936 | | No log | 0.8633 | 398 | 1.1126 | 0.0 | 1.1126 | 1.0548 | | No log | 0.8677 | 400 | 1.3267 | 0.0 | 1.3267 | 1.1518 | | No log | 0.8720 | 402 | 1.5216 | 0.2667 | 1.5216 | 1.2335 | | No log | 0.8764 | 404 | 1.7081 | -0.0845 | 1.7081 | 1.3069 | | No log | 0.8807 | 406 | 1.6433 | -0.1085 | 1.6433 | 1.2819 | | No log | 0.8850 | 408 | 1.4251 | -0.1786 | 1.4251 | 1.1938 | | No log | 0.8894 | 410 | 1.5602 | -0.1159 | 1.5602 | 1.2491 | | No log | 0.8937 | 412 | 1.7243 | -0.0645 | 1.7243 | 1.3131 | | No log | 0.8980 | 414 | 1.6887 | -0.1085 | 1.6887 | 1.2995 | | No log | 0.9024 | 416 | 1.5076 | 0.0 | 1.5076 | 1.2278 | | No log | 0.9067 | 418 | 1.1594 | 0.0 | 1.1594 | 1.0768 | | No log | 0.9111 | 420 | 1.0769 | 0.0 | 1.0769 | 1.0377 | | No log | 0.9154 | 422 | 1.1745 | 0.0 | 1.1745 | 1.0838 | | No log | 0.9197 | 424 | 1.2911 | 0.0 | 1.2911 | 1.1362 | | No log | 0.9241 | 426 | 1.4073 | 0.0 | 1.4073 | 1.1863 | | No log | 0.9284 | 428 | 1.3976 | 0.0 | 1.3976 | 1.1822 | | No log | 0.9328 | 430 | 1.5538 | -0.4667 | 1.5538 | 1.2465 | | No log | 0.9371 | 432 | 1.4362 | -0.4667 | 1.4362 | 1.1984 | | No log | 0.9414 | 434 | 1.3939 | 0.0 | 1.3939 | 1.1806 | | No log | 0.9458 | 436 | 1.4437 | 0.0 | 1.4437 | 1.2015 | | No log | 0.9501 | 438 | 1.3292 | 0.0 | 1.3292 | 1.1529 | | No log | 0.9544 | 440 | 1.1086 | 0.0 | 1.1086 | 1.0529 | | No log | 0.9588 | 442 | 1.1104 | -0.0233 | 1.1104 | 1.0538 | | No log | 0.9631 | 444 | 1.3013 | 0.2667 | 1.3013 | 1.1408 | | No log | 0.9675 | 446 | 1.4906 | 0.2414 | 1.4906 | 1.2209 | | No log | 0.9718 | 448 | 1.4099 | 0.2667 | 1.4099 | 1.1874 | | No log | 0.9761 | 450 | 1.3728 | 0.0 | 1.3728 | 1.1717 | | No log | 0.9805 | 452 | 1.2687 | 0.0 | 1.2687 | 1.1264 | | No log | 0.9848 | 454 | 1.4675 | 0.2524 | 1.4675 | 1.2114 | | No log | 0.9892 | 456 | 1.4358 | 0.2524 | 1.4358 | 1.1982 | | No log | 0.9935 | 458 | 1.1124 | -0.0233 | 1.1124 | 1.0547 | | No log | 0.9978 | 460 | 1.0625 | -0.0233 | 1.0625 | 1.0308 | | No log | 1.0022 | 462 | 1.4112 | -0.1440 | 1.4112 | 1.1879 | | No log | 1.0065 | 464 | 1.9673 | 0.0884 | 1.9673 | 1.4026 | | No log | 1.0108 | 466 | 1.9562 | 0.0884 | 1.9562 | 1.3986 | | No log | 1.0152 | 468 | 1.5438 | -0.1085 | 1.5438 | 1.2425 | | No log | 1.0195 | 470 | 1.4215 | 0.2667 | 1.4215 | 1.1923 | | No log | 1.0239 | 472 | 1.3726 | 0.0 | 1.3726 | 1.1716 | | No log | 1.0282 | 474 | 1.2539 | 0.0 | 1.2539 | 1.1198 | | No log | 1.0325 | 476 | 1.2641 | 0.0 | 1.2641 | 1.1243 | | No log | 1.0369 | 478 | 1.2229 | 0.0 | 1.2229 | 1.1059 | | No log | 1.0412 | 480 | 1.3420 | 0.0 | 1.3420 | 1.1585 | | No log | 1.0456 | 482 | 1.4815 | 0.2414 | 1.4815 | 1.2172 | | No log | 1.0499 | 484 | 1.2227 | -0.0233 | 1.2227 | 1.1058 | | No log | 1.0542 | 486 | 1.1879 | -0.0233 | 1.1879 | 1.0899 | | No log | 1.0586 | 488 | 1.3236 | 0.0 | 1.3236 | 1.1505 | | No log | 1.0629 | 490 | 1.5454 | 0.0222 | 1.5454 | 1.2431 | | No log | 1.0672 | 492 | 1.4918 | 0.0 | 1.4918 | 1.2214 | | No log | 1.0716 | 494 | 1.1545 | 0.0 | 1.1545 | 1.0745 | | No log | 1.0759 | 496 | 0.7991 | 0.1895 | 0.7991 | 0.8939 | | No log | 1.0803 | 498 | 0.7245 | 0.1895 | 0.7245 | 0.8512 | | 0.3463 | 1.0846 | 500 | 0.8309 | -0.0233 | 0.8309 | 0.9116 | | 0.3463 | 1.0889 | 502 | 1.1768 | 0.0 | 1.1768 | 1.0848 | | 0.3463 | 1.0933 | 504 | 1.6191 | 0.0222 | 1.6191 | 1.2724 | | 0.3463 | 1.0976 | 506 | 1.6710 | -0.3276 | 1.6710 | 1.2927 | | 0.3463 | 1.1020 | 508 | 1.3873 | 0.0 | 1.3873 | 1.1778 | | 0.3463 | 1.1063 | 510 | 1.1005 | 0.0 | 1.1005 | 1.0490 | | 0.3463 | 1.1106 | 512 | 1.1698 | 0.0 | 1.1698 | 1.0816 | | 0.3463 | 1.1150 | 514 | 1.2342 | 0.0 | 1.2342 | 1.1109 | | 0.3463 | 1.1193 | 516 | 1.2707 | 0.0 | 1.2707 | 1.1272 | | 0.3463 | 1.1236 | 518 | 1.4036 | 0.0 | 1.4036 | 1.1848 | | 0.3463 | 1.1280 | 520 | 1.5334 | -0.3883 | 1.5334 | 1.2383 | | 0.3463 | 1.1323 | 522 | 1.4008 | 0.0 | 1.4008 | 1.1835 | | 0.3463 | 1.1367 | 524 | 1.2868 | 0.0 | 1.2868 | 1.1344 | | 0.3463 | 1.1410 | 526 | 1.0735 | 0.0 | 1.0735 | 1.0361 | | 0.3463 | 1.1453 | 528 | 1.0508 | 0.0 | 1.0508 | 1.0251 | | 0.3463 | 1.1497 | 530 | 1.2354 | 0.0 | 1.2354 | 1.1115 | | 0.3463 | 1.1540 | 532 | 1.1591 | -0.0233 | 1.1591 | 1.0766 | | 0.3463 | 1.1584 | 534 | 1.1645 | -0.0233 | 1.1645 | 1.0791 | | 0.3463 | 1.1627 | 536 | 1.3183 | 0.0 | 1.3183 | 1.1482 | | 0.3463 | 1.1670 | 538 | 1.6206 | -0.1085 | 1.6206 | 1.2730 | | 0.3463 | 1.1714 | 540 | 1.6710 | -0.0845 | 1.6710 | 1.2927 | | 0.3463 | 1.1757 | 542 | 1.4234 | 0.0 | 1.4234 | 1.1931 | | 0.3463 | 1.1800 | 544 | 1.1868 | 0.0 | 1.1868 | 1.0894 | | 0.3463 | 1.1844 | 546 | 0.9796 | -0.0233 | 0.9796 | 0.9897 | | 0.3463 | 1.1887 | 548 | 0.9470 | -0.0233 | 0.9470 | 0.9732 | | 0.3463 | 1.1931 | 550 | 1.0902 | 0.0 | 1.0902 | 1.0441 | | 0.3463 | 1.1974 | 552 | 1.2104 | 0.0 | 1.2104 | 1.1002 | | 0.3463 | 1.2017 | 554 | 1.4058 | 0.0 | 1.4058 | 1.1857 | | 0.3463 | 1.2061 | 556 | 1.4588 | -0.4667 | 1.4588 | 1.2078 | | 0.3463 | 1.2104 | 558 | 1.3757 | -0.0233 | 1.3757 | 1.1729 | | 0.3463 | 1.2148 | 560 | 1.2870 | -0.0233 | 1.2870 | 1.1345 | | 0.3463 | 1.2191 | 562 | 1.5279 | -0.1379 | 1.5279 | 1.2361 | | 0.3463 | 1.2234 | 564 | 1.7192 | -0.0645 | 1.7192 | 1.3112 | | 0.3463 | 1.2278 | 566 | 1.6736 | -0.0845 | 1.6736 | 1.2937 | | 0.3463 | 1.2321 | 568 | 1.3126 | 0.0 | 1.3126 | 1.1457 | | 0.3463 | 1.2364 | 570 | 1.1570 | 0.0 | 1.1570 | 1.0756 | | 0.3463 | 1.2408 | 572 | 1.1468 | 0.0 | 1.1468 | 1.0709 | | 0.3463 | 1.2451 | 574 | 1.4021 | 0.2667 | 1.4021 | 1.1841 | | 0.3463 | 1.2495 | 576 | 1.4577 | -0.1748 | 1.4577 | 1.2073 | | 0.3463 | 1.2538 | 578 | 1.2193 | 0.2222 | 1.2193 | 1.1042 | | 0.3463 | 1.2581 | 580 | 1.1722 | 0.2667 | 1.1722 | 1.0827 | | 0.3463 | 1.2625 | 582 | 1.4227 | 0.2667 | 1.4227 | 1.1928 | | 0.3463 | 1.2668 | 584 | 1.4544 | 0.2667 | 1.4544 | 1.2060 | | 0.3463 | 1.2711 | 586 | 1.2029 | 0.2667 | 1.2029 | 1.0968 | | 0.3463 | 1.2755 | 588 | 1.1581 | 0.0 | 1.1581 | 1.0761 | | 0.3463 | 1.2798 | 590 | 1.2813 | 0.2667 | 1.2813 | 1.1319 | | 0.3463 | 1.2842 | 592 | 1.2462 | 0.2667 | 1.2462 | 1.1163 | | 0.3463 | 1.2885 | 594 | 1.4218 | 0.2667 | 1.4218 | 1.1924 | | 0.3463 | 1.2928 | 596 | 1.6536 | 0.0704 | 1.6536 | 1.2859 | | 0.3463 | 1.2972 | 598 | 1.6292 | 0.2414 | 1.6292 | 1.2764 | | 0.3463 | 1.3015 | 600 | 1.3909 | 0.2667 | 1.3909 | 1.1794 | | 0.3463 | 1.3059 | 602 | 1.3099 | 0.0 | 1.3099 | 1.1445 | | 0.3463 | 1.3102 | 604 | 1.1187 | 0.0 | 1.1187 | 1.0577 | | 0.3463 | 1.3145 | 606 | 1.0964 | 0.0 | 1.0964 | 1.0471 | | 0.3463 | 1.3189 | 608 | 1.1636 | 0.0 | 1.1636 | 1.0787 | | 0.3463 | 1.3232 | 610 | 1.2400 | 0.0 | 1.2400 | 1.1135 | | 0.3463 | 1.3275 | 612 | 1.3683 | 0.2667 | 1.3683 | 1.1697 | | 0.3463 | 1.3319 | 614 | 1.3660 | 0.2667 | 1.3660 | 1.1688 | | 0.3463 | 1.3362 | 616 | 1.6641 | -0.1085 | 1.6641 | 1.2900 | | 0.3463 | 1.3406 | 618 | 1.6704 | -0.1085 | 1.6704 | 1.2925 | | 0.3463 | 1.3449 | 620 | 1.5779 | -0.1085 | 1.5779 | 1.2561 | | 0.3463 | 1.3492 | 622 | 1.1583 | 0.0 | 1.1583 | 1.0763 | | 0.3463 | 1.3536 | 624 | 1.0025 | -0.0233 | 1.0025 | 1.0012 | | 0.3463 | 1.3579 | 626 | 1.1802 | 0.0 | 1.1802 | 1.0864 | | 0.3463 | 1.3623 | 628 | 1.3927 | 0.0 | 1.3927 | 1.1801 | | 0.3463 | 1.3666 | 630 | 1.2981 | 0.0 | 1.2981 | 1.1393 | | 0.3463 | 1.3709 | 632 | 1.1072 | 0.0 | 1.1072 | 1.0522 | | 0.3463 | 1.3753 | 634 | 0.9739 | -0.0233 | 0.9739 | 0.9868 | | 0.3463 | 1.3796 | 636 | 1.0549 | 0.0 | 1.0549 | 1.0271 | | 0.3463 | 1.3839 | 638 | 1.2672 | 0.0 | 1.2672 | 1.1257 | | 0.3463 | 1.3883 | 640 | 1.2444 | 0.0 | 1.2444 | 1.1155 | | 0.3463 | 1.3926 | 642 | 0.9734 | -0.0233 | 0.9734 | 0.9866 | | 0.3463 | 1.3970 | 644 | 0.8677 | 0.1895 | 0.8677 | 0.9315 | | 0.3463 | 1.4013 | 646 | 0.9745 | -0.0233 | 0.9745 | 0.9872 | | 0.3463 | 1.4056 | 648 | 1.3388 | 0.0 | 1.3388 | 1.1571 | | 0.3463 | 1.4100 | 650 | 1.6478 | 0.2524 | 1.6478 | 1.2837 | | 0.3463 | 1.4143 | 652 | 1.6388 | 0.2667 | 1.6388 | 1.2802 | | 0.3463 | 1.4187 | 654 | 1.4562 | 0.0 | 1.4562 | 1.2067 | | 0.3463 | 1.4230 | 656 | 1.2053 | 0.0 | 1.2053 | 1.0979 | | 0.3463 | 1.4273 | 658 | 0.9941 | 0.0 | 0.9941 | 0.9970 | | 0.3463 | 1.4317 | 660 | 0.8480 | -0.0233 | 0.8480 | 0.9209 | | 0.3463 | 1.4360 | 662 | 0.8432 | -0.0233 | 0.8432 | 0.9182 | | 0.3463 | 1.4403 | 664 | 0.9911 | 0.0 | 0.9911 | 0.9955 | | 0.3463 | 1.4447 | 666 | 1.3010 | 0.0 | 1.3010 | 1.1406 | | 0.3463 | 1.4490 | 668 | 1.2951 | 0.0 | 1.2951 | 1.1380 | | 0.3463 | 1.4534 | 670 | 1.0364 | -0.0233 | 1.0364 | 1.0180 | | 0.3463 | 1.4577 | 672 | 1.0176 | -0.0233 | 1.0176 | 1.0088 | | 0.3463 | 1.4620 | 674 | 1.1974 | -0.0233 | 1.1974 | 1.0943 | | 0.3463 | 1.4664 | 676 | 1.4640 | -0.1379 | 1.4640 | 1.2100 | | 0.3463 | 1.4707 | 678 | 1.6976 | -0.0845 | 1.6976 | 1.3029 | | 0.3463 | 1.4751 | 680 | 1.6271 | 0.0774 | 1.6271 | 1.2756 | | 0.3463 | 1.4794 | 682 | 1.2338 | -0.0233 | 1.2338 | 1.1107 | | 0.3463 | 1.4837 | 684 | 0.9712 | -0.0233 | 0.9712 | 0.9855 | | 0.3463 | 1.4881 | 686 | 0.9968 | -0.0233 | 0.9968 | 0.9984 | | 0.3463 | 1.4924 | 688 | 1.1519 | 0.0 | 1.1519 | 1.0733 | | 0.3463 | 1.4967 | 690 | 1.2989 | 0.0 | 1.2989 | 1.1397 | | 0.3463 | 1.5011 | 692 | 1.1985 | 0.0 | 1.1985 | 1.0948 | | 0.3463 | 1.5054 | 694 | 1.0473 | 0.0 | 1.0473 | 1.0234 | | 0.3463 | 1.5098 | 696 | 1.0715 | 0.0 | 1.0715 | 1.0351 | | 0.3463 | 1.5141 | 698 | 1.1690 | 0.0 | 1.1690 | 1.0812 | | 0.3463 | 1.5184 | 700 | 1.1487 | 0.0 | 1.1487 | 1.0718 | | 0.3463 | 1.5228 | 702 | 1.0480 | 0.0 | 1.0480 | 1.0237 | | 0.3463 | 1.5271 | 704 | 1.0045 | -0.0233 | 1.0045 | 1.0022 | | 0.3463 | 1.5315 | 706 | 1.1208 | -0.0233 | 1.1208 | 1.0587 | | 0.3463 | 1.5358 | 708 | 1.3223 | 0.0 | 1.3223 | 1.1499 | | 0.3463 | 1.5401 | 710 | 1.3759 | 0.0 | 1.3759 | 1.1730 | | 0.3463 | 1.5445 | 712 | 1.2231 | 0.0 | 1.2231 | 1.1060 | | 0.3463 | 1.5488 | 714 | 1.1786 | -0.0233 | 1.1786 | 1.0856 | | 0.3463 | 1.5531 | 716 | 1.0765 | -0.0233 | 1.0765 | 1.0375 | | 0.3463 | 1.5575 | 718 | 1.0925 | -0.0233 | 1.0925 | 1.0452 | | 0.3463 | 1.5618 | 720 | 1.2725 | 0.0 | 1.2725 | 1.1281 | | 0.3463 | 1.5662 | 722 | 1.5095 | 0.2667 | 1.5095 | 1.2286 | | 0.3463 | 1.5705 | 724 | 1.4552 | 0.2667 | 1.4552 | 1.2063 | | 0.3463 | 1.5748 | 726 | 1.2642 | 0.0 | 1.2642 | 1.1244 | | 0.3463 | 1.5792 | 728 | 1.2485 | 0.0 | 1.2485 | 1.1173 | | 0.3463 | 1.5835 | 730 | 1.4452 | 0.2667 | 1.4452 | 1.2021 | | 0.3463 | 1.5879 | 732 | 1.6962 | -0.1085 | 1.6962 | 1.3024 | | 0.3463 | 1.5922 | 734 | 1.5540 | -0.1379 | 1.5540 | 1.2466 | | 0.3463 | 1.5965 | 736 | 1.1297 | 0.0 | 1.1297 | 1.0629 | | 0.3463 | 1.6009 | 738 | 0.9710 | -0.0233 | 0.9710 | 0.9854 | | 0.3463 | 1.6052 | 740 | 1.0593 | -0.0233 | 1.0593 | 1.0292 | | 0.3463 | 1.6095 | 742 | 1.1776 | 0.0 | 1.1776 | 1.0852 | | 0.3463 | 1.6139 | 744 | 1.4458 | 0.2524 | 1.4458 | 1.2024 | | 0.3463 | 1.6182 | 746 | 1.5119 | 0.2524 | 1.5119 | 1.2296 | | 0.3463 | 1.6226 | 748 | 1.3224 | 0.0 | 1.3224 | 1.1500 | | 0.3463 | 1.6269 | 750 | 1.2498 | 0.0 | 1.2498 | 1.1179 | | 0.3463 | 1.6312 | 752 | 1.0874 | 0.0 | 1.0874 | 1.0428 | | 0.3463 | 1.6356 | 754 | 1.0872 | 0.0 | 1.0872 | 1.0427 | | 0.3463 | 1.6399 | 756 | 1.0450 | 0.0 | 1.0450 | 1.0223 | | 0.3463 | 1.6443 | 758 | 1.0152 | 0.0 | 1.0152 | 1.0076 | | 0.3463 | 1.6486 | 760 | 1.2104 | 0.0 | 1.2104 | 1.1002 | | 0.3463 | 1.6529 | 762 | 1.4131 | 0.0222 | 1.4131 | 1.1887 | | 0.3463 | 1.6573 | 764 | 1.3726 | 0.0 | 1.3726 | 1.1716 | | 0.3463 | 1.6616 | 766 | 1.4181 | 0.0 | 1.4181 | 1.1908 | | 0.3463 | 1.6659 | 768 | 1.3026 | 0.0 | 1.3026 | 1.1413 | | 0.3463 | 1.6703 | 770 | 1.2815 | 0.0 | 1.2815 | 1.1320 | | 0.3463 | 1.6746 | 772 | 1.1397 | 0.0 | 1.1397 | 1.0675 | | 0.3463 | 1.6790 | 774 | 1.0563 | 0.0 | 1.0563 | 1.0278 | | 0.3463 | 1.6833 | 776 | 0.9394 | -0.0233 | 0.9394 | 0.9692 | | 0.3463 | 1.6876 | 778 | 1.0102 | 0.0 | 1.0102 | 1.0051 | | 0.3463 | 1.6920 | 780 | 1.3065 | 0.0 | 1.3065 | 1.1430 | | 0.3463 | 1.6963 | 782 | 1.5302 | 0.0 | 1.5302 | 1.2370 | | 0.3463 | 1.7007 | 784 | 1.5108 | 0.0 | 1.5108 | 1.2292 | | 0.3463 | 1.7050 | 786 | 1.2800 | 0.0 | 1.2800 | 1.1314 | | 0.3463 | 1.7093 | 788 | 1.0152 | 0.0 | 1.0152 | 1.0076 | | 0.3463 | 1.7137 | 790 | 0.9354 | -0.0233 | 0.9354 | 0.9672 | | 0.3463 | 1.7180 | 792 | 0.9962 | 0.0 | 0.9962 | 0.9981 | | 0.3463 | 1.7223 | 794 | 1.0126 | 0.0 | 1.0126 | 1.0063 | | 0.3463 | 1.7267 | 796 | 1.0402 | 0.0 | 1.0402 | 1.0199 | | 0.3463 | 1.7310 | 798 | 1.2211 | 0.0 | 1.2211 | 1.1050 | | 0.3463 | 1.7354 | 800 | 1.4048 | 0.0 | 1.4048 | 1.1853 | | 0.3463 | 1.7397 | 802 | 1.2934 | 0.0 | 1.2934 | 1.1373 | | 0.3463 | 1.7440 | 804 | 0.9672 | -0.0233 | 0.9672 | 0.9835 | | 0.3463 | 1.7484 | 806 | 0.8149 | -0.0421 | 0.8149 | 0.9027 | | 0.3463 | 1.7527 | 808 | 0.8751 | -0.0233 | 0.8751 | 0.9354 | | 0.3463 | 1.7570 | 810 | 1.1778 | 0.0 | 1.1778 | 1.0852 | | 0.3463 | 1.7614 | 812 | 1.3544 | 0.0 | 1.3544 | 1.1638 | | 0.3463 | 1.7657 | 814 | 1.2063 | 0.0 | 1.2063 | 1.0983 | | 0.3463 | 1.7701 | 816 | 0.9585 | 0.0 | 0.9585 | 0.9791 | | 0.3463 | 1.7744 | 818 | 0.9279 | 0.0 | 0.9279 | 0.9633 | | 0.3463 | 1.7787 | 820 | 0.9084 | 0.0 | 0.9084 | 0.9531 | | 0.3463 | 1.7831 | 822 | 1.0741 | 0.0 | 1.0741 | 1.0364 | | 0.3463 | 1.7874 | 824 | 1.2224 | 0.0 | 1.2224 | 1.1056 | | 0.3463 | 1.7918 | 826 | 1.4121 | -0.1748 | 1.4121 | 1.1883 | | 0.3463 | 1.7961 | 828 | 1.3432 | -0.1748 | 1.3432 | 1.1590 | | 0.3463 | 1.8004 | 830 | 1.1561 | -0.0233 | 1.1561 | 1.0752 | | 0.3463 | 1.8048 | 832 | 1.1993 | 0.0 | 1.1993 | 1.0951 | | 0.3463 | 1.8091 | 834 | 1.2415 | 0.0 | 1.2415 | 1.1142 | | 0.3463 | 1.8134 | 836 | 1.3206 | 0.0 | 1.3206 | 1.1492 | | 0.3463 | 1.8178 | 838 | 1.4137 | 0.0 | 1.4137 | 1.1890 | | 0.3463 | 1.8221 | 840 | 1.2750 | 0.0 | 1.2750 | 1.1292 | | 0.3463 | 1.8265 | 842 | 1.0160 | 0.0 | 1.0160 | 1.0080 | | 0.3463 | 1.8308 | 844 | 0.8670 | -0.0421 | 0.8670 | 0.9311 | | 0.3463 | 1.8351 | 846 | 0.9231 | -0.0421 | 0.9231 | 0.9608 | | 0.3463 | 1.8395 | 848 | 1.2399 | 0.0 | 1.2399 | 1.1135 | | 0.3463 | 1.8438 | 850 | 1.4550 | -0.1748 | 1.4550 | 1.2062 | | 0.3463 | 1.8482 | 852 | 1.5777 | -0.1379 | 1.5777 | 1.2561 | | 0.3463 | 1.8525 | 854 | 1.5185 | 0.2667 | 1.5185 | 1.2323 | | 0.3463 | 1.8568 | 856 | 1.3958 | 0.0 | 1.3958 | 1.1814 | | 0.3463 | 1.8612 | 858 | 1.1570 | 0.0 | 1.1570 | 1.0757 | | 0.3463 | 1.8655 | 860 | 0.9342 | -0.0233 | 0.9342 | 0.9666 | | 0.3463 | 1.8698 | 862 | 0.7772 | 0.1239 | 0.7772 | 0.8816 | | 0.3463 | 1.8742 | 864 | 0.7914 | 0.1239 | 0.7914 | 0.8896 | | 0.3463 | 1.8785 | 866 | 0.9179 | -0.0233 | 0.9179 | 0.9581 | | 0.3463 | 1.8829 | 868 | 1.2861 | 0.0 | 1.2861 | 1.1341 | | 0.3463 | 1.8872 | 870 | 1.4886 | 0.0 | 1.4886 | 1.2201 | | 0.3463 | 1.8915 | 872 | 1.4057 | 0.0 | 1.4057 | 1.1856 | | 0.3463 | 1.8959 | 874 | 1.1160 | 0.0 | 1.1160 | 1.0564 | | 0.3463 | 1.9002 | 876 | 0.8410 | -0.0577 | 0.8410 | 0.9171 | | 0.3463 | 1.9046 | 878 | 0.7966 | -0.0820 | 0.7966 | 0.8925 | | 0.3463 | 1.9089 | 880 | 0.8508 | -0.0577 | 0.8508 | 0.9224 | | 0.3463 | 1.9132 | 882 | 0.9597 | -0.0421 | 0.9597 | 0.9797 | | 0.3463 | 1.9176 | 884 | 1.2128 | 0.0 | 1.2128 | 1.1013 | | 0.3463 | 1.9219 | 886 | 1.2280 | 0.0 | 1.2280 | 1.1081 | | 0.3463 | 1.9262 | 888 | 1.2573 | 0.0 | 1.2573 | 1.1213 | | 0.3463 | 1.9306 | 890 | 1.2172 | 0.0 | 1.2172 | 1.1033 | | 0.3463 | 1.9349 | 892 | 1.1830 | 0.0 | 1.1830 | 1.0876 | | 0.3463 | 1.9393 | 894 | 1.2522 | 0.0 | 1.2522 | 1.1190 | | 0.3463 | 1.9436 | 896 | 1.2324 | 0.0 | 1.2324 | 1.1101 | | 0.3463 | 1.9479 | 898 | 1.0497 | 0.0 | 1.0497 | 1.0245 | | 0.3463 | 1.9523 | 900 | 0.8537 | -0.0233 | 0.8537 | 0.9240 | | 0.3463 | 1.9566 | 902 | 0.8609 | -0.0233 | 0.8609 | 0.9279 | | 0.3463 | 1.9610 | 904 | 1.0522 | 0.0 | 1.0522 | 1.0258 | | 0.3463 | 1.9653 | 906 | 1.3958 | 0.2667 | 1.3958 | 1.1814 | | 0.3463 | 1.9696 | 908 | 1.5277 | -0.1379 | 1.5277 | 1.2360 | | 0.3463 | 1.9740 | 910 | 1.3366 | 0.0 | 1.3366 | 1.1561 | | 0.3463 | 1.9783 | 912 | 0.9455 | 0.0 | 0.9455 | 0.9723 | | 0.3463 | 1.9826 | 914 | 0.8080 | -0.0577 | 0.8080 | 0.8989 | | 0.3463 | 1.9870 | 916 | 0.8843 | -0.0233 | 0.8843 | 0.9404 | | 0.3463 | 1.9913 | 918 | 1.1738 | 0.0 | 1.1738 | 1.0834 | | 0.3463 | 1.9957 | 920 | 1.2958 | 0.0 | 1.2958 | 1.1383 | | 0.3463 | 2.0 | 922 | 1.2135 | 0.0 | 1.2135 | 1.1016 | | 0.3463 | 2.0043 | 924 | 1.0429 | 0.0 | 1.0429 | 1.0212 | | 0.3463 | 2.0087 | 926 | 1.0784 | 0.0 | 1.0784 | 1.0384 | | 0.3463 | 2.0130 | 928 | 1.1758 | 0.0 | 1.1758 | 1.0843 | | 0.3463 | 2.0174 | 930 | 1.3403 | 0.0 | 1.3403 | 1.1577 | | 0.3463 | 2.0217 | 932 | 1.3185 | 0.0 | 1.3185 | 1.1483 | | 0.3463 | 2.0260 | 934 | 1.1297 | 0.0 | 1.1297 | 1.0629 | | 0.3463 | 2.0304 | 936 | 0.9506 | 0.0 | 0.9506 | 0.9750 | | 0.3463 | 2.0347 | 938 | 0.9603 | 0.0 | 0.9603 | 0.9799 | | 0.3463 | 2.0390 | 940 | 1.0258 | 0.0 | 1.0258 | 1.0128 | | 0.3463 | 2.0434 | 942 | 1.0212 | 0.0 | 1.0212 | 1.0106 | | 0.3463 | 2.0477 | 944 | 1.1506 | 0.0 | 1.1506 | 1.0727 | | 0.3463 | 2.0521 | 946 | 1.2475 | 0.0 | 1.2475 | 1.1169 | | 0.3463 | 2.0564 | 948 | 1.4185 | 0.0 | 1.4185 | 1.1910 | | 0.3463 | 2.0607 | 950 | 1.3732 | 0.0 | 1.3732 | 1.1718 | | 0.3463 | 2.0651 | 952 | 1.2374 | 0.0 | 1.2374 | 1.1124 | | 0.3463 | 2.0694 | 954 | 1.1162 | 0.0 | 1.1162 | 1.0565 | | 0.3463 | 2.0738 | 956 | 1.0666 | 0.0 | 1.0666 | 1.0328 | | 0.3463 | 2.0781 | 958 | 1.1446 | 0.0 | 1.1446 | 1.0698 | | 0.3463 | 2.0824 | 960 | 1.0816 | 0.0 | 1.0816 | 1.0400 | | 0.3463 | 2.0868 | 962 | 1.0951 | 0.0 | 1.0951 | 1.0464 | | 0.3463 | 2.0911 | 964 | 1.2428 | 0.0 | 1.2428 | 1.1148 | | 0.3463 | 2.0954 | 966 | 1.2578 | 0.0 | 1.2578 | 1.1215 | | 0.3463 | 2.0998 | 968 | 1.1836 | 0.0 | 1.1836 | 1.0879 | | 0.3463 | 2.1041 | 970 | 1.0334 | 0.0 | 1.0334 | 1.0166 | | 0.3463 | 2.1085 | 972 | 1.0127 | 0.0 | 1.0127 | 1.0063 | | 0.3463 | 2.1128 | 974 | 1.0483 | 0.0 | 1.0483 | 1.0239 | | 0.3463 | 2.1171 | 976 | 1.1557 | 0.0 | 1.1557 | 1.0750 | | 0.3463 | 2.1215 | 978 | 1.2049 | 0.0 | 1.2049 | 1.0977 | | 0.3463 | 2.1258 | 980 | 1.3164 | 0.0 | 1.3164 | 1.1474 | | 0.3463 | 2.1302 | 982 | 1.4028 | 0.2667 | 1.4028 | 1.1844 | | 0.3463 | 2.1345 | 984 | 1.2709 | 0.0 | 1.2709 | 1.1274 | | 0.3463 | 2.1388 | 986 | 1.1077 | 0.0 | 1.1077 | 1.0525 | | 0.3463 | 2.1432 | 988 | 1.1841 | 0.0 | 1.1841 | 1.0882 | | 0.3463 | 2.1475 | 990 | 1.3437 | 0.0 | 1.3437 | 1.1592 | | 0.3463 | 2.1518 | 992 | 1.2643 | 0.0 | 1.2643 | 1.1244 | | 0.3463 | 2.1562 | 994 | 0.9924 | 0.0 | 0.9924 | 0.9962 | | 0.3463 | 2.1605 | 996 | 0.8436 | -0.0233 | 0.8436 | 0.9185 | | 0.3463 | 2.1649 | 998 | 0.8934 | -0.0233 | 0.8934 | 0.9452 | | 0.1306 | 2.1692 | 1000 | 1.0233 | 0.0 | 1.0233 | 1.0116 | | 0.1306 | 2.1735 | 1002 | 1.2235 | 0.0 | 1.2235 | 1.1061 | | 0.1306 | 2.1779 | 1004 | 1.2885 | 0.0 | 1.2885 | 1.1351 | | 0.1306 | 2.1822 | 1006 | 1.3028 | 0.0 | 1.3028 | 1.1414 | | 0.1306 | 2.1866 | 1008 | 1.1286 | 0.0 | 1.1286 | 1.0624 | | 0.1306 | 2.1909 | 1010 | 0.8653 | -0.0233 | 0.8653 | 0.9302 | | 0.1306 | 2.1952 | 1012 | 0.7905 | -0.2737 | 0.7905 | 0.8891 | | 0.1306 | 2.1996 | 1014 | 0.8282 | -0.0233 | 0.8282 | 0.9101 | | 0.1306 | 2.2039 | 1016 | 0.9851 | -0.0233 | 0.9851 | 0.9925 | | 0.1306 | 2.2082 | 1018 | 1.2354 | 0.0 | 1.2354 | 1.1115 | | 0.1306 | 2.2126 | 1020 | 1.3506 | 0.0 | 1.3506 | 1.1622 | | 0.1306 | 2.2169 | 1022 | 1.2714 | 0.0 | 1.2714 | 1.1276 | | 0.1306 | 2.2213 | 1024 | 1.0514 | -0.0233 | 1.0514 | 1.0254 | | 0.1306 | 2.2256 | 1026 | 0.9020 | -0.2737 | 0.9020 | 0.9497 | | 0.1306 | 2.2299 | 1028 | 0.8919 | -0.2623 | 0.8919 | 0.9444 | | 0.1306 | 2.2343 | 1030 | 0.9642 | -0.2737 | 0.9642 | 0.9819 | | 0.1306 | 2.2386 | 1032 | 1.1833 | -0.0233 | 1.1833 | 1.0878 | | 0.1306 | 2.2430 | 1034 | 1.4572 | 0.0 | 1.4572 | 1.2071 | | 0.1306 | 2.2473 | 1036 | 1.4909 | 0.0 | 1.4909 | 1.2210 | | 0.1306 | 2.2516 | 1038 | 1.3251 | 0.0 | 1.3251 | 1.1511 | | 0.1306 | 2.2560 | 1040 | 1.0489 | 0.0 | 1.0489 | 1.0242 | | 0.1306 | 2.2603 | 1042 | 0.8534 | -0.0233 | 0.8534 | 0.9238 | | 0.1306 | 2.2646 | 1044 | 0.8250 | -0.0233 | 0.8250 | 0.9083 | | 0.1306 | 2.2690 | 1046 | 0.8860 | 0.0 | 0.8860 | 0.9413 | | 0.1306 | 2.2733 | 1048 | 1.0238 | 0.0 | 1.0238 | 1.0118 | | 0.1306 | 2.2777 | 1050 | 1.2366 | 0.0 | 1.2366 | 1.1120 | | 0.1306 | 2.2820 | 1052 | 1.2547 | 0.0 | 1.2547 | 1.1201 | | 0.1306 | 2.2863 | 1054 | 1.0583 | 0.0 | 1.0583 | 1.0287 | | 0.1306 | 2.2907 | 1056 | 0.8672 | -0.0233 | 0.8672 | 0.9313 | | 0.1306 | 2.2950 | 1058 | 0.8486 | -0.0233 | 0.8486 | 0.9212 | | 0.1306 | 2.2993 | 1060 | 0.9334 | -0.0233 | 0.9334 | 0.9661 | | 0.1306 | 2.3037 | 1062 | 1.1466 | 0.0 | 1.1466 | 1.0708 | | 0.1306 | 2.3080 | 1064 | 1.2350 | 0.0 | 1.2350 | 1.1113 | | 0.1306 | 2.3124 | 1066 | 1.1184 | 0.0 | 1.1184 | 1.0575 | | 0.1306 | 2.3167 | 1068 | 0.9979 | 0.0 | 0.9979 | 0.9990 | | 0.1306 | 2.3210 | 1070 | 0.9569 | 0.0 | 0.9569 | 0.9782 | | 0.1306 | 2.3254 | 1072 | 0.9250 | 0.0 | 0.9250 | 0.9618 | | 0.1306 | 2.3297 | 1074 | 0.9713 | 0.0 | 0.9713 | 0.9856 | | 0.1306 | 2.3341 | 1076 | 1.1116 | 0.0 | 1.1116 | 1.0543 | | 0.1306 | 2.3384 | 1078 | 1.1316 | 0.0 | 1.1316 | 1.0638 | | 0.1306 | 2.3427 | 1080 | 1.0846 | 0.0 | 1.0846 | 1.0414 | | 0.1306 | 2.3471 | 1082 | 0.9318 | 0.0 | 0.9318 | 0.9653 | | 0.1306 | 2.3514 | 1084 | 0.8935 | 0.0 | 0.8935 | 0.9452 | | 0.1306 | 2.3557 | 1086 | 0.9701 | 0.0 | 0.9701 | 0.9849 | | 0.1306 | 2.3601 | 1088 | 0.9732 | 0.0 | 0.9732 | 0.9865 | | 0.1306 | 2.3644 | 1090 | 0.9383 | 0.0 | 0.9383 | 0.9687 | | 0.1306 | 2.3688 | 1092 | 0.9560 | 0.0 | 0.9560 | 0.9777 | | 0.1306 | 2.3731 | 1094 | 0.9971 | 0.0 | 0.9971 | 0.9985 | | 0.1306 | 2.3774 | 1096 | 1.0870 | 0.0 | 1.0870 | 1.0426 | | 0.1306 | 2.3818 | 1098 | 1.1477 | 0.0 | 1.1477 | 1.0713 | | 0.1306 | 2.3861 | 1100 | 1.1722 | 0.0 | 1.1722 | 1.0827 | | 0.1306 | 2.3905 | 1102 | 1.2203 | 0.0 | 1.2203 | 1.1047 | | 0.1306 | 2.3948 | 1104 | 1.2030 | 0.0 | 1.2030 | 1.0968 | | 0.1306 | 2.3991 | 1106 | 1.0271 | 0.0 | 1.0271 | 1.0134 | | 0.1306 | 2.4035 | 1108 | 0.9735 | 0.0 | 0.9735 | 0.9866 | | 0.1306 | 2.4078 | 1110 | 1.0046 | 0.0 | 1.0046 | 1.0023 | | 0.1306 | 2.4121 | 1112 | 1.0564 | 0.0 | 1.0564 | 1.0278 | | 0.1306 | 2.4165 | 1114 | 1.1710 | 0.0 | 1.1710 | 1.0821 | | 0.1306 | 2.4208 | 1116 | 1.1363 | 0.0 | 1.1363 | 1.0660 | | 0.1306 | 2.4252 | 1118 | 1.2836 | 0.0 | 1.2836 | 1.1330 | | 0.1306 | 2.4295 | 1120 | 1.4126 | 0.0 | 1.4126 | 1.1885 | | 0.1306 | 2.4338 | 1122 | 1.3012 | 0.0 | 1.3012 | 1.1407 | | 0.1306 | 2.4382 | 1124 | 1.0811 | 0.0 | 1.0811 | 1.0398 | | 0.1306 | 2.4425 | 1126 | 0.9472 | 0.0 | 0.9472 | 0.9732 | | 0.1306 | 2.4469 | 1128 | 0.8119 | -0.0233 | 0.8119 | 0.9011 | | 0.1306 | 2.4512 | 1130 | 0.8337 | -0.0233 | 0.8337 | 0.9131 | | 0.1306 | 2.4555 | 1132 | 0.9683 | 0.0 | 0.9683 | 0.9840 | | 0.1306 | 2.4599 | 1134 | 1.2525 | 0.0 | 1.2525 | 1.1191 | | 0.1306 | 2.4642 | 1136 | 1.3690 | 0.0 | 1.3690 | 1.1701 | | 0.1306 | 2.4685 | 1138 | 1.1959 | 0.0 | 1.1959 | 1.0936 | | 0.1306 | 2.4729 | 1140 | 1.0923 | -0.0233 | 1.0923 | 1.0451 | | 0.1306 | 2.4772 | 1142 | 1.0746 | -0.0233 | 1.0746 | 1.0367 | | 0.1306 | 2.4816 | 1144 | 1.3237 | 0.2667 | 1.3237 | 1.1505 | | 0.1306 | 2.4859 | 1146 | 1.5557 | 0.2667 | 1.5557 | 1.2473 | | 0.1306 | 2.4902 | 1148 | 1.4860 | 0.2667 | 1.4860 | 1.2190 | | 0.1306 | 2.4946 | 1150 | 1.2218 | 0.0 | 1.2218 | 1.1054 | | 0.1306 | 2.4989 | 1152 | 0.9742 | 0.0 | 0.9742 | 0.9870 | | 0.1306 | 2.5033 | 1154 | 0.9664 | 0.0 | 0.9664 | 0.9830 | | 0.1306 | 2.5076 | 1156 | 1.0690 | 0.0 | 1.0690 | 1.0339 | | 0.1306 | 2.5119 | 1158 | 1.1999 | 0.0 | 1.1999 | 1.0954 | | 0.1306 | 2.5163 | 1160 | 1.2935 | 0.0 | 1.2935 | 1.1373 | | 0.1306 | 2.5206 | 1162 | 1.2897 | 0.0 | 1.2897 | 1.1357 | | 0.1306 | 2.5249 | 1164 | 1.1852 | 0.0 | 1.1852 | 1.0887 | | 0.1306 | 2.5293 | 1166 | 1.0429 | 0.0 | 1.0429 | 1.0212 | | 0.1306 | 2.5336 | 1168 | 1.0609 | 0.0 | 1.0609 | 1.0300 | | 0.1306 | 2.5380 | 1170 | 1.1475 | 0.0 | 1.1475 | 1.0712 | | 0.1306 | 2.5423 | 1172 | 1.1081 | 0.0 | 1.1081 | 1.0527 | | 0.1306 | 2.5466 | 1174 | 0.9920 | 0.0 | 0.9920 | 0.9960 | | 0.1306 | 2.5510 | 1176 | 0.9866 | 0.0 | 0.9866 | 0.9933 | | 0.1306 | 2.5553 | 1178 | 1.0670 | 0.0 | 1.0670 | 1.0330 | | 0.1306 | 2.5597 | 1180 | 1.0934 | 0.0 | 1.0934 | 1.0457 | | 0.1306 | 2.5640 | 1182 | 1.0882 | 0.0 | 1.0882 | 1.0432 | | 0.1306 | 2.5683 | 1184 | 1.0705 | 0.0 | 1.0705 | 1.0347 | | 0.1306 | 2.5727 | 1186 | 1.1366 | 0.0 | 1.1366 | 1.0661 | | 0.1306 | 2.5770 | 1188 | 1.1317 | 0.0 | 1.1317 | 1.0638 | | 0.1306 | 2.5813 | 1190 | 1.1415 | 0.0 | 1.1415 | 1.0684 | | 0.1306 | 2.5857 | 1192 | 0.9514 | 0.0 | 0.9514 | 0.9754 | | 0.1306 | 2.5900 | 1194 | 0.7689 | -0.0233 | 0.7689 | 0.8769 | | 0.1306 | 2.5944 | 1196 | 0.7423 | 0.1895 | 0.7423 | 0.8616 | | 0.1306 | 2.5987 | 1198 | 0.7799 | -0.0233 | 0.7799 | 0.8831 | | 0.1306 | 2.6030 | 1200 | 0.8877 | 0.0 | 0.8877 | 0.9422 | | 0.1306 | 2.6074 | 1202 | 1.0710 | 0.0 | 1.0710 | 1.0349 | | 0.1306 | 2.6117 | 1204 | 1.1546 | 0.0 | 1.1546 | 1.0745 | | 0.1306 | 2.6161 | 1206 | 1.2473 | 0.0 | 1.2473 | 1.1168 | | 0.1306 | 2.6204 | 1208 | 1.1694 | 0.0 | 1.1694 | 1.0814 | | 0.1306 | 2.6247 | 1210 | 1.0520 | 0.0 | 1.0520 | 1.0256 | | 0.1306 | 2.6291 | 1212 | 0.9072 | 0.0 | 0.9072 | 0.9525 | | 0.1306 | 2.6334 | 1214 | 0.9309 | 0.0 | 0.9309 | 0.9648 | | 0.1306 | 2.6377 | 1216 | 0.9914 | 0.0 | 0.9914 | 0.9957 | | 0.1306 | 2.6421 | 1218 | 1.0865 | 0.0 | 1.0865 | 1.0423 | | 0.1306 | 2.6464 | 1220 | 1.0860 | 0.0 | 1.0860 | 1.0421 | | 0.1306 | 2.6508 | 1222 | 1.1743 | 0.0 | 1.1743 | 1.0836 | | 0.1306 | 2.6551 | 1224 | 1.2391 | 0.0 | 1.2391 | 1.1132 | | 0.1306 | 2.6594 | 1226 | 1.1792 | 0.0 | 1.1792 | 1.0859 | | 0.1306 | 2.6638 | 1228 | 0.9781 | -0.0233 | 0.9781 | 0.9890 | | 0.1306 | 2.6681 | 1230 | 0.9280 | -0.0233 | 0.9280 | 0.9633 | | 0.1306 | 2.6725 | 1232 | 1.0330 | -0.0233 | 1.0330 | 1.0163 | | 0.1306 | 2.6768 | 1234 | 1.2795 | 0.0 | 1.2795 | 1.1312 | | 0.1306 | 2.6811 | 1236 | 1.2725 | 0.0 | 1.2725 | 1.1281 | | 0.1306 | 2.6855 | 1238 | 1.2273 | 0.0 | 1.2273 | 1.1078 | | 0.1306 | 2.6898 | 1240 | 1.0315 | 0.0 | 1.0315 | 1.0156 | | 0.1306 | 2.6941 | 1242 | 0.8745 | -0.0233 | 0.8745 | 0.9352 | | 0.1306 | 2.6985 | 1244 | 0.8246 | -0.0233 | 0.8246 | 0.9081 | | 0.1306 | 2.7028 | 1246 | 0.9045 | -0.0233 | 0.9045 | 0.9511 | | 0.1306 | 2.7072 | 1248 | 1.0694 | 0.0 | 1.0694 | 1.0341 | | 0.1306 | 2.7115 | 1250 | 1.0767 | 0.0 | 1.0767 | 1.0376 | | 0.1306 | 2.7158 | 1252 | 0.9580 | -0.0233 | 0.9580 | 0.9788 | | 0.1306 | 2.7202 | 1254 | 0.9511 | -0.0233 | 0.9511 | 0.9752 | | 0.1306 | 2.7245 | 1256 | 1.0497 | -0.0233 | 1.0497 | 1.0245 | | 0.1306 | 2.7289 | 1258 | 1.2046 | 0.0 | 1.2046 | 1.0976 | | 0.1306 | 2.7332 | 1260 | 1.3150 | 0.0 | 1.3150 | 1.1467 | | 0.1306 | 2.7375 | 1262 | 1.1791 | 0.0 | 1.1791 | 1.0859 | | 0.1306 | 2.7419 | 1264 | 0.9457 | 0.0 | 0.9457 | 0.9725 | | 0.1306 | 2.7462 | 1266 | 0.7766 | -0.0233 | 0.7766 | 0.8812 | | 0.1306 | 2.7505 | 1268 | 0.7511 | -0.0233 | 0.7511 | 0.8667 | | 0.1306 | 2.7549 | 1270 | 0.7859 | -0.0233 | 0.7859 | 0.8865 | | 0.1306 | 2.7592 | 1272 | 0.9537 | 0.0 | 0.9537 | 0.9766 | | 0.1306 | 2.7636 | 1274 | 1.0744 | 0.0 | 1.0744 | 1.0365 | | 0.1306 | 2.7679 | 1276 | 1.0681 | 0.0 | 1.0681 | 1.0335 | | 0.1306 | 2.7722 | 1278 | 0.9740 | -0.0233 | 0.9740 | 0.9869 | | 0.1306 | 2.7766 | 1280 | 0.8751 | -0.0233 | 0.8751 | 0.9355 | | 0.1306 | 2.7809 | 1282 | 0.8897 | -0.0233 | 0.8897 | 0.9432 | | 0.1306 | 2.7852 | 1284 | 1.0193 | -0.0233 | 1.0193 | 1.0096 | | 0.1306 | 2.7896 | 1286 | 1.2791 | 0.0 | 1.2791 | 1.1310 | | 0.1306 | 2.7939 | 1288 | 1.4198 | 0.0 | 1.4198 | 1.1916 | | 0.1306 | 2.7983 | 1290 | 1.3517 | 0.0 | 1.3517 | 1.1626 | | 0.1306 | 2.8026 | 1292 | 1.1352 | 0.0 | 1.1352 | 1.0655 | | 0.1306 | 2.8069 | 1294 | 0.8703 | -0.0233 | 0.8703 | 0.9329 | | 0.1306 | 2.8113 | 1296 | 0.7805 | -0.0233 | 0.7805 | 0.8835 | | 0.1306 | 2.8156 | 1298 | 0.7822 | -0.0233 | 0.7822 | 0.8844 | | 0.1306 | 2.8200 | 1300 | 0.8686 | 0.0 | 0.8686 | 0.9320 | | 0.1306 | 2.8243 | 1302 | 1.0173 | 0.0 | 1.0173 | 1.0086 | | 0.1306 | 2.8286 | 1304 | 1.0579 | 0.0 | 1.0579 | 1.0286 | | 0.1306 | 2.8330 | 1306 | 1.0981 | 0.0 | 1.0981 | 1.0479 | | 0.1306 | 2.8373 | 1308 | 1.0510 | 0.0 | 1.0510 | 1.0252 | | 0.1306 | 2.8416 | 1310 | 0.9544 | 0.0 | 0.9544 | 0.9770 | | 0.1306 | 2.8460 | 1312 | 0.9164 | -0.0233 | 0.9164 | 0.9573 | | 0.1306 | 2.8503 | 1314 | 0.8434 | -0.0233 | 0.8434 | 0.9184 | | 0.1306 | 2.8547 | 1316 | 0.8862 | -0.0233 | 0.8862 | 0.9414 | | 0.1306 | 2.8590 | 1318 | 1.0266 | -0.0233 | 1.0266 | 1.0132 | | 0.1306 | 2.8633 | 1320 | 1.1569 | 0.0 | 1.1569 | 1.0756 | | 0.1306 | 2.8677 | 1322 | 1.0886 | 0.0 | 1.0886 | 1.0433 | | 0.1306 | 2.8720 | 1324 | 0.9140 | -0.0233 | 0.9140 | 0.9560 | | 0.1306 | 2.8764 | 1326 | 0.8129 | -0.0233 | 0.8129 | 0.9016 | | 0.1306 | 2.8807 | 1328 | 0.7766 | -0.0233 | 0.7766 | 0.8813 | | 0.1306 | 2.8850 | 1330 | 0.8017 | -0.0233 | 0.8017 | 0.8954 | | 0.1306 | 2.8894 | 1332 | 0.9187 | -0.0233 | 0.9187 | 0.9585 | | 0.1306 | 2.8937 | 1334 | 0.9960 | -0.0233 | 0.9960 | 0.9980 | | 0.1306 | 2.8980 | 1336 | 0.9571 | -0.0233 | 0.9571 | 0.9783 | | 0.1306 | 2.9024 | 1338 | 0.9239 | -0.0233 | 0.9239 | 0.9612 | | 0.1306 | 2.9067 | 1340 | 0.9856 | -0.0233 | 0.9856 | 0.9928 | | 0.1306 | 2.9111 | 1342 | 0.9803 | -0.0233 | 0.9803 | 0.9901 | | 0.1306 | 2.9154 | 1344 | 0.9788 | -0.0233 | 0.9788 | 0.9893 | | 0.1306 | 2.9197 | 1346 | 0.9874 | 0.0 | 0.9874 | 0.9937 | | 0.1306 | 2.9241 | 1348 | 0.9856 | 0.0 | 0.9856 | 0.9927 | | 0.1306 | 2.9284 | 1350 | 1.0173 | 0.0 | 1.0173 | 1.0086 | | 0.1306 | 2.9328 | 1352 | 1.0504 | 0.0 | 1.0504 | 1.0249 | | 0.1306 | 2.9371 | 1354 | 1.0713 | -0.0233 | 1.0713 | 1.0350 | | 0.1306 | 2.9414 | 1356 | 1.1869 | -0.0233 | 1.1869 | 1.0895 | | 0.1306 | 2.9458 | 1358 | 1.2763 | 0.0 | 1.2763 | 1.1297 | | 0.1306 | 2.9501 | 1360 | 1.2696 | 0.0 | 1.2696 | 1.1267 | | 0.1306 | 2.9544 | 1362 | 1.1455 | -0.0233 | 1.1455 | 1.0703 | | 0.1306 | 2.9588 | 1364 | 1.1795 | 0.0 | 1.1795 | 1.0861 | | 0.1306 | 2.9631 | 1366 | 1.1812 | 0.0 | 1.1812 | 1.0868 | | 0.1306 | 2.9675 | 1368 | 1.1760 | 0.0 | 1.1760 | 1.0844 | | 0.1306 | 2.9718 | 1370 | 1.1066 | 0.0 | 1.1066 | 1.0519 | | 0.1306 | 2.9761 | 1372 | 0.9421 | -0.0233 | 0.9421 | 0.9706 | | 0.1306 | 2.9805 | 1374 | 0.8886 | -0.0233 | 0.8886 | 0.9426 | | 0.1306 | 2.9848 | 1376 | 0.9419 | -0.0233 | 0.9419 | 0.9705 | | 0.1306 | 2.9892 | 1378 | 1.1086 | 0.0 | 1.1086 | 1.0529 | | 0.1306 | 2.9935 | 1380 | 1.3271 | 0.0 | 1.3271 | 1.1520 | | 0.1306 | 2.9978 | 1382 | 1.3659 | 0.0 | 1.3659 | 1.1687 | | 0.1306 | 3.0022 | 1384 | 1.3459 | 0.0 | 1.3459 | 1.1601 | | 0.1306 | 3.0065 | 1386 | 1.1917 | 0.0 | 1.1917 | 1.0917 | | 0.1306 | 3.0108 | 1388 | 0.9705 | 0.0 | 0.9705 | 0.9852 | | 0.1306 | 3.0152 | 1390 | 0.8794 | -0.0233 | 0.8794 | 0.9378 | | 0.1306 | 3.0195 | 1392 | 0.8996 | -0.0233 | 0.8996 | 0.9485 | | 0.1306 | 3.0239 | 1394 | 0.9712 | 0.0 | 0.9712 | 0.9855 | | 0.1306 | 3.0282 | 1396 | 1.0929 | 0.0 | 1.0929 | 1.0454 | | 0.1306 | 3.0325 | 1398 | 1.1057 | 0.0 | 1.1057 | 1.0515 | | 0.1306 | 3.0369 | 1400 | 1.0698 | 0.0 | 1.0698 | 1.0343 | | 0.1306 | 3.0412 | 1402 | 0.9851 | -0.0233 | 0.9851 | 0.9925 | | 0.1306 | 3.0456 | 1404 | 0.8863 | -0.0233 | 0.8863 | 0.9414 | | 0.1306 | 3.0499 | 1406 | 0.8972 | -0.0233 | 0.8972 | 0.9472 | | 0.1306 | 3.0542 | 1408 | 1.0058 | 0.0 | 1.0058 | 1.0029 | | 0.1306 | 3.0586 | 1410 | 1.1797 | 0.0 | 1.1797 | 1.0862 | | 0.1306 | 3.0629 | 1412 | 1.1826 | 0.0 | 1.1826 | 1.0875 | | 0.1306 | 3.0672 | 1414 | 1.0585 | 0.0 | 1.0585 | 1.0288 | | 0.1306 | 3.0716 | 1416 | 0.8673 | -0.0233 | 0.8673 | 0.9313 | | 0.1306 | 3.0759 | 1418 | 0.7248 | -0.0233 | 0.7248 | 0.8514 | | 0.1306 | 3.0803 | 1420 | 0.7048 | 0.1895 | 0.7048 | 0.8395 | | 0.1306 | 3.0846 | 1422 | 0.7617 | -0.0233 | 0.7617 | 0.8728 | | 0.1306 | 3.0889 | 1424 | 0.9412 | 0.0 | 0.9412 | 0.9701 | | 0.1306 | 3.0933 | 1426 | 1.2815 | 0.0 | 1.2815 | 1.1320 | | 0.1306 | 3.0976 | 1428 | 1.4973 | 0.2667 | 1.4973 | 1.2236 | | 0.1306 | 3.1020 | 1430 | 1.5011 | 0.2667 | 1.5011 | 1.2252 | | 0.1306 | 3.1063 | 1432 | 1.3457 | 0.0 | 1.3457 | 1.1600 | | 0.1306 | 3.1106 | 1434 | 1.1041 | 0.0 | 1.1041 | 1.0508 | | 0.1306 | 3.1150 | 1436 | 0.9073 | -0.0233 | 0.9073 | 0.9525 | | 0.1306 | 3.1193 | 1438 | 0.8479 | -0.0233 | 0.8479 | 0.9208 | | 0.1306 | 3.1236 | 1440 | 0.9117 | -0.0233 | 0.9117 | 0.9548 | | 0.1306 | 3.1280 | 1442 | 1.0769 | -0.0233 | 1.0769 | 1.0378 | | 0.1306 | 3.1323 | 1444 | 1.1342 | -0.0233 | 1.1342 | 1.0650 | | 0.1306 | 3.1367 | 1446 | 1.1479 | -0.0233 | 1.1479 | 1.0714 | | 0.1306 | 3.1410 | 1448 | 1.1853 | 0.0 | 1.1853 | 1.0887 | | 0.1306 | 3.1453 | 1450 | 1.1277 | 0.0 | 1.1277 | 1.0619 | | 0.1306 | 3.1497 | 1452 | 1.0211 | -0.0233 | 1.0211 | 1.0105 | | 0.1306 | 3.1540 | 1454 | 0.9671 | -0.0233 | 0.9671 | 0.9834 | | 0.1306 | 3.1584 | 1456 | 0.9148 | -0.0233 | 0.9148 | 0.9565 | | 0.1306 | 3.1627 | 1458 | 0.9341 | -0.0233 | 0.9341 | 0.9665 | | 0.1306 | 3.1670 | 1460 | 0.9936 | 0.0 | 0.9936 | 0.9968 | | 0.1306 | 3.1714 | 1462 | 0.9468 | -0.0233 | 0.9468 | 0.9731 | | 0.1306 | 3.1757 | 1464 | 0.9219 | -0.0233 | 0.9219 | 0.9602 | | 0.1306 | 3.1800 | 1466 | 0.8924 | -0.0233 | 0.8924 | 0.9447 | | 0.1306 | 3.1844 | 1468 | 0.8827 | -0.0233 | 0.8827 | 0.9395 | | 0.1306 | 3.1887 | 1470 | 0.9165 | -0.0233 | 0.9165 | 0.9573 | | 0.1306 | 3.1931 | 1472 | 0.9440 | 0.0 | 0.9440 | 0.9716 | | 0.1306 | 3.1974 | 1474 | 0.9802 | 0.0 | 0.9802 | 0.9900 | | 0.1306 | 3.2017 | 1476 | 0.9876 | 0.0 | 0.9876 | 0.9938 | | 0.1306 | 3.2061 | 1478 | 1.0662 | 0.0 | 1.0662 | 1.0326 | | 0.1306 | 3.2104 | 1480 | 1.1522 | 0.0 | 1.1522 | 1.0734 | | 0.1306 | 3.2148 | 1482 | 1.0929 | 0.0 | 1.0929 | 1.0454 | | 0.1306 | 3.2191 | 1484 | 0.9637 | 0.0 | 0.9637 | 0.9817 | | 0.1306 | 3.2234 | 1486 | 0.8539 | 0.0 | 0.8539 | 0.9241 | | 0.1306 | 3.2278 | 1488 | 0.8507 | 0.0 | 0.8507 | 0.9223 | | 0.1306 | 3.2321 | 1490 | 0.9251 | 0.0 | 0.9251 | 0.9618 | | 0.1306 | 3.2364 | 1492 | 1.0191 | 0.0 | 1.0191 | 1.0095 | | 0.1306 | 3.2408 | 1494 | 0.9489 | -0.0233 | 0.9489 | 0.9741 | | 0.1306 | 3.2451 | 1496 | 0.8643 | -0.0233 | 0.8643 | 0.9297 | | 0.1306 | 3.2495 | 1498 | 0.9134 | -0.0233 | 0.9134 | 0.9557 | | 0.0993 | 3.2538 | 1500 | 0.9943 | -0.0233 | 0.9943 | 0.9972 | | 0.0993 | 3.2581 | 1502 | 1.1440 | 0.0 | 1.1440 | 1.0696 | | 0.0993 | 3.2625 | 1504 | 1.3012 | 0.0 | 1.3012 | 1.1407 | | 0.0993 | 3.2668 | 1506 | 1.2376 | 0.0 | 1.2376 | 1.1125 | | 0.0993 | 3.2711 | 1508 | 1.0097 | -0.0233 | 1.0097 | 1.0048 | | 0.0993 | 3.2755 | 1510 | 0.9146 | -0.0233 | 0.9146 | 0.9563 | | 0.0993 | 3.2798 | 1512 | 0.8572 | -0.0233 | 0.8572 | 0.9259 | | 0.0993 | 3.2842 | 1514 | 0.9143 | 0.0 | 0.9143 | 0.9562 | | 0.0993 | 3.2885 | 1516 | 1.0474 | 0.0 | 1.0474 | 1.0234 | | 0.0993 | 3.2928 | 1518 | 1.1901 | 0.0 | 1.1901 | 1.0909 | | 0.0993 | 3.2972 | 1520 | 1.1668 | 0.0 | 1.1668 | 1.0802 | | 0.0993 | 3.3015 | 1522 | 1.0390 | 0.0 | 1.0390 | 1.0193 | | 0.0993 | 3.3059 | 1524 | 0.8418 | -0.0233 | 0.8418 | 0.9175 | | 0.0993 | 3.3102 | 1526 | 0.7281 | -0.0577 | 0.7281 | 0.8533 | | 0.0993 | 3.3145 | 1528 | 0.7145 | 0.0984 | 0.7145 | 0.8453 | | 0.0993 | 3.3189 | 1530 | 0.7545 | -0.0421 | 0.7545 | 0.8686 | | 0.0993 | 3.3232 | 1532 | 0.9153 | -0.0233 | 0.9153 | 0.9567 | | 0.0993 | 3.3275 | 1534 | 1.1627 | 0.0 | 1.1627 | 1.0783 | | 0.0993 | 3.3319 | 1536 | 1.3666 | 0.0 | 1.3666 | 1.1690 | | 0.0993 | 3.3362 | 1538 | 1.3574 | 0.0 | 1.3574 | 1.1651 | | 0.0993 | 3.3406 | 1540 | 1.1789 | 0.0 | 1.1789 | 1.0858 | | 0.0993 | 3.3449 | 1542 | 0.9492 | 0.0 | 0.9492 | 0.9743 | | 0.0993 | 3.3492 | 1544 | 0.8708 | -0.0233 | 0.8708 | 0.9331 | | 0.0993 | 3.3536 | 1546 | 0.8038 | -0.0233 | 0.8038 | 0.8965 | | 0.0993 | 3.3579 | 1548 | 0.8421 | -0.0233 | 0.8421 | 0.9177 | | 0.0993 | 3.3623 | 1550 | 0.9608 | 0.0 | 0.9608 | 0.9802 | | 0.0993 | 3.3666 | 1552 | 1.1048 | 0.0 | 1.1048 | 1.0511 | | 0.0993 | 3.3709 | 1554 | 1.1699 | 0.0 | 1.1699 | 1.0816 | | 0.0993 | 3.3753 | 1556 | 1.0905 | 0.0 | 1.0905 | 1.0443 | | 0.0993 | 3.3796 | 1558 | 0.8930 | -0.0233 | 0.8930 | 0.9450 | | 0.0993 | 3.3839 | 1560 | 0.8213 | -0.0233 | 0.8213 | 0.9062 | | 0.0993 | 3.3883 | 1562 | 0.8319 | -0.0233 | 0.8319 | 0.9121 | | 0.0993 | 3.3926 | 1564 | 0.8730 | -0.0233 | 0.8730 | 0.9344 | | 0.0993 | 3.3970 | 1566 | 0.9212 | -0.0233 | 0.9212 | 0.9598 | | 0.0993 | 3.4013 | 1568 | 1.0562 | 0.0 | 1.0562 | 1.0277 | | 0.0993 | 3.4056 | 1570 | 1.0913 | 0.0 | 1.0913 | 1.0447 | | 0.0993 | 3.4100 | 1572 | 1.0507 | 0.0 | 1.0507 | 1.0251 | | 0.0993 | 3.4143 | 1574 | 0.9344 | 0.0 | 0.9344 | 0.9666 | | 0.0993 | 3.4187 | 1576 | 0.8777 | -0.0233 | 0.8777 | 0.9369 | | 0.0993 | 3.4230 | 1578 | 0.9253 | 0.0 | 0.9253 | 0.9619 | | 0.0993 | 3.4273 | 1580 | 1.0156 | 0.0 | 1.0156 | 1.0077 | | 0.0993 | 3.4317 | 1582 | 1.0418 | 0.0 | 1.0418 | 1.0207 | | 0.0993 | 3.4360 | 1584 | 1.0963 | 0.0 | 1.0963 | 1.0471 | | 0.0993 | 3.4403 | 1586 | 1.0834 | 0.0 | 1.0834 | 1.0409 | | 0.0993 | 3.4447 | 1588 | 0.9537 | 0.0 | 0.9537 | 0.9766 | | 0.0993 | 3.4490 | 1590 | 0.8329 | -0.0233 | 0.8329 | 0.9126 | | 0.0993 | 3.4534 | 1592 | 0.8335 | -0.0233 | 0.8335 | 0.9129 | | 0.0993 | 3.4577 | 1594 | 0.9628 | 0.0 | 0.9628 | 0.9812 | | 0.0993 | 3.4620 | 1596 | 1.0430 | 0.0 | 1.0430 | 1.0213 | | 0.0993 | 3.4664 | 1598 | 0.9938 | 0.0 | 0.9938 | 0.9969 | | 0.0993 | 3.4707 | 1600 | 0.9402 | 0.0 | 0.9402 | 0.9697 | | 0.0993 | 3.4751 | 1602 | 0.9139 | 0.0 | 0.9139 | 0.9560 | | 0.0993 | 3.4794 | 1604 | 0.8959 | 0.0 | 0.8959 | 0.9465 | | 0.0993 | 3.4837 | 1606 | 0.8724 | -0.0233 | 0.8724 | 0.9340 | | 0.0993 | 3.4881 | 1608 | 0.8300 | -0.0233 | 0.8300 | 0.9110 | | 0.0993 | 3.4924 | 1610 | 0.8330 | -0.0233 | 0.8330 | 0.9127 | | 0.0993 | 3.4967 | 1612 | 0.9072 | -0.0233 | 0.9072 | 0.9525 | | 0.0993 | 3.5011 | 1614 | 1.0004 | 0.0 | 1.0004 | 1.0002 | | 0.0993 | 3.5054 | 1616 | 1.1151 | 0.0 | 1.1151 | 1.0560 | | 0.0993 | 3.5098 | 1618 | 1.1338 | 0.0 | 1.1338 | 1.0648 | | 0.0993 | 3.5141 | 1620 | 1.1754 | 0.0 | 1.1754 | 1.0841 | | 0.0993 | 3.5184 | 1622 | 1.1300 | 0.0 | 1.1300 | 1.0630 | | 0.0993 | 3.5228 | 1624 | 0.9544 | -0.0233 | 0.9544 | 0.9769 | | 0.0993 | 3.5271 | 1626 | 0.8098 | -0.0233 | 0.8098 | 0.8999 | | 0.0993 | 3.5315 | 1628 | 0.8119 | -0.0233 | 0.8119 | 0.9011 | | 0.0993 | 3.5358 | 1630 | 0.9648 | -0.0233 | 0.9648 | 0.9822 | | 0.0993 | 3.5401 | 1632 | 1.0519 | 0.0 | 1.0519 | 1.0256 | | 0.0993 | 3.5445 | 1634 | 0.9811 | 0.0 | 0.9811 | 0.9905 | | 0.0993 | 3.5488 | 1636 | 0.9265 | 0.0 | 0.9265 | 0.9626 | | 0.0993 | 3.5531 | 1638 | 0.8150 | -0.0233 | 0.8150 | 0.9028 | | 0.0993 | 3.5575 | 1640 | 0.7477 | 0.1895 | 0.7477 | 0.8647 | | 0.0993 | 3.5618 | 1642 | 0.7845 | -0.0233 | 0.7845 | 0.8857 | | 0.0993 | 3.5662 | 1644 | 0.9056 | -0.0233 | 0.9056 | 0.9516 | | 0.0993 | 3.5705 | 1646 | 1.0518 | 0.0 | 1.0518 | 1.0256 | | 0.0993 | 3.5748 | 1648 | 1.0574 | 0.0 | 1.0574 | 1.0283 | | 0.0993 | 3.5792 | 1650 | 0.8667 | -0.0233 | 0.8667 | 0.9309 | | 0.0993 | 3.5835 | 1652 | 0.8076 | -0.0233 | 0.8076 | 0.8987 | | 0.0993 | 3.5879 | 1654 | 0.8949 | -0.0233 | 0.8949 | 0.9460 | | 0.0993 | 3.5922 | 1656 | 0.9316 | -0.0233 | 0.9316 | 0.9652 | | 0.0993 | 3.5965 | 1658 | 0.8101 | -0.0233 | 0.8101 | 0.9000 | | 0.0993 | 3.6009 | 1660 | 0.7363 | -0.0233 | 0.7363 | 0.8581 | | 0.0993 | 3.6052 | 1662 | 0.7350 | -0.0233 | 0.7350 | 0.8573 | | 0.0993 | 3.6095 | 1664 | 0.7985 | -0.0233 | 0.7985 | 0.8936 | | 0.0993 | 3.6139 | 1666 | 0.7867 | -0.0233 | 0.7867 | 0.8869 | | 0.0993 | 3.6182 | 1668 | 0.8846 | 0.0 | 0.8846 | 0.9405 | | 0.0993 | 3.6226 | 1670 | 1.0513 | 0.0 | 1.0513 | 1.0253 | | 0.0993 | 3.6269 | 1672 | 1.0176 | 0.0 | 1.0176 | 1.0088 | | 0.0993 | 3.6312 | 1674 | 1.0360 | 0.0 | 1.0360 | 1.0179 | | 0.0993 | 3.6356 | 1676 | 0.9762 | 0.0 | 0.9762 | 0.9880 | | 0.0993 | 3.6399 | 1678 | 0.8417 | -0.0233 | 0.8417 | 0.9175 | | 0.0993 | 3.6443 | 1680 | 0.8385 | -0.0233 | 0.8385 | 0.9157 | | 0.0993 | 3.6486 | 1682 | 0.8816 | 0.0 | 0.8816 | 0.9389 | | 0.0993 | 3.6529 | 1684 | 0.8185 | -0.0233 | 0.8185 | 0.9047 | | 0.0993 | 3.6573 | 1686 | 0.7525 | -0.0233 | 0.7525 | 0.8675 | | 0.0993 | 3.6616 | 1688 | 0.7501 | -0.0233 | 0.7501 | 0.8661 | | 0.0993 | 3.6659 | 1690 | 0.8344 | -0.0233 | 0.8344 | 0.9134 | | 0.0993 | 3.6703 | 1692 | 0.9903 | -0.0233 | 0.9903 | 0.9952 | | 0.0993 | 3.6746 | 1694 | 1.1347 | 0.0 | 1.1347 | 1.0652 | | 0.0993 | 3.6790 | 1696 | 1.0946 | 0.0 | 1.0946 | 1.0462 | | 0.0993 | 3.6833 | 1698 | 0.9832 | -0.0233 | 0.9832 | 0.9916 | | 0.0993 | 3.6876 | 1700 | 0.9725 | -0.0233 | 0.9725 | 0.9862 | | 0.0993 | 3.6920 | 1702 | 0.9176 | -0.0233 | 0.9176 | 0.9579 | | 0.0993 | 3.6963 | 1704 | 0.9299 | -0.0233 | 0.9299 | 0.9643 | | 0.0993 | 3.7007 | 1706 | 0.9426 | -0.0233 | 0.9426 | 0.9709 | | 0.0993 | 3.7050 | 1708 | 0.9670 | -0.0233 | 0.9670 | 0.9834 | | 0.0993 | 3.7093 | 1710 | 0.9610 | -0.0233 | 0.9610 | 0.9803 | | 0.0993 | 3.7137 | 1712 | 0.9405 | -0.0233 | 0.9405 | 0.9698 | | 0.0993 | 3.7180 | 1714 | 0.8641 | -0.0233 | 0.8641 | 0.9296 | | 0.0993 | 3.7223 | 1716 | 0.8258 | -0.0233 | 0.8258 | 0.9088 | | 0.0993 | 3.7267 | 1718 | 0.7956 | -0.0421 | 0.7956 | 0.8920 | | 0.0993 | 3.7310 | 1720 | 0.8159 | -0.0421 | 0.8159 | 0.9033 | | 0.0993 | 3.7354 | 1722 | 0.8985 | -0.0233 | 0.8985 | 0.9479 | | 0.0993 | 3.7397 | 1724 | 1.0270 | 0.0 | 1.0270 | 1.0134 | | 0.0993 | 3.7440 | 1726 | 1.1510 | 0.0 | 1.1510 | 1.0728 | | 0.0993 | 3.7484 | 1728 | 1.1022 | 0.0 | 1.1022 | 1.0499 | | 0.0993 | 3.7527 | 1730 | 0.9362 | 0.0 | 0.9362 | 0.9676 | | 0.0993 | 3.7570 | 1732 | 0.8122 | -0.0233 | 0.8122 | 0.9012 | | 0.0993 | 3.7614 | 1734 | 0.8183 | -0.0233 | 0.8183 | 0.9046 | | 0.0993 | 3.7657 | 1736 | 0.9202 | 0.0 | 0.9202 | 0.9593 | | 0.0993 | 3.7701 | 1738 | 1.0794 | 0.0 | 1.0794 | 1.0390 | | 0.0993 | 3.7744 | 1740 | 1.1010 | 0.0 | 1.1010 | 1.0493 | | 0.0993 | 3.7787 | 1742 | 1.0353 | 0.0 | 1.0353 | 1.0175 | | 0.0993 | 3.7831 | 1744 | 0.8841 | -0.0233 | 0.8841 | 0.9402 | | 0.0993 | 3.7874 | 1746 | 0.8225 | -0.2692 | 0.8225 | 0.9069 | | 0.0993 | 3.7918 | 1748 | 0.8576 | -0.0421 | 0.8576 | 0.9260 | | 0.0993 | 3.7961 | 1750 | 0.8998 | -0.0421 | 0.8998 | 0.9486 | | 0.0993 | 3.8004 | 1752 | 1.0288 | -0.0233 | 1.0288 | 1.0143 | | 0.0993 | 3.8048 | 1754 | 1.1176 | -0.0233 | 1.1176 | 1.0572 | | 0.0993 | 3.8091 | 1756 | 1.1594 | -0.0233 | 1.1594 | 1.0767 | | 0.0993 | 3.8134 | 1758 | 1.1107 | -0.0233 | 1.1107 | 1.0539 | | 0.0993 | 3.8178 | 1760 | 1.0877 | -0.0233 | 1.0877 | 1.0429 | | 0.0993 | 3.8221 | 1762 | 0.9792 | -0.0233 | 0.9792 | 0.9896 | | 0.0993 | 3.8265 | 1764 | 0.8314 | -0.0233 | 0.8314 | 0.9118 | | 0.0993 | 3.8308 | 1766 | 0.7941 | -0.0577 | 0.7941 | 0.8911 | | 0.0993 | 3.8351 | 1768 | 0.8262 | -0.0421 | 0.8262 | 0.9089 | | 0.0993 | 3.8395 | 1770 | 0.9422 | -0.0233 | 0.9422 | 0.9707 | | 0.0993 | 3.8438 | 1772 | 1.0577 | -0.0233 | 1.0577 | 1.0285 | | 0.0993 | 3.8482 | 1774 | 1.0626 | -0.0233 | 1.0626 | 1.0308 | | 0.0993 | 3.8525 | 1776 | 0.9945 | -0.0233 | 0.9945 | 0.9972 | | 0.0993 | 3.8568 | 1778 | 0.9207 | -0.0577 | 0.9207 | 0.9595 | | 0.0993 | 3.8612 | 1780 | 0.9294 | -0.0577 | 0.9294 | 0.9641 | | 0.0993 | 3.8655 | 1782 | 0.9315 | -0.0421 | 0.9315 | 0.9651 | | 0.0993 | 3.8698 | 1784 | 0.9183 | -0.0233 | 0.9183 | 0.9583 | | 0.0993 | 3.8742 | 1786 | 0.9635 | -0.0233 | 0.9635 | 0.9816 | | 0.0993 | 3.8785 | 1788 | 0.9288 | -0.0233 | 0.9288 | 0.9637 | | 0.0993 | 3.8829 | 1790 | 0.9268 | -0.0233 | 0.9268 | 0.9627 | | 0.0993 | 3.8872 | 1792 | 0.9675 | 0.0 | 0.9675 | 0.9836 | | 0.0993 | 3.8915 | 1794 | 0.9835 | 0.0 | 0.9835 | 0.9917 | | 0.0993 | 3.8959 | 1796 | 0.9017 | 0.0 | 0.9017 | 0.9496 | | 0.0993 | 3.9002 | 1798 | 0.8505 | 0.0 | 0.8505 | 0.9222 | | 0.0993 | 3.9046 | 1800 | 0.8082 | -0.0233 | 0.8082 | 0.8990 | | 0.0993 | 3.9089 | 1802 | 0.7819 | -0.0233 | 0.7819 | 0.8843 | | 0.0993 | 3.9132 | 1804 | 0.8018 | -0.0233 | 0.8018 | 0.8954 | | 0.0993 | 3.9176 | 1806 | 0.9306 | -0.0233 | 0.9306 | 0.9647 | | 0.0993 | 3.9219 | 1808 | 1.1343 | 0.0 | 1.1343 | 1.0650 | | 0.0993 | 3.9262 | 1810 | 1.2071 | 0.0 | 1.2071 | 1.0987 | | 0.0993 | 3.9306 | 1812 | 1.1366 | -0.0233 | 1.1366 | 1.0661 | | 0.0993 | 3.9349 | 1814 | 0.9945 | -0.0233 | 0.9945 | 0.9972 | | 0.0993 | 3.9393 | 1816 | 0.8675 | -0.0233 | 0.8675 | 0.9314 | | 0.0993 | 3.9436 | 1818 | 0.8554 | -0.0233 | 0.8554 | 0.9249 | | 0.0993 | 3.9479 | 1820 | 0.9044 | -0.0233 | 0.9044 | 0.9510 | | 0.0993 | 3.9523 | 1822 | 1.0021 | -0.0233 | 1.0021 | 1.0011 | | 0.0993 | 3.9566 | 1824 | 1.1601 | 0.0 | 1.1601 | 1.0771 | | 0.0993 | 3.9610 | 1826 | 1.1829 | 0.0 | 1.1829 | 1.0876 | | 0.0993 | 3.9653 | 1828 | 1.0727 | 0.0 | 1.0727 | 1.0357 | | 0.0993 | 3.9696 | 1830 | 0.9337 | -0.0233 | 0.9337 | 0.9663 | | 0.0993 | 3.9740 | 1832 | 0.8766 | -0.0233 | 0.8766 | 0.9363 | | 0.0993 | 3.9783 | 1834 | 0.9349 | -0.0233 | 0.9349 | 0.9669 | | 0.0993 | 3.9826 | 1836 | 0.9972 | 0.0 | 0.9972 | 0.9986 | | 0.0993 | 3.9870 | 1838 | 1.0582 | 0.0 | 1.0582 | 1.0287 | | 0.0993 | 3.9913 | 1840 | 1.0314 | 0.0 | 1.0314 | 1.0156 | | 0.0993 | 3.9957 | 1842 | 0.8750 | -0.0233 | 0.8750 | 0.9354 | | 0.0993 | 4.0 | 1844 | 0.7546 | -0.0421 | 0.7546 | 0.8687 | | 0.0993 | 4.0043 | 1846 | 0.7520 | -0.0421 | 0.7520 | 0.8672 | | 0.0993 | 4.0087 | 1848 | 0.8139 | -0.0233 | 0.8139 | 0.9022 | | 0.0993 | 4.0130 | 1850 | 0.9909 | -0.0233 | 0.9909 | 0.9954 | | 0.0993 | 4.0174 | 1852 | 1.1466 | -0.0233 | 1.1466 | 1.0708 | | 0.0993 | 4.0217 | 1854 | 1.0997 | -0.0233 | 1.0997 | 1.0487 | | 0.0993 | 4.0260 | 1856 | 1.0485 | -0.0233 | 1.0485 | 1.0240 | | 0.0993 | 4.0304 | 1858 | 0.9479 | -0.0233 | 0.9479 | 0.9736 | | 0.0993 | 4.0347 | 1860 | 0.9525 | -0.0233 | 0.9525 | 0.9760 | | 0.0993 | 4.0390 | 1862 | 1.0092 | -0.0233 | 1.0092 | 1.0046 | | 0.0993 | 4.0434 | 1864 | 1.0090 | -0.0233 | 1.0090 | 1.0045 | | 0.0993 | 4.0477 | 1866 | 0.9796 | -0.0233 | 0.9796 | 0.9897 | | 0.0993 | 4.0521 | 1868 | 0.9697 | -0.0233 | 0.9697 | 0.9847 | | 0.0993 | 4.0564 | 1870 | 1.0478 | -0.0233 | 1.0478 | 1.0236 | | 0.0993 | 4.0607 | 1872 | 1.0634 | 0.0 | 1.0634 | 1.0312 | | 0.0993 | 4.0651 | 1874 | 1.0139 | -0.0233 | 1.0139 | 1.0069 | | 0.0993 | 4.0694 | 1876 | 0.8380 | -0.0233 | 0.8380 | 0.9154 | | 0.0993 | 4.0738 | 1878 | 0.7344 | -0.0421 | 0.7344 | 0.8570 | | 0.0993 | 4.0781 | 1880 | 0.7492 | -0.0233 | 0.7492 | 0.8655 | | 0.0993 | 4.0824 | 1882 | 0.8795 | -0.0233 | 0.8795 | 0.9378 | | 0.0993 | 4.0868 | 1884 | 0.9632 | 0.0 | 0.9632 | 0.9814 | | 0.0993 | 4.0911 | 1886 | 1.0206 | 0.0 | 1.0206 | 1.0102 | | 0.0993 | 4.0954 | 1888 | 0.9782 | 0.0 | 0.9782 | 0.9890 | | 0.0993 | 4.0998 | 1890 | 0.8598 | 0.0 | 0.8598 | 0.9273 | | 0.0993 | 4.1041 | 1892 | 0.7986 | -0.0233 | 0.7986 | 0.8937 | | 0.0993 | 4.1085 | 1894 | 0.7940 | -0.0233 | 0.7940 | 0.8911 | | 0.0993 | 4.1128 | 1896 | 0.8366 | 0.0 | 0.8366 | 0.9146 | | 0.0993 | 4.1171 | 1898 | 0.8157 | -0.0233 | 0.8157 | 0.9031 | | 0.0993 | 4.1215 | 1900 | 0.8091 | -0.0233 | 0.8091 | 0.8995 | | 0.0993 | 4.1258 | 1902 | 0.8022 | -0.0233 | 0.8022 | 0.8956 | | 0.0993 | 4.1302 | 1904 | 0.8532 | -0.0233 | 0.8532 | 0.9237 | | 0.0993 | 4.1345 | 1906 | 0.8378 | -0.0233 | 0.8378 | 0.9153 | | 0.0993 | 4.1388 | 1908 | 0.8804 | 0.0 | 0.8804 | 0.9383 | | 0.0993 | 4.1432 | 1910 | 0.9456 | 0.0 | 0.9456 | 0.9724 | | 0.0993 | 4.1475 | 1912 | 0.8716 | 0.0 | 0.8716 | 0.9336 | | 0.0993 | 4.1518 | 1914 | 0.8769 | 0.0 | 0.8769 | 0.9365 | | 0.0993 | 4.1562 | 1916 | 0.8826 | 0.0 | 0.8826 | 0.9395 | | 0.0993 | 4.1605 | 1918 | 0.8533 | -0.0233 | 0.8533 | 0.9238 | | 0.0993 | 4.1649 | 1920 | 0.8599 | -0.0233 | 0.8599 | 0.9273 | | 0.0993 | 4.1692 | 1922 | 0.8149 | -0.0233 | 0.8149 | 0.9027 | | 0.0993 | 4.1735 | 1924 | 0.8321 | -0.0233 | 0.8321 | 0.9122 | | 0.0993 | 4.1779 | 1926 | 0.9637 | -0.0233 | 0.9637 | 0.9817 | | 0.0993 | 4.1822 | 1928 | 1.0789 | 0.0 | 1.0789 | 1.0387 | | 0.0993 | 4.1866 | 1930 | 1.0689 | 0.0 | 1.0689 | 1.0339 | | 0.0993 | 4.1909 | 1932 | 1.0033 | 0.0 | 1.0033 | 1.0017 | | 0.0993 | 4.1952 | 1934 | 0.9041 | -0.0233 | 0.9041 | 0.9509 | | 0.0993 | 4.1996 | 1936 | 0.8179 | -0.0233 | 0.8179 | 0.9044 | | 0.0993 | 4.2039 | 1938 | 0.8119 | -0.0233 | 0.8119 | 0.9011 | | 0.0993 | 4.2082 | 1940 | 0.8642 | -0.0233 | 0.8642 | 0.9296 | | 0.0993 | 4.2126 | 1942 | 0.9171 | -0.0233 | 0.9171 | 0.9576 | | 0.0993 | 4.2169 | 1944 | 1.0048 | 0.0 | 1.0048 | 1.0024 | | 0.0993 | 4.2213 | 1946 | 1.0001 | 0.0 | 1.0001 | 1.0000 | | 0.0993 | 4.2256 | 1948 | 0.8911 | -0.0233 | 0.8911 | 0.9440 | | 0.0993 | 4.2299 | 1950 | 0.8845 | -0.0233 | 0.8845 | 0.9405 | | 0.0993 | 4.2343 | 1952 | 0.9772 | 0.0 | 0.9772 | 0.9885 | | 0.0993 | 4.2386 | 1954 | 1.0762 | 0.0 | 1.0762 | 1.0374 | | 0.0993 | 4.2430 | 1956 | 1.0758 | 0.0 | 1.0758 | 1.0372 | | 0.0993 | 4.2473 | 1958 | 0.9721 | 0.0 | 0.9721 | 0.9860 | | 0.0993 | 4.2516 | 1960 | 0.8383 | -0.0233 | 0.8383 | 0.9156 | | 0.0993 | 4.2560 | 1962 | 0.7627 | -0.0233 | 0.7627 | 0.8733 | | 0.0993 | 4.2603 | 1964 | 0.7451 | -0.0233 | 0.7451 | 0.8632 | | 0.0993 | 4.2646 | 1966 | 0.7407 | -0.0233 | 0.7407 | 0.8606 | | 0.0993 | 4.2690 | 1968 | 0.7923 | -0.0233 | 0.7923 | 0.8901 | | 0.0993 | 4.2733 | 1970 | 0.8804 | 0.0 | 0.8804 | 0.9383 | | 0.0993 | 4.2777 | 1972 | 0.9751 | 0.0 | 0.9751 | 0.9875 | | 0.0993 | 4.2820 | 1974 | 0.9500 | 0.0 | 0.9500 | 0.9747 | | 0.0993 | 4.2863 | 1976 | 0.8802 | 0.0 | 0.8802 | 0.9382 | | 0.0993 | 4.2907 | 1978 | 0.8562 | 0.0 | 0.8562 | 0.9253 | | 0.0993 | 4.2950 | 1980 | 0.8462 | 0.0 | 0.8462 | 0.9199 | | 0.0993 | 4.2993 | 1982 | 0.8064 | -0.0233 | 0.8064 | 0.8980 | | 0.0993 | 4.3037 | 1984 | 0.8117 | -0.0233 | 0.8117 | 0.9009 | | 0.0993 | 4.3080 | 1986 | 0.9201 | -0.0233 | 0.9201 | 0.9592 | | 0.0993 | 4.3124 | 1988 | 1.0124 | 0.0 | 1.0124 | 1.0062 | | 0.0993 | 4.3167 | 1990 | 1.0284 | 0.0 | 1.0284 | 1.0141 | | 0.0993 | 4.3210 | 1992 | 0.9125 | -0.0233 | 0.9125 | 0.9553 | | 0.0993 | 4.3254 | 1994 | 0.7754 | -0.0233 | 0.7754 | 0.8806 | | 0.0993 | 4.3297 | 1996 | 0.7098 | 0.1239 | 0.7098 | 0.8425 | | 0.0993 | 4.3341 | 1998 | 0.6989 | 0.0984 | 0.6989 | 0.8360 | | 0.076 | 4.3384 | 2000 | 0.7376 | -0.0421 | 0.7376 | 0.8588 | | 0.076 | 4.3427 | 2002 | 0.7961 | -0.0233 | 0.7961 | 0.8923 | | 0.076 | 4.3471 | 2004 | 0.8327 | -0.0233 | 0.8327 | 0.9125 | | 0.076 | 4.3514 | 2006 | 0.8351 | -0.0233 | 0.8351 | 0.9138 | | 0.076 | 4.3557 | 2008 | 0.8073 | -0.0233 | 0.8073 | 0.8985 | | 0.076 | 4.3601 | 2010 | 0.8272 | -0.0233 | 0.8272 | 0.9095 | | 0.076 | 4.3644 | 2012 | 0.8175 | -0.0233 | 0.8175 | 0.9042 | | 0.076 | 4.3688 | 2014 | 0.8260 | -0.0233 | 0.8260 | 0.9089 | | 0.076 | 4.3731 | 2016 | 0.8645 | -0.0233 | 0.8645 | 0.9298 | | 0.076 | 4.3774 | 2018 | 0.9603 | -0.0233 | 0.9603 | 0.9799 | | 0.076 | 4.3818 | 2020 | 1.0526 | -0.0233 | 1.0525 | 1.0259 | | 0.076 | 4.3861 | 2022 | 1.0298 | -0.0233 | 1.0298 | 1.0148 | | 0.076 | 4.3905 | 2024 | 0.9933 | -0.0233 | 0.9933 | 0.9966 | | 0.076 | 4.3948 | 2026 | 0.8912 | -0.0233 | 0.8912 | 0.9440 | | 0.076 | 4.3991 | 2028 | 0.8503 | -0.0233 | 0.8503 | 0.9221 | | 0.076 | 4.4035 | 2030 | 0.8158 | -0.0233 | 0.8158 | 0.9032 | | 0.076 | 4.4078 | 2032 | 0.8048 | -0.0233 | 0.8048 | 0.8971 | | 0.076 | 4.4121 | 2034 | 0.8346 | -0.0233 | 0.8346 | 0.9135 | | 0.076 | 4.4165 | 2036 | 0.8400 | -0.0233 | 0.8400 | 0.9165 | | 0.076 | 4.4208 | 2038 | 0.8988 | -0.0233 | 0.8988 | 0.9481 | | 0.076 | 4.4252 | 2040 | 0.9141 | -0.0233 | 0.9141 | 0.9561 | | 0.076 | 4.4295 | 2042 | 0.9398 | -0.0233 | 0.9398 | 0.9694 | | 0.076 | 4.4338 | 2044 | 0.8907 | -0.0233 | 0.8907 | 0.9437 | | 0.076 | 4.4382 | 2046 | 0.8642 | -0.0233 | 0.8642 | 0.9296 | | 0.076 | 4.4425 | 2048 | 0.8522 | -0.0233 | 0.8522 | 0.9232 | | 0.076 | 4.4469 | 2050 | 0.8540 | -0.0233 | 0.8540 | 0.9241 | | 0.076 | 4.4512 | 2052 | 0.9151 | -0.0233 | 0.9151 | 0.9566 | | 0.076 | 4.4555 | 2054 | 0.9287 | -0.0233 | 0.9287 | 0.9637 | | 0.076 | 4.4599 | 2056 | 0.9031 | -0.0233 | 0.9031 | 0.9503 | | 0.076 | 4.4642 | 2058 | 0.8746 | -0.0233 | 0.8746 | 0.9352 | | 0.076 | 4.4685 | 2060 | 0.8771 | -0.0233 | 0.8771 | 0.9365 | | 0.076 | 4.4729 | 2062 | 0.8924 | -0.0233 | 0.8924 | 0.9447 | | 0.076 | 4.4772 | 2064 | 0.9615 | -0.0233 | 0.9615 | 0.9806 | | 0.076 | 4.4816 | 2066 | 1.1345 | 0.0 | 1.1345 | 1.0651 | | 0.076 | 4.4859 | 2068 | 1.2003 | 0.0 | 1.2003 | 1.0956 | | 0.076 | 4.4902 | 2070 | 1.1232 | 0.0 | 1.1232 | 1.0598 | | 0.076 | 4.4946 | 2072 | 1.0089 | -0.0233 | 1.0089 | 1.0044 | | 0.076 | 4.4989 | 2074 | 0.9167 | -0.0233 | 0.9167 | 0.9575 | | 0.076 | 4.5033 | 2076 | 0.8329 | -0.0421 | 0.8329 | 0.9126 | | 0.076 | 4.5076 | 2078 | 0.8481 | -0.0233 | 0.8481 | 0.9209 | | 0.076 | 4.5119 | 2080 | 0.8975 | -0.0233 | 0.8975 | 0.9474 | | 0.076 | 4.5163 | 2082 | 0.9111 | -0.0233 | 0.9111 | 0.9545 | | 0.076 | 4.5206 | 2084 | 0.9713 | 0.0 | 0.9713 | 0.9856 | | 0.076 | 4.5249 | 2086 | 0.9711 | 0.0 | 0.9711 | 0.9854 | | 0.076 | 4.5293 | 2088 | 0.9927 | -0.0233 | 0.9927 | 0.9963 | | 0.076 | 4.5336 | 2090 | 0.9595 | -0.0233 | 0.9595 | 0.9795 | | 0.076 | 4.5380 | 2092 | 0.9472 | -0.0233 | 0.9472 | 0.9732 | | 0.076 | 4.5423 | 2094 | 0.9728 | -0.0233 | 0.9728 | 0.9863 | | 0.076 | 4.5466 | 2096 | 1.0387 | -0.0233 | 1.0387 | 1.0192 | | 0.076 | 4.5510 | 2098 | 1.0218 | -0.0233 | 1.0218 | 1.0109 | | 0.076 | 4.5553 | 2100 | 1.0427 | 0.0 | 1.0427 | 1.0211 | | 0.076 | 4.5597 | 2102 | 1.0686 | 0.0 | 1.0686 | 1.0338 | | 0.076 | 4.5640 | 2104 | 1.0577 | 0.0 | 1.0577 | 1.0284 | | 0.076 | 4.5683 | 2106 | 0.9681 | 0.0 | 0.9681 | 0.9839 | | 0.076 | 4.5727 | 2108 | 0.8680 | -0.0233 | 0.8680 | 0.9317 | | 0.076 | 4.5770 | 2110 | 0.8462 | -0.0233 | 0.8462 | 0.9199 | | 0.076 | 4.5813 | 2112 | 0.8452 | -0.0233 | 0.8452 | 0.9193 | | 0.076 | 4.5857 | 2114 | 0.8824 | -0.0233 | 0.8824 | 0.9394 | | 0.076 | 4.5900 | 2116 | 0.9392 | 0.0 | 0.9392 | 0.9691 | | 0.076 | 4.5944 | 2118 | 1.0023 | 0.0 | 1.0023 | 1.0012 | | 0.076 | 4.5987 | 2120 | 1.0239 | 0.0 | 1.0239 | 1.0119 | | 0.076 | 4.6030 | 2122 | 1.0632 | 0.0 | 1.0632 | 1.0311 | | 0.076 | 4.6074 | 2124 | 1.0273 | 0.0 | 1.0273 | 1.0136 | | 0.076 | 4.6117 | 2126 | 0.9172 | -0.0233 | 0.9172 | 0.9577 | | 0.076 | 4.6161 | 2128 | 0.8392 | -0.0233 | 0.8392 | 0.9161 | | 0.076 | 4.6204 | 2130 | 0.8499 | -0.0233 | 0.8499 | 0.9219 | | 0.076 | 4.6247 | 2132 | 0.9531 | -0.0233 | 0.9531 | 0.9763 | | 0.076 | 4.6291 | 2134 | 1.0395 | 0.0 | 1.0395 | 1.0196 | | 0.076 | 4.6334 | 2136 | 1.0145 | 0.0 | 1.0145 | 1.0072 | | 0.076 | 4.6377 | 2138 | 0.9309 | -0.0233 | 0.9309 | 0.9648 | | 0.076 | 4.6421 | 2140 | 0.8859 | -0.0233 | 0.8859 | 0.9412 | | 0.076 | 4.6464 | 2142 | 0.8825 | -0.0233 | 0.8825 | 0.9394 | | 0.076 | 4.6508 | 2144 | 0.9252 | -0.0233 | 0.9252 | 0.9619 | | 0.076 | 4.6551 | 2146 | 0.9581 | -0.0233 | 0.9581 | 0.9788 | | 0.076 | 4.6594 | 2148 | 0.9452 | -0.0233 | 0.9452 | 0.9722 | | 0.076 | 4.6638 | 2150 | 0.9164 | -0.0233 | 0.9164 | 0.9573 | | 0.076 | 4.6681 | 2152 | 0.8656 | -0.0233 | 0.8656 | 0.9304 | | 0.076 | 4.6725 | 2154 | 0.7674 | -0.0421 | 0.7674 | 0.8760 | | 0.076 | 4.6768 | 2156 | 0.7437 | -0.0421 | 0.7437 | 0.8624 | | 0.076 | 4.6811 | 2158 | 0.7667 | -0.0421 | 0.7667 | 0.8756 | | 0.076 | 4.6855 | 2160 | 0.8434 | -0.0233 | 0.8434 | 0.9184 | | 0.076 | 4.6898 | 2162 | 0.9940 | 0.0 | 0.9940 | 0.9970 | | 0.076 | 4.6941 | 2164 | 1.0846 | 0.0 | 1.0846 | 1.0414 | | 0.076 | 4.6985 | 2166 | 1.0394 | 0.0 | 1.0394 | 1.0195 | | 0.076 | 4.7028 | 2168 | 0.9821 | 0.0 | 0.9821 | 0.9910 | | 0.076 | 4.7072 | 2170 | 0.8956 | -0.0233 | 0.8956 | 0.9464 | | 0.076 | 4.7115 | 2172 | 0.7669 | -0.0421 | 0.7669 | 0.8757 | | 0.076 | 4.7158 | 2174 | 0.7299 | 0.1239 | 0.7299 | 0.8543 | | 0.076 | 4.7202 | 2176 | 0.7411 | -0.0577 | 0.7411 | 0.8609 | | 0.076 | 4.7245 | 2178 | 0.8093 | -0.0233 | 0.8093 | 0.8996 | | 0.076 | 4.7289 | 2180 | 0.9246 | -0.0233 | 0.9246 | 0.9616 | | 0.076 | 4.7332 | 2182 | 1.0093 | -0.0233 | 1.0093 | 1.0046 | | 0.076 | 4.7375 | 2184 | 1.0830 | -0.0233 | 1.0830 | 1.0407 | | 0.076 | 4.7419 | 2186 | 1.0494 | -0.0233 | 1.0494 | 1.0244 | | 0.076 | 4.7462 | 2188 | 0.9409 | -0.0233 | 0.9409 | 0.9700 | | 0.076 | 4.7505 | 2190 | 0.7952 | 0.1239 | 0.7952 | 0.8918 | | 0.076 | 4.7549 | 2192 | 0.7278 | 0.1239 | 0.7278 | 0.8531 | | 0.076 | 4.7592 | 2194 | 0.7209 | 0.1239 | 0.7209 | 0.8491 | | 0.076 | 4.7636 | 2196 | 0.7566 | 0.1239 | 0.7566 | 0.8699 | | 0.076 | 4.7679 | 2198 | 0.8931 | -0.0233 | 0.8931 | 0.9451 | | 0.076 | 4.7722 | 2200 | 1.0671 | 0.0 | 1.0671 | 1.0330 | | 0.076 | 4.7766 | 2202 | 1.1438 | 0.0 | 1.1438 | 1.0695 | | 0.076 | 4.7809 | 2204 | 1.0783 | 0.0 | 1.0783 | 1.0384 | | 0.076 | 4.7852 | 2206 | 0.9353 | 0.0 | 0.9353 | 0.9671 | | 0.076 | 4.7896 | 2208 | 0.8294 | -0.0233 | 0.8294 | 0.9107 | | 0.076 | 4.7939 | 2210 | 0.7666 | -0.0233 | 0.7666 | 0.8756 | | 0.076 | 4.7983 | 2212 | 0.7072 | 0.1239 | 0.7072 | 0.8409 | | 0.076 | 4.8026 | 2214 | 0.6983 | 0.1239 | 0.6983 | 0.8356 | | 0.076 | 4.8069 | 2216 | 0.7433 | -0.0577 | 0.7433 | 0.8622 | | 0.076 | 4.8113 | 2218 | 0.8118 | -0.0233 | 0.8118 | 0.9010 | | 0.076 | 4.8156 | 2220 | 0.8372 | -0.0233 | 0.8372 | 0.9150 | | 0.076 | 4.8200 | 2222 | 0.8735 | -0.0233 | 0.8735 | 0.9346 | | 0.076 | 4.8243 | 2224 | 0.9228 | -0.0233 | 0.9228 | 0.9606 | | 0.076 | 4.8286 | 2226 | 0.9137 | -0.0233 | 0.9137 | 0.9559 | | 0.076 | 4.8330 | 2228 | 0.8914 | -0.0233 | 0.8914 | 0.9441 | | 0.076 | 4.8373 | 2230 | 0.8680 | -0.0233 | 0.8680 | 0.9317 | | 0.076 | 4.8416 | 2232 | 0.8165 | -0.0577 | 0.8165 | 0.9036 | | 0.076 | 4.8460 | 2234 | 0.7699 | -0.0820 | 0.7699 | 0.8774 | | 0.076 | 4.8503 | 2236 | 0.7779 | -0.2655 | 0.7779 | 0.8820 | | 0.076 | 4.8547 | 2238 | 0.7972 | -0.0577 | 0.7972 | 0.8928 | | 0.076 | 4.8590 | 2240 | 0.8876 | -0.0233 | 0.8876 | 0.9421 | | 0.076 | 4.8633 | 2242 | 0.9497 | -0.0233 | 0.9497 | 0.9745 | | 0.076 | 4.8677 | 2244 | 0.9781 | -0.0233 | 0.9781 | 0.9890 | | 0.076 | 4.8720 | 2246 | 0.9991 | -0.0233 | 0.9991 | 0.9995 | | 0.076 | 4.8764 | 2248 | 1.0591 | 0.0 | 1.0591 | 1.0291 | | 0.076 | 4.8807 | 2250 | 1.0259 | 0.0 | 1.0259 | 1.0128 | | 0.076 | 4.8850 | 2252 | 0.9494 | -0.0233 | 0.9494 | 0.9744 | | 0.076 | 4.8894 | 2254 | 0.8438 | -0.0233 | 0.8438 | 0.9186 | | 0.076 | 4.8937 | 2256 | 0.7680 | 0.1239 | 0.7680 | 0.8764 | | 0.076 | 4.8980 | 2258 | 0.7570 | 0.1239 | 0.7570 | 0.8700 | | 0.076 | 4.9024 | 2260 | 0.8034 | -0.0233 | 0.8034 | 0.8963 | | 0.076 | 4.9067 | 2262 | 0.9002 | -0.0233 | 0.9002 | 0.9488 | | 0.076 | 4.9111 | 2264 | 1.0480 | 0.0 | 1.0480 | 1.0237 | | 0.076 | 4.9154 | 2266 | 1.1098 | 0.0 | 1.1098 | 1.0535 | | 0.076 | 4.9197 | 2268 | 1.0584 | 0.0 | 1.0584 | 1.0288 | | 0.076 | 4.9241 | 2270 | 0.9101 | -0.0233 | 0.9101 | 0.9540 | | 0.076 | 4.9284 | 2272 | 0.8073 | -0.0233 | 0.8073 | 0.8985 | | 0.076 | 4.9328 | 2274 | 0.7932 | -0.0233 | 0.7932 | 0.8906 | | 0.076 | 4.9371 | 2276 | 0.7732 | -0.0233 | 0.7732 | 0.8793 | | 0.076 | 4.9414 | 2278 | 0.7798 | -0.0233 | 0.7798 | 0.8830 | | 0.076 | 4.9458 | 2280 | 0.7943 | -0.0233 | 0.7943 | 0.8912 | | 0.076 | 4.9501 | 2282 | 0.8569 | -0.0233 | 0.8569 | 0.9257 | | 0.076 | 4.9544 | 2284 | 0.9882 | -0.0233 | 0.9882 | 0.9941 | | 0.076 | 4.9588 | 2286 | 1.1951 | 0.0 | 1.1951 | 1.0932 | | 0.076 | 4.9631 | 2288 | 1.2756 | 0.0 | 1.2756 | 1.1294 | | 0.076 | 4.9675 | 2290 | 1.2094 | 0.0 | 1.2094 | 1.0997 | | 0.076 | 4.9718 | 2292 | 1.0512 | 0.0 | 1.0512 | 1.0253 | | 0.076 | 4.9761 | 2294 | 0.9377 | -0.0233 | 0.9377 | 0.9683 | | 0.076 | 4.9805 | 2296 | 0.8538 | -0.0233 | 0.8538 | 0.9240 | | 0.076 | 4.9848 | 2298 | 0.8393 | -0.0577 | 0.8393 | 0.9161 | | 0.076 | 4.9892 | 2300 | 0.9055 | -0.0233 | 0.9055 | 0.9516 | | 0.076 | 4.9935 | 2302 | 0.9885 | -0.0233 | 0.9885 | 0.9942 | | 0.076 | 4.9978 | 2304 | 1.0848 | 0.0 | 1.0848 | 1.0415 | | 0.076 | 5.0022 | 2306 | 1.0729 | 0.0 | 1.0729 | 1.0358 | | 0.076 | 5.0065 | 2308 | 0.9748 | 0.0 | 0.9748 | 0.9873 | | 0.076 | 5.0108 | 2310 | 0.9116 | -0.0233 | 0.9116 | 0.9548 | | 0.076 | 5.0152 | 2312 | 0.8524 | -0.0233 | 0.8524 | 0.9233 | | 0.076 | 5.0195 | 2314 | 0.8409 | -0.0233 | 0.8409 | 0.9170 | | 0.076 | 5.0239 | 2316 | 0.8527 | -0.0233 | 0.8527 | 0.9234 | | 0.076 | 5.0282 | 2318 | 0.9112 | -0.0233 | 0.9112 | 0.9545 | | 0.076 | 5.0325 | 2320 | 0.9999 | -0.0233 | 0.9999 | 0.9999 | | 0.076 | 5.0369 | 2322 | 1.0612 | 0.0 | 1.0612 | 1.0301 | | 0.076 | 5.0412 | 2324 | 1.0833 | 0.0 | 1.0833 | 1.0408 | | 0.076 | 5.0456 | 2326 | 1.0615 | 0.0 | 1.0615 | 1.0303 | | 0.076 | 5.0499 | 2328 | 1.0010 | 0.0 | 1.0010 | 1.0005 | | 0.076 | 5.0542 | 2330 | 0.9053 | -0.0233 | 0.9053 | 0.9515 | | 0.076 | 5.0586 | 2332 | 0.8814 | -0.0233 | 0.8814 | 0.9388 | | 0.076 | 5.0629 | 2334 | 0.9028 | -0.0233 | 0.9028 | 0.9501 | | 0.076 | 5.0672 | 2336 | 0.9149 | -0.0233 | 0.9149 | 0.9565 | | 0.076 | 5.0716 | 2338 | 0.8684 | -0.0233 | 0.8684 | 0.9319 | | 0.076 | 5.0759 | 2340 | 0.8633 | -0.0233 | 0.8633 | 0.9291 | | 0.076 | 5.0803 | 2342 | 0.9351 | -0.0233 | 0.9351 | 0.9670 | | 0.076 | 5.0846 | 2344 | 1.0714 | 0.0 | 1.0714 | 1.0351 | | 0.076 | 5.0889 | 2346 | 1.1228 | 0.0 | 1.1228 | 1.0596 | | 0.076 | 5.0933 | 2348 | 1.0534 | 0.0 | 1.0534 | 1.0263 | | 0.076 | 5.0976 | 2350 | 0.9336 | -0.0233 | 0.9336 | 0.9662 | | 0.076 | 5.1020 | 2352 | 0.8975 | -0.0233 | 0.8975 | 0.9474 | | 0.076 | 5.1063 | 2354 | 0.9483 | -0.0233 | 0.9483 | 0.9738 | | 0.076 | 5.1106 | 2356 | 1.0165 | -0.0233 | 1.0165 | 1.0082 | | 0.076 | 5.1150 | 2358 | 1.0847 | -0.0233 | 1.0847 | 1.0415 | | 0.076 | 5.1193 | 2360 | 1.1484 | 0.0 | 1.1484 | 1.0716 | | 0.076 | 5.1236 | 2362 | 1.1283 | 0.0 | 1.1283 | 1.0622 | | 0.076 | 5.1280 | 2364 | 1.0834 | -0.0233 | 1.0834 | 1.0409 | | 0.076 | 5.1323 | 2366 | 1.0722 | -0.0233 | 1.0722 | 1.0355 | | 0.076 | 5.1367 | 2368 | 0.9984 | -0.0233 | 0.9984 | 0.9992 | | 0.076 | 5.1410 | 2370 | 0.8914 | -0.0233 | 0.8914 | 0.9441 | | 0.076 | 5.1453 | 2372 | 0.8572 | -0.0233 | 0.8572 | 0.9258 | | 0.076 | 5.1497 | 2374 | 0.8848 | -0.0233 | 0.8848 | 0.9406 | | 0.076 | 5.1540 | 2376 | 0.9465 | -0.0233 | 0.9465 | 0.9729 | | 0.076 | 5.1584 | 2378 | 0.9649 | -0.0233 | 0.9649 | 0.9823 | | 0.076 | 5.1627 | 2380 | 1.0173 | 0.0 | 1.0173 | 1.0086 | | 0.076 | 5.1670 | 2382 | 1.0326 | 0.0 | 1.0326 | 1.0162 | | 0.076 | 5.1714 | 2384 | 0.9723 | 0.0 | 0.9723 | 0.9861 | | 0.076 | 5.1757 | 2386 | 0.8966 | -0.0233 | 0.8966 | 0.9469 | | 0.076 | 5.1800 | 2388 | 0.8875 | -0.0233 | 0.8875 | 0.9421 | | 0.076 | 5.1844 | 2390 | 0.9551 | -0.0233 | 0.9551 | 0.9773 | | 0.076 | 5.1887 | 2392 | 1.0146 | -0.0233 | 1.0146 | 1.0073 | | 0.076 | 5.1931 | 2394 | 1.0136 | -0.0233 | 1.0136 | 1.0068 | | 0.076 | 5.1974 | 2396 | 1.0167 | -0.0233 | 1.0167 | 1.0083 | | 0.076 | 5.2017 | 2398 | 0.9855 | -0.0233 | 0.9855 | 0.9927 | | 0.076 | 5.2061 | 2400 | 1.0402 | -0.0233 | 1.0402 | 1.0199 | | 0.076 | 5.2104 | 2402 | 1.1433 | -0.0233 | 1.1433 | 1.0692 | | 0.076 | 5.2148 | 2404 | 1.1381 | -0.0233 | 1.1381 | 1.0668 | | 0.076 | 5.2191 | 2406 | 1.0492 | -0.0233 | 1.0492 | 1.0243 | | 0.076 | 5.2234 | 2408 | 1.0370 | -0.0233 | 1.0370 | 1.0184 | | 0.076 | 5.2278 | 2410 | 1.0847 | -0.0233 | 1.0847 | 1.0415 | | 0.076 | 5.2321 | 2412 | 1.1718 | -0.0233 | 1.1718 | 1.0825 | | 0.076 | 5.2364 | 2414 | 1.1566 | 0.0 | 1.1566 | 1.0755 | | 0.076 | 5.2408 | 2416 | 1.0976 | 0.0 | 1.0976 | 1.0477 | | 0.076 | 5.2451 | 2418 | 1.0371 | 0.0 | 1.0371 | 1.0184 | | 0.076 | 5.2495 | 2420 | 0.9760 | 0.0 | 0.9760 | 0.9879 | | 0.076 | 5.2538 | 2422 | 1.0177 | 0.0 | 1.0177 | 1.0088 | | 0.076 | 5.2581 | 2424 | 1.1049 | 0.0 | 1.1049 | 1.0511 | | 0.076 | 5.2625 | 2426 | 1.1929 | 0.0 | 1.1929 | 1.0922 | | 0.076 | 5.2668 | 2428 | 1.1740 | 0.0 | 1.1740 | 1.0835 | | 0.076 | 5.2711 | 2430 | 1.0644 | 0.0 | 1.0644 | 1.0317 | | 0.076 | 5.2755 | 2432 | 0.9197 | 0.0 | 0.9197 | 0.9590 | | 0.076 | 5.2798 | 2434 | 0.8698 | -0.0233 | 0.8698 | 0.9326 | | 0.076 | 5.2842 | 2436 | 0.9035 | 0.0 | 0.9035 | 0.9505 | | 0.076 | 5.2885 | 2438 | 1.0299 | 0.0 | 1.0299 | 1.0148 | | 0.076 | 5.2928 | 2440 | 1.0777 | 0.0 | 1.0777 | 1.0381 | | 0.076 | 5.2972 | 2442 | 1.0325 | 0.0 | 1.0325 | 1.0161 | | 0.076 | 5.3015 | 2444 | 0.9300 | -0.0233 | 0.9300 | 0.9644 | | 0.076 | 5.3059 | 2446 | 0.8802 | -0.0233 | 0.8802 | 0.9382 | | 0.076 | 5.3102 | 2448 | 0.9103 | -0.0233 | 0.9103 | 0.9541 | | 0.076 | 5.3145 | 2450 | 1.0273 | 0.0 | 1.0273 | 1.0135 | | 0.076 | 5.3189 | 2452 | 1.0826 | 0.0 | 1.0826 | 1.0405 | | 0.076 | 5.3232 | 2454 | 1.0522 | 0.0 | 1.0522 | 1.0258 | | 0.076 | 5.3275 | 2456 | 0.9531 | 0.0 | 0.9531 | 0.9763 | | 0.076 | 5.3319 | 2458 | 0.8840 | -0.0233 | 0.8840 | 0.9402 | | 0.076 | 5.3362 | 2460 | 0.8564 | -0.0233 | 0.8564 | 0.9254 | | 0.076 | 5.3406 | 2462 | 0.9021 | -0.0233 | 0.9021 | 0.9498 | | 0.076 | 5.3449 | 2464 | 1.0297 | 0.0 | 1.0297 | 1.0147 | | 0.076 | 5.3492 | 2466 | 1.1413 | 0.0 | 1.1413 | 1.0683 | | 0.076 | 5.3536 | 2468 | 1.1828 | 0.0 | 1.1828 | 1.0876 | | 0.076 | 5.3579 | 2470 | 1.1377 | 0.0 | 1.1377 | 1.0666 | | 0.076 | 5.3623 | 2472 | 1.0092 | -0.0233 | 1.0092 | 1.0046 | | 0.076 | 5.3666 | 2474 | 0.9352 | -0.0233 | 0.9352 | 0.9671 | | 0.076 | 5.3709 | 2476 | 0.8872 | -0.0233 | 0.8872 | 0.9419 | | 0.076 | 5.3753 | 2478 | 0.8955 | -0.0233 | 0.8955 | 0.9463 | | 0.076 | 5.3796 | 2480 | 0.9086 | -0.0233 | 0.9086 | 0.9532 | | 0.076 | 5.3839 | 2482 | 0.9156 | -0.0233 | 0.9156 | 0.9568 | | 0.076 | 5.3883 | 2484 | 0.9838 | -0.0233 | 0.9838 | 0.9919 | | 0.076 | 5.3926 | 2486 | 1.0762 | 0.0 | 1.0762 | 1.0374 | | 0.076 | 5.3970 | 2488 | 1.0835 | 0.0 | 1.0835 | 1.0409 | | 0.076 | 5.4013 | 2490 | 1.0441 | -0.0233 | 1.0441 | 1.0218 | | 0.076 | 5.4056 | 2492 | 0.9357 | -0.0233 | 0.9357 | 0.9673 | | 0.076 | 5.4100 | 2494 | 0.8379 | -0.0233 | 0.8379 | 0.9154 | | 0.076 | 5.4143 | 2496 | 0.8298 | -0.0233 | 0.8298 | 0.9110 | | 0.076 | 5.4187 | 2498 | 0.8952 | -0.0233 | 0.8952 | 0.9462 | | 0.0627 | 5.4230 | 2500 | 0.9858 | -0.0233 | 0.9858 | 0.9929 | | 0.0627 | 5.4273 | 2502 | 1.0352 | -0.0233 | 1.0352 | 1.0174 | | 0.0627 | 5.4317 | 2504 | 1.0226 | -0.0233 | 1.0226 | 1.0113 | | 0.0627 | 5.4360 | 2506 | 1.0816 | 0.0 | 1.0816 | 1.0400 | | 0.0627 | 5.4403 | 2508 | 1.1229 | 0.0 | 1.1229 | 1.0597 | | 0.0627 | 5.4447 | 2510 | 1.1089 | -0.0233 | 1.1089 | 1.0530 | | 0.0627 | 5.4490 | 2512 | 1.0354 | -0.0233 | 1.0354 | 1.0175 | | 0.0627 | 5.4534 | 2514 | 0.9696 | -0.0233 | 0.9696 | 0.9847 | | 0.0627 | 5.4577 | 2516 | 0.9280 | -0.0233 | 0.9280 | 0.9633 | | 0.0627 | 5.4620 | 2518 | 0.8932 | -0.0233 | 0.8932 | 0.9451 | | 0.0627 | 5.4664 | 2520 | 0.9485 | -0.0233 | 0.9485 | 0.9739 | | 0.0627 | 5.4707 | 2522 | 1.0089 | -0.0233 | 1.0089 | 1.0044 | | 0.0627 | 5.4751 | 2524 | 1.0418 | -0.0233 | 1.0418 | 1.0207 | | 0.0627 | 5.4794 | 2526 | 0.9743 | -0.0233 | 0.9743 | 0.9870 | | 0.0627 | 5.4837 | 2528 | 0.9286 | -0.0233 | 0.9286 | 0.9636 | | 0.0627 | 5.4881 | 2530 | 0.9457 | -0.0233 | 0.9457 | 0.9725 | | 0.0627 | 5.4924 | 2532 | 0.9244 | -0.0233 | 0.9244 | 0.9614 | | 0.0627 | 5.4967 | 2534 | 0.9067 | -0.0233 | 0.9067 | 0.9522 | | 0.0627 | 5.5011 | 2536 | 0.9188 | -0.0233 | 0.9188 | 0.9585 | | 0.0627 | 5.5054 | 2538 | 0.9401 | -0.0233 | 0.9401 | 0.9696 | | 0.0627 | 5.5098 | 2540 | 0.9178 | -0.0233 | 0.9178 | 0.9580 | | 0.0627 | 5.5141 | 2542 | 0.8953 | -0.0233 | 0.8953 | 0.9462 | | 0.0627 | 5.5184 | 2544 | 0.8681 | -0.0233 | 0.8681 | 0.9317 | | 0.0627 | 5.5228 | 2546 | 0.8698 | -0.0233 | 0.8698 | 0.9326 | | 0.0627 | 5.5271 | 2548 | 0.9512 | -0.0233 | 0.9512 | 0.9753 | | 0.0627 | 5.5315 | 2550 | 1.0354 | -0.0233 | 1.0354 | 1.0175 | | 0.0627 | 5.5358 | 2552 | 1.1385 | -0.0233 | 1.1385 | 1.0670 | | 0.0627 | 5.5401 | 2554 | 1.1078 | -0.0233 | 1.1078 | 1.0525 | | 0.0627 | 5.5445 | 2556 | 1.0263 | -0.0233 | 1.0263 | 1.0130 | | 0.0627 | 5.5488 | 2558 | 0.9404 | -0.0233 | 0.9404 | 0.9698 | | 0.0627 | 5.5531 | 2560 | 0.9343 | -0.0233 | 0.9343 | 0.9666 | | 0.0627 | 5.5575 | 2562 | 0.9935 | -0.0233 | 0.9935 | 0.9967 | | 0.0627 | 5.5618 | 2564 | 1.0205 | -0.0233 | 1.0205 | 1.0102 | | 0.0627 | 5.5662 | 2566 | 1.0868 | 0.0 | 1.0868 | 1.0425 | | 0.0627 | 5.5705 | 2568 | 1.0958 | 0.0 | 1.0958 | 1.0468 | | 0.0627 | 5.5748 | 2570 | 1.0400 | -0.0233 | 1.0400 | 1.0198 | | 0.0627 | 5.5792 | 2572 | 0.9643 | -0.0233 | 0.9643 | 0.9820 | | 0.0627 | 5.5835 | 2574 | 0.9297 | -0.0233 | 0.9297 | 0.9642 | | 0.0627 | 5.5879 | 2576 | 0.9166 | -0.0233 | 0.9166 | 0.9574 | | 0.0627 | 5.5922 | 2578 | 0.9412 | -0.0233 | 0.9412 | 0.9702 | | 0.0627 | 5.5965 | 2580 | 0.9727 | -0.0233 | 0.9727 | 0.9863 | | 0.0627 | 5.6009 | 2582 | 1.0126 | -0.0233 | 1.0126 | 1.0063 | | 0.0627 | 5.6052 | 2584 | 1.0552 | -0.0233 | 1.0552 | 1.0272 | | 0.0627 | 5.6095 | 2586 | 1.0396 | -0.0233 | 1.0396 | 1.0196 | | 0.0627 | 5.6139 | 2588 | 0.9849 | -0.0233 | 0.9849 | 0.9924 | | 0.0627 | 5.6182 | 2590 | 1.0118 | -0.0233 | 1.0118 | 1.0059 | | 0.0627 | 5.6226 | 2592 | 1.0313 | -0.0233 | 1.0313 | 1.0155 | | 0.0627 | 5.6269 | 2594 | 1.0806 | -0.0233 | 1.0806 | 1.0395 | | 0.0627 | 5.6312 | 2596 | 1.0432 | -0.0233 | 1.0432 | 1.0214 | | 0.0627 | 5.6356 | 2598 | 1.0082 | -0.0233 | 1.0082 | 1.0041 | | 0.0627 | 5.6399 | 2600 | 1.0609 | -0.0233 | 1.0609 | 1.0300 | | 0.0627 | 5.6443 | 2602 | 1.0497 | -0.0233 | 1.0497 | 1.0245 | | 0.0627 | 5.6486 | 2604 | 1.0628 | -0.0233 | 1.0628 | 1.0309 | | 0.0627 | 5.6529 | 2606 | 1.0031 | -0.0233 | 1.0031 | 1.0016 | | 0.0627 | 5.6573 | 2608 | 0.9572 | -0.0233 | 0.9572 | 0.9784 | | 0.0627 | 5.6616 | 2610 | 0.9144 | -0.0233 | 0.9144 | 0.9562 | | 0.0627 | 5.6659 | 2612 | 0.9344 | -0.0233 | 0.9344 | 0.9666 | | 0.0627 | 5.6703 | 2614 | 0.9194 | -0.0233 | 0.9194 | 0.9589 | | 0.0627 | 5.6746 | 2616 | 0.9271 | -0.0233 | 0.9271 | 0.9629 | | 0.0627 | 5.6790 | 2618 | 0.9958 | -0.0233 | 0.9958 | 0.9979 | | 0.0627 | 5.6833 | 2620 | 1.1152 | 0.0 | 1.1152 | 1.0560 | | 0.0627 | 5.6876 | 2622 | 1.1685 | 0.0 | 1.1685 | 1.0810 | | 0.0627 | 5.6920 | 2624 | 1.0941 | 0.0 | 1.0941 | 1.0460 | | 0.0627 | 5.6963 | 2626 | 0.9468 | -0.0233 | 0.9468 | 0.9730 | | 0.0627 | 5.7007 | 2628 | 0.8571 | -0.0233 | 0.8571 | 0.9258 | | 0.0627 | 5.7050 | 2630 | 0.8351 | -0.0233 | 0.8351 | 0.9138 | | 0.0627 | 5.7093 | 2632 | 0.8976 | -0.0233 | 0.8976 | 0.9474 | | 0.0627 | 5.7137 | 2634 | 1.0252 | 0.0 | 1.0252 | 1.0125 | | 0.0627 | 5.7180 | 2636 | 1.0930 | 0.0 | 1.0930 | 1.0455 | | 0.0627 | 5.7223 | 2638 | 1.0665 | 0.0 | 1.0665 | 1.0327 | | 0.0627 | 5.7267 | 2640 | 1.0045 | 0.0 | 1.0045 | 1.0022 | | 0.0627 | 5.7310 | 2642 | 0.8747 | -0.0233 | 0.8747 | 0.9352 | | 0.0627 | 5.7354 | 2644 | 0.7662 | -0.0421 | 0.7662 | 0.8753 | | 0.0627 | 5.7397 | 2646 | 0.7524 | -0.0577 | 0.7524 | 0.8674 | | 0.0627 | 5.7440 | 2648 | 0.7919 | -0.0421 | 0.7919 | 0.8899 | | 0.0627 | 5.7484 | 2650 | 0.8803 | -0.0233 | 0.8803 | 0.9382 | | 0.0627 | 5.7527 | 2652 | 0.9743 | -0.0233 | 0.9743 | 0.9871 | | 0.0627 | 5.7570 | 2654 | 0.9921 | -0.0233 | 0.9921 | 0.9960 | | 0.0627 | 5.7614 | 2656 | 1.0665 | 0.0 | 1.0665 | 1.0327 | | 0.0627 | 5.7657 | 2658 | 1.0425 | -0.0233 | 1.0425 | 1.0210 | | 0.0627 | 5.7701 | 2660 | 0.9920 | -0.0233 | 0.9920 | 0.9960 | | 0.0627 | 5.7744 | 2662 | 0.9030 | -0.0233 | 0.9030 | 0.9503 | | 0.0627 | 5.7787 | 2664 | 0.8601 | -0.0233 | 0.8601 | 0.9274 | | 0.0627 | 5.7831 | 2666 | 0.9140 | -0.0233 | 0.9140 | 0.9560 | | 0.0627 | 5.7874 | 2668 | 1.0589 | 0.0 | 1.0589 | 1.0290 | | 0.0627 | 5.7918 | 2670 | 1.1207 | 0.0 | 1.1207 | 1.0586 | | 0.0627 | 5.7961 | 2672 | 1.0648 | 0.0 | 1.0648 | 1.0319 | | 0.0627 | 5.8004 | 2674 | 0.9630 | -0.0233 | 0.9630 | 0.9813 | | 0.0627 | 5.8048 | 2676 | 0.8795 | -0.0233 | 0.8795 | 0.9378 | | 0.0627 | 5.8091 | 2678 | 0.8799 | -0.0233 | 0.8799 | 0.9380 | | 0.0627 | 5.8134 | 2680 | 0.9597 | -0.0233 | 0.9597 | 0.9796 | | 0.0627 | 5.8178 | 2682 | 0.9677 | -0.0233 | 0.9677 | 0.9837 | | 0.0627 | 5.8221 | 2684 | 0.9487 | -0.0233 | 0.9487 | 0.9740 | | 0.0627 | 5.8265 | 2686 | 0.9021 | -0.0233 | 0.9021 | 0.9498 | | 0.0627 | 5.8308 | 2688 | 0.8543 | -0.0233 | 0.8543 | 0.9243 | | 0.0627 | 5.8351 | 2690 | 0.8223 | -0.0233 | 0.8223 | 0.9068 | | 0.0627 | 5.8395 | 2692 | 0.7760 | -0.0577 | 0.7760 | 0.8809 | | 0.0627 | 5.8438 | 2694 | 0.7982 | -0.0233 | 0.7982 | 0.8934 | | 0.0627 | 5.8482 | 2696 | 0.8266 | -0.0233 | 0.8266 | 0.9092 | | 0.0627 | 5.8525 | 2698 | 0.8985 | -0.0233 | 0.8985 | 0.9479 | | 0.0627 | 5.8568 | 2700 | 0.9246 | -0.0233 | 0.9246 | 0.9616 | | 0.0627 | 5.8612 | 2702 | 0.9025 | -0.0233 | 0.9025 | 0.9500 | | 0.0627 | 5.8655 | 2704 | 0.8279 | -0.0233 | 0.8279 | 0.9099 | | 0.0627 | 5.8698 | 2706 | 0.8074 | -0.0233 | 0.8074 | 0.8986 | | 0.0627 | 5.8742 | 2708 | 0.8279 | -0.0233 | 0.8279 | 0.9099 | | 0.0627 | 5.8785 | 2710 | 0.9014 | -0.0233 | 0.9014 | 0.9494 | | 0.0627 | 5.8829 | 2712 | 1.0500 | -0.0233 | 1.0500 | 1.0247 | | 0.0627 | 5.8872 | 2714 | 1.1180 | -0.0233 | 1.1180 | 1.0574 | | 0.0627 | 5.8915 | 2716 | 1.0756 | -0.0233 | 1.0756 | 1.0371 | | 0.0627 | 5.8959 | 2718 | 0.9634 | -0.0233 | 0.9634 | 0.9815 | | 0.0627 | 5.9002 | 2720 | 0.9007 | -0.0233 | 0.9007 | 0.9490 | | 0.0627 | 5.9046 | 2722 | 0.8668 | -0.0233 | 0.8668 | 0.9310 | | 0.0627 | 5.9089 | 2724 | 0.8509 | -0.0233 | 0.8509 | 0.9225 | | 0.0627 | 5.9132 | 2726 | 0.8542 | -0.0233 | 0.8542 | 0.9242 | | 0.0627 | 5.9176 | 2728 | 0.8924 | -0.0233 | 0.8924 | 0.9446 | | 0.0627 | 5.9219 | 2730 | 0.8751 | -0.0233 | 0.8751 | 0.9355 | | 0.0627 | 5.9262 | 2732 | 0.8922 | -0.0233 | 0.8922 | 0.9446 | | 0.0627 | 5.9306 | 2734 | 0.9017 | -0.0233 | 0.9017 | 0.9496 | | 0.0627 | 5.9349 | 2736 | 0.8855 | -0.0233 | 0.8855 | 0.9410 | | 0.0627 | 5.9393 | 2738 | 0.8948 | -0.0233 | 0.8948 | 0.9459 | | 0.0627 | 5.9436 | 2740 | 0.9040 | -0.0233 | 0.9040 | 0.9508 | | 0.0627 | 5.9479 | 2742 | 0.8801 | -0.0233 | 0.8801 | 0.9381 | | 0.0627 | 5.9523 | 2744 | 0.9002 | -0.0233 | 0.9002 | 0.9488 | | 0.0627 | 5.9566 | 2746 | 0.9045 | -0.0233 | 0.9045 | 0.9510 | | 0.0627 | 5.9610 | 2748 | 0.9177 | -0.0233 | 0.9177 | 0.9580 | | 0.0627 | 5.9653 | 2750 | 0.9536 | -0.0233 | 0.9536 | 0.9765 | | 0.0627 | 5.9696 | 2752 | 0.9002 | -0.0233 | 0.9002 | 0.9488 | | 0.0627 | 5.9740 | 2754 | 0.8257 | -0.0233 | 0.8257 | 0.9087 | | 0.0627 | 5.9783 | 2756 | 0.8139 | -0.0233 | 0.8139 | 0.9021 | | 0.0627 | 5.9826 | 2758 | 0.8424 | -0.0233 | 0.8424 | 0.9178 | | 0.0627 | 5.9870 | 2760 | 0.9099 | -0.0233 | 0.9099 | 0.9539 | | 0.0627 | 5.9913 | 2762 | 0.9651 | -0.0233 | 0.9651 | 0.9824 | | 0.0627 | 5.9957 | 2764 | 0.9806 | -0.0233 | 0.9806 | 0.9903 | | 0.0627 | 6.0 | 2766 | 1.0027 | -0.0233 | 1.0027 | 1.0014 | | 0.0627 | 6.0043 | 2768 | 0.9692 | -0.0233 | 0.9692 | 0.9845 | | 0.0627 | 6.0087 | 2770 | 0.9590 | -0.0233 | 0.9590 | 0.9793 | | 0.0627 | 6.0130 | 2772 | 0.9104 | -0.0233 | 0.9104 | 0.9542 | | 0.0627 | 6.0174 | 2774 | 0.8559 | -0.0233 | 0.8559 | 0.9252 | | 0.0627 | 6.0217 | 2776 | 0.8503 | -0.0233 | 0.8503 | 0.9221 | | 0.0627 | 6.0260 | 2778 | 0.8846 | -0.0233 | 0.8846 | 0.9405 | | 0.0627 | 6.0304 | 2780 | 0.9220 | -0.0233 | 0.9220 | 0.9602 | | 0.0627 | 6.0347 | 2782 | 0.9314 | -0.0233 | 0.9314 | 0.9651 | | 0.0627 | 6.0390 | 2784 | 0.9210 | -0.0233 | 0.9210 | 0.9597 | | 0.0627 | 6.0434 | 2786 | 0.8591 | -0.0233 | 0.8591 | 0.9269 | | 0.0627 | 6.0477 | 2788 | 0.8349 | -0.0233 | 0.8349 | 0.9137 | | 0.0627 | 6.0521 | 2790 | 0.8624 | -0.0233 | 0.8624 | 0.9286 | | 0.0627 | 6.0564 | 2792 | 0.9094 | -0.0233 | 0.9094 | 0.9536 | | 0.0627 | 6.0607 | 2794 | 0.9060 | -0.0233 | 0.9060 | 0.9519 | | 0.0627 | 6.0651 | 2796 | 0.8776 | -0.0233 | 0.8776 | 0.9368 | | 0.0627 | 6.0694 | 2798 | 0.8230 | -0.0233 | 0.8230 | 0.9072 | | 0.0627 | 6.0738 | 2800 | 0.7724 | -0.0421 | 0.7724 | 0.8789 | | 0.0627 | 6.0781 | 2802 | 0.7636 | -0.0421 | 0.7636 | 0.8738 | | 0.0627 | 6.0824 | 2804 | 0.7988 | -0.0233 | 0.7988 | 0.8938 | | 0.0627 | 6.0868 | 2806 | 0.8409 | -0.0233 | 0.8409 | 0.9170 | | 0.0627 | 6.0911 | 2808 | 0.8478 | -0.0233 | 0.8478 | 0.9208 | | 0.0627 | 6.0954 | 2810 | 0.8191 | -0.0233 | 0.8191 | 0.9050 | | 0.0627 | 6.0998 | 2812 | 0.7627 | -0.0233 | 0.7627 | 0.8733 | | 0.0627 | 6.1041 | 2814 | 0.7101 | 0.1239 | 0.7101 | 0.8427 | | 0.0627 | 6.1085 | 2816 | 0.7019 | 0.1239 | 0.7019 | 0.8378 | | 0.0627 | 6.1128 | 2818 | 0.7237 | 0.1239 | 0.7237 | 0.8507 | | 0.0627 | 6.1171 | 2820 | 0.7574 | -0.0421 | 0.7574 | 0.8703 | | 0.0627 | 6.1215 | 2822 | 0.7962 | -0.0233 | 0.7962 | 0.8923 | | 0.0627 | 6.1258 | 2824 | 0.8278 | -0.0233 | 0.8278 | 0.9098 | | 0.0627 | 6.1302 | 2826 | 0.8523 | -0.0233 | 0.8523 | 0.9232 | | 0.0627 | 6.1345 | 2828 | 0.8434 | -0.0233 | 0.8434 | 0.9184 | | 0.0627 | 6.1388 | 2830 | 0.8041 | -0.0233 | 0.8041 | 0.8967 | | 0.0627 | 6.1432 | 2832 | 0.8077 | -0.0233 | 0.8077 | 0.8987 | | 0.0627 | 6.1475 | 2834 | 0.8631 | -0.0233 | 0.8631 | 0.9290 | | 0.0627 | 6.1518 | 2836 | 0.9777 | -0.0233 | 0.9777 | 0.9888 | | 0.0627 | 6.1562 | 2838 | 1.0586 | 0.0 | 1.0586 | 1.0289 | | 0.0627 | 6.1605 | 2840 | 1.0797 | 0.0 | 1.0797 | 1.0391 | | 0.0627 | 6.1649 | 2842 | 1.0232 | 0.0 | 1.0232 | 1.0115 | | 0.0627 | 6.1692 | 2844 | 0.9113 | -0.0233 | 0.9113 | 0.9546 | | 0.0627 | 6.1735 | 2846 | 0.7942 | -0.0233 | 0.7942 | 0.8912 | | 0.0627 | 6.1779 | 2848 | 0.7512 | -0.0233 | 0.7512 | 0.8667 | | 0.0627 | 6.1822 | 2850 | 0.7479 | -0.0233 | 0.7479 | 0.8648 | | 0.0627 | 6.1866 | 2852 | 0.7795 | -0.0233 | 0.7795 | 0.8829 | | 0.0627 | 6.1909 | 2854 | 0.8574 | -0.0233 | 0.8574 | 0.9259 | | 0.0627 | 6.1952 | 2856 | 0.9449 | -0.0233 | 0.9449 | 0.9721 | | 0.0627 | 6.1996 | 2858 | 0.9912 | 0.0 | 0.9912 | 0.9956 | | 0.0627 | 6.2039 | 2860 | 0.9635 | 0.0 | 0.9635 | 0.9816 | | 0.0627 | 6.2082 | 2862 | 0.8768 | -0.0233 | 0.8768 | 0.9364 | | 0.0627 | 6.2126 | 2864 | 0.8144 | -0.0233 | 0.8144 | 0.9024 | | 0.0627 | 6.2169 | 2866 | 0.7265 | -0.0233 | 0.7265 | 0.8523 | | 0.0627 | 6.2213 | 2868 | 0.6665 | 0.1538 | 0.6665 | 0.8164 | | 0.0627 | 6.2256 | 2870 | 0.6557 | 0.1239 | 0.6557 | 0.8097 | | 0.0627 | 6.2299 | 2872 | 0.6679 | 0.1538 | 0.6679 | 0.8173 | | 0.0627 | 6.2343 | 2874 | 0.7153 | -0.0233 | 0.7153 | 0.8457 | | 0.0627 | 6.2386 | 2876 | 0.8169 | -0.0233 | 0.8169 | 0.9038 | | 0.0627 | 6.2430 | 2878 | 0.9051 | 0.0 | 0.9051 | 0.9514 | | 0.0627 | 6.2473 | 2880 | 0.9306 | 0.0 | 0.9306 | 0.9647 | | 0.0627 | 6.2516 | 2882 | 0.9017 | 0.0 | 0.9017 | 0.9496 | | 0.0627 | 6.2560 | 2884 | 0.8313 | -0.0233 | 0.8313 | 0.9117 | | 0.0627 | 6.2603 | 2886 | 0.7380 | -0.0233 | 0.7380 | 0.8590 | | 0.0627 | 6.2646 | 2888 | 0.6913 | 0.1895 | 0.6913 | 0.8314 | | 0.0627 | 6.2690 | 2890 | 0.6907 | 0.1895 | 0.6907 | 0.8311 | | 0.0627 | 6.2733 | 2892 | 0.7237 | -0.0233 | 0.7237 | 0.8507 | | 0.0627 | 6.2777 | 2894 | 0.7703 | -0.0233 | 0.7703 | 0.8777 | | 0.0627 | 6.2820 | 2896 | 0.8514 | -0.0233 | 0.8514 | 0.9227 | | 0.0627 | 6.2863 | 2898 | 0.9534 | -0.0233 | 0.9534 | 0.9764 | | 0.0627 | 6.2907 | 2900 | 0.9661 | -0.0233 | 0.9661 | 0.9829 | | 0.0627 | 6.2950 | 2902 | 0.9098 | -0.0233 | 0.9098 | 0.9538 | | 0.0627 | 6.2993 | 2904 | 0.8460 | -0.0233 | 0.8460 | 0.9198 | | 0.0627 | 6.3037 | 2906 | 0.8076 | -0.0233 | 0.8076 | 0.8987 | | 0.0627 | 6.3080 | 2908 | 0.7486 | -0.0233 | 0.7486 | 0.8652 | | 0.0627 | 6.3124 | 2910 | 0.7410 | -0.0233 | 0.7410 | 0.8608 | | 0.0627 | 6.3167 | 2912 | 0.7798 | -0.0233 | 0.7798 | 0.8831 | | 0.0627 | 6.3210 | 2914 | 0.7804 | -0.0233 | 0.7804 | 0.8834 | | 0.0627 | 6.3254 | 2916 | 0.7721 | -0.0233 | 0.7721 | 0.8787 | | 0.0627 | 6.3297 | 2918 | 0.7973 | -0.0233 | 0.7973 | 0.8929 | | 0.0627 | 6.3341 | 2920 | 0.8210 | -0.0233 | 0.8210 | 0.9061 | | 0.0627 | 6.3384 | 2922 | 0.7963 | -0.0233 | 0.7963 | 0.8924 | | 0.0627 | 6.3427 | 2924 | 0.7651 | -0.0233 | 0.7651 | 0.8747 | | 0.0627 | 6.3471 | 2926 | 0.7641 | -0.0233 | 0.7641 | 0.8741 | | 0.0627 | 6.3514 | 2928 | 0.7385 | -0.0233 | 0.7385 | 0.8594 | | 0.0627 | 6.3557 | 2930 | 0.7047 | -0.0233 | 0.7047 | 0.8395 | | 0.0627 | 6.3601 | 2932 | 0.7056 | 0.1895 | 0.7056 | 0.8400 | | 0.0627 | 6.3644 | 2934 | 0.7344 | -0.0233 | 0.7344 | 0.8570 | | 0.0627 | 6.3688 | 2936 | 0.7650 | -0.0233 | 0.7650 | 0.8746 | | 0.0627 | 6.3731 | 2938 | 0.8213 | -0.0233 | 0.8213 | 0.9063 | | 0.0627 | 6.3774 | 2940 | 0.8635 | -0.0233 | 0.8635 | 0.9293 | | 0.0627 | 6.3818 | 2942 | 0.8619 | -0.0233 | 0.8619 | 0.9284 | | 0.0627 | 6.3861 | 2944 | 0.8076 | -0.0233 | 0.8076 | 0.8986 | | 0.0627 | 6.3905 | 2946 | 0.7386 | -0.0233 | 0.7386 | 0.8594 | | 0.0627 | 6.3948 | 2948 | 0.7332 | -0.0233 | 0.7332 | 0.8563 | | 0.0627 | 6.3991 | 2950 | 0.7775 | -0.0233 | 0.7775 | 0.8817 | | 0.0627 | 6.4035 | 2952 | 0.7931 | -0.0233 | 0.7931 | 0.8905 | | 0.0627 | 6.4078 | 2954 | 0.8123 | -0.0233 | 0.8123 | 0.9013 | | 0.0627 | 6.4121 | 2956 | 0.8225 | -0.0233 | 0.8225 | 0.9069 | | 0.0627 | 6.4165 | 2958 | 0.8574 | -0.0233 | 0.8574 | 0.9259 | | 0.0627 | 6.4208 | 2960 | 0.9091 | -0.0233 | 0.9091 | 0.9535 | | 0.0627 | 6.4252 | 2962 | 0.9064 | -0.0233 | 0.9064 | 0.9520 | | 0.0627 | 6.4295 | 2964 | 0.8648 | -0.0233 | 0.8648 | 0.9300 | | 0.0627 | 6.4338 | 2966 | 0.8158 | -0.0233 | 0.8158 | 0.9032 | | 0.0627 | 6.4382 | 2968 | 0.8071 | -0.0233 | 0.8071 | 0.8984 | | 0.0627 | 6.4425 | 2970 | 0.7689 | -0.0233 | 0.7689 | 0.8769 | | 0.0627 | 6.4469 | 2972 | 0.7598 | -0.0233 | 0.7598 | 0.8716 | | 0.0627 | 6.4512 | 2974 | 0.7840 | -0.0233 | 0.7840 | 0.8854 | | 0.0627 | 6.4555 | 2976 | 0.8456 | -0.0233 | 0.8456 | 0.9195 | | 0.0627 | 6.4599 | 2978 | 0.9475 | -0.0233 | 0.9475 | 0.9734 | | 0.0627 | 6.4642 | 2980 | 0.9798 | 0.0 | 0.9798 | 0.9898 | | 0.0627 | 6.4685 | 2982 | 0.9389 | -0.0233 | 0.9389 | 0.9689 | | 0.0627 | 6.4729 | 2984 | 0.8697 | -0.0233 | 0.8697 | 0.9326 | | 0.0627 | 6.4772 | 2986 | 0.8702 | -0.0233 | 0.8702 | 0.9328 | | 0.0627 | 6.4816 | 2988 | 0.8430 | -0.0233 | 0.8430 | 0.9181 | | 0.0627 | 6.4859 | 2990 | 0.7998 | -0.0233 | 0.7998 | 0.8943 | | 0.0627 | 6.4902 | 2992 | 0.8064 | -0.0233 | 0.8064 | 0.8980 | | 0.0627 | 6.4946 | 2994 | 0.8548 | -0.0233 | 0.8548 | 0.9245 | | 0.0627 | 6.4989 | 2996 | 0.8944 | -0.0233 | 0.8944 | 0.9457 | | 0.0627 | 6.5033 | 2998 | 0.9647 | -0.0233 | 0.9647 | 0.9822 | | 0.0548 | 6.5076 | 3000 | 1.0033 | 0.0 | 1.0033 | 1.0016 | | 0.0548 | 6.5119 | 3002 | 1.0084 | 0.0 | 1.0084 | 1.0042 | | 0.0548 | 6.5163 | 3004 | 0.9390 | -0.0233 | 0.9390 | 0.9690 | | 0.0548 | 6.5206 | 3006 | 0.9302 | -0.0233 | 0.9302 | 0.9645 | | 0.0548 | 6.5249 | 3008 | 0.9635 | -0.0233 | 0.9635 | 0.9816 | | 0.0548 | 6.5293 | 3010 | 0.9960 | 0.0 | 0.9960 | 0.9980 | | 0.0548 | 6.5336 | 3012 | 0.9887 | 0.0 | 0.9887 | 0.9943 | | 0.0548 | 6.5380 | 3014 | 0.9652 | 0.0 | 0.9652 | 0.9824 | | 0.0548 | 6.5423 | 3016 | 0.9398 | 0.0 | 0.9398 | 0.9694 | | 0.0548 | 6.5466 | 3018 | 0.9121 | -0.0233 | 0.9121 | 0.9550 | | 0.0548 | 6.5510 | 3020 | 0.8976 | -0.0233 | 0.8976 | 0.9474 | | 0.0548 | 6.5553 | 3022 | 0.9213 | -0.0233 | 0.9213 | 0.9598 | | 0.0548 | 6.5597 | 3024 | 1.0134 | 0.0 | 1.0134 | 1.0067 | | 0.0548 | 6.5640 | 3026 | 1.0402 | 0.0 | 1.0402 | 1.0199 | | 0.0548 | 6.5683 | 3028 | 0.9804 | -0.0233 | 0.9804 | 0.9901 | | 0.0548 | 6.5727 | 3030 | 0.8851 | -0.0233 | 0.8851 | 0.9408 | | 0.0548 | 6.5770 | 3032 | 0.8672 | -0.0233 | 0.8672 | 0.9312 | | 0.0548 | 6.5813 | 3034 | 0.9256 | -0.0233 | 0.9256 | 0.9621 | | 0.0548 | 6.5857 | 3036 | 1.0095 | -0.0233 | 1.0095 | 1.0048 | | 0.0548 | 6.5900 | 3038 | 1.0861 | 0.0 | 1.0861 | 1.0421 | | 0.0548 | 6.5944 | 3040 | 1.1060 | 0.0 | 1.1060 | 1.0517 | | 0.0548 | 6.5987 | 3042 | 1.0269 | -0.0233 | 1.0269 | 1.0134 | | 0.0548 | 6.6030 | 3044 | 0.9237 | -0.0233 | 0.9237 | 0.9611 | | 0.0548 | 6.6074 | 3046 | 0.9131 | -0.0233 | 0.9131 | 0.9555 | | 0.0548 | 6.6117 | 3048 | 0.9755 | -0.0233 | 0.9755 | 0.9877 | | 0.0548 | 6.6161 | 3050 | 1.0512 | 0.0 | 1.0512 | 1.0253 | | 0.0548 | 6.6204 | 3052 | 1.0526 | 0.0 | 1.0526 | 1.0260 | | 0.0548 | 6.6247 | 3054 | 0.9668 | -0.0233 | 0.9668 | 0.9832 | | 0.0548 | 6.6291 | 3056 | 0.8756 | -0.0233 | 0.8756 | 0.9357 | | 0.0548 | 6.6334 | 3058 | 0.8652 | -0.0233 | 0.8652 | 0.9301 | | 0.0548 | 6.6377 | 3060 | 0.9179 | -0.0233 | 0.9179 | 0.9581 | | 0.0548 | 6.6421 | 3062 | 0.9856 | -0.0233 | 0.9856 | 0.9928 | | 0.0548 | 6.6464 | 3064 | 1.0450 | 0.0 | 1.0450 | 1.0223 | | 0.0548 | 6.6508 | 3066 | 1.1183 | 0.0 | 1.1183 | 1.0575 | | 0.0548 | 6.6551 | 3068 | 1.0838 | 0.0 | 1.0838 | 1.0411 | | 0.0548 | 6.6594 | 3070 | 1.0004 | -0.0233 | 1.0004 | 1.0002 | | 0.0548 | 6.6638 | 3072 | 0.9462 | -0.0233 | 0.9462 | 0.9727 | | 0.0548 | 6.6681 | 3074 | 0.8985 | -0.0233 | 0.8985 | 0.9479 | | 0.0548 | 6.6725 | 3076 | 0.8988 | -0.0233 | 0.8988 | 0.9481 | | 0.0548 | 6.6768 | 3078 | 0.9243 | -0.0233 | 0.9243 | 0.9614 | | 0.0548 | 6.6811 | 3080 | 0.9864 | -0.0233 | 0.9864 | 0.9932 | | 0.0548 | 6.6855 | 3082 | 1.0362 | -0.0233 | 1.0362 | 1.0179 | | 0.0548 | 6.6898 | 3084 | 1.0945 | 0.0 | 1.0945 | 1.0462 | | 0.0548 | 6.6941 | 3086 | 1.0712 | 0.0 | 1.0712 | 1.0350 | | 0.0548 | 6.6985 | 3088 | 1.0124 | -0.0233 | 1.0124 | 1.0062 | | 0.0548 | 6.7028 | 3090 | 0.9707 | -0.0233 | 0.9707 | 0.9852 | | 0.0548 | 6.7072 | 3092 | 0.9711 | -0.0233 | 0.9711 | 0.9854 | | 0.0548 | 6.7115 | 3094 | 0.9368 | -0.0233 | 0.9368 | 0.9679 | | 0.0548 | 6.7158 | 3096 | 0.9401 | -0.0233 | 0.9401 | 0.9696 | | 0.0548 | 6.7202 | 3098 | 0.9379 | -0.0233 | 0.9379 | 0.9684 | | 0.0548 | 6.7245 | 3100 | 0.9083 | -0.0233 | 0.9083 | 0.9531 | | 0.0548 | 6.7289 | 3102 | 0.9071 | -0.0233 | 0.9071 | 0.9524 | | 0.0548 | 6.7332 | 3104 | 0.9497 | -0.0233 | 0.9497 | 0.9745 | | 0.0548 | 6.7375 | 3106 | 0.9807 | -0.0233 | 0.9807 | 0.9903 | | 0.0548 | 6.7419 | 3108 | 1.0321 | -0.0233 | 1.0321 | 1.0159 | | 0.0548 | 6.7462 | 3110 | 1.0372 | -0.0233 | 1.0372 | 1.0184 | | 0.0548 | 6.7505 | 3112 | 1.0559 | -0.0233 | 1.0559 | 1.0276 | | 0.0548 | 6.7549 | 3114 | 1.0075 | -0.0233 | 1.0075 | 1.0038 | | 0.0548 | 6.7592 | 3116 | 0.9359 | -0.0233 | 0.9359 | 0.9674 | | 0.0548 | 6.7636 | 3118 | 0.8753 | -0.0233 | 0.8753 | 0.9356 | | 0.0548 | 6.7679 | 3120 | 0.8745 | -0.0233 | 0.8745 | 0.9351 | | 0.0548 | 6.7722 | 3122 | 0.9128 | -0.0233 | 0.9128 | 0.9554 | | 0.0548 | 6.7766 | 3124 | 0.9636 | -0.0233 | 0.9636 | 0.9816 | | 0.0548 | 6.7809 | 3126 | 0.9668 | -0.0233 | 0.9668 | 0.9833 | | 0.0548 | 6.7852 | 3128 | 0.9162 | -0.0233 | 0.9162 | 0.9572 | | 0.0548 | 6.7896 | 3130 | 0.8824 | -0.0233 | 0.8824 | 0.9394 | | 0.0548 | 6.7939 | 3132 | 0.8889 | -0.0233 | 0.8889 | 0.9428 | | 0.0548 | 6.7983 | 3134 | 0.9466 | -0.0233 | 0.9466 | 0.9729 | | 0.0548 | 6.8026 | 3136 | 0.9746 | -0.0233 | 0.9746 | 0.9872 | | 0.0548 | 6.8069 | 3138 | 0.9889 | -0.0233 | 0.9889 | 0.9944 | | 0.0548 | 6.8113 | 3140 | 0.9781 | -0.0233 | 0.9781 | 0.9890 | | 0.0548 | 6.8156 | 3142 | 0.9281 | -0.0233 | 0.9281 | 0.9634 | | 0.0548 | 6.8200 | 3144 | 0.9032 | -0.0233 | 0.9032 | 0.9504 | | 0.0548 | 6.8243 | 3146 | 0.9221 | -0.0233 | 0.9221 | 0.9603 | | 0.0548 | 6.8286 | 3148 | 0.9804 | -0.0233 | 0.9804 | 0.9902 | | 0.0548 | 6.8330 | 3150 | 1.0034 | -0.0233 | 1.0034 | 1.0017 | | 0.0548 | 6.8373 | 3152 | 0.9767 | -0.0233 | 0.9767 | 0.9883 | | 0.0548 | 6.8416 | 3154 | 0.9037 | -0.0233 | 0.9037 | 0.9506 | | 0.0548 | 6.8460 | 3156 | 0.8461 | -0.0233 | 0.8461 | 0.9198 | | 0.0548 | 6.8503 | 3158 | 0.8067 | -0.0233 | 0.8067 | 0.8982 | | 0.0548 | 6.8547 | 3160 | 0.7990 | -0.0233 | 0.7990 | 0.8939 | | 0.0548 | 6.8590 | 3162 | 0.7993 | -0.0233 | 0.7993 | 0.8940 | | 0.0548 | 6.8633 | 3164 | 0.8188 | -0.0233 | 0.8188 | 0.9049 | | 0.0548 | 6.8677 | 3166 | 0.8431 | -0.0233 | 0.8431 | 0.9182 | | 0.0548 | 6.8720 | 3168 | 0.8840 | -0.0233 | 0.8840 | 0.9402 | | 0.0548 | 6.8764 | 3170 | 0.9569 | -0.0233 | 0.9569 | 0.9782 | | 0.0548 | 6.8807 | 3172 | 0.9812 | -0.0233 | 0.9812 | 0.9905 | | 0.0548 | 6.8850 | 3174 | 0.9461 | -0.0233 | 0.9461 | 0.9727 | | 0.0548 | 6.8894 | 3176 | 0.8794 | -0.0233 | 0.8794 | 0.9377 | | 0.0548 | 6.8937 | 3178 | 0.8205 | -0.0233 | 0.8205 | 0.9058 | | 0.0548 | 6.8980 | 3180 | 0.7997 | -0.0233 | 0.7997 | 0.8943 | | 0.0548 | 6.9024 | 3182 | 0.7984 | -0.0233 | 0.7984 | 0.8935 | | 0.0548 | 6.9067 | 3184 | 0.8279 | -0.0233 | 0.8279 | 0.9099 | | 0.0548 | 6.9111 | 3186 | 0.8915 | -0.0233 | 0.8915 | 0.9442 | | 0.0548 | 6.9154 | 3188 | 1.0067 | 0.0 | 1.0067 | 1.0033 | | 0.0548 | 6.9197 | 3190 | 1.0836 | 0.0 | 1.0836 | 1.0409 | | 0.0548 | 6.9241 | 3192 | 1.0888 | 0.0 | 1.0888 | 1.0435 | | 0.0548 | 6.9284 | 3194 | 1.0355 | 0.0 | 1.0355 | 1.0176 | | 0.0548 | 6.9328 | 3196 | 0.9640 | -0.0233 | 0.9640 | 0.9818 | | 0.0548 | 6.9371 | 3198 | 0.9068 | -0.0233 | 0.9068 | 0.9523 | | 0.0548 | 6.9414 | 3200 | 0.8430 | -0.0233 | 0.8430 | 0.9181 | | 0.0548 | 6.9458 | 3202 | 0.8172 | -0.0233 | 0.8172 | 0.9040 | | 0.0548 | 6.9501 | 3204 | 0.8351 | -0.0233 | 0.8351 | 0.9138 | | 0.0548 | 6.9544 | 3206 | 0.8951 | -0.0233 | 0.8951 | 0.9461 | | 0.0548 | 6.9588 | 3208 | 0.9716 | -0.0233 | 0.9716 | 0.9857 | | 0.0548 | 6.9631 | 3210 | 0.9876 | -0.0233 | 0.9876 | 0.9938 | | 0.0548 | 6.9675 | 3212 | 0.9582 | -0.0233 | 0.9582 | 0.9789 | | 0.0548 | 6.9718 | 3214 | 0.9188 | -0.0233 | 0.9188 | 0.9586 | | 0.0548 | 6.9761 | 3216 | 0.8808 | -0.0233 | 0.8808 | 0.9385 | | 0.0548 | 6.9805 | 3218 | 0.8761 | -0.0233 | 0.8761 | 0.9360 | | 0.0548 | 6.9848 | 3220 | 0.9169 | -0.0233 | 0.9169 | 0.9576 | | 0.0548 | 6.9892 | 3222 | 0.9806 | -0.0233 | 0.9806 | 0.9903 | | 0.0548 | 6.9935 | 3224 | 1.0057 | 0.0 | 1.0057 | 1.0029 | | 0.0548 | 6.9978 | 3226 | 0.9933 | -0.0233 | 0.9933 | 0.9966 | | 0.0548 | 7.0022 | 3228 | 0.9636 | -0.0233 | 0.9636 | 0.9816 | | 0.0548 | 7.0065 | 3230 | 0.9215 | -0.0233 | 0.9215 | 0.9600 | | 0.0548 | 7.0108 | 3232 | 0.8746 | -0.0233 | 0.8746 | 0.9352 | | 0.0548 | 7.0152 | 3234 | 0.8474 | -0.0233 | 0.8474 | 0.9205 | | 0.0548 | 7.0195 | 3236 | 0.8385 | -0.0233 | 0.8385 | 0.9157 | | 0.0548 | 7.0239 | 3238 | 0.8751 | -0.0233 | 0.8751 | 0.9355 | | 0.0548 | 7.0282 | 3240 | 0.9411 | -0.0233 | 0.9411 | 0.9701 | | 0.0548 | 7.0325 | 3242 | 0.9819 | -0.0233 | 0.9819 | 0.9909 | | 0.0548 | 7.0369 | 3244 | 0.9703 | -0.0233 | 0.9703 | 0.9850 | | 0.0548 | 7.0412 | 3246 | 0.9155 | -0.0233 | 0.9155 | 0.9568 | | 0.0548 | 7.0456 | 3248 | 0.8958 | -0.0233 | 0.8958 | 0.9465 | | 0.0548 | 7.0499 | 3250 | 0.9220 | -0.0233 | 0.9220 | 0.9602 | | 0.0548 | 7.0542 | 3252 | 0.9346 | -0.0233 | 0.9346 | 0.9668 | | 0.0548 | 7.0586 | 3254 | 0.9754 | -0.0233 | 0.9754 | 0.9876 | | 0.0548 | 7.0629 | 3256 | 1.0440 | 0.0 | 1.0440 | 1.0218 | | 0.0548 | 7.0672 | 3258 | 1.0342 | -0.0233 | 1.0342 | 1.0170 | | 0.0548 | 7.0716 | 3260 | 0.9644 | -0.0233 | 0.9644 | 0.9821 | | 0.0548 | 7.0759 | 3262 | 0.9200 | -0.0233 | 0.9200 | 0.9592 | | 0.0548 | 7.0803 | 3264 | 0.8935 | -0.0233 | 0.8935 | 0.9452 | | 0.0548 | 7.0846 | 3266 | 0.8398 | -0.0233 | 0.8398 | 0.9164 | | 0.0548 | 7.0889 | 3268 | 0.8058 | -0.0421 | 0.8058 | 0.8977 | | 0.0548 | 7.0933 | 3270 | 0.8026 | -0.0421 | 0.8026 | 0.8959 | | 0.0548 | 7.0976 | 3272 | 0.8257 | -0.0233 | 0.8257 | 0.9087 | | 0.0548 | 7.1020 | 3274 | 0.8727 | -0.0233 | 0.8727 | 0.9342 | | 0.0548 | 7.1063 | 3276 | 0.9461 | -0.0233 | 0.9461 | 0.9727 | | 0.0548 | 7.1106 | 3278 | 0.9726 | -0.0233 | 0.9726 | 0.9862 | | 0.0548 | 7.1150 | 3280 | 0.9591 | -0.0233 | 0.9591 | 0.9793 | | 0.0548 | 7.1193 | 3282 | 0.8908 | -0.0233 | 0.8908 | 0.9438 | | 0.0548 | 7.1236 | 3284 | 0.8536 | -0.0233 | 0.8536 | 0.9239 | | 0.0548 | 7.1280 | 3286 | 0.8184 | -0.0233 | 0.8184 | 0.9047 | | 0.0548 | 7.1323 | 3288 | 0.8377 | -0.0233 | 0.8377 | 0.9153 | | 0.0548 | 7.1367 | 3290 | 0.8744 | -0.0233 | 0.8744 | 0.9351 | | 0.0548 | 7.1410 | 3292 | 0.8728 | -0.0233 | 0.8728 | 0.9343 | | 0.0548 | 7.1453 | 3294 | 0.8671 | -0.0233 | 0.8671 | 0.9312 | | 0.0548 | 7.1497 | 3296 | 0.8961 | -0.0233 | 0.8961 | 0.9466 | | 0.0548 | 7.1540 | 3298 | 0.9333 | -0.0233 | 0.9333 | 0.9661 | | 0.0548 | 7.1584 | 3300 | 0.9452 | -0.0233 | 0.9452 | 0.9722 | | 0.0548 | 7.1627 | 3302 | 0.9332 | -0.0233 | 0.9332 | 0.9660 | | 0.0548 | 7.1670 | 3304 | 0.9017 | -0.0233 | 0.9017 | 0.9496 | | 0.0548 | 7.1714 | 3306 | 0.8564 | -0.0233 | 0.8564 | 0.9254 | | 0.0548 | 7.1757 | 3308 | 0.8110 | -0.0233 | 0.8110 | 0.9005 | | 0.0548 | 7.1800 | 3310 | 0.7940 | -0.0233 | 0.7940 | 0.8911 | | 0.0548 | 7.1844 | 3312 | 0.8159 | -0.0233 | 0.8159 | 0.9032 | | 0.0548 | 7.1887 | 3314 | 0.8738 | -0.0233 | 0.8738 | 0.9348 | | 0.0548 | 7.1931 | 3316 | 0.9289 | -0.0233 | 0.9289 | 0.9638 | | 0.0548 | 7.1974 | 3318 | 0.9802 | 0.0 | 0.9802 | 0.9900 | | 0.0548 | 7.2017 | 3320 | 0.9955 | 0.0 | 0.9955 | 0.9977 | | 0.0548 | 7.2061 | 3322 | 0.9724 | 0.0 | 0.9724 | 0.9861 | | 0.0548 | 7.2104 | 3324 | 0.9147 | -0.0233 | 0.9147 | 0.9564 | | 0.0548 | 7.2148 | 3326 | 0.8522 | -0.0233 | 0.8522 | 0.9232 | | 0.0548 | 7.2191 | 3328 | 0.8539 | -0.0233 | 0.8539 | 0.9241 | | 0.0548 | 7.2234 | 3330 | 0.8745 | -0.0233 | 0.8745 | 0.9352 | | 0.0548 | 7.2278 | 3332 | 0.8891 | -0.0233 | 0.8891 | 0.9429 | | 0.0548 | 7.2321 | 3334 | 0.9607 | -0.0233 | 0.9607 | 0.9801 | | 0.0548 | 7.2364 | 3336 | 1.0423 | 0.0 | 1.0423 | 1.0209 | | 0.0548 | 7.2408 | 3338 | 1.0536 | 0.0 | 1.0536 | 1.0264 | | 0.0548 | 7.2451 | 3340 | 1.0110 | 0.0 | 1.0110 | 1.0055 | | 0.0548 | 7.2495 | 3342 | 0.9320 | -0.0233 | 0.9320 | 0.9654 | | 0.0548 | 7.2538 | 3344 | 0.8411 | -0.0233 | 0.8411 | 0.9171 | | 0.0548 | 7.2581 | 3346 | 0.7699 | -0.0233 | 0.7699 | 0.8774 | | 0.0548 | 7.2625 | 3348 | 0.7576 | -0.0233 | 0.7576 | 0.8704 | | 0.0548 | 7.2668 | 3350 | 0.7822 | -0.0233 | 0.7822 | 0.8844 | | 0.0548 | 7.2711 | 3352 | 0.8369 | -0.0233 | 0.8369 | 0.9148 | | 0.0548 | 7.2755 | 3354 | 0.8666 | -0.0233 | 0.8666 | 0.9309 | | 0.0548 | 7.2798 | 3356 | 0.8921 | -0.0233 | 0.8921 | 0.9445 | | 0.0548 | 7.2842 | 3358 | 0.9173 | -0.0233 | 0.9173 | 0.9577 | | 0.0548 | 7.2885 | 3360 | 0.9385 | -0.0233 | 0.9385 | 0.9688 | | 0.0548 | 7.2928 | 3362 | 0.9335 | -0.0233 | 0.9335 | 0.9662 | | 0.0548 | 7.2972 | 3364 | 0.8807 | -0.0233 | 0.8807 | 0.9385 | | 0.0548 | 7.3015 | 3366 | 0.8204 | -0.0233 | 0.8204 | 0.9058 | | 0.0548 | 7.3059 | 3368 | 0.7701 | -0.0233 | 0.7701 | 0.8776 | | 0.0548 | 7.3102 | 3370 | 0.7627 | 0.1895 | 0.7627 | 0.8733 | | 0.0548 | 7.3145 | 3372 | 0.7794 | -0.0233 | 0.7794 | 0.8828 | | 0.0548 | 7.3189 | 3374 | 0.8169 | -0.0233 | 0.8169 | 0.9038 | | 0.0548 | 7.3232 | 3376 | 0.8306 | -0.0233 | 0.8306 | 0.9114 | | 0.0548 | 7.3275 | 3378 | 0.8441 | -0.0233 | 0.8441 | 0.9187 | | 0.0548 | 7.3319 | 3380 | 0.8763 | -0.0233 | 0.8763 | 0.9361 | | 0.0548 | 7.3362 | 3382 | 0.9154 | -0.0233 | 0.9154 | 0.9568 | | 0.0548 | 7.3406 | 3384 | 0.9680 | 0.0 | 0.9680 | 0.9839 | | 0.0548 | 7.3449 | 3386 | 0.9869 | 0.0 | 0.9869 | 0.9934 | | 0.0548 | 7.3492 | 3388 | 0.9741 | 0.0 | 0.9741 | 0.9869 | | 0.0548 | 7.3536 | 3390 | 0.9144 | -0.0233 | 0.9144 | 0.9562 | | 0.0548 | 7.3579 | 3392 | 0.8533 | -0.0233 | 0.8533 | 0.9238 | | 0.0548 | 7.3623 | 3394 | 0.8414 | -0.0233 | 0.8414 | 0.9173 | | 0.0548 | 7.3666 | 3396 | 0.8514 | -0.0233 | 0.8514 | 0.9227 | | 0.0548 | 7.3709 | 3398 | 0.8735 | -0.0233 | 0.8735 | 0.9346 | | 0.0548 | 7.3753 | 3400 | 0.9308 | -0.0233 | 0.9308 | 0.9648 | | 0.0548 | 7.3796 | 3402 | 0.9784 | 0.0 | 0.9784 | 0.9891 | | 0.0548 | 7.3839 | 3404 | 0.9808 | 0.0 | 0.9808 | 0.9903 | | 0.0548 | 7.3883 | 3406 | 0.9314 | -0.0233 | 0.9314 | 0.9651 | | 0.0548 | 7.3926 | 3408 | 0.8972 | -0.0233 | 0.8972 | 0.9472 | | 0.0548 | 7.3970 | 3410 | 0.8580 | -0.0233 | 0.8580 | 0.9263 | | 0.0548 | 7.4013 | 3412 | 0.8548 | -0.0233 | 0.8548 | 0.9246 | | 0.0548 | 7.4056 | 3414 | 0.8830 | -0.0233 | 0.8830 | 0.9397 | | 0.0548 | 7.4100 | 3416 | 0.9215 | -0.0233 | 0.9215 | 0.9600 | | 0.0548 | 7.4143 | 3418 | 0.9498 | 0.0 | 0.9498 | 0.9746 | | 0.0548 | 7.4187 | 3420 | 0.9493 | -0.0233 | 0.9493 | 0.9743 | | 0.0548 | 7.4230 | 3422 | 0.9439 | -0.0233 | 0.9439 | 0.9715 | | 0.0548 | 7.4273 | 3424 | 0.9344 | -0.0233 | 0.9344 | 0.9666 | | 0.0548 | 7.4317 | 3426 | 0.9328 | -0.0233 | 0.9328 | 0.9658 | | 0.0548 | 7.4360 | 3428 | 0.9206 | -0.0233 | 0.9206 | 0.9595 | | 0.0548 | 7.4403 | 3430 | 0.8903 | -0.0233 | 0.8903 | 0.9435 | | 0.0548 | 7.4447 | 3432 | 0.8707 | -0.0233 | 0.8707 | 0.9331 | | 0.0548 | 7.4490 | 3434 | 0.8601 | -0.0233 | 0.8601 | 0.9274 | | 0.0548 | 7.4534 | 3436 | 0.8639 | -0.0233 | 0.8639 | 0.9294 | | 0.0548 | 7.4577 | 3438 | 0.8856 | -0.0233 | 0.8856 | 0.9411 | | 0.0548 | 7.4620 | 3440 | 0.9250 | -0.0233 | 0.9250 | 0.9618 | | 0.0548 | 7.4664 | 3442 | 0.9639 | -0.0233 | 0.9639 | 0.9818 | | 0.0548 | 7.4707 | 3444 | 0.9417 | -0.0233 | 0.9417 | 0.9704 | | 0.0548 | 7.4751 | 3446 | 0.8900 | -0.0233 | 0.8900 | 0.9434 | | 0.0548 | 7.4794 | 3448 | 0.8402 | -0.0233 | 0.8402 | 0.9166 | | 0.0548 | 7.4837 | 3450 | 0.8428 | -0.0233 | 0.8428 | 0.9180 | | 0.0548 | 7.4881 | 3452 | 0.8600 | -0.0233 | 0.8600 | 0.9274 | | 0.0548 | 7.4924 | 3454 | 0.8915 | -0.0233 | 0.8915 | 0.9442 | | 0.0548 | 7.4967 | 3456 | 0.9507 | -0.0233 | 0.9507 | 0.9750 | | 0.0548 | 7.5011 | 3458 | 0.9872 | -0.0233 | 0.9872 | 0.9936 | | 0.0548 | 7.5054 | 3460 | 1.0178 | 0.0 | 1.0178 | 1.0089 | | 0.0548 | 7.5098 | 3462 | 1.0202 | 0.0 | 1.0202 | 1.0100 | | 0.0548 | 7.5141 | 3464 | 0.9971 | 0.0 | 0.9971 | 0.9986 | | 0.0548 | 7.5184 | 3466 | 0.9711 | -0.0233 | 0.9711 | 0.9855 | | 0.0548 | 7.5228 | 3468 | 0.9289 | -0.0233 | 0.9289 | 0.9638 | | 0.0548 | 7.5271 | 3470 | 0.8780 | -0.0233 | 0.8780 | 0.9370 | | 0.0548 | 7.5315 | 3472 | 0.8578 | -0.0233 | 0.8578 | 0.9262 | | 0.0548 | 7.5358 | 3474 | 0.8455 | -0.0233 | 0.8455 | 0.9195 | | 0.0548 | 7.5401 | 3476 | 0.8647 | -0.0233 | 0.8647 | 0.9299 | | 0.0548 | 7.5445 | 3478 | 0.9185 | -0.0233 | 0.9185 | 0.9584 | | 0.0548 | 7.5488 | 3480 | 0.9538 | -0.0233 | 0.9538 | 0.9766 | | 0.0548 | 7.5531 | 3482 | 0.9713 | -0.0233 | 0.9713 | 0.9856 | | 0.0548 | 7.5575 | 3484 | 0.9660 | -0.0233 | 0.9660 | 0.9829 | | 0.0548 | 7.5618 | 3486 | 0.9431 | -0.0233 | 0.9431 | 0.9711 | | 0.0548 | 7.5662 | 3488 | 0.9073 | -0.0233 | 0.9073 | 0.9525 | | 0.0548 | 7.5705 | 3490 | 0.8925 | -0.0233 | 0.8925 | 0.9447 | | 0.0548 | 7.5748 | 3492 | 0.9276 | -0.0233 | 0.9276 | 0.9631 | | 0.0548 | 7.5792 | 3494 | 0.9490 | -0.0233 | 0.9490 | 0.9742 | | 0.0548 | 7.5835 | 3496 | 0.9663 | -0.0233 | 0.9663 | 0.9830 | | 0.0548 | 7.5879 | 3498 | 0.9943 | -0.0233 | 0.9943 | 0.9971 | | 0.0481 | 7.5922 | 3500 | 1.0081 | -0.0233 | 1.0081 | 1.0040 | | 0.0481 | 7.5965 | 3502 | 0.9862 | -0.0233 | 0.9862 | 0.9931 | | 0.0481 | 7.6009 | 3504 | 0.9383 | -0.0233 | 0.9383 | 0.9687 | | 0.0481 | 7.6052 | 3506 | 0.9239 | -0.0233 | 0.9239 | 0.9612 | | 0.0481 | 7.6095 | 3508 | 0.8982 | -0.0233 | 0.8982 | 0.9478 | | 0.0481 | 7.6139 | 3510 | 0.8663 | -0.0233 | 0.8663 | 0.9308 | | 0.0481 | 7.6182 | 3512 | 0.8770 | -0.0233 | 0.8770 | 0.9365 | | 0.0481 | 7.6226 | 3514 | 0.9172 | -0.0233 | 0.9172 | 0.9577 | | 0.0481 | 7.6269 | 3516 | 0.9332 | 0.0 | 0.9332 | 0.9660 | | 0.0481 | 7.6312 | 3518 | 0.9293 | 0.0 | 0.9293 | 0.9640 | | 0.0481 | 7.6356 | 3520 | 0.9414 | 0.0 | 0.9414 | 0.9703 | | 0.0481 | 7.6399 | 3522 | 0.9380 | 0.0 | 0.9380 | 0.9685 | | 0.0481 | 7.6443 | 3524 | 0.9223 | 0.0 | 0.9223 | 0.9604 | | 0.0481 | 7.6486 | 3526 | 0.8904 | -0.0233 | 0.8904 | 0.9436 | | 0.0481 | 7.6529 | 3528 | 0.9038 | -0.0233 | 0.9038 | 0.9507 | | 0.0481 | 7.6573 | 3530 | 0.9263 | 0.0 | 0.9263 | 0.9624 | | 0.0481 | 7.6616 | 3532 | 0.9639 | 0.0 | 0.9639 | 0.9818 | | 0.0481 | 7.6659 | 3534 | 0.9743 | 0.0 | 0.9743 | 0.9871 | | 0.0481 | 7.6703 | 3536 | 0.9618 | 0.0 | 0.9618 | 0.9807 | | 0.0481 | 7.6746 | 3538 | 0.9530 | 0.0 | 0.9530 | 0.9762 | | 0.0481 | 7.6790 | 3540 | 0.9408 | 0.0 | 0.9408 | 0.9699 | | 0.0481 | 7.6833 | 3542 | 0.9494 | 0.0 | 0.9494 | 0.9744 | | 0.0481 | 7.6876 | 3544 | 0.9805 | 0.0 | 0.9805 | 0.9902 | | 0.0481 | 7.6920 | 3546 | 0.9611 | 0.0 | 0.9611 | 0.9803 | | 0.0481 | 7.6963 | 3548 | 0.9061 | -0.0233 | 0.9061 | 0.9519 | | 0.0481 | 7.7007 | 3550 | 0.8516 | -0.0233 | 0.8516 | 0.9228 | | 0.0481 | 7.7050 | 3552 | 0.8327 | -0.0233 | 0.8327 | 0.9125 | | 0.0481 | 7.7093 | 3554 | 0.8489 | -0.0233 | 0.8489 | 0.9213 | | 0.0481 | 7.7137 | 3556 | 0.8927 | -0.0233 | 0.8927 | 0.9448 | | 0.0481 | 7.7180 | 3558 | 0.9283 | 0.0 | 0.9283 | 0.9635 | | 0.0481 | 7.7223 | 3560 | 0.9532 | 0.0 | 0.9532 | 0.9763 | | 0.0481 | 7.7267 | 3562 | 0.9403 | 0.0 | 0.9403 | 0.9697 | | 0.0481 | 7.7310 | 3564 | 0.8979 | -0.0233 | 0.8979 | 0.9476 | | 0.0481 | 7.7354 | 3566 | 0.8523 | -0.0233 | 0.8523 | 0.9232 | | 0.0481 | 7.7397 | 3568 | 0.8322 | -0.0233 | 0.8322 | 0.9123 | | 0.0481 | 7.7440 | 3570 | 0.8247 | -0.0233 | 0.8247 | 0.9081 | | 0.0481 | 7.7484 | 3572 | 0.8495 | -0.0233 | 0.8495 | 0.9217 | | 0.0481 | 7.7527 | 3574 | 0.8984 | -0.0233 | 0.8984 | 0.9479 | | 0.0481 | 7.7570 | 3576 | 0.9251 | -0.0233 | 0.9251 | 0.9618 | | 0.0481 | 7.7614 | 3578 | 0.9090 | -0.0233 | 0.9090 | 0.9534 | | 0.0481 | 7.7657 | 3580 | 0.8826 | -0.0233 | 0.8826 | 0.9394 | | 0.0481 | 7.7701 | 3582 | 0.8418 | -0.0233 | 0.8418 | 0.9175 | | 0.0481 | 7.7744 | 3584 | 0.8288 | -0.0233 | 0.8288 | 0.9104 | | 0.0481 | 7.7787 | 3586 | 0.8445 | -0.0233 | 0.8445 | 0.9189 | | 0.0481 | 7.7831 | 3588 | 0.8491 | -0.0233 | 0.8491 | 0.9215 | | 0.0481 | 7.7874 | 3590 | 0.8818 | -0.0233 | 0.8818 | 0.9390 | | 0.0481 | 7.7918 | 3592 | 0.8991 | -0.0233 | 0.8991 | 0.9482 | | 0.0481 | 7.7961 | 3594 | 0.9289 | -0.0233 | 0.9289 | 0.9638 | | 0.0481 | 7.8004 | 3596 | 0.9398 | -0.0233 | 0.9398 | 0.9695 | | 0.0481 | 7.8048 | 3598 | 0.9239 | -0.0233 | 0.9239 | 0.9612 | | 0.0481 | 7.8091 | 3600 | 0.9309 | -0.0233 | 0.9309 | 0.9648 | | 0.0481 | 7.8134 | 3602 | 0.9298 | -0.0233 | 0.9298 | 0.9643 | | 0.0481 | 7.8178 | 3604 | 0.9166 | -0.0233 | 0.9166 | 0.9574 | | 0.0481 | 7.8221 | 3606 | 0.9051 | -0.0233 | 0.9051 | 0.9514 | | 0.0481 | 7.8265 | 3608 | 0.8645 | -0.0233 | 0.8645 | 0.9298 | | 0.0481 | 7.8308 | 3610 | 0.8538 | -0.0233 | 0.8538 | 0.9240 | | 0.0481 | 7.8351 | 3612 | 0.8775 | -0.0233 | 0.8775 | 0.9368 | | 0.0481 | 7.8395 | 3614 | 0.9354 | -0.0233 | 0.9354 | 0.9671 | | 0.0481 | 7.8438 | 3616 | 0.9807 | -0.0233 | 0.9807 | 0.9903 | | 0.0481 | 7.8482 | 3618 | 0.9826 | -0.0233 | 0.9826 | 0.9913 | | 0.0481 | 7.8525 | 3620 | 0.9812 | -0.0233 | 0.9812 | 0.9906 | | 0.0481 | 7.8568 | 3622 | 0.9689 | -0.0233 | 0.9689 | 0.9843 | | 0.0481 | 7.8612 | 3624 | 0.9419 | -0.0233 | 0.9419 | 0.9705 | | 0.0481 | 7.8655 | 3626 | 0.8904 | -0.0233 | 0.8904 | 0.9436 | | 0.0481 | 7.8698 | 3628 | 0.8620 | -0.0233 | 0.8620 | 0.9284 | | 0.0481 | 7.8742 | 3630 | 0.8485 | -0.0233 | 0.8485 | 0.9211 | | 0.0481 | 7.8785 | 3632 | 0.8427 | -0.0233 | 0.8427 | 0.9180 | | 0.0481 | 7.8829 | 3634 | 0.8545 | -0.0233 | 0.8545 | 0.9244 | | 0.0481 | 7.8872 | 3636 | 0.8648 | -0.0233 | 0.8648 | 0.9300 | | 0.0481 | 7.8915 | 3638 | 0.8790 | -0.0233 | 0.8790 | 0.9376 | | 0.0481 | 7.8959 | 3640 | 0.8879 | -0.0233 | 0.8879 | 0.9423 | | 0.0481 | 7.9002 | 3642 | 0.8914 | -0.0233 | 0.8914 | 0.9441 | | 0.0481 | 7.9046 | 3644 | 0.8940 | -0.0233 | 0.8940 | 0.9455 | | 0.0481 | 7.9089 | 3646 | 0.9078 | -0.0233 | 0.9078 | 0.9528 | | 0.0481 | 7.9132 | 3648 | 0.9010 | -0.0233 | 0.9010 | 0.9492 | | 0.0481 | 7.9176 | 3650 | 0.8822 | -0.0233 | 0.8822 | 0.9392 | | 0.0481 | 7.9219 | 3652 | 0.8742 | -0.0233 | 0.8742 | 0.9350 | | 0.0481 | 7.9262 | 3654 | 0.8646 | -0.0233 | 0.8646 | 0.9298 | | 0.0481 | 7.9306 | 3656 | 0.8526 | -0.0233 | 0.8526 | 0.9233 | | 0.0481 | 7.9349 | 3658 | 0.8229 | -0.0233 | 0.8229 | 0.9071 | | 0.0481 | 7.9393 | 3660 | 0.8058 | -0.0233 | 0.8058 | 0.8977 | | 0.0481 | 7.9436 | 3662 | 0.8010 | -0.0233 | 0.8010 | 0.8950 | | 0.0481 | 7.9479 | 3664 | 0.8072 | -0.0233 | 0.8072 | 0.8984 | | 0.0481 | 7.9523 | 3666 | 0.7898 | -0.0233 | 0.7898 | 0.8887 | | 0.0481 | 7.9566 | 3668 | 0.7840 | -0.0233 | 0.7840 | 0.8854 | | 0.0481 | 7.9610 | 3670 | 0.7690 | -0.0233 | 0.7690 | 0.8769 | | 0.0481 | 7.9653 | 3672 | 0.7702 | -0.0233 | 0.7702 | 0.8776 | | 0.0481 | 7.9696 | 3674 | 0.7923 | -0.0233 | 0.7923 | 0.8901 | | 0.0481 | 7.9740 | 3676 | 0.8103 | -0.0233 | 0.8103 | 0.9002 | | 0.0481 | 7.9783 | 3678 | 0.8523 | -0.0233 | 0.8523 | 0.9232 | | 0.0481 | 7.9826 | 3680 | 0.8734 | -0.0233 | 0.8734 | 0.9345 | | 0.0481 | 7.9870 | 3682 | 0.8856 | -0.0233 | 0.8856 | 0.9411 | | 0.0481 | 7.9913 | 3684 | 0.8727 | -0.0233 | 0.8727 | 0.9342 | | 0.0481 | 7.9957 | 3686 | 0.8617 | -0.0233 | 0.8617 | 0.9283 | | 0.0481 | 8.0 | 3688 | 0.8246 | -0.0233 | 0.8246 | 0.9081 | | 0.0481 | 8.0043 | 3690 | 0.8105 | -0.0233 | 0.8105 | 0.9003 | | 0.0481 | 8.0087 | 3692 | 0.8232 | -0.0233 | 0.8232 | 0.9073 | | 0.0481 | 8.0130 | 3694 | 0.8541 | -0.0233 | 0.8541 | 0.9242 | | 0.0481 | 8.0174 | 3696 | 0.8559 | -0.0233 | 0.8559 | 0.9251 | | 0.0481 | 8.0217 | 3698 | 0.8633 | -0.0233 | 0.8633 | 0.9292 | | 0.0481 | 8.0260 | 3700 | 0.8456 | -0.0233 | 0.8456 | 0.9195 | | 0.0481 | 8.0304 | 3702 | 0.8431 | -0.0233 | 0.8431 | 0.9182 | | 0.0481 | 8.0347 | 3704 | 0.8409 | -0.0233 | 0.8409 | 0.9170 | | 0.0481 | 8.0390 | 3706 | 0.8518 | -0.0233 | 0.8518 | 0.9229 | | 0.0481 | 8.0434 | 3708 | 0.8648 | -0.0233 | 0.8648 | 0.9300 | | 0.0481 | 8.0477 | 3710 | 0.9001 | -0.0233 | 0.9001 | 0.9487 | | 0.0481 | 8.0521 | 3712 | 0.9524 | -0.0233 | 0.9524 | 0.9759 | | 0.0481 | 8.0564 | 3714 | 0.9912 | 0.0 | 0.9912 | 0.9956 | | 0.0481 | 8.0607 | 3716 | 1.0040 | 0.0 | 1.0040 | 1.0020 | | 0.0481 | 8.0651 | 3718 | 0.9699 | -0.0233 | 0.9699 | 0.9848 | | 0.0481 | 8.0694 | 3720 | 0.9338 | -0.0233 | 0.9338 | 0.9664 | | 0.0481 | 8.0738 | 3722 | 0.8869 | -0.0233 | 0.8869 | 0.9417 | | 0.0481 | 8.0781 | 3724 | 0.8244 | -0.0233 | 0.8244 | 0.9080 | | 0.0481 | 8.0824 | 3726 | 0.7874 | -0.0233 | 0.7874 | 0.8874 | | 0.0481 | 8.0868 | 3728 | 0.7770 | -0.0233 | 0.7770 | 0.8815 | | 0.0481 | 8.0911 | 3730 | 0.7907 | -0.0233 | 0.7907 | 0.8892 | | 0.0481 | 8.0954 | 3732 | 0.8331 | -0.0233 | 0.8331 | 0.9128 | | 0.0481 | 8.0998 | 3734 | 0.8927 | -0.0233 | 0.8927 | 0.9448 | | 0.0481 | 8.1041 | 3736 | 0.9515 | -0.0233 | 0.9515 | 0.9755 | | 0.0481 | 8.1085 | 3738 | 0.9672 | -0.0233 | 0.9672 | 0.9835 | | 0.0481 | 8.1128 | 3740 | 0.9423 | -0.0233 | 0.9423 | 0.9707 | | 0.0481 | 8.1171 | 3742 | 0.9013 | -0.0233 | 0.9013 | 0.9494 | | 0.0481 | 8.1215 | 3744 | 0.8609 | -0.0233 | 0.8609 | 0.9279 | | 0.0481 | 8.1258 | 3746 | 0.8209 | -0.0233 | 0.8209 | 0.9061 | | 0.0481 | 8.1302 | 3748 | 0.7905 | -0.0233 | 0.7905 | 0.8891 | | 0.0481 | 8.1345 | 3750 | 0.7698 | -0.0233 | 0.7698 | 0.8774 | | 0.0481 | 8.1388 | 3752 | 0.7647 | -0.0233 | 0.7647 | 0.8745 | | 0.0481 | 8.1432 | 3754 | 0.7501 | -0.0233 | 0.7501 | 0.8661 | | 0.0481 | 8.1475 | 3756 | 0.7515 | -0.0233 | 0.7515 | 0.8669 | | 0.0481 | 8.1518 | 3758 | 0.7631 | -0.0233 | 0.7631 | 0.8735 | | 0.0481 | 8.1562 | 3760 | 0.7979 | -0.0233 | 0.7979 | 0.8932 | | 0.0481 | 8.1605 | 3762 | 0.8491 | -0.0233 | 0.8491 | 0.9214 | | 0.0481 | 8.1649 | 3764 | 0.8995 | -0.0233 | 0.8995 | 0.9484 | | 0.0481 | 8.1692 | 3766 | 0.9290 | -0.0233 | 0.9290 | 0.9639 | | 0.0481 | 8.1735 | 3768 | 0.9258 | -0.0233 | 0.9258 | 0.9622 | | 0.0481 | 8.1779 | 3770 | 0.8882 | -0.0233 | 0.8882 | 0.9425 | | 0.0481 | 8.1822 | 3772 | 0.8352 | -0.0233 | 0.8352 | 0.9139 | | 0.0481 | 8.1866 | 3774 | 0.8209 | -0.0233 | 0.8209 | 0.9060 | | 0.0481 | 8.1909 | 3776 | 0.8304 | -0.0233 | 0.8304 | 0.9112 | | 0.0481 | 8.1952 | 3778 | 0.8477 | -0.0233 | 0.8477 | 0.9207 | | 0.0481 | 8.1996 | 3780 | 0.8430 | -0.0233 | 0.8430 | 0.9181 | | 0.0481 | 8.2039 | 3782 | 0.8338 | -0.0233 | 0.8338 | 0.9131 | | 0.0481 | 8.2082 | 3784 | 0.8196 | -0.0233 | 0.8196 | 0.9053 | | 0.0481 | 8.2126 | 3786 | 0.8120 | -0.0233 | 0.8120 | 0.9011 | | 0.0481 | 8.2169 | 3788 | 0.8050 | -0.0233 | 0.8050 | 0.8972 | | 0.0481 | 8.2213 | 3790 | 0.8060 | -0.0233 | 0.8060 | 0.8978 | | 0.0481 | 8.2256 | 3792 | 0.8129 | -0.0233 | 0.8129 | 0.9016 | | 0.0481 | 8.2299 | 3794 | 0.8091 | -0.0233 | 0.8091 | 0.8995 | | 0.0481 | 8.2343 | 3796 | 0.8135 | -0.0233 | 0.8135 | 0.9020 | | 0.0481 | 8.2386 | 3798 | 0.8191 | -0.0233 | 0.8191 | 0.9050 | | 0.0481 | 8.2430 | 3800 | 0.8345 | -0.0233 | 0.8345 | 0.9135 | | 0.0481 | 8.2473 | 3802 | 0.8590 | -0.0233 | 0.8590 | 0.9268 | | 0.0481 | 8.2516 | 3804 | 0.8769 | -0.0233 | 0.8769 | 0.9364 | | 0.0481 | 8.2560 | 3806 | 0.8941 | -0.0233 | 0.8941 | 0.9456 | | 0.0481 | 8.2603 | 3808 | 0.8842 | -0.0233 | 0.8842 | 0.9403 | | 0.0481 | 8.2646 | 3810 | 0.8620 | -0.0233 | 0.8620 | 0.9285 | | 0.0481 | 8.2690 | 3812 | 0.8592 | -0.0233 | 0.8592 | 0.9269 | | 0.0481 | 8.2733 | 3814 | 0.8645 | -0.0233 | 0.8645 | 0.9298 | | 0.0481 | 8.2777 | 3816 | 0.8927 | -0.0233 | 0.8926 | 0.9448 | | 0.0481 | 8.2820 | 3818 | 0.9427 | -0.0233 | 0.9427 | 0.9709 | | 0.0481 | 8.2863 | 3820 | 0.9604 | -0.0233 | 0.9604 | 0.9800 | | 0.0481 | 8.2907 | 3822 | 0.9431 | -0.0233 | 0.9431 | 0.9711 | | 0.0481 | 8.2950 | 3824 | 0.9115 | -0.0233 | 0.9115 | 0.9547 | | 0.0481 | 8.2993 | 3826 | 0.8625 | -0.0233 | 0.8625 | 0.9287 | | 0.0481 | 8.3037 | 3828 | 0.8026 | -0.0233 | 0.8026 | 0.8959 | | 0.0481 | 8.3080 | 3830 | 0.7614 | -0.0233 | 0.7614 | 0.8726 | | 0.0481 | 8.3124 | 3832 | 0.7426 | 0.1895 | 0.7426 | 0.8618 | | 0.0481 | 8.3167 | 3834 | 0.7452 | -0.0233 | 0.7452 | 0.8632 | | 0.0481 | 8.3210 | 3836 | 0.7614 | -0.0233 | 0.7614 | 0.8726 | | 0.0481 | 8.3254 | 3838 | 0.7952 | -0.0233 | 0.7952 | 0.8917 | | 0.0481 | 8.3297 | 3840 | 0.8475 | -0.0233 | 0.8475 | 0.9206 | | 0.0481 | 8.3341 | 3842 | 0.9024 | -0.0233 | 0.9024 | 0.9499 | | 0.0481 | 8.3384 | 3844 | 0.9486 | -0.0233 | 0.9486 | 0.9740 | | 0.0481 | 8.3427 | 3846 | 0.9694 | -0.0233 | 0.9694 | 0.9846 | | 0.0481 | 8.3471 | 3848 | 0.9641 | -0.0233 | 0.9641 | 0.9819 | | 0.0481 | 8.3514 | 3850 | 0.9317 | -0.0233 | 0.9317 | 0.9653 | | 0.0481 | 8.3557 | 3852 | 0.8922 | -0.0233 | 0.8922 | 0.9446 | | 0.0481 | 8.3601 | 3854 | 0.8523 | -0.0233 | 0.8523 | 0.9232 | | 0.0481 | 8.3644 | 3856 | 0.8353 | -0.0233 | 0.8353 | 0.9140 | | 0.0481 | 8.3688 | 3858 | 0.8159 | -0.0233 | 0.8159 | 0.9033 | | 0.0481 | 8.3731 | 3860 | 0.8009 | -0.0233 | 0.8009 | 0.8949 | | 0.0481 | 8.3774 | 3862 | 0.8083 | -0.0233 | 0.8083 | 0.8991 | | 0.0481 | 8.3818 | 3864 | 0.8293 | -0.0233 | 0.8293 | 0.9107 | | 0.0481 | 8.3861 | 3866 | 0.8452 | -0.0233 | 0.8452 | 0.9194 | | 0.0481 | 8.3905 | 3868 | 0.8745 | -0.0233 | 0.8745 | 0.9351 | | 0.0481 | 8.3948 | 3870 | 0.8989 | -0.0233 | 0.8989 | 0.9481 | | 0.0481 | 8.3991 | 3872 | 0.8971 | -0.0233 | 0.8971 | 0.9472 | | 0.0481 | 8.4035 | 3874 | 0.8703 | -0.0233 | 0.8703 | 0.9329 | | 0.0481 | 8.4078 | 3876 | 0.8363 | -0.0233 | 0.8363 | 0.9145 | | 0.0481 | 8.4121 | 3878 | 0.8156 | -0.0233 | 0.8156 | 0.9031 | | 0.0481 | 8.4165 | 3880 | 0.8162 | -0.0233 | 0.8162 | 0.9035 | | 0.0481 | 8.4208 | 3882 | 0.8108 | -0.0233 | 0.8108 | 0.9004 | | 0.0481 | 8.4252 | 3884 | 0.8226 | -0.0233 | 0.8226 | 0.9070 | | 0.0481 | 8.4295 | 3886 | 0.8459 | -0.0233 | 0.8459 | 0.9197 | | 0.0481 | 8.4338 | 3888 | 0.8647 | -0.0233 | 0.8647 | 0.9299 | | 0.0481 | 8.4382 | 3890 | 0.8850 | -0.0233 | 0.8850 | 0.9408 | | 0.0481 | 8.4425 | 3892 | 0.9014 | -0.0233 | 0.9014 | 0.9494 | | 0.0481 | 8.4469 | 3894 | 0.9054 | -0.0233 | 0.9054 | 0.9515 | | 0.0481 | 8.4512 | 3896 | 0.9275 | -0.0233 | 0.9275 | 0.9631 | | 0.0481 | 8.4555 | 3898 | 0.9254 | -0.0233 | 0.9254 | 0.9620 | | 0.0481 | 8.4599 | 3900 | 0.8937 | -0.0233 | 0.8937 | 0.9454 | | 0.0481 | 8.4642 | 3902 | 0.8508 | -0.0233 | 0.8508 | 0.9224 | | 0.0481 | 8.4685 | 3904 | 0.8184 | -0.0233 | 0.8184 | 0.9047 | | 0.0481 | 8.4729 | 3906 | 0.7908 | -0.0233 | 0.7908 | 0.8893 | | 0.0481 | 8.4772 | 3908 | 0.7763 | -0.0233 | 0.7763 | 0.8811 | | 0.0481 | 8.4816 | 3910 | 0.7710 | -0.0233 | 0.7710 | 0.8781 | | 0.0481 | 8.4859 | 3912 | 0.7779 | -0.0233 | 0.7779 | 0.8820 | | 0.0481 | 8.4902 | 3914 | 0.7991 | -0.0233 | 0.7991 | 0.8939 | | 0.0481 | 8.4946 | 3916 | 0.8383 | -0.0233 | 0.8383 | 0.9156 | | 0.0481 | 8.4989 | 3918 | 0.8792 | -0.0233 | 0.8792 | 0.9376 | | 0.0481 | 8.5033 | 3920 | 0.8878 | -0.0233 | 0.8878 | 0.9422 | | 0.0481 | 8.5076 | 3922 | 0.9037 | -0.0233 | 0.9037 | 0.9506 | | 0.0481 | 8.5119 | 3924 | 0.9184 | -0.0233 | 0.9184 | 0.9583 | | 0.0481 | 8.5163 | 3926 | 0.9140 | -0.0233 | 0.9140 | 0.9561 | | 0.0481 | 8.5206 | 3928 | 0.9297 | -0.0233 | 0.9297 | 0.9642 | | 0.0481 | 8.5249 | 3930 | 0.9351 | 0.0 | 0.9351 | 0.9670 | | 0.0481 | 8.5293 | 3932 | 0.9251 | 0.0 | 0.9251 | 0.9618 | | 0.0481 | 8.5336 | 3934 | 0.8944 | -0.0233 | 0.8944 | 0.9458 | | 0.0481 | 8.5380 | 3936 | 0.8671 | -0.0233 | 0.8671 | 0.9312 | | 0.0481 | 8.5423 | 3938 | 0.8399 | -0.0233 | 0.8399 | 0.9165 | | 0.0481 | 8.5466 | 3940 | 0.8359 | -0.0233 | 0.8359 | 0.9143 | | 0.0481 | 8.5510 | 3942 | 0.8328 | -0.0233 | 0.8328 | 0.9126 | | 0.0481 | 8.5553 | 3944 | 0.8400 | -0.0233 | 0.8400 | 0.9165 | | 0.0481 | 8.5597 | 3946 | 0.8426 | -0.0233 | 0.8426 | 0.9179 | | 0.0481 | 8.5640 | 3948 | 0.8388 | -0.0233 | 0.8388 | 0.9159 | | 0.0481 | 8.5683 | 3950 | 0.8234 | -0.0233 | 0.8234 | 0.9074 | | 0.0481 | 8.5727 | 3952 | 0.8281 | -0.0233 | 0.8281 | 0.9100 | | 0.0481 | 8.5770 | 3954 | 0.8200 | -0.0233 | 0.8200 | 0.9056 | | 0.0481 | 8.5813 | 3956 | 0.8259 | -0.0233 | 0.8259 | 0.9088 | | 0.0481 | 8.5857 | 3958 | 0.8293 | -0.0233 | 0.8293 | 0.9107 | | 0.0481 | 8.5900 | 3960 | 0.8080 | -0.0233 | 0.8080 | 0.8989 | | 0.0481 | 8.5944 | 3962 | 0.7914 | -0.0233 | 0.7914 | 0.8896 | | 0.0481 | 8.5987 | 3964 | 0.7845 | -0.0233 | 0.7845 | 0.8857 | | 0.0481 | 8.6030 | 3966 | 0.7921 | -0.0233 | 0.7921 | 0.8900 | | 0.0481 | 8.6074 | 3968 | 0.7948 | -0.0233 | 0.7948 | 0.8915 | | 0.0481 | 8.6117 | 3970 | 0.7801 | -0.0233 | 0.7801 | 0.8832 | | 0.0481 | 8.6161 | 3972 | 0.7591 | -0.0233 | 0.7591 | 0.8712 | | 0.0481 | 8.6204 | 3974 | 0.7568 | -0.0233 | 0.7568 | 0.8699 | | 0.0481 | 8.6247 | 3976 | 0.7558 | -0.0233 | 0.7558 | 0.8694 | | 0.0481 | 8.6291 | 3978 | 0.7775 | -0.0233 | 0.7775 | 0.8817 | | 0.0481 | 8.6334 | 3980 | 0.7980 | -0.0233 | 0.7980 | 0.8933 | | 0.0481 | 8.6377 | 3982 | 0.8105 | -0.0233 | 0.8105 | 0.9003 | | 0.0481 | 8.6421 | 3984 | 0.8077 | -0.0233 | 0.8077 | 0.8987 | | 0.0481 | 8.6464 | 3986 | 0.8110 | -0.0233 | 0.8110 | 0.9006 | | 0.0481 | 8.6508 | 3988 | 0.8234 | -0.0233 | 0.8234 | 0.9074 | | 0.0481 | 8.6551 | 3990 | 0.8458 | -0.0233 | 0.8458 | 0.9197 | | 0.0481 | 8.6594 | 3992 | 0.8499 | -0.0233 | 0.8499 | 0.9219 | | 0.0481 | 8.6638 | 3994 | 0.8581 | -0.0233 | 0.8581 | 0.9263 | | 0.0481 | 8.6681 | 3996 | 0.8633 | -0.0233 | 0.8633 | 0.9291 | | 0.0481 | 8.6725 | 3998 | 0.8401 | -0.0233 | 0.8401 | 0.9166 | | 0.0424 | 8.6768 | 4000 | 0.8201 | -0.0233 | 0.8201 | 0.9056 | | 0.0424 | 8.6811 | 4002 | 0.7855 | -0.0233 | 0.7855 | 0.8863 | | 0.0424 | 8.6855 | 4004 | 0.7448 | -0.0233 | 0.7448 | 0.8630 | | 0.0424 | 8.6898 | 4006 | 0.7267 | -0.0233 | 0.7267 | 0.8525 | | 0.0424 | 8.6941 | 4008 | 0.7338 | -0.0233 | 0.7338 | 0.8566 | | 0.0424 | 8.6985 | 4010 | 0.7600 | -0.0233 | 0.7600 | 0.8718 | | 0.0424 | 8.7028 | 4012 | 0.8006 | -0.0233 | 0.8006 | 0.8948 | | 0.0424 | 8.7072 | 4014 | 0.8518 | -0.0233 | 0.8518 | 0.9229 | | 0.0424 | 8.7115 | 4016 | 0.8928 | 0.0 | 0.8928 | 0.9449 | | 0.0424 | 8.7158 | 4018 | 0.9211 | 0.0 | 0.9211 | 0.9597 | | 0.0424 | 8.7202 | 4020 | 0.9400 | 0.0 | 0.9400 | 0.9695 | | 0.0424 | 8.7245 | 4022 | 0.9473 | 0.0 | 0.9473 | 0.9733 | | 0.0424 | 8.7289 | 4024 | 0.9571 | 0.0 | 0.9571 | 0.9783 | | 0.0424 | 8.7332 | 4026 | 0.9462 | 0.0 | 0.9462 | 0.9727 | | 0.0424 | 8.7375 | 4028 | 0.9137 | -0.0233 | 0.9137 | 0.9559 | | 0.0424 | 8.7419 | 4030 | 0.8902 | -0.0233 | 0.8902 | 0.9435 | | 0.0424 | 8.7462 | 4032 | 0.8724 | -0.0233 | 0.8724 | 0.9340 | | 0.0424 | 8.7505 | 4034 | 0.8543 | -0.0233 | 0.8543 | 0.9243 | | 0.0424 | 8.7549 | 4036 | 0.8493 | -0.0233 | 0.8493 | 0.9216 | | 0.0424 | 8.7592 | 4038 | 0.8459 | -0.0233 | 0.8459 | 0.9197 | | 0.0424 | 8.7636 | 4040 | 0.8542 | -0.0233 | 0.8542 | 0.9243 | | 0.0424 | 8.7679 | 4042 | 0.8471 | -0.0233 | 0.8471 | 0.9204 | | 0.0424 | 8.7722 | 4044 | 0.8404 | -0.0233 | 0.8404 | 0.9167 | | 0.0424 | 8.7766 | 4046 | 0.8483 | -0.0233 | 0.8483 | 0.9210 | | 0.0424 | 8.7809 | 4048 | 0.8433 | -0.0233 | 0.8433 | 0.9183 | | 0.0424 | 8.7852 | 4050 | 0.8328 | -0.0233 | 0.8328 | 0.9126 | | 0.0424 | 8.7896 | 4052 | 0.8362 | -0.0233 | 0.8362 | 0.9145 | | 0.0424 | 8.7939 | 4054 | 0.8496 | -0.0233 | 0.8496 | 0.9217 | | 0.0424 | 8.7983 | 4056 | 0.8526 | -0.0233 | 0.8526 | 0.9234 | | 0.0424 | 8.8026 | 4058 | 0.8501 | -0.0233 | 0.8501 | 0.9220 | | 0.0424 | 8.8069 | 4060 | 0.8361 | -0.0233 | 0.8361 | 0.9144 | | 0.0424 | 8.8113 | 4062 | 0.8329 | -0.0233 | 0.8329 | 0.9126 | | 0.0424 | 8.8156 | 4064 | 0.8420 | -0.0233 | 0.8420 | 0.9176 | | 0.0424 | 8.8200 | 4066 | 0.8638 | -0.0233 | 0.8638 | 0.9294 | | 0.0424 | 8.8243 | 4068 | 0.8902 | -0.0233 | 0.8902 | 0.9435 | | 0.0424 | 8.8286 | 4070 | 0.9085 | -0.0233 | 0.9085 | 0.9531 | | 0.0424 | 8.8330 | 4072 | 0.9126 | -0.0233 | 0.9126 | 0.9553 | | 0.0424 | 8.8373 | 4074 | 0.9008 | -0.0233 | 0.9008 | 0.9491 | | 0.0424 | 8.8416 | 4076 | 0.8761 | -0.0233 | 0.8761 | 0.9360 | | 0.0424 | 8.8460 | 4078 | 0.8716 | -0.0233 | 0.8716 | 0.9336 | | 0.0424 | 8.8503 | 4080 | 0.8736 | -0.0233 | 0.8736 | 0.9347 | | 0.0424 | 8.8547 | 4082 | 0.8682 | -0.0233 | 0.8682 | 0.9318 | | 0.0424 | 8.8590 | 4084 | 0.8526 | -0.0233 | 0.8526 | 0.9234 | | 0.0424 | 8.8633 | 4086 | 0.8478 | -0.0233 | 0.8478 | 0.9207 | | 0.0424 | 8.8677 | 4088 | 0.8539 | -0.0233 | 0.8539 | 0.9241 | | 0.0424 | 8.8720 | 4090 | 0.8625 | -0.0233 | 0.8625 | 0.9287 | | 0.0424 | 8.8764 | 4092 | 0.8622 | -0.0233 | 0.8622 | 0.9285 | | 0.0424 | 8.8807 | 4094 | 0.8493 | -0.0233 | 0.8493 | 0.9216 | | 0.0424 | 8.8850 | 4096 | 0.8497 | -0.0233 | 0.8497 | 0.9218 | | 0.0424 | 8.8894 | 4098 | 0.8593 | -0.0233 | 0.8593 | 0.9270 | | 0.0424 | 8.8937 | 4100 | 0.8774 | -0.0233 | 0.8774 | 0.9367 | | 0.0424 | 8.8980 | 4102 | 0.9081 | -0.0233 | 0.9081 | 0.9530 | | 0.0424 | 8.9024 | 4104 | 0.9403 | 0.0 | 0.9403 | 0.9697 | | 0.0424 | 8.9067 | 4106 | 0.9468 | 0.0 | 0.9468 | 0.9730 | | 0.0424 | 8.9111 | 4108 | 0.9370 | 0.0 | 0.9370 | 0.9680 | | 0.0424 | 8.9154 | 4110 | 0.9364 | 0.0 | 0.9364 | 0.9677 | | 0.0424 | 8.9197 | 4112 | 0.9468 | 0.0 | 0.9468 | 0.9730 | | 0.0424 | 8.9241 | 4114 | 0.9472 | 0.0 | 0.9472 | 0.9733 | | 0.0424 | 8.9284 | 4116 | 0.9436 | 0.0 | 0.9436 | 0.9714 | | 0.0424 | 8.9328 | 4118 | 0.9310 | 0.0 | 0.9310 | 0.9649 | | 0.0424 | 8.9371 | 4120 | 0.9019 | -0.0233 | 0.9019 | 0.9497 | | 0.0424 | 8.9414 | 4122 | 0.8740 | -0.0233 | 0.8740 | 0.9349 | | 0.0424 | 8.9458 | 4124 | 0.8336 | -0.0233 | 0.8336 | 0.9130 | | 0.0424 | 8.9501 | 4126 | 0.8042 | -0.0233 | 0.8042 | 0.8968 | | 0.0424 | 8.9544 | 4128 | 0.7937 | -0.0233 | 0.7937 | 0.8909 | | 0.0424 | 8.9588 | 4130 | 0.7890 | -0.0233 | 0.7890 | 0.8883 | | 0.0424 | 8.9631 | 4132 | 0.7959 | -0.0233 | 0.7959 | 0.8921 | | 0.0424 | 8.9675 | 4134 | 0.8165 | -0.0233 | 0.8165 | 0.9036 | | 0.0424 | 8.9718 | 4136 | 0.8448 | -0.0233 | 0.8448 | 0.9191 | | 0.0424 | 8.9761 | 4138 | 0.8828 | -0.0233 | 0.8828 | 0.9396 | | 0.0424 | 8.9805 | 4140 | 0.8996 | -0.0233 | 0.8996 | 0.9485 | | 0.0424 | 8.9848 | 4142 | 0.9139 | 0.0 | 0.9139 | 0.9560 | | 0.0424 | 8.9892 | 4144 | 0.9033 | 0.0 | 0.9033 | 0.9504 | | 0.0424 | 8.9935 | 4146 | 0.8784 | -0.0233 | 0.8784 | 0.9372 | | 0.0424 | 8.9978 | 4148 | 0.8474 | -0.0233 | 0.8474 | 0.9205 | | 0.0424 | 9.0022 | 4150 | 0.8302 | -0.0233 | 0.8302 | 0.9111 | | 0.0424 | 9.0065 | 4152 | 0.8235 | -0.0233 | 0.8235 | 0.9074 | | 0.0424 | 9.0108 | 4154 | 0.8091 | -0.0233 | 0.8091 | 0.8995 | | 0.0424 | 9.0152 | 4156 | 0.8104 | -0.0233 | 0.8104 | 0.9002 | | 0.0424 | 9.0195 | 4158 | 0.8143 | -0.0233 | 0.8143 | 0.9024 | | 0.0424 | 9.0239 | 4160 | 0.8171 | -0.0233 | 0.8171 | 0.9039 | | 0.0424 | 9.0282 | 4162 | 0.8269 | -0.0233 | 0.8269 | 0.9094 | | 0.0424 | 9.0325 | 4164 | 0.8520 | -0.0233 | 0.8520 | 0.9230 | | 0.0424 | 9.0369 | 4166 | 0.8725 | -0.0233 | 0.8725 | 0.9341 | | 0.0424 | 9.0412 | 4168 | 0.8969 | -0.0233 | 0.8969 | 0.9470 | | 0.0424 | 9.0456 | 4170 | 0.9139 | -0.0233 | 0.9139 | 0.9560 | | 0.0424 | 9.0499 | 4172 | 0.9118 | 0.0 | 0.9118 | 0.9549 | | 0.0424 | 9.0542 | 4174 | 0.8955 | -0.0233 | 0.8955 | 0.9463 | | 0.0424 | 9.0586 | 4176 | 0.8793 | -0.0233 | 0.8793 | 0.9377 | | 0.0424 | 9.0629 | 4178 | 0.8627 | -0.0233 | 0.8627 | 0.9288 | | 0.0424 | 9.0672 | 4180 | 0.8501 | -0.0233 | 0.8501 | 0.9220 | | 0.0424 | 9.0716 | 4182 | 0.8483 | -0.0233 | 0.8483 | 0.9210 | | 0.0424 | 9.0759 | 4184 | 0.8408 | -0.0233 | 0.8408 | 0.9169 | | 0.0424 | 9.0803 | 4186 | 0.8412 | -0.0233 | 0.8412 | 0.9172 | | 0.0424 | 9.0846 | 4188 | 0.8421 | -0.0233 | 0.8421 | 0.9176 | | 0.0424 | 9.0889 | 4190 | 0.8378 | -0.0233 | 0.8378 | 0.9153 | | 0.0424 | 9.0933 | 4192 | 0.8361 | -0.0233 | 0.8361 | 0.9144 | | 0.0424 | 9.0976 | 4194 | 0.8374 | -0.0233 | 0.8374 | 0.9151 | | 0.0424 | 9.1020 | 4196 | 0.8369 | -0.0233 | 0.8369 | 0.9148 | | 0.0424 | 9.1063 | 4198 | 0.8242 | -0.0233 | 0.8242 | 0.9079 | | 0.0424 | 9.1106 | 4200 | 0.8192 | -0.0233 | 0.8192 | 0.9051 | | 0.0424 | 9.1150 | 4202 | 0.8152 | -0.0233 | 0.8152 | 0.9029 | | 0.0424 | 9.1193 | 4204 | 0.8134 | -0.0233 | 0.8134 | 0.9019 | | 0.0424 | 9.1236 | 4206 | 0.8146 | -0.0233 | 0.8146 | 0.9026 | | 0.0424 | 9.1280 | 4208 | 0.8186 | -0.0233 | 0.8186 | 0.9048 | | 0.0424 | 9.1323 | 4210 | 0.8344 | -0.0233 | 0.8344 | 0.9135 | | 0.0424 | 9.1367 | 4212 | 0.8629 | -0.0233 | 0.8629 | 0.9289 | | 0.0424 | 9.1410 | 4214 | 0.8909 | -0.0233 | 0.8909 | 0.9439 | | 0.0424 | 9.1453 | 4216 | 0.9161 | -0.0233 | 0.9161 | 0.9571 | | 0.0424 | 9.1497 | 4218 | 0.9264 | -0.0233 | 0.9264 | 0.9625 | | 0.0424 | 9.1540 | 4220 | 0.9301 | -0.0233 | 0.9301 | 0.9644 | | 0.0424 | 9.1584 | 4222 | 0.9199 | -0.0233 | 0.9199 | 0.9591 | | 0.0424 | 9.1627 | 4224 | 0.9042 | -0.0233 | 0.9042 | 0.9509 | | 0.0424 | 9.1670 | 4226 | 0.8939 | -0.0233 | 0.8939 | 0.9455 | | 0.0424 | 9.1714 | 4228 | 0.8770 | -0.0233 | 0.8770 | 0.9365 | | 0.0424 | 9.1757 | 4230 | 0.8621 | -0.0233 | 0.8621 | 0.9285 | | 0.0424 | 9.1800 | 4232 | 0.8434 | -0.0233 | 0.8434 | 0.9183 | | 0.0424 | 9.1844 | 4234 | 0.8287 | -0.0233 | 0.8287 | 0.9104 | | 0.0424 | 9.1887 | 4236 | 0.8252 | -0.0233 | 0.8252 | 0.9084 | | 0.0424 | 9.1931 | 4238 | 0.8240 | -0.0233 | 0.8240 | 0.9078 | | 0.0424 | 9.1974 | 4240 | 0.8345 | -0.0233 | 0.8345 | 0.9135 | | 0.0424 | 9.2017 | 4242 | 0.8504 | -0.0233 | 0.8504 | 0.9222 | | 0.0424 | 9.2061 | 4244 | 0.8746 | -0.0233 | 0.8746 | 0.9352 | | 0.0424 | 9.2104 | 4246 | 0.9007 | -0.0233 | 0.9007 | 0.9491 | | 0.0424 | 9.2148 | 4248 | 0.9172 | -0.0233 | 0.9172 | 0.9577 | | 0.0424 | 9.2191 | 4250 | 0.9224 | -0.0233 | 0.9224 | 0.9604 | | 0.0424 | 9.2234 | 4252 | 0.9149 | -0.0233 | 0.9149 | 0.9565 | | 0.0424 | 9.2278 | 4254 | 0.8963 | -0.0233 | 0.8963 | 0.9467 | | 0.0424 | 9.2321 | 4256 | 0.8703 | -0.0233 | 0.8703 | 0.9329 | | 0.0424 | 9.2364 | 4258 | 0.8501 | -0.0233 | 0.8501 | 0.9220 | | 0.0424 | 9.2408 | 4260 | 0.8237 | -0.0233 | 0.8237 | 0.9076 | | 0.0424 | 9.2451 | 4262 | 0.7982 | -0.0233 | 0.7982 | 0.8934 | | 0.0424 | 9.2495 | 4264 | 0.7864 | -0.0233 | 0.7864 | 0.8868 | | 0.0424 | 9.2538 | 4266 | 0.7868 | -0.0233 | 0.7868 | 0.8870 | | 0.0424 | 9.2581 | 4268 | 0.7922 | -0.0233 | 0.7922 | 0.8900 | | 0.0424 | 9.2625 | 4270 | 0.8072 | -0.0233 | 0.8072 | 0.8984 | | 0.0424 | 9.2668 | 4272 | 0.8290 | -0.0233 | 0.8290 | 0.9105 | | 0.0424 | 9.2711 | 4274 | 0.8541 | -0.0233 | 0.8541 | 0.9242 | | 0.0424 | 9.2755 | 4276 | 0.8788 | -0.0233 | 0.8788 | 0.9374 | | 0.0424 | 9.2798 | 4278 | 0.9017 | -0.0233 | 0.9017 | 0.9496 | | 0.0424 | 9.2842 | 4280 | 0.9283 | -0.0233 | 0.9283 | 0.9635 | | 0.0424 | 9.2885 | 4282 | 0.9434 | -0.0233 | 0.9434 | 0.9713 | | 0.0424 | 9.2928 | 4284 | 0.9421 | -0.0233 | 0.9421 | 0.9706 | | 0.0424 | 9.2972 | 4286 | 0.9243 | -0.0233 | 0.9243 | 0.9614 | | 0.0424 | 9.3015 | 4288 | 0.8970 | -0.0233 | 0.8970 | 0.9471 | | 0.0424 | 9.3059 | 4290 | 0.8826 | -0.0233 | 0.8826 | 0.9395 | | 0.0424 | 9.3102 | 4292 | 0.8728 | -0.0233 | 0.8728 | 0.9342 | | 0.0424 | 9.3145 | 4294 | 0.8694 | -0.0233 | 0.8694 | 0.9324 | | 0.0424 | 9.3189 | 4296 | 0.8686 | -0.0233 | 0.8686 | 0.9320 | | 0.0424 | 9.3232 | 4298 | 0.8656 | -0.0233 | 0.8656 | 0.9304 | | 0.0424 | 9.3275 | 4300 | 0.8575 | -0.0233 | 0.8575 | 0.9260 | | 0.0424 | 9.3319 | 4302 | 0.8548 | -0.0233 | 0.8548 | 0.9245 | | 0.0424 | 9.3362 | 4304 | 0.8618 | -0.0233 | 0.8618 | 0.9283 | | 0.0424 | 9.3406 | 4306 | 0.8699 | -0.0233 | 0.8699 | 0.9327 | | 0.0424 | 9.3449 | 4308 | 0.8840 | -0.0233 | 0.8840 | 0.9402 | | 0.0424 | 9.3492 | 4310 | 0.9057 | -0.0233 | 0.9057 | 0.9517 | | 0.0424 | 9.3536 | 4312 | 0.9240 | -0.0233 | 0.9240 | 0.9612 | | 0.0424 | 9.3579 | 4314 | 0.9257 | -0.0233 | 0.9257 | 0.9621 | | 0.0424 | 9.3623 | 4316 | 0.9304 | -0.0233 | 0.9304 | 0.9645 | | 0.0424 | 9.3666 | 4318 | 0.9243 | -0.0233 | 0.9243 | 0.9614 | | 0.0424 | 9.3709 | 4320 | 0.9069 | -0.0233 | 0.9069 | 0.9523 | | 0.0424 | 9.3753 | 4322 | 0.8934 | -0.0233 | 0.8934 | 0.9452 | | 0.0424 | 9.3796 | 4324 | 0.8815 | -0.0233 | 0.8815 | 0.9389 | | 0.0424 | 9.3839 | 4326 | 0.8832 | -0.0233 | 0.8832 | 0.9398 | | 0.0424 | 9.3883 | 4328 | 0.8888 | -0.0233 | 0.8888 | 0.9428 | | 0.0424 | 9.3926 | 4330 | 0.8906 | -0.0233 | 0.8906 | 0.9437 | | 0.0424 | 9.3970 | 4332 | 0.8798 | -0.0233 | 0.8798 | 0.9380 | | 0.0424 | 9.4013 | 4334 | 0.8604 | -0.0233 | 0.8604 | 0.9276 | | 0.0424 | 9.4056 | 4336 | 0.8407 | -0.0233 | 0.8407 | 0.9169 | | 0.0424 | 9.4100 | 4338 | 0.8310 | -0.0233 | 0.8310 | 0.9116 | | 0.0424 | 9.4143 | 4340 | 0.8213 | -0.0233 | 0.8213 | 0.9063 | | 0.0424 | 9.4187 | 4342 | 0.8181 | -0.0233 | 0.8181 | 0.9045 | | 0.0424 | 9.4230 | 4344 | 0.8224 | -0.0233 | 0.8224 | 0.9068 | | 0.0424 | 9.4273 | 4346 | 0.8318 | -0.0233 | 0.8318 | 0.9120 | | 0.0424 | 9.4317 | 4348 | 0.8442 | -0.0233 | 0.8442 | 0.9188 | | 0.0424 | 9.4360 | 4350 | 0.8485 | -0.0233 | 0.8485 | 0.9211 | | 0.0424 | 9.4403 | 4352 | 0.8600 | -0.0233 | 0.8600 | 0.9274 | | 0.0424 | 9.4447 | 4354 | 0.8774 | -0.0233 | 0.8774 | 0.9367 | | 0.0424 | 9.4490 | 4356 | 0.8916 | -0.0233 | 0.8916 | 0.9442 | | 0.0424 | 9.4534 | 4358 | 0.8981 | -0.0233 | 0.8981 | 0.9477 | | 0.0424 | 9.4577 | 4360 | 0.8966 | -0.0233 | 0.8966 | 0.9469 | | 0.0424 | 9.4620 | 4362 | 0.8934 | -0.0233 | 0.8934 | 0.9452 | | 0.0424 | 9.4664 | 4364 | 0.8941 | -0.0233 | 0.8941 | 0.9456 | | 0.0424 | 9.4707 | 4366 | 0.8946 | -0.0233 | 0.8946 | 0.9458 | | 0.0424 | 9.4751 | 4368 | 0.8861 | -0.0233 | 0.8861 | 0.9413 | | 0.0424 | 9.4794 | 4370 | 0.8744 | -0.0233 | 0.8744 | 0.9351 | | 0.0424 | 9.4837 | 4372 | 0.8695 | -0.0233 | 0.8695 | 0.9325 | | 0.0424 | 9.4881 | 4374 | 0.8691 | -0.0233 | 0.8691 | 0.9323 | | 0.0424 | 9.4924 | 4376 | 0.8634 | -0.0233 | 0.8634 | 0.9292 | | 0.0424 | 9.4967 | 4378 | 0.8511 | -0.0233 | 0.8511 | 0.9226 | | 0.0424 | 9.5011 | 4380 | 0.8425 | -0.0233 | 0.8425 | 0.9179 | | 0.0424 | 9.5054 | 4382 | 0.8319 | -0.0233 | 0.8319 | 0.9121 | | 0.0424 | 9.5098 | 4384 | 0.8311 | -0.0233 | 0.8311 | 0.9116 | | 0.0424 | 9.5141 | 4386 | 0.8336 | -0.0233 | 0.8336 | 0.9130 | | 0.0424 | 9.5184 | 4388 | 0.8434 | -0.0233 | 0.8434 | 0.9184 | | 0.0424 | 9.5228 | 4390 | 0.8474 | -0.0233 | 0.8474 | 0.9206 | | 0.0424 | 9.5271 | 4392 | 0.8546 | -0.0233 | 0.8546 | 0.9244 | | 0.0424 | 9.5315 | 4394 | 0.8653 | -0.0233 | 0.8653 | 0.9302 | | 0.0424 | 9.5358 | 4396 | 0.8746 | -0.0233 | 0.8746 | 0.9352 | | 0.0424 | 9.5401 | 4398 | 0.8788 | -0.0233 | 0.8788 | 0.9374 | | 0.0424 | 9.5445 | 4400 | 0.8752 | -0.0233 | 0.8752 | 0.9355 | | 0.0424 | 9.5488 | 4402 | 0.8671 | -0.0233 | 0.8671 | 0.9312 | | 0.0424 | 9.5531 | 4404 | 0.8606 | -0.0233 | 0.8606 | 0.9277 | | 0.0424 | 9.5575 | 4406 | 0.8544 | -0.0233 | 0.8544 | 0.9243 | | 0.0424 | 9.5618 | 4408 | 0.8517 | -0.0233 | 0.8517 | 0.9229 | | 0.0424 | 9.5662 | 4410 | 0.8485 | -0.0233 | 0.8485 | 0.9211 | | 0.0424 | 9.5705 | 4412 | 0.8473 | -0.0233 | 0.8473 | 0.9205 | | 0.0424 | 9.5748 | 4414 | 0.8512 | -0.0233 | 0.8512 | 0.9226 | | 0.0424 | 9.5792 | 4416 | 0.8563 | -0.0233 | 0.8563 | 0.9254 | | 0.0424 | 9.5835 | 4418 | 0.8588 | -0.0233 | 0.8588 | 0.9267 | | 0.0424 | 9.5879 | 4420 | 0.8635 | -0.0233 | 0.8635 | 0.9293 | | 0.0424 | 9.5922 | 4422 | 0.8724 | -0.0233 | 0.8724 | 0.9340 | | 0.0424 | 9.5965 | 4424 | 0.8760 | -0.0233 | 0.8760 | 0.9360 | | 0.0424 | 9.6009 | 4426 | 0.8790 | -0.0233 | 0.8790 | 0.9376 | | 0.0424 | 9.6052 | 4428 | 0.8784 | -0.0233 | 0.8784 | 0.9372 | | 0.0424 | 9.6095 | 4430 | 0.8727 | -0.0233 | 0.8727 | 0.9342 | | 0.0424 | 9.6139 | 4432 | 0.8656 | -0.0233 | 0.8656 | 0.9304 | | 0.0424 | 9.6182 | 4434 | 0.8645 | -0.0233 | 0.8645 | 0.9298 | | 0.0424 | 9.6226 | 4436 | 0.8667 | -0.0233 | 0.8667 | 0.9310 | | 0.0424 | 9.6269 | 4438 | 0.8703 | -0.0233 | 0.8703 | 0.9329 | | 0.0424 | 9.6312 | 4440 | 0.8682 | -0.0233 | 0.8682 | 0.9318 | | 0.0424 | 9.6356 | 4442 | 0.8636 | -0.0233 | 0.8636 | 0.9293 | | 0.0424 | 9.6399 | 4444 | 0.8634 | -0.0233 | 0.8634 | 0.9292 | | 0.0424 | 9.6443 | 4446 | 0.8633 | -0.0233 | 0.8633 | 0.9291 | | 0.0424 | 9.6486 | 4448 | 0.8663 | -0.0233 | 0.8663 | 0.9308 | | 0.0424 | 9.6529 | 4450 | 0.8663 | -0.0233 | 0.8663 | 0.9308 | | 0.0424 | 9.6573 | 4452 | 0.8673 | -0.0233 | 0.8673 | 0.9313 | | 0.0424 | 9.6616 | 4454 | 0.8649 | -0.0233 | 0.8649 | 0.9300 | | 0.0424 | 9.6659 | 4456 | 0.8672 | -0.0233 | 0.8672 | 0.9312 | | 0.0424 | 9.6703 | 4458 | 0.8711 | -0.0233 | 0.8711 | 0.9333 | | 0.0424 | 9.6746 | 4460 | 0.8765 | -0.0233 | 0.8765 | 0.9362 | | 0.0424 | 9.6790 | 4462 | 0.8798 | -0.0233 | 0.8798 | 0.9380 | | 0.0424 | 9.6833 | 4464 | 0.8834 | -0.0233 | 0.8834 | 0.9399 | | 0.0424 | 9.6876 | 4466 | 0.8828 | -0.0233 | 0.8828 | 0.9396 | | 0.0424 | 9.6920 | 4468 | 0.8779 | -0.0233 | 0.8779 | 0.9369 | | 0.0424 | 9.6963 | 4470 | 0.8720 | -0.0233 | 0.8720 | 0.9338 | | 0.0424 | 9.7007 | 4472 | 0.8707 | -0.0233 | 0.8707 | 0.9331 | | 0.0424 | 9.7050 | 4474 | 0.8678 | -0.0233 | 0.8678 | 0.9316 | | 0.0424 | 9.7093 | 4476 | 0.8685 | -0.0233 | 0.8685 | 0.9319 | | 0.0424 | 9.7137 | 4478 | 0.8674 | -0.0233 | 0.8674 | 0.9313 | | 0.0424 | 9.7180 | 4480 | 0.8659 | -0.0233 | 0.8659 | 0.9305 | | 0.0424 | 9.7223 | 4482 | 0.8604 | -0.0233 | 0.8604 | 0.9276 | | 0.0424 | 9.7267 | 4484 | 0.8534 | -0.0233 | 0.8534 | 0.9238 | | 0.0424 | 9.7310 | 4486 | 0.8442 | -0.0233 | 0.8442 | 0.9188 | | 0.0424 | 9.7354 | 4488 | 0.8373 | -0.0233 | 0.8373 | 0.9150 | | 0.0424 | 9.7397 | 4490 | 0.8343 | -0.0233 | 0.8343 | 0.9134 | | 0.0424 | 9.7440 | 4492 | 0.8298 | -0.0233 | 0.8298 | 0.9110 | | 0.0424 | 9.7484 | 4494 | 0.8252 | -0.0233 | 0.8252 | 0.9084 | | 0.0424 | 9.7527 | 4496 | 0.8226 | -0.0233 | 0.8226 | 0.9070 | | 0.0424 | 9.7570 | 4498 | 0.8206 | -0.0233 | 0.8206 | 0.9059 | | 0.0382 | 9.7614 | 4500 | 0.8209 | -0.0233 | 0.8209 | 0.9060 | | 0.0382 | 9.7657 | 4502 | 0.8216 | -0.0233 | 0.8216 | 0.9064 | | 0.0382 | 9.7701 | 4504 | 0.8250 | -0.0233 | 0.8250 | 0.9083 | | 0.0382 | 9.7744 | 4506 | 0.8263 | -0.0233 | 0.8263 | 0.9090 | | 0.0382 | 9.7787 | 4508 | 0.8249 | -0.0233 | 0.8249 | 0.9082 | | 0.0382 | 9.7831 | 4510 | 0.8256 | -0.0233 | 0.8256 | 0.9086 | | 0.0382 | 9.7874 | 4512 | 0.8283 | -0.0233 | 0.8283 | 0.9101 | | 0.0382 | 9.7918 | 4514 | 0.8297 | -0.0233 | 0.8297 | 0.9109 | | 0.0382 | 9.7961 | 4516 | 0.8285 | -0.0233 | 0.8285 | 0.9102 | | 0.0382 | 9.8004 | 4518 | 0.8264 | -0.0233 | 0.8264 | 0.9091 | | 0.0382 | 9.8048 | 4520 | 0.8230 | -0.0233 | 0.8230 | 0.9072 | | 0.0382 | 9.8091 | 4522 | 0.8199 | -0.0233 | 0.8199 | 0.9055 | | 0.0382 | 9.8134 | 4524 | 0.8206 | -0.0233 | 0.8206 | 0.9059 | | 0.0382 | 9.8178 | 4526 | 0.8216 | -0.0233 | 0.8216 | 0.9064 | | 0.0382 | 9.8221 | 4528 | 0.8259 | -0.0233 | 0.8259 | 0.9088 | | 0.0382 | 9.8265 | 4530 | 0.8316 | -0.0233 | 0.8316 | 0.9119 | | 0.0382 | 9.8308 | 4532 | 0.8373 | -0.0233 | 0.8373 | 0.9150 | | 0.0382 | 9.8351 | 4534 | 0.8434 | -0.0233 | 0.8434 | 0.9184 | | 0.0382 | 9.8395 | 4536 | 0.8505 | -0.0233 | 0.8505 | 0.9222 | | 0.0382 | 9.8438 | 4538 | 0.8585 | -0.0233 | 0.8585 | 0.9266 | | 0.0382 | 9.8482 | 4540 | 0.8660 | -0.0233 | 0.8660 | 0.9306 | | 0.0382 | 9.8525 | 4542 | 0.8699 | -0.0233 | 0.8699 | 0.9327 | | 0.0382 | 9.8568 | 4544 | 0.8714 | -0.0233 | 0.8714 | 0.9335 | | 0.0382 | 9.8612 | 4546 | 0.8719 | -0.0233 | 0.8719 | 0.9337 | | 0.0382 | 9.8655 | 4548 | 0.8730 | -0.0233 | 0.8730 | 0.9343 | | 0.0382 | 9.8698 | 4550 | 0.8739 | -0.0233 | 0.8739 | 0.9348 | | 0.0382 | 9.8742 | 4552 | 0.8726 | -0.0233 | 0.8726 | 0.9341 | | 0.0382 | 9.8785 | 4554 | 0.8697 | -0.0233 | 0.8697 | 0.9326 | | 0.0382 | 9.8829 | 4556 | 0.8652 | -0.0233 | 0.8652 | 0.9301 | | 0.0382 | 9.8872 | 4558 | 0.8611 | -0.0233 | 0.8611 | 0.9280 | | 0.0382 | 9.8915 | 4560 | 0.8581 | -0.0233 | 0.8581 | 0.9263 | | 0.0382 | 9.8959 | 4562 | 0.8558 | -0.0233 | 0.8558 | 0.9251 | | 0.0382 | 9.9002 | 4564 | 0.8541 | -0.0233 | 0.8541 | 0.9242 | | 0.0382 | 9.9046 | 4566 | 0.8522 | -0.0233 | 0.8522 | 0.9231 | | 0.0382 | 9.9089 | 4568 | 0.8505 | -0.0233 | 0.8505 | 0.9222 | | 0.0382 | 9.9132 | 4570 | 0.8487 | -0.0233 | 0.8487 | 0.9212 | | 0.0382 | 9.9176 | 4572 | 0.8484 | -0.0233 | 0.8484 | 0.9211 | | 0.0382 | 9.9219 | 4574 | 0.8488 | -0.0233 | 0.8488 | 0.9213 | | 0.0382 | 9.9262 | 4576 | 0.8489 | -0.0233 | 0.8489 | 0.9214 | | 0.0382 | 9.9306 | 4578 | 0.8494 | -0.0233 | 0.8494 | 0.9216 | | 0.0382 | 9.9349 | 4580 | 0.8492 | -0.0233 | 0.8492 | 0.9215 | | 0.0382 | 9.9393 | 4582 | 0.8497 | -0.0233 | 0.8497 | 0.9218 | | 0.0382 | 9.9436 | 4584 | 0.8506 | -0.0233 | 0.8506 | 0.9223 | | 0.0382 | 9.9479 | 4586 | 0.8517 | -0.0233 | 0.8517 | 0.9228 | | 0.0382 | 9.9523 | 4588 | 0.8521 | -0.0233 | 0.8521 | 0.9231 | | 0.0382 | 9.9566 | 4590 | 0.8523 | -0.0233 | 0.8523 | 0.9232 | | 0.0382 | 9.9610 | 4592 | 0.8527 | -0.0233 | 0.8527 | 0.9234 | | 0.0382 | 9.9653 | 4594 | 0.8536 | -0.0233 | 0.8536 | 0.9239 | | 0.0382 | 9.9696 | 4596 | 0.8542 | -0.0233 | 0.8542 | 0.9242 | | 0.0382 | 9.9740 | 4598 | 0.8543 | -0.0233 | 0.8543 | 0.9243 | | 0.0382 | 9.9783 | 4600 | 0.8543 | -0.0233 | 0.8543 | 0.9243 | | 0.0382 | 9.9826 | 4602 | 0.8541 | -0.0233 | 0.8541 | 0.9242 | | 0.0382 | 9.9870 | 4604 | 0.8541 | -0.0233 | 0.8541 | 0.9242 | | 0.0382 | 9.9913 | 4606 | 0.8540 | -0.0233 | 0.8540 | 0.9241 | | 0.0382 | 9.9957 | 4608 | 0.8539 | -0.0233 | 0.8539 | 0.9241 | | 0.0382 | 10.0 | 4610 | 0.8539 | -0.0233 | 0.8539 | 0.9241 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu118 - Datasets 2.21.0 - Tokenizers 0.19.1
bionot/15-2
bionot
"2024-11-12T23:08:58Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:08:58Z"
Entry not found
barchetta/pino-131009
barchetta
"2024-11-12T23:15:03Z"
0
0
null
[ "safetensors", "llama", "region:us" ]
null
"2024-11-12T23:09:07Z"
Entry not found
tttx/problem12_model_more_aug_30
tttx
"2024-11-12T23:56:34Z"
0
0
peft
[ "peft", "safetensors", "llama", "alignment-handbook", "trl", "sft", "generated_from_trainer", "dataset:tttx/problem12_data_more_aug", "base_model:barc0/Llama-3.1-ARC-Potpourri-Transduction-8B", "base_model:adapter:barc0/Llama-3.1-ARC-Potpourri-Transduction-8B", "license:llama3.1", "region:us" ]
null
"2024-11-12T23:09:09Z"
--- base_model: barc0/Llama-3.1-ARC-Potpourri-Transduction-8B datasets: - tttx/problem12_data_more_aug library_name: peft license: llama3.1 tags: - alignment-handbook - trl - sft - generated_from_trainer model-index: - name: problem12_model_more_aug_30 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # problem12_model_more_aug_30 This model is a fine-tuned version of [barc0/Llama-3.1-ARC-Potpourri-Transduction-8B](https://huggingface.co/barc0/Llama-3.1-ARC-Potpourri-Transduction-8B) on the tttx/problem12_data_more_aug dataset. It achieves the following results on the evaluation set: - Loss: nan ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - distributed_type: multi-GPU - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 0.0 | 1.0 | 2 | nan | | 0.002 | 2.0 | 4 | nan | ### Framework versions - PEFT 0.13.2 - Transformers 4.47.0.dev0 - Pytorch 2.4.0+cu121 - Datasets 3.1.0 - Tokenizers 0.20.3
glif-loradex-trainer/dham_dham_osteology2
glif-loradex-trainer
"2024-11-12T23:11:12Z"
0
0
diffusers
[ "diffusers", "text-to-image", "template:sd-lora", "base_model:black-forest-labs/FLUX.1-dev", "base_model:finetune:black-forest-labs/FLUX.1-dev", "license:other", "region:us", "flux", "lora", "base_model:adapter:black-forest-labs/FLUX.1-dev" ]
text-to-image
"2024-11-12T23:10:25Z"
--- tags: - diffusers - text-to-image - template:sd-lora - base_model:black-forest-labs/FLUX.1-dev - base_model:finetune:black-forest-labs/FLUX.1-dev - license:other - region:us - flux - lora widget: - output: url: samples/1731452894040__000003000_0.jpg text: a robot in the style of TOK - output: url: samples/1731452917113__000003000_1.jpg text: pacman in the style of TOK - output: url: samples/1731452940187__000003000_2.jpg text: a retro mac in the style of TOK - output: url: samples/1731452963264__000003000_3.jpg text: pikachu in the style of TOK - output: url: samples/1731452986366__000003000_4.jpg text: a salamander in the style of TOK - output: url: samples/1731453009478__000003000_5.jpg text: a ghost in the style of TOK base_model: black-forest-labs/FLUX.1-dev trigger: TOK instance_prompt: TOK license: other license_name: flux-1-dev-non-commercial-license license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md --- # dham_osteology2 Model trained with [AI Toolkit by Ostris](https://github.com/ostris/ai-toolkit) under the [Glif Loradex program](https://huggingface.co/glif-loradex-trainer) by [Glif](https://glif.app) user `dham`. <Gallery /> ## Trigger words You should use `TOK` to trigger the image generation. ## Download model Weights for this model are available in Safetensors format. [Download](/glif-loradex-trainer/dham_dham_osteology2/tree/main) them in the Files & versions tab. ## License This model is licensed under the [flux-1-dev-non-commercial-license](https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md).
kenken6696/Llama-3.2-3B_fix_head
kenken6696
"2024-11-12T23:14:16Z"
0
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
"2024-11-12T23:11:13Z"
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
OhaymakingO/3-12112149-02Haymak
OhaymakingO
"2024-11-12T23:15:52Z"
0
0
null
[ "safetensors", "llama", "region:us" ]
null
"2024-11-12T23:11:52Z"
Entry not found
touhidulislam/BERTweet_retrain_2020_14
touhidulislam
"2024-11-12T23:14:01Z"
0
0
transformers
[ "transformers", "safetensors", "roberta", "fill-mask", "generated_from_trainer", "base_model:vinai/bertweet-base", "base_model:finetune:vinai/bertweet-base", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
fill-mask
"2024-11-12T23:13:38Z"
--- library_name: transformers license: mit base_model: vinai/bertweet-base tags: - generated_from_trainer model-index: - name: BERTweet_retrain_2020_14 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # BERTweet_retrain_2020_14 This model is a fine-tuned version of [vinai/bertweet-base](https://huggingface.co/vinai/bertweet-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.6032 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 3.0595 | 1.0 | 3005 | 2.6784 | | 2.5428 | 2.0 | 6010 | 2.6193 | | 2.5419 | 3.0 | 9015 | 2.5905 | ### Framework versions - Transformers 4.45.1 - Pytorch 2.1.0+cu121 - Datasets 3.0.1 - Tokenizers 0.20.0
seyviour/paligemma_vqav2_2
seyviour
"2024-11-13T01:05:21Z"
0
0
peft
[ "peft", "tensorboard", "safetensors", "generated_from_trainer", "base_model:google/paligemma-3b-pt-224", "base_model:adapter:google/paligemma-3b-pt-224", "license:gemma", "region:us" ]
null
"2024-11-12T23:15:10Z"
--- library_name: peft license: gemma base_model: google/paligemma-3b-pt-224 tags: - generated_from_trainer model-index: - name: paligemma_vqav2_2 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # paligemma_vqav2_2 This model is a fine-tuned version of [google/paligemma-3b-pt-224](https://huggingface.co/google/paligemma-3b-pt-224) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 4e-05 - train_batch_size: 3 - eval_batch_size: 3 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 12 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 2 - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:------:|:----:|:---------------:| | No log | 0.9888 | 66 | 1.0517 | ### Framework versions - PEFT 0.13.2 - Transformers 4.46.1 - Pytorch 2.4.0.post301 - Datasets 2.14.4 - Tokenizers 0.20.1
t2m-mu/musicgene-small-musiccaps
t2m-mu
"2024-11-13T00:40:12Z"
0
0
peft
[ "peft", "safetensors", "text-to-audio", "musics-gen", "generated_from_trainer", "base_model:facebook/musicgen-small", "base_model:adapter:facebook/musicgen-small", "license:cc-by-nc-4.0", "region:us" ]
text-to-audio
"2024-11-12T23:15:11Z"
--- library_name: peft license: cc-by-nc-4.0 base_model: facebook/musicgen-small tags: - text-to-audio - musics-gen - generated_from_trainer model-index: - name: musicgene-small-musiccaps results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # musicgene-small-musiccaps This model is a fine-tuned version of [facebook/musicgen-small](https://huggingface.co/facebook/musicgen-small) on the ylacombe/tiny-punk dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 16 - optimizer: Use adamw_torch with betas=(0.9,0.99) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 1 - mixed_precision_training: Native AMP ### Training results ### Framework versions - PEFT 0.13.2 - Transformers 4.46.2 - Pytorch 2.5.0a0+872d972e41.nv24.08 - Datasets 3.1.0 - Tokenizers 0.20.3
shropsdarcey84/novilunar
shropsdarcey84
"2024-11-12T23:15:12Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:15:12Z"
Entry not found
shropsdarcey84/esotericism
shropsdarcey84
"2024-11-12T23:15:22Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:15:22Z"
Entry not found
featherless-ai-quants/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-GGUF
featherless-ai-quants
"2024-11-12T23:27:23Z"
0
0
null
[ "gguf", "text-generation", "base_model:ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta", "base_model:quantized:ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta", "region:us" ]
text-generation
"2024-11-12T23:15:23Z"
--- base_model: ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta pipeline_tag: text-generation quantized_by: featherless-ai-quants --- # ArianAskari/SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta GGUF Quantizations 🚀 ![Featherless AI Quants](./featherless-quants.png) *Optimized GGUF quantization files for enhanced model performance* > Powered by [Featherless AI](https://featherless.ai) - run any model you'd like for a simple small fee. --- ## Available Quantizations 📊 | Quantization Type | File | Size | |-------------------|------|------| | IQ4_XS | [ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-IQ4_XS.gguf](https://huggingface.co/featherless-ai-quants/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-GGUF/blob/main/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-IQ4_XS.gguf) | 3761.66 MB | | Q2_K | [ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q2_K.gguf](https://huggingface.co/featherless-ai-quants/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-GGUF/blob/main/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q2_K.gguf) | 2593.27 MB | | Q3_K_L | [ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q3_K_L.gguf](https://huggingface.co/featherless-ai-quants/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-GGUF/blob/main/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q3_K_L.gguf) | 3644.97 MB | | Q3_K_M | [ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q3_K_M.gguf](https://huggingface.co/featherless-ai-quants/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-GGUF/blob/main/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q3_K_M.gguf) | 3355.97 MB | | Q3_K_S | [ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q3_K_S.gguf](https://huggingface.co/featherless-ai-quants/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-GGUF/blob/main/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q3_K_S.gguf) | 3017.97 MB | | Q4_K_M | [ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q4_K_M.gguf](https://huggingface.co/featherless-ai-quants/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-GGUF/blob/main/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q4_K_M.gguf) | 4166.07 MB | | Q4_K_S | [ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q4_K_S.gguf](https://huggingface.co/featherless-ai-quants/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-GGUF/blob/main/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q4_K_S.gguf) | 3948.57 MB | | Q5_K_M | [ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q5_K_M.gguf](https://huggingface.co/featherless-ai-quants/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-GGUF/blob/main/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q5_K_M.gguf) | 4893.69 MB | | Q5_K_S | [ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q5_K_S.gguf](https://huggingface.co/featherless-ai-quants/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-GGUF/blob/main/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q5_K_S.gguf) | 4766.19 MB | | Q6_K | [ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q6_K.gguf](https://huggingface.co/featherless-ai-quants/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-GGUF/blob/main/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q6_K.gguf) | 5666.80 MB | | Q8_0 | [ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q8_0.gguf](https://huggingface.co/featherless-ai-quants/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-GGUF/blob/main/ArianAskari-SOLID-SFT-DPO-MixQV3-SOLIDRejected-SFTChosen-Zephyr-7b-beta-Q8_0.gguf) | 7339.34 MB | --- ## ⚡ Powered by [Featherless AI](https://featherless.ai) ### Key Features - 🔥 **Instant Hosting** - Deploy any Llama model on HuggingFace instantly - 🛠️ **Zero Infrastructure** - No server setup or maintenance required - 📚 **Vast Compatibility** - Support for 2400+ models and counting - 💎 **Affordable Pricing** - Starting at just $10/month --- **Links:** [Get Started](https://featherless.ai) | [Documentation](https://featherless.ai/docs) | [Models](https://featherless.ai/models)
tttx/problem12_model_aug_30
tttx
"2024-11-12T23:55:53Z"
0
0
peft
[ "peft", "safetensors", "llama", "alignment-handbook", "trl", "sft", "generated_from_trainer", "dataset:tttx/problem12_data", "base_model:barc0/Llama-3.1-ARC-Potpourri-Transduction-8B", "base_model:adapter:barc0/Llama-3.1-ARC-Potpourri-Transduction-8B", "license:llama3.1", "region:us" ]
null
"2024-11-12T23:15:31Z"
--- base_model: barc0/Llama-3.1-ARC-Potpourri-Transduction-8B datasets: - tttx/problem12_data library_name: peft license: llama3.1 tags: - alignment-handbook - trl - sft - generated_from_trainer model-index: - name: problem12_model_aug_30 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # problem12_model_aug_30 This model is a fine-tuned version of [barc0/Llama-3.1-ARC-Potpourri-Transduction-8B](https://huggingface.co/barc0/Llama-3.1-ARC-Potpourri-Transduction-8B) on the tttx/problem12_data dataset. It achieves the following results on the evaluation set: - Loss: nan ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - distributed_type: multi-GPU - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 0.0 | 1.0 | 60 | nan | | 0.0 | 2.0 | 120 | nan | ### Framework versions - PEFT 0.13.2 - Transformers 4.47.0.dev0 - Pytorch 2.4.0+cu121 - Datasets 3.1.0 - Tokenizers 0.20.3
shropsdarcey84/lazurites
shropsdarcey84
"2024-11-12T23:15:40Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:15:39Z"
Entry not found
shropsdarcey84/suberitidae
shropsdarcey84
"2024-11-12T23:15:51Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:15:51Z"
Entry not found
barchetta/voto-131015
barchetta
"2024-11-12T23:21:54Z"
0
0
null
[ "safetensors", "llama", "region:us" ]
null
"2024-11-12T23:15:53Z"
Entry not found
barchetta/arte-131015
barchetta
"2024-11-12T23:27:55Z"
0
0
null
[ "safetensors", "llama", "region:us" ]
null
"2024-11-12T23:15:54Z"
Entry not found
OhaymakingO/1-12112045-02Haymak
OhaymakingO
"2024-11-12T23:19:57Z"
0
0
null
[ "safetensors", "llama", "region:us" ]
null
"2024-11-12T23:15:54Z"
Entry not found
shropsdarcey84/myriapodan
shropsdarcey84
"2024-11-12T23:16:03Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:16:03Z"
Entry not found
shropsdarcey84/cumaceous
shropsdarcey84
"2024-11-12T23:16:15Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:16:15Z"
Entry not found
shropsdarcey84/trouvre
shropsdarcey84
"2024-11-12T23:16:27Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:16:27Z"
Entry not found
shropsdarcey84/deligated
shropsdarcey84
"2024-11-12T23:16:39Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:16:38Z"
Entry not found
dexserbia/test49-W7
dexserbia
"2024-11-12T23:18:10Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:16:45Z"
Entry not found
shropsdarcey84/aphizog
shropsdarcey84
"2024-11-12T23:16:51Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:16:51Z"
Entry not found
shropsdarcey84/advocatess
shropsdarcey84
"2024-11-12T23:17:03Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:17:03Z"
Entry not found
nitic-nlp-team/webnavix-llama-base
nitic-nlp-team
"2024-11-13T00:08:04Z"
0
0
null
[ "safetensors", "llama", "license:apache-2.0", "region:us" ]
null
"2024-11-12T23:17:06Z"
--- license: apache-2.0 ---
shropsdarcey84/aristides
shropsdarcey84
"2024-11-12T23:17:15Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:17:14Z"
Entry not found
shropsdarcey84/normed
shropsdarcey84
"2024-11-12T23:17:27Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:17:27Z"
Entry not found
shropsdarcey84/victorians
shropsdarcey84
"2024-11-12T23:17:39Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:17:39Z"
Entry not found
shropsdarcey84/proving
shropsdarcey84
"2024-11-12T23:17:51Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:17:51Z"
Entry not found
nitic-nlp-team/webnavix-llama-ai-tools
nitic-nlp-team
"2024-11-13T00:13:27Z"
0
0
null
[ "safetensors", "llama", "license:apache-2.0", "region:us" ]
null
"2024-11-12T23:17:53Z"
--- license: apache-2.0 ---
shropsdarcey84/reconstitution
shropsdarcey84
"2024-11-12T23:18:03Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:18:02Z"
Entry not found
shropsdarcey84/imperialin
shropsdarcey84
"2024-11-12T23:18:16Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:18:15Z"
Entry not found
shropsdarcey84/enterocoelous
shropsdarcey84
"2024-11-12T23:18:27Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:18:27Z"
Entry not found
nitic-nlp-team/webnavix-llama-social-interaction
nitic-nlp-team
"2024-11-13T00:18:59Z"
0
0
null
[ "safetensors", "llama", "license:apache-2.0", "region:us" ]
null
"2024-11-12T23:18:38Z"
--- license: apache-2.0 ---
jaydapichon68/surculus
jaydapichon68
"2024-11-12T23:18:39Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:18:39Z"
Entry not found
featherless-ai-quants/Knobi3-EvoMerge1-GGUF
featherless-ai-quants
"2024-11-12T23:29:21Z"
0
0
null
[ "gguf", "text-generation", "base_model:Knobi3/EvoMerge1", "base_model:quantized:Knobi3/EvoMerge1", "region:us" ]
text-generation
"2024-11-12T23:18:47Z"
--- base_model: Knobi3/EvoMerge1 pipeline_tag: text-generation quantized_by: featherless-ai-quants --- # Knobi3/EvoMerge1 GGUF Quantizations 🚀 ![Featherless AI Quants](./featherless-quants.png) *Optimized GGUF quantization files for enhanced model performance* > Powered by [Featherless AI](https://featherless.ai) - run any model you'd like for a simple small fee. --- ## Available Quantizations 📊 | Quantization Type | File | Size | |-------------------|------|------| | IQ4_XS | [Knobi3-EvoMerge1-IQ4_XS.gguf](https://huggingface.co/featherless-ai-quants/Knobi3-EvoMerge1-GGUF/blob/main/Knobi3-EvoMerge1-IQ4_XS.gguf) | 3761.66 MB | | Q2_K | [Knobi3-EvoMerge1-Q2_K.gguf](https://huggingface.co/featherless-ai-quants/Knobi3-EvoMerge1-GGUF/blob/main/Knobi3-EvoMerge1-Q2_K.gguf) | 2593.27 MB | | Q3_K_L | [Knobi3-EvoMerge1-Q3_K_L.gguf](https://huggingface.co/featherless-ai-quants/Knobi3-EvoMerge1-GGUF/blob/main/Knobi3-EvoMerge1-Q3_K_L.gguf) | 3644.97 MB | | Q3_K_M | [Knobi3-EvoMerge1-Q3_K_M.gguf](https://huggingface.co/featherless-ai-quants/Knobi3-EvoMerge1-GGUF/blob/main/Knobi3-EvoMerge1-Q3_K_M.gguf) | 3355.97 MB | | Q3_K_S | [Knobi3-EvoMerge1-Q3_K_S.gguf](https://huggingface.co/featherless-ai-quants/Knobi3-EvoMerge1-GGUF/blob/main/Knobi3-EvoMerge1-Q3_K_S.gguf) | 3017.97 MB | | Q4_K_M | [Knobi3-EvoMerge1-Q4_K_M.gguf](https://huggingface.co/featherless-ai-quants/Knobi3-EvoMerge1-GGUF/blob/main/Knobi3-EvoMerge1-Q4_K_M.gguf) | 4166.07 MB | | Q4_K_S | [Knobi3-EvoMerge1-Q4_K_S.gguf](https://huggingface.co/featherless-ai-quants/Knobi3-EvoMerge1-GGUF/blob/main/Knobi3-EvoMerge1-Q4_K_S.gguf) | 3948.57 MB | | Q5_K_M | [Knobi3-EvoMerge1-Q5_K_M.gguf](https://huggingface.co/featherless-ai-quants/Knobi3-EvoMerge1-GGUF/blob/main/Knobi3-EvoMerge1-Q5_K_M.gguf) | 4893.69 MB | | Q5_K_S | [Knobi3-EvoMerge1-Q5_K_S.gguf](https://huggingface.co/featherless-ai-quants/Knobi3-EvoMerge1-GGUF/blob/main/Knobi3-EvoMerge1-Q5_K_S.gguf) | 4766.19 MB | | Q6_K | [Knobi3-EvoMerge1-Q6_K.gguf](https://huggingface.co/featherless-ai-quants/Knobi3-EvoMerge1-GGUF/blob/main/Knobi3-EvoMerge1-Q6_K.gguf) | 5666.80 MB | | Q8_0 | [Knobi3-EvoMerge1-Q8_0.gguf](https://huggingface.co/featherless-ai-quants/Knobi3-EvoMerge1-GGUF/blob/main/Knobi3-EvoMerge1-Q8_0.gguf) | 7339.34 MB | --- ## ⚡ Powered by [Featherless AI](https://featherless.ai) ### Key Features - 🔥 **Instant Hosting** - Deploy any Llama model on HuggingFace instantly - 🛠️ **Zero Infrastructure** - No server setup or maintenance required - 📚 **Vast Compatibility** - Support for 2400+ models and counting - 💎 **Affordable Pricing** - Starting at just $10/month --- **Links:** [Get Started](https://featherless.ai) | [Documentation](https://featherless.ai/docs) | [Models](https://featherless.ai/models)
jaydapichon68/riflebird
jaydapichon68
"2024-11-12T23:18:51Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:18:51Z"
Entry not found
jaydapichon68/chuffs
jaydapichon68
"2024-11-12T23:19:03Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:19:03Z"
Entry not found
nitic-nlp-team/webnavix-llama-summarizing
nitic-nlp-team
"2024-11-13T00:24:21Z"
0
0
null
[ "safetensors", "llama", "license:apache-2.0", "region:us" ]
null
"2024-11-12T23:19:09Z"
--- license: apache-2.0 ---
jaydapichon68/stupex
jaydapichon68
"2024-11-12T23:19:15Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:19:14Z"
Entry not found
jaydapichon68/ferulae
jaydapichon68
"2024-11-12T23:19:27Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:19:27Z"
Entry not found
nitic-nlp-team/webnavix-llama-information-lookup
nitic-nlp-team
"2024-11-13T00:28:32Z"
0
0
null
[ "safetensors", "llama", "license:apache-2.0", "region:us" ]
null
"2024-11-12T23:19:38Z"
--- license: apache-2.0 ---
jaydapichon68/lionization
jaydapichon68
"2024-11-12T23:19:39Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:19:39Z"
Entry not found
jaydapichon68/respectably
jaydapichon68
"2024-11-12T23:19:51Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:19:51Z"
Entry not found
nitic-nlp-team/webnavix-llama-composing
nitic-nlp-team
"2024-11-13T00:32:35Z"
0
0
null
[ "safetensors", "llama", "license:apache-2.0", "region:us" ]
null
"2024-11-12T23:20:00Z"
--- license: apache-2.0 ---
jaydapichon68/kea
jaydapichon68
"2024-11-12T23:20:03Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:20:03Z"
Entry not found
jaydapichon68/foreboard
jaydapichon68
"2024-11-12T23:20:15Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:20:15Z"
Entry not found
tensorblock/KoR-Orca-Platypus-13B-GGUF
tensorblock
"2024-11-13T00:06:39Z"
0
0
transformers
[ "transformers", "gguf", "TensorBlock", "GGUF", "text-generation", "ko", "dataset:kyujinpy/OpenOrca-KO", "dataset:kyujinpy/KOpen-platypus", "base_model:kyujinpy/KoR-Orca-Platypus-13B", "base_model:quantized:kyujinpy/KoR-Orca-Platypus-13B", "license:cc-by-nc-sa-4.0", "endpoints_compatible", "region:us" ]
text-generation
"2024-11-12T23:20:15Z"
--- language: - ko datasets: - kyujinpy/OpenOrca-KO - kyujinpy/KOpen-platypus library_name: transformers pipeline_tag: text-generation license: cc-by-nc-sa-4.0 base_model: kyujinpy/KoR-Orca-Platypus-13B tags: - TensorBlock - GGUF --- <div style="width: auto; margin-left: auto; margin-right: auto"> <img src="https://i.imgur.com/jC7kdl8.jpeg" alt="TensorBlock" style="width: 100%; min-width: 400px; display: block; margin: auto;"> </div> <div style="display: flex; justify-content: space-between; width: 100%;"> <div style="display: flex; flex-direction: column; align-items: flex-start;"> <p style="margin-top: 0.5em; margin-bottom: 0em;"> Feedback and support: TensorBlock's <a href="https://x.com/tensorblock_aoi">Twitter/X</a>, <a href="https://t.me/TensorBlock">Telegram Group</a> and <a href="https://x.com/tensorblock_aoi">Discord server</a> </p> </div> </div> ## kyujinpy/KoR-Orca-Platypus-13B - GGUF This repo contains GGUF format model files for [kyujinpy/KoR-Orca-Platypus-13B](https://huggingface.co/kyujinpy/KoR-Orca-Platypus-13B). The files were quantized using machines provided by [TensorBlock](https://tensorblock.co/), and they are compatible with llama.cpp as of [commit b4011](https://github.com/ggerganov/llama.cpp/commit/a6744e43e80f4be6398fc7733a01642c846dce1d). ## Prompt template ``` ``` ## Model file specification | Filename | Quant type | File Size | Description | | -------- | ---------- | --------- | ----------- | | [KoR-Orca-Platypus-13B-Q2_K.gguf](https://huggingface.co/tensorblock/KoR-Orca-Platypus-13B-GGUF/tree/main/KoR-Orca-Platypus-13B-Q2_K.gguf) | Q2_K | 4.521 GB | smallest, significant quality loss - not recommended for most purposes | | [KoR-Orca-Platypus-13B-Q3_K_S.gguf](https://huggingface.co/tensorblock/KoR-Orca-Platypus-13B-GGUF/tree/main/KoR-Orca-Platypus-13B-Q3_K_S.gguf) | Q3_K_S | 5.270 GB | very small, high quality loss | | [KoR-Orca-Platypus-13B-Q3_K_M.gguf](https://huggingface.co/tensorblock/KoR-Orca-Platypus-13B-GGUF/tree/main/KoR-Orca-Platypus-13B-Q3_K_M.gguf) | Q3_K_M | 5.903 GB | very small, high quality loss | | [KoR-Orca-Platypus-13B-Q3_K_L.gguf](https://huggingface.co/tensorblock/KoR-Orca-Platypus-13B-GGUF/tree/main/KoR-Orca-Platypus-13B-Q3_K_L.gguf) | Q3_K_L | 6.454 GB | small, substantial quality loss | | [KoR-Orca-Platypus-13B-Q4_0.gguf](https://huggingface.co/tensorblock/KoR-Orca-Platypus-13B-GGUF/tree/main/KoR-Orca-Platypus-13B-Q4_0.gguf) | Q4_0 | 6.860 GB | legacy; small, very high quality loss - prefer using Q3_K_M | | [KoR-Orca-Platypus-13B-Q4_K_S.gguf](https://huggingface.co/tensorblock/KoR-Orca-Platypus-13B-GGUF/tree/main/KoR-Orca-Platypus-13B-Q4_K_S.gguf) | Q4_K_S | 6.913 GB | small, greater quality loss | | [KoR-Orca-Platypus-13B-Q4_K_M.gguf](https://huggingface.co/tensorblock/KoR-Orca-Platypus-13B-GGUF/tree/main/KoR-Orca-Platypus-13B-Q4_K_M.gguf) | Q4_K_M | 7.326 GB | medium, balanced quality - recommended | | [KoR-Orca-Platypus-13B-Q5_0.gguf](https://huggingface.co/tensorblock/KoR-Orca-Platypus-13B-GGUF/tree/main/KoR-Orca-Platypus-13B-Q5_0.gguf) | Q5_0 | 8.356 GB | legacy; medium, balanced quality - prefer using Q4_K_M | | [KoR-Orca-Platypus-13B-Q5_K_S.gguf](https://huggingface.co/tensorblock/KoR-Orca-Platypus-13B-GGUF/tree/main/KoR-Orca-Platypus-13B-Q5_K_S.gguf) | Q5_K_S | 8.356 GB | large, low quality loss - recommended | | [KoR-Orca-Platypus-13B-Q5_K_M.gguf](https://huggingface.co/tensorblock/KoR-Orca-Platypus-13B-GGUF/tree/main/KoR-Orca-Platypus-13B-Q5_K_M.gguf) | Q5_K_M | 8.596 GB | large, very low quality loss - recommended | | [KoR-Orca-Platypus-13B-Q6_K.gguf](https://huggingface.co/tensorblock/KoR-Orca-Platypus-13B-GGUF/tree/main/KoR-Orca-Platypus-13B-Q6_K.gguf) | Q6_K | 9.946 GB | very large, extremely low quality loss | | [KoR-Orca-Platypus-13B-Q8_0.gguf](https://huggingface.co/tensorblock/KoR-Orca-Platypus-13B-GGUF/tree/main/KoR-Orca-Platypus-13B-Q8_0.gguf) | Q8_0 | 12.881 GB | very large, extremely low quality loss - not recommended | ## Downloading instruction ### Command line Firstly, install Huggingface Client ```shell pip install -U "huggingface_hub[cli]" ``` Then, downoad the individual model file the a local directory ```shell huggingface-cli download tensorblock/KoR-Orca-Platypus-13B-GGUF --include "KoR-Orca-Platypus-13B-Q2_K.gguf" --local-dir MY_LOCAL_DIR ``` If you wanna download multiple model files with a pattern (e.g., `*Q4_K*gguf`), you can try: ```shell huggingface-cli download tensorblock/KoR-Orca-Platypus-13B-GGUF --local-dir MY_LOCAL_DIR --local-dir-use-symlinks False --include='*Q4_K*gguf' ```
nitic-nlp-team/webnavix-llama-booking
nitic-nlp-team
"2024-11-13T00:36:53Z"
0
0
null
[ "safetensors", "llama", "license:apache-2.0", "region:us" ]
null
"2024-11-12T23:20:23Z"
--- license: apache-2.0 ---
jaydapichon68/nonaccumulating
jaydapichon68
"2024-11-12T23:20:27Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:20:27Z"
Entry not found
vinningrev201/unsepulchred
vinningrev201
"2024-11-12T23:20:39Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:20:38Z"
Entry not found
nitic-nlp-team/webnavix-llama-shopping
nitic-nlp-team
"2024-11-13T00:40:37Z"
0
0
null
[ "safetensors", "llama", "license:apache-2.0", "region:us" ]
null
"2024-11-12T23:20:46Z"
--- license: apache-2.0 ---
vinningrev201/perite
vinningrev201
"2024-11-12T23:20:51Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:20:51Z"
Entry not found
baxtos/1-12112225-BTvmk
baxtos
"2024-11-12T23:24:59Z"
0
0
null
[ "safetensors", "llama", "region:us" ]
null
"2024-11-12T23:20:55Z"
Entry not found
vinningrev201/surbased
vinningrev201
"2024-11-12T23:21:03Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:21:03Z"
Entry not found
nitic-nlp-team/webnavix-llama-task-management
nitic-nlp-team
"2024-11-13T00:44:37Z"
0
0
null
[ "safetensors", "llama", "license:apache-2.0", "region:us" ]
null
"2024-11-12T23:21:09Z"
--- license: apache-2.0 ---
vinningrev201/dialectologer
vinningrev201
"2024-11-12T23:21:15Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:21:14Z"
Entry not found
gdshaji/gd-sn11-mistralai36k-v1
gdshaji
"2024-11-12T23:26:16Z"
0
0
transformers
[ "transformers", "safetensors", "mistral", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
"2024-11-12T23:21:16Z"
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
vinningrev201/brighteyes
vinningrev201
"2024-11-12T23:21:27Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:21:27Z"
Entry not found
nitic-nlp-team/webnavix-llama-shared
nitic-nlp-team
"2024-11-13T00:52:25Z"
0
0
null
[ "safetensors", "llama", "license:apache-2.0", "region:us" ]
null
"2024-11-12T23:21:38Z"
--- license: apache-2.0 ---
vinningrev201/patty
vinningrev201
"2024-11-12T23:21:39Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:21:39Z"
Entry not found
vinningrev201/antikenotoxin
vinningrev201
"2024-11-12T23:21:51Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:21:51Z"
Entry not found
nitic-nlp-team/webnavix-llama
nitic-nlp-team
"2024-11-12T23:21:52Z"
0
0
null
[ "license:apache-2.0", "region:us" ]
null
"2024-11-12T23:21:52Z"
--- license: apache-2.0 ---
vinningrev201/cymbalom
vinningrev201
"2024-11-12T23:22:03Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:22:03Z"
Entry not found
vinningrev201/bondsman
vinningrev201
"2024-11-12T23:22:15Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:22:15Z"
Entry not found
pypert/tonier
pypert
"2024-11-12T23:22:27Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:22:27Z"
Entry not found
pypert/wagoness
pypert
"2024-11-12T23:22:39Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:22:39Z"
Entry not found
aicmpt/SN21_DEC_204266
aicmpt
"2024-11-12T23:30:18Z"
0
0
null
[ "any-to-any", "omega", "omegalabs", "bittensor", "agi", "license:mit", "region:us" ]
any-to-any
"2024-11-12T23:22:47Z"
--- license: mit tags: - any-to-any - omega - omegalabs - bittensor - agi --- This is an Any-to-Any model checkpoint for the OMEGA Labs x Bittensor Any-to-Any subnet. Check out the [git repo](https://github.com/omegalabsinc/omegalabs-anytoany-bittensor) and find OMEGA on X: [@omegalabsai](https://x.com/omegalabsai).
pypert/sphragistic
pypert
"2024-11-12T23:22:51Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:22:51Z"
Entry not found
Trickshotblaster/wuerstchen-anime-model-full
Trickshotblaster
"2024-11-12T23:23:02Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:23:02Z"
Entry not found
pypert/smitch
pypert
"2024-11-12T23:23:03Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:23:03Z"
Entry not found
pypert/transcriptively
pypert
"2024-11-12T23:23:15Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:23:15Z"
Entry not found
pypert/dishevelling
pypert
"2024-11-12T23:23:28Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:23:27Z"
Entry not found
pypert/quassative
pypert
"2024-11-12T23:23:40Z"
0
0
null
[ "region:us" ]
null
"2024-11-12T23:23:39Z"
Entry not found