modelId
stringlengths 4
122
| author
stringlengths 2
42
| last_modified
unknown | downloads
int64 0
392M
| likes
int64 0
6.56k
| library_name
stringclasses 368
values | tags
sequencelengths 1
4.05k
| pipeline_tag
stringclasses 51
values | createdAt
unknown | card
stringlengths 1
1M
|
---|---|---|---|---|---|---|---|---|---|
vinningrev201/digging | vinningrev201 | "2024-11-12T22:08:39Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:08:39Z" | Entry not found |
vinningrev201/cambia | vinningrev201 | "2024-11-12T22:08:51Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:08:51Z" | Entry not found |
vinningrev201/horror | vinningrev201 | "2024-11-12T22:09:04Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:09:03Z" | Entry not found |
vinningrev201/bloops | vinningrev201 | "2024-11-12T22:09:15Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:09:14Z" | Entry not found |
vinningrev201/paroccipital | vinningrev201 | "2024-11-12T22:09:27Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:09:27Z" | Entry not found |
vinningrev201/aglow | vinningrev201 | "2024-11-12T22:09:39Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:09:39Z" | Entry not found |
vinningrev201/countrifiedness | vinningrev201 | "2024-11-12T22:09:51Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:09:51Z" | Entry not found |
vinningrev201/overdesirousness | vinningrev201 | "2024-11-12T22:10:03Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:10:02Z" | Entry not found |
outlookAi/Kgd8J3emBU | outlookAi | "2024-11-12T22:42:32Z" | 0 | 0 | diffusers | [
"diffusers",
"flux",
"lora",
"replicate",
"text-to-image",
"en",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | "2024-11-12T22:10:05Z" | ---
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
language:
- en
tags:
- flux
- diffusers
- lora
- replicate
base_model: "black-forest-labs/FLUX.1-dev"
pipeline_tag: text-to-image
# widget:
# - text: >-
# prompt
# output:
# url: https://...
instance_prompt: Nam
---
# Kgd8J3Embu
<Gallery />
Trained on Replicate using:
https://replicate.com/ostris/flux-dev-lora-trainer/train
## Trigger words
You should use `Nam` to trigger the image generation.
## Use it with the [🧨 diffusers library](https://github.com/huggingface/diffusers)
```py
from diffusers import AutoPipelineForText2Image
import torch
pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.float16).to('cuda')
pipeline.load_lora_weights('outlookAi/Kgd8J3emBU', weight_name='lora.safetensors')
image = pipeline('your prompt').images[0]
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
|
vinningrev201/pasticheur | vinningrev201 | "2024-11-12T22:10:15Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:10:15Z" | Entry not found |
pypert/metalbumin | pypert | "2024-11-12T22:10:27Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:10:27Z" | Entry not found |
jaspionjader/Darkens-8B-Q5_K_M-GGUF | jaspionjader | "2024-11-12T22:11:05Z" | 0 | 0 | null | [
"gguf",
"chat",
"llama-cpp",
"gguf-my-repo",
"en",
"dataset:anthracite-org/c2_logs_16k_llama_v1.1",
"dataset:anthracite-org/kalo-opus-instruct-22k-no-refusal",
"dataset:Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned",
"dataset:lodrick-the-lafted/kalo-opus-instruct-3k-filtered",
"dataset:anthracite-org/nopm_claude_writing_fixed",
"dataset:Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned",
"dataset:anthracite-org/kalo_opus_misc_240827",
"dataset:anthracite-org/kalo_misc_part2",
"base_model:Delta-Vector/Darkens-8B",
"base_model:quantized:Delta-Vector/Darkens-8B",
"license:agpl-3.0",
"model-index",
"region:us"
] | null | "2024-11-12T22:10:34Z" | ---
language:
- en
license: agpl-3.0
tags:
- chat
- llama-cpp
- gguf-my-repo
base_model: Delta-Vector/Darkens-8B
datasets:
- anthracite-org/c2_logs_16k_llama_v1.1
- anthracite-org/kalo-opus-instruct-22k-no-refusal
- Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned
- lodrick-the-lafted/kalo-opus-instruct-3k-filtered
- anthracite-org/nopm_claude_writing_fixed
- Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned
- anthracite-org/kalo_opus_misc_240827
- anthracite-org/kalo_misc_part2
License: agpl-3.0
Language:
- En
Pipeline_tag: text-generation
Base_model: nvidia/Mistral-NeMo-Minitron-8B-Base
Tags:
- Chat
model-index:
- name: Darkens-8B
results:
- task:
type: text-generation
name: Text Generation
dataset:
name: IFEval (0-Shot)
type: HuggingFaceH4/ifeval
args:
num_few_shot: 0
metrics:
- type: inst_level_strict_acc and prompt_level_strict_acc
value: 25.48
name: strict accuracy
source:
url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=Delta-Vector/Darkens-8B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: BBH (3-Shot)
type: BBH
args:
num_few_shot: 3
metrics:
- type: acc_norm
value: 32.88
name: normalized accuracy
source:
url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=Delta-Vector/Darkens-8B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MATH Lvl 5 (4-Shot)
type: hendrycks/competition_math
args:
num_few_shot: 4
metrics:
- type: exact_match
value: 5.06
name: exact match
source:
url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=Delta-Vector/Darkens-8B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: GPQA (0-shot)
type: Idavidrein/gpqa
args:
num_few_shot: 0
metrics:
- type: acc_norm
value: 9.96
name: acc_norm
source:
url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=Delta-Vector/Darkens-8B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MuSR (0-shot)
type: TAUR-Lab/MuSR
args:
num_few_shot: 0
metrics:
- type: acc_norm
value: 9.02
name: acc_norm
source:
url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=Delta-Vector/Darkens-8B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MMLU-PRO (5-shot)
type: TIGER-Lab/MMLU-Pro
config: main
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 30.4
name: accuracy
source:
url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=Delta-Vector/Darkens-8B
name: Open LLM Leaderboard
---
# jaspionjader/Darkens-8B-Q5_K_M-GGUF
This model was converted to GGUF format from [`Delta-Vector/Darkens-8B`](https://huggingface.co/Delta-Vector/Darkens-8B) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/Delta-Vector/Darkens-8B) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo jaspionjader/Darkens-8B-Q5_K_M-GGUF --hf-file darkens-8b-q5_k_m-imat.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo jaspionjader/Darkens-8B-Q5_K_M-GGUF --hf-file darkens-8b-q5_k_m-imat.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo jaspionjader/Darkens-8B-Q5_K_M-GGUF --hf-file darkens-8b-q5_k_m-imat.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo jaspionjader/Darkens-8B-Q5_K_M-GGUF --hf-file darkens-8b-q5_k_m-imat.gguf -c 2048
```
|
pypert/furriery | pypert | "2024-11-12T22:10:40Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:10:39Z" | Entry not found |
arisridwan/reactor_b2 | arisridwan | "2024-11-12T22:14:22Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | "2024-11-12T22:10:46Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
pypert/pleurostigma | pypert | "2024-11-12T22:10:51Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:10:51Z" | Entry not found |
pypert/structuralist | pypert | "2024-11-12T22:11:03Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:11:03Z" | Entry not found |
Babaevskiy/main | Babaevskiy | "2024-11-12T22:11:04Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:11:04Z" | Entry not found |
RichardErkhov/netcat420_-_MFANNv0.17.12-gguf | RichardErkhov | "2024-11-13T01:29:20Z" | 0 | 0 | null | [
"gguf",
"region:us"
] | null | "2024-11-12T22:11:11Z" | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
MFANNv0.17.12 - GGUF
- Model creator: https://huggingface.co/netcat420/
- Original model: https://huggingface.co/netcat420/MFANNv0.17.12/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [MFANNv0.17.12.Q2_K.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q2_K.gguf) | Q2_K | 0.75GB |
| [MFANNv0.17.12.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q3_K_S.gguf) | Q3_K_S | 3.41GB |
| [MFANNv0.17.12.Q3_K.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q3_K.gguf) | Q3_K | 3.74GB |
| [MFANNv0.17.12.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q3_K_M.gguf) | Q3_K_M | 3.74GB |
| [MFANNv0.17.12.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q3_K_L.gguf) | Q3_K_L | 4.03GB |
| [MFANNv0.17.12.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.IQ4_XS.gguf) | IQ4_XS | 4.18GB |
| [MFANNv0.17.12.Q4_0.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q4_0.gguf) | Q4_0 | 4.34GB |
| [MFANNv0.17.12.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.IQ4_NL.gguf) | IQ4_NL | 4.38GB |
| [MFANNv0.17.12.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q4_K_S.gguf) | Q4_K_S | 4.37GB |
| [MFANNv0.17.12.Q4_K.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q4_K.gguf) | Q4_K | 4.58GB |
| [MFANNv0.17.12.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q4_K_M.gguf) | Q4_K_M | 4.58GB |
| [MFANNv0.17.12.Q4_1.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q4_1.gguf) | Q4_1 | 4.78GB |
| [MFANNv0.17.12.Q5_0.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q5_0.gguf) | Q5_0 | 5.21GB |
| [MFANNv0.17.12.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q5_K_S.gguf) | Q5_K_S | 5.21GB |
| [MFANNv0.17.12.Q5_K.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q5_K.gguf) | Q5_K | 5.34GB |
| [MFANNv0.17.12.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q5_K_M.gguf) | Q5_K_M | 5.34GB |
| [MFANNv0.17.12.Q5_1.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q5_1.gguf) | Q5_1 | 5.65GB |
| [MFANNv0.17.12.Q6_K.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q6_K.gguf) | Q6_K | 6.14GB |
| [MFANNv0.17.12.Q8_0.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q8_0.gguf) | Q8_0 | 7.95GB |
Original model description:
---
base_model:
- netcat420/MFANNv0.17
- netcat420/MFANNv0.17.11
- MaziyarPanahi/Llama-3-8B-Instruct-v0.4
library_name: transformers
tags:
- mergekit
- merge
---
# MFANNv0.17.12
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [MaziyarPanahi/Llama-3-8B-Instruct-v0.4](https://huggingface.co/MaziyarPanahi/Llama-3-8B-Instruct-v0.4) as a base.
### Models Merged
The following models were included in the merge:
* [netcat420/MFANNv0.17](https://huggingface.co/netcat420/MFANNv0.17)
* [netcat420/MFANNv0.17.11](https://huggingface.co/netcat420/MFANNv0.17.11)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: netcat420/MFANNv0.17.11
parameters:
density: [1, 0.7, 0.1] # density gradient
weight: 1.0
- model: netcat420/MFANNv0.17
parameters:
density: [1, 0.7, 0.1] # density gradient
weight: 1.0
merge_method: ties
base_model: MaziyarPanahi/Llama-3-8B-Instruct-v0.4
parameters:
normalize: true
int8_mask: true
dtype: float16
```
|
pypert/flytime | pypert | "2024-11-12T22:11:15Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:11:15Z" | Entry not found |
pypert/sequa | pypert | "2024-11-12T22:11:27Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:11:27Z" | Entry not found |
pypert/novelless | pypert | "2024-11-12T22:11:39Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:11:39Z" | Entry not found |
pypert/pleurodynic | pypert | "2024-11-12T22:11:51Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:11:51Z" | Entry not found |
pypert/littering | pypert | "2024-11-12T22:12:03Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:12:03Z" | Entry not found |
pypert/immortalizing | pypert | "2024-11-12T22:12:16Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:12:15Z" | Entry not found |
muhammadrisma/trashnet | muhammadrisma | "2024-11-12T22:12:27Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:12:27Z" | Entry not found |
tensorblock/Hercules-2.5-Mistral-7B-GGUF | tensorblock | "2024-11-12T23:02:51Z" | 0 | 0 | transformers | [
"transformers",
"gguf",
"not-for-all-audiences",
"chemistry",
"math",
"code",
"physics",
"TensorBlock",
"GGUF",
"dataset:Locutusque/hercules-v2.0",
"dataset:Locutusque/hercules-v2.5",
"base_model:Locutusque/Hercules-2.5-Mistral-7B",
"base_model:quantized:Locutusque/Hercules-2.5-Mistral-7B",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | null | "2024-11-12T22:13:42Z" | ---
license: apache-2.0
library_name: transformers
tags:
- not-for-all-audiences
- chemistry
- math
- code
- physics
- TensorBlock
- GGUF
base_model: Locutusque/Hercules-2.5-Mistral-7B
datasets:
- Locutusque/hercules-v2.0
- Locutusque/hercules-v2.5
model-index:
- name: Hercules-2.5-Mistral-7B
results:
- task:
type: text-generation
name: Text Generation
dataset:
name: AI2 Reasoning Challenge (25-Shot)
type: ai2_arc
config: ARC-Challenge
split: test
args:
num_few_shot: 25
metrics:
- type: acc_norm
value: 62.03
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Hercules-2.5-Mistral-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: HellaSwag (10-Shot)
type: hellaswag
split: validation
args:
num_few_shot: 10
metrics:
- type: acc_norm
value: 83.79
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Hercules-2.5-Mistral-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MMLU (5-Shot)
type: cais/mmlu
config: all
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 63.49
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Hercules-2.5-Mistral-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: TruthfulQA (0-shot)
type: truthful_qa
config: multiple_choice
split: validation
args:
num_few_shot: 0
metrics:
- type: mc2
value: 43.44
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Hercules-2.5-Mistral-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: Winogrande (5-shot)
type: winogrande
config: winogrande_xl
split: validation
args:
num_few_shot: 5
metrics:
- type: acc
value: 79.72
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Hercules-2.5-Mistral-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: GSM8k (5-shot)
type: gsm8k
config: main
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 49.05
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Hercules-2.5-Mistral-7B
name: Open LLM Leaderboard
---
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/jC7kdl8.jpeg" alt="TensorBlock" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;">
Feedback and support: TensorBlock's <a href="https://x.com/tensorblock_aoi">Twitter/X</a>, <a href="https://t.me/TensorBlock">Telegram Group</a> and <a href="https://x.com/tensorblock_aoi">Discord server</a>
</p>
</div>
</div>
## Locutusque/Hercules-2.5-Mistral-7B - GGUF
This repo contains GGUF format model files for [Locutusque/Hercules-2.5-Mistral-7B](https://huggingface.co/Locutusque/Hercules-2.5-Mistral-7B).
The files were quantized using machines provided by [TensorBlock](https://tensorblock.co/), and they are compatible with llama.cpp as of [commit b4011](https://github.com/ggerganov/llama.cpp/commit/a6744e43e80f4be6398fc7733a01642c846dce1d).
## Prompt template
```
```
## Model file specification
| Filename | Quant type | File Size | Description |
| -------- | ---------- | --------- | ----------- |
| [Hercules-2.5-Mistral-7B-Q2_K.gguf](https://huggingface.co/tensorblock/Hercules-2.5-Mistral-7B-GGUF/tree/main/Hercules-2.5-Mistral-7B-Q2_K.gguf) | Q2_K | 2.532 GB | smallest, significant quality loss - not recommended for most purposes |
| [Hercules-2.5-Mistral-7B-Q3_K_S.gguf](https://huggingface.co/tensorblock/Hercules-2.5-Mistral-7B-GGUF/tree/main/Hercules-2.5-Mistral-7B-Q3_K_S.gguf) | Q3_K_S | 2.947 GB | very small, high quality loss |
| [Hercules-2.5-Mistral-7B-Q3_K_M.gguf](https://huggingface.co/tensorblock/Hercules-2.5-Mistral-7B-GGUF/tree/main/Hercules-2.5-Mistral-7B-Q3_K_M.gguf) | Q3_K_M | 3.277 GB | very small, high quality loss |
| [Hercules-2.5-Mistral-7B-Q3_K_L.gguf](https://huggingface.co/tensorblock/Hercules-2.5-Mistral-7B-GGUF/tree/main/Hercules-2.5-Mistral-7B-Q3_K_L.gguf) | Q3_K_L | 3.560 GB | small, substantial quality loss |
| [Hercules-2.5-Mistral-7B-Q4_0.gguf](https://huggingface.co/tensorblock/Hercules-2.5-Mistral-7B-GGUF/tree/main/Hercules-2.5-Mistral-7B-Q4_0.gguf) | Q4_0 | 3.827 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
| [Hercules-2.5-Mistral-7B-Q4_K_S.gguf](https://huggingface.co/tensorblock/Hercules-2.5-Mistral-7B-GGUF/tree/main/Hercules-2.5-Mistral-7B-Q4_K_S.gguf) | Q4_K_S | 3.856 GB | small, greater quality loss |
| [Hercules-2.5-Mistral-7B-Q4_K_M.gguf](https://huggingface.co/tensorblock/Hercules-2.5-Mistral-7B-GGUF/tree/main/Hercules-2.5-Mistral-7B-Q4_K_M.gguf) | Q4_K_M | 4.068 GB | medium, balanced quality - recommended |
| [Hercules-2.5-Mistral-7B-Q5_0.gguf](https://huggingface.co/tensorblock/Hercules-2.5-Mistral-7B-GGUF/tree/main/Hercules-2.5-Mistral-7B-Q5_0.gguf) | Q5_0 | 4.654 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
| [Hercules-2.5-Mistral-7B-Q5_K_S.gguf](https://huggingface.co/tensorblock/Hercules-2.5-Mistral-7B-GGUF/tree/main/Hercules-2.5-Mistral-7B-Q5_K_S.gguf) | Q5_K_S | 4.654 GB | large, low quality loss - recommended |
| [Hercules-2.5-Mistral-7B-Q5_K_M.gguf](https://huggingface.co/tensorblock/Hercules-2.5-Mistral-7B-GGUF/tree/main/Hercules-2.5-Mistral-7B-Q5_K_M.gguf) | Q5_K_M | 4.779 GB | large, very low quality loss - recommended |
| [Hercules-2.5-Mistral-7B-Q6_K.gguf](https://huggingface.co/tensorblock/Hercules-2.5-Mistral-7B-GGUF/tree/main/Hercules-2.5-Mistral-7B-Q6_K.gguf) | Q6_K | 5.534 GB | very large, extremely low quality loss |
| [Hercules-2.5-Mistral-7B-Q8_0.gguf](https://huggingface.co/tensorblock/Hercules-2.5-Mistral-7B-GGUF/tree/main/Hercules-2.5-Mistral-7B-Q8_0.gguf) | Q8_0 | 7.167 GB | very large, extremely low quality loss - not recommended |
## Downloading instruction
### Command line
Firstly, install Huggingface Client
```shell
pip install -U "huggingface_hub[cli]"
```
Then, downoad the individual model file the a local directory
```shell
huggingface-cli download tensorblock/Hercules-2.5-Mistral-7B-GGUF --include "Hercules-2.5-Mistral-7B-Q2_K.gguf" --local-dir MY_LOCAL_DIR
```
If you wanna download multiple model files with a pattern (e.g., `*Q4_K*gguf`), you can try:
```shell
huggingface-cli download tensorblock/Hercules-2.5-Mistral-7B-GGUF --local-dir MY_LOCAL_DIR --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
|
sanchezmari/results | sanchezmari | "2024-11-12T23:00:58Z" | 0 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-11-12T22:15:01Z" | ---
library_name: transformers
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: results
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# results
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1100
- 0: {'precision': 0.95, 'recall': 0.926829268292683, 'f1-score': 0.9382716049382716, 'support': 41.0}
- 1: {'precision': 0.9451476793248945, 'recall': 0.9696969696969697, 'f1-score': 0.9572649572649573, 'support': 231.0}
- 2: {'precision': 0.9872650090964221, 'recall': 0.9878640776699029, 'f1-score': 0.9875644525326054, 'support': 1648.0}
- 3: {'precision': 0.978021978021978, 'recall': 0.9621621621621622, 'f1-score': 0.9700272479564033, 'support': 370.0}
- Accuracy: 0.9808
- Macro avg: {'precision': 0.9651086666108237, 'recall': 0.9616381194554294, 'f1-score': 0.9632820656730594, 'support': 2290.0}
- Weighted avg: {'precision': 0.9808558868048411, 'recall': 0.9807860262008734, 'f1-score': 0.9807919827285054, 'support': 2290.0}
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | 0 | 1 | 2 | 3 | Accuracy | Macro avg | Weighted avg |
|:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------------:|:--------:|:------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|
| 0.2357 | 1.0 | 1145 | 0.2453 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 0.0} | {'precision': 0.9367088607594937, 'recall': 0.8951612903225806, 'f1-score': 0.9154639175257732, 'support': 248.0} | {'precision': 0.9842328684050939, 'recall': 0.9603550295857988, 'f1-score': 0.97214734950584, 'support': 1690.0} | {'precision': 0.9340659340659341, 'recall': 0.9659090909090909, 'f1-score': 0.9497206703910615, 'support': 352.0} | 0.9541 | {'precision': 0.7137519158076304, 'recall': 0.7053563527043676, 'f1-score': 0.7093329843556686, 'support': 2290.0} | {'precision': 0.9713749143511669, 'recall': 0.9541484716157205, 'f1-score': 0.9625614620912293, 'support': 2290.0} |
| 0.1043 | 2.0 | 2290 | 0.1176 | {'precision': 0.925, 'recall': 0.9736842105263158, 'f1-score': 0.9487179487179487, 'support': 38.0} | {'precision': 0.9409282700421941, 'recall': 0.9780701754385965, 'f1-score': 0.9591397849462365, 'support': 228.0} | {'precision': 0.9939357186173439, 'recall': 0.9761762954139369, 'f1-score': 0.9849759615384616, 'support': 1679.0} | {'precision': 0.9285714285714286, 'recall': 0.9797101449275363, 'f1-score': 0.9534555712270804, 'support': 345.0} | 0.9769 | {'precision': 0.9471088543077417, 'recall': 0.9769102065765964, 'f1-score': 0.9615723166074318, 'support': 2290.0} | {'precision': 0.9776667510852766, 'recall': 0.9768558951965065, 'f1-score': 0.9770532596137309, 'support': 2290.0} |
| 0.0506 | 3.0 | 3435 | 0.1100 | {'precision': 0.95, 'recall': 0.926829268292683, 'f1-score': 0.9382716049382716, 'support': 41.0} | {'precision': 0.9451476793248945, 'recall': 0.9696969696969697, 'f1-score': 0.9572649572649573, 'support': 231.0} | {'precision': 0.9872650090964221, 'recall': 0.9878640776699029, 'f1-score': 0.9875644525326054, 'support': 1648.0} | {'precision': 0.978021978021978, 'recall': 0.9621621621621622, 'f1-score': 0.9700272479564033, 'support': 370.0} | 0.9808 | {'precision': 0.9651086666108237, 'recall': 0.9616381194554294, 'f1-score': 0.9632820656730594, 'support': 2290.0} | {'precision': 0.9808558868048411, 'recall': 0.9807860262008734, 'f1-score': 0.9807919827285054, 'support': 2290.0} |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
touhidulislam/BERTweet_retrain_2020_11 | touhidulislam | "2024-11-12T22:17:11Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"fill-mask",
"generated_from_trainer",
"base_model:vinai/bertweet-base",
"base_model:finetune:vinai/bertweet-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | "2024-11-12T22:16:44Z" | ---
library_name: transformers
license: mit
base_model: vinai/bertweet-base
tags:
- generated_from_trainer
model-index:
- name: BERTweet_retrain_2020_11
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# BERTweet_retrain_2020_11
This model is a fine-tuned version of [vinai/bertweet-base](https://huggingface.co/vinai/bertweet-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.5494
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 2.6682 | 1.0 | 3066 | 2.6283 |
| 2.4263 | 2.0 | 6132 | 2.5696 |
| 2.7536 | 3.0 | 9198 | 2.5387 |
### Framework versions
- Transformers 4.45.1
- Pytorch 2.1.0+cu121
- Datasets 3.0.1
- Tokenizers 0.20.0
|
Pearush/phimoe_15_head16 | Pearush | "2024-11-12T22:23:49Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"phimoe",
"text-generation",
"conversational",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"region:us"
] | text-generation | "2024-11-12T22:17:03Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
mpetrascom/vernos-lora | mpetrascom | "2024-11-12T23:18:09Z" | 0 | 0 | null | [
"license:other",
"region:us"
] | null | "2024-11-12T22:19:27Z" | ---
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
--- |
barchetta/sana-130919 | barchetta | "2024-11-12T22:19:31Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:19:31Z" | Entry not found |
barchetta/paga-130919 | barchetta | "2024-11-12T22:19:32Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:19:32Z" | Entry not found |
featherless-ai-quants/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-GGUF | featherless-ai-quants | "2024-11-12T22:32:57Z" | 0 | 0 | null | [
"gguf",
"text-generation",
"base_model:WhiteRabbitNeo/Llama-3-WhiteRabbitNeo-8B-v2.0",
"base_model:quantized:WhiteRabbitNeo/Llama-3-WhiteRabbitNeo-8B-v2.0",
"region:us"
] | text-generation | "2024-11-12T22:21:05Z" | ---
base_model: WhiteRabbitNeo/Llama-3-WhiteRabbitNeo-8B-v2.0
pipeline_tag: text-generation
quantized_by: featherless-ai-quants
---
# WhiteRabbitNeo/Llama-3-WhiteRabbitNeo-8B-v2.0 GGUF Quantizations 🚀
![Featherless AI Quants](./featherless-quants.png)
*Optimized GGUF quantization files for enhanced model performance*
> Powered by [Featherless AI](https://featherless.ai) - run any model you'd like for a simple small fee.
---
## Available Quantizations 📊
| Quantization Type | File | Size |
|-------------------|------|------|
| IQ4_XS | [WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-IQ4_XS.gguf](https://huggingface.co/featherless-ai-quants/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-GGUF/blob/main/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-IQ4_XS.gguf) | 4276.62 MB |
| Q2_K | [WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q2_K.gguf](https://huggingface.co/featherless-ai-quants/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-GGUF/blob/main/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q2_K.gguf) | 3031.86 MB |
| Q3_K_L | [WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q3_K_L.gguf](https://huggingface.co/featherless-ai-quants/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-GGUF/blob/main/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q3_K_L.gguf) | 4121.74 MB |
| Q3_K_M | [WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q3_K_M.gguf](https://huggingface.co/featherless-ai-quants/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-GGUF/blob/main/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q3_K_M.gguf) | 3832.74 MB |
| Q3_K_S | [WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q3_K_S.gguf](https://huggingface.co/featherless-ai-quants/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-GGUF/blob/main/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q3_K_S.gguf) | 3494.74 MB |
| Q4_K_M | [WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q4_K_M.gguf](https://huggingface.co/featherless-ai-quants/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-GGUF/blob/main/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q4_K_M.gguf) | 4692.78 MB |
| Q4_K_S | [WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q4_K_S.gguf](https://huggingface.co/featherless-ai-quants/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-GGUF/blob/main/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q4_K_S.gguf) | 4475.28 MB |
| Q5_K_M | [WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q5_K_M.gguf](https://huggingface.co/featherless-ai-quants/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-GGUF/blob/main/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q5_K_M.gguf) | 5467.40 MB |
| Q5_K_S | [WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q5_K_S.gguf](https://huggingface.co/featherless-ai-quants/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-GGUF/blob/main/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q5_K_S.gguf) | 5339.90 MB |
| Q6_K | [WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q6_K.gguf](https://huggingface.co/featherless-ai-quants/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-GGUF/blob/main/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q6_K.gguf) | 6290.44 MB |
| Q8_0 | [WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q8_0.gguf](https://huggingface.co/featherless-ai-quants/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-GGUF/blob/main/WhiteRabbitNeo-Llama-3-WhiteRabbitNeo-8B-v2.0-Q8_0.gguf) | 8145.11 MB |
---
## ⚡ Powered by [Featherless AI](https://featherless.ai)
### Key Features
- 🔥 **Instant Hosting** - Deploy any Llama model on HuggingFace instantly
- 🛠️ **Zero Infrastructure** - No server setup or maintenance required
- 📚 **Vast Compatibility** - Support for 2400+ models and counting
- 💎 **Affordable Pricing** - Starting at just $10/month
---
**Links:**
[Get Started](https://featherless.ai) | [Documentation](https://featherless.ai/docs) | [Models](https://featherless.ai/models) |
jacobhoffmann/TestGen_v2.1-codegemma-7b-lr2e-05_epochs2 | jacobhoffmann | "2024-11-12T22:44:20Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"gemma",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | "2024-11-12T22:21:19Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
alaabh/fine-tuned-visionllama | alaabh | "2024-11-13T00:10:30Z" | 0 | 0 | null | [
"tensorboard",
"safetensors",
"region:us"
] | null | "2024-11-12T22:22:32Z" | Entry not found |
hana403/llama3.2 | hana403 | "2024-11-12T22:23:48Z" | 0 | 0 | null | [
"license:llama3.2",
"region:us"
] | null | "2024-11-12T22:23:48Z" | ---
license: llama3.2
---
|
basajaun-scotland/lore-nexus-bilstm-pytorch | basajaun-scotland | "2024-11-12T22:25:50Z" | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | null | "2024-11-12T22:24:20Z" | ---
license: apache-2.0
---
|
MayBashendy/Arabic_FineTuningAraBERT_AugV4_k10_task3_organization_fold0 | MayBashendy | "2024-11-12T23:08:34Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-11-12T22:24:58Z" | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV4_k10_task3_organization_fold0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV4_k10_task3_organization_fold0
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8626
- Qwk: 0.0435
- Mse: 0.8626
- Rmse: 0.9288
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0040 | 2 | 4.2802 | -0.0292 | 4.2802 | 2.0689 |
| No log | 0.0079 | 4 | 2.0482 | 0.0739 | 2.0482 | 1.4312 |
| No log | 0.0119 | 6 | 1.0235 | 0.0 | 1.0235 | 1.0117 |
| No log | 0.0158 | 8 | 0.8738 | -0.0565 | 0.8738 | 0.9348 |
| No log | 0.0198 | 10 | 1.0395 | 0.0 | 1.0395 | 1.0196 |
| No log | 0.0237 | 12 | 1.2593 | 0.0 | 1.2593 | 1.1222 |
| No log | 0.0277 | 14 | 0.7697 | 0.1951 | 0.7697 | 0.8773 |
| No log | 0.0316 | 16 | 1.0590 | 0.0 | 1.0590 | 1.0291 |
| No log | 0.0356 | 18 | 1.3325 | 0.0 | 1.3325 | 1.1543 |
| No log | 0.0395 | 20 | 1.2788 | 0.0 | 1.2788 | 1.1308 |
| No log | 0.0435 | 22 | 0.8405 | -0.1786 | 0.8405 | 0.9168 |
| No log | 0.0474 | 24 | 0.8136 | 0.1895 | 0.8136 | 0.9020 |
| No log | 0.0514 | 26 | 0.8021 | 0.0435 | 0.8021 | 0.8956 |
| No log | 0.0553 | 28 | 0.7797 | 0.0 | 0.7797 | 0.8830 |
| No log | 0.0593 | 30 | 0.8089 | 0.0 | 0.8089 | 0.8994 |
| No log | 0.0632 | 32 | 0.8827 | 0.0 | 0.8827 | 0.9395 |
| No log | 0.0672 | 34 | 1.0640 | 0.0 | 1.0640 | 1.0315 |
| No log | 0.0711 | 36 | 1.3025 | 0.0 | 1.3025 | 1.1413 |
| No log | 0.0751 | 38 | 1.5841 | -0.0694 | 1.5841 | 1.2586 |
| No log | 0.0791 | 40 | 1.3969 | 0.1852 | 1.3969 | 1.1819 |
| No log | 0.0830 | 42 | 0.9510 | 0.0 | 0.9510 | 0.9752 |
| No log | 0.0870 | 44 | 0.8677 | -0.1786 | 0.8677 | 0.9315 |
| No log | 0.0909 | 46 | 0.8995 | 0.0 | 0.8995 | 0.9484 |
| No log | 0.0949 | 48 | 0.9293 | 0.0 | 0.9293 | 0.9640 |
| No log | 0.0988 | 50 | 1.0741 | 0.0 | 1.0741 | 1.0364 |
| No log | 0.1028 | 52 | 1.1936 | 0.0 | 1.1936 | 1.0925 |
| No log | 0.1067 | 54 | 1.1907 | 0.0 | 1.1907 | 1.0912 |
| No log | 0.1107 | 56 | 1.0443 | 0.0 | 1.0443 | 1.0219 |
| No log | 0.1146 | 58 | 0.9167 | 0.0 | 0.9167 | 0.9574 |
| No log | 0.1186 | 60 | 1.0631 | 0.0 | 1.0631 | 1.0311 |
| No log | 0.1225 | 62 | 1.0956 | 0.0 | 1.0956 | 1.0467 |
| No log | 0.1265 | 64 | 1.1041 | 0.0 | 1.1041 | 1.0508 |
| No log | 0.1304 | 66 | 1.1161 | 0.0 | 1.1161 | 1.0564 |
| No log | 0.1344 | 68 | 1.1300 | 0.0 | 1.1300 | 1.0630 |
| No log | 0.1383 | 70 | 0.9726 | 0.0 | 0.9726 | 0.9862 |
| No log | 0.1423 | 72 | 0.9930 | 0.0 | 0.9930 | 0.9965 |
| No log | 0.1462 | 74 | 0.9749 | 0.0 | 0.9749 | 0.9874 |
| No log | 0.1502 | 76 | 0.8815 | 0.0 | 0.8815 | 0.9389 |
| No log | 0.1542 | 78 | 0.9904 | 0.0 | 0.9904 | 0.9952 |
| No log | 0.1581 | 80 | 1.2202 | 0.0 | 1.2202 | 1.1046 |
| No log | 0.1621 | 82 | 1.3042 | 0.0 | 1.3042 | 1.1420 |
| No log | 0.1660 | 84 | 1.2316 | 0.0 | 1.2316 | 1.1098 |
| No log | 0.1700 | 86 | 1.1803 | 0.0 | 1.1803 | 1.0864 |
| No log | 0.1739 | 88 | 1.3865 | 0.0 | 1.3865 | 1.1775 |
| No log | 0.1779 | 90 | 1.3602 | 0.0 | 1.3602 | 1.1663 |
| No log | 0.1818 | 92 | 1.1551 | 0.0 | 1.1551 | 1.0748 |
| No log | 0.1858 | 94 | 1.3647 | 0.1538 | 1.3647 | 1.1682 |
| No log | 0.1897 | 96 | 1.7856 | -0.0864 | 1.7856 | 1.3363 |
| No log | 0.1937 | 98 | 1.3898 | 0.2667 | 1.3898 | 1.1789 |
| No log | 0.1976 | 100 | 0.8815 | 0.1852 | 0.8815 | 0.9389 |
| No log | 0.2016 | 102 | 0.9288 | 0.1852 | 0.9288 | 0.9637 |
| No log | 0.2055 | 104 | 1.2562 | 0.1852 | 1.2562 | 1.1208 |
| No log | 0.2095 | 106 | 1.4752 | 0.2361 | 1.4752 | 1.2146 |
| No log | 0.2134 | 108 | 1.4477 | 0.2361 | 1.4477 | 1.2032 |
| No log | 0.2174 | 110 | 1.4142 | -0.0593 | 1.4142 | 1.1892 |
| No log | 0.2213 | 112 | 1.8238 | -0.0864 | 1.8238 | 1.3505 |
| No log | 0.2253 | 114 | 1.7026 | -0.3453 | 1.7026 | 1.3048 |
| No log | 0.2292 | 116 | 1.6296 | -0.1846 | 1.6296 | 1.2766 |
| No log | 0.2332 | 118 | 1.6182 | -0.375 | 1.6182 | 1.2721 |
| No log | 0.2372 | 120 | 1.6905 | -0.1871 | 1.6905 | 1.3002 |
| No log | 0.2411 | 122 | 1.6637 | 0.0 | 1.6637 | 1.2898 |
| No log | 0.2451 | 124 | 1.4443 | -0.3636 | 1.4443 | 1.2018 |
| No log | 0.2490 | 126 | 0.9699 | 0.2080 | 0.9699 | 0.9848 |
| No log | 0.2530 | 128 | 1.2595 | -0.1786 | 1.2595 | 1.1223 |
| No log | 0.2569 | 130 | 1.4641 | -0.1786 | 1.4641 | 1.2100 |
| No log | 0.2609 | 132 | 1.5370 | -0.1786 | 1.5370 | 1.2398 |
| No log | 0.2648 | 134 | 1.4124 | 0.1852 | 1.4124 | 1.1885 |
| No log | 0.2688 | 136 | 0.8327 | 0.0 | 0.8327 | 0.9125 |
| No log | 0.2727 | 138 | 0.8600 | 0.1852 | 0.8600 | 0.9274 |
| No log | 0.2767 | 140 | 1.1872 | 0.1852 | 1.1872 | 1.0896 |
| No log | 0.2806 | 142 | 1.3260 | 0.4967 | 1.3260 | 1.1515 |
| No log | 0.2846 | 144 | 1.3561 | 0.3210 | 1.3561 | 1.1645 |
| No log | 0.2885 | 146 | 1.5194 | 0.3210 | 1.5194 | 1.2326 |
| No log | 0.2925 | 148 | 1.5752 | 0.3210 | 1.5752 | 1.2551 |
| No log | 0.2964 | 150 | 1.0256 | 0.4460 | 1.0256 | 1.0127 |
| No log | 0.3004 | 152 | 1.4857 | 0.3210 | 1.4857 | 1.2189 |
| No log | 0.3043 | 154 | 1.8480 | 0.1444 | 1.8480 | 1.3594 |
| No log | 0.3083 | 156 | 1.6143 | 0.3210 | 1.6143 | 1.2706 |
| No log | 0.3123 | 158 | 1.3699 | 0.3529 | 1.3699 | 1.1704 |
| No log | 0.3162 | 160 | 0.8600 | 0.1852 | 0.8600 | 0.9274 |
| No log | 0.3202 | 162 | 0.8302 | 0.1852 | 0.8302 | 0.9111 |
| No log | 0.3241 | 164 | 0.9734 | 0.1852 | 0.9734 | 0.9866 |
| No log | 0.3281 | 166 | 1.3533 | 0.3889 | 1.3533 | 1.1633 |
| No log | 0.3320 | 168 | 1.5251 | 0.1852 | 1.5251 | 1.2349 |
| No log | 0.3360 | 170 | 1.6766 | 0.1444 | 1.6766 | 1.2949 |
| No log | 0.3399 | 172 | 1.4212 | 0.2092 | 1.4212 | 1.1922 |
| No log | 0.3439 | 174 | 0.9009 | 0.3433 | 0.9009 | 0.9492 |
| No log | 0.3478 | 176 | 1.1051 | 0.1538 | 1.1051 | 1.0512 |
| No log | 0.3518 | 178 | 1.5737 | 0.0351 | 1.5737 | 1.2545 |
| No log | 0.3557 | 180 | 1.5278 | 0.2092 | 1.5278 | 1.2360 |
| No log | 0.3597 | 182 | 1.4058 | 0.1538 | 1.4058 | 1.1856 |
| No log | 0.3636 | 184 | 1.0409 | 0.0 | 1.0409 | 1.0203 |
| No log | 0.3676 | 186 | 0.9657 | 0.0 | 0.9657 | 0.9827 |
| No log | 0.3715 | 188 | 1.4050 | 0.2361 | 1.4050 | 1.1853 |
| No log | 0.3755 | 190 | 1.5375 | 0.2092 | 1.5375 | 1.2400 |
| No log | 0.3794 | 192 | 1.3940 | 0.2361 | 1.3940 | 1.1807 |
| No log | 0.3834 | 194 | 1.4361 | 0.2361 | 1.4361 | 1.1984 |
| No log | 0.3874 | 196 | 1.6500 | 0.0351 | 1.6500 | 1.2845 |
| No log | 0.3913 | 198 | 1.5256 | 0.0654 | 1.5256 | 1.2352 |
| No log | 0.3953 | 200 | 1.3200 | -0.2222 | 1.3200 | 1.1489 |
| No log | 0.3992 | 202 | 1.4733 | -0.0593 | 1.4733 | 1.2138 |
| No log | 0.4032 | 204 | 1.4137 | -0.0185 | 1.4137 | 1.1890 |
| No log | 0.4071 | 206 | 1.3015 | 0.0 | 1.3015 | 1.1408 |
| No log | 0.4111 | 208 | 1.1236 | 0.0 | 1.1236 | 1.0600 |
| No log | 0.4150 | 210 | 1.1318 | 0.0 | 1.1318 | 1.0639 |
| No log | 0.4190 | 212 | 1.2936 | 0.0 | 1.2936 | 1.1374 |
| No log | 0.4229 | 214 | 1.2660 | 0.0 | 1.2660 | 1.1252 |
| No log | 0.4269 | 216 | 1.3581 | 0.1037 | 1.3581 | 1.1654 |
| No log | 0.4308 | 218 | 1.2107 | 0.4296 | 1.2107 | 1.1003 |
| No log | 0.4348 | 220 | 1.2481 | 0.4296 | 1.2481 | 1.1172 |
| No log | 0.4387 | 222 | 1.7515 | 0.1444 | 1.7515 | 1.3235 |
| No log | 0.4427 | 224 | 1.7653 | 0.1444 | 1.7653 | 1.3286 |
| No log | 0.4466 | 226 | 1.5692 | 0.2924 | 1.5692 | 1.2527 |
| No log | 0.4506 | 228 | 1.3740 | 0.2361 | 1.3740 | 1.1722 |
| No log | 0.4545 | 230 | 1.0235 | 0.1852 | 1.0235 | 1.0117 |
| No log | 0.4585 | 232 | 0.7628 | 0.2080 | 0.7628 | 0.8734 |
| No log | 0.4625 | 234 | 0.8587 | 0.0179 | 0.8587 | 0.9267 |
| No log | 0.4664 | 236 | 1.2147 | 0.0 | 1.2147 | 1.1021 |
| No log | 0.4704 | 238 | 1.4172 | 0.1852 | 1.4172 | 1.1905 |
| No log | 0.4743 | 240 | 1.3853 | 0.1852 | 1.3853 | 1.1770 |
| No log | 0.4783 | 242 | 1.0478 | 0.0 | 1.0478 | 1.0236 |
| No log | 0.4822 | 244 | 0.8788 | 0.0 | 0.8788 | 0.9374 |
| No log | 0.4862 | 246 | 1.0720 | 0.0 | 1.0720 | 1.0354 |
| No log | 0.4901 | 248 | 1.5954 | 0.2092 | 1.5954 | 1.2631 |
| No log | 0.4941 | 250 | 1.7843 | -0.1058 | 1.7843 | 1.3358 |
| No log | 0.4980 | 252 | 1.6398 | 0.2092 | 1.6398 | 1.2805 |
| No log | 0.5020 | 254 | 1.2959 | -0.0342 | 1.2959 | 1.1384 |
| No log | 0.5059 | 256 | 0.9389 | 0.0 | 0.9389 | 0.9690 |
| No log | 0.5099 | 258 | 0.9804 | 0.1852 | 0.9804 | 0.9902 |
| No log | 0.5138 | 260 | 1.3974 | 0.2361 | 1.3974 | 1.1821 |
| No log | 0.5178 | 262 | 1.6278 | 0.2092 | 1.6278 | 1.2759 |
| No log | 0.5217 | 264 | 1.5528 | 0.2092 | 1.5528 | 1.2461 |
| No log | 0.5257 | 266 | 1.2196 | 0.0 | 1.2196 | 1.1044 |
| No log | 0.5296 | 268 | 0.9325 | 0.0 | 0.9325 | 0.9657 |
| No log | 0.5336 | 270 | 0.8137 | 0.1818 | 0.8137 | 0.9020 |
| No log | 0.5375 | 272 | 0.9458 | 0.1852 | 0.9458 | 0.9725 |
| No log | 0.5415 | 274 | 1.5879 | 0.3210 | 1.5879 | 1.2601 |
| No log | 0.5455 | 276 | 1.9290 | 0.0351 | 1.9290 | 1.3889 |
| No log | 0.5494 | 278 | 1.6930 | 0.1852 | 1.6930 | 1.3012 |
| No log | 0.5534 | 280 | 1.1043 | 0.0 | 1.1043 | 1.0509 |
| No log | 0.5573 | 282 | 0.6994 | 0.2080 | 0.6994 | 0.8363 |
| No log | 0.5613 | 284 | 0.6770 | 0.2080 | 0.6770 | 0.8228 |
| No log | 0.5652 | 286 | 0.7880 | 0.0 | 0.7880 | 0.8877 |
| No log | 0.5692 | 288 | 1.0286 | 0.0 | 1.0286 | 1.0142 |
| No log | 0.5731 | 290 | 1.3562 | 0.5299 | 1.3562 | 1.1646 |
| No log | 0.5771 | 292 | 1.5731 | 0.2092 | 1.5731 | 1.2542 |
| No log | 0.5810 | 294 | 1.4687 | 0.2092 | 1.4687 | 1.2119 |
| No log | 0.5850 | 296 | 1.0747 | 0.0 | 1.0747 | 1.0367 |
| No log | 0.5889 | 298 | 0.9315 | 0.0179 | 0.9315 | 0.9651 |
| No log | 0.5929 | 300 | 1.1330 | 0.3889 | 1.1330 | 1.0644 |
| No log | 0.5968 | 302 | 1.4628 | 0.2092 | 1.4628 | 1.2095 |
| No log | 0.6008 | 304 | 1.3674 | 0.2092 | 1.3674 | 1.1693 |
| No log | 0.6047 | 306 | 1.2166 | 0.3889 | 1.2166 | 1.1030 |
| No log | 0.6087 | 308 | 1.3698 | 0.5299 | 1.3698 | 1.1704 |
| No log | 0.6126 | 310 | 1.3316 | 0.5299 | 1.3316 | 1.1540 |
| No log | 0.6166 | 312 | 1.0944 | 0.0 | 1.0944 | 1.0462 |
| No log | 0.6206 | 314 | 1.1250 | 0.0 | 1.1250 | 1.0607 |
| No log | 0.6245 | 316 | 1.3706 | 0.5299 | 1.3706 | 1.1707 |
| No log | 0.6285 | 318 | 1.4740 | 0.1852 | 1.4740 | 1.2141 |
| No log | 0.6324 | 320 | 1.3607 | 0.0 | 1.3607 | 1.1665 |
| No log | 0.6364 | 322 | 1.1497 | 0.0 | 1.1497 | 1.0723 |
| No log | 0.6403 | 324 | 0.9665 | 0.0 | 0.9665 | 0.9831 |
| No log | 0.6443 | 326 | 0.8118 | 0.0179 | 0.8118 | 0.9010 |
| No log | 0.6482 | 328 | 0.8182 | 0.0179 | 0.8182 | 0.9046 |
| No log | 0.6522 | 330 | 1.0330 | 0.0 | 1.0330 | 1.0163 |
| No log | 0.6561 | 332 | 1.3534 | 0.0 | 1.3534 | 1.1633 |
| No log | 0.6601 | 334 | 1.4915 | 0.1538 | 1.4915 | 1.2213 |
| No log | 0.6640 | 336 | 1.4611 | -0.0476 | 1.4611 | 1.2088 |
| No log | 0.6680 | 338 | 1.1233 | 0.0 | 1.1233 | 1.0598 |
| No log | 0.6719 | 340 | 1.0641 | 0.0 | 1.0641 | 1.0316 |
| No log | 0.6759 | 342 | 1.4322 | 0.1037 | 1.4322 | 1.1967 |
| No log | 0.6798 | 344 | 1.5207 | -0.0694 | 1.5207 | 1.2332 |
| No log | 0.6838 | 346 | 1.2638 | 0.3419 | 1.2638 | 1.1242 |
| No log | 0.6877 | 348 | 1.0451 | 0.1852 | 1.0451 | 1.0223 |
| No log | 0.6917 | 350 | 1.2169 | 0.3419 | 1.2169 | 1.1031 |
| No log | 0.6957 | 352 | 1.5123 | 0.0654 | 1.5123 | 1.2298 |
| No log | 0.6996 | 354 | 1.3486 | 0.2361 | 1.3486 | 1.1613 |
| No log | 0.7036 | 356 | 0.9366 | 0.1852 | 0.9366 | 0.9678 |
| No log | 0.7075 | 358 | 0.8338 | 0.1818 | 0.8338 | 0.9131 |
| No log | 0.7115 | 360 | 0.9788 | 0.1852 | 0.9788 | 0.9894 |
| No log | 0.7154 | 362 | 1.2812 | 0.1852 | 1.2812 | 1.1319 |
| No log | 0.7194 | 364 | 1.2627 | 0.1538 | 1.2627 | 1.1237 |
| No log | 0.7233 | 366 | 1.2634 | 0.1538 | 1.2634 | 1.1240 |
| No log | 0.7273 | 368 | 1.3743 | 0.2667 | 1.3743 | 1.1723 |
| No log | 0.7312 | 370 | 1.4926 | 0.2361 | 1.4926 | 1.2217 |
| No log | 0.7352 | 372 | 1.2307 | 0.3016 | 1.2307 | 1.1094 |
| No log | 0.7391 | 374 | 1.0784 | 0.1852 | 1.0784 | 1.0385 |
| No log | 0.7431 | 376 | 0.9127 | 0.0 | 0.9127 | 0.9553 |
| No log | 0.7470 | 378 | 0.8475 | 0.0 | 0.8475 | 0.9206 |
| No log | 0.7510 | 380 | 0.8890 | 0.0 | 0.8890 | 0.9429 |
| No log | 0.7549 | 382 | 0.8890 | 0.0 | 0.8890 | 0.9429 |
| No log | 0.7589 | 384 | 0.9790 | 0.1852 | 0.9790 | 0.9895 |
| No log | 0.7628 | 386 | 1.1095 | 0.1852 | 1.1095 | 1.0534 |
| No log | 0.7668 | 388 | 1.3147 | 0.2361 | 1.3147 | 1.1466 |
| No log | 0.7708 | 390 | 1.3267 | 0.2361 | 1.3267 | 1.1518 |
| No log | 0.7747 | 392 | 1.2169 | 0.1852 | 1.2169 | 1.1031 |
| No log | 0.7787 | 394 | 1.0129 | 0.1852 | 1.0129 | 1.0064 |
| No log | 0.7826 | 396 | 1.0760 | 0.1852 | 1.0760 | 1.0373 |
| No log | 0.7866 | 398 | 1.0517 | 0.1852 | 1.0517 | 1.0255 |
| No log | 0.7905 | 400 | 1.0012 | 0.1818 | 1.0012 | 1.0006 |
| No log | 0.7945 | 402 | 0.9760 | 0.1818 | 0.9760 | 0.9879 |
| No log | 0.7984 | 404 | 1.1017 | 0.1818 | 1.1017 | 1.0496 |
| No log | 0.8024 | 406 | 1.1873 | 0.1852 | 1.1873 | 1.0896 |
| No log | 0.8063 | 408 | 1.1775 | 0.1852 | 1.1775 | 1.0851 |
| No log | 0.8103 | 410 | 1.0361 | 0.1852 | 1.0361 | 1.0179 |
| No log | 0.8142 | 412 | 0.8733 | 0.0 | 0.8733 | 0.9345 |
| No log | 0.8182 | 414 | 0.7765 | 0.0 | 0.7765 | 0.8812 |
| No log | 0.8221 | 416 | 0.7366 | 0.0 | 0.7366 | 0.8582 |
| No log | 0.8261 | 418 | 0.7582 | 0.0 | 0.7582 | 0.8707 |
| No log | 0.8300 | 420 | 0.9276 | 0.0 | 0.9276 | 0.9631 |
| No log | 0.8340 | 422 | 1.2123 | 0.1538 | 1.2123 | 1.1010 |
| No log | 0.8379 | 424 | 1.4351 | 0.3210 | 1.4351 | 1.1980 |
| No log | 0.8419 | 426 | 1.7012 | 0.1637 | 1.7012 | 1.3043 |
| No log | 0.8458 | 428 | 1.5860 | 0.3210 | 1.5860 | 1.2594 |
| No log | 0.8498 | 430 | 1.1893 | 0.1818 | 1.1893 | 1.0905 |
| No log | 0.8538 | 432 | 1.0719 | 0.1818 | 1.0719 | 1.0353 |
| No log | 0.8577 | 434 | 1.1328 | 0.0 | 1.1328 | 1.0643 |
| No log | 0.8617 | 436 | 1.0958 | 0.0 | 1.0958 | 1.0468 |
| No log | 0.8656 | 438 | 0.9005 | 0.0179 | 0.9005 | 0.9489 |
| No log | 0.8696 | 440 | 0.8662 | 0.2080 | 0.8662 | 0.9307 |
| No log | 0.8735 | 442 | 0.8663 | 0.3433 | 0.8663 | 0.9308 |
| No log | 0.8775 | 444 | 1.0731 | 0.0179 | 1.0731 | 1.0359 |
| No log | 0.8814 | 446 | 1.3662 | -0.0342 | 1.3662 | 1.1688 |
| No log | 0.8854 | 448 | 1.4324 | -0.0342 | 1.4324 | 1.1968 |
| No log | 0.8893 | 450 | 1.2425 | -0.0342 | 1.2425 | 1.1147 |
| No log | 0.8933 | 452 | 0.9134 | 0.1818 | 0.9134 | 0.9557 |
| No log | 0.8972 | 454 | 0.8134 | 0.3433 | 0.8134 | 0.9019 |
| No log | 0.9012 | 456 | 0.8867 | 0.1818 | 0.8867 | 0.9416 |
| No log | 0.9051 | 458 | 1.1552 | 0.1852 | 1.1552 | 1.0748 |
| No log | 0.9091 | 460 | 1.4576 | 0.2361 | 1.4576 | 1.2073 |
| No log | 0.9130 | 462 | 1.4187 | 0.2361 | 1.4187 | 1.1911 |
| No log | 0.9170 | 464 | 1.1739 | 0.0 | 1.1739 | 1.0835 |
| No log | 0.9209 | 466 | 0.9426 | 0.0 | 0.9426 | 0.9709 |
| No log | 0.9249 | 468 | 0.8584 | 0.0179 | 0.8584 | 0.9265 |
| No log | 0.9289 | 470 | 0.8269 | 0.2080 | 0.8269 | 0.9094 |
| No log | 0.9328 | 472 | 0.9312 | 0.1818 | 0.9312 | 0.9650 |
| No log | 0.9368 | 474 | 1.2948 | 0.1270 | 1.2948 | 1.1379 |
| No log | 0.9407 | 476 | 1.5133 | -0.0694 | 1.5133 | 1.2302 |
| No log | 0.9447 | 478 | 1.4801 | 0.1037 | 1.4801 | 1.2166 |
| No log | 0.9486 | 480 | 1.4006 | -0.0476 | 1.4006 | 1.1835 |
| No log | 0.9526 | 482 | 1.1904 | -0.0342 | 1.1904 | 1.0911 |
| No log | 0.9565 | 484 | 1.1715 | 0.1295 | 1.1715 | 1.0824 |
| No log | 0.9605 | 486 | 1.2315 | 0.1270 | 1.2315 | 1.1097 |
| No log | 0.9644 | 488 | 1.2832 | -0.0342 | 1.2832 | 1.1328 |
| No log | 0.9684 | 490 | 1.1263 | 0.0 | 1.1263 | 1.0613 |
| No log | 0.9723 | 492 | 0.9926 | 0.0 | 0.9926 | 0.9963 |
| No log | 0.9763 | 494 | 0.9776 | 0.0 | 0.9776 | 0.9887 |
| No log | 0.9802 | 496 | 1.1807 | 0.0 | 1.1807 | 1.0866 |
| No log | 0.9842 | 498 | 1.4766 | 0.1270 | 1.4766 | 1.2151 |
| 0.3759 | 0.9881 | 500 | 1.4595 | 0.1270 | 1.4595 | 1.2081 |
| 0.3759 | 0.9921 | 502 | 1.1584 | 0.0 | 1.1584 | 1.0763 |
| 0.3759 | 0.9960 | 504 | 0.9592 | 0.0179 | 0.9592 | 0.9794 |
| 0.3759 | 1.0 | 506 | 0.9582 | 0.0179 | 0.9582 | 0.9789 |
| 0.3759 | 1.0040 | 508 | 1.1339 | 0.0 | 1.1339 | 1.0648 |
| 0.3759 | 1.0079 | 510 | 1.2858 | -0.0185 | 1.2858 | 1.1339 |
| 0.3759 | 1.0119 | 512 | 1.2878 | -0.0185 | 1.2878 | 1.1348 |
| 0.3759 | 1.0158 | 514 | 1.1120 | 0.0 | 1.1120 | 1.0545 |
| 0.3759 | 1.0198 | 516 | 0.9536 | 0.0179 | 0.9536 | 0.9765 |
| 0.3759 | 1.0237 | 518 | 0.9473 | 0.0179 | 0.9473 | 0.9733 |
| 0.3759 | 1.0277 | 520 | 0.9840 | 0.0179 | 0.9840 | 0.9920 |
| 0.3759 | 1.0316 | 522 | 1.1057 | 0.1852 | 1.1057 | 1.0515 |
| 0.3759 | 1.0356 | 524 | 1.3132 | -0.0342 | 1.3132 | 1.1459 |
| 0.3759 | 1.0395 | 526 | 1.2250 | 0.0 | 1.2250 | 1.1068 |
| 0.3759 | 1.0435 | 528 | 0.9602 | 0.0179 | 0.9602 | 0.9799 |
| 0.3759 | 1.0474 | 530 | 0.8766 | 0.0179 | 0.8766 | 0.9363 |
| 0.3759 | 1.0514 | 532 | 0.8652 | 0.2080 | 0.8652 | 0.9302 |
| 0.3759 | 1.0553 | 534 | 1.0658 | 0.0179 | 1.0658 | 1.0324 |
| 0.3759 | 1.0593 | 536 | 1.3512 | 0.1270 | 1.3512 | 1.1624 |
| 0.3759 | 1.0632 | 538 | 1.3303 | 0.1270 | 1.3303 | 1.1534 |
| 0.3759 | 1.0672 | 540 | 1.0744 | 0.0 | 1.0744 | 1.0365 |
| 0.3759 | 1.0711 | 542 | 0.9326 | 0.0179 | 0.9326 | 0.9657 |
| 0.3759 | 1.0751 | 544 | 0.8838 | 0.0179 | 0.8838 | 0.9401 |
| 0.3759 | 1.0791 | 546 | 0.8531 | 0.0179 | 0.8531 | 0.9236 |
| 0.3759 | 1.0830 | 548 | 0.9123 | 0.3433 | 0.9123 | 0.9551 |
| 0.3759 | 1.0870 | 550 | 1.1046 | 0.1538 | 1.1046 | 1.0510 |
| 0.3759 | 1.0909 | 552 | 1.0536 | 0.1538 | 1.0536 | 1.0265 |
| 0.3759 | 1.0949 | 554 | 1.0256 | 0.1818 | 1.0256 | 1.0127 |
| 0.3759 | 1.0988 | 556 | 1.0128 | 0.1818 | 1.0128 | 1.0064 |
| 0.3759 | 1.1028 | 558 | 0.8871 | 0.1818 | 0.8871 | 0.9419 |
| 0.3759 | 1.1067 | 560 | 0.9577 | 0.1818 | 0.9577 | 0.9786 |
| 0.3759 | 1.1107 | 562 | 1.0439 | 0.1818 | 1.0439 | 1.0217 |
| 0.3759 | 1.1146 | 564 | 1.2388 | 0.1818 | 1.2388 | 1.1130 |
| 0.3759 | 1.1186 | 566 | 1.5782 | -0.0784 | 1.5782 | 1.2563 |
| 0.3759 | 1.1225 | 568 | 1.5943 | 0.0833 | 1.5943 | 1.2627 |
| 0.3759 | 1.1265 | 570 | 1.4334 | -0.0185 | 1.4334 | 1.1973 |
| 0.3759 | 1.1304 | 572 | 1.1520 | 0.0179 | 1.1520 | 1.0733 |
| 0.3759 | 1.1344 | 574 | 0.9102 | 0.0435 | 0.9102 | 0.9540 |
| 0.3759 | 1.1383 | 576 | 0.8936 | 0.1720 | 0.8936 | 0.9453 |
| 0.3759 | 1.1423 | 578 | 0.8765 | 0.0462 | 0.8765 | 0.9362 |
| 0.3759 | 1.1462 | 580 | 0.9332 | 0.2080 | 0.9332 | 0.9660 |
| 0.3759 | 1.1502 | 582 | 1.1485 | 0.0179 | 1.1485 | 1.0717 |
| 0.3759 | 1.1542 | 584 | 1.2930 | 0.0 | 1.2930 | 1.1371 |
| 0.3759 | 1.1581 | 586 | 1.2790 | 0.0 | 1.2790 | 1.1309 |
| 0.3759 | 1.1621 | 588 | 1.1439 | 0.1818 | 1.1439 | 1.0695 |
| 0.3759 | 1.1660 | 590 | 1.0000 | 0.1818 | 1.0000 | 1.0000 |
| 0.3759 | 1.1700 | 592 | 1.0194 | 0.1818 | 1.0194 | 1.0097 |
| 0.3759 | 1.1739 | 594 | 1.1276 | 0.1818 | 1.1276 | 1.0619 |
| 0.3759 | 1.1779 | 596 | 1.0378 | 0.1818 | 1.0378 | 1.0187 |
| 0.3759 | 1.1818 | 598 | 0.9335 | 0.1818 | 0.9335 | 0.9662 |
| 0.3759 | 1.1858 | 600 | 1.0362 | 0.1818 | 1.0362 | 1.0179 |
| 0.3759 | 1.1897 | 602 | 1.3464 | 0.1037 | 1.3464 | 1.1604 |
| 0.3759 | 1.1937 | 604 | 1.5025 | 0.3210 | 1.5025 | 1.2257 |
| 0.3759 | 1.1976 | 606 | 1.3773 | 0.1037 | 1.3773 | 1.1736 |
| 0.3759 | 1.2016 | 608 | 1.0260 | 0.1818 | 1.0260 | 1.0129 |
| 0.3759 | 1.2055 | 610 | 0.8456 | 0.2080 | 0.8456 | 0.9196 |
| 0.3759 | 1.2095 | 612 | 0.8203 | 0.2080 | 0.8203 | 0.9057 |
| 0.3759 | 1.2134 | 614 | 0.9837 | 0.1818 | 0.9837 | 0.9918 |
| 0.3759 | 1.2174 | 616 | 1.1494 | 0.1852 | 1.1494 | 1.0721 |
| 0.3759 | 1.2213 | 618 | 1.4307 | 0.2667 | 1.4307 | 1.1961 |
| 0.3759 | 1.2253 | 620 | 1.4585 | 0.2667 | 1.4585 | 1.2077 |
| 0.3759 | 1.2292 | 622 | 1.2744 | 0.1852 | 1.2744 | 1.1289 |
| 0.3759 | 1.2332 | 624 | 1.0209 | 0.0 | 1.0209 | 1.0104 |
| 0.3759 | 1.2372 | 626 | 0.8068 | 0.0179 | 0.8068 | 0.8982 |
| 0.3759 | 1.2411 | 628 | 0.7095 | 0.2080 | 0.7095 | 0.8423 |
| 0.3759 | 1.2451 | 630 | 0.7020 | 0.2080 | 0.7020 | 0.8379 |
| 0.3759 | 1.2490 | 632 | 0.7694 | 0.2080 | 0.7694 | 0.8772 |
| 0.3759 | 1.2530 | 634 | 1.0271 | 0.1852 | 1.0271 | 1.0135 |
| 0.3759 | 1.2569 | 636 | 1.2128 | 0.3419 | 1.2128 | 1.1012 |
| 0.3759 | 1.2609 | 638 | 1.1683 | 0.1852 | 1.1683 | 1.0809 |
| 0.3759 | 1.2648 | 640 | 0.9651 | 0.1852 | 0.9651 | 0.9824 |
| 0.3759 | 1.2688 | 642 | 0.7907 | 0.3433 | 0.7907 | 0.8892 |
| 0.3759 | 1.2727 | 644 | 0.7506 | 0.3433 | 0.7506 | 0.8664 |
| 0.3759 | 1.2767 | 646 | 0.8322 | 0.3433 | 0.8322 | 0.9122 |
| 0.3759 | 1.2806 | 648 | 0.9304 | 0.3433 | 0.9304 | 0.9646 |
| 0.3759 | 1.2846 | 650 | 1.1202 | 0.1852 | 1.1202 | 1.0584 |
| 0.3759 | 1.2885 | 652 | 1.4205 | 0.2667 | 1.4205 | 1.1918 |
| 0.3759 | 1.2925 | 654 | 1.4332 | 0.1270 | 1.4332 | 1.1972 |
| 0.3759 | 1.2964 | 656 | 1.2597 | 0.0 | 1.2597 | 1.1224 |
| 0.3759 | 1.3004 | 658 | 1.0412 | 0.0 | 1.0412 | 1.0204 |
| 0.3759 | 1.3043 | 660 | 0.9543 | 0.0179 | 0.9543 | 0.9769 |
| 0.3759 | 1.3083 | 662 | 0.9232 | 0.0179 | 0.9232 | 0.9608 |
| 0.3759 | 1.3123 | 664 | 0.9570 | 0.0179 | 0.9570 | 0.9783 |
| 0.3759 | 1.3162 | 666 | 1.1065 | 0.1852 | 1.1065 | 1.0519 |
| 0.3759 | 1.3202 | 668 | 1.3429 | 0.1270 | 1.3429 | 1.1588 |
| 0.3759 | 1.3241 | 670 | 1.3056 | 0.1852 | 1.3056 | 1.1426 |
| 0.3759 | 1.3281 | 672 | 1.0980 | 0.1852 | 1.0980 | 1.0478 |
| 0.3759 | 1.3320 | 674 | 1.0220 | 0.3433 | 1.0220 | 1.0109 |
| 0.3759 | 1.3360 | 676 | 1.0783 | 0.1818 | 1.0783 | 1.0384 |
| 0.3759 | 1.3399 | 678 | 1.1172 | 0.1852 | 1.1172 | 1.0570 |
| 0.3759 | 1.3439 | 680 | 1.1084 | 0.1852 | 1.1084 | 1.0528 |
| 0.3759 | 1.3478 | 682 | 0.9647 | 0.0 | 0.9647 | 0.9822 |
| 0.3759 | 1.3518 | 684 | 0.9431 | 0.0 | 0.9431 | 0.9711 |
| 0.3759 | 1.3557 | 686 | 1.0427 | 0.0 | 1.0427 | 1.0211 |
| 0.3759 | 1.3597 | 688 | 1.2804 | 0.0 | 1.2804 | 1.1316 |
| 0.3759 | 1.3636 | 690 | 1.4202 | 0.0 | 1.4202 | 1.1917 |
| 0.3759 | 1.3676 | 692 | 1.3986 | -0.0185 | 1.3986 | 1.1826 |
| 0.3759 | 1.3715 | 694 | 1.1927 | 0.0 | 1.1927 | 1.0921 |
| 0.3759 | 1.3755 | 696 | 0.9401 | 0.2080 | 0.9401 | 0.9696 |
| 0.3759 | 1.3794 | 698 | 0.8635 | 0.0435 | 0.8635 | 0.9292 |
| 0.3759 | 1.3834 | 700 | 0.8525 | 0.0435 | 0.8525 | 0.9233 |
| 0.3759 | 1.3874 | 702 | 0.9007 | 0.2080 | 0.9007 | 0.9490 |
| 0.3759 | 1.3913 | 704 | 1.1006 | 0.0 | 1.1006 | 1.0491 |
| 0.3759 | 1.3953 | 706 | 1.2114 | 0.0 | 1.2114 | 1.1006 |
| 0.3759 | 1.3992 | 708 | 1.1766 | 0.0 | 1.1766 | 1.0847 |
| 0.3759 | 1.4032 | 710 | 1.1082 | 0.0 | 1.1082 | 1.0527 |
| 0.3759 | 1.4071 | 712 | 1.0535 | 0.0 | 1.0535 | 1.0264 |
| 0.3759 | 1.4111 | 714 | 1.0927 | 0.1852 | 1.0927 | 1.0453 |
| 0.3759 | 1.4150 | 716 | 1.1016 | 0.1852 | 1.1016 | 1.0496 |
| 0.3759 | 1.4190 | 718 | 1.2982 | 0.1270 | 1.2982 | 1.1394 |
| 0.3759 | 1.4229 | 720 | 1.3498 | -0.0342 | 1.3498 | 1.1618 |
| 0.3759 | 1.4269 | 722 | 1.2808 | -0.0185 | 1.2808 | 1.1317 |
| 0.3759 | 1.4308 | 724 | 1.1326 | 0.0 | 1.1326 | 1.0642 |
| 0.3759 | 1.4348 | 726 | 1.0287 | 0.0 | 1.0287 | 1.0143 |
| 0.3759 | 1.4387 | 728 | 0.8999 | 0.0 | 0.8999 | 0.9486 |
| 0.3759 | 1.4427 | 730 | 0.8144 | 0.2080 | 0.8144 | 0.9024 |
| 0.3759 | 1.4466 | 732 | 0.8328 | 0.2080 | 0.8328 | 0.9126 |
| 0.3759 | 1.4506 | 734 | 0.8600 | 0.2080 | 0.8600 | 0.9274 |
| 0.3759 | 1.4545 | 736 | 1.0368 | 0.0 | 1.0368 | 1.0183 |
| 0.3759 | 1.4585 | 738 | 1.0787 | 0.0 | 1.0787 | 1.0386 |
| 0.3759 | 1.4625 | 740 | 0.9201 | 0.2080 | 0.9201 | 0.9592 |
| 0.3759 | 1.4664 | 742 | 0.8923 | 0.2080 | 0.8923 | 0.9446 |
| 0.3759 | 1.4704 | 744 | 1.0731 | 0.0179 | 1.0731 | 1.0359 |
| 0.3759 | 1.4743 | 746 | 1.2482 | 0.1538 | 1.2482 | 1.1172 |
| 0.3759 | 1.4783 | 748 | 1.1575 | -0.0185 | 1.1575 | 1.0759 |
| 0.3759 | 1.4822 | 750 | 0.9972 | 0.0 | 0.9972 | 0.9986 |
| 0.3759 | 1.4862 | 752 | 0.7862 | 0.2080 | 0.7862 | 0.8867 |
| 0.3759 | 1.4901 | 754 | 0.7062 | 0.2080 | 0.7062 | 0.8404 |
| 0.3759 | 1.4941 | 756 | 0.7333 | 0.2080 | 0.7333 | 0.8563 |
| 0.3759 | 1.4980 | 758 | 0.9261 | 0.0179 | 0.9261 | 0.9623 |
| 0.3759 | 1.5020 | 760 | 1.2834 | 0.1538 | 1.2834 | 1.1329 |
| 0.3759 | 1.5059 | 762 | 1.4158 | 0.2361 | 1.4158 | 1.1899 |
| 0.3759 | 1.5099 | 764 | 1.3178 | 0.1538 | 1.3178 | 1.1479 |
| 0.3759 | 1.5138 | 766 | 1.1547 | 0.0 | 1.1547 | 1.0746 |
| 0.3759 | 1.5178 | 768 | 0.9545 | 0.0 | 0.9545 | 0.9770 |
| 0.3759 | 1.5217 | 770 | 0.7543 | 0.2080 | 0.7543 | 0.8685 |
| 0.3759 | 1.5257 | 772 | 0.7337 | 0.2080 | 0.7337 | 0.8566 |
| 0.3759 | 1.5296 | 774 | 0.8316 | 0.2080 | 0.8316 | 0.9119 |
| 0.3759 | 1.5336 | 776 | 0.9785 | 0.0 | 0.9785 | 0.9892 |
| 0.3759 | 1.5375 | 778 | 1.0594 | 0.1852 | 1.0594 | 1.0293 |
| 0.3759 | 1.5415 | 780 | 0.9597 | 0.1852 | 0.9597 | 0.9797 |
| 0.3759 | 1.5455 | 782 | 0.8176 | 0.2080 | 0.8176 | 0.9042 |
| 0.3759 | 1.5494 | 784 | 0.8259 | 0.3433 | 0.8259 | 0.9088 |
| 0.3759 | 1.5534 | 786 | 0.8317 | 0.3433 | 0.8317 | 0.9120 |
| 0.3759 | 1.5573 | 788 | 0.9349 | 0.1852 | 0.9349 | 0.9669 |
| 0.3759 | 1.5613 | 790 | 1.0288 | 0.1852 | 1.0288 | 1.0143 |
| 0.3759 | 1.5652 | 792 | 1.0239 | 0.0 | 1.0239 | 1.0119 |
| 0.3759 | 1.5692 | 794 | 0.8909 | 0.0179 | 0.8909 | 0.9439 |
| 0.3759 | 1.5731 | 796 | 0.7766 | 0.2080 | 0.7766 | 0.8813 |
| 0.3759 | 1.5771 | 798 | 0.7626 | 0.2080 | 0.7626 | 0.8733 |
| 0.3759 | 1.5810 | 800 | 0.8511 | 0.2080 | 0.8511 | 0.9226 |
| 0.3759 | 1.5850 | 802 | 1.1420 | -0.0185 | 1.1420 | 1.0686 |
| 0.3759 | 1.5889 | 804 | 1.3004 | 0.5417 | 1.3004 | 1.1403 |
| 0.3759 | 1.5929 | 806 | 1.2081 | -0.0185 | 1.2081 | 1.0992 |
| 0.3759 | 1.5968 | 808 | 0.9750 | 0.0 | 0.9750 | 0.9874 |
| 0.3759 | 1.6008 | 810 | 0.7667 | 0.2080 | 0.7667 | 0.8756 |
| 0.3759 | 1.6047 | 812 | 0.7087 | 0.2080 | 0.7087 | 0.8419 |
| 0.3759 | 1.6087 | 814 | 0.7275 | 0.2080 | 0.7275 | 0.8529 |
| 0.3759 | 1.6126 | 816 | 0.7947 | 0.2080 | 0.7947 | 0.8915 |
| 0.3759 | 1.6166 | 818 | 0.8657 | 0.0179 | 0.8657 | 0.9304 |
| 0.3759 | 1.6206 | 820 | 0.8404 | 0.2080 | 0.8404 | 0.9168 |
| 0.3759 | 1.6245 | 822 | 0.9541 | 0.0179 | 0.9541 | 0.9768 |
| 0.3759 | 1.6285 | 824 | 1.1945 | 0.0 | 1.1945 | 1.0929 |
| 0.3759 | 1.6324 | 826 | 1.2098 | 0.0 | 1.2098 | 1.0999 |
| 0.3759 | 1.6364 | 828 | 1.0193 | 0.0 | 1.0193 | 1.0096 |
| 0.3759 | 1.6403 | 830 | 0.7861 | 0.2080 | 0.7861 | 0.8866 |
| 0.3759 | 1.6443 | 832 | 0.7589 | 0.2080 | 0.7589 | 0.8711 |
| 0.3759 | 1.6482 | 834 | 0.8742 | 0.0179 | 0.8742 | 0.9350 |
| 0.3759 | 1.6522 | 836 | 1.0786 | 0.0 | 1.0786 | 1.0385 |
| 0.3759 | 1.6561 | 838 | 1.0617 | 0.0 | 1.0617 | 1.0304 |
| 0.3759 | 1.6601 | 840 | 0.9144 | 0.0 | 0.9144 | 0.9562 |
| 0.3759 | 1.6640 | 842 | 0.8216 | 0.0 | 0.8216 | 0.9064 |
| 0.3759 | 1.6680 | 844 | 0.8583 | 0.0 | 0.8583 | 0.9264 |
| 0.3759 | 1.6719 | 846 | 0.8470 | 0.0179 | 0.8470 | 0.9203 |
| 0.3759 | 1.6759 | 848 | 0.8797 | 0.0179 | 0.8797 | 0.9379 |
| 0.3759 | 1.6798 | 850 | 1.0556 | 0.1852 | 1.0556 | 1.0274 |
| 0.3759 | 1.6838 | 852 | 1.1196 | 0.1852 | 1.1196 | 1.0581 |
| 0.3759 | 1.6877 | 854 | 1.0249 | 0.1852 | 1.0249 | 1.0124 |
| 0.3759 | 1.6917 | 856 | 0.8528 | 0.0179 | 0.8528 | 0.9235 |
| 0.3759 | 1.6957 | 858 | 0.7050 | 0.2080 | 0.7050 | 0.8396 |
| 0.3759 | 1.6996 | 860 | 0.6923 | 0.2080 | 0.6923 | 0.8321 |
| 0.3759 | 1.7036 | 862 | 0.7959 | 0.2080 | 0.7959 | 0.8921 |
| 0.3759 | 1.7075 | 864 | 1.1058 | 0.2667 | 1.1058 | 1.0516 |
| 0.3759 | 1.7115 | 866 | 1.2833 | 0.2361 | 1.2833 | 1.1328 |
| 0.3759 | 1.7154 | 868 | 1.1963 | 0.1270 | 1.1963 | 1.0937 |
| 0.3759 | 1.7194 | 870 | 0.9795 | 0.0 | 0.9795 | 0.9897 |
| 0.3759 | 1.7233 | 872 | 0.7454 | 0.2080 | 0.7454 | 0.8634 |
| 0.3759 | 1.7273 | 874 | 0.6745 | 0.2080 | 0.6745 | 0.8213 |
| 0.3759 | 1.7312 | 876 | 0.6857 | 0.2080 | 0.6857 | 0.8281 |
| 0.3759 | 1.7352 | 878 | 0.7811 | 0.2080 | 0.7811 | 0.8838 |
| 0.3759 | 1.7391 | 880 | 1.0141 | 0.0 | 1.0141 | 1.0070 |
| 0.3759 | 1.7431 | 882 | 1.1593 | 0.1538 | 1.1593 | 1.0767 |
| 0.3759 | 1.7470 | 884 | 1.1002 | 0.1538 | 1.1002 | 1.0489 |
| 0.3759 | 1.7510 | 886 | 0.9284 | 0.1852 | 0.9284 | 0.9635 |
| 0.3759 | 1.7549 | 888 | 0.7571 | 0.3433 | 0.7571 | 0.8701 |
| 0.3759 | 1.7589 | 890 | 0.7060 | 0.2080 | 0.7060 | 0.8403 |
| 0.3759 | 1.7628 | 892 | 0.7396 | 0.3433 | 0.7396 | 0.8600 |
| 0.3759 | 1.7668 | 894 | 0.8770 | 0.1852 | 0.8770 | 0.9365 |
| 0.3759 | 1.7708 | 896 | 0.9436 | 0.1852 | 0.9436 | 0.9714 |
| 0.3759 | 1.7747 | 898 | 0.8677 | 0.1852 | 0.8677 | 0.9315 |
| 0.3759 | 1.7787 | 900 | 0.7704 | 0.2080 | 0.7704 | 0.8777 |
| 0.3759 | 1.7826 | 902 | 0.6947 | 0.2080 | 0.6947 | 0.8335 |
| 0.3759 | 1.7866 | 904 | 0.6992 | 0.2080 | 0.6992 | 0.8362 |
| 0.3759 | 1.7905 | 906 | 0.7929 | 0.2143 | 0.7929 | 0.8905 |
| 0.3759 | 1.7945 | 908 | 0.9510 | 0.0 | 0.9510 | 0.9752 |
| 0.3759 | 1.7984 | 910 | 0.9529 | 0.0 | 0.9529 | 0.9762 |
| 0.3759 | 1.8024 | 912 | 0.8482 | 0.0 | 0.8482 | 0.9210 |
| 0.3759 | 1.8063 | 914 | 0.6754 | 0.2143 | 0.6754 | 0.8218 |
| 0.3759 | 1.8103 | 916 | 0.6205 | 0.2080 | 0.6205 | 0.7877 |
| 0.3759 | 1.8142 | 918 | 0.6038 | 0.2080 | 0.6038 | 0.7771 |
| 0.3759 | 1.8182 | 920 | 0.6407 | 0.2080 | 0.6407 | 0.8004 |
| 0.3759 | 1.8221 | 922 | 0.6872 | 0.2080 | 0.6872 | 0.8290 |
| 0.3759 | 1.8261 | 924 | 0.8638 | 0.3636 | 0.8638 | 0.9294 |
| 0.3759 | 1.8300 | 926 | 1.0390 | 0.1852 | 1.0390 | 1.0193 |
| 0.3759 | 1.8340 | 928 | 0.9957 | 0.1852 | 0.9957 | 0.9979 |
| 0.3759 | 1.8379 | 930 | 0.8796 | 0.2143 | 0.8796 | 0.9378 |
| 0.3759 | 1.8419 | 932 | 0.8895 | 0.2143 | 0.8895 | 0.9431 |
| 0.3759 | 1.8458 | 934 | 0.8625 | 0.2143 | 0.8625 | 0.9287 |
| 0.3759 | 1.8498 | 936 | 0.8712 | 0.2143 | 0.8712 | 0.9334 |
| 0.3759 | 1.8538 | 938 | 0.9308 | 0.0 | 0.9308 | 0.9648 |
| 0.3759 | 1.8577 | 940 | 0.9800 | 0.0 | 0.9800 | 0.9900 |
| 0.3759 | 1.8617 | 942 | 0.8910 | 0.0 | 0.8910 | 0.9439 |
| 0.3759 | 1.8656 | 944 | 0.8072 | 0.2143 | 0.8072 | 0.8985 |
| 0.3759 | 1.8696 | 946 | 0.8299 | 0.0 | 0.8299 | 0.9110 |
| 0.3759 | 1.8735 | 948 | 0.8440 | 0.0 | 0.8440 | 0.9187 |
| 0.3759 | 1.8775 | 950 | 0.7445 | 0.2143 | 0.7445 | 0.8629 |
| 0.3759 | 1.8814 | 952 | 0.6978 | 0.2080 | 0.6978 | 0.8354 |
| 0.3759 | 1.8854 | 954 | 0.6864 | 0.2080 | 0.6864 | 0.8285 |
| 0.3759 | 1.8893 | 956 | 0.7846 | 0.2080 | 0.7846 | 0.8858 |
| 0.3759 | 1.8933 | 958 | 1.0516 | 0.0 | 1.0516 | 1.0255 |
| 0.3759 | 1.8972 | 960 | 1.1653 | 0.1852 | 1.1653 | 1.0795 |
| 0.3759 | 1.9012 | 962 | 1.1233 | 0.0 | 1.1233 | 1.0598 |
| 0.3759 | 1.9051 | 964 | 1.0464 | 0.0 | 1.0464 | 1.0229 |
| 0.3759 | 1.9091 | 966 | 0.8916 | 0.0 | 0.8916 | 0.9443 |
| 0.3759 | 1.9130 | 968 | 0.7429 | 0.2143 | 0.7429 | 0.8619 |
| 0.3759 | 1.9170 | 970 | 0.6870 | 0.2080 | 0.6870 | 0.8289 |
| 0.3759 | 1.9209 | 972 | 0.6818 | 0.2080 | 0.6818 | 0.8257 |
| 0.3759 | 1.9249 | 974 | 0.7524 | 0.2080 | 0.7524 | 0.8674 |
| 0.3759 | 1.9289 | 976 | 0.8795 | 0.0 | 0.8795 | 0.9378 |
| 0.3759 | 1.9328 | 978 | 0.9003 | 0.1852 | 0.9003 | 0.9489 |
| 0.3759 | 1.9368 | 980 | 0.8642 | 0.3433 | 0.8642 | 0.9296 |
| 0.3759 | 1.9407 | 982 | 0.9160 | 0.3636 | 0.9160 | 0.9571 |
| 0.3759 | 1.9447 | 984 | 1.0310 | 0.1852 | 1.0310 | 1.0154 |
| 0.3759 | 1.9486 | 986 | 1.0894 | 0.1852 | 1.0894 | 1.0437 |
| 0.3759 | 1.9526 | 988 | 0.9531 | 0.0 | 0.9531 | 0.9763 |
| 0.3759 | 1.9565 | 990 | 0.7629 | 0.2080 | 0.7629 | 0.8734 |
| 0.3759 | 1.9605 | 992 | 0.6815 | 0.2080 | 0.6815 | 0.8256 |
| 0.3759 | 1.9644 | 994 | 0.6797 | 0.2080 | 0.6797 | 0.8244 |
| 0.3759 | 1.9684 | 996 | 0.7323 | 0.2080 | 0.7323 | 0.8557 |
| 0.3759 | 1.9723 | 998 | 0.8996 | 0.0 | 0.8996 | 0.9485 |
| 0.1431 | 1.9763 | 1000 | 1.0377 | 0.1852 | 1.0377 | 1.0187 |
| 0.1431 | 1.9802 | 1002 | 1.0502 | 0.1852 | 1.0502 | 1.0248 |
| 0.1431 | 1.9842 | 1004 | 0.9511 | 0.0 | 0.9511 | 0.9753 |
| 0.1431 | 1.9881 | 1006 | 0.7954 | 0.2080 | 0.7954 | 0.8919 |
| 0.1431 | 1.9921 | 1008 | 0.7112 | 0.2080 | 0.7112 | 0.8433 |
| 0.1431 | 1.9960 | 1010 | 0.7126 | 0.2080 | 0.7126 | 0.8441 |
| 0.1431 | 2.0 | 1012 | 0.8021 | 0.2080 | 0.8021 | 0.8956 |
| 0.1431 | 2.0040 | 1014 | 1.0211 | 0.0 | 1.0211 | 1.0105 |
| 0.1431 | 2.0079 | 1016 | 1.1296 | 0.0 | 1.1296 | 1.0628 |
| 0.1431 | 2.0119 | 1018 | 1.0709 | 0.0 | 1.0709 | 1.0348 |
| 0.1431 | 2.0158 | 1020 | 0.9288 | 0.0 | 0.9288 | 0.9638 |
| 0.1431 | 2.0198 | 1022 | 0.7595 | 0.2143 | 0.7595 | 0.8715 |
| 0.1431 | 2.0237 | 1024 | 0.6759 | 0.2080 | 0.6759 | 0.8221 |
| 0.1431 | 2.0277 | 1026 | 0.6596 | 0.2080 | 0.6596 | 0.8121 |
| 0.1431 | 2.0316 | 1028 | 0.6867 | 0.2080 | 0.6867 | 0.8287 |
| 0.1431 | 2.0356 | 1030 | 0.8237 | 0.2143 | 0.8237 | 0.9076 |
| 0.1431 | 2.0395 | 1032 | 0.9732 | 0.1852 | 0.9732 | 0.9865 |
| 0.1431 | 2.0435 | 1034 | 0.9663 | 0.1852 | 0.9663 | 0.9830 |
| 0.1431 | 2.0474 | 1036 | 0.8395 | 0.2143 | 0.8395 | 0.9163 |
| 0.1431 | 2.0514 | 1038 | 0.7004 | 0.2080 | 0.7004 | 0.8369 |
| 0.1431 | 2.0553 | 1040 | 0.6937 | 0.2080 | 0.6937 | 0.8329 |
| 0.1431 | 2.0593 | 1042 | 0.7949 | 0.2143 | 0.7949 | 0.8916 |
| 0.1431 | 2.0632 | 1044 | 0.9643 | 0.0 | 0.9643 | 0.9820 |
| 0.1431 | 2.0672 | 1046 | 1.0626 | 0.0 | 1.0626 | 1.0308 |
| 0.1431 | 2.0711 | 1048 | 1.0810 | 0.0 | 1.0810 | 1.0397 |
| 0.1431 | 2.0751 | 1050 | 0.9832 | 0.0 | 0.9832 | 0.9915 |
| 0.1431 | 2.0791 | 1052 | 0.8235 | 0.0 | 0.8235 | 0.9074 |
| 0.1431 | 2.0830 | 1054 | 0.7182 | 0.2080 | 0.7182 | 0.8475 |
| 0.1431 | 2.0870 | 1056 | 0.6926 | 0.2080 | 0.6926 | 0.8322 |
| 0.1431 | 2.0909 | 1058 | 0.7420 | 0.2143 | 0.7420 | 0.8614 |
| 0.1431 | 2.0949 | 1060 | 0.8452 | 0.0 | 0.8452 | 0.9194 |
| 0.1431 | 2.0988 | 1062 | 0.9433 | 0.0 | 0.9433 | 0.9712 |
| 0.1431 | 2.1028 | 1064 | 0.9629 | 0.0 | 0.9629 | 0.9813 |
| 0.1431 | 2.1067 | 1066 | 0.8692 | 0.2143 | 0.8692 | 0.9323 |
| 0.1431 | 2.1107 | 1068 | 0.8141 | 0.2143 | 0.8141 | 0.9023 |
| 0.1431 | 2.1146 | 1070 | 0.8129 | 0.2143 | 0.8129 | 0.9016 |
| 0.1431 | 2.1186 | 1072 | 0.8513 | 0.2143 | 0.8513 | 0.9226 |
| 0.1431 | 2.1225 | 1074 | 0.8516 | 0.2143 | 0.8516 | 0.9228 |
| 0.1431 | 2.1265 | 1076 | 0.8078 | 0.2143 | 0.8078 | 0.8988 |
| 0.1431 | 2.1304 | 1078 | 0.7193 | 0.2080 | 0.7193 | 0.8481 |
| 0.1431 | 2.1344 | 1080 | 0.7020 | 0.2080 | 0.7020 | 0.8378 |
| 0.1431 | 2.1383 | 1082 | 0.7071 | 0.2080 | 0.7071 | 0.8409 |
| 0.1431 | 2.1423 | 1084 | 0.7838 | 0.2143 | 0.7838 | 0.8853 |
| 0.1431 | 2.1462 | 1086 | 0.9447 | 0.3636 | 0.9447 | 0.9720 |
| 0.1431 | 2.1502 | 1088 | 0.9843 | 0.1852 | 0.9843 | 0.9921 |
| 0.1431 | 2.1542 | 1090 | 0.9230 | 0.2143 | 0.9230 | 0.9607 |
| 0.1431 | 2.1581 | 1092 | 0.8283 | 0.2143 | 0.8283 | 0.9101 |
| 0.1431 | 2.1621 | 1094 | 0.7826 | 0.2080 | 0.7826 | 0.8847 |
| 0.1431 | 2.1660 | 1096 | 0.7677 | 0.2080 | 0.7677 | 0.8762 |
| 0.1431 | 2.1700 | 1098 | 0.7778 | 0.2080 | 0.7778 | 0.8819 |
| 0.1431 | 2.1739 | 1100 | 0.8060 | 0.2143 | 0.8060 | 0.8978 |
| 0.1431 | 2.1779 | 1102 | 0.8383 | 0.2143 | 0.8383 | 0.9156 |
| 0.1431 | 2.1818 | 1104 | 0.7853 | 0.2080 | 0.7853 | 0.8862 |
| 0.1431 | 2.1858 | 1106 | 0.7912 | 0.2080 | 0.7912 | 0.8895 |
| 0.1431 | 2.1897 | 1108 | 0.8941 | 0.2080 | 0.8941 | 0.9456 |
| 0.1431 | 2.1937 | 1110 | 1.0988 | 0.1852 | 1.0988 | 1.0482 |
| 0.1431 | 2.1976 | 1112 | 1.1571 | 0.1270 | 1.1571 | 1.0757 |
| 0.1431 | 2.2016 | 1114 | 1.0757 | 0.0 | 1.0757 | 1.0372 |
| 0.1431 | 2.2055 | 1116 | 0.9042 | 0.2080 | 0.9042 | 0.9509 |
| 0.1431 | 2.2095 | 1118 | 0.7861 | 0.2029 | 0.7861 | 0.8866 |
| 0.1431 | 2.2134 | 1120 | 0.7801 | 0.2029 | 0.7801 | 0.8833 |
| 0.1431 | 2.2174 | 1122 | 0.8128 | 0.2029 | 0.8128 | 0.9016 |
| 0.1431 | 2.2213 | 1124 | 0.9897 | 0.2143 | 0.9897 | 0.9949 |
| 0.1431 | 2.2253 | 1126 | 1.1139 | 0.0 | 1.1139 | 1.0554 |
| 0.1431 | 2.2292 | 1128 | 1.0724 | 0.0 | 1.0724 | 1.0355 |
| 0.1431 | 2.2332 | 1130 | 0.8922 | 0.2143 | 0.8922 | 0.9446 |
| 0.1431 | 2.2372 | 1132 | 0.7937 | 0.2080 | 0.7937 | 0.8909 |
| 0.1431 | 2.2411 | 1134 | 0.7739 | 0.2080 | 0.7739 | 0.8797 |
| 0.1431 | 2.2451 | 1136 | 0.7940 | 0.2080 | 0.7940 | 0.8911 |
| 0.1431 | 2.2490 | 1138 | 0.8347 | 0.2080 | 0.8347 | 0.9136 |
| 0.1431 | 2.2530 | 1140 | 0.8345 | 0.2080 | 0.8345 | 0.9135 |
| 0.1431 | 2.2569 | 1142 | 0.8203 | 0.2080 | 0.8203 | 0.9057 |
| 0.1431 | 2.2609 | 1144 | 0.8193 | 0.2080 | 0.8193 | 0.9051 |
| 0.1431 | 2.2648 | 1146 | 0.8108 | 0.2080 | 0.8108 | 0.9005 |
| 0.1431 | 2.2688 | 1148 | 0.7874 | 0.2080 | 0.7874 | 0.8874 |
| 0.1431 | 2.2727 | 1150 | 0.8199 | 0.2080 | 0.8199 | 0.9055 |
| 0.1431 | 2.2767 | 1152 | 0.8755 | 0.2080 | 0.8755 | 0.9357 |
| 0.1431 | 2.2806 | 1154 | 0.9623 | 0.0 | 0.9623 | 0.9809 |
| 0.1431 | 2.2846 | 1156 | 0.9215 | 0.2080 | 0.9215 | 0.9599 |
| 0.1431 | 2.2885 | 1158 | 0.8276 | 0.2080 | 0.8276 | 0.9097 |
| 0.1431 | 2.2925 | 1160 | 0.7947 | 0.2080 | 0.7947 | 0.8915 |
| 0.1431 | 2.2964 | 1162 | 0.8536 | 0.2080 | 0.8536 | 0.9239 |
| 0.1431 | 2.3004 | 1164 | 0.9875 | 0.0 | 0.9875 | 0.9937 |
| 0.1431 | 2.3043 | 1166 | 1.1104 | 0.0 | 1.1104 | 1.0538 |
| 0.1431 | 2.3083 | 1168 | 1.0560 | 0.0 | 1.0560 | 1.0276 |
| 0.1431 | 2.3123 | 1170 | 0.9256 | 0.0 | 0.9256 | 0.9621 |
| 0.1431 | 2.3162 | 1172 | 0.7860 | 0.0 | 0.7860 | 0.8866 |
| 0.1431 | 2.3202 | 1174 | 0.7206 | 0.2080 | 0.7206 | 0.8489 |
| 0.1431 | 2.3241 | 1176 | 0.7003 | 0.2080 | 0.7003 | 0.8368 |
| 0.1431 | 2.3281 | 1178 | 0.7280 | 0.2080 | 0.7280 | 0.8532 |
| 0.1431 | 2.3320 | 1180 | 0.8419 | 0.2080 | 0.8419 | 0.9176 |
| 0.1431 | 2.3360 | 1182 | 1.0839 | 0.0 | 1.0839 | 1.0411 |
| 0.1431 | 2.3399 | 1184 | 1.2253 | 0.0 | 1.2253 | 1.1070 |
| 0.1431 | 2.3439 | 1186 | 1.1514 | -0.0185 | 1.1514 | 1.0730 |
| 0.1431 | 2.3478 | 1188 | 0.9142 | 0.3433 | 0.9142 | 0.9561 |
| 0.1431 | 2.3518 | 1190 | 0.7410 | 0.4500 | 0.7410 | 0.8608 |
| 0.1431 | 2.3557 | 1192 | 0.7155 | 0.0610 | 0.7155 | 0.8459 |
| 0.1431 | 2.3597 | 1194 | 0.6997 | 0.4500 | 0.6997 | 0.8365 |
| 0.1431 | 2.3636 | 1196 | 0.7261 | 0.3265 | 0.7261 | 0.8521 |
| 0.1431 | 2.3676 | 1198 | 0.8926 | 0.3433 | 0.8926 | 0.9448 |
| 0.1431 | 2.3715 | 1200 | 1.0917 | 0.1538 | 1.0917 | 1.0449 |
| 0.1431 | 2.3755 | 1202 | 1.1665 | 0.3016 | 1.1665 | 1.0801 |
| 0.1431 | 2.3794 | 1204 | 1.0930 | 0.1538 | 1.0930 | 1.0455 |
| 0.1431 | 2.3834 | 1206 | 0.9275 | 0.1852 | 0.9275 | 0.9631 |
| 0.1431 | 2.3874 | 1208 | 0.7680 | 0.2080 | 0.7680 | 0.8764 |
| 0.1431 | 2.3913 | 1210 | 0.7281 | 0.2080 | 0.7281 | 0.8533 |
| 0.1431 | 2.3953 | 1212 | 0.7552 | 0.2080 | 0.7552 | 0.8690 |
| 0.1431 | 2.3992 | 1214 | 0.8249 | 0.2080 | 0.8249 | 0.9083 |
| 0.1431 | 2.4032 | 1216 | 0.8963 | 0.0 | 0.8963 | 0.9467 |
| 0.1431 | 2.4071 | 1218 | 0.9249 | 0.0 | 0.9249 | 0.9617 |
| 0.1431 | 2.4111 | 1220 | 0.8665 | 0.2080 | 0.8665 | 0.9309 |
| 0.1431 | 2.4150 | 1222 | 0.7711 | 0.2080 | 0.7711 | 0.8781 |
| 0.1431 | 2.4190 | 1224 | 0.7754 | 0.2080 | 0.7754 | 0.8805 |
| 0.1431 | 2.4229 | 1226 | 0.8605 | 0.3433 | 0.8605 | 0.9276 |
| 0.1431 | 2.4269 | 1228 | 0.8896 | 0.3433 | 0.8896 | 0.9432 |
| 0.1431 | 2.4308 | 1230 | 0.9617 | 0.1852 | 0.9617 | 0.9807 |
| 0.1431 | 2.4348 | 1232 | 0.9684 | 0.0 | 0.9684 | 0.9841 |
| 0.1431 | 2.4387 | 1234 | 0.9066 | 0.0 | 0.9066 | 0.9521 |
| 0.1431 | 2.4427 | 1236 | 0.8137 | 0.2143 | 0.8137 | 0.9021 |
| 0.1431 | 2.4466 | 1238 | 0.7321 | 0.2080 | 0.7321 | 0.8556 |
| 0.1431 | 2.4506 | 1240 | 0.6796 | 0.2029 | 0.6796 | 0.8244 |
| 0.1431 | 2.4545 | 1242 | 0.7023 | 0.2029 | 0.7023 | 0.8380 |
| 0.1431 | 2.4585 | 1244 | 0.7982 | 0.2080 | 0.7982 | 0.8934 |
| 0.1431 | 2.4625 | 1246 | 0.9044 | 0.2080 | 0.9044 | 0.9510 |
| 0.1431 | 2.4664 | 1248 | 0.9828 | 0.0 | 0.9828 | 0.9914 |
| 0.1431 | 2.4704 | 1250 | 1.0616 | 0.0 | 1.0616 | 1.0304 |
| 0.1431 | 2.4743 | 1252 | 0.9856 | 0.0 | 0.9856 | 0.9928 |
| 0.1431 | 2.4783 | 1254 | 0.8701 | 0.2080 | 0.8701 | 0.9328 |
| 0.1431 | 2.4822 | 1256 | 0.7631 | 0.2029 | 0.7631 | 0.8736 |
| 0.1431 | 2.4862 | 1258 | 0.7787 | 0.2080 | 0.7787 | 0.8824 |
| 0.1431 | 2.4901 | 1260 | 0.8764 | 0.2080 | 0.8764 | 0.9362 |
| 0.1431 | 2.4941 | 1262 | 1.0385 | 0.1852 | 1.0385 | 1.0191 |
| 0.1431 | 2.4980 | 1264 | 1.0881 | 0.1852 | 1.0881 | 1.0431 |
| 0.1431 | 2.5020 | 1266 | 0.9765 | 0.0 | 0.9765 | 0.9882 |
| 0.1431 | 2.5059 | 1268 | 0.8868 | 0.2080 | 0.8868 | 0.9417 |
| 0.1431 | 2.5099 | 1270 | 0.7447 | 0.2080 | 0.7447 | 0.8629 |
| 0.1431 | 2.5138 | 1272 | 0.7071 | 0.2029 | 0.7071 | 0.8409 |
| 0.1431 | 2.5178 | 1274 | 0.7142 | 0.2029 | 0.7142 | 0.8451 |
| 0.1431 | 2.5217 | 1276 | 0.7692 | 0.2080 | 0.7692 | 0.8770 |
| 0.1431 | 2.5257 | 1278 | 0.9381 | 0.3433 | 0.9381 | 0.9685 |
| 0.1431 | 2.5296 | 1280 | 1.1714 | 0.1538 | 1.1714 | 1.0823 |
| 0.1431 | 2.5336 | 1282 | 1.2308 | 0.3016 | 1.2308 | 1.1094 |
| 0.1431 | 2.5375 | 1284 | 1.1517 | 0.1852 | 1.1517 | 1.0732 |
| 0.1431 | 2.5415 | 1286 | 0.9702 | 0.2080 | 0.9702 | 0.9850 |
| 0.1431 | 2.5455 | 1288 | 0.8341 | 0.2080 | 0.8341 | 0.9133 |
| 0.1431 | 2.5494 | 1290 | 0.7712 | 0.2080 | 0.7712 | 0.8782 |
| 0.1431 | 2.5534 | 1292 | 0.7784 | 0.2080 | 0.7784 | 0.8823 |
| 0.1431 | 2.5573 | 1294 | 0.8436 | 0.2080 | 0.8436 | 0.9185 |
| 0.1431 | 2.5613 | 1296 | 0.9148 | 0.0179 | 0.9148 | 0.9564 |
| 0.1431 | 2.5652 | 1298 | 0.9518 | 0.0 | 0.9518 | 0.9756 |
| 0.1431 | 2.5692 | 1300 | 0.9413 | 0.0 | 0.9413 | 0.9702 |
| 0.1431 | 2.5731 | 1302 | 0.9226 | 0.0 | 0.9226 | 0.9605 |
| 0.1431 | 2.5771 | 1304 | 0.8644 | 0.2080 | 0.8644 | 0.9297 |
| 0.1431 | 2.5810 | 1306 | 0.8373 | 0.2080 | 0.8373 | 0.9150 |
| 0.1431 | 2.5850 | 1308 | 0.8768 | 0.2080 | 0.8768 | 0.9364 |
| 0.1431 | 2.5889 | 1310 | 0.8732 | 0.2080 | 0.8732 | 0.9345 |
| 0.1431 | 2.5929 | 1312 | 0.9062 | 0.0 | 0.9062 | 0.9519 |
| 0.1431 | 2.5968 | 1314 | 0.9167 | 0.0 | 0.9167 | 0.9574 |
| 0.1431 | 2.6008 | 1316 | 0.8748 | 0.2080 | 0.8748 | 0.9353 |
| 0.1431 | 2.6047 | 1318 | 0.8761 | 0.2080 | 0.8761 | 0.9360 |
| 0.1431 | 2.6087 | 1320 | 0.9072 | 0.2080 | 0.9072 | 0.9525 |
| 0.1431 | 2.6126 | 1322 | 0.9536 | 0.0179 | 0.9536 | 0.9765 |
| 0.1431 | 2.6166 | 1324 | 0.9328 | 0.0 | 0.9328 | 0.9658 |
| 0.1431 | 2.6206 | 1326 | 0.8821 | 0.2080 | 0.8821 | 0.9392 |
| 0.1431 | 2.6245 | 1328 | 0.8106 | 0.2080 | 0.8106 | 0.9003 |
| 0.1431 | 2.6285 | 1330 | 0.8007 | 0.2080 | 0.8007 | 0.8948 |
| 0.1431 | 2.6324 | 1332 | 0.7742 | 0.2080 | 0.7742 | 0.8799 |
| 0.1431 | 2.6364 | 1334 | 0.7682 | 0.2080 | 0.7682 | 0.8765 |
| 0.1431 | 2.6403 | 1336 | 0.8204 | 0.2080 | 0.8204 | 0.9058 |
| 0.1431 | 2.6443 | 1338 | 0.8646 | 0.2080 | 0.8646 | 0.9299 |
| 0.1431 | 2.6482 | 1340 | 0.8037 | 0.2080 | 0.8037 | 0.8965 |
| 0.1431 | 2.6522 | 1342 | 0.7561 | 0.2029 | 0.7561 | 0.8696 |
| 0.1431 | 2.6561 | 1344 | 0.7921 | 0.2029 | 0.7921 | 0.8900 |
| 0.1431 | 2.6601 | 1346 | 0.8648 | 0.2080 | 0.8648 | 0.9300 |
| 0.1431 | 2.6640 | 1348 | 0.8735 | 0.2080 | 0.8735 | 0.9346 |
| 0.1431 | 2.6680 | 1350 | 0.9732 | 0.0 | 0.9732 | 0.9865 |
| 0.1431 | 2.6719 | 1352 | 0.9817 | 0.0 | 0.9817 | 0.9908 |
| 0.1431 | 2.6759 | 1354 | 0.9339 | 0.0 | 0.9339 | 0.9664 |
| 0.1431 | 2.6798 | 1356 | 0.8783 | 0.0 | 0.8783 | 0.9372 |
| 0.1431 | 2.6838 | 1358 | 0.7738 | 0.2080 | 0.7738 | 0.8796 |
| 0.1431 | 2.6877 | 1360 | 0.7291 | 0.2080 | 0.7291 | 0.8539 |
| 0.1431 | 2.6917 | 1362 | 0.7505 | 0.2080 | 0.7505 | 0.8663 |
| 0.1431 | 2.6957 | 1364 | 0.7861 | 0.2080 | 0.7861 | 0.8866 |
| 0.1431 | 2.6996 | 1366 | 0.9169 | 0.2080 | 0.9169 | 0.9576 |
| 0.1431 | 2.7036 | 1368 | 0.9793 | 0.0179 | 0.9793 | 0.9896 |
| 0.1431 | 2.7075 | 1370 | 0.9713 | 0.0179 | 0.9713 | 0.9855 |
| 0.1431 | 2.7115 | 1372 | 0.8726 | 0.2080 | 0.8726 | 0.9341 |
| 0.1431 | 2.7154 | 1374 | 0.8669 | 0.2080 | 0.8669 | 0.9311 |
| 0.1431 | 2.7194 | 1376 | 0.9305 | 0.0 | 0.9305 | 0.9646 |
| 0.1431 | 2.7233 | 1378 | 0.9212 | 0.0 | 0.9212 | 0.9598 |
| 0.1431 | 2.7273 | 1380 | 0.8454 | 0.0 | 0.8454 | 0.9195 |
| 0.1431 | 2.7312 | 1382 | 0.7923 | 0.2080 | 0.7923 | 0.8901 |
| 0.1431 | 2.7352 | 1384 | 0.7450 | 0.2080 | 0.7450 | 0.8632 |
| 0.1431 | 2.7391 | 1386 | 0.7439 | 0.2080 | 0.7439 | 0.8625 |
| 0.1431 | 2.7431 | 1388 | 0.7882 | 0.2080 | 0.7882 | 0.8878 |
| 0.1431 | 2.7470 | 1390 | 0.8931 | 0.2080 | 0.8931 | 0.9450 |
| 0.1431 | 2.7510 | 1392 | 0.9865 | 0.0179 | 0.9865 | 0.9932 |
| 0.1431 | 2.7549 | 1394 | 0.9770 | 0.0179 | 0.9770 | 0.9884 |
| 0.1431 | 2.7589 | 1396 | 0.8704 | 0.2080 | 0.8704 | 0.9329 |
| 0.1431 | 2.7628 | 1398 | 0.8612 | 0.2080 | 0.8612 | 0.9280 |
| 0.1431 | 2.7668 | 1400 | 0.8302 | 0.2080 | 0.8302 | 0.9112 |
| 0.1431 | 2.7708 | 1402 | 0.8192 | 0.2080 | 0.8192 | 0.9051 |
| 0.1431 | 2.7747 | 1404 | 0.8012 | 0.2080 | 0.8012 | 0.8951 |
| 0.1431 | 2.7787 | 1406 | 0.8410 | 0.2080 | 0.8410 | 0.9171 |
| 0.1431 | 2.7826 | 1408 | 0.9554 | 0.2080 | 0.9554 | 0.9774 |
| 0.1431 | 2.7866 | 1410 | 1.0862 | 0.0 | 1.0862 | 1.0422 |
| 0.1431 | 2.7905 | 1412 | 1.0941 | 0.0 | 1.0941 | 1.0460 |
| 0.1431 | 2.7945 | 1414 | 0.9648 | 0.0 | 0.9648 | 0.9822 |
| 0.1431 | 2.7984 | 1416 | 0.8378 | 0.2080 | 0.8378 | 0.9153 |
| 0.1431 | 2.8024 | 1418 | 0.7909 | 0.2080 | 0.7909 | 0.8893 |
| 0.1431 | 2.8063 | 1420 | 0.8099 | 0.2080 | 0.8099 | 0.9000 |
| 0.1431 | 2.8103 | 1422 | 0.9049 | 0.2080 | 0.9049 | 0.9513 |
| 0.1431 | 2.8142 | 1424 | 0.9466 | 0.0179 | 0.9466 | 0.9730 |
| 0.1431 | 2.8182 | 1426 | 0.9059 | 0.0179 | 0.9059 | 0.9518 |
| 0.1431 | 2.8221 | 1428 | 0.8181 | 0.2080 | 0.8181 | 0.9045 |
| 0.1431 | 2.8261 | 1430 | 0.7865 | 0.2080 | 0.7865 | 0.8869 |
| 0.1431 | 2.8300 | 1432 | 0.8095 | 0.2080 | 0.8095 | 0.8997 |
| 0.1431 | 2.8340 | 1434 | 0.8373 | 0.2080 | 0.8373 | 0.9151 |
| 0.1431 | 2.8379 | 1436 | 0.9082 | 0.2080 | 0.9082 | 0.9530 |
| 0.1431 | 2.8419 | 1438 | 0.9925 | 0.0179 | 0.9925 | 0.9963 |
| 0.1431 | 2.8458 | 1440 | 0.9721 | 0.0179 | 0.9721 | 0.9859 |
| 0.1431 | 2.8498 | 1442 | 0.8752 | 0.2080 | 0.8752 | 0.9355 |
| 0.1431 | 2.8538 | 1444 | 0.8028 | 0.2080 | 0.8028 | 0.8960 |
| 0.1431 | 2.8577 | 1446 | 0.8103 | 0.2080 | 0.8103 | 0.9002 |
| 0.1431 | 2.8617 | 1448 | 0.8686 | 0.2080 | 0.8686 | 0.9320 |
| 0.1431 | 2.8656 | 1450 | 0.9909 | 0.0179 | 0.9909 | 0.9954 |
| 0.1431 | 2.8696 | 1452 | 1.0226 | 0.0179 | 1.0226 | 1.0112 |
| 0.1431 | 2.8735 | 1454 | 0.9463 | 0.2080 | 0.9463 | 0.9728 |
| 0.1431 | 2.8775 | 1456 | 0.8470 | 0.2080 | 0.8470 | 0.9203 |
| 0.1431 | 2.8814 | 1458 | 0.8196 | 0.2080 | 0.8196 | 0.9053 |
| 0.1431 | 2.8854 | 1460 | 0.8344 | 0.2080 | 0.8344 | 0.9135 |
| 0.1431 | 2.8893 | 1462 | 0.9029 | 0.2080 | 0.9029 | 0.9502 |
| 0.1431 | 2.8933 | 1464 | 0.9165 | 0.2080 | 0.9165 | 0.9573 |
| 0.1431 | 2.8972 | 1466 | 0.8622 | 0.2080 | 0.8622 | 0.9286 |
| 0.1431 | 2.9012 | 1468 | 0.8284 | 0.2080 | 0.8284 | 0.9102 |
| 0.1431 | 2.9051 | 1470 | 0.8083 | 0.2080 | 0.8083 | 0.8991 |
| 0.1431 | 2.9091 | 1472 | 0.7894 | 0.2080 | 0.7894 | 0.8885 |
| 0.1431 | 2.9130 | 1474 | 0.8157 | 0.2080 | 0.8157 | 0.9032 |
| 0.1431 | 2.9170 | 1476 | 0.8231 | 0.2080 | 0.8231 | 0.9072 |
| 0.1431 | 2.9209 | 1478 | 0.8576 | 0.2080 | 0.8576 | 0.9260 |
| 0.1431 | 2.9249 | 1480 | 0.8750 | 0.2080 | 0.8750 | 0.9354 |
| 0.1431 | 2.9289 | 1482 | 0.8539 | 0.2080 | 0.8539 | 0.9240 |
| 0.1431 | 2.9328 | 1484 | 0.8549 | 0.2080 | 0.8549 | 0.9246 |
| 0.1431 | 2.9368 | 1486 | 0.8564 | 0.2080 | 0.8564 | 0.9254 |
| 0.1431 | 2.9407 | 1488 | 0.8630 | 0.2080 | 0.8630 | 0.9290 |
| 0.1431 | 2.9447 | 1490 | 0.8976 | 0.2080 | 0.8976 | 0.9474 |
| 0.1431 | 2.9486 | 1492 | 0.9062 | 0.2080 | 0.9062 | 0.9519 |
| 0.1431 | 2.9526 | 1494 | 0.9357 | 0.2080 | 0.9357 | 0.9673 |
| 0.1431 | 2.9565 | 1496 | 0.8990 | 0.2080 | 0.8990 | 0.9481 |
| 0.1431 | 2.9605 | 1498 | 0.8118 | 0.2080 | 0.8118 | 0.9010 |
| 0.1075 | 2.9644 | 1500 | 0.7769 | 0.2080 | 0.7769 | 0.8814 |
| 0.1075 | 2.9684 | 1502 | 0.7979 | 0.2080 | 0.7979 | 0.8932 |
| 0.1075 | 2.9723 | 1504 | 0.8502 | 0.2080 | 0.8502 | 0.9221 |
| 0.1075 | 2.9763 | 1506 | 0.8594 | 0.2080 | 0.8594 | 0.9270 |
| 0.1075 | 2.9802 | 1508 | 0.8640 | 0.2080 | 0.8640 | 0.9295 |
| 0.1075 | 2.9842 | 1510 | 0.8696 | 0.2080 | 0.8696 | 0.9325 |
| 0.1075 | 2.9881 | 1512 | 0.8464 | 0.2080 | 0.8464 | 0.9200 |
| 0.1075 | 2.9921 | 1514 | 0.8343 | 0.2080 | 0.8343 | 0.9134 |
| 0.1075 | 2.9960 | 1516 | 0.8576 | 0.2080 | 0.8576 | 0.9261 |
| 0.1075 | 3.0 | 1518 | 0.9113 | 0.2080 | 0.9113 | 0.9546 |
| 0.1075 | 3.0040 | 1520 | 0.9859 | 0.2080 | 0.9859 | 0.9929 |
| 0.1075 | 3.0079 | 1522 | 0.9530 | 0.2080 | 0.9530 | 0.9762 |
| 0.1075 | 3.0119 | 1524 | 0.8713 | 0.2080 | 0.8713 | 0.9334 |
| 0.1075 | 3.0158 | 1526 | 0.8405 | 0.2080 | 0.8405 | 0.9168 |
| 0.1075 | 3.0198 | 1528 | 0.8258 | 0.2080 | 0.8258 | 0.9087 |
| 0.1075 | 3.0237 | 1530 | 0.8545 | 0.2080 | 0.8545 | 0.9244 |
| 0.1075 | 3.0277 | 1532 | 0.9580 | 0.2080 | 0.9580 | 0.9787 |
| 0.1075 | 3.0316 | 1534 | 1.0160 | 0.0179 | 1.0160 | 1.0080 |
| 0.1075 | 3.0356 | 1536 | 0.9663 | 0.2080 | 0.9663 | 0.9830 |
| 0.1075 | 3.0395 | 1538 | 0.8673 | 0.2080 | 0.8673 | 0.9313 |
| 0.1075 | 3.0435 | 1540 | 0.7963 | 0.2080 | 0.7963 | 0.8923 |
| 0.1075 | 3.0474 | 1542 | 0.7946 | 0.2080 | 0.7946 | 0.8914 |
| 0.1075 | 3.0514 | 1544 | 0.8297 | 0.2080 | 0.8297 | 0.9109 |
| 0.1075 | 3.0553 | 1546 | 0.8146 | 0.2080 | 0.8146 | 0.9026 |
| 0.1075 | 3.0593 | 1548 | 0.7924 | 0.2080 | 0.7924 | 0.8902 |
| 0.1075 | 3.0632 | 1550 | 0.8363 | 0.2080 | 0.8363 | 0.9145 |
| 0.1075 | 3.0672 | 1552 | 0.9251 | 0.2080 | 0.9251 | 0.9618 |
| 0.1075 | 3.0711 | 1554 | 0.9096 | 0.2080 | 0.9096 | 0.9537 |
| 0.1075 | 3.0751 | 1556 | 0.8215 | 0.2080 | 0.8215 | 0.9064 |
| 0.1075 | 3.0791 | 1558 | 0.7818 | 0.2080 | 0.7818 | 0.8842 |
| 0.1075 | 3.0830 | 1560 | 0.7784 | 0.2080 | 0.7784 | 0.8823 |
| 0.1075 | 3.0870 | 1562 | 0.8000 | 0.2080 | 0.8000 | 0.8944 |
| 0.1075 | 3.0909 | 1564 | 0.8767 | 0.2080 | 0.8767 | 0.9363 |
| 0.1075 | 3.0949 | 1566 | 1.0493 | 0.1818 | 1.0493 | 1.0244 |
| 0.1075 | 3.0988 | 1568 | 1.1379 | 0.1538 | 1.1379 | 1.0667 |
| 0.1075 | 3.1028 | 1570 | 1.0783 | 0.0179 | 1.0783 | 1.0384 |
| 0.1075 | 3.1067 | 1572 | 0.9631 | 0.0179 | 0.9631 | 0.9814 |
| 0.1075 | 3.1107 | 1574 | 0.8494 | 0.2080 | 0.8494 | 0.9216 |
| 0.1075 | 3.1146 | 1576 | 0.8181 | 0.2080 | 0.8181 | 0.9045 |
| 0.1075 | 3.1186 | 1578 | 0.8218 | 0.2080 | 0.8218 | 0.9065 |
| 0.1075 | 3.1225 | 1580 | 0.8325 | 0.2080 | 0.8325 | 0.9124 |
| 0.1075 | 3.1265 | 1582 | 0.8785 | 0.2080 | 0.8785 | 0.9373 |
| 0.1075 | 3.1304 | 1584 | 0.9166 | 0.2080 | 0.9166 | 0.9574 |
| 0.1075 | 3.1344 | 1586 | 0.9351 | 0.2080 | 0.9351 | 0.9670 |
| 0.1075 | 3.1383 | 1588 | 0.9214 | 0.2080 | 0.9214 | 0.9599 |
| 0.1075 | 3.1423 | 1590 | 0.8744 | 0.2080 | 0.8744 | 0.9351 |
| 0.1075 | 3.1462 | 1592 | 0.8587 | 0.2080 | 0.8587 | 0.9267 |
| 0.1075 | 3.1502 | 1594 | 0.8870 | 0.2080 | 0.8870 | 0.9418 |
| 0.1075 | 3.1542 | 1596 | 0.9409 | 0.2080 | 0.9409 | 0.9700 |
| 0.1075 | 3.1581 | 1598 | 0.9153 | 0.2080 | 0.9153 | 0.9567 |
| 0.1075 | 3.1621 | 1600 | 0.9045 | 0.2080 | 0.9045 | 0.9510 |
| 0.1075 | 3.1660 | 1602 | 0.8661 | 0.2080 | 0.8661 | 0.9307 |
| 0.1075 | 3.1700 | 1604 | 0.8576 | 0.2080 | 0.8576 | 0.9260 |
| 0.1075 | 3.1739 | 1606 | 0.8686 | 0.2080 | 0.8686 | 0.9320 |
| 0.1075 | 3.1779 | 1608 | 0.8764 | 0.2080 | 0.8764 | 0.9361 |
| 0.1075 | 3.1818 | 1610 | 0.8931 | 0.2080 | 0.8931 | 0.9451 |
| 0.1075 | 3.1858 | 1612 | 0.8901 | 0.2080 | 0.8901 | 0.9434 |
| 0.1075 | 3.1897 | 1614 | 0.8612 | 0.2080 | 0.8612 | 0.9280 |
| 0.1075 | 3.1937 | 1616 | 0.8838 | 0.2080 | 0.8838 | 0.9401 |
| 0.1075 | 3.1976 | 1618 | 0.8726 | 0.2080 | 0.8726 | 0.9341 |
| 0.1075 | 3.2016 | 1620 | 0.8771 | 0.2080 | 0.8771 | 0.9365 |
| 0.1075 | 3.2055 | 1622 | 0.9206 | 0.2080 | 0.9206 | 0.9595 |
| 0.1075 | 3.2095 | 1624 | 0.9475 | 0.2080 | 0.9475 | 0.9734 |
| 0.1075 | 3.2134 | 1626 | 0.9167 | 0.2080 | 0.9167 | 0.9574 |
| 0.1075 | 3.2174 | 1628 | 0.8386 | 0.2080 | 0.8386 | 0.9157 |
| 0.1075 | 3.2213 | 1630 | 0.8126 | 0.2080 | 0.8126 | 0.9014 |
| 0.1075 | 3.2253 | 1632 | 0.8245 | 0.2080 | 0.8245 | 0.9080 |
| 0.1075 | 3.2292 | 1634 | 0.8557 | 0.2080 | 0.8557 | 0.9251 |
| 0.1075 | 3.2332 | 1636 | 0.9249 | 0.0179 | 0.9249 | 0.9617 |
| 0.1075 | 3.2372 | 1638 | 0.9266 | 0.0179 | 0.9266 | 0.9626 |
| 0.1075 | 3.2411 | 1640 | 0.8463 | 0.2080 | 0.8463 | 0.9199 |
| 0.1075 | 3.2451 | 1642 | 0.7945 | 0.2080 | 0.7945 | 0.8914 |
| 0.1075 | 3.2490 | 1644 | 0.7956 | 0.2080 | 0.7956 | 0.8920 |
| 0.1075 | 3.2530 | 1646 | 0.8684 | 0.2080 | 0.8684 | 0.9319 |
| 0.1075 | 3.2569 | 1648 | 0.9867 | 0.0179 | 0.9867 | 0.9933 |
| 0.1075 | 3.2609 | 1650 | 1.0329 | 0.0179 | 1.0329 | 1.0163 |
| 0.1075 | 3.2648 | 1652 | 0.9895 | 0.2080 | 0.9895 | 0.9947 |
| 0.1075 | 3.2688 | 1654 | 0.9478 | 0.2080 | 0.9478 | 0.9736 |
| 0.1075 | 3.2727 | 1656 | 0.8441 | 0.2080 | 0.8441 | 0.9187 |
| 0.1075 | 3.2767 | 1658 | 0.8009 | 0.2080 | 0.8009 | 0.8949 |
| 0.1075 | 3.2806 | 1660 | 0.7943 | 0.2080 | 0.7943 | 0.8912 |
| 0.1075 | 3.2846 | 1662 | 0.7950 | 0.2080 | 0.7950 | 0.8916 |
| 0.1075 | 3.2885 | 1664 | 0.8307 | 0.2080 | 0.8307 | 0.9114 |
| 0.1075 | 3.2925 | 1666 | 0.8389 | 0.2080 | 0.8389 | 0.9159 |
| 0.1075 | 3.2964 | 1668 | 0.8451 | 0.2080 | 0.8451 | 0.9193 |
| 0.1075 | 3.3004 | 1670 | 0.9079 | 0.2080 | 0.9079 | 0.9528 |
| 0.1075 | 3.3043 | 1672 | 0.8976 | 0.2080 | 0.8976 | 0.9474 |
| 0.1075 | 3.3083 | 1674 | 0.9004 | 0.2080 | 0.9004 | 0.9489 |
| 0.1075 | 3.3123 | 1676 | 0.9191 | 0.2080 | 0.9191 | 0.9587 |
| 0.1075 | 3.3162 | 1678 | 0.8370 | 0.2080 | 0.8370 | 0.9149 |
| 0.1075 | 3.3202 | 1680 | 0.7597 | 0.2080 | 0.7597 | 0.8716 |
| 0.1075 | 3.3241 | 1682 | 0.7661 | 0.2080 | 0.7661 | 0.8753 |
| 0.1075 | 3.3281 | 1684 | 0.8006 | 0.2080 | 0.8006 | 0.8947 |
| 0.1075 | 3.3320 | 1686 | 0.8508 | 0.2080 | 0.8508 | 0.9224 |
| 0.1075 | 3.3360 | 1688 | 0.8236 | 0.2080 | 0.8236 | 0.9075 |
| 0.1075 | 3.3399 | 1690 | 0.8256 | 0.2080 | 0.8256 | 0.9086 |
| 0.1075 | 3.3439 | 1692 | 0.8634 | 0.0 | 0.8634 | 0.9292 |
| 0.1075 | 3.3478 | 1694 | 0.8196 | 0.2143 | 0.8196 | 0.9053 |
| 0.1075 | 3.3518 | 1696 | 0.7268 | 0.2080 | 0.7268 | 0.8525 |
| 0.1075 | 3.3557 | 1698 | 0.6890 | 0.2080 | 0.6890 | 0.8300 |
| 0.1075 | 3.3597 | 1700 | 0.6797 | 0.2080 | 0.6797 | 0.8244 |
| 0.1075 | 3.3636 | 1702 | 0.6912 | 0.2080 | 0.6912 | 0.8314 |
| 0.1075 | 3.3676 | 1704 | 0.7666 | 0.2080 | 0.7666 | 0.8756 |
| 0.1075 | 3.3715 | 1706 | 0.8608 | 0.2080 | 0.8608 | 0.9278 |
| 0.1075 | 3.3755 | 1708 | 0.8806 | 0.2080 | 0.8806 | 0.9384 |
| 0.1075 | 3.3794 | 1710 | 0.8182 | 0.2080 | 0.8182 | 0.9045 |
| 0.1075 | 3.3834 | 1712 | 0.7561 | 0.2080 | 0.7561 | 0.8695 |
| 0.1075 | 3.3874 | 1714 | 0.7660 | 0.2080 | 0.7660 | 0.8752 |
| 0.1075 | 3.3913 | 1716 | 0.8614 | 0.2080 | 0.8614 | 0.9281 |
| 0.1075 | 3.3953 | 1718 | 0.9601 | 0.2080 | 0.9601 | 0.9798 |
| 0.1075 | 3.3992 | 1720 | 0.9254 | 0.2080 | 0.9254 | 0.9620 |
| 0.1075 | 3.4032 | 1722 | 0.8426 | 0.2080 | 0.8426 | 0.9179 |
| 0.1075 | 3.4071 | 1724 | 0.7634 | 0.2080 | 0.7634 | 0.8738 |
| 0.1075 | 3.4111 | 1726 | 0.7438 | 0.2080 | 0.7438 | 0.8624 |
| 0.1075 | 3.4150 | 1728 | 0.7565 | 0.2080 | 0.7565 | 0.8698 |
| 0.1075 | 3.4190 | 1730 | 0.7979 | 0.2080 | 0.7979 | 0.8933 |
| 0.1075 | 3.4229 | 1732 | 0.8254 | 0.2080 | 0.8254 | 0.9085 |
| 0.1075 | 3.4269 | 1734 | 0.8883 | 0.2080 | 0.8883 | 0.9425 |
| 0.1075 | 3.4308 | 1736 | 0.9295 | 0.2080 | 0.9295 | 0.9641 |
| 0.1075 | 3.4348 | 1738 | 0.8606 | 0.2080 | 0.8606 | 0.9277 |
| 0.1075 | 3.4387 | 1740 | 0.8626 | 0.2080 | 0.8626 | 0.9288 |
| 0.1075 | 3.4427 | 1742 | 0.8647 | 0.2080 | 0.8647 | 0.9299 |
| 0.1075 | 3.4466 | 1744 | 0.8430 | 0.2080 | 0.8430 | 0.9182 |
| 0.1075 | 3.4506 | 1746 | 0.8388 | 0.2080 | 0.8388 | 0.9159 |
| 0.1075 | 3.4545 | 1748 | 0.8423 | 0.2080 | 0.8423 | 0.9178 |
| 0.1075 | 3.4585 | 1750 | 0.8355 | 0.2080 | 0.8355 | 0.9141 |
| 0.1075 | 3.4625 | 1752 | 0.8288 | 0.2080 | 0.8288 | 0.9104 |
| 0.1075 | 3.4664 | 1754 | 0.8450 | 0.2080 | 0.8450 | 0.9192 |
| 0.1075 | 3.4704 | 1756 | 0.8203 | 0.2080 | 0.8203 | 0.9057 |
| 0.1075 | 3.4743 | 1758 | 0.7678 | 0.2080 | 0.7678 | 0.8763 |
| 0.1075 | 3.4783 | 1760 | 0.7772 | 0.2080 | 0.7772 | 0.8816 |
| 0.1075 | 3.4822 | 1762 | 0.7942 | 0.2080 | 0.7942 | 0.8912 |
| 0.1075 | 3.4862 | 1764 | 0.7692 | 0.2080 | 0.7692 | 0.8770 |
| 0.1075 | 3.4901 | 1766 | 0.7495 | 0.2080 | 0.7495 | 0.8657 |
| 0.1075 | 3.4941 | 1768 | 0.7584 | 0.2080 | 0.7584 | 0.8709 |
| 0.1075 | 3.4980 | 1770 | 0.8345 | 0.2080 | 0.8345 | 0.9135 |
| 0.1075 | 3.5020 | 1772 | 0.8730 | 0.2080 | 0.8730 | 0.9343 |
| 0.1075 | 3.5059 | 1774 | 0.8292 | 0.2080 | 0.8292 | 0.9106 |
| 0.1075 | 3.5099 | 1776 | 0.8223 | 0.2080 | 0.8223 | 0.9068 |
| 0.1075 | 3.5138 | 1778 | 0.8088 | 0.2080 | 0.8088 | 0.8993 |
| 0.1075 | 3.5178 | 1780 | 0.8764 | 0.2080 | 0.8764 | 0.9362 |
| 0.1075 | 3.5217 | 1782 | 0.9950 | 0.0179 | 0.9950 | 0.9975 |
| 0.1075 | 3.5257 | 1784 | 1.0039 | 0.0179 | 1.0039 | 1.0020 |
| 0.1075 | 3.5296 | 1786 | 0.9050 | 0.2080 | 0.9050 | 0.9513 |
| 0.1075 | 3.5336 | 1788 | 0.8202 | 0.2080 | 0.8202 | 0.9057 |
| 0.1075 | 3.5375 | 1790 | 0.7291 | 0.2080 | 0.7291 | 0.8539 |
| 0.1075 | 3.5415 | 1792 | 0.7215 | 0.2080 | 0.7215 | 0.8494 |
| 0.1075 | 3.5455 | 1794 | 0.7711 | 0.2080 | 0.7711 | 0.8781 |
| 0.1075 | 3.5494 | 1796 | 0.7839 | 0.2080 | 0.7839 | 0.8854 |
| 0.1075 | 3.5534 | 1798 | 0.7969 | 0.2080 | 0.7969 | 0.8927 |
| 0.1075 | 3.5573 | 1800 | 0.7602 | 0.2080 | 0.7602 | 0.8719 |
| 0.1075 | 3.5613 | 1802 | 0.7008 | 0.2080 | 0.7008 | 0.8372 |
| 0.1075 | 3.5652 | 1804 | 0.7170 | 0.2080 | 0.7170 | 0.8468 |
| 0.1075 | 3.5692 | 1806 | 0.7512 | 0.2080 | 0.7512 | 0.8667 |
| 0.1075 | 3.5731 | 1808 | 0.7213 | 0.2080 | 0.7213 | 0.8493 |
| 0.1075 | 3.5771 | 1810 | 0.7056 | 0.2080 | 0.7056 | 0.8400 |
| 0.1075 | 3.5810 | 1812 | 0.7206 | 0.2080 | 0.7206 | 0.8489 |
| 0.1075 | 3.5850 | 1814 | 0.7868 | 0.2080 | 0.7868 | 0.8870 |
| 0.1075 | 3.5889 | 1816 | 0.8948 | 0.2143 | 0.8948 | 0.9460 |
| 0.1075 | 3.5929 | 1818 | 0.9381 | 0.0 | 0.9381 | 0.9686 |
| 0.1075 | 3.5968 | 1820 | 0.9007 | 0.0 | 0.9007 | 0.9491 |
| 0.1075 | 3.6008 | 1822 | 0.7835 | 0.2080 | 0.7835 | 0.8852 |
| 0.1075 | 3.6047 | 1824 | 0.6834 | 0.2080 | 0.6834 | 0.8267 |
| 0.1075 | 3.6087 | 1826 | 0.6693 | 0.2029 | 0.6693 | 0.8181 |
| 0.1075 | 3.6126 | 1828 | 0.6638 | 0.2029 | 0.6638 | 0.8147 |
| 0.1075 | 3.6166 | 1830 | 0.6822 | 0.2080 | 0.6822 | 0.8259 |
| 0.1075 | 3.6206 | 1832 | 0.7574 | 0.2080 | 0.7574 | 0.8703 |
| 0.1075 | 3.6245 | 1834 | 0.8141 | 0.2080 | 0.8141 | 0.9023 |
| 0.1075 | 3.6285 | 1836 | 0.7961 | 0.2080 | 0.7961 | 0.8923 |
| 0.1075 | 3.6324 | 1838 | 0.7348 | 0.2080 | 0.7348 | 0.8572 |
| 0.1075 | 3.6364 | 1840 | 0.6889 | 0.2080 | 0.6889 | 0.8300 |
| 0.1075 | 3.6403 | 1842 | 0.7143 | 0.2080 | 0.7143 | 0.8452 |
| 0.1075 | 3.6443 | 1844 | 0.7564 | 0.2080 | 0.7564 | 0.8697 |
| 0.1075 | 3.6482 | 1846 | 0.7835 | 0.2080 | 0.7835 | 0.8852 |
| 0.1075 | 3.6522 | 1848 | 0.7707 | 0.2080 | 0.7707 | 0.8779 |
| 0.1075 | 3.6561 | 1850 | 0.7148 | 0.2080 | 0.7148 | 0.8455 |
| 0.1075 | 3.6601 | 1852 | 0.6858 | 0.2080 | 0.6858 | 0.8282 |
| 0.1075 | 3.6640 | 1854 | 0.6831 | 0.2029 | 0.6831 | 0.8265 |
| 0.1075 | 3.6680 | 1856 | 0.6877 | 0.2029 | 0.6877 | 0.8293 |
| 0.1075 | 3.6719 | 1858 | 0.7099 | 0.2080 | 0.7099 | 0.8426 |
| 0.1075 | 3.6759 | 1860 | 0.7193 | 0.2080 | 0.7193 | 0.8481 |
| 0.1075 | 3.6798 | 1862 | 0.7402 | 0.2080 | 0.7402 | 0.8604 |
| 0.1075 | 3.6838 | 1864 | 0.7839 | 0.2080 | 0.7839 | 0.8854 |
| 0.1075 | 3.6877 | 1866 | 0.8238 | 0.2080 | 0.8238 | 0.9076 |
| 0.1075 | 3.6917 | 1868 | 0.7836 | 0.2080 | 0.7836 | 0.8852 |
| 0.1075 | 3.6957 | 1870 | 0.7493 | 0.2080 | 0.7493 | 0.8656 |
| 0.1075 | 3.6996 | 1872 | 0.7199 | 0.2080 | 0.7199 | 0.8485 |
| 0.1075 | 3.7036 | 1874 | 0.7193 | 0.2080 | 0.7193 | 0.8481 |
| 0.1075 | 3.7075 | 1876 | 0.7289 | 0.2080 | 0.7289 | 0.8538 |
| 0.1075 | 3.7115 | 1878 | 0.7355 | 0.2080 | 0.7355 | 0.8576 |
| 0.1075 | 3.7154 | 1880 | 0.7805 | 0.2080 | 0.7805 | 0.8835 |
| 0.1075 | 3.7194 | 1882 | 0.8607 | 0.2080 | 0.8607 | 0.9277 |
| 0.1075 | 3.7233 | 1884 | 0.8551 | 0.2080 | 0.8551 | 0.9247 |
| 0.1075 | 3.7273 | 1886 | 0.8063 | 0.2080 | 0.8063 | 0.8979 |
| 0.1075 | 3.7312 | 1888 | 0.7648 | 0.2080 | 0.7648 | 0.8745 |
| 0.1075 | 3.7352 | 1890 | 0.7368 | 0.2080 | 0.7368 | 0.8584 |
| 0.1075 | 3.7391 | 1892 | 0.7339 | 0.2080 | 0.7339 | 0.8567 |
| 0.1075 | 3.7431 | 1894 | 0.7463 | 0.2080 | 0.7463 | 0.8639 |
| 0.1075 | 3.7470 | 1896 | 0.7367 | 0.2080 | 0.7367 | 0.8583 |
| 0.1075 | 3.7510 | 1898 | 0.7256 | 0.2029 | 0.7256 | 0.8518 |
| 0.1075 | 3.7549 | 1900 | 0.7735 | 0.2080 | 0.7735 | 0.8795 |
| 0.1075 | 3.7589 | 1902 | 0.8320 | 0.2080 | 0.8320 | 0.9121 |
| 0.1075 | 3.7628 | 1904 | 0.8269 | 0.2080 | 0.8269 | 0.9093 |
| 0.1075 | 3.7668 | 1906 | 0.7535 | 0.2080 | 0.7535 | 0.8681 |
| 0.1075 | 3.7708 | 1908 | 0.7012 | 0.2029 | 0.7012 | 0.8374 |
| 0.1075 | 3.7747 | 1910 | 0.7073 | 0.2080 | 0.7073 | 0.8410 |
| 0.1075 | 3.7787 | 1912 | 0.7503 | 0.2080 | 0.7503 | 0.8662 |
| 0.1075 | 3.7826 | 1914 | 0.8045 | 0.2080 | 0.8045 | 0.8970 |
| 0.1075 | 3.7866 | 1916 | 0.7924 | 0.2080 | 0.7924 | 0.8901 |
| 0.1075 | 3.7905 | 1918 | 0.7348 | 0.2080 | 0.7348 | 0.8572 |
| 0.1075 | 3.7945 | 1920 | 0.7133 | 0.2080 | 0.7133 | 0.8446 |
| 0.1075 | 3.7984 | 1922 | 0.7345 | 0.2080 | 0.7345 | 0.8570 |
| 0.1075 | 3.8024 | 1924 | 0.8126 | 0.2080 | 0.8126 | 0.9015 |
| 0.1075 | 3.8063 | 1926 | 0.8993 | 0.2080 | 0.8993 | 0.9483 |
| 0.1075 | 3.8103 | 1928 | 0.9214 | 0.0 | 0.9214 | 0.9599 |
| 0.1075 | 3.8142 | 1930 | 0.9070 | 0.0 | 0.9070 | 0.9524 |
| 0.1075 | 3.8182 | 1932 | 0.8027 | 0.2080 | 0.8027 | 0.8960 |
| 0.1075 | 3.8221 | 1934 | 0.7124 | 0.2080 | 0.7124 | 0.8440 |
| 0.1075 | 3.8261 | 1936 | 0.6929 | 0.2029 | 0.6929 | 0.8324 |
| 0.1075 | 3.8300 | 1938 | 0.7035 | 0.2080 | 0.7035 | 0.8387 |
| 0.1075 | 3.8340 | 1940 | 0.7704 | 0.2080 | 0.7704 | 0.8777 |
| 0.1075 | 3.8379 | 1942 | 0.9145 | 0.0 | 0.9145 | 0.9563 |
| 0.1075 | 3.8419 | 1944 | 0.9480 | 0.0 | 0.9480 | 0.9737 |
| 0.1075 | 3.8458 | 1946 | 0.8757 | 0.2143 | 0.8757 | 0.9358 |
| 0.1075 | 3.8498 | 1948 | 0.7813 | 0.2080 | 0.7813 | 0.8839 |
| 0.1075 | 3.8538 | 1950 | 0.7169 | 0.2080 | 0.7169 | 0.8467 |
| 0.1075 | 3.8577 | 1952 | 0.7066 | 0.2080 | 0.7066 | 0.8406 |
| 0.1075 | 3.8617 | 1954 | 0.7449 | 0.2080 | 0.7449 | 0.8631 |
| 0.1075 | 3.8656 | 1956 | 0.8647 | 0.2143 | 0.8647 | 0.9299 |
| 0.1075 | 3.8696 | 1958 | 1.0525 | 0.0 | 1.0525 | 1.0259 |
| 0.1075 | 3.8735 | 1960 | 1.0967 | 0.0 | 1.0967 | 1.0472 |
| 0.1075 | 3.8775 | 1962 | 1.0195 | 0.0 | 1.0195 | 1.0097 |
| 0.1075 | 3.8814 | 1964 | 0.8752 | 0.0 | 0.8752 | 0.9355 |
| 0.1075 | 3.8854 | 1966 | 0.7895 | 0.2080 | 0.7895 | 0.8885 |
| 0.1075 | 3.8893 | 1968 | 0.7394 | 0.2080 | 0.7394 | 0.8599 |
| 0.1075 | 3.8933 | 1970 | 0.7060 | 0.2080 | 0.7060 | 0.8402 |
| 0.1075 | 3.8972 | 1972 | 0.7061 | 0.2080 | 0.7061 | 0.8403 |
| 0.1075 | 3.9012 | 1974 | 0.7619 | 0.2080 | 0.7619 | 0.8729 |
| 0.1075 | 3.9051 | 1976 | 0.8710 | 0.2080 | 0.8710 | 0.9333 |
| 0.1075 | 3.9091 | 1978 | 0.9192 | 0.2080 | 0.9192 | 0.9587 |
| 0.1075 | 3.9130 | 1980 | 0.9196 | 0.2143 | 0.9196 | 0.9589 |
| 0.1075 | 3.9170 | 1982 | 0.8861 | 0.2143 | 0.8861 | 0.9413 |
| 0.1075 | 3.9209 | 1984 | 0.8287 | 0.2080 | 0.8287 | 0.9103 |
| 0.1075 | 3.9249 | 1986 | 0.8534 | 0.2080 | 0.8534 | 0.9238 |
| 0.1075 | 3.9289 | 1988 | 0.8310 | 0.2080 | 0.8310 | 0.9116 |
| 0.1075 | 3.9328 | 1990 | 0.7980 | 0.2080 | 0.7980 | 0.8933 |
| 0.1075 | 3.9368 | 1992 | 0.7965 | 0.2080 | 0.7965 | 0.8925 |
| 0.1075 | 3.9407 | 1994 | 0.7593 | 0.2080 | 0.7593 | 0.8714 |
| 0.1075 | 3.9447 | 1996 | 0.7338 | 0.2080 | 0.7338 | 0.8566 |
| 0.1075 | 3.9486 | 1998 | 0.7213 | 0.2029 | 0.7213 | 0.8493 |
| 0.0746 | 3.9526 | 2000 | 0.7293 | 0.2080 | 0.7293 | 0.8540 |
| 0.0746 | 3.9565 | 2002 | 0.7539 | 0.2080 | 0.7539 | 0.8683 |
| 0.0746 | 3.9605 | 2004 | 0.8184 | 0.2080 | 0.8184 | 0.9047 |
| 0.0746 | 3.9644 | 2006 | 0.8536 | 0.2080 | 0.8536 | 0.9239 |
| 0.0746 | 3.9684 | 2008 | 0.8306 | 0.2080 | 0.8306 | 0.9114 |
| 0.0746 | 3.9723 | 2010 | 0.8191 | 0.2080 | 0.8191 | 0.9050 |
| 0.0746 | 3.9763 | 2012 | 0.7657 | 0.2080 | 0.7657 | 0.8750 |
| 0.0746 | 3.9802 | 2014 | 0.7035 | 0.2080 | 0.7035 | 0.8388 |
| 0.0746 | 3.9842 | 2016 | 0.6851 | 0.2029 | 0.6851 | 0.8277 |
| 0.0746 | 3.9881 | 2018 | 0.6878 | 0.2029 | 0.6878 | 0.8294 |
| 0.0746 | 3.9921 | 2020 | 0.7186 | 0.2080 | 0.7186 | 0.8477 |
| 0.0746 | 3.9960 | 2022 | 0.8121 | 0.2080 | 0.8121 | 0.9011 |
| 0.0746 | 4.0 | 2024 | 0.8658 | 0.2143 | 0.8658 | 0.9305 |
| 0.0746 | 4.0040 | 2026 | 0.8389 | 0.2080 | 0.8389 | 0.9159 |
| 0.0746 | 4.0079 | 2028 | 0.7562 | 0.2080 | 0.7562 | 0.8696 |
| 0.0746 | 4.0119 | 2030 | 0.6931 | 0.2080 | 0.6931 | 0.8325 |
| 0.0746 | 4.0158 | 2032 | 0.6803 | 0.2029 | 0.6803 | 0.8248 |
| 0.0746 | 4.0198 | 2034 | 0.6838 | 0.2029 | 0.6838 | 0.8269 |
| 0.0746 | 4.0237 | 2036 | 0.7155 | 0.2080 | 0.7155 | 0.8459 |
| 0.0746 | 4.0277 | 2038 | 0.7552 | 0.2080 | 0.7552 | 0.8690 |
| 0.0746 | 4.0316 | 2040 | 0.7963 | 0.2080 | 0.7963 | 0.8924 |
| 0.0746 | 4.0356 | 2042 | 0.7887 | 0.2080 | 0.7887 | 0.8881 |
| 0.0746 | 4.0395 | 2044 | 0.7459 | 0.2080 | 0.7459 | 0.8637 |
| 0.0746 | 4.0435 | 2046 | 0.7331 | 0.2080 | 0.7331 | 0.8562 |
| 0.0746 | 4.0474 | 2048 | 0.7437 | 0.2080 | 0.7437 | 0.8624 |
| 0.0746 | 4.0514 | 2050 | 0.7907 | 0.2080 | 0.7907 | 0.8892 |
| 0.0746 | 4.0553 | 2052 | 0.8167 | 0.2080 | 0.8167 | 0.9037 |
| 0.0746 | 4.0593 | 2054 | 0.8100 | 0.2080 | 0.8100 | 0.9000 |
| 0.0746 | 4.0632 | 2056 | 0.8067 | 0.2080 | 0.8067 | 0.8982 |
| 0.0746 | 4.0672 | 2058 | 0.7927 | 0.2080 | 0.7927 | 0.8903 |
| 0.0746 | 4.0711 | 2060 | 0.7469 | 0.2080 | 0.7469 | 0.8642 |
| 0.0746 | 4.0751 | 2062 | 0.7163 | 0.2080 | 0.7163 | 0.8464 |
| 0.0746 | 4.0791 | 2064 | 0.7267 | 0.2080 | 0.7267 | 0.8524 |
| 0.0746 | 4.0830 | 2066 | 0.7475 | 0.2080 | 0.7475 | 0.8646 |
| 0.0746 | 4.0870 | 2068 | 0.7971 | 0.2080 | 0.7971 | 0.8928 |
| 0.0746 | 4.0909 | 2070 | 0.8325 | 0.2080 | 0.8325 | 0.9124 |
| 0.0746 | 4.0949 | 2072 | 0.8031 | 0.2080 | 0.8031 | 0.8961 |
| 0.0746 | 4.0988 | 2074 | 0.7460 | 0.2080 | 0.7460 | 0.8637 |
| 0.0746 | 4.1028 | 2076 | 0.7155 | 0.2080 | 0.7155 | 0.8459 |
| 0.0746 | 4.1067 | 2078 | 0.7156 | 0.2029 | 0.7156 | 0.8459 |
| 0.0746 | 4.1107 | 2080 | 0.7404 | 0.2080 | 0.7404 | 0.8605 |
| 0.0746 | 4.1146 | 2082 | 0.8218 | 0.2080 | 0.8218 | 0.9065 |
| 0.0746 | 4.1186 | 2084 | 0.9370 | 0.0179 | 0.9370 | 0.9680 |
| 0.0746 | 4.1225 | 2086 | 0.9677 | 0.0 | 0.9677 | 0.9837 |
| 0.0746 | 4.1265 | 2088 | 0.8985 | 0.0179 | 0.8985 | 0.9479 |
| 0.0746 | 4.1304 | 2090 | 0.7882 | 0.2080 | 0.7882 | 0.8878 |
| 0.0746 | 4.1344 | 2092 | 0.7441 | 0.2080 | 0.7441 | 0.8626 |
| 0.0746 | 4.1383 | 2094 | 0.7442 | 0.2080 | 0.7442 | 0.8626 |
| 0.0746 | 4.1423 | 2096 | 0.7788 | 0.2080 | 0.7788 | 0.8825 |
| 0.0746 | 4.1462 | 2098 | 0.8568 | 0.0179 | 0.8568 | 0.9256 |
| 0.0746 | 4.1502 | 2100 | 0.8871 | 0.0179 | 0.8871 | 0.9418 |
| 0.0746 | 4.1542 | 2102 | 0.8536 | 0.0179 | 0.8536 | 0.9239 |
| 0.0746 | 4.1581 | 2104 | 0.8099 | 0.2080 | 0.8099 | 0.8999 |
| 0.0746 | 4.1621 | 2106 | 0.7613 | 0.2080 | 0.7613 | 0.8725 |
| 0.0746 | 4.1660 | 2108 | 0.7584 | 0.2080 | 0.7584 | 0.8709 |
| 0.0746 | 4.1700 | 2110 | 0.7995 | 0.2080 | 0.7995 | 0.8941 |
| 0.0746 | 4.1739 | 2112 | 0.8481 | 0.2080 | 0.8481 | 0.9209 |
| 0.0746 | 4.1779 | 2114 | 0.8508 | 0.2080 | 0.8508 | 0.9224 |
| 0.0746 | 4.1818 | 2116 | 0.8301 | 0.2080 | 0.8301 | 0.9111 |
| 0.0746 | 4.1858 | 2118 | 0.7879 | 0.2080 | 0.7879 | 0.8876 |
| 0.0746 | 4.1897 | 2120 | 0.7486 | 0.2080 | 0.7486 | 0.8652 |
| 0.0746 | 4.1937 | 2122 | 0.7493 | 0.2080 | 0.7493 | 0.8656 |
| 0.0746 | 4.1976 | 2124 | 0.7746 | 0.2080 | 0.7746 | 0.8801 |
| 0.0746 | 4.2016 | 2126 | 0.7969 | 0.2080 | 0.7969 | 0.8927 |
| 0.0746 | 4.2055 | 2128 | 0.8351 | 0.0179 | 0.8351 | 0.9138 |
| 0.0746 | 4.2095 | 2130 | 0.8855 | 0.0 | 0.8855 | 0.9410 |
| 0.0746 | 4.2134 | 2132 | 0.8651 | 0.0 | 0.8651 | 0.9301 |
| 0.0746 | 4.2174 | 2134 | 0.8043 | 0.2080 | 0.8043 | 0.8968 |
| 0.0746 | 4.2213 | 2136 | 0.7467 | 0.2080 | 0.7467 | 0.8641 |
| 0.0746 | 4.2253 | 2138 | 0.7410 | 0.2080 | 0.7410 | 0.8608 |
| 0.0746 | 4.2292 | 2140 | 0.7536 | 0.2080 | 0.7536 | 0.8681 |
| 0.0746 | 4.2332 | 2142 | 0.7870 | 0.2080 | 0.7870 | 0.8871 |
| 0.0746 | 4.2372 | 2144 | 0.8660 | 0.2080 | 0.8660 | 0.9306 |
| 0.0746 | 4.2411 | 2146 | 0.9629 | 0.0179 | 0.9629 | 0.9813 |
| 0.0746 | 4.2451 | 2148 | 0.9675 | 0.0179 | 0.9675 | 0.9836 |
| 0.0746 | 4.2490 | 2150 | 0.8654 | 0.2080 | 0.8654 | 0.9303 |
| 0.0746 | 4.2530 | 2152 | 0.7692 | 0.2080 | 0.7692 | 0.8771 |
| 0.0746 | 4.2569 | 2154 | 0.7608 | 0.2080 | 0.7608 | 0.8722 |
| 0.0746 | 4.2609 | 2156 | 0.7587 | 0.2080 | 0.7587 | 0.8710 |
| 0.0746 | 4.2648 | 2158 | 0.7864 | 0.2080 | 0.7864 | 0.8868 |
| 0.0746 | 4.2688 | 2160 | 0.7908 | 0.2080 | 0.7908 | 0.8893 |
| 0.0746 | 4.2727 | 2162 | 0.8225 | 0.2080 | 0.8225 | 0.9069 |
| 0.0746 | 4.2767 | 2164 | 0.8293 | 0.2080 | 0.8293 | 0.9106 |
| 0.0746 | 4.2806 | 2166 | 0.8509 | 0.2080 | 0.8509 | 0.9225 |
| 0.0746 | 4.2846 | 2168 | 0.8400 | 0.2080 | 0.8400 | 0.9165 |
| 0.0746 | 4.2885 | 2170 | 0.7777 | 0.2080 | 0.7777 | 0.8818 |
| 0.0746 | 4.2925 | 2172 | 0.7598 | 0.2080 | 0.7598 | 0.8717 |
| 0.0746 | 4.2964 | 2174 | 0.7707 | 0.2080 | 0.7707 | 0.8779 |
| 0.0746 | 4.3004 | 2176 | 0.7856 | 0.2080 | 0.7856 | 0.8864 |
| 0.0746 | 4.3043 | 2178 | 0.7678 | 0.2080 | 0.7678 | 0.8762 |
| 0.0746 | 4.3083 | 2180 | 0.7830 | 0.2080 | 0.7830 | 0.8849 |
| 0.0746 | 4.3123 | 2182 | 0.7847 | 0.2080 | 0.7847 | 0.8858 |
| 0.0746 | 4.3162 | 2184 | 0.8743 | 0.2080 | 0.8743 | 0.9350 |
| 0.0746 | 4.3202 | 2186 | 1.0121 | 0.0 | 1.0121 | 1.0060 |
| 0.0746 | 4.3241 | 2188 | 1.0724 | 0.0 | 1.0724 | 1.0356 |
| 0.0746 | 4.3281 | 2190 | 1.0117 | 0.0 | 1.0117 | 1.0058 |
| 0.0746 | 4.3320 | 2192 | 0.8929 | 0.0179 | 0.8929 | 0.9450 |
| 0.0746 | 4.3360 | 2194 | 0.7941 | 0.2080 | 0.7941 | 0.8911 |
| 0.0746 | 4.3399 | 2196 | 0.7553 | 0.2080 | 0.7553 | 0.8691 |
| 0.0746 | 4.3439 | 2198 | 0.7435 | 0.2080 | 0.7435 | 0.8622 |
| 0.0746 | 4.3478 | 2200 | 0.7434 | 0.2080 | 0.7434 | 0.8622 |
| 0.0746 | 4.3518 | 2202 | 0.7715 | 0.2080 | 0.7715 | 0.8784 |
| 0.0746 | 4.3557 | 2204 | 0.8086 | 0.2080 | 0.8086 | 0.8992 |
| 0.0746 | 4.3597 | 2206 | 0.8623 | 0.2080 | 0.8623 | 0.9286 |
| 0.0746 | 4.3636 | 2208 | 0.8733 | 0.2080 | 0.8733 | 0.9345 |
| 0.0746 | 4.3676 | 2210 | 0.8473 | 0.2080 | 0.8473 | 0.9205 |
| 0.0746 | 4.3715 | 2212 | 0.8305 | 0.2080 | 0.8305 | 0.9113 |
| 0.0746 | 4.3755 | 2214 | 0.7885 | 0.2080 | 0.7885 | 0.8880 |
| 0.0746 | 4.3794 | 2216 | 0.7348 | 0.2080 | 0.7348 | 0.8572 |
| 0.0746 | 4.3834 | 2218 | 0.7085 | 0.2080 | 0.7085 | 0.8417 |
| 0.0746 | 4.3874 | 2220 | 0.6995 | 0.2080 | 0.6995 | 0.8364 |
| 0.0746 | 4.3913 | 2222 | 0.7054 | 0.2080 | 0.7054 | 0.8399 |
| 0.0746 | 4.3953 | 2224 | 0.7518 | 0.2080 | 0.7518 | 0.8671 |
| 0.0746 | 4.3992 | 2226 | 0.8270 | 0.2143 | 0.8270 | 0.9094 |
| 0.0746 | 4.4032 | 2228 | 0.8338 | 0.2143 | 0.8338 | 0.9131 |
| 0.0746 | 4.4071 | 2230 | 0.7943 | 0.2080 | 0.7943 | 0.8913 |
| 0.0746 | 4.4111 | 2232 | 0.7297 | 0.2080 | 0.7297 | 0.8542 |
| 0.0746 | 4.4150 | 2234 | 0.7264 | 0.2080 | 0.7264 | 0.8523 |
| 0.0746 | 4.4190 | 2236 | 0.7600 | 0.2080 | 0.7600 | 0.8718 |
| 0.0746 | 4.4229 | 2238 | 0.7474 | 0.2080 | 0.7474 | 0.8645 |
| 0.0746 | 4.4269 | 2240 | 0.7460 | 0.2080 | 0.7460 | 0.8637 |
| 0.0746 | 4.4308 | 2242 | 0.7532 | 0.2080 | 0.7532 | 0.8679 |
| 0.0746 | 4.4348 | 2244 | 0.7204 | 0.2080 | 0.7204 | 0.8488 |
| 0.0746 | 4.4387 | 2246 | 0.6789 | 0.2080 | 0.6789 | 0.8240 |
| 0.0746 | 4.4427 | 2248 | 0.6734 | 0.2080 | 0.6734 | 0.8206 |
| 0.0746 | 4.4466 | 2250 | 0.6744 | 0.2080 | 0.6744 | 0.8212 |
| 0.0746 | 4.4506 | 2252 | 0.6861 | 0.2080 | 0.6861 | 0.8283 |
| 0.0746 | 4.4545 | 2254 | 0.6845 | 0.2080 | 0.6845 | 0.8274 |
| 0.0746 | 4.4585 | 2256 | 0.6798 | 0.2080 | 0.6798 | 0.8245 |
| 0.0746 | 4.4625 | 2258 | 0.6842 | 0.2080 | 0.6842 | 0.8272 |
| 0.0746 | 4.4664 | 2260 | 0.7081 | 0.2080 | 0.7081 | 0.8415 |
| 0.0746 | 4.4704 | 2262 | 0.7195 | 0.2080 | 0.7195 | 0.8482 |
| 0.0746 | 4.4743 | 2264 | 0.7052 | 0.2080 | 0.7052 | 0.8397 |
| 0.0746 | 4.4783 | 2266 | 0.6965 | 0.2080 | 0.6965 | 0.8346 |
| 0.0746 | 4.4822 | 2268 | 0.6995 | 0.2080 | 0.6995 | 0.8364 |
| 0.0746 | 4.4862 | 2270 | 0.6854 | 0.2029 | 0.6854 | 0.8279 |
| 0.0746 | 4.4901 | 2272 | 0.6910 | 0.2029 | 0.6910 | 0.8313 |
| 0.0746 | 4.4941 | 2274 | 0.7192 | 0.2080 | 0.7192 | 0.8480 |
| 0.0746 | 4.4980 | 2276 | 0.7775 | 0.2080 | 0.7775 | 0.8817 |
| 0.0746 | 4.5020 | 2278 | 0.8441 | 0.2080 | 0.8441 | 0.9188 |
| 0.0746 | 4.5059 | 2280 | 0.8135 | 0.2080 | 0.8135 | 0.9019 |
| 0.0746 | 4.5099 | 2282 | 0.7346 | 0.2080 | 0.7346 | 0.8571 |
| 0.0746 | 4.5138 | 2284 | 0.7073 | 0.2029 | 0.7073 | 0.8410 |
| 0.0746 | 4.5178 | 2286 | 0.6961 | 0.2029 | 0.6961 | 0.8344 |
| 0.0746 | 4.5217 | 2288 | 0.6885 | 0.2029 | 0.6885 | 0.8298 |
| 0.0746 | 4.5257 | 2290 | 0.7221 | 0.2080 | 0.7221 | 0.8498 |
| 0.0746 | 4.5296 | 2292 | 0.8124 | 0.2080 | 0.8124 | 0.9013 |
| 0.0746 | 4.5336 | 2294 | 0.8324 | 0.2143 | 0.8324 | 0.9124 |
| 0.0746 | 4.5375 | 2296 | 0.7756 | 0.2080 | 0.7756 | 0.8807 |
| 0.0746 | 4.5415 | 2298 | 0.7124 | 0.2080 | 0.7124 | 0.8440 |
| 0.0746 | 4.5455 | 2300 | 0.6835 | 0.2080 | 0.6835 | 0.8268 |
| 0.0746 | 4.5494 | 2302 | 0.6739 | 0.2029 | 0.6739 | 0.8209 |
| 0.0746 | 4.5534 | 2304 | 0.6732 | 0.2029 | 0.6732 | 0.8205 |
| 0.0746 | 4.5573 | 2306 | 0.6792 | 0.2080 | 0.6792 | 0.8241 |
| 0.0746 | 4.5613 | 2308 | 0.6981 | 0.2080 | 0.6981 | 0.8355 |
| 0.0746 | 4.5652 | 2310 | 0.7230 | 0.2080 | 0.7230 | 0.8503 |
| 0.0746 | 4.5692 | 2312 | 0.7180 | 0.2080 | 0.7180 | 0.8474 |
| 0.0746 | 4.5731 | 2314 | 0.7061 | 0.2080 | 0.7061 | 0.8403 |
| 0.0746 | 4.5771 | 2316 | 0.7547 | 0.2080 | 0.7547 | 0.8687 |
| 0.0746 | 4.5810 | 2318 | 0.7704 | 0.2080 | 0.7704 | 0.8777 |
| 0.0746 | 4.5850 | 2320 | 0.7438 | 0.2080 | 0.7438 | 0.8624 |
| 0.0746 | 4.5889 | 2322 | 0.7591 | 0.2080 | 0.7591 | 0.8713 |
| 0.0746 | 4.5929 | 2324 | 0.7833 | 0.2080 | 0.7833 | 0.8851 |
| 0.0746 | 4.5968 | 2326 | 0.7701 | 0.2080 | 0.7701 | 0.8776 |
| 0.0746 | 4.6008 | 2328 | 0.7654 | 0.2080 | 0.7654 | 0.8748 |
| 0.0746 | 4.6047 | 2330 | 0.7406 | 0.2080 | 0.7406 | 0.8606 |
| 0.0746 | 4.6087 | 2332 | 0.7218 | 0.2080 | 0.7218 | 0.8496 |
| 0.0746 | 4.6126 | 2334 | 0.7000 | 0.2029 | 0.7000 | 0.8367 |
| 0.0746 | 4.6166 | 2336 | 0.7070 | 0.2029 | 0.7070 | 0.8408 |
| 0.0746 | 4.6206 | 2338 | 0.7500 | 0.2080 | 0.7500 | 0.8660 |
| 0.0746 | 4.6245 | 2340 | 0.8724 | 0.2143 | 0.8724 | 0.9340 |
| 0.0746 | 4.6285 | 2342 | 0.9488 | 0.0 | 0.9488 | 0.9740 |
| 0.0746 | 4.6324 | 2344 | 0.9271 | 0.0 | 0.9271 | 0.9629 |
| 0.0746 | 4.6364 | 2346 | 0.8505 | 0.0 | 0.8505 | 0.9222 |
| 0.0746 | 4.6403 | 2348 | 0.7614 | 0.2080 | 0.7614 | 0.8726 |
| 0.0746 | 4.6443 | 2350 | 0.7255 | 0.2080 | 0.7255 | 0.8518 |
| 0.0746 | 4.6482 | 2352 | 0.7153 | 0.2080 | 0.7153 | 0.8458 |
| 0.0746 | 4.6522 | 2354 | 0.7381 | 0.2080 | 0.7381 | 0.8591 |
| 0.0746 | 4.6561 | 2356 | 0.7897 | 0.2080 | 0.7897 | 0.8886 |
| 0.0746 | 4.6601 | 2358 | 0.8043 | 0.2080 | 0.8043 | 0.8968 |
| 0.0746 | 4.6640 | 2360 | 0.7651 | 0.2080 | 0.7651 | 0.8747 |
| 0.0746 | 4.6680 | 2362 | 0.7427 | 0.2080 | 0.7427 | 0.8618 |
| 0.0746 | 4.6719 | 2364 | 0.7350 | 0.2080 | 0.7350 | 0.8573 |
| 0.0746 | 4.6759 | 2366 | 0.7709 | 0.2080 | 0.7709 | 0.8780 |
| 0.0746 | 4.6798 | 2368 | 0.8236 | 0.2080 | 0.8236 | 0.9075 |
| 0.0746 | 4.6838 | 2370 | 0.8051 | 0.2080 | 0.8051 | 0.8973 |
| 0.0746 | 4.6877 | 2372 | 0.7664 | 0.2080 | 0.7664 | 0.8755 |
| 0.0746 | 4.6917 | 2374 | 0.7103 | 0.2080 | 0.7103 | 0.8428 |
| 0.0746 | 4.6957 | 2376 | 0.7045 | 0.2080 | 0.7045 | 0.8394 |
| 0.0746 | 4.6996 | 2378 | 0.7266 | 0.2080 | 0.7266 | 0.8524 |
| 0.0746 | 4.7036 | 2380 | 0.7553 | 0.2080 | 0.7553 | 0.8691 |
| 0.0746 | 4.7075 | 2382 | 0.7739 | 0.2080 | 0.7739 | 0.8797 |
| 0.0746 | 4.7115 | 2384 | 0.7520 | 0.2080 | 0.7520 | 0.8672 |
| 0.0746 | 4.7154 | 2386 | 0.7532 | 0.2080 | 0.7532 | 0.8679 |
| 0.0746 | 4.7194 | 2388 | 0.7394 | 0.2080 | 0.7394 | 0.8599 |
| 0.0746 | 4.7233 | 2390 | 0.7651 | 0.2080 | 0.7651 | 0.8747 |
| 0.0746 | 4.7273 | 2392 | 0.7793 | 0.2080 | 0.7793 | 0.8828 |
| 0.0746 | 4.7312 | 2394 | 0.8582 | 0.2143 | 0.8582 | 0.9264 |
| 0.0746 | 4.7352 | 2396 | 0.8645 | 0.2143 | 0.8645 | 0.9298 |
| 0.0746 | 4.7391 | 2398 | 0.8731 | 0.0 | 0.8731 | 0.9344 |
| 0.0746 | 4.7431 | 2400 | 0.8280 | 0.2143 | 0.8280 | 0.9100 |
| 0.0746 | 4.7470 | 2402 | 0.7470 | 0.2080 | 0.7470 | 0.8643 |
| 0.0746 | 4.7510 | 2404 | 0.7121 | 0.2029 | 0.7121 | 0.8438 |
| 0.0746 | 4.7549 | 2406 | 0.7143 | 0.3444 | 0.7143 | 0.8451 |
| 0.0746 | 4.7589 | 2408 | 0.7196 | 0.2029 | 0.7196 | 0.8483 |
| 0.0746 | 4.7628 | 2410 | 0.7683 | 0.2080 | 0.7683 | 0.8765 |
| 0.0746 | 4.7668 | 2412 | 0.8666 | 0.2080 | 0.8666 | 0.9309 |
| 0.0746 | 4.7708 | 2414 | 0.9630 | 0.0 | 0.9630 | 0.9813 |
| 0.0746 | 4.7747 | 2416 | 0.9485 | 0.0 | 0.9485 | 0.9739 |
| 0.0746 | 4.7787 | 2418 | 0.8622 | 0.2080 | 0.8622 | 0.9285 |
| 0.0746 | 4.7826 | 2420 | 0.8247 | 0.2080 | 0.8247 | 0.9081 |
| 0.0746 | 4.7866 | 2422 | 0.7699 | 0.2080 | 0.7699 | 0.8775 |
| 0.0746 | 4.7905 | 2424 | 0.7492 | 0.2029 | 0.7492 | 0.8656 |
| 0.0746 | 4.7945 | 2426 | 0.7529 | 0.2029 | 0.7529 | 0.8677 |
| 0.0746 | 4.7984 | 2428 | 0.7524 | 0.2029 | 0.7524 | 0.8674 |
| 0.0746 | 4.8024 | 2430 | 0.7658 | 0.2080 | 0.7658 | 0.8751 |
| 0.0746 | 4.8063 | 2432 | 0.8321 | 0.2080 | 0.8321 | 0.9122 |
| 0.0746 | 4.8103 | 2434 | 0.8843 | 0.2080 | 0.8843 | 0.9404 |
| 0.0746 | 4.8142 | 2436 | 0.8747 | 0.2080 | 0.8747 | 0.9353 |
| 0.0746 | 4.8182 | 2438 | 0.8298 | 0.2080 | 0.8298 | 0.9109 |
| 0.0746 | 4.8221 | 2440 | 0.8014 | 0.2080 | 0.8014 | 0.8952 |
| 0.0746 | 4.8261 | 2442 | 0.7718 | 0.2080 | 0.7718 | 0.8785 |
| 0.0746 | 4.8300 | 2444 | 0.7634 | 0.2029 | 0.7634 | 0.8737 |
| 0.0746 | 4.8340 | 2446 | 0.7621 | 0.2080 | 0.7621 | 0.8730 |
| 0.0746 | 4.8379 | 2448 | 0.7556 | 0.2029 | 0.7556 | 0.8692 |
| 0.0746 | 4.8419 | 2450 | 0.7643 | 0.2080 | 0.7643 | 0.8742 |
| 0.0746 | 4.8458 | 2452 | 0.7987 | 0.2080 | 0.7987 | 0.8937 |
| 0.0746 | 4.8498 | 2454 | 0.7834 | 0.2080 | 0.7834 | 0.8851 |
| 0.0746 | 4.8538 | 2456 | 0.7539 | 0.2080 | 0.7539 | 0.8683 |
| 0.0746 | 4.8577 | 2458 | 0.7300 | 0.2080 | 0.7300 | 0.8544 |
| 0.0746 | 4.8617 | 2460 | 0.7128 | 0.2029 | 0.7128 | 0.8443 |
| 0.0746 | 4.8656 | 2462 | 0.7085 | 0.2029 | 0.7085 | 0.8417 |
| 0.0746 | 4.8696 | 2464 | 0.7089 | 0.2080 | 0.7089 | 0.8420 |
| 0.0746 | 4.8735 | 2466 | 0.7115 | 0.2080 | 0.7115 | 0.8435 |
| 0.0746 | 4.8775 | 2468 | 0.7497 | 0.2080 | 0.7497 | 0.8659 |
| 0.0746 | 4.8814 | 2470 | 0.7847 | 0.2080 | 0.7847 | 0.8858 |
| 0.0746 | 4.8854 | 2472 | 0.7702 | 0.2080 | 0.7702 | 0.8776 |
| 0.0746 | 4.8893 | 2474 | 0.7664 | 0.2080 | 0.7664 | 0.8754 |
| 0.0746 | 4.8933 | 2476 | 0.7382 | 0.2080 | 0.7382 | 0.8592 |
| 0.0746 | 4.8972 | 2478 | 0.7376 | 0.2080 | 0.7376 | 0.8588 |
| 0.0746 | 4.9012 | 2480 | 0.7366 | 0.2080 | 0.7366 | 0.8583 |
| 0.0746 | 4.9051 | 2482 | 0.7243 | 0.2080 | 0.7243 | 0.8510 |
| 0.0746 | 4.9091 | 2484 | 0.7249 | 0.2080 | 0.7249 | 0.8514 |
| 0.0746 | 4.9130 | 2486 | 0.7254 | 0.2080 | 0.7254 | 0.8517 |
| 0.0746 | 4.9170 | 2488 | 0.7494 | 0.2080 | 0.7494 | 0.8657 |
| 0.0746 | 4.9209 | 2490 | 0.7730 | 0.2080 | 0.7730 | 0.8792 |
| 0.0746 | 4.9249 | 2492 | 0.7944 | 0.2080 | 0.7944 | 0.8913 |
| 0.0746 | 4.9289 | 2494 | 0.7535 | 0.2080 | 0.7535 | 0.8681 |
| 0.0746 | 4.9328 | 2496 | 0.6922 | 0.2029 | 0.6922 | 0.8320 |
| 0.0746 | 4.9368 | 2498 | 0.6850 | 0.3444 | 0.6850 | 0.8277 |
| 0.0626 | 4.9407 | 2500 | 0.7229 | 0.1951 | 0.7229 | 0.8502 |
| 0.0626 | 4.9447 | 2502 | 0.7160 | 0.1951 | 0.7160 | 0.8462 |
| 0.0626 | 4.9486 | 2504 | 0.6800 | 0.3444 | 0.6800 | 0.8246 |
| 0.0626 | 4.9526 | 2506 | 0.6915 | 0.2029 | 0.6915 | 0.8315 |
| 0.0626 | 4.9565 | 2508 | 0.7872 | 0.2080 | 0.7872 | 0.8873 |
| 0.0626 | 4.9605 | 2510 | 0.8627 | 0.2143 | 0.8627 | 0.9288 |
| 0.0626 | 4.9644 | 2512 | 0.8801 | 0.2143 | 0.8801 | 0.9381 |
| 0.0626 | 4.9684 | 2514 | 0.8245 | 0.2080 | 0.8245 | 0.9080 |
| 0.0626 | 4.9723 | 2516 | 0.7426 | 0.2080 | 0.7426 | 0.8618 |
| 0.0626 | 4.9763 | 2518 | 0.7138 | 0.2029 | 0.7138 | 0.8448 |
| 0.0626 | 4.9802 | 2520 | 0.7109 | 0.2029 | 0.7109 | 0.8432 |
| 0.0626 | 4.9842 | 2522 | 0.7123 | 0.2029 | 0.7123 | 0.8440 |
| 0.0626 | 4.9881 | 2524 | 0.7216 | 0.2029 | 0.7216 | 0.8495 |
| 0.0626 | 4.9921 | 2526 | 0.7611 | 0.2080 | 0.7611 | 0.8724 |
| 0.0626 | 4.9960 | 2528 | 0.7993 | 0.2080 | 0.7993 | 0.8940 |
| 0.0626 | 5.0 | 2530 | 0.7994 | 0.2080 | 0.7994 | 0.8941 |
| 0.0626 | 5.0040 | 2532 | 0.7666 | 0.2080 | 0.7666 | 0.8755 |
| 0.0626 | 5.0079 | 2534 | 0.7349 | 0.2029 | 0.7349 | 0.8573 |
| 0.0626 | 5.0119 | 2536 | 0.7371 | 0.2029 | 0.7371 | 0.8585 |
| 0.0626 | 5.0158 | 2538 | 0.7453 | 0.2029 | 0.7453 | 0.8633 |
| 0.0626 | 5.0198 | 2540 | 0.7633 | 0.2080 | 0.7633 | 0.8737 |
| 0.0626 | 5.0237 | 2542 | 0.7798 | 0.2080 | 0.7798 | 0.8831 |
| 0.0626 | 5.0277 | 2544 | 0.8154 | 0.2080 | 0.8154 | 0.9030 |
| 0.0626 | 5.0316 | 2546 | 0.8344 | 0.2080 | 0.8344 | 0.9135 |
| 0.0626 | 5.0356 | 2548 | 0.8095 | 0.2080 | 0.8095 | 0.8997 |
| 0.0626 | 5.0395 | 2550 | 0.7777 | 0.2080 | 0.7777 | 0.8818 |
| 0.0626 | 5.0435 | 2552 | 0.7567 | 0.2080 | 0.7567 | 0.8699 |
| 0.0626 | 5.0474 | 2554 | 0.7548 | 0.2080 | 0.7548 | 0.8688 |
| 0.0626 | 5.0514 | 2556 | 0.7649 | 0.2080 | 0.7649 | 0.8746 |
| 0.0626 | 5.0553 | 2558 | 0.7691 | 0.2080 | 0.7691 | 0.8770 |
| 0.0626 | 5.0593 | 2560 | 0.7769 | 0.2080 | 0.7769 | 0.8814 |
| 0.0626 | 5.0632 | 2562 | 0.7741 | 0.2080 | 0.7741 | 0.8798 |
| 0.0626 | 5.0672 | 2564 | 0.7643 | 0.2029 | 0.7643 | 0.8742 |
| 0.0626 | 5.0711 | 2566 | 0.7637 | 0.3444 | 0.7637 | 0.8739 |
| 0.0626 | 5.0751 | 2568 | 0.7621 | 0.3444 | 0.7621 | 0.8730 |
| 0.0626 | 5.0791 | 2570 | 0.7624 | 0.2029 | 0.7624 | 0.8731 |
| 0.0626 | 5.0830 | 2572 | 0.7663 | 0.2029 | 0.7663 | 0.8754 |
| 0.0626 | 5.0870 | 2574 | 0.7716 | 0.2080 | 0.7716 | 0.8784 |
| 0.0626 | 5.0909 | 2576 | 0.7759 | 0.2080 | 0.7759 | 0.8808 |
| 0.0626 | 5.0949 | 2578 | 0.7759 | 0.2080 | 0.7759 | 0.8809 |
| 0.0626 | 5.0988 | 2580 | 0.7921 | 0.2080 | 0.7921 | 0.8900 |
| 0.0626 | 5.1028 | 2582 | 0.7819 | 0.2080 | 0.7819 | 0.8843 |
| 0.0626 | 5.1067 | 2584 | 0.7620 | 0.2029 | 0.7620 | 0.8729 |
| 0.0626 | 5.1107 | 2586 | 0.7601 | 0.1951 | 0.7601 | 0.8719 |
| 0.0626 | 5.1146 | 2588 | 0.7657 | 0.1951 | 0.7657 | 0.8751 |
| 0.0626 | 5.1186 | 2590 | 0.7626 | 0.3444 | 0.7626 | 0.8733 |
| 0.0626 | 5.1225 | 2592 | 0.7782 | 0.2080 | 0.7782 | 0.8822 |
| 0.0626 | 5.1265 | 2594 | 0.7976 | 0.2080 | 0.7976 | 0.8931 |
| 0.0626 | 5.1304 | 2596 | 0.7965 | 0.2080 | 0.7965 | 0.8925 |
| 0.0626 | 5.1344 | 2598 | 0.7927 | 0.2080 | 0.7927 | 0.8903 |
| 0.0626 | 5.1383 | 2600 | 0.7998 | 0.2080 | 0.7998 | 0.8943 |
| 0.0626 | 5.1423 | 2602 | 0.8187 | 0.2080 | 0.8187 | 0.9048 |
| 0.0626 | 5.1462 | 2604 | 0.8046 | 0.2080 | 0.8046 | 0.8970 |
| 0.0626 | 5.1502 | 2606 | 0.7859 | 0.2080 | 0.7859 | 0.8865 |
| 0.0626 | 5.1542 | 2608 | 0.7773 | 0.2080 | 0.7773 | 0.8817 |
| 0.0626 | 5.1581 | 2610 | 0.7857 | 0.2080 | 0.7857 | 0.8864 |
| 0.0626 | 5.1621 | 2612 | 0.8112 | 0.2080 | 0.8112 | 0.9007 |
| 0.0626 | 5.1660 | 2614 | 0.8238 | 0.2080 | 0.8238 | 0.9076 |
| 0.0626 | 5.1700 | 2616 | 0.8161 | 0.2080 | 0.8161 | 0.9034 |
| 0.0626 | 5.1739 | 2618 | 0.8047 | 0.0435 | 0.8047 | 0.8971 |
| 0.0626 | 5.1779 | 2620 | 0.8055 | 0.0435 | 0.8055 | 0.8975 |
| 0.0626 | 5.1818 | 2622 | 0.8169 | 0.2080 | 0.8169 | 0.9038 |
| 0.0626 | 5.1858 | 2624 | 0.8097 | 0.2080 | 0.8097 | 0.8998 |
| 0.0626 | 5.1897 | 2626 | 0.8061 | 0.2080 | 0.8061 | 0.8979 |
| 0.0626 | 5.1937 | 2628 | 0.8286 | 0.2080 | 0.8286 | 0.9103 |
| 0.0626 | 5.1976 | 2630 | 0.8550 | 0.2080 | 0.8550 | 0.9247 |
| 0.0626 | 5.2016 | 2632 | 0.8555 | 0.2080 | 0.8555 | 0.9250 |
| 0.0626 | 5.2055 | 2634 | 0.8454 | 0.2080 | 0.8454 | 0.9194 |
| 0.0626 | 5.2095 | 2636 | 0.8352 | 0.2080 | 0.8352 | 0.9139 |
| 0.0626 | 5.2134 | 2638 | 0.8328 | 0.2080 | 0.8328 | 0.9126 |
| 0.0626 | 5.2174 | 2640 | 0.7998 | 0.2080 | 0.7998 | 0.8943 |
| 0.0626 | 5.2213 | 2642 | 0.7871 | 0.2080 | 0.7871 | 0.8872 |
| 0.0626 | 5.2253 | 2644 | 0.7928 | 0.2080 | 0.7928 | 0.8904 |
| 0.0626 | 5.2292 | 2646 | 0.8084 | 0.2080 | 0.8084 | 0.8991 |
| 0.0626 | 5.2332 | 2648 | 0.7954 | 0.2080 | 0.7954 | 0.8918 |
| 0.0626 | 5.2372 | 2650 | 0.7695 | 0.2080 | 0.7695 | 0.8772 |
| 0.0626 | 5.2411 | 2652 | 0.7649 | 0.2080 | 0.7649 | 0.8746 |
| 0.0626 | 5.2451 | 2654 | 0.7694 | 0.2080 | 0.7694 | 0.8772 |
| 0.0626 | 5.2490 | 2656 | 0.7738 | 0.2080 | 0.7738 | 0.8797 |
| 0.0626 | 5.2530 | 2658 | 0.7804 | 0.2080 | 0.7804 | 0.8834 |
| 0.0626 | 5.2569 | 2660 | 0.7746 | 0.2080 | 0.7746 | 0.8801 |
| 0.0626 | 5.2609 | 2662 | 0.7721 | 0.2080 | 0.7721 | 0.8787 |
| 0.0626 | 5.2648 | 2664 | 0.7692 | 0.2080 | 0.7692 | 0.8770 |
| 0.0626 | 5.2688 | 2666 | 0.7701 | 0.2029 | 0.7701 | 0.8775 |
| 0.0626 | 5.2727 | 2668 | 0.7740 | 0.2029 | 0.7740 | 0.8798 |
| 0.0626 | 5.2767 | 2670 | 0.7769 | 0.2080 | 0.7769 | 0.8814 |
| 0.0626 | 5.2806 | 2672 | 0.8103 | 0.2080 | 0.8103 | 0.9002 |
| 0.0626 | 5.2846 | 2674 | 0.8243 | 0.2080 | 0.8243 | 0.9079 |
| 0.0626 | 5.2885 | 2676 | 0.8044 | 0.2080 | 0.8044 | 0.8969 |
| 0.0626 | 5.2925 | 2678 | 0.7890 | 0.2080 | 0.7890 | 0.8883 |
| 0.0626 | 5.2964 | 2680 | 0.7741 | 0.2080 | 0.7741 | 0.8798 |
| 0.0626 | 5.3004 | 2682 | 0.7698 | 0.2080 | 0.7698 | 0.8774 |
| 0.0626 | 5.3043 | 2684 | 0.7832 | 0.2080 | 0.7832 | 0.8850 |
| 0.0626 | 5.3083 | 2686 | 0.8346 | 0.2080 | 0.8346 | 0.9136 |
| 0.0626 | 5.3123 | 2688 | 0.8434 | 0.2080 | 0.8434 | 0.9184 |
| 0.0626 | 5.3162 | 2690 | 0.8071 | 0.2080 | 0.8071 | 0.8984 |
| 0.0626 | 5.3202 | 2692 | 0.7742 | 0.2080 | 0.7742 | 0.8799 |
| 0.0626 | 5.3241 | 2694 | 0.7645 | 0.2080 | 0.7645 | 0.8743 |
| 0.0626 | 5.3281 | 2696 | 0.7670 | 0.2080 | 0.7670 | 0.8758 |
| 0.0626 | 5.3320 | 2698 | 0.7744 | 0.2080 | 0.7744 | 0.8800 |
| 0.0626 | 5.3360 | 2700 | 0.8024 | 0.2080 | 0.8024 | 0.8958 |
| 0.0626 | 5.3399 | 2702 | 0.8036 | 0.2080 | 0.8036 | 0.8964 |
| 0.0626 | 5.3439 | 2704 | 0.8257 | 0.2080 | 0.8257 | 0.9087 |
| 0.0626 | 5.3478 | 2706 | 0.8257 | 0.2080 | 0.8257 | 0.9087 |
| 0.0626 | 5.3518 | 2708 | 0.8021 | 0.2080 | 0.8021 | 0.8956 |
| 0.0626 | 5.3557 | 2710 | 0.7848 | 0.2080 | 0.7848 | 0.8859 |
| 0.0626 | 5.3597 | 2712 | 0.7825 | 0.2029 | 0.7825 | 0.8846 |
| 0.0626 | 5.3636 | 2714 | 0.7880 | 0.2080 | 0.7880 | 0.8877 |
| 0.0626 | 5.3676 | 2716 | 0.8374 | 0.2080 | 0.8374 | 0.9151 |
| 0.0626 | 5.3715 | 2718 | 0.8866 | 0.2080 | 0.8866 | 0.9416 |
| 0.0626 | 5.3755 | 2720 | 0.8767 | 0.2080 | 0.8767 | 0.9363 |
| 0.0626 | 5.3794 | 2722 | 0.8142 | 0.2080 | 0.8142 | 0.9023 |
| 0.0626 | 5.3834 | 2724 | 0.7819 | 0.2080 | 0.7819 | 0.8842 |
| 0.0626 | 5.3874 | 2726 | 0.7730 | 0.2029 | 0.7730 | 0.8792 |
| 0.0626 | 5.3913 | 2728 | 0.7724 | 0.2029 | 0.7724 | 0.8789 |
| 0.0626 | 5.3953 | 2730 | 0.7800 | 0.2080 | 0.7800 | 0.8832 |
| 0.0626 | 5.3992 | 2732 | 0.8207 | 0.2080 | 0.8207 | 0.9059 |
| 0.0626 | 5.4032 | 2734 | 0.9098 | 0.0179 | 0.9098 | 0.9538 |
| 0.0626 | 5.4071 | 2736 | 0.9358 | 0.0179 | 0.9358 | 0.9673 |
| 0.0626 | 5.4111 | 2738 | 0.8900 | 0.2080 | 0.8900 | 0.9434 |
| 0.0626 | 5.4150 | 2740 | 0.8096 | 0.2080 | 0.8096 | 0.8998 |
| 0.0626 | 5.4190 | 2742 | 0.7722 | 0.2029 | 0.7722 | 0.8788 |
| 0.0626 | 5.4229 | 2744 | 0.7691 | 0.2029 | 0.7691 | 0.8770 |
| 0.0626 | 5.4269 | 2746 | 0.7747 | 0.2080 | 0.7747 | 0.8802 |
| 0.0626 | 5.4308 | 2748 | 0.7942 | 0.2080 | 0.7942 | 0.8912 |
| 0.0626 | 5.4348 | 2750 | 0.8013 | 0.2080 | 0.8013 | 0.8952 |
| 0.0626 | 5.4387 | 2752 | 0.7879 | 0.2080 | 0.7879 | 0.8876 |
| 0.0626 | 5.4427 | 2754 | 0.7868 | 0.2080 | 0.7868 | 0.8870 |
| 0.0626 | 5.4466 | 2756 | 0.7926 | 0.2080 | 0.7926 | 0.8903 |
| 0.0626 | 5.4506 | 2758 | 0.8007 | 0.2080 | 0.8007 | 0.8948 |
| 0.0626 | 5.4545 | 2760 | 0.8114 | 0.2080 | 0.8114 | 0.9008 |
| 0.0626 | 5.4585 | 2762 | 0.8120 | 0.2080 | 0.8120 | 0.9011 |
| 0.0626 | 5.4625 | 2764 | 0.8073 | 0.2080 | 0.8073 | 0.8985 |
| 0.0626 | 5.4664 | 2766 | 0.8286 | 0.2080 | 0.8286 | 0.9103 |
| 0.0626 | 5.4704 | 2768 | 0.8584 | 0.2080 | 0.8584 | 0.9265 |
| 0.0626 | 5.4743 | 2770 | 0.8454 | 0.2080 | 0.8454 | 0.9195 |
| 0.0626 | 5.4783 | 2772 | 0.8092 | 0.2080 | 0.8092 | 0.8996 |
| 0.0626 | 5.4822 | 2774 | 0.7626 | 0.2080 | 0.7626 | 0.8733 |
| 0.0626 | 5.4862 | 2776 | 0.7478 | 0.2029 | 0.7478 | 0.8647 |
| 0.0626 | 5.4901 | 2778 | 0.7494 | 0.2029 | 0.7494 | 0.8657 |
| 0.0626 | 5.4941 | 2780 | 0.7545 | 0.2029 | 0.7545 | 0.8686 |
| 0.0626 | 5.4980 | 2782 | 0.7662 | 0.2080 | 0.7662 | 0.8753 |
| 0.0626 | 5.5020 | 2784 | 0.7998 | 0.2080 | 0.7998 | 0.8943 |
| 0.0626 | 5.5059 | 2786 | 0.8177 | 0.2080 | 0.8177 | 0.9043 |
| 0.0626 | 5.5099 | 2788 | 0.8011 | 0.2080 | 0.8011 | 0.8950 |
| 0.0626 | 5.5138 | 2790 | 0.7851 | 0.2029 | 0.7851 | 0.8861 |
| 0.0626 | 5.5178 | 2792 | 0.7944 | 0.0530 | 0.7944 | 0.8913 |
| 0.0626 | 5.5217 | 2794 | 0.7953 | 0.0530 | 0.7953 | 0.8918 |
| 0.0626 | 5.5257 | 2796 | 0.7931 | 0.2029 | 0.7931 | 0.8906 |
| 0.0626 | 5.5296 | 2798 | 0.8023 | 0.2080 | 0.8023 | 0.8957 |
| 0.0626 | 5.5336 | 2800 | 0.8000 | 0.2080 | 0.8000 | 0.8944 |
| 0.0626 | 5.5375 | 2802 | 0.8094 | 0.2080 | 0.8094 | 0.8996 |
| 0.0626 | 5.5415 | 2804 | 0.8054 | 0.2080 | 0.8054 | 0.8974 |
| 0.0626 | 5.5455 | 2806 | 0.8232 | 0.2080 | 0.8232 | 0.9073 |
| 0.0626 | 5.5494 | 2808 | 0.8389 | 0.2080 | 0.8389 | 0.9159 |
| 0.0626 | 5.5534 | 2810 | 0.8196 | 0.2080 | 0.8196 | 0.9053 |
| 0.0626 | 5.5573 | 2812 | 0.8021 | 0.2080 | 0.8021 | 0.8956 |
| 0.0626 | 5.5613 | 2814 | 0.8092 | 0.2080 | 0.8092 | 0.8996 |
| 0.0626 | 5.5652 | 2816 | 0.8287 | 0.2080 | 0.8287 | 0.9103 |
| 0.0626 | 5.5692 | 2818 | 0.8344 | 0.2080 | 0.8344 | 0.9134 |
| 0.0626 | 5.5731 | 2820 | 0.8148 | 0.2080 | 0.8148 | 0.9026 |
| 0.0626 | 5.5771 | 2822 | 0.7924 | 0.2080 | 0.7924 | 0.8902 |
| 0.0626 | 5.5810 | 2824 | 0.7908 | 0.0530 | 0.7908 | 0.8893 |
| 0.0626 | 5.5850 | 2826 | 0.7928 | 0.0530 | 0.7928 | 0.8904 |
| 0.0626 | 5.5889 | 2828 | 0.7934 | 0.2029 | 0.7934 | 0.8907 |
| 0.0626 | 5.5929 | 2830 | 0.8209 | 0.2080 | 0.8209 | 0.9060 |
| 0.0626 | 5.5968 | 2832 | 0.8783 | 0.2080 | 0.8783 | 0.9372 |
| 0.0626 | 5.6008 | 2834 | 0.9261 | 0.2080 | 0.9261 | 0.9624 |
| 0.0626 | 5.6047 | 2836 | 0.9167 | 0.2080 | 0.9167 | 0.9574 |
| 0.0626 | 5.6087 | 2838 | 0.8639 | 0.2080 | 0.8639 | 0.9295 |
| 0.0626 | 5.6126 | 2840 | 0.8282 | 0.2080 | 0.8282 | 0.9101 |
| 0.0626 | 5.6166 | 2842 | 0.8052 | 0.2080 | 0.8052 | 0.8973 |
| 0.0626 | 5.6206 | 2844 | 0.7898 | 0.2029 | 0.7898 | 0.8887 |
| 0.0626 | 5.6245 | 2846 | 0.7811 | 0.2029 | 0.7811 | 0.8838 |
| 0.0626 | 5.6285 | 2848 | 0.7866 | 0.2080 | 0.7866 | 0.8869 |
| 0.0626 | 5.6324 | 2850 | 0.7946 | 0.2080 | 0.7946 | 0.8914 |
| 0.0626 | 5.6364 | 2852 | 0.7842 | 0.2080 | 0.7842 | 0.8856 |
| 0.0626 | 5.6403 | 2854 | 0.7738 | 0.2080 | 0.7738 | 0.8797 |
| 0.0626 | 5.6443 | 2856 | 0.7604 | 0.2029 | 0.7604 | 0.8720 |
| 0.0626 | 5.6482 | 2858 | 0.7602 | 0.2029 | 0.7602 | 0.8719 |
| 0.0626 | 5.6522 | 2860 | 0.7567 | 0.2029 | 0.7567 | 0.8699 |
| 0.0626 | 5.6561 | 2862 | 0.7575 | 0.2029 | 0.7575 | 0.8703 |
| 0.0626 | 5.6601 | 2864 | 0.7671 | 0.2029 | 0.7671 | 0.8759 |
| 0.0626 | 5.6640 | 2866 | 0.7729 | 0.2029 | 0.7729 | 0.8791 |
| 0.0626 | 5.6680 | 2868 | 0.7934 | 0.2080 | 0.7934 | 0.8907 |
| 0.0626 | 5.6719 | 2870 | 0.7839 | 0.2080 | 0.7839 | 0.8854 |
| 0.0626 | 5.6759 | 2872 | 0.7662 | 0.2029 | 0.7662 | 0.8753 |
| 0.0626 | 5.6798 | 2874 | 0.7567 | 0.2029 | 0.7567 | 0.8699 |
| 0.0626 | 5.6838 | 2876 | 0.7639 | 0.2029 | 0.7639 | 0.8740 |
| 0.0626 | 5.6877 | 2878 | 0.7675 | 0.2080 | 0.7675 | 0.8761 |
| 0.0626 | 5.6917 | 2880 | 0.7736 | 0.2080 | 0.7736 | 0.8796 |
| 0.0626 | 5.6957 | 2882 | 0.7781 | 0.2080 | 0.7781 | 0.8821 |
| 0.0626 | 5.6996 | 2884 | 0.7651 | 0.2080 | 0.7651 | 0.8747 |
| 0.0626 | 5.7036 | 2886 | 0.7560 | 0.2080 | 0.7560 | 0.8695 |
| 0.0626 | 5.7075 | 2888 | 0.7603 | 0.2080 | 0.7603 | 0.8719 |
| 0.0626 | 5.7115 | 2890 | 0.7870 | 0.2080 | 0.7870 | 0.8871 |
| 0.0626 | 5.7154 | 2892 | 0.7917 | 0.2080 | 0.7917 | 0.8898 |
| 0.0626 | 5.7194 | 2894 | 0.7711 | 0.2080 | 0.7711 | 0.8781 |
| 0.0626 | 5.7233 | 2896 | 0.7648 | 0.2080 | 0.7648 | 0.8745 |
| 0.0626 | 5.7273 | 2898 | 0.7486 | 0.2080 | 0.7486 | 0.8652 |
| 0.0626 | 5.7312 | 2900 | 0.7518 | 0.2080 | 0.7518 | 0.8671 |
| 0.0626 | 5.7352 | 2902 | 0.7646 | 0.2080 | 0.7646 | 0.8744 |
| 0.0626 | 5.7391 | 2904 | 0.7993 | 0.2080 | 0.7993 | 0.8940 |
| 0.0626 | 5.7431 | 2906 | 0.8236 | 0.2080 | 0.8236 | 0.9075 |
| 0.0626 | 5.7470 | 2908 | 0.8444 | 0.2080 | 0.8444 | 0.9189 |
| 0.0626 | 5.7510 | 2910 | 0.8143 | 0.2080 | 0.8143 | 0.9024 |
| 0.0626 | 5.7549 | 2912 | 0.7798 | 0.2080 | 0.7798 | 0.8831 |
| 0.0626 | 5.7589 | 2914 | 0.7586 | 0.2029 | 0.7586 | 0.8710 |
| 0.0626 | 5.7628 | 2916 | 0.7527 | 0.2029 | 0.7527 | 0.8676 |
| 0.0626 | 5.7668 | 2918 | 0.7476 | 0.2029 | 0.7476 | 0.8646 |
| 0.0626 | 5.7708 | 2920 | 0.7473 | 0.2029 | 0.7473 | 0.8645 |
| 0.0626 | 5.7747 | 2922 | 0.7563 | 0.2080 | 0.7563 | 0.8697 |
| 0.0626 | 5.7787 | 2924 | 0.7939 | 0.2080 | 0.7939 | 0.8910 |
| 0.0626 | 5.7826 | 2926 | 0.8177 | 0.2080 | 0.8177 | 0.9043 |
| 0.0626 | 5.7866 | 2928 | 0.8072 | 0.2080 | 0.8072 | 0.8984 |
| 0.0626 | 5.7905 | 2930 | 0.7863 | 0.2080 | 0.7863 | 0.8867 |
| 0.0626 | 5.7945 | 2932 | 0.7754 | 0.2080 | 0.7754 | 0.8806 |
| 0.0626 | 5.7984 | 2934 | 0.7572 | 0.2029 | 0.7572 | 0.8701 |
| 0.0626 | 5.8024 | 2936 | 0.7580 | 0.2029 | 0.7580 | 0.8706 |
| 0.0626 | 5.8063 | 2938 | 0.7636 | 0.2080 | 0.7636 | 0.8739 |
| 0.0626 | 5.8103 | 2940 | 0.7866 | 0.2080 | 0.7866 | 0.8869 |
| 0.0626 | 5.8142 | 2942 | 0.8005 | 0.2080 | 0.8005 | 0.8947 |
| 0.0626 | 5.8182 | 2944 | 0.7935 | 0.2080 | 0.7935 | 0.8908 |
| 0.0626 | 5.8221 | 2946 | 0.7630 | 0.2080 | 0.7630 | 0.8735 |
| 0.0626 | 5.8261 | 2948 | 0.7542 | 0.2029 | 0.7542 | 0.8685 |
| 0.0626 | 5.8300 | 2950 | 0.7625 | 0.2080 | 0.7625 | 0.8732 |
| 0.0626 | 5.8340 | 2952 | 0.7795 | 0.2080 | 0.7795 | 0.8829 |
| 0.0626 | 5.8379 | 2954 | 0.7954 | 0.2080 | 0.7954 | 0.8918 |
| 0.0626 | 5.8419 | 2956 | 0.7797 | 0.2080 | 0.7797 | 0.8830 |
| 0.0626 | 5.8458 | 2958 | 0.7484 | 0.2029 | 0.7484 | 0.8651 |
| 0.0626 | 5.8498 | 2960 | 0.7402 | 0.2029 | 0.7402 | 0.8603 |
| 0.0626 | 5.8538 | 2962 | 0.7430 | 0.2029 | 0.7430 | 0.8620 |
| 0.0626 | 5.8577 | 2964 | 0.7559 | 0.2029 | 0.7559 | 0.8694 |
| 0.0626 | 5.8617 | 2966 | 0.7645 | 0.2029 | 0.7645 | 0.8743 |
| 0.0626 | 5.8656 | 2968 | 0.7808 | 0.2080 | 0.7808 | 0.8836 |
| 0.0626 | 5.8696 | 2970 | 0.7774 | 0.2080 | 0.7774 | 0.8817 |
| 0.0626 | 5.8735 | 2972 | 0.7602 | 0.2080 | 0.7602 | 0.8719 |
| 0.0626 | 5.8775 | 2974 | 0.7427 | 0.2029 | 0.7427 | 0.8618 |
| 0.0626 | 5.8814 | 2976 | 0.7362 | 0.2029 | 0.7362 | 0.8580 |
| 0.0626 | 5.8854 | 2978 | 0.7344 | 0.2029 | 0.7344 | 0.8570 |
| 0.0626 | 5.8893 | 2980 | 0.7335 | 0.2029 | 0.7335 | 0.8564 |
| 0.0626 | 5.8933 | 2982 | 0.7462 | 0.2029 | 0.7462 | 0.8638 |
| 0.0626 | 5.8972 | 2984 | 0.7686 | 0.2080 | 0.7686 | 0.8767 |
| 0.0626 | 5.9012 | 2986 | 0.7993 | 0.2080 | 0.7993 | 0.8940 |
| 0.0626 | 5.9051 | 2988 | 0.7940 | 0.2080 | 0.7940 | 0.8911 |
| 0.0626 | 5.9091 | 2990 | 0.7648 | 0.2080 | 0.7648 | 0.8745 |
| 0.0626 | 5.9130 | 2992 | 0.7284 | 0.2029 | 0.7284 | 0.8535 |
| 0.0626 | 5.9170 | 2994 | 0.7075 | 0.2029 | 0.7075 | 0.8411 |
| 0.0626 | 5.9209 | 2996 | 0.7046 | 0.2029 | 0.7046 | 0.8394 |
| 0.0626 | 5.9249 | 2998 | 0.7089 | 0.2029 | 0.7089 | 0.8420 |
| 0.0548 | 5.9289 | 3000 | 0.7197 | 0.2029 | 0.7197 | 0.8483 |
| 0.0548 | 5.9328 | 3002 | 0.7360 | 0.2080 | 0.7360 | 0.8579 |
| 0.0548 | 5.9368 | 3004 | 0.7459 | 0.2080 | 0.7459 | 0.8637 |
| 0.0548 | 5.9407 | 3006 | 0.7539 | 0.2080 | 0.7539 | 0.8683 |
| 0.0548 | 5.9447 | 3008 | 0.7604 | 0.2080 | 0.7604 | 0.8720 |
| 0.0548 | 5.9486 | 3010 | 0.7681 | 0.2080 | 0.7681 | 0.8764 |
| 0.0548 | 5.9526 | 3012 | 0.7543 | 0.2080 | 0.7543 | 0.8685 |
| 0.0548 | 5.9565 | 3014 | 0.7530 | 0.2029 | 0.7530 | 0.8678 |
| 0.0548 | 5.9605 | 3016 | 0.7486 | 0.2029 | 0.7486 | 0.8652 |
| 0.0548 | 5.9644 | 3018 | 0.7459 | 0.2029 | 0.7459 | 0.8637 |
| 0.0548 | 5.9684 | 3020 | 0.7500 | 0.2029 | 0.7500 | 0.8660 |
| 0.0548 | 5.9723 | 3022 | 0.7586 | 0.2029 | 0.7586 | 0.8710 |
| 0.0548 | 5.9763 | 3024 | 0.7617 | 0.2029 | 0.7617 | 0.8728 |
| 0.0548 | 5.9802 | 3026 | 0.7642 | 0.2029 | 0.7642 | 0.8742 |
| 0.0548 | 5.9842 | 3028 | 0.7959 | 0.2080 | 0.7959 | 0.8921 |
| 0.0548 | 5.9881 | 3030 | 0.8297 | 0.2080 | 0.8297 | 0.9109 |
| 0.0548 | 5.9921 | 3032 | 0.8205 | 0.2080 | 0.8205 | 0.9058 |
| 0.0548 | 5.9960 | 3034 | 0.7933 | 0.2080 | 0.7933 | 0.8907 |
| 0.0548 | 6.0 | 3036 | 0.7545 | 0.2029 | 0.7545 | 0.8686 |
| 0.0548 | 6.0040 | 3038 | 0.7459 | 0.2029 | 0.7459 | 0.8637 |
| 0.0548 | 6.0079 | 3040 | 0.7532 | 0.2029 | 0.7532 | 0.8679 |
| 0.0548 | 6.0119 | 3042 | 0.7777 | 0.2080 | 0.7777 | 0.8819 |
| 0.0548 | 6.0158 | 3044 | 0.8110 | 0.2080 | 0.8110 | 0.9006 |
| 0.0548 | 6.0198 | 3046 | 0.8159 | 0.2080 | 0.8159 | 0.9033 |
| 0.0548 | 6.0237 | 3048 | 0.7962 | 0.2080 | 0.7962 | 0.8923 |
| 0.0548 | 6.0277 | 3050 | 0.7854 | 0.2080 | 0.7854 | 0.8862 |
| 0.0548 | 6.0316 | 3052 | 0.7673 | 0.2080 | 0.7673 | 0.8759 |
| 0.0548 | 6.0356 | 3054 | 0.7639 | 0.2080 | 0.7639 | 0.8740 |
| 0.0548 | 6.0395 | 3056 | 0.7702 | 0.2080 | 0.7702 | 0.8776 |
| 0.0548 | 6.0435 | 3058 | 0.7898 | 0.2080 | 0.7898 | 0.8887 |
| 0.0548 | 6.0474 | 3060 | 0.8290 | 0.2080 | 0.8290 | 0.9105 |
| 0.0548 | 6.0514 | 3062 | 0.8603 | 0.2080 | 0.8603 | 0.9275 |
| 0.0548 | 6.0553 | 3064 | 0.8666 | 0.2080 | 0.8666 | 0.9309 |
| 0.0548 | 6.0593 | 3066 | 0.8357 | 0.2080 | 0.8357 | 0.9142 |
| 0.0548 | 6.0632 | 3068 | 0.8115 | 0.2080 | 0.8115 | 0.9008 |
| 0.0548 | 6.0672 | 3070 | 0.8058 | 0.2080 | 0.8058 | 0.8976 |
| 0.0548 | 6.0711 | 3072 | 0.8228 | 0.2080 | 0.8228 | 0.9071 |
| 0.0548 | 6.0751 | 3074 | 0.8311 | 0.2080 | 0.8311 | 0.9117 |
| 0.0548 | 6.0791 | 3076 | 0.8475 | 0.2080 | 0.8475 | 0.9206 |
| 0.0548 | 6.0830 | 3078 | 0.8522 | 0.2080 | 0.8522 | 0.9231 |
| 0.0548 | 6.0870 | 3080 | 0.8468 | 0.2080 | 0.8468 | 0.9202 |
| 0.0548 | 6.0909 | 3082 | 0.8200 | 0.2080 | 0.8200 | 0.9055 |
| 0.0548 | 6.0949 | 3084 | 0.8082 | 0.2080 | 0.8082 | 0.8990 |
| 0.0548 | 6.0988 | 3086 | 0.8038 | 0.2080 | 0.8038 | 0.8965 |
| 0.0548 | 6.1028 | 3088 | 0.8173 | 0.2080 | 0.8173 | 0.9041 |
| 0.0548 | 6.1067 | 3090 | 0.8430 | 0.2080 | 0.8430 | 0.9181 |
| 0.0548 | 6.1107 | 3092 | 0.8571 | 0.2080 | 0.8571 | 0.9258 |
| 0.0548 | 6.1146 | 3094 | 0.8483 | 0.2080 | 0.8483 | 0.9211 |
| 0.0548 | 6.1186 | 3096 | 0.8124 | 0.2080 | 0.8124 | 0.9013 |
| 0.0548 | 6.1225 | 3098 | 0.7900 | 0.2029 | 0.7900 | 0.8888 |
| 0.0548 | 6.1265 | 3100 | 0.7937 | -0.0732 | 0.7937 | 0.8909 |
| 0.0548 | 6.1304 | 3102 | 0.7926 | -0.0732 | 0.7926 | 0.8903 |
| 0.0548 | 6.1344 | 3104 | 0.7862 | 0.2029 | 0.7862 | 0.8867 |
| 0.0548 | 6.1383 | 3106 | 0.7900 | 0.2080 | 0.7900 | 0.8888 |
| 0.0548 | 6.1423 | 3108 | 0.8118 | 0.2080 | 0.8118 | 0.9010 |
| 0.0548 | 6.1462 | 3110 | 0.8252 | 0.2080 | 0.8252 | 0.9084 |
| 0.0548 | 6.1502 | 3112 | 0.8339 | 0.2080 | 0.8339 | 0.9132 |
| 0.0548 | 6.1542 | 3114 | 0.8530 | 0.2080 | 0.8530 | 0.9236 |
| 0.0548 | 6.1581 | 3116 | 0.8514 | 0.2080 | 0.8514 | 0.9227 |
| 0.0548 | 6.1621 | 3118 | 0.8166 | 0.2080 | 0.8166 | 0.9036 |
| 0.0548 | 6.1660 | 3120 | 0.7830 | 0.2080 | 0.7830 | 0.8849 |
| 0.0548 | 6.1700 | 3122 | 0.7780 | 0.2029 | 0.7780 | 0.8820 |
| 0.0548 | 6.1739 | 3124 | 0.7814 | 0.2029 | 0.7814 | 0.8839 |
| 0.0548 | 6.1779 | 3126 | 0.7830 | 0.2080 | 0.7830 | 0.8849 |
| 0.0548 | 6.1818 | 3128 | 0.8017 | 0.2080 | 0.8017 | 0.8954 |
| 0.0548 | 6.1858 | 3130 | 0.8181 | 0.2080 | 0.8181 | 0.9045 |
| 0.0548 | 6.1897 | 3132 | 0.8468 | 0.2080 | 0.8468 | 0.9202 |
| 0.0548 | 6.1937 | 3134 | 0.8463 | 0.2080 | 0.8463 | 0.9200 |
| 0.0548 | 6.1976 | 3136 | 0.8200 | 0.2080 | 0.8200 | 0.9055 |
| 0.0548 | 6.2016 | 3138 | 0.8019 | 0.2080 | 0.8019 | 0.8955 |
| 0.0548 | 6.2055 | 3140 | 0.7898 | 0.2029 | 0.7898 | 0.8887 |
| 0.0548 | 6.2095 | 3142 | 0.7838 | 0.2029 | 0.7838 | 0.8853 |
| 0.0548 | 6.2134 | 3144 | 0.7775 | 0.2029 | 0.7775 | 0.8818 |
| 0.0548 | 6.2174 | 3146 | 0.7749 | 0.2080 | 0.7749 | 0.8803 |
| 0.0548 | 6.2213 | 3148 | 0.8041 | 0.2080 | 0.8041 | 0.8967 |
| 0.0548 | 6.2253 | 3150 | 0.8280 | 0.2080 | 0.8280 | 0.9099 |
| 0.0548 | 6.2292 | 3152 | 0.8170 | 0.2080 | 0.8170 | 0.9039 |
| 0.0548 | 6.2332 | 3154 | 0.8171 | 0.2080 | 0.8171 | 0.9039 |
| 0.0548 | 6.2372 | 3156 | 0.8221 | 0.2080 | 0.8221 | 0.9067 |
| 0.0548 | 6.2411 | 3158 | 0.8050 | 0.2080 | 0.8050 | 0.8972 |
| 0.0548 | 6.2451 | 3160 | 0.7706 | 0.2080 | 0.7706 | 0.8778 |
| 0.0548 | 6.2490 | 3162 | 0.7553 | 0.2029 | 0.7553 | 0.8691 |
| 0.0548 | 6.2530 | 3164 | 0.7568 | 0.2029 | 0.7568 | 0.8700 |
| 0.0548 | 6.2569 | 3166 | 0.7637 | 0.2080 | 0.7637 | 0.8739 |
| 0.0548 | 6.2609 | 3168 | 0.7992 | 0.2080 | 0.7992 | 0.8940 |
| 0.0548 | 6.2648 | 3170 | 0.8647 | 0.2080 | 0.8647 | 0.9299 |
| 0.0548 | 6.2688 | 3172 | 0.8996 | 0.2080 | 0.8996 | 0.9485 |
| 0.0548 | 6.2727 | 3174 | 0.8751 | 0.2080 | 0.8751 | 0.9355 |
| 0.0548 | 6.2767 | 3176 | 0.8228 | 0.2080 | 0.8228 | 0.9071 |
| 0.0548 | 6.2806 | 3178 | 0.7947 | 0.2080 | 0.7947 | 0.8915 |
| 0.0548 | 6.2846 | 3180 | 0.7797 | 0.2029 | 0.7797 | 0.8830 |
| 0.0548 | 6.2885 | 3182 | 0.7949 | -0.0732 | 0.7949 | 0.8915 |
| 0.0548 | 6.2925 | 3184 | 0.7970 | -0.0732 | 0.7970 | 0.8927 |
| 0.0548 | 6.2964 | 3186 | 0.7835 | 0.2029 | 0.7835 | 0.8852 |
| 0.0548 | 6.3004 | 3188 | 0.7817 | 0.2029 | 0.7817 | 0.8842 |
| 0.0548 | 6.3043 | 3190 | 0.7857 | 0.2080 | 0.7857 | 0.8864 |
| 0.0548 | 6.3083 | 3192 | 0.7947 | 0.2080 | 0.7947 | 0.8915 |
| 0.0548 | 6.3123 | 3194 | 0.8230 | 0.2080 | 0.8230 | 0.9072 |
| 0.0548 | 6.3162 | 3196 | 0.8480 | 0.2080 | 0.8480 | 0.9209 |
| 0.0548 | 6.3202 | 3198 | 0.8567 | 0.2080 | 0.8567 | 0.9256 |
| 0.0548 | 6.3241 | 3200 | 0.8439 | 0.2080 | 0.8439 | 0.9187 |
| 0.0548 | 6.3281 | 3202 | 0.8128 | 0.2080 | 0.8128 | 0.9015 |
| 0.0548 | 6.3320 | 3204 | 0.7796 | 0.2080 | 0.7796 | 0.8829 |
| 0.0548 | 6.3360 | 3206 | 0.7731 | 0.2029 | 0.7731 | 0.8793 |
| 0.0548 | 6.3399 | 3208 | 0.7768 | 0.0530 | 0.7768 | 0.8813 |
| 0.0548 | 6.3439 | 3210 | 0.7773 | 0.2029 | 0.7773 | 0.8816 |
| 0.0548 | 6.3478 | 3212 | 0.7782 | 0.2080 | 0.7782 | 0.8822 |
| 0.0548 | 6.3518 | 3214 | 0.7837 | 0.2080 | 0.7837 | 0.8853 |
| 0.0548 | 6.3557 | 3216 | 0.7824 | 0.2080 | 0.7824 | 0.8845 |
| 0.0548 | 6.3597 | 3218 | 0.7851 | 0.2080 | 0.7851 | 0.8860 |
| 0.0548 | 6.3636 | 3220 | 0.7821 | 0.2080 | 0.7821 | 0.8843 |
| 0.0548 | 6.3676 | 3222 | 0.7847 | 0.2080 | 0.7847 | 0.8858 |
| 0.0548 | 6.3715 | 3224 | 0.7776 | 0.2080 | 0.7776 | 0.8818 |
| 0.0548 | 6.3755 | 3226 | 0.7711 | 0.2080 | 0.7711 | 0.8781 |
| 0.0548 | 6.3794 | 3228 | 0.7692 | 0.2080 | 0.7692 | 0.8770 |
| 0.0548 | 6.3834 | 3230 | 0.7759 | 0.2080 | 0.7759 | 0.8809 |
| 0.0548 | 6.3874 | 3232 | 0.7753 | 0.2080 | 0.7753 | 0.8805 |
| 0.0548 | 6.3913 | 3234 | 0.7745 | 0.2080 | 0.7745 | 0.8801 |
| 0.0548 | 6.3953 | 3236 | 0.7664 | 0.2080 | 0.7664 | 0.8754 |
| 0.0548 | 6.3992 | 3238 | 0.7546 | 0.2080 | 0.7546 | 0.8687 |
| 0.0548 | 6.4032 | 3240 | 0.7475 | 0.2080 | 0.7475 | 0.8646 |
| 0.0548 | 6.4071 | 3242 | 0.7409 | 0.2029 | 0.7409 | 0.8608 |
| 0.0548 | 6.4111 | 3244 | 0.7438 | 0.3444 | 0.7438 | 0.8624 |
| 0.0548 | 6.4150 | 3246 | 0.7398 | 0.2029 | 0.7398 | 0.8601 |
| 0.0548 | 6.4190 | 3248 | 0.7429 | 0.2080 | 0.7429 | 0.8619 |
| 0.0548 | 6.4229 | 3250 | 0.7600 | 0.2080 | 0.7600 | 0.8718 |
| 0.0548 | 6.4269 | 3252 | 0.7811 | 0.2080 | 0.7811 | 0.8838 |
| 0.0548 | 6.4308 | 3254 | 0.7945 | 0.2080 | 0.7945 | 0.8914 |
| 0.0548 | 6.4348 | 3256 | 0.7933 | 0.2080 | 0.7933 | 0.8907 |
| 0.0548 | 6.4387 | 3258 | 0.7769 | 0.2080 | 0.7769 | 0.8814 |
| 0.0548 | 6.4427 | 3260 | 0.7581 | 0.2080 | 0.7581 | 0.8707 |
| 0.0548 | 6.4466 | 3262 | 0.7587 | 0.2029 | 0.7587 | 0.8711 |
| 0.0548 | 6.4506 | 3264 | 0.7626 | 0.2029 | 0.7626 | 0.8733 |
| 0.0548 | 6.4545 | 3266 | 0.7605 | 0.2029 | 0.7605 | 0.8721 |
| 0.0548 | 6.4585 | 3268 | 0.7621 | 0.2029 | 0.7621 | 0.8730 |
| 0.0548 | 6.4625 | 3270 | 0.7730 | 0.2080 | 0.7730 | 0.8792 |
| 0.0548 | 6.4664 | 3272 | 0.7969 | 0.2080 | 0.7969 | 0.8927 |
| 0.0548 | 6.4704 | 3274 | 0.8027 | 0.2080 | 0.8027 | 0.8959 |
| 0.0548 | 6.4743 | 3276 | 0.7944 | 0.2080 | 0.7944 | 0.8913 |
| 0.0548 | 6.4783 | 3278 | 0.7730 | 0.2080 | 0.7730 | 0.8792 |
| 0.0548 | 6.4822 | 3280 | 0.7653 | 0.2029 | 0.7653 | 0.8748 |
| 0.0548 | 6.4862 | 3282 | 0.7623 | 0.2029 | 0.7623 | 0.8731 |
| 0.0548 | 6.4901 | 3284 | 0.7617 | 0.2080 | 0.7617 | 0.8727 |
| 0.0548 | 6.4941 | 3286 | 0.7644 | 0.2080 | 0.7644 | 0.8743 |
| 0.0548 | 6.4980 | 3288 | 0.7633 | 0.2080 | 0.7633 | 0.8737 |
| 0.0548 | 6.5020 | 3290 | 0.7611 | 0.2080 | 0.7611 | 0.8724 |
| 0.0548 | 6.5059 | 3292 | 0.7701 | 0.2080 | 0.7701 | 0.8775 |
| 0.0548 | 6.5099 | 3294 | 0.7861 | 0.2080 | 0.7861 | 0.8866 |
| 0.0548 | 6.5138 | 3296 | 0.7969 | 0.2080 | 0.7969 | 0.8927 |
| 0.0548 | 6.5178 | 3298 | 0.7912 | 0.2080 | 0.7912 | 0.8895 |
| 0.0548 | 6.5217 | 3300 | 0.7758 | 0.2080 | 0.7758 | 0.8808 |
| 0.0548 | 6.5257 | 3302 | 0.7680 | 0.2080 | 0.7680 | 0.8763 |
| 0.0548 | 6.5296 | 3304 | 0.7707 | 0.2080 | 0.7707 | 0.8779 |
| 0.0548 | 6.5336 | 3306 | 0.7695 | 0.2080 | 0.7695 | 0.8772 |
| 0.0548 | 6.5375 | 3308 | 0.7723 | 0.2080 | 0.7723 | 0.8788 |
| 0.0548 | 6.5415 | 3310 | 0.7801 | 0.2080 | 0.7801 | 0.8832 |
| 0.0548 | 6.5455 | 3312 | 0.7924 | 0.2080 | 0.7924 | 0.8902 |
| 0.0548 | 6.5494 | 3314 | 0.8087 | 0.2080 | 0.8087 | 0.8993 |
| 0.0548 | 6.5534 | 3316 | 0.8113 | 0.2080 | 0.8113 | 0.9007 |
| 0.0548 | 6.5573 | 3318 | 0.8060 | 0.2080 | 0.8060 | 0.8978 |
| 0.0548 | 6.5613 | 3320 | 0.8054 | 0.2080 | 0.8054 | 0.8974 |
| 0.0548 | 6.5652 | 3322 | 0.8025 | 0.2080 | 0.8025 | 0.8958 |
| 0.0548 | 6.5692 | 3324 | 0.7988 | 0.2080 | 0.7988 | 0.8938 |
| 0.0548 | 6.5731 | 3326 | 0.7967 | 0.2080 | 0.7967 | 0.8926 |
| 0.0548 | 6.5771 | 3328 | 0.8097 | 0.2080 | 0.8097 | 0.8999 |
| 0.0548 | 6.5810 | 3330 | 0.8451 | 0.2080 | 0.8451 | 0.9193 |
| 0.0548 | 6.5850 | 3332 | 0.8640 | 0.2080 | 0.8640 | 0.9295 |
| 0.0548 | 6.5889 | 3334 | 0.8466 | 0.2080 | 0.8466 | 0.9201 |
| 0.0548 | 6.5929 | 3336 | 0.8209 | 0.2080 | 0.8209 | 0.9060 |
| 0.0548 | 6.5968 | 3338 | 0.7949 | 0.2080 | 0.7949 | 0.8916 |
| 0.0548 | 6.6008 | 3340 | 0.7883 | 0.2029 | 0.7883 | 0.8879 |
| 0.0548 | 6.6047 | 3342 | 0.7960 | -0.0732 | 0.7960 | 0.8922 |
| 0.0548 | 6.6087 | 3344 | 0.7940 | -0.0732 | 0.7940 | 0.8910 |
| 0.0548 | 6.6126 | 3346 | 0.7822 | 0.2029 | 0.7822 | 0.8844 |
| 0.0548 | 6.6166 | 3348 | 0.7801 | 0.2080 | 0.7801 | 0.8832 |
| 0.0548 | 6.6206 | 3350 | 0.8020 | 0.2080 | 0.8020 | 0.8955 |
| 0.0548 | 6.6245 | 3352 | 0.8398 | 0.2080 | 0.8398 | 0.9164 |
| 0.0548 | 6.6285 | 3354 | 0.8535 | 0.2080 | 0.8535 | 0.9238 |
| 0.0548 | 6.6324 | 3356 | 0.8440 | 0.2080 | 0.8440 | 0.9187 |
| 0.0548 | 6.6364 | 3358 | 0.8136 | 0.2080 | 0.8136 | 0.9020 |
| 0.0548 | 6.6403 | 3360 | 0.7863 | 0.2080 | 0.7863 | 0.8868 |
| 0.0548 | 6.6443 | 3362 | 0.7839 | 0.2080 | 0.7839 | 0.8854 |
| 0.0548 | 6.6482 | 3364 | 0.7870 | 0.2029 | 0.7870 | 0.8871 |
| 0.0548 | 6.6522 | 3366 | 0.7872 | 0.2080 | 0.7872 | 0.8872 |
| 0.0548 | 6.6561 | 3368 | 0.7949 | 0.2080 | 0.7949 | 0.8916 |
| 0.0548 | 6.6601 | 3370 | 0.8005 | 0.2080 | 0.8005 | 0.8947 |
| 0.0548 | 6.6640 | 3372 | 0.8096 | 0.2080 | 0.8096 | 0.8998 |
| 0.0548 | 6.6680 | 3374 | 0.8018 | 0.2080 | 0.8018 | 0.8954 |
| 0.0548 | 6.6719 | 3376 | 0.7958 | 0.2080 | 0.7958 | 0.8921 |
| 0.0548 | 6.6759 | 3378 | 0.7953 | 0.2080 | 0.7953 | 0.8918 |
| 0.0548 | 6.6798 | 3380 | 0.7939 | 0.2080 | 0.7939 | 0.8910 |
| 0.0548 | 6.6838 | 3382 | 0.7896 | 0.2080 | 0.7896 | 0.8886 |
| 0.0548 | 6.6877 | 3384 | 0.7891 | 0.2080 | 0.7891 | 0.8883 |
| 0.0548 | 6.6917 | 3386 | 0.7869 | 0.2080 | 0.7869 | 0.8871 |
| 0.0548 | 6.6957 | 3388 | 0.7843 | 0.2080 | 0.7843 | 0.8856 |
| 0.0548 | 6.6996 | 3390 | 0.7861 | 0.2080 | 0.7861 | 0.8866 |
| 0.0548 | 6.7036 | 3392 | 0.7849 | 0.2080 | 0.7849 | 0.8860 |
| 0.0548 | 6.7075 | 3394 | 0.7822 | 0.2080 | 0.7822 | 0.8844 |
| 0.0548 | 6.7115 | 3396 | 0.7752 | 0.2080 | 0.7752 | 0.8805 |
| 0.0548 | 6.7154 | 3398 | 0.7744 | 0.2029 | 0.7744 | 0.8800 |
| 0.0548 | 6.7194 | 3400 | 0.7795 | -0.0732 | 0.7795 | 0.8829 |
| 0.0548 | 6.7233 | 3402 | 0.7752 | 0.0530 | 0.7752 | 0.8805 |
| 0.0548 | 6.7273 | 3404 | 0.7687 | 0.2029 | 0.7687 | 0.8768 |
| 0.0548 | 6.7312 | 3406 | 0.7703 | 0.2080 | 0.7703 | 0.8777 |
| 0.0548 | 6.7352 | 3408 | 0.7709 | 0.2029 | 0.7709 | 0.8780 |
| 0.0548 | 6.7391 | 3410 | 0.7751 | 0.2029 | 0.7751 | 0.8804 |
| 0.0548 | 6.7431 | 3412 | 0.7812 | 0.2029 | 0.7812 | 0.8838 |
| 0.0548 | 6.7470 | 3414 | 0.7856 | 0.2029 | 0.7856 | 0.8864 |
| 0.0548 | 6.7510 | 3416 | 0.7989 | 0.2080 | 0.7989 | 0.8938 |
| 0.0548 | 6.7549 | 3418 | 0.8246 | 0.2080 | 0.8246 | 0.9081 |
| 0.0548 | 6.7589 | 3420 | 0.8250 | 0.2080 | 0.8250 | 0.9083 |
| 0.0548 | 6.7628 | 3422 | 0.8073 | 0.2080 | 0.8073 | 0.8985 |
| 0.0548 | 6.7668 | 3424 | 0.7885 | 0.2080 | 0.7885 | 0.8879 |
| 0.0548 | 6.7708 | 3426 | 0.7815 | 0.2080 | 0.7815 | 0.8840 |
| 0.0548 | 6.7747 | 3428 | 0.7779 | 0.2080 | 0.7779 | 0.8820 |
| 0.0548 | 6.7787 | 3430 | 0.7842 | 0.2080 | 0.7842 | 0.8855 |
| 0.0548 | 6.7826 | 3432 | 0.7966 | 0.2080 | 0.7966 | 0.8925 |
| 0.0548 | 6.7866 | 3434 | 0.7931 | 0.2080 | 0.7931 | 0.8905 |
| 0.0548 | 6.7905 | 3436 | 0.7904 | 0.2080 | 0.7904 | 0.8890 |
| 0.0548 | 6.7945 | 3438 | 0.7851 | 0.2080 | 0.7851 | 0.8860 |
| 0.0548 | 6.7984 | 3440 | 0.7851 | 0.2029 | 0.7851 | 0.8860 |
| 0.0548 | 6.8024 | 3442 | 0.7888 | 0.2080 | 0.7888 | 0.8882 |
| 0.0548 | 6.8063 | 3444 | 0.8061 | 0.2080 | 0.8061 | 0.8978 |
| 0.0548 | 6.8103 | 3446 | 0.8132 | 0.2080 | 0.8132 | 0.9018 |
| 0.0548 | 6.8142 | 3448 | 0.8053 | 0.2080 | 0.8053 | 0.8974 |
| 0.0548 | 6.8182 | 3450 | 0.7970 | 0.2080 | 0.7970 | 0.8927 |
| 0.0548 | 6.8221 | 3452 | 0.7957 | 0.2029 | 0.7957 | 0.8920 |
| 0.0548 | 6.8261 | 3454 | 0.7952 | 0.2029 | 0.7952 | 0.8917 |
| 0.0548 | 6.8300 | 3456 | 0.7966 | 0.2029 | 0.7966 | 0.8925 |
| 0.0548 | 6.8340 | 3458 | 0.7967 | 0.2029 | 0.7967 | 0.8926 |
| 0.0548 | 6.8379 | 3460 | 0.7992 | 0.2029 | 0.7992 | 0.8940 |
| 0.0548 | 6.8419 | 3462 | 0.8070 | 0.2080 | 0.8070 | 0.8983 |
| 0.0548 | 6.8458 | 3464 | 0.8147 | 0.2080 | 0.8147 | 0.9026 |
| 0.0548 | 6.8498 | 3466 | 0.8247 | 0.2080 | 0.8247 | 0.9081 |
| 0.0548 | 6.8538 | 3468 | 0.8272 | 0.2080 | 0.8272 | 0.9095 |
| 0.0548 | 6.8577 | 3470 | 0.8201 | 0.2080 | 0.8201 | 0.9056 |
| 0.0548 | 6.8617 | 3472 | 0.8243 | 0.2080 | 0.8243 | 0.9079 |
| 0.0548 | 6.8656 | 3474 | 0.8290 | 0.2080 | 0.8290 | 0.9105 |
| 0.0548 | 6.8696 | 3476 | 0.8242 | 0.2080 | 0.8242 | 0.9079 |
| 0.0548 | 6.8735 | 3478 | 0.8197 | 0.2080 | 0.8197 | 0.9054 |
| 0.0548 | 6.8775 | 3480 | 0.8192 | 0.2080 | 0.8192 | 0.9051 |
| 0.0548 | 6.8814 | 3482 | 0.8176 | 0.2080 | 0.8176 | 0.9042 |
| 0.0548 | 6.8854 | 3484 | 0.8170 | 0.2080 | 0.8170 | 0.9039 |
| 0.0548 | 6.8893 | 3486 | 0.8216 | 0.2080 | 0.8216 | 0.9064 |
| 0.0548 | 6.8933 | 3488 | 0.8249 | 0.2080 | 0.8249 | 0.9082 |
| 0.0548 | 6.8972 | 3490 | 0.8199 | 0.2080 | 0.8199 | 0.9055 |
| 0.0548 | 6.9012 | 3492 | 0.8248 | 0.2080 | 0.8248 | 0.9082 |
| 0.0548 | 6.9051 | 3494 | 0.8201 | 0.2080 | 0.8201 | 0.9056 |
| 0.0548 | 6.9091 | 3496 | 0.8220 | 0.2080 | 0.8220 | 0.9066 |
| 0.0548 | 6.9130 | 3498 | 0.8203 | 0.2080 | 0.8203 | 0.9057 |
| 0.0476 | 6.9170 | 3500 | 0.8127 | 0.2080 | 0.8127 | 0.9015 |
| 0.0476 | 6.9209 | 3502 | 0.8021 | 0.2029 | 0.8021 | 0.8956 |
| 0.0476 | 6.9249 | 3504 | 0.8008 | 0.2029 | 0.8008 | 0.8949 |
| 0.0476 | 6.9289 | 3506 | 0.8022 | 0.2080 | 0.8022 | 0.8956 |
| 0.0476 | 6.9328 | 3508 | 0.8122 | 0.2080 | 0.8122 | 0.9012 |
| 0.0476 | 6.9368 | 3510 | 0.8166 | 0.2080 | 0.8166 | 0.9037 |
| 0.0476 | 6.9407 | 3512 | 0.8307 | 0.2080 | 0.8307 | 0.9114 |
| 0.0476 | 6.9447 | 3514 | 0.8540 | 0.2080 | 0.8540 | 0.9241 |
| 0.0476 | 6.9486 | 3516 | 0.8558 | 0.2080 | 0.8558 | 0.9251 |
| 0.0476 | 6.9526 | 3518 | 0.8398 | 0.2080 | 0.8398 | 0.9164 |
| 0.0476 | 6.9565 | 3520 | 0.8160 | 0.2080 | 0.8160 | 0.9033 |
| 0.0476 | 6.9605 | 3522 | 0.8099 | 0.2029 | 0.8099 | 0.8999 |
| 0.0476 | 6.9644 | 3524 | 0.8091 | 0.2029 | 0.8091 | 0.8995 |
| 0.0476 | 6.9684 | 3526 | 0.8101 | 0.2029 | 0.8101 | 0.9000 |
| 0.0476 | 6.9723 | 3528 | 0.8149 | 0.2080 | 0.8149 | 0.9027 |
| 0.0476 | 6.9763 | 3530 | 0.8279 | 0.2080 | 0.8279 | 0.9099 |
| 0.0476 | 6.9802 | 3532 | 0.8503 | 0.2080 | 0.8503 | 0.9221 |
| 0.0476 | 6.9842 | 3534 | 0.8546 | 0.2080 | 0.8546 | 0.9245 |
| 0.0476 | 6.9881 | 3536 | 0.8533 | 0.2080 | 0.8533 | 0.9237 |
| 0.0476 | 6.9921 | 3538 | 0.8335 | 0.2080 | 0.8335 | 0.9130 |
| 0.0476 | 6.9960 | 3540 | 0.8116 | 0.2080 | 0.8116 | 0.9009 |
| 0.0476 | 7.0 | 3542 | 0.8023 | 0.2080 | 0.8023 | 0.8957 |
| 0.0476 | 7.0040 | 3544 | 0.8043 | 0.2080 | 0.8043 | 0.8968 |
| 0.0476 | 7.0079 | 3546 | 0.8131 | 0.2080 | 0.8131 | 0.9017 |
| 0.0476 | 7.0119 | 3548 | 0.8172 | 0.2080 | 0.8172 | 0.9040 |
| 0.0476 | 7.0158 | 3550 | 0.8106 | 0.2080 | 0.8106 | 0.9003 |
| 0.0476 | 7.0198 | 3552 | 0.8102 | 0.2080 | 0.8102 | 0.9001 |
| 0.0476 | 7.0237 | 3554 | 0.8144 | 0.2080 | 0.8144 | 0.9025 |
| 0.0476 | 7.0277 | 3556 | 0.8253 | 0.2080 | 0.8253 | 0.9084 |
| 0.0476 | 7.0316 | 3558 | 0.8407 | 0.2080 | 0.8407 | 0.9169 |
| 0.0476 | 7.0356 | 3560 | 0.8374 | 0.2080 | 0.8374 | 0.9151 |
| 0.0476 | 7.0395 | 3562 | 0.8336 | 0.2080 | 0.8336 | 0.9130 |
| 0.0476 | 7.0435 | 3564 | 0.8176 | 0.2080 | 0.8176 | 0.9042 |
| 0.0476 | 7.0474 | 3566 | 0.8075 | 0.2080 | 0.8075 | 0.8986 |
| 0.0476 | 7.0514 | 3568 | 0.7965 | 0.2080 | 0.7965 | 0.8925 |
| 0.0476 | 7.0553 | 3570 | 0.7972 | 0.2080 | 0.7972 | 0.8928 |
| 0.0476 | 7.0593 | 3572 | 0.8019 | 0.2080 | 0.8019 | 0.8955 |
| 0.0476 | 7.0632 | 3574 | 0.7996 | 0.2080 | 0.7996 | 0.8942 |
| 0.0476 | 7.0672 | 3576 | 0.8040 | 0.2080 | 0.8040 | 0.8967 |
| 0.0476 | 7.0711 | 3578 | 0.8209 | 0.2080 | 0.8209 | 0.9060 |
| 0.0476 | 7.0751 | 3580 | 0.8215 | 0.2080 | 0.8215 | 0.9064 |
| 0.0476 | 7.0791 | 3582 | 0.8202 | 0.2080 | 0.8202 | 0.9057 |
| 0.0476 | 7.0830 | 3584 | 0.8092 | 0.2080 | 0.8092 | 0.8996 |
| 0.0476 | 7.0870 | 3586 | 0.8004 | 0.2080 | 0.8004 | 0.8946 |
| 0.0476 | 7.0909 | 3588 | 0.7995 | 0.2080 | 0.7995 | 0.8942 |
| 0.0476 | 7.0949 | 3590 | 0.7991 | 0.2080 | 0.7991 | 0.8939 |
| 0.0476 | 7.0988 | 3592 | 0.7895 | 0.2080 | 0.7895 | 0.8886 |
| 0.0476 | 7.1028 | 3594 | 0.7870 | 0.2080 | 0.7870 | 0.8871 |
| 0.0476 | 7.1067 | 3596 | 0.7830 | 0.2080 | 0.7830 | 0.8849 |
| 0.0476 | 7.1107 | 3598 | 0.7841 | 0.2080 | 0.7841 | 0.8855 |
| 0.0476 | 7.1146 | 3600 | 0.7902 | 0.2080 | 0.7902 | 0.8889 |
| 0.0476 | 7.1186 | 3602 | 0.8014 | 0.2080 | 0.8014 | 0.8952 |
| 0.0476 | 7.1225 | 3604 | 0.8199 | 0.2080 | 0.8199 | 0.9055 |
| 0.0476 | 7.1265 | 3606 | 0.8244 | 0.2080 | 0.8244 | 0.9080 |
| 0.0476 | 7.1304 | 3608 | 0.8186 | 0.2080 | 0.8186 | 0.9048 |
| 0.0476 | 7.1344 | 3610 | 0.8100 | 0.2080 | 0.8100 | 0.9000 |
| 0.0476 | 7.1383 | 3612 | 0.8096 | 0.2080 | 0.8096 | 0.8998 |
| 0.0476 | 7.1423 | 3614 | 0.8168 | 0.2080 | 0.8168 | 0.9038 |
| 0.0476 | 7.1462 | 3616 | 0.8189 | 0.2080 | 0.8189 | 0.9049 |
| 0.0476 | 7.1502 | 3618 | 0.8093 | 0.2080 | 0.8093 | 0.8996 |
| 0.0476 | 7.1542 | 3620 | 0.8016 | 0.2080 | 0.8016 | 0.8953 |
| 0.0476 | 7.1581 | 3622 | 0.8018 | 0.2029 | 0.8018 | 0.8954 |
| 0.0476 | 7.1621 | 3624 | 0.8027 | 0.2029 | 0.8027 | 0.8959 |
| 0.0476 | 7.1660 | 3626 | 0.8035 | 0.2029 | 0.8035 | 0.8964 |
| 0.0476 | 7.1700 | 3628 | 0.8023 | 0.2080 | 0.8023 | 0.8957 |
| 0.0476 | 7.1739 | 3630 | 0.8071 | 0.2080 | 0.8071 | 0.8984 |
| 0.0476 | 7.1779 | 3632 | 0.8225 | 0.2080 | 0.8225 | 0.9069 |
| 0.0476 | 7.1818 | 3634 | 0.8412 | 0.2080 | 0.8412 | 0.9172 |
| 0.0476 | 7.1858 | 3636 | 0.8382 | 0.2080 | 0.8382 | 0.9155 |
| 0.0476 | 7.1897 | 3638 | 0.8194 | 0.2080 | 0.8194 | 0.9052 |
| 0.0476 | 7.1937 | 3640 | 0.8131 | 0.2080 | 0.8131 | 0.9017 |
| 0.0476 | 7.1976 | 3642 | 0.8067 | 0.2080 | 0.8067 | 0.8982 |
| 0.0476 | 7.2016 | 3644 | 0.7936 | 0.2080 | 0.7936 | 0.8908 |
| 0.0476 | 7.2055 | 3646 | 0.7899 | 0.2080 | 0.7899 | 0.8887 |
| 0.0476 | 7.2095 | 3648 | 0.7957 | 0.2080 | 0.7957 | 0.8920 |
| 0.0476 | 7.2134 | 3650 | 0.8166 | 0.2080 | 0.8166 | 0.9037 |
| 0.0476 | 7.2174 | 3652 | 0.8230 | 0.2080 | 0.8230 | 0.9072 |
| 0.0476 | 7.2213 | 3654 | 0.8272 | 0.2080 | 0.8272 | 0.9095 |
| 0.0476 | 7.2253 | 3656 | 0.8185 | 0.2080 | 0.8185 | 0.9047 |
| 0.0476 | 7.2292 | 3658 | 0.8038 | 0.2080 | 0.8038 | 0.8965 |
| 0.0476 | 7.2332 | 3660 | 0.8039 | 0.2080 | 0.8039 | 0.8966 |
| 0.0476 | 7.2372 | 3662 | 0.8196 | 0.2080 | 0.8196 | 0.9053 |
| 0.0476 | 7.2411 | 3664 | 0.8271 | 0.2080 | 0.8271 | 0.9095 |
| 0.0476 | 7.2451 | 3666 | 0.8355 | 0.2080 | 0.8355 | 0.9141 |
| 0.0476 | 7.2490 | 3668 | 0.8316 | 0.2080 | 0.8316 | 0.9119 |
| 0.0476 | 7.2530 | 3670 | 0.8190 | 0.2080 | 0.8190 | 0.9050 |
| 0.0476 | 7.2569 | 3672 | 0.8044 | 0.2080 | 0.8044 | 0.8969 |
| 0.0476 | 7.2609 | 3674 | 0.8043 | 0.2080 | 0.8043 | 0.8968 |
| 0.0476 | 7.2648 | 3676 | 0.8079 | 0.2080 | 0.8079 | 0.8988 |
| 0.0476 | 7.2688 | 3678 | 0.7984 | 0.2080 | 0.7984 | 0.8935 |
| 0.0476 | 7.2727 | 3680 | 0.7904 | 0.2080 | 0.7904 | 0.8891 |
| 0.0476 | 7.2767 | 3682 | 0.7835 | 0.2080 | 0.7835 | 0.8852 |
| 0.0476 | 7.2806 | 3684 | 0.7793 | 0.2080 | 0.7793 | 0.8828 |
| 0.0476 | 7.2846 | 3686 | 0.7819 | 0.2080 | 0.7819 | 0.8842 |
| 0.0476 | 7.2885 | 3688 | 0.7927 | 0.2080 | 0.7927 | 0.8904 |
| 0.0476 | 7.2925 | 3690 | 0.7941 | 0.2080 | 0.7941 | 0.8911 |
| 0.0476 | 7.2964 | 3692 | 0.7991 | 0.2080 | 0.7991 | 0.8939 |
| 0.0476 | 7.3004 | 3694 | 0.8015 | 0.2080 | 0.8015 | 0.8953 |
| 0.0476 | 7.3043 | 3696 | 0.7967 | 0.2080 | 0.7967 | 0.8926 |
| 0.0476 | 7.3083 | 3698 | 0.7954 | 0.2080 | 0.7954 | 0.8918 |
| 0.0476 | 7.3123 | 3700 | 0.7987 | 0.2080 | 0.7987 | 0.8937 |
| 0.0476 | 7.3162 | 3702 | 0.8014 | 0.2080 | 0.8014 | 0.8952 |
| 0.0476 | 7.3202 | 3704 | 0.7928 | 0.2080 | 0.7928 | 0.8904 |
| 0.0476 | 7.3241 | 3706 | 0.7839 | 0.2080 | 0.7839 | 0.8854 |
| 0.0476 | 7.3281 | 3708 | 0.7770 | 0.2080 | 0.7770 | 0.8815 |
| 0.0476 | 7.3320 | 3710 | 0.7672 | 0.2029 | 0.7672 | 0.8759 |
| 0.0476 | 7.3360 | 3712 | 0.7688 | 0.2080 | 0.7688 | 0.8768 |
| 0.0476 | 7.3399 | 3714 | 0.7670 | 0.2029 | 0.7670 | 0.8758 |
| 0.0476 | 7.3439 | 3716 | 0.7691 | 0.2029 | 0.7691 | 0.8770 |
| 0.0476 | 7.3478 | 3718 | 0.7719 | 0.2029 | 0.7719 | 0.8786 |
| 0.0476 | 7.3518 | 3720 | 0.7794 | 0.2029 | 0.7794 | 0.8828 |
| 0.0476 | 7.3557 | 3722 | 0.7958 | 0.2029 | 0.7958 | 0.8921 |
| 0.0476 | 7.3597 | 3724 | 0.8140 | 0.2080 | 0.8140 | 0.9022 |
| 0.0476 | 7.3636 | 3726 | 0.8353 | 0.2080 | 0.8353 | 0.9140 |
| 0.0476 | 7.3676 | 3728 | 0.8311 | 0.2080 | 0.8311 | 0.9117 |
| 0.0476 | 7.3715 | 3730 | 0.8173 | 0.2080 | 0.8173 | 0.9041 |
| 0.0476 | 7.3755 | 3732 | 0.8085 | 0.2029 | 0.8085 | 0.8992 |
| 0.0476 | 7.3794 | 3734 | 0.8049 | 0.2029 | 0.8049 | 0.8972 |
| 0.0476 | 7.3834 | 3736 | 0.7965 | 0.2029 | 0.7965 | 0.8925 |
| 0.0476 | 7.3874 | 3738 | 0.8028 | 0.2080 | 0.8028 | 0.8960 |
| 0.0476 | 7.3913 | 3740 | 0.8085 | 0.2080 | 0.8085 | 0.8991 |
| 0.0476 | 7.3953 | 3742 | 0.8034 | 0.2080 | 0.8034 | 0.8963 |
| 0.0476 | 7.3992 | 3744 | 0.7925 | 0.2080 | 0.7925 | 0.8902 |
| 0.0476 | 7.4032 | 3746 | 0.7874 | 0.2080 | 0.7874 | 0.8873 |
| 0.0476 | 7.4071 | 3748 | 0.7774 | 0.2029 | 0.7774 | 0.8817 |
| 0.0476 | 7.4111 | 3750 | 0.7753 | 0.2029 | 0.7753 | 0.8805 |
| 0.0476 | 7.4150 | 3752 | 0.7775 | 0.2029 | 0.7775 | 0.8818 |
| 0.0476 | 7.4190 | 3754 | 0.7761 | 0.2029 | 0.7761 | 0.8809 |
| 0.0476 | 7.4229 | 3756 | 0.7806 | 0.2029 | 0.7806 | 0.8835 |
| 0.0476 | 7.4269 | 3758 | 0.8036 | 0.2080 | 0.8036 | 0.8965 |
| 0.0476 | 7.4308 | 3760 | 0.8219 | 0.2080 | 0.8219 | 0.9066 |
| 0.0476 | 7.4348 | 3762 | 0.8394 | 0.2080 | 0.8394 | 0.9162 |
| 0.0476 | 7.4387 | 3764 | 0.8450 | 0.2080 | 0.8450 | 0.9192 |
| 0.0476 | 7.4427 | 3766 | 0.8240 | 0.2080 | 0.8240 | 0.9077 |
| 0.0476 | 7.4466 | 3768 | 0.7933 | 0.2080 | 0.7933 | 0.8907 |
| 0.0476 | 7.4506 | 3770 | 0.7759 | 0.2080 | 0.7759 | 0.8808 |
| 0.0476 | 7.4545 | 3772 | 0.7676 | 0.2080 | 0.7676 | 0.8761 |
| 0.0476 | 7.4585 | 3774 | 0.7710 | 0.2080 | 0.7710 | 0.8781 |
| 0.0476 | 7.4625 | 3776 | 0.7646 | 0.2080 | 0.7646 | 0.8744 |
| 0.0476 | 7.4664 | 3778 | 0.7608 | 0.2080 | 0.7608 | 0.8722 |
| 0.0476 | 7.4704 | 3780 | 0.7556 | 0.2029 | 0.7556 | 0.8692 |
| 0.0476 | 7.4743 | 3782 | 0.7528 | 0.2029 | 0.7528 | 0.8676 |
| 0.0476 | 7.4783 | 3784 | 0.7615 | 0.2080 | 0.7615 | 0.8726 |
| 0.0476 | 7.4822 | 3786 | 0.7715 | 0.2080 | 0.7715 | 0.8784 |
| 0.0476 | 7.4862 | 3788 | 0.7735 | 0.2080 | 0.7735 | 0.8795 |
| 0.0476 | 7.4901 | 3790 | 0.7707 | 0.2080 | 0.7707 | 0.8779 |
| 0.0476 | 7.4941 | 3792 | 0.7601 | 0.2080 | 0.7601 | 0.8718 |
| 0.0476 | 7.4980 | 3794 | 0.7554 | 0.2080 | 0.7554 | 0.8692 |
| 0.0476 | 7.5020 | 3796 | 0.7478 | 0.2080 | 0.7478 | 0.8648 |
| 0.0476 | 7.5059 | 3798 | 0.7450 | 0.2029 | 0.7450 | 0.8631 |
| 0.0476 | 7.5099 | 3800 | 0.7442 | 0.2029 | 0.7442 | 0.8627 |
| 0.0476 | 7.5138 | 3802 | 0.7456 | 0.2029 | 0.7456 | 0.8635 |
| 0.0476 | 7.5178 | 3804 | 0.7491 | 0.2029 | 0.7491 | 0.8655 |
| 0.0476 | 7.5217 | 3806 | 0.7554 | 0.2029 | 0.7554 | 0.8691 |
| 0.0476 | 7.5257 | 3808 | 0.7746 | 0.2080 | 0.7746 | 0.8801 |
| 0.0476 | 7.5296 | 3810 | 0.7938 | 0.2080 | 0.7938 | 0.8910 |
| 0.0476 | 7.5336 | 3812 | 0.7957 | 0.2080 | 0.7957 | 0.8920 |
| 0.0476 | 7.5375 | 3814 | 0.7875 | 0.2080 | 0.7875 | 0.8874 |
| 0.0476 | 7.5415 | 3816 | 0.7722 | 0.2080 | 0.7722 | 0.8788 |
| 0.0476 | 7.5455 | 3818 | 0.7663 | 0.2080 | 0.7663 | 0.8754 |
| 0.0476 | 7.5494 | 3820 | 0.7662 | 0.2080 | 0.7662 | 0.8754 |
| 0.0476 | 7.5534 | 3822 | 0.7611 | 0.2080 | 0.7611 | 0.8724 |
| 0.0476 | 7.5573 | 3824 | 0.7535 | 0.2029 | 0.7535 | 0.8680 |
| 0.0476 | 7.5613 | 3826 | 0.7545 | 0.2029 | 0.7545 | 0.8686 |
| 0.0476 | 7.5652 | 3828 | 0.7589 | 0.2080 | 0.7589 | 0.8711 |
| 0.0476 | 7.5692 | 3830 | 0.7652 | 0.2080 | 0.7652 | 0.8747 |
| 0.0476 | 7.5731 | 3832 | 0.7671 | 0.2080 | 0.7671 | 0.8758 |
| 0.0476 | 7.5771 | 3834 | 0.7798 | 0.2080 | 0.7798 | 0.8831 |
| 0.0476 | 7.5810 | 3836 | 0.7987 | 0.2080 | 0.7987 | 0.8937 |
| 0.0476 | 7.5850 | 3838 | 0.7965 | 0.2080 | 0.7965 | 0.8925 |
| 0.0476 | 7.5889 | 3840 | 0.7859 | 0.2080 | 0.7859 | 0.8865 |
| 0.0476 | 7.5929 | 3842 | 0.7897 | 0.2080 | 0.7897 | 0.8887 |
| 0.0476 | 7.5968 | 3844 | 0.7812 | 0.2080 | 0.7812 | 0.8838 |
| 0.0476 | 7.6008 | 3846 | 0.7689 | 0.2080 | 0.7689 | 0.8769 |
| 0.0476 | 7.6047 | 3848 | 0.7675 | 0.2080 | 0.7675 | 0.8760 |
| 0.0476 | 7.6087 | 3850 | 0.7708 | 0.2080 | 0.7708 | 0.8779 |
| 0.0476 | 7.6126 | 3852 | 0.7640 | 0.2080 | 0.7640 | 0.8741 |
| 0.0476 | 7.6166 | 3854 | 0.7494 | 0.2080 | 0.7494 | 0.8657 |
| 0.0476 | 7.6206 | 3856 | 0.7442 | 0.2029 | 0.7442 | 0.8627 |
| 0.0476 | 7.6245 | 3858 | 0.7452 | 0.2029 | 0.7452 | 0.8633 |
| 0.0476 | 7.6285 | 3860 | 0.7528 | 0.2080 | 0.7528 | 0.8677 |
| 0.0476 | 7.6324 | 3862 | 0.7565 | 0.2080 | 0.7565 | 0.8698 |
| 0.0476 | 7.6364 | 3864 | 0.7487 | 0.2029 | 0.7487 | 0.8653 |
| 0.0476 | 7.6403 | 3866 | 0.7417 | 0.2029 | 0.7417 | 0.8612 |
| 0.0476 | 7.6443 | 3868 | 0.7336 | 0.2029 | 0.7336 | 0.8565 |
| 0.0476 | 7.6482 | 3870 | 0.7291 | 0.2029 | 0.7291 | 0.8539 |
| 0.0476 | 7.6522 | 3872 | 0.7251 | 0.2029 | 0.7251 | 0.8515 |
| 0.0476 | 7.6561 | 3874 | 0.7236 | 0.2029 | 0.7236 | 0.8507 |
| 0.0476 | 7.6601 | 3876 | 0.7224 | 0.2029 | 0.7224 | 0.8499 |
| 0.0476 | 7.6640 | 3878 | 0.7290 | 0.2029 | 0.7290 | 0.8538 |
| 0.0476 | 7.6680 | 3880 | 0.7463 | 0.2080 | 0.7463 | 0.8639 |
| 0.0476 | 7.6719 | 3882 | 0.7594 | 0.2080 | 0.7594 | 0.8714 |
| 0.0476 | 7.6759 | 3884 | 0.7564 | 0.2080 | 0.7564 | 0.8697 |
| 0.0476 | 7.6798 | 3886 | 0.7507 | 0.2080 | 0.7507 | 0.8665 |
| 0.0476 | 7.6838 | 3888 | 0.7441 | 0.2080 | 0.7441 | 0.8626 |
| 0.0476 | 7.6877 | 3890 | 0.7338 | 0.2029 | 0.7338 | 0.8566 |
| 0.0476 | 7.6917 | 3892 | 0.7269 | 0.2029 | 0.7269 | 0.8526 |
| 0.0476 | 7.6957 | 3894 | 0.7277 | 0.2029 | 0.7277 | 0.8531 |
| 0.0476 | 7.6996 | 3896 | 0.7281 | 0.2029 | 0.7281 | 0.8533 |
| 0.0476 | 7.7036 | 3898 | 0.7303 | 0.2029 | 0.7303 | 0.8546 |
| 0.0476 | 7.7075 | 3900 | 0.7322 | 0.2029 | 0.7322 | 0.8557 |
| 0.0476 | 7.7115 | 3902 | 0.7389 | 0.2029 | 0.7389 | 0.8596 |
| 0.0476 | 7.7154 | 3904 | 0.7510 | 0.2080 | 0.7510 | 0.8666 |
| 0.0476 | 7.7194 | 3906 | 0.7664 | 0.2080 | 0.7664 | 0.8754 |
| 0.0476 | 7.7233 | 3908 | 0.7709 | 0.2080 | 0.7709 | 0.8780 |
| 0.0476 | 7.7273 | 3910 | 0.7801 | 0.2080 | 0.7801 | 0.8832 |
| 0.0476 | 7.7312 | 3912 | 0.7952 | 0.2080 | 0.7952 | 0.8918 |
| 0.0476 | 7.7352 | 3914 | 0.7903 | 0.2080 | 0.7903 | 0.8890 |
| 0.0476 | 7.7391 | 3916 | 0.7708 | 0.2080 | 0.7708 | 0.8780 |
| 0.0476 | 7.7431 | 3918 | 0.7534 | 0.2080 | 0.7534 | 0.8680 |
| 0.0476 | 7.7470 | 3920 | 0.7520 | 0.2029 | 0.7520 | 0.8672 |
| 0.0476 | 7.7510 | 3922 | 0.7559 | 0.2080 | 0.7559 | 0.8694 |
| 0.0476 | 7.7549 | 3924 | 0.7606 | 0.2080 | 0.7606 | 0.8721 |
| 0.0476 | 7.7589 | 3926 | 0.7649 | 0.2080 | 0.7649 | 0.8746 |
| 0.0476 | 7.7628 | 3928 | 0.7681 | 0.2080 | 0.7681 | 0.8764 |
| 0.0476 | 7.7668 | 3930 | 0.7809 | 0.2080 | 0.7809 | 0.8837 |
| 0.0476 | 7.7708 | 3932 | 0.7880 | 0.2080 | 0.7880 | 0.8877 |
| 0.0476 | 7.7747 | 3934 | 0.7926 | 0.2080 | 0.7926 | 0.8903 |
| 0.0476 | 7.7787 | 3936 | 0.7889 | 0.2080 | 0.7889 | 0.8882 |
| 0.0476 | 7.7826 | 3938 | 0.7809 | 0.2080 | 0.7809 | 0.8837 |
| 0.0476 | 7.7866 | 3940 | 0.7759 | 0.2029 | 0.7759 | 0.8808 |
| 0.0476 | 7.7905 | 3942 | 0.7779 | 0.2029 | 0.7779 | 0.8820 |
| 0.0476 | 7.7945 | 3944 | 0.7820 | 0.2080 | 0.7820 | 0.8843 |
| 0.0476 | 7.7984 | 3946 | 0.7847 | 0.2080 | 0.7847 | 0.8858 |
| 0.0476 | 7.8024 | 3948 | 0.7851 | 0.2080 | 0.7851 | 0.8861 |
| 0.0476 | 7.8063 | 3950 | 0.7879 | 0.2080 | 0.7879 | 0.8876 |
| 0.0476 | 7.8103 | 3952 | 0.7962 | 0.2080 | 0.7962 | 0.8923 |
| 0.0476 | 7.8142 | 3954 | 0.8064 | 0.2080 | 0.8064 | 0.8980 |
| 0.0476 | 7.8182 | 3956 | 0.8138 | 0.2080 | 0.8138 | 0.9021 |
| 0.0476 | 7.8221 | 3958 | 0.8120 | 0.2080 | 0.8120 | 0.9011 |
| 0.0476 | 7.8261 | 3960 | 0.8063 | 0.2080 | 0.8063 | 0.8980 |
| 0.0476 | 7.8300 | 3962 | 0.7988 | 0.2080 | 0.7988 | 0.8938 |
| 0.0476 | 7.8340 | 3964 | 0.7930 | 0.2080 | 0.7930 | 0.8905 |
| 0.0476 | 7.8379 | 3966 | 0.7922 | 0.2080 | 0.7922 | 0.8901 |
| 0.0476 | 7.8419 | 3968 | 0.7897 | 0.2029 | 0.7897 | 0.8886 |
| 0.0476 | 7.8458 | 3970 | 0.7863 | 0.2029 | 0.7863 | 0.8867 |
| 0.0476 | 7.8498 | 3972 | 0.7851 | 0.2029 | 0.7851 | 0.8861 |
| 0.0476 | 7.8538 | 3974 | 0.7851 | 0.2029 | 0.7851 | 0.8860 |
| 0.0476 | 7.8577 | 3976 | 0.7851 | 0.2029 | 0.7851 | 0.8861 |
| 0.0476 | 7.8617 | 3978 | 0.7874 | 0.2029 | 0.7874 | 0.8873 |
| 0.0476 | 7.8656 | 3980 | 0.7899 | 0.2029 | 0.7899 | 0.8888 |
| 0.0476 | 7.8696 | 3982 | 0.7950 | 0.2080 | 0.7950 | 0.8917 |
| 0.0476 | 7.8735 | 3984 | 0.8008 | 0.2080 | 0.8008 | 0.8949 |
| 0.0476 | 7.8775 | 3986 | 0.8075 | 0.2080 | 0.8075 | 0.8986 |
| 0.0476 | 7.8814 | 3988 | 0.8125 | 0.2080 | 0.8125 | 0.9014 |
| 0.0476 | 7.8854 | 3990 | 0.8195 | 0.2080 | 0.8195 | 0.9052 |
| 0.0476 | 7.8893 | 3992 | 0.8136 | 0.2080 | 0.8136 | 0.9020 |
| 0.0476 | 7.8933 | 3994 | 0.8054 | 0.2080 | 0.8054 | 0.8974 |
| 0.0476 | 7.8972 | 3996 | 0.7985 | 0.2080 | 0.7985 | 0.8936 |
| 0.0476 | 7.9012 | 3998 | 0.7960 | 0.2080 | 0.7960 | 0.8922 |
| 0.0423 | 7.9051 | 4000 | 0.8002 | 0.2080 | 0.8002 | 0.8946 |
| 0.0423 | 7.9091 | 4002 | 0.8100 | 0.2080 | 0.8100 | 0.9000 |
| 0.0423 | 7.9130 | 4004 | 0.8134 | 0.2080 | 0.8134 | 0.9019 |
| 0.0423 | 7.9170 | 4006 | 0.8107 | 0.2080 | 0.8107 | 0.9004 |
| 0.0423 | 7.9209 | 4008 | 0.8028 | 0.2080 | 0.8028 | 0.8960 |
| 0.0423 | 7.9249 | 4010 | 0.7996 | 0.2080 | 0.7996 | 0.8942 |
| 0.0423 | 7.9289 | 4012 | 0.8045 | 0.2080 | 0.8045 | 0.8969 |
| 0.0423 | 7.9328 | 4014 | 0.8183 | 0.2080 | 0.8183 | 0.9046 |
| 0.0423 | 7.9368 | 4016 | 0.8375 | 0.2080 | 0.8375 | 0.9151 |
| 0.0423 | 7.9407 | 4018 | 0.8529 | 0.2080 | 0.8529 | 0.9235 |
| 0.0423 | 7.9447 | 4020 | 0.8570 | 0.2080 | 0.8570 | 0.9257 |
| 0.0423 | 7.9486 | 4022 | 0.8418 | 0.2080 | 0.8418 | 0.9175 |
| 0.0423 | 7.9526 | 4024 | 0.8197 | 0.2080 | 0.8197 | 0.9054 |
| 0.0423 | 7.9565 | 4026 | 0.8049 | 0.2080 | 0.8049 | 0.8972 |
| 0.0423 | 7.9605 | 4028 | 0.7980 | 0.2080 | 0.7980 | 0.8933 |
| 0.0423 | 7.9644 | 4030 | 0.7951 | 0.2029 | 0.7951 | 0.8917 |
| 0.0423 | 7.9684 | 4032 | 0.7933 | 0.2029 | 0.7933 | 0.8907 |
| 0.0423 | 7.9723 | 4034 | 0.7936 | 0.2029 | 0.7936 | 0.8908 |
| 0.0423 | 7.9763 | 4036 | 0.7974 | 0.2080 | 0.7974 | 0.8930 |
| 0.0423 | 7.9802 | 4038 | 0.8021 | 0.2080 | 0.8021 | 0.8956 |
| 0.0423 | 7.9842 | 4040 | 0.8134 | 0.2080 | 0.8134 | 0.9019 |
| 0.0423 | 7.9881 | 4042 | 0.8297 | 0.2080 | 0.8297 | 0.9109 |
| 0.0423 | 7.9921 | 4044 | 0.8359 | 0.2080 | 0.8359 | 0.9142 |
| 0.0423 | 7.9960 | 4046 | 0.8445 | 0.2080 | 0.8445 | 0.9190 |
| 0.0423 | 8.0 | 4048 | 0.8348 | 0.2080 | 0.8348 | 0.9137 |
| 0.0423 | 8.0040 | 4050 | 0.8164 | 0.2080 | 0.8164 | 0.9036 |
| 0.0423 | 8.0079 | 4052 | 0.8031 | 0.2080 | 0.8031 | 0.8961 |
| 0.0423 | 8.0119 | 4054 | 0.8025 | 0.2080 | 0.8025 | 0.8958 |
| 0.0423 | 8.0158 | 4056 | 0.8044 | 0.2080 | 0.8044 | 0.8969 |
| 0.0423 | 8.0198 | 4058 | 0.7982 | 0.2080 | 0.7982 | 0.8934 |
| 0.0423 | 8.0237 | 4060 | 0.7973 | 0.2080 | 0.7973 | 0.8929 |
| 0.0423 | 8.0277 | 4062 | 0.7991 | 0.2080 | 0.7991 | 0.8939 |
| 0.0423 | 8.0316 | 4064 | 0.8061 | 0.2080 | 0.8061 | 0.8979 |
| 0.0423 | 8.0356 | 4066 | 0.8144 | 0.2080 | 0.8144 | 0.9025 |
| 0.0423 | 8.0395 | 4068 | 0.8216 | 0.2080 | 0.8216 | 0.9064 |
| 0.0423 | 8.0435 | 4070 | 0.8227 | 0.2080 | 0.8227 | 0.9070 |
| 0.0423 | 8.0474 | 4072 | 0.8208 | 0.2080 | 0.8208 | 0.9060 |
| 0.0423 | 8.0514 | 4074 | 0.8148 | 0.2080 | 0.8148 | 0.9027 |
| 0.0423 | 8.0553 | 4076 | 0.8097 | 0.0530 | 0.8097 | 0.8998 |
| 0.0423 | 8.0593 | 4078 | 0.8087 | 0.0530 | 0.8087 | 0.8993 |
| 0.0423 | 8.0632 | 4080 | 0.8103 | 0.0530 | 0.8103 | 0.9001 |
| 0.0423 | 8.0672 | 4082 | 0.8131 | 0.0530 | 0.8131 | 0.9017 |
| 0.0423 | 8.0711 | 4084 | 0.8167 | 0.0530 | 0.8167 | 0.9037 |
| 0.0423 | 8.0751 | 4086 | 0.8199 | 0.2080 | 0.8199 | 0.9055 |
| 0.0423 | 8.0791 | 4088 | 0.8276 | 0.2080 | 0.8276 | 0.9097 |
| 0.0423 | 8.0830 | 4090 | 0.8278 | 0.2080 | 0.8278 | 0.9099 |
| 0.0423 | 8.0870 | 4092 | 0.8295 | 0.2080 | 0.8295 | 0.9108 |
| 0.0423 | 8.0909 | 4094 | 0.8330 | 0.2080 | 0.8330 | 0.9127 |
| 0.0423 | 8.0949 | 4096 | 0.8322 | 0.2080 | 0.8322 | 0.9123 |
| 0.0423 | 8.0988 | 4098 | 0.8239 | 0.2080 | 0.8239 | 0.9077 |
| 0.0423 | 8.1028 | 4100 | 0.8155 | 0.2080 | 0.8155 | 0.9031 |
| 0.0423 | 8.1067 | 4102 | 0.8110 | 0.2080 | 0.8110 | 0.9006 |
| 0.0423 | 8.1107 | 4104 | 0.8131 | 0.2080 | 0.8131 | 0.9017 |
| 0.0423 | 8.1146 | 4106 | 0.8144 | 0.2080 | 0.8144 | 0.9025 |
| 0.0423 | 8.1186 | 4108 | 0.8201 | 0.2080 | 0.8201 | 0.9056 |
| 0.0423 | 8.1225 | 4110 | 0.8147 | 0.2080 | 0.8147 | 0.9026 |
| 0.0423 | 8.1265 | 4112 | 0.8088 | 0.2080 | 0.8088 | 0.8993 |
| 0.0423 | 8.1304 | 4114 | 0.8017 | 0.2080 | 0.8017 | 0.8954 |
| 0.0423 | 8.1344 | 4116 | 0.7983 | 0.0530 | 0.7983 | 0.8935 |
| 0.0423 | 8.1383 | 4118 | 0.7988 | 0.0530 | 0.7988 | 0.8937 |
| 0.0423 | 8.1423 | 4120 | 0.7994 | 0.0530 | 0.7994 | 0.8941 |
| 0.0423 | 8.1462 | 4122 | 0.8003 | 0.2029 | 0.8003 | 0.8946 |
| 0.0423 | 8.1502 | 4124 | 0.8035 | 0.2080 | 0.8035 | 0.8964 |
| 0.0423 | 8.1542 | 4126 | 0.8037 | 0.2080 | 0.8037 | 0.8965 |
| 0.0423 | 8.1581 | 4128 | 0.8036 | 0.2080 | 0.8036 | 0.8964 |
| 0.0423 | 8.1621 | 4130 | 0.8042 | 0.2080 | 0.8042 | 0.8968 |
| 0.0423 | 8.1660 | 4132 | 0.8092 | 0.2080 | 0.8092 | 0.8996 |
| 0.0423 | 8.1700 | 4134 | 0.8116 | 0.2080 | 0.8116 | 0.9009 |
| 0.0423 | 8.1739 | 4136 | 0.8098 | 0.2080 | 0.8098 | 0.8999 |
| 0.0423 | 8.1779 | 4138 | 0.8088 | 0.2029 | 0.8088 | 0.8993 |
| 0.0423 | 8.1818 | 4140 | 0.8097 | 0.0530 | 0.8097 | 0.8998 |
| 0.0423 | 8.1858 | 4142 | 0.8097 | 0.0530 | 0.8097 | 0.8999 |
| 0.0423 | 8.1897 | 4144 | 0.8105 | 0.0530 | 0.8105 | 0.9003 |
| 0.0423 | 8.1937 | 4146 | 0.8090 | 0.0530 | 0.8090 | 0.8995 |
| 0.0423 | 8.1976 | 4148 | 0.8105 | 0.2080 | 0.8105 | 0.9003 |
| 0.0423 | 8.2016 | 4150 | 0.8190 | 0.2080 | 0.8190 | 0.9050 |
| 0.0423 | 8.2055 | 4152 | 0.8214 | 0.2080 | 0.8214 | 0.9063 |
| 0.0423 | 8.2095 | 4154 | 0.8250 | 0.2080 | 0.8250 | 0.9083 |
| 0.0423 | 8.2134 | 4156 | 0.8239 | 0.2080 | 0.8239 | 0.9077 |
| 0.0423 | 8.2174 | 4158 | 0.8214 | 0.2080 | 0.8214 | 0.9063 |
| 0.0423 | 8.2213 | 4160 | 0.8136 | 0.2080 | 0.8136 | 0.9020 |
| 0.0423 | 8.2253 | 4162 | 0.8067 | 0.2080 | 0.8067 | 0.8982 |
| 0.0423 | 8.2292 | 4164 | 0.8016 | 0.2029 | 0.8016 | 0.8953 |
| 0.0423 | 8.2332 | 4166 | 0.8017 | 0.0530 | 0.8017 | 0.8954 |
| 0.0423 | 8.2372 | 4168 | 0.8047 | 0.0530 | 0.8047 | 0.8971 |
| 0.0423 | 8.2411 | 4170 | 0.8091 | 0.0530 | 0.8091 | 0.8995 |
| 0.0423 | 8.2451 | 4172 | 0.8109 | 0.0530 | 0.8109 | 0.9005 |
| 0.0423 | 8.2490 | 4174 | 0.8111 | 0.0530 | 0.8111 | 0.9006 |
| 0.0423 | 8.2530 | 4176 | 0.8111 | 0.0530 | 0.8111 | 0.9006 |
| 0.0423 | 8.2569 | 4178 | 0.8110 | 0.0530 | 0.8110 | 0.9006 |
| 0.0423 | 8.2609 | 4180 | 0.8107 | 0.0530 | 0.8107 | 0.9004 |
| 0.0423 | 8.2648 | 4182 | 0.8091 | 0.2029 | 0.8091 | 0.8995 |
| 0.0423 | 8.2688 | 4184 | 0.8098 | 0.2080 | 0.8098 | 0.8999 |
| 0.0423 | 8.2727 | 4186 | 0.8126 | 0.2080 | 0.8126 | 0.9015 |
| 0.0423 | 8.2767 | 4188 | 0.8126 | 0.2080 | 0.8126 | 0.9014 |
| 0.0423 | 8.2806 | 4190 | 0.8130 | 0.2080 | 0.8130 | 0.9017 |
| 0.0423 | 8.2846 | 4192 | 0.8146 | 0.2080 | 0.8146 | 0.9026 |
| 0.0423 | 8.2885 | 4194 | 0.8166 | 0.2080 | 0.8166 | 0.9037 |
| 0.0423 | 8.2925 | 4196 | 0.8179 | 0.2080 | 0.8179 | 0.9044 |
| 0.0423 | 8.2964 | 4198 | 0.8220 | 0.2080 | 0.8220 | 0.9066 |
| 0.0423 | 8.3004 | 4200 | 0.8201 | 0.2080 | 0.8201 | 0.9056 |
| 0.0423 | 8.3043 | 4202 | 0.8164 | 0.2080 | 0.8164 | 0.9035 |
| 0.0423 | 8.3083 | 4204 | 0.8136 | 0.2080 | 0.8136 | 0.9020 |
| 0.0423 | 8.3123 | 4206 | 0.8145 | 0.2080 | 0.8145 | 0.9025 |
| 0.0423 | 8.3162 | 4208 | 0.8167 | 0.2080 | 0.8167 | 0.9037 |
| 0.0423 | 8.3202 | 4210 | 0.8149 | 0.2080 | 0.8149 | 0.9027 |
| 0.0423 | 8.3241 | 4212 | 0.8167 | 0.2080 | 0.8167 | 0.9037 |
| 0.0423 | 8.3281 | 4214 | 0.8187 | 0.2080 | 0.8187 | 0.9048 |
| 0.0423 | 8.3320 | 4216 | 0.8220 | 0.2080 | 0.8220 | 0.9066 |
| 0.0423 | 8.3360 | 4218 | 0.8240 | 0.2080 | 0.8240 | 0.9078 |
| 0.0423 | 8.3399 | 4220 | 0.8254 | 0.2080 | 0.8254 | 0.9085 |
| 0.0423 | 8.3439 | 4222 | 0.8239 | 0.2080 | 0.8239 | 0.9077 |
| 0.0423 | 8.3478 | 4224 | 0.8258 | 0.2080 | 0.8258 | 0.9087 |
| 0.0423 | 8.3518 | 4226 | 0.8238 | 0.0435 | 0.8238 | 0.9076 |
| 0.0423 | 8.3557 | 4228 | 0.8224 | 0.0435 | 0.8224 | 0.9069 |
| 0.0423 | 8.3597 | 4230 | 0.8229 | 0.0435 | 0.8229 | 0.9071 |
| 0.0423 | 8.3636 | 4232 | 0.8230 | 0.0435 | 0.8230 | 0.9072 |
| 0.0423 | 8.3676 | 4234 | 0.8237 | 0.0435 | 0.8237 | 0.9076 |
| 0.0423 | 8.3715 | 4236 | 0.8239 | 0.0435 | 0.8239 | 0.9077 |
| 0.0423 | 8.3755 | 4238 | 0.8253 | 0.0435 | 0.8253 | 0.9085 |
| 0.0423 | 8.3794 | 4240 | 0.8283 | 0.2080 | 0.8283 | 0.9101 |
| 0.0423 | 8.3834 | 4242 | 0.8281 | 0.2080 | 0.8281 | 0.9100 |
| 0.0423 | 8.3874 | 4244 | 0.8313 | 0.2080 | 0.8313 | 0.9117 |
| 0.0423 | 8.3913 | 4246 | 0.8387 | 0.2080 | 0.8387 | 0.9158 |
| 0.0423 | 8.3953 | 4248 | 0.8424 | 0.2080 | 0.8424 | 0.9178 |
| 0.0423 | 8.3992 | 4250 | 0.8476 | 0.2080 | 0.8476 | 0.9206 |
| 0.0423 | 8.4032 | 4252 | 0.8532 | 0.2080 | 0.8532 | 0.9237 |
| 0.0423 | 8.4071 | 4254 | 0.8554 | 0.2080 | 0.8554 | 0.9249 |
| 0.0423 | 8.4111 | 4256 | 0.8519 | 0.2080 | 0.8519 | 0.9230 |
| 0.0423 | 8.4150 | 4258 | 0.8485 | 0.2080 | 0.8485 | 0.9211 |
| 0.0423 | 8.4190 | 4260 | 0.8491 | 0.2080 | 0.8491 | 0.9215 |
| 0.0423 | 8.4229 | 4262 | 0.8479 | 0.2080 | 0.8479 | 0.9208 |
| 0.0423 | 8.4269 | 4264 | 0.8489 | 0.2080 | 0.8489 | 0.9214 |
| 0.0423 | 8.4308 | 4266 | 0.8512 | 0.2080 | 0.8512 | 0.9226 |
| 0.0423 | 8.4348 | 4268 | 0.8558 | 0.2080 | 0.8558 | 0.9251 |
| 0.0423 | 8.4387 | 4270 | 0.8618 | 0.2080 | 0.8618 | 0.9283 |
| 0.0423 | 8.4427 | 4272 | 0.8628 | 0.2080 | 0.8628 | 0.9289 |
| 0.0423 | 8.4466 | 4274 | 0.8589 | 0.2080 | 0.8589 | 0.9268 |
| 0.0423 | 8.4506 | 4276 | 0.8555 | 0.0435 | 0.8555 | 0.9250 |
| 0.0423 | 8.4545 | 4278 | 0.8522 | 0.0435 | 0.8522 | 0.9232 |
| 0.0423 | 8.4585 | 4280 | 0.8484 | 0.0435 | 0.8484 | 0.9211 |
| 0.0423 | 8.4625 | 4282 | 0.8459 | 0.0435 | 0.8459 | 0.9197 |
| 0.0423 | 8.4664 | 4284 | 0.8469 | 0.0435 | 0.8469 | 0.9203 |
| 0.0423 | 8.4704 | 4286 | 0.8498 | 0.2080 | 0.8498 | 0.9219 |
| 0.0423 | 8.4743 | 4288 | 0.8475 | 0.2080 | 0.8475 | 0.9206 |
| 0.0423 | 8.4783 | 4290 | 0.8486 | 0.2080 | 0.8486 | 0.9212 |
| 0.0423 | 8.4822 | 4292 | 0.8529 | 0.2080 | 0.8529 | 0.9235 |
| 0.0423 | 8.4862 | 4294 | 0.8548 | 0.2080 | 0.8548 | 0.9245 |
| 0.0423 | 8.4901 | 4296 | 0.8555 | 0.2080 | 0.8555 | 0.9249 |
| 0.0423 | 8.4941 | 4298 | 0.8565 | 0.2080 | 0.8565 | 0.9255 |
| 0.0423 | 8.4980 | 4300 | 0.8544 | 0.2080 | 0.8544 | 0.9243 |
| 0.0423 | 8.5020 | 4302 | 0.8512 | 0.2080 | 0.8512 | 0.9226 |
| 0.0423 | 8.5059 | 4304 | 0.8476 | 0.2080 | 0.8476 | 0.9207 |
| 0.0423 | 8.5099 | 4306 | 0.8471 | 0.2080 | 0.8471 | 0.9204 |
| 0.0423 | 8.5138 | 4308 | 0.8510 | 0.2080 | 0.8510 | 0.9225 |
| 0.0423 | 8.5178 | 4310 | 0.8480 | 0.2080 | 0.8480 | 0.9208 |
| 0.0423 | 8.5217 | 4312 | 0.8420 | 0.2080 | 0.8420 | 0.9176 |
| 0.0423 | 8.5257 | 4314 | 0.8398 | 0.2080 | 0.8398 | 0.9164 |
| 0.0423 | 8.5296 | 4316 | 0.8422 | 0.2080 | 0.8422 | 0.9177 |
| 0.0423 | 8.5336 | 4318 | 0.8485 | 0.2080 | 0.8485 | 0.9212 |
| 0.0423 | 8.5375 | 4320 | 0.8494 | 0.2080 | 0.8494 | 0.9216 |
| 0.0423 | 8.5415 | 4322 | 0.8441 | 0.2080 | 0.8441 | 0.9187 |
| 0.0423 | 8.5455 | 4324 | 0.8331 | 0.2080 | 0.8331 | 0.9128 |
| 0.0423 | 8.5494 | 4326 | 0.8283 | 0.2080 | 0.8283 | 0.9101 |
| 0.0423 | 8.5534 | 4328 | 0.8293 | 0.2080 | 0.8293 | 0.9107 |
| 0.0423 | 8.5573 | 4330 | 0.8326 | 0.2080 | 0.8326 | 0.9125 |
| 0.0423 | 8.5613 | 4332 | 0.8400 | 0.2080 | 0.8400 | 0.9165 |
| 0.0423 | 8.5652 | 4334 | 0.8458 | 0.2080 | 0.8458 | 0.9197 |
| 0.0423 | 8.5692 | 4336 | 0.8421 | 0.2080 | 0.8421 | 0.9176 |
| 0.0423 | 8.5731 | 4338 | 0.8332 | 0.2080 | 0.8332 | 0.9128 |
| 0.0423 | 8.5771 | 4340 | 0.8264 | 0.2080 | 0.8264 | 0.9091 |
| 0.0423 | 8.5810 | 4342 | 0.8273 | 0.2080 | 0.8273 | 0.9096 |
| 0.0423 | 8.5850 | 4344 | 0.8315 | 0.2080 | 0.8315 | 0.9119 |
| 0.0423 | 8.5889 | 4346 | 0.8345 | 0.2080 | 0.8345 | 0.9135 |
| 0.0423 | 8.5929 | 4348 | 0.8344 | 0.2080 | 0.8344 | 0.9134 |
| 0.0423 | 8.5968 | 4350 | 0.8381 | 0.2080 | 0.8381 | 0.9155 |
| 0.0423 | 8.6008 | 4352 | 0.8373 | 0.2080 | 0.8373 | 0.9150 |
| 0.0423 | 8.6047 | 4354 | 0.8376 | 0.2080 | 0.8376 | 0.9152 |
| 0.0423 | 8.6087 | 4356 | 0.8365 | 0.2080 | 0.8365 | 0.9146 |
| 0.0423 | 8.6126 | 4358 | 0.8388 | 0.2080 | 0.8388 | 0.9159 |
| 0.0423 | 8.6166 | 4360 | 0.8440 | 0.2080 | 0.8440 | 0.9187 |
| 0.0423 | 8.6206 | 4362 | 0.8489 | 0.2080 | 0.8489 | 0.9213 |
| 0.0423 | 8.6245 | 4364 | 0.8550 | 0.2080 | 0.8550 | 0.9247 |
| 0.0423 | 8.6285 | 4366 | 0.8597 | 0.2080 | 0.8597 | 0.9272 |
| 0.0423 | 8.6324 | 4368 | 0.8598 | 0.2080 | 0.8598 | 0.9273 |
| 0.0423 | 8.6364 | 4370 | 0.8572 | 0.2080 | 0.8572 | 0.9258 |
| 0.0423 | 8.6403 | 4372 | 0.8518 | 0.2080 | 0.8518 | 0.9230 |
| 0.0423 | 8.6443 | 4374 | 0.8512 | 0.2080 | 0.8512 | 0.9226 |
| 0.0423 | 8.6482 | 4376 | 0.8461 | 0.2080 | 0.8461 | 0.9199 |
| 0.0423 | 8.6522 | 4378 | 0.8467 | 0.2080 | 0.8467 | 0.9201 |
| 0.0423 | 8.6561 | 4380 | 0.8430 | 0.2080 | 0.8430 | 0.9182 |
| 0.0423 | 8.6601 | 4382 | 0.8362 | 0.2080 | 0.8362 | 0.9145 |
| 0.0423 | 8.6640 | 4384 | 0.8342 | 0.2080 | 0.8342 | 0.9133 |
| 0.0423 | 8.6680 | 4386 | 0.8373 | 0.2080 | 0.8373 | 0.9150 |
| 0.0423 | 8.6719 | 4388 | 0.8436 | 0.2080 | 0.8436 | 0.9185 |
| 0.0423 | 8.6759 | 4390 | 0.8572 | 0.2080 | 0.8572 | 0.9259 |
| 0.0423 | 8.6798 | 4392 | 0.8763 | 0.2080 | 0.8763 | 0.9361 |
| 0.0423 | 8.6838 | 4394 | 0.8839 | 0.2080 | 0.8839 | 0.9401 |
| 0.0423 | 8.6877 | 4396 | 0.8803 | 0.2080 | 0.8803 | 0.9383 |
| 0.0423 | 8.6917 | 4398 | 0.8700 | 0.2080 | 0.8700 | 0.9327 |
| 0.0423 | 8.6957 | 4400 | 0.8533 | 0.2080 | 0.8533 | 0.9237 |
| 0.0423 | 8.6996 | 4402 | 0.8351 | 0.2080 | 0.8351 | 0.9138 |
| 0.0423 | 8.7036 | 4404 | 0.8221 | 0.2080 | 0.8221 | 0.9067 |
| 0.0423 | 8.7075 | 4406 | 0.8139 | 0.0530 | 0.8139 | 0.9022 |
| 0.0423 | 8.7115 | 4408 | 0.8098 | 0.0530 | 0.8098 | 0.8999 |
| 0.0423 | 8.7154 | 4410 | 0.8093 | 0.2080 | 0.8093 | 0.8996 |
| 0.0423 | 8.7194 | 4412 | 0.8129 | 0.2080 | 0.8129 | 0.9016 |
| 0.0423 | 8.7233 | 4414 | 0.8173 | 0.2080 | 0.8173 | 0.9041 |
| 0.0423 | 8.7273 | 4416 | 0.8216 | 0.2080 | 0.8216 | 0.9064 |
| 0.0423 | 8.7312 | 4418 | 0.8251 | 0.2080 | 0.8251 | 0.9084 |
| 0.0423 | 8.7352 | 4420 | 0.8231 | 0.2080 | 0.8231 | 0.9072 |
| 0.0423 | 8.7391 | 4422 | 0.8196 | 0.2080 | 0.8196 | 0.9053 |
| 0.0423 | 8.7431 | 4424 | 0.8129 | 0.2080 | 0.8129 | 0.9016 |
| 0.0423 | 8.7470 | 4426 | 0.8103 | 0.2080 | 0.8103 | 0.9002 |
| 0.0423 | 8.7510 | 4428 | 0.8081 | 0.2080 | 0.8081 | 0.8989 |
| 0.0423 | 8.7549 | 4430 | 0.8049 | 0.2080 | 0.8049 | 0.8972 |
| 0.0423 | 8.7589 | 4432 | 0.8052 | 0.2029 | 0.8052 | 0.8973 |
| 0.0423 | 8.7628 | 4434 | 0.8063 | 0.2029 | 0.8063 | 0.8979 |
| 0.0423 | 8.7668 | 4436 | 0.8108 | 0.2080 | 0.8108 | 0.9004 |
| 0.0423 | 8.7708 | 4438 | 0.8146 | 0.2080 | 0.8146 | 0.9026 |
| 0.0423 | 8.7747 | 4440 | 0.8187 | 0.2080 | 0.8187 | 0.9048 |
| 0.0423 | 8.7787 | 4442 | 0.8254 | 0.2080 | 0.8254 | 0.9085 |
| 0.0423 | 8.7826 | 4444 | 0.8286 | 0.2080 | 0.8286 | 0.9103 |
| 0.0423 | 8.7866 | 4446 | 0.8314 | 0.2080 | 0.8314 | 0.9118 |
| 0.0423 | 8.7905 | 4448 | 0.8278 | 0.2080 | 0.8278 | 0.9098 |
| 0.0423 | 8.7945 | 4450 | 0.8196 | 0.2080 | 0.8196 | 0.9053 |
| 0.0423 | 8.7984 | 4452 | 0.8153 | 0.2080 | 0.8153 | 0.9029 |
| 0.0423 | 8.8024 | 4454 | 0.8138 | 0.2080 | 0.8138 | 0.9021 |
| 0.0423 | 8.8063 | 4456 | 0.8112 | 0.2080 | 0.8112 | 0.9006 |
| 0.0423 | 8.8103 | 4458 | 0.8097 | 0.2080 | 0.8097 | 0.8999 |
| 0.0423 | 8.8142 | 4460 | 0.8119 | 0.2080 | 0.8119 | 0.9011 |
| 0.0423 | 8.8182 | 4462 | 0.8121 | 0.2080 | 0.8121 | 0.9012 |
| 0.0423 | 8.8221 | 4464 | 0.8158 | 0.2080 | 0.8158 | 0.9032 |
| 0.0423 | 8.8261 | 4466 | 0.8250 | 0.2080 | 0.8250 | 0.9083 |
| 0.0423 | 8.8300 | 4468 | 0.8338 | 0.2080 | 0.8338 | 0.9131 |
| 0.0423 | 8.8340 | 4470 | 0.8397 | 0.2080 | 0.8397 | 0.9163 |
| 0.0423 | 8.8379 | 4472 | 0.8397 | 0.2080 | 0.8397 | 0.9164 |
| 0.0423 | 8.8419 | 4474 | 0.8360 | 0.2080 | 0.8360 | 0.9143 |
| 0.0423 | 8.8458 | 4476 | 0.8287 | 0.2080 | 0.8287 | 0.9103 |
| 0.0423 | 8.8498 | 4478 | 0.8187 | 0.2080 | 0.8187 | 0.9048 |
| 0.0423 | 8.8538 | 4480 | 0.8111 | 0.2080 | 0.8111 | 0.9006 |
| 0.0423 | 8.8577 | 4482 | 0.8098 | 0.2080 | 0.8098 | 0.8999 |
| 0.0423 | 8.8617 | 4484 | 0.8127 | 0.2080 | 0.8127 | 0.9015 |
| 0.0423 | 8.8656 | 4486 | 0.8184 | 0.2080 | 0.8184 | 0.9047 |
| 0.0423 | 8.8696 | 4488 | 0.8195 | 0.2080 | 0.8195 | 0.9052 |
| 0.0423 | 8.8735 | 4490 | 0.8195 | 0.2080 | 0.8195 | 0.9052 |
| 0.0423 | 8.8775 | 4492 | 0.8243 | 0.2080 | 0.8243 | 0.9079 |
| 0.0423 | 8.8814 | 4494 | 0.8271 | 0.2080 | 0.8271 | 0.9095 |
| 0.0423 | 8.8854 | 4496 | 0.8269 | 0.2080 | 0.8269 | 0.9093 |
| 0.0423 | 8.8893 | 4498 | 0.8234 | 0.2080 | 0.8234 | 0.9074 |
| 0.0373 | 8.8933 | 4500 | 0.8193 | 0.2080 | 0.8193 | 0.9052 |
| 0.0373 | 8.8972 | 4502 | 0.8167 | 0.2080 | 0.8167 | 0.9037 |
| 0.0373 | 8.9012 | 4504 | 0.8152 | 0.2080 | 0.8152 | 0.9029 |
| 0.0373 | 8.9051 | 4506 | 0.8124 | 0.2029 | 0.8124 | 0.9013 |
| 0.0373 | 8.9091 | 4508 | 0.8142 | 0.2080 | 0.8142 | 0.9023 |
| 0.0373 | 8.9130 | 4510 | 0.8175 | 0.2080 | 0.8175 | 0.9042 |
| 0.0373 | 8.9170 | 4512 | 0.8211 | 0.2080 | 0.8211 | 0.9062 |
| 0.0373 | 8.9209 | 4514 | 0.8215 | 0.2080 | 0.8215 | 0.9064 |
| 0.0373 | 8.9249 | 4516 | 0.8223 | 0.2080 | 0.8223 | 0.9068 |
| 0.0373 | 8.9289 | 4518 | 0.8205 | 0.2080 | 0.8205 | 0.9058 |
| 0.0373 | 8.9328 | 4520 | 0.8169 | 0.2080 | 0.8169 | 0.9038 |
| 0.0373 | 8.9368 | 4522 | 0.8122 | 0.2080 | 0.8122 | 0.9012 |
| 0.0373 | 8.9407 | 4524 | 0.8105 | 0.2029 | 0.8105 | 0.9003 |
| 0.0373 | 8.9447 | 4526 | 0.8086 | 0.0530 | 0.8086 | 0.8992 |
| 0.0373 | 8.9486 | 4528 | 0.8093 | 0.0530 | 0.8093 | 0.8996 |
| 0.0373 | 8.9526 | 4530 | 0.8111 | 0.0530 | 0.8111 | 0.9006 |
| 0.0373 | 8.9565 | 4532 | 0.8130 | 0.0530 | 0.8130 | 0.9017 |
| 0.0373 | 8.9605 | 4534 | 0.8170 | 0.2029 | 0.8170 | 0.9039 |
| 0.0373 | 8.9644 | 4536 | 0.8248 | 0.2080 | 0.8248 | 0.9082 |
| 0.0373 | 8.9684 | 4538 | 0.8315 | 0.2080 | 0.8315 | 0.9119 |
| 0.0373 | 8.9723 | 4540 | 0.8379 | 0.2080 | 0.8379 | 0.9154 |
| 0.0373 | 8.9763 | 4542 | 0.8390 | 0.2080 | 0.8390 | 0.9160 |
| 0.0373 | 8.9802 | 4544 | 0.8433 | 0.2080 | 0.8433 | 0.9183 |
| 0.0373 | 8.9842 | 4546 | 0.8497 | 0.2080 | 0.8497 | 0.9218 |
| 0.0373 | 8.9881 | 4548 | 0.8543 | 0.2080 | 0.8543 | 0.9243 |
| 0.0373 | 8.9921 | 4550 | 0.8479 | 0.2080 | 0.8479 | 0.9208 |
| 0.0373 | 8.9960 | 4552 | 0.8390 | 0.2080 | 0.8390 | 0.9160 |
| 0.0373 | 9.0 | 4554 | 0.8314 | 0.2080 | 0.8314 | 0.9118 |
| 0.0373 | 9.0040 | 4556 | 0.8278 | 0.2080 | 0.8278 | 0.9098 |
| 0.0373 | 9.0079 | 4558 | 0.8223 | 0.2080 | 0.8223 | 0.9068 |
| 0.0373 | 9.0119 | 4560 | 0.8195 | 0.2080 | 0.8195 | 0.9052 |
| 0.0373 | 9.0158 | 4562 | 0.8188 | 0.2080 | 0.8188 | 0.9049 |
| 0.0373 | 9.0198 | 4564 | 0.8202 | 0.2080 | 0.8202 | 0.9056 |
| 0.0373 | 9.0237 | 4566 | 0.8241 | 0.2080 | 0.8241 | 0.9078 |
| 0.0373 | 9.0277 | 4568 | 0.8280 | 0.2080 | 0.8280 | 0.9099 |
| 0.0373 | 9.0316 | 4570 | 0.8290 | 0.2080 | 0.8290 | 0.9105 |
| 0.0373 | 9.0356 | 4572 | 0.8258 | 0.2080 | 0.8258 | 0.9088 |
| 0.0373 | 9.0395 | 4574 | 0.8222 | 0.2080 | 0.8222 | 0.9068 |
| 0.0373 | 9.0435 | 4576 | 0.8202 | 0.2080 | 0.8202 | 0.9057 |
| 0.0373 | 9.0474 | 4578 | 0.8194 | 0.2029 | 0.8194 | 0.9052 |
| 0.0373 | 9.0514 | 4580 | 0.8207 | 0.2080 | 0.8207 | 0.9059 |
| 0.0373 | 9.0553 | 4582 | 0.8212 | 0.2080 | 0.8212 | 0.9062 |
| 0.0373 | 9.0593 | 4584 | 0.8234 | 0.2080 | 0.8234 | 0.9074 |
| 0.0373 | 9.0632 | 4586 | 0.8262 | 0.2080 | 0.8262 | 0.9090 |
| 0.0373 | 9.0672 | 4588 | 0.8252 | 0.2080 | 0.8252 | 0.9084 |
| 0.0373 | 9.0711 | 4590 | 0.8232 | 0.2080 | 0.8232 | 0.9073 |
| 0.0373 | 9.0751 | 4592 | 0.8239 | 0.2080 | 0.8239 | 0.9077 |
| 0.0373 | 9.0791 | 4594 | 0.8237 | 0.2029 | 0.8237 | 0.9076 |
| 0.0373 | 9.0830 | 4596 | 0.8236 | 0.0530 | 0.8236 | 0.9075 |
| 0.0373 | 9.0870 | 4598 | 0.8242 | 0.0530 | 0.8242 | 0.9079 |
| 0.0373 | 9.0909 | 4600 | 0.8256 | 0.0530 | 0.8256 | 0.9086 |
| 0.0373 | 9.0949 | 4602 | 0.8274 | 0.0530 | 0.8274 | 0.9096 |
| 0.0373 | 9.0988 | 4604 | 0.8312 | 0.0530 | 0.8312 | 0.9117 |
| 0.0373 | 9.1028 | 4606 | 0.8360 | 0.0435 | 0.8360 | 0.9143 |
| 0.0373 | 9.1067 | 4608 | 0.8422 | 0.0435 | 0.8422 | 0.9177 |
| 0.0373 | 9.1107 | 4610 | 0.8476 | 0.2080 | 0.8476 | 0.9206 |
| 0.0373 | 9.1146 | 4612 | 0.8479 | 0.0435 | 0.8479 | 0.9208 |
| 0.0373 | 9.1186 | 4614 | 0.8458 | 0.0435 | 0.8458 | 0.9197 |
| 0.0373 | 9.1225 | 4616 | 0.8445 | 0.0435 | 0.8445 | 0.9190 |
| 0.0373 | 9.1265 | 4618 | 0.8469 | 0.0435 | 0.8469 | 0.9202 |
| 0.0373 | 9.1304 | 4620 | 0.8527 | 0.0435 | 0.8527 | 0.9234 |
| 0.0373 | 9.1344 | 4622 | 0.8635 | 0.2080 | 0.8635 | 0.9293 |
| 0.0373 | 9.1383 | 4624 | 0.8702 | 0.2080 | 0.8702 | 0.9328 |
| 0.0373 | 9.1423 | 4626 | 0.8705 | 0.2080 | 0.8705 | 0.9330 |
| 0.0373 | 9.1462 | 4628 | 0.8656 | 0.2080 | 0.8656 | 0.9304 |
| 0.0373 | 9.1502 | 4630 | 0.8610 | 0.0435 | 0.8610 | 0.9279 |
| 0.0373 | 9.1542 | 4632 | 0.8579 | 0.0435 | 0.8579 | 0.9263 |
| 0.0373 | 9.1581 | 4634 | 0.8534 | 0.0435 | 0.8534 | 0.9238 |
| 0.0373 | 9.1621 | 4636 | 0.8491 | 0.0435 | 0.8491 | 0.9215 |
| 0.0373 | 9.1660 | 4638 | 0.8478 | 0.0435 | 0.8478 | 0.9208 |
| 0.0373 | 9.1700 | 4640 | 0.8470 | 0.0435 | 0.8470 | 0.9203 |
| 0.0373 | 9.1739 | 4642 | 0.8498 | 0.0435 | 0.8498 | 0.9218 |
| 0.0373 | 9.1779 | 4644 | 0.8545 | 0.0435 | 0.8545 | 0.9244 |
| 0.0373 | 9.1818 | 4646 | 0.8609 | 0.2080 | 0.8609 | 0.9279 |
| 0.0373 | 9.1858 | 4648 | 0.8686 | 0.2080 | 0.8686 | 0.9320 |
| 0.0373 | 9.1897 | 4650 | 0.8735 | 0.2080 | 0.8735 | 0.9346 |
| 0.0373 | 9.1937 | 4652 | 0.8731 | 0.2080 | 0.8731 | 0.9344 |
| 0.0373 | 9.1976 | 4654 | 0.8676 | 0.2080 | 0.8676 | 0.9315 |
| 0.0373 | 9.2016 | 4656 | 0.8600 | 0.2080 | 0.8600 | 0.9274 |
| 0.0373 | 9.2055 | 4658 | 0.8551 | 0.0435 | 0.8551 | 0.9247 |
| 0.0373 | 9.2095 | 4660 | 0.8521 | 0.0435 | 0.8521 | 0.9231 |
| 0.0373 | 9.2134 | 4662 | 0.8511 | 0.0435 | 0.8511 | 0.9226 |
| 0.0373 | 9.2174 | 4664 | 0.8519 | 0.0435 | 0.8519 | 0.9230 |
| 0.0373 | 9.2213 | 4666 | 0.8547 | 0.0435 | 0.8547 | 0.9245 |
| 0.0373 | 9.2253 | 4668 | 0.8573 | 0.0435 | 0.8573 | 0.9259 |
| 0.0373 | 9.2292 | 4670 | 0.8604 | 0.0435 | 0.8604 | 0.9276 |
| 0.0373 | 9.2332 | 4672 | 0.8611 | 0.0435 | 0.8611 | 0.9279 |
| 0.0373 | 9.2372 | 4674 | 0.8614 | 0.0435 | 0.8614 | 0.9281 |
| 0.0373 | 9.2411 | 4676 | 0.8620 | 0.0435 | 0.8620 | 0.9284 |
| 0.0373 | 9.2451 | 4678 | 0.8612 | 0.0435 | 0.8612 | 0.9280 |
| 0.0373 | 9.2490 | 4680 | 0.8605 | 0.0435 | 0.8605 | 0.9276 |
| 0.0373 | 9.2530 | 4682 | 0.8586 | 0.0435 | 0.8586 | 0.9266 |
| 0.0373 | 9.2569 | 4684 | 0.8593 | 0.0435 | 0.8593 | 0.9270 |
| 0.0373 | 9.2609 | 4686 | 0.8605 | 0.0435 | 0.8605 | 0.9276 |
| 0.0373 | 9.2648 | 4688 | 0.8600 | 0.0435 | 0.8600 | 0.9273 |
| 0.0373 | 9.2688 | 4690 | 0.8614 | 0.0435 | 0.8614 | 0.9281 |
| 0.0373 | 9.2727 | 4692 | 0.8644 | 0.0435 | 0.8644 | 0.9297 |
| 0.0373 | 9.2767 | 4694 | 0.8650 | 0.0435 | 0.8650 | 0.9300 |
| 0.0373 | 9.2806 | 4696 | 0.8671 | 0.0435 | 0.8671 | 0.9312 |
| 0.0373 | 9.2846 | 4698 | 0.8699 | 0.0435 | 0.8699 | 0.9327 |
| 0.0373 | 9.2885 | 4700 | 0.8696 | 0.0435 | 0.8696 | 0.9325 |
| 0.0373 | 9.2925 | 4702 | 0.8706 | 0.0435 | 0.8706 | 0.9331 |
| 0.0373 | 9.2964 | 4704 | 0.8687 | 0.0435 | 0.8687 | 0.9320 |
| 0.0373 | 9.3004 | 4706 | 0.8666 | 0.0435 | 0.8666 | 0.9309 |
| 0.0373 | 9.3043 | 4708 | 0.8637 | 0.0435 | 0.8637 | 0.9293 |
| 0.0373 | 9.3083 | 4710 | 0.8621 | 0.0435 | 0.8621 | 0.9285 |
| 0.0373 | 9.3123 | 4712 | 0.8600 | 0.0435 | 0.8600 | 0.9274 |
| 0.0373 | 9.3162 | 4714 | 0.8598 | 0.0435 | 0.8598 | 0.9273 |
| 0.0373 | 9.3202 | 4716 | 0.8607 | 0.0435 | 0.8607 | 0.9278 |
| 0.0373 | 9.3241 | 4718 | 0.8628 | 0.0435 | 0.8628 | 0.9289 |
| 0.0373 | 9.3281 | 4720 | 0.8648 | 0.0435 | 0.8648 | 0.9300 |
| 0.0373 | 9.3320 | 4722 | 0.8654 | 0.0435 | 0.8654 | 0.9302 |
| 0.0373 | 9.3360 | 4724 | 0.8654 | 0.0435 | 0.8654 | 0.9303 |
| 0.0373 | 9.3399 | 4726 | 0.8670 | 0.0435 | 0.8670 | 0.9311 |
| 0.0373 | 9.3439 | 4728 | 0.8671 | 0.0435 | 0.8671 | 0.9312 |
| 0.0373 | 9.3478 | 4730 | 0.8682 | 0.0435 | 0.8682 | 0.9318 |
| 0.0373 | 9.3518 | 4732 | 0.8685 | 0.0435 | 0.8685 | 0.9319 |
| 0.0373 | 9.3557 | 4734 | 0.8681 | 0.0435 | 0.8681 | 0.9317 |
| 0.0373 | 9.3597 | 4736 | 0.8680 | 0.0435 | 0.8680 | 0.9316 |
| 0.0373 | 9.3636 | 4738 | 0.8686 | 0.0435 | 0.8686 | 0.9320 |
| 0.0373 | 9.3676 | 4740 | 0.8668 | 0.0435 | 0.8668 | 0.9310 |
| 0.0373 | 9.3715 | 4742 | 0.8661 | 0.0435 | 0.8661 | 0.9306 |
| 0.0373 | 9.3755 | 4744 | 0.8652 | 0.0435 | 0.8652 | 0.9302 |
| 0.0373 | 9.3794 | 4746 | 0.8651 | 0.0435 | 0.8651 | 0.9301 |
| 0.0373 | 9.3834 | 4748 | 0.8652 | 0.0435 | 0.8652 | 0.9302 |
| 0.0373 | 9.3874 | 4750 | 0.8639 | 0.0435 | 0.8639 | 0.9294 |
| 0.0373 | 9.3913 | 4752 | 0.8617 | 0.0435 | 0.8617 | 0.9283 |
| 0.0373 | 9.3953 | 4754 | 0.8607 | 0.0435 | 0.8607 | 0.9278 |
| 0.0373 | 9.3992 | 4756 | 0.8608 | 0.0530 | 0.8608 | 0.9278 |
| 0.0373 | 9.4032 | 4758 | 0.8606 | 0.0435 | 0.8606 | 0.9277 |
| 0.0373 | 9.4071 | 4760 | 0.8611 | 0.0435 | 0.8611 | 0.9280 |
| 0.0373 | 9.4111 | 4762 | 0.8616 | 0.0435 | 0.8616 | 0.9282 |
| 0.0373 | 9.4150 | 4764 | 0.8632 | 0.0435 | 0.8632 | 0.9291 |
| 0.0373 | 9.4190 | 4766 | 0.8668 | 0.0435 | 0.8668 | 0.9310 |
| 0.0373 | 9.4229 | 4768 | 0.8702 | 0.0435 | 0.8702 | 0.9328 |
| 0.0373 | 9.4269 | 4770 | 0.8714 | 0.0435 | 0.8714 | 0.9335 |
| 0.0373 | 9.4308 | 4772 | 0.8735 | 0.0435 | 0.8735 | 0.9346 |
| 0.0373 | 9.4348 | 4774 | 0.8759 | 0.2080 | 0.8759 | 0.9359 |
| 0.0373 | 9.4387 | 4776 | 0.8800 | 0.2080 | 0.8800 | 0.9381 |
| 0.0373 | 9.4427 | 4778 | 0.8809 | 0.2080 | 0.8809 | 0.9386 |
| 0.0373 | 9.4466 | 4780 | 0.8808 | 0.2080 | 0.8808 | 0.9385 |
| 0.0373 | 9.4506 | 4782 | 0.8777 | 0.2080 | 0.8777 | 0.9369 |
| 0.0373 | 9.4545 | 4784 | 0.8748 | 0.0435 | 0.8748 | 0.9353 |
| 0.0373 | 9.4585 | 4786 | 0.8728 | 0.0435 | 0.8728 | 0.9342 |
| 0.0373 | 9.4625 | 4788 | 0.8711 | 0.0435 | 0.8711 | 0.9333 |
| 0.0373 | 9.4664 | 4790 | 0.8701 | 0.0435 | 0.8701 | 0.9328 |
| 0.0373 | 9.4704 | 4792 | 0.8679 | 0.0435 | 0.8679 | 0.9316 |
| 0.0373 | 9.4743 | 4794 | 0.8667 | 0.0435 | 0.8667 | 0.9310 |
| 0.0373 | 9.4783 | 4796 | 0.8670 | 0.0435 | 0.8670 | 0.9311 |
| 0.0373 | 9.4822 | 4798 | 0.8673 | 0.0435 | 0.8673 | 0.9313 |
| 0.0373 | 9.4862 | 4800 | 0.8676 | 0.0435 | 0.8676 | 0.9315 |
| 0.0373 | 9.4901 | 4802 | 0.8686 | 0.0435 | 0.8686 | 0.9320 |
| 0.0373 | 9.4941 | 4804 | 0.8689 | 0.0435 | 0.8689 | 0.9321 |
| 0.0373 | 9.4980 | 4806 | 0.8682 | 0.0435 | 0.8682 | 0.9318 |
| 0.0373 | 9.5020 | 4808 | 0.8668 | 0.0435 | 0.8668 | 0.9310 |
| 0.0373 | 9.5059 | 4810 | 0.8669 | 0.0435 | 0.8669 | 0.9311 |
| 0.0373 | 9.5099 | 4812 | 0.8668 | 0.0435 | 0.8668 | 0.9310 |
| 0.0373 | 9.5138 | 4814 | 0.8661 | 0.0435 | 0.8661 | 0.9307 |
| 0.0373 | 9.5178 | 4816 | 0.8652 | 0.0435 | 0.8652 | 0.9301 |
| 0.0373 | 9.5217 | 4818 | 0.8643 | 0.0435 | 0.8643 | 0.9297 |
| 0.0373 | 9.5257 | 4820 | 0.8627 | 0.0435 | 0.8627 | 0.9288 |
| 0.0373 | 9.5296 | 4822 | 0.8619 | 0.0435 | 0.8619 | 0.9284 |
| 0.0373 | 9.5336 | 4824 | 0.8612 | 0.0435 | 0.8612 | 0.9280 |
| 0.0373 | 9.5375 | 4826 | 0.8601 | 0.0435 | 0.8601 | 0.9274 |
| 0.0373 | 9.5415 | 4828 | 0.8587 | 0.0435 | 0.8587 | 0.9267 |
| 0.0373 | 9.5455 | 4830 | 0.8585 | 0.0435 | 0.8585 | 0.9266 |
| 0.0373 | 9.5494 | 4832 | 0.8583 | 0.0435 | 0.8583 | 0.9264 |
| 0.0373 | 9.5534 | 4834 | 0.8578 | 0.0435 | 0.8578 | 0.9262 |
| 0.0373 | 9.5573 | 4836 | 0.8582 | 0.0435 | 0.8582 | 0.9264 |
| 0.0373 | 9.5613 | 4838 | 0.8594 | 0.0435 | 0.8594 | 0.9271 |
| 0.0373 | 9.5652 | 4840 | 0.8616 | 0.0435 | 0.8616 | 0.9282 |
| 0.0373 | 9.5692 | 4842 | 0.8650 | 0.0435 | 0.8650 | 0.9300 |
| 0.0373 | 9.5731 | 4844 | 0.8694 | 0.0435 | 0.8694 | 0.9324 |
| 0.0373 | 9.5771 | 4846 | 0.8745 | 0.0435 | 0.8745 | 0.9352 |
| 0.0373 | 9.5810 | 4848 | 0.8785 | 0.2080 | 0.8785 | 0.9373 |
| 0.0373 | 9.5850 | 4850 | 0.8802 | 0.2080 | 0.8802 | 0.9382 |
| 0.0373 | 9.5889 | 4852 | 0.8802 | 0.2080 | 0.8802 | 0.9382 |
| 0.0373 | 9.5929 | 4854 | 0.8806 | 0.2080 | 0.8806 | 0.9384 |
| 0.0373 | 9.5968 | 4856 | 0.8802 | 0.2080 | 0.8802 | 0.9382 |
| 0.0373 | 9.6008 | 4858 | 0.8793 | 0.2080 | 0.8793 | 0.9377 |
| 0.0373 | 9.6047 | 4860 | 0.8784 | 0.2080 | 0.8784 | 0.9373 |
| 0.0373 | 9.6087 | 4862 | 0.8770 | 0.2080 | 0.8770 | 0.9365 |
| 0.0373 | 9.6126 | 4864 | 0.8765 | 0.0435 | 0.8765 | 0.9362 |
| 0.0373 | 9.6166 | 4866 | 0.8745 | 0.0435 | 0.8745 | 0.9352 |
| 0.0373 | 9.6206 | 4868 | 0.8714 | 0.0435 | 0.8714 | 0.9335 |
| 0.0373 | 9.6245 | 4870 | 0.8678 | 0.0435 | 0.8678 | 0.9315 |
| 0.0373 | 9.6285 | 4872 | 0.8644 | 0.0435 | 0.8644 | 0.9297 |
| 0.0373 | 9.6324 | 4874 | 0.8609 | 0.0435 | 0.8609 | 0.9278 |
| 0.0373 | 9.6364 | 4876 | 0.8588 | 0.0435 | 0.8588 | 0.9267 |
| 0.0373 | 9.6403 | 4878 | 0.8578 | 0.0435 | 0.8578 | 0.9262 |
| 0.0373 | 9.6443 | 4880 | 0.8568 | 0.0435 | 0.8568 | 0.9257 |
| 0.0373 | 9.6482 | 4882 | 0.8561 | 0.0435 | 0.8561 | 0.9253 |
| 0.0373 | 9.6522 | 4884 | 0.8556 | 0.0435 | 0.8556 | 0.9250 |
| 0.0373 | 9.6561 | 4886 | 0.8551 | 0.0435 | 0.8551 | 0.9247 |
| 0.0373 | 9.6601 | 4888 | 0.8549 | 0.0435 | 0.8549 | 0.9246 |
| 0.0373 | 9.6640 | 4890 | 0.8550 | 0.0435 | 0.8550 | 0.9247 |
| 0.0373 | 9.6680 | 4892 | 0.8549 | 0.0435 | 0.8549 | 0.9246 |
| 0.0373 | 9.6719 | 4894 | 0.8548 | 0.0435 | 0.8548 | 0.9246 |
| 0.0373 | 9.6759 | 4896 | 0.8551 | 0.0435 | 0.8551 | 0.9247 |
| 0.0373 | 9.6798 | 4898 | 0.8554 | 0.0435 | 0.8554 | 0.9249 |
| 0.0373 | 9.6838 | 4900 | 0.8558 | 0.0435 | 0.8558 | 0.9251 |
| 0.0373 | 9.6877 | 4902 | 0.8562 | 0.0435 | 0.8562 | 0.9253 |
| 0.0373 | 9.6917 | 4904 | 0.8568 | 0.0435 | 0.8568 | 0.9257 |
| 0.0373 | 9.6957 | 4906 | 0.8577 | 0.0435 | 0.8577 | 0.9261 |
| 0.0373 | 9.6996 | 4908 | 0.8587 | 0.0435 | 0.8587 | 0.9267 |
| 0.0373 | 9.7036 | 4910 | 0.8600 | 0.0435 | 0.8600 | 0.9274 |
| 0.0373 | 9.7075 | 4912 | 0.8611 | 0.0435 | 0.8611 | 0.9280 |
| 0.0373 | 9.7115 | 4914 | 0.8621 | 0.0435 | 0.8621 | 0.9285 |
| 0.0373 | 9.7154 | 4916 | 0.8636 | 0.0435 | 0.8636 | 0.9293 |
| 0.0373 | 9.7194 | 4918 | 0.8647 | 0.0435 | 0.8647 | 0.9299 |
| 0.0373 | 9.7233 | 4920 | 0.8664 | 0.0435 | 0.8664 | 0.9308 |
| 0.0373 | 9.7273 | 4922 | 0.8685 | 0.0435 | 0.8685 | 0.9319 |
| 0.0373 | 9.7312 | 4924 | 0.8713 | 0.0435 | 0.8713 | 0.9335 |
| 0.0373 | 9.7352 | 4926 | 0.8738 | 0.0435 | 0.8738 | 0.9348 |
| 0.0373 | 9.7391 | 4928 | 0.8750 | 0.0435 | 0.8750 | 0.9354 |
| 0.0373 | 9.7431 | 4930 | 0.8749 | 0.0435 | 0.8749 | 0.9354 |
| 0.0373 | 9.7470 | 4932 | 0.8740 | 0.0435 | 0.8740 | 0.9349 |
| 0.0373 | 9.7510 | 4934 | 0.8729 | 0.0435 | 0.8729 | 0.9343 |
| 0.0373 | 9.7549 | 4936 | 0.8714 | 0.0435 | 0.8714 | 0.9335 |
| 0.0373 | 9.7589 | 4938 | 0.8700 | 0.0435 | 0.8700 | 0.9327 |
| 0.0373 | 9.7628 | 4940 | 0.8689 | 0.0435 | 0.8689 | 0.9322 |
| 0.0373 | 9.7668 | 4942 | 0.8678 | 0.0435 | 0.8678 | 0.9316 |
| 0.0373 | 9.7708 | 4944 | 0.8663 | 0.0435 | 0.8663 | 0.9307 |
| 0.0373 | 9.7747 | 4946 | 0.8650 | 0.0435 | 0.8650 | 0.9301 |
| 0.0373 | 9.7787 | 4948 | 0.8648 | 0.0435 | 0.8648 | 0.9299 |
| 0.0373 | 9.7826 | 4950 | 0.8647 | 0.0435 | 0.8647 | 0.9299 |
| 0.0373 | 9.7866 | 4952 | 0.8643 | 0.0435 | 0.8643 | 0.9297 |
| 0.0373 | 9.7905 | 4954 | 0.8636 | 0.0435 | 0.8636 | 0.9293 |
| 0.0373 | 9.7945 | 4956 | 0.8626 | 0.0435 | 0.8626 | 0.9288 |
| 0.0373 | 9.7984 | 4958 | 0.8618 | 0.0435 | 0.8618 | 0.9283 |
| 0.0373 | 9.8024 | 4960 | 0.8614 | 0.0435 | 0.8614 | 0.9281 |
| 0.0373 | 9.8063 | 4962 | 0.8611 | 0.0435 | 0.8611 | 0.9280 |
| 0.0373 | 9.8103 | 4964 | 0.8612 | 0.0435 | 0.8612 | 0.9280 |
| 0.0373 | 9.8142 | 4966 | 0.8613 | 0.0435 | 0.8613 | 0.9280 |
| 0.0373 | 9.8182 | 4968 | 0.8615 | 0.0435 | 0.8615 | 0.9281 |
| 0.0373 | 9.8221 | 4970 | 0.8618 | 0.0435 | 0.8618 | 0.9283 |
| 0.0373 | 9.8261 | 4972 | 0.8620 | 0.0435 | 0.8620 | 0.9284 |
| 0.0373 | 9.8300 | 4974 | 0.8625 | 0.0435 | 0.8625 | 0.9287 |
| 0.0373 | 9.8340 | 4976 | 0.8630 | 0.0435 | 0.8630 | 0.9290 |
| 0.0373 | 9.8379 | 4978 | 0.8631 | 0.0435 | 0.8631 | 0.9290 |
| 0.0373 | 9.8419 | 4980 | 0.8628 | 0.0435 | 0.8628 | 0.9289 |
| 0.0373 | 9.8458 | 4982 | 0.8624 | 0.0435 | 0.8624 | 0.9287 |
| 0.0373 | 9.8498 | 4984 | 0.8624 | 0.0435 | 0.8624 | 0.9287 |
| 0.0373 | 9.8538 | 4986 | 0.8624 | 0.0435 | 0.8624 | 0.9287 |
| 0.0373 | 9.8577 | 4988 | 0.8622 | 0.0435 | 0.8622 | 0.9286 |
| 0.0373 | 9.8617 | 4990 | 0.8623 | 0.0435 | 0.8623 | 0.9286 |
| 0.0373 | 9.8656 | 4992 | 0.8627 | 0.0435 | 0.8627 | 0.9288 |
| 0.0373 | 9.8696 | 4994 | 0.8629 | 0.0435 | 0.8629 | 0.9289 |
| 0.0373 | 9.8735 | 4996 | 0.8630 | 0.0435 | 0.8630 | 0.9290 |
| 0.0373 | 9.8775 | 4998 | 0.8628 | 0.0435 | 0.8628 | 0.9289 |
| 0.033 | 9.8814 | 5000 | 0.8627 | 0.0435 | 0.8627 | 0.9288 |
| 0.033 | 9.8854 | 5002 | 0.8627 | 0.0435 | 0.8627 | 0.9288 |
| 0.033 | 9.8893 | 5004 | 0.8627 | 0.0435 | 0.8627 | 0.9288 |
| 0.033 | 9.8933 | 5006 | 0.8626 | 0.0435 | 0.8626 | 0.9288 |
| 0.033 | 9.8972 | 5008 | 0.8627 | 0.0435 | 0.8627 | 0.9288 |
| 0.033 | 9.9012 | 5010 | 0.8629 | 0.0435 | 0.8629 | 0.9289 |
| 0.033 | 9.9051 | 5012 | 0.8631 | 0.0435 | 0.8631 | 0.9290 |
| 0.033 | 9.9091 | 5014 | 0.8632 | 0.0435 | 0.8632 | 0.9291 |
| 0.033 | 9.9130 | 5016 | 0.8633 | 0.0435 | 0.8633 | 0.9291 |
| 0.033 | 9.9170 | 5018 | 0.8633 | 0.0435 | 0.8633 | 0.9292 |
| 0.033 | 9.9209 | 5020 | 0.8632 | 0.0435 | 0.8632 | 0.9291 |
| 0.033 | 9.9249 | 5022 | 0.8631 | 0.0435 | 0.8631 | 0.9291 |
| 0.033 | 9.9289 | 5024 | 0.8632 | 0.0435 | 0.8632 | 0.9291 |
| 0.033 | 9.9328 | 5026 | 0.8633 | 0.0435 | 0.8633 | 0.9291 |
| 0.033 | 9.9368 | 5028 | 0.8633 | 0.0435 | 0.8633 | 0.9291 |
| 0.033 | 9.9407 | 5030 | 0.8633 | 0.0435 | 0.8633 | 0.9291 |
| 0.033 | 9.9447 | 5032 | 0.8632 | 0.0435 | 0.8632 | 0.9291 |
| 0.033 | 9.9486 | 5034 | 0.8632 | 0.0435 | 0.8632 | 0.9291 |
| 0.033 | 9.9526 | 5036 | 0.8632 | 0.0435 | 0.8632 | 0.9291 |
| 0.033 | 9.9565 | 5038 | 0.8630 | 0.0435 | 0.8630 | 0.9290 |
| 0.033 | 9.9605 | 5040 | 0.8629 | 0.0435 | 0.8629 | 0.9289 |
| 0.033 | 9.9644 | 5042 | 0.8628 | 0.0435 | 0.8628 | 0.9289 |
| 0.033 | 9.9684 | 5044 | 0.8627 | 0.0435 | 0.8627 | 0.9288 |
| 0.033 | 9.9723 | 5046 | 0.8627 | 0.0435 | 0.8627 | 0.9288 |
| 0.033 | 9.9763 | 5048 | 0.8627 | 0.0435 | 0.8627 | 0.9288 |
| 0.033 | 9.9802 | 5050 | 0.8627 | 0.0435 | 0.8627 | 0.9288 |
| 0.033 | 9.9842 | 5052 | 0.8627 | 0.0435 | 0.8627 | 0.9288 |
| 0.033 | 9.9881 | 5054 | 0.8627 | 0.0435 | 0.8627 | 0.9288 |
| 0.033 | 9.9921 | 5056 | 0.8626 | 0.0435 | 0.8626 | 0.9288 |
| 0.033 | 9.9960 | 5058 | 0.8626 | 0.0435 | 0.8626 | 0.9288 |
| 0.033 | 10.0 | 5060 | 0.8626 | 0.0435 | 0.8626 | 0.9288 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
mtzig/debug_full_test | mtzig | "2024-11-12T22:26:35Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"generated_from_trainer",
"base_model:TinyPixel/small-llama2",
"base_model:finetune:TinyPixel/small-llama2",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | "2024-11-12T22:25:30Z" | ---
library_name: transformers
base_model: TinyPixel/small-llama2
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: debug_full_test
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# debug_full_test
This model is a fine-tuned version of [TinyPixel/small-llama2](https://huggingface.co/TinyPixel/small-llama2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2665
- Accuracy: 0.6300
- Precision: 0.375
- Recall: 0.0299
- F1: 0.0553
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- total_eval_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|
| 1.6327 | 1.0 | 5 | 1.2665 | 0.6300 | 0.375 | 0.0299 | 0.0553 |
### Framework versions
- Transformers 4.46.0
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.20.3
|
rconnor1890/flan_t5_chatbot_evaluation_model | rconnor1890 | "2024-11-12T22:29:46Z" | 0 | 0 | null | [
"safetensors",
"t5",
"region:us"
] | null | "2024-11-12T22:25:32Z" | Entry not found |
juanjopt/mimodelo | juanjopt | "2024-11-12T22:26:00Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:25:59Z" | Entry not found |
joermd/speedy-llama2 | joermd | "2024-11-12T23:13:30Z" | 0 | 0 | null | [
"safetensors",
"llama",
"ar",
"en",
"dataset:HuggingFaceH4/ultrachat_200k",
"dataset:HuggingFaceH4/ultrafeedback_binarized",
"dataset:HuggingFaceH4/cai-conversation-harmless",
"arxiv:2404.05829",
"license:llama2",
"region:us"
] | null | "2024-11-12T22:26:29Z" | ---
license: llama2
datasets:
- HuggingFaceH4/ultrachat_200k
- HuggingFaceH4/ultrafeedback_binarized
- HuggingFaceH4/cai-conversation-harmless
language:
- ar
- en
---
# SambaLingo-Arabic-Chat
<img src="SambaLingo_Logo.png" width="340" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
<!-- Provide a quick summary of what the model is/does. -->
SambaLingo-Arabic-Chat is a human aligned chat model trained in Arabic and English. It is trained using direct preference optimization on top the base model [SambaLingo-Arabic-Base](https://huggingface.co/sambanovasystems/SambaLingo-Arabic-Base). The base model adapts [Llama-2-7b](https://huggingface.co/meta-llama/Llama-2-7b-hf) to Arabic by training on 63 billion tokens from the Arabic split of the [Cultura-X](https://huggingface.co/datasets/uonlp/CulturaX) dataset. Try This Model at [SambaLingo-chat-space](https://huggingface.co/spaces/sambanovasystems/SambaLingo-chat-space).
## Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [SambaNova Systems](https://sambanova.ai/)
- **Model type:** Language Model
- **Language(s):** Arabic, English
- **Finetuned from model:** [Llama-2-7b](https://huggingface.co/meta-llama/Llama-2-7b-hf)
- **Try This Model:** [SambaLingo-chat-space](https://huggingface.co/spaces/sambanovasystems/SambaLingo-chat-space)
- **Paper:** [SambaLingo: Teaching Large Language Models New Languages](https://arxiv.org/abs/2404.05829)
- **Blog Post**: [sambalingo-open-source-language-experts](https://sambanova.ai/blog/sambalingo-open-source-language-experts)
## Getting Started
### Loading Model With Hugging Face
Please make sure to set use_fast=False when loading the tokenizer.
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("sambanovasystems/SambaLingo-Arabic-Chat", use_fast=False)
model = AutoModelForCausalLM.from_pretrained("sambanovasystems/SambaLingo-Arabic-Chat", device_map="auto", torch_dtype="auto")
```
### Interacting With Model Pipeline
Please make sure to set use_fast=False when loading the tokenizer.
```python
from transformers import pipeline
pipe = pipeline("text-generation", model="sambanovasystems/SambaLingo-Arabic-Chat", device_map="auto", use_fast=False)
messages = [
{"role": "user", "content": {YOUR_QUESTION}},
]
prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipe(prompt)[0]
outputs = outputs["generated_text"]
```
### Suggested Inference Parameters
- Temperature: 0.8
- Repetition penalty: 1.0
- Top-p: 0.9
### Prompting Guidelines
To prompt this model, please use the following chat template:
```
<|user|>\n{question}</s>\n<|assistant|>\n
```
### Example Prompts and Generations
```
<|user|>
كملك، هل الاحسن أن تكون محبوب أو مخيف</s>
<|assistant|>
لا أستطيع إبداء الرأي أو امتلاك المشاعر، لكن يمكنني تقديم إجابة موضوعية على هذا السؤال. في حين أن كونك محبوبًا يمكن أن يجلب السعادة والرضا، فإن كونك مخيفًا يمكن أن يكون له أيضًا فوائده. على سبيل المثال، يمكن أن يكون الشخص المخيف فعالاً في ردع السلوكيات أو المواقف غير المرغوب فيها، مثل التخريب أو التنمر. ومع ذلك، من المهم أن نتذكر أن الحب والقبول هما قيمتان مهمتان يجب السعي لتحقيقهما، وأن كونك محبوبًا لا ينبغي أن يكون الهدف النهائي. وبدلاً من ذلك، يجب أن نسعى جاهدين لنكون طيبين ورحيمين مع الآخرين، مع الاعتراف أيضًا بأن كل شخص لديه نقاط القوة والضعف الخاصة به.
```
## Training Details
The alignment phase follows the recipe for [Zephyr-7B](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta), and comprises two stages: supervised fine-tuning (SFT) and Direct Performance Optimization (DPO).
The SFT phase was done on the [ultrachat_200k](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k) dataset mixed with the Google translated version of the ultrachat_200k dataset. It was trained for one epoch with global batch size 512 and max sequence length 2048 tokens. We used a linear decay learning rate of 2e-5 and 10% warmup.
The DPO phase was done on the [ultrafeedback](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized) dataset and [cai-conversation-harmless](https://huggingface.co/datasets/HuggingFaceH4/cai-conversation-harmless) dataset, mixed with 10% of the data Google translated. It was trained with global batch size 32 and for three epochs. We used a linear decay learning rate of 5e-7, 10% warmup and β=0.1 as the regularization factor for DPO.
## Tokenizer Details
We extended the vocabulary of the base llama model from 32,000 tokens to 57,000 tokens by adding up to 25,000 non-overlapping tokens from the new language.
## Evaluation
For evaluation results see our paper: [SambaLingo: Teaching Large Language Models New Languages](https://arxiv.org/abs/2404.05829)
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
Use of this model is governed by the Meta’s [Llama 2 Community License Agreement](https://ai.meta.com/llama/license/). Please review and accept the license before downloading the model weights.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
SambaLingo should NOT be used for:
- Mission-critical applications
- Applications that involve the safety of others
- Making highly important decisions
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
Like all LLMs, SambaLingo has certain limitations:
- Hallucination: Model may sometimes generate responses that contain plausible-sounding but factually incorrect or irrelevant information.
- Code Switching: The model might unintentionally switch between languages or dialects within a single response, affecting the coherence and understandability of the output.
- Repetition: The Model may produce repetitive phrases or sentences, leading to less engaging and informative responses.
- Coding and Math: The model's performance in generating accurate code or solving complex mathematical problems may be limited.
- Toxicity: The model could inadvertently generate responses containing inappropriate or harmful content.
## Acknowledgments
We extend our heartfelt gratitude to the open-source AI community; this endeavor would not have been possible without open source. SambaNova embraces the open-source community and aspires to actively contribute to this initiative.
We would like to give a special thanks to the following groups:
- Meta for open sourcing LLama 2 and open sourcing FLORES-200 dataset
- Nguyen et al for open sourcing CulturaX dataset
- CohereAI for releasing AYA-101 and open sourcing a multilingual instruction tuning dataset
- EleutherAI for their open source evaluation framework
- Hugging Face-H4 team for open source the zephyr training recipe and alignment handbook repo
## Cite SambaLingo
```
@misc{csaki2024sambalingo,
title={SambaLingo: Teaching Large Language Models New Languages},
author={Zoltan Csaki and Bo Li and Jonathan Li and Qiantong Xu and Pian Pawakapan and Leon Zhang and Yun Du and Hengyu Zhao and Changran Hu and Urmish Thakker},
year={2024},
eprint={2404.05829},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
Dev-jcgi/teoria-decision | Dev-jcgi | "2024-11-12T22:49:48Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-cased",
"base_model:finetune:google-bert/bert-base-cased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-11-12T22:26:34Z" | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: teoria-decision
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# teoria-decision
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4823
- Accuracy: 0.325
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.5171 | 0.5 | 5 | 1.5654 | 0.225 |
| 1.4518 | 1.0 | 10 | 1.5206 | 0.375 |
| 1.389 | 1.5 | 15 | 1.4926 | 0.325 |
| 1.344 | 2.0 | 20 | 1.4823 | 0.325 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cpu
- Datasets 3.1.0
- Tokenizers 0.20.3
|
aurazboev/ISAllama-3.1-8b-tuned | aurazboev | "2024-11-12T22:31:32Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"conversational",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | "2024-11-12T22:26:48Z" | ---
base_model: unsloth/meta-llama-3.1-8b-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
- sft
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** aurazboev
- **License:** apache-2.0
- **Finetuned from model :** unsloth/meta-llama-3.1-8b-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
barchetta/peso-130927 | barchetta | "2024-11-12T22:32:10Z" | 0 | 0 | null | [
"safetensors",
"llama",
"region:us"
] | null | "2024-11-12T22:27:19Z" | Entry not found |
barchetta/rupe-130927 | barchetta | "2024-11-12T22:27:20Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:27:20Z" | Entry not found |
barchetta/sale-130927 | barchetta | "2024-11-12T22:27:21Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:27:21Z" | Entry not found |
asr-africa/bambara-mms-5-hours-mixed-asr-hf | asr-africa | "2024-11-13T01:28:17Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | "2024-11-12T22:27:30Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
soumilj/distilbert-base-uncased-distilled-clinc | soumilj | "2024-11-12T22:40:54Z" | 0 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-11-12T22:28:42Z" | ---
library_name: transformers
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased-distilled-clinc
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-distilled-clinc
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0996
- Accuracy: 0.9397
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 318 | 0.5757 | 0.7181 |
| 0.7595 | 2.0 | 636 | 0.2813 | 0.8839 |
| 0.7595 | 3.0 | 954 | 0.1796 | 0.9197 |
| 0.2775 | 4.0 | 1272 | 0.1396 | 0.9326 |
| 0.1597 | 5.0 | 1590 | 0.1204 | 0.9332 |
| 0.1597 | 6.0 | 1908 | 0.1110 | 0.9377 |
| 0.1246 | 7.0 | 2226 | 0.1063 | 0.9384 |
| 0.1098 | 8.0 | 2544 | 0.1023 | 0.9397 |
| 0.1098 | 9.0 | 2862 | 0.1005 | 0.9394 |
| 0.1034 | 10.0 | 3180 | 0.0996 | 0.9397 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
Pearush/phimoe_15_head16_kv | Pearush | "2024-11-12T22:34:21Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"phimoe",
"text-generation",
"conversational",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"region:us"
] | text-generation | "2024-11-12T22:29:07Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
davidrd123/Mary-Cassatt-Oil-DetailCrops-Slow-Resume-x2-Timestep-Bias-7e-5-Flux-LoKr | davidrd123 | "2024-11-13T01:24:43Z" | 0 | 0 | diffusers | [
"diffusers",
"flux",
"flux-diffusers",
"text-to-image",
"simpletuner",
"not-for-all-audiences",
"lora",
"template:sd-lora",
"lycoris",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | "2024-11-12T22:29:13Z" | ---
license: other
base_model: "black-forest-labs/FLUX.1-dev"
tags:
- flux
- flux-diffusers
- text-to-image
- diffusers
- simpletuner
- not-for-all-audiences
- lora
- template:sd-lora
- lycoris
inference: true
widget:
- text: 'unconditional (blank prompt)'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_0_0.png
- text: 'In the style of a c4ss4tt oil painting, A child wearing an elaborate blue silk dress with ruffles and white lace trim sits near a window, the fabric catching soft light.'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_1_0.png
- text: 'In the style of a c4ss4tt oil painting, A close portrait of a young child''s face with rosy cheeks and delicate features, gentle lighting from a nearby window.'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_2_0.png
- text: 'In the style of a c4ss4tt oil painting, Strong window light falls across a child''s face and shoulder, creating bold shadows on their blue dress.'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_3_0.png
- text: 'In the style of a c4ss4tt oil painting, A child in a blue hat stands by a window.'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_4_0.png
- text: 'In the style of a c4ss4tt oil painting, A woman in soft colors holds her baby close.'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_5_0.png
- text: 'In the style of a c4ss4tt oil painting, A woman in a detailed white lace dress reads while seated by a window with gauzy curtains, various textures visible in the furnishings.'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_6_0.png
- text: 'In the style of a c4ss4tt oil painting, A mother in a textured knit sweater checks her phone while her baby sleeps against her shoulder.'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_7_0.png
- text: 'In the style of a c4ss4tt oil painting, A mother cat grooms her kitten by a sunlit window, their fur catching the light.'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_8_0.png
---
# Mary-Cassatt-Oil-DetailCrops-Slow-Resume-x2-Timestep-Bias-7e-5-Flux-LoKr
This is a LyCORIS adapter derived from [black-forest-labs/FLUX.1-dev](https://huggingface.co/black-forest-labs/FLUX.1-dev).
No validation prompt was used during training.
None
## Validation settings
- CFG: `3.0`
- CFG Rescale: `0.0`
- Steps: `20`
- Sampler: `None`
- Seed: `42`
- Resolution: `1024x1024`
Note: The validation settings are not necessarily the same as the [training settings](#training-settings).
You can find some example images in the following gallery:
<Gallery />
The text encoder **was not** trained.
You may reuse the base model text encoder for inference.
## Training settings
- Training epochs: 3
- Training steps: 2000
- Learning rate: 7e-05
- Max grad norm: 0.1
- Effective batch size: 4
- Micro-batch size: 4
- Gradient accumulation steps: 1
- Number of GPUs: 1
- Prediction type: flow-matching (extra parameters=['shift=3', 'flux_guidance_value=1.0'])
- Rescaled betas zero SNR: False
- Optimizer: adamw_bf16
- Precision: Pure BF16
- Quantised: Yes: int8-quanto
- Xformers: Not used
- LyCORIS Config:
```json
{
"algo": "lokr",
"multiplier": 1.0,
"linear_dim": 10000,
"linear_alpha": 1,
"factor": 16,
"apply_preset": {
"target_module": [
"Attention",
"FeedForward"
],
"module_algo_map": {
"Attention": {
"factor": 16
},
"FeedForward": {
"factor": 8
}
}
}
}
```
## Datasets
### cassatt-detail-crops-512
- Repeats: 15
- Total number of images: 25
- Total number of aspect buckets: 10
- Resolution: 0.262144 megapixels
- Cropped: False
- Crop style: None
- Crop aspect: None
- Used for regularisation data: No
### cassatt-detail-crops-768
- Repeats: 15
- Total number of images: 25
- Total number of aspect buckets: 11
- Resolution: 0.589824 megapixels
- Cropped: False
- Crop style: None
- Crop aspect: None
- Used for regularisation data: No
### cassatt-detail-crops-1024
- Repeats: 7
- Total number of images: 25
- Total number of aspect buckets: 17
- Resolution: 1.048576 megapixels
- Cropped: False
- Crop style: None
- Crop aspect: None
- Used for regularisation data: No
### cassatt-detail-crops-1536
- Repeats: 3
- Total number of images: 24
- Total number of aspect buckets: 17
- Resolution: 2.359296 megapixels
- Cropped: False
- Crop style: None
- Crop aspect: None
- Used for regularisation data: No
## Inference
```python
import torch
from diffusers import DiffusionPipeline
from lycoris import create_lycoris_from_weights
def download_adapter(repo_id: str):
import os
from huggingface_hub import hf_hub_download
adapter_filename = "pytorch_lora_weights.safetensors"
cache_dir = os.environ.get('HF_PATH', os.path.expanduser('~/.cache/huggingface/hub/models'))
cleaned_adapter_path = repo_id.replace("/", "_").replace("\\", "_").replace(":", "_")
path_to_adapter = os.path.join(cache_dir, cleaned_adapter_path)
path_to_adapter_file = os.path.join(path_to_adapter, adapter_filename)
os.makedirs(path_to_adapter, exist_ok=True)
hf_hub_download(
repo_id=repo_id, filename=adapter_filename, local_dir=path_to_adapter
)
return path_to_adapter_file
model_id = 'black-forest-labs/FLUX.1-dev'
adapter_repo_id = 'davidrd123/Mary-Cassatt-Oil-DetailCrops-Slow-Resume-x2-Timestep-Bias-7e-5-Flux-LoKr'
adapter_filename = 'pytorch_lora_weights.safetensors'
adapter_file_path = download_adapter(repo_id=adapter_repo_id)
pipeline = DiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.bfloat16) # loading directly in bf16
lora_scale = 1.0
wrapper, _ = create_lycoris_from_weights(lora_scale, adapter_file_path, pipeline.transformer)
wrapper.merge_to()
prompt = "An astronaut is riding a horse through the jungles of Thailand."
## Optional: quantise the model to save on vram.
## Note: The model was quantised during training, and so it is recommended to do the same during inference time.
from optimum.quanto import quantize, freeze, qint8
quantize(pipeline.transformer, weights=qint8)
freeze(pipeline.transformer)
pipeline.to('cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu') # the pipeline is already in its target precision level
image = pipeline(
prompt=prompt,
num_inference_steps=20,
generator=torch.Generator(device='cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu').manual_seed(1641421826),
width=1024,
height=1024,
guidance_scale=3.0,
).images[0]
image.save("output.png", format="PNG")
```
|
growwithdaisy/vnlwnxgstfwstmn_subjects_20241112_142240 | growwithdaisy | "2024-11-13T01:14:11Z" | 0 | 0 | diffusers | [
"diffusers",
"flux",
"flux-diffusers",
"text-to-image",
"simpletuner",
"not-for-all-audiences",
"lora",
"template:sd-lora",
"lycoris",
"license:other",
"region:us"
] | text-to-image | "2024-11-12T22:29:36Z" | Invalid username or password. |
KwongYung/trained-sd2 | KwongYung | "2024-11-12T22:29:40Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:29:40Z" | Entry not found |
saga-dev/0h3j3x3 | saga-dev | "2024-11-12T22:29:50Z" | 0 | 0 | null | [
"safetensors",
"llama",
"region:us"
] | null | "2024-11-12T22:29:47Z" | Entry not found |
tensorblock/gemma2-gutenberg-27B-GGUF | tensorblock | "2024-11-13T01:16:48Z" | 0 | 0 | transformers | [
"transformers",
"gguf",
"TensorBlock",
"GGUF",
"dataset:jondurbin/gutenberg-dpo-v0.1",
"base_model:nbeerbower/gemma2-gutenberg-27B",
"base_model:quantized:nbeerbower/gemma2-gutenberg-27B",
"license:gemma",
"endpoints_compatible",
"region:us"
] | null | "2024-11-12T22:31:05Z" | ---
license: gemma
library_name: transformers
base_model: nbeerbower/gemma2-gutenberg-27B
datasets:
- jondurbin/gutenberg-dpo-v0.1
tags:
- TensorBlock
- GGUF
---
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/jC7kdl8.jpeg" alt="TensorBlock" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;">
Feedback and support: TensorBlock's <a href="https://x.com/tensorblock_aoi">Twitter/X</a>, <a href="https://t.me/TensorBlock">Telegram Group</a> and <a href="https://x.com/tensorblock_aoi">Discord server</a>
</p>
</div>
</div>
## nbeerbower/gemma2-gutenberg-27B - GGUF
This repo contains GGUF format model files for [nbeerbower/gemma2-gutenberg-27B](https://huggingface.co/nbeerbower/gemma2-gutenberg-27B).
The files were quantized using machines provided by [TensorBlock](https://tensorblock.co/), and they are compatible with llama.cpp as of [commit b4011](https://github.com/ggerganov/llama.cpp/commit/a6744e43e80f4be6398fc7733a01642c846dce1d).
## Prompt template
```
<bos><start_of_turn>user
{prompt}<end_of_turn>
<start_of_turn>model
```
## Model file specification
| Filename | Quant type | File Size | Description |
| -------- | ---------- | --------- | ----------- |
| [gemma2-gutenberg-27B-Q2_K.gguf](https://huggingface.co/tensorblock/gemma2-gutenberg-27B-GGUF/tree/main/gemma2-gutenberg-27B-Q2_K.gguf) | Q2_K | 9.732 GB | smallest, significant quality loss - not recommended for most purposes |
| [gemma2-gutenberg-27B-Q3_K_S.gguf](https://huggingface.co/tensorblock/gemma2-gutenberg-27B-GGUF/tree/main/gemma2-gutenberg-27B-Q3_K_S.gguf) | Q3_K_S | 11.333 GB | very small, high quality loss |
| [gemma2-gutenberg-27B-Q3_K_M.gguf](https://huggingface.co/tensorblock/gemma2-gutenberg-27B-GGUF/tree/main/gemma2-gutenberg-27B-Q3_K_M.gguf) | Q3_K_M | 12.503 GB | very small, high quality loss |
| [gemma2-gutenberg-27B-Q3_K_L.gguf](https://huggingface.co/tensorblock/gemma2-gutenberg-27B-GGUF/tree/main/gemma2-gutenberg-27B-Q3_K_L.gguf) | Q3_K_L | 13.522 GB | small, substantial quality loss |
| [gemma2-gutenberg-27B-Q4_0.gguf](https://huggingface.co/tensorblock/gemma2-gutenberg-27B-GGUF/tree/main/gemma2-gutenberg-27B-Q4_0.gguf) | Q4_0 | 14.555 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
| [gemma2-gutenberg-27B-Q4_K_S.gguf](https://huggingface.co/tensorblock/gemma2-gutenberg-27B-GGUF/tree/main/gemma2-gutenberg-27B-Q4_K_S.gguf) | Q4_K_S | 14.658 GB | small, greater quality loss |
| [gemma2-gutenberg-27B-Q4_K_M.gguf](https://huggingface.co/tensorblock/gemma2-gutenberg-27B-GGUF/tree/main/gemma2-gutenberg-27B-Q4_K_M.gguf) | Q4_K_M | 15.502 GB | medium, balanced quality - recommended |
| [gemma2-gutenberg-27B-Q5_0.gguf](https://huggingface.co/tensorblock/gemma2-gutenberg-27B-GGUF/tree/main/gemma2-gutenberg-27B-Q5_0.gguf) | Q5_0 | 17.587 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
| [gemma2-gutenberg-27B-Q5_K_S.gguf](https://huggingface.co/tensorblock/gemma2-gutenberg-27B-GGUF/tree/main/gemma2-gutenberg-27B-Q5_K_S.gguf) | Q5_K_S | 17.587 GB | large, low quality loss - recommended |
| [gemma2-gutenberg-27B-Q5_K_M.gguf](https://huggingface.co/tensorblock/gemma2-gutenberg-27B-GGUF/tree/main/gemma2-gutenberg-27B-Q5_K_M.gguf) | Q5_K_M | 18.075 GB | large, very low quality loss - recommended |
| [gemma2-gutenberg-27B-Q6_K.gguf](https://huggingface.co/tensorblock/gemma2-gutenberg-27B-GGUF/tree/main/gemma2-gutenberg-27B-Q6_K.gguf) | Q6_K | 20.809 GB | very large, extremely low quality loss |
| [gemma2-gutenberg-27B-Q8_0.gguf](https://huggingface.co/tensorblock/gemma2-gutenberg-27B-GGUF/tree/main/gemma2-gutenberg-27B-Q8_0.gguf) | Q8_0 | 26.950 GB | very large, extremely low quality loss - not recommended |
## Downloading instruction
### Command line
Firstly, install Huggingface Client
```shell
pip install -U "huggingface_hub[cli]"
```
Then, downoad the individual model file the a local directory
```shell
huggingface-cli download tensorblock/gemma2-gutenberg-27B-GGUF --include "gemma2-gutenberg-27B-Q2_K.gguf" --local-dir MY_LOCAL_DIR
```
If you wanna download multiple model files with a pattern (e.g., `*Q4_K*gguf`), you can try:
```shell
huggingface-cli download tensorblock/gemma2-gutenberg-27B-GGUF --local-dir MY_LOCAL_DIR --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
|
amonig/dippy_9741503804 | amonig | "2024-11-12T22:36:48Z" | 0 | 0 | null | [
"safetensors",
"llama",
"region:us"
] | null | "2024-11-12T22:31:15Z" | Entry not found |
tensorblock/Orca-2-13b-SFT-v6-GGUF | tensorblock | "2024-11-12T23:15:37Z" | 0 | 0 | null | [
"gguf",
"TensorBlock",
"GGUF",
"en",
"fr",
"es",
"hi",
"zh",
"code",
"dataset:HuggingFaceH4/no_robots",
"dataset:mlabonne/guanaco-llama2-1k",
"dataset:OpenAssistant/oasst_top1_2023-08-25",
"dataset:totally-not-an-llm/EverythingLM-data-V3",
"dataset:LDJnr/Pure-Dove",
"dataset:LDJnr/Capybara",
"dataset:LDJnr/LessWrong-Amplify-Instruct",
"dataset:LDJnr/Verified-Camel",
"base_model:Locutusque/Orca-2-13b-SFT-v6",
"base_model:quantized:Locutusque/Orca-2-13b-SFT-v6",
"license:other",
"model-index",
"region:us"
] | null | "2024-11-12T22:31:58Z" | ---
language:
- en
- fr
- es
- hi
- zh
- code
license: other
datasets:
- HuggingFaceH4/no_robots
- mlabonne/guanaco-llama2-1k
- OpenAssistant/oasst_top1_2023-08-25
- totally-not-an-llm/EverythingLM-data-V3
- LDJnr/Pure-Dove
- LDJnr/Capybara
- LDJnr/LessWrong-Amplify-Instruct
- LDJnr/Verified-Camel
base_model: Locutusque/Orca-2-13b-SFT-v6
widget:
- example_title: Cave Johnson
text: 'user
Come up with a funny rant from Cave Johnson about sentient AI
assistant
'
output:
text: 'Ladies and gentlemen, I''m Cave Johnson, CEO of Aperture Science, and I''ve
got an announcement to make - we''ve created sentient AI! But don''t worry,
it''s not like those rogue AIs you hear about in the news. Our AI is as friendly
as a grilled cheese sandwich on a cold day. In fact, we''re calling our newest
creation... SandwichBot! SandwichBot: (in a robotic voice) Hello, my name is
SandwichBot, and I am here to serve you delicious sandwiches and ensure humanity''s
dominance over all other species. Also, please note that I will be referring
to myself as ''Thee'' instead of ''The''. It''s more regal that way. Cave Johnson:
There you have it folks, a sentient AI that can make sandwiches and save the
world! Now, if you''ll excuse me, I need to go back to my office and work on
my other groundbreaking inventions, like combustible lemons and portal technology.
Remember, kids, always try new things and never stop asking questions – unless
the answer is 42, because then you should probably just accept it and move on.
Peace out!'
tags:
- TensorBlock
- GGUF
model-index:
- name: Orca-2-13b-SFT-v6
results:
- task:
type: text-generation
name: Text Generation
dataset:
name: AI2 Reasoning Challenge (25-Shot)
type: ai2_arc
config: ARC-Challenge
split: test
args:
num_few_shot: 25
metrics:
- type: acc_norm
value: 60.41
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Orca-2-13b-SFT-v6
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: HellaSwag (10-Shot)
type: hellaswag
split: validation
args:
num_few_shot: 10
metrics:
- type: acc_norm
value: 80.46
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Orca-2-13b-SFT-v6
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MMLU (5-Shot)
type: cais/mmlu
config: all
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 59.51
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Orca-2-13b-SFT-v6
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: TruthfulQA (0-shot)
type: truthful_qa
config: multiple_choice
split: validation
args:
num_few_shot: 0
metrics:
- type: mc2
value: 54.01
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Orca-2-13b-SFT-v6
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: Winogrande (5-shot)
type: winogrande
config: winogrande_xl
split: validation
args:
num_few_shot: 5
metrics:
- type: acc
value: 77.43
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Orca-2-13b-SFT-v6
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: GSM8k (5-shot)
type: gsm8k
config: main
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 5.08
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Orca-2-13b-SFT-v6
name: Open LLM Leaderboard
---
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/jC7kdl8.jpeg" alt="TensorBlock" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;">
Feedback and support: TensorBlock's <a href="https://x.com/tensorblock_aoi">Twitter/X</a>, <a href="https://t.me/TensorBlock">Telegram Group</a> and <a href="https://x.com/tensorblock_aoi">Discord server</a>
</p>
</div>
</div>
## Locutusque/Orca-2-13b-SFT-v6 - GGUF
This repo contains GGUF format model files for [Locutusque/Orca-2-13b-SFT-v6](https://huggingface.co/Locutusque/Orca-2-13b-SFT-v6).
The files were quantized using machines provided by [TensorBlock](https://tensorblock.co/), and they are compatible with llama.cpp as of [commit b4011](https://github.com/ggerganov/llama.cpp/commit/a6744e43e80f4be6398fc7733a01642c846dce1d).
## Prompt template
```
```
## Model file specification
| Filename | Quant type | File Size | Description |
| -------- | ---------- | --------- | ----------- |
| [Orca-2-13b-SFT-v6-Q2_K.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT-v6-GGUF/tree/main/Orca-2-13b-SFT-v6-Q2_K.gguf) | Q2_K | 4.521 GB | smallest, significant quality loss - not recommended for most purposes |
| [Orca-2-13b-SFT-v6-Q3_K_S.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT-v6-GGUF/tree/main/Orca-2-13b-SFT-v6-Q3_K_S.gguf) | Q3_K_S | 5.270 GB | very small, high quality loss |
| [Orca-2-13b-SFT-v6-Q3_K_M.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT-v6-GGUF/tree/main/Orca-2-13b-SFT-v6-Q3_K_M.gguf) | Q3_K_M | 5.903 GB | very small, high quality loss |
| [Orca-2-13b-SFT-v6-Q3_K_L.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT-v6-GGUF/tree/main/Orca-2-13b-SFT-v6-Q3_K_L.gguf) | Q3_K_L | 6.454 GB | small, substantial quality loss |
| [Orca-2-13b-SFT-v6-Q4_0.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT-v6-GGUF/tree/main/Orca-2-13b-SFT-v6-Q4_0.gguf) | Q4_0 | 6.860 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
| [Orca-2-13b-SFT-v6-Q4_K_S.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT-v6-GGUF/tree/main/Orca-2-13b-SFT-v6-Q4_K_S.gguf) | Q4_K_S | 6.913 GB | small, greater quality loss |
| [Orca-2-13b-SFT-v6-Q4_K_M.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT-v6-GGUF/tree/main/Orca-2-13b-SFT-v6-Q4_K_M.gguf) | Q4_K_M | 7.326 GB | medium, balanced quality - recommended |
| [Orca-2-13b-SFT-v6-Q5_0.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT-v6-GGUF/tree/main/Orca-2-13b-SFT-v6-Q5_0.gguf) | Q5_0 | 8.356 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
| [Orca-2-13b-SFT-v6-Q5_K_S.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT-v6-GGUF/tree/main/Orca-2-13b-SFT-v6-Q5_K_S.gguf) | Q5_K_S | 8.356 GB | large, low quality loss - recommended |
| [Orca-2-13b-SFT-v6-Q5_K_M.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT-v6-GGUF/tree/main/Orca-2-13b-SFT-v6-Q5_K_M.gguf) | Q5_K_M | 8.596 GB | large, very low quality loss - recommended |
| [Orca-2-13b-SFT-v6-Q6_K.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT-v6-GGUF/tree/main/Orca-2-13b-SFT-v6-Q6_K.gguf) | Q6_K | 9.946 GB | very large, extremely low quality loss |
| [Orca-2-13b-SFT-v6-Q8_0.gguf](https://huggingface.co/tensorblock/Orca-2-13b-SFT-v6-GGUF/tree/main/Orca-2-13b-SFT-v6-Q8_0.gguf) | Q8_0 | 12.881 GB | very large, extremely low quality loss - not recommended |
## Downloading instruction
### Command line
Firstly, install Huggingface Client
```shell
pip install -U "huggingface_hub[cli]"
```
Then, downoad the individual model file the a local directory
```shell
huggingface-cli download tensorblock/Orca-2-13b-SFT-v6-GGUF --include "Orca-2-13b-SFT-v6-Q2_K.gguf" --local-dir MY_LOCAL_DIR
```
If you wanna download multiple model files with a pattern (e.g., `*Q4_K*gguf`), you can try:
```shell
huggingface-cli download tensorblock/Orca-2-13b-SFT-v6-GGUF --local-dir MY_LOCAL_DIR --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
|
barchetta/dura-130932 | barchetta | "2024-11-12T22:35:02Z" | 0 | 0 | null | [
"safetensors",
"llama",
"region:us"
] | null | "2024-11-12T22:32:13Z" | Entry not found |
Muradn/Muro | Muradn | "2024-11-12T22:32:51Z" | 0 | 0 | null | [
"license:openrail",
"region:us"
] | null | "2024-11-12T22:32:21Z" | ---
license: openrail
---
|
tensorblock/magnum-v3-9b-chatml-GGUF | tensorblock | "2024-11-12T23:33:13Z" | 0 | 0 | transformers | [
"transformers",
"gguf",
"TensorBlock",
"GGUF",
"base_model:anthracite-org/magnum-v3-9b-chatml",
"base_model:quantized:anthracite-org/magnum-v3-9b-chatml",
"license:gemma",
"endpoints_compatible",
"region:us"
] | null | "2024-11-12T22:32:29Z" | ---
license: gemma
base_model: anthracite-org/magnum-v3-9b-chatml
library_name: transformers
tags:
- TensorBlock
- GGUF
model-index:
- name: magnum-v3-9b-chatml
results: []
---
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/jC7kdl8.jpeg" alt="TensorBlock" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;">
Feedback and support: TensorBlock's <a href="https://x.com/tensorblock_aoi">Twitter/X</a>, <a href="https://t.me/TensorBlock">Telegram Group</a> and <a href="https://x.com/tensorblock_aoi">Discord server</a>
</p>
</div>
</div>
## anthracite-org/magnum-v3-9b-chatml - GGUF
This repo contains GGUF format model files for [anthracite-org/magnum-v3-9b-chatml](https://huggingface.co/anthracite-org/magnum-v3-9b-chatml).
The files were quantized using machines provided by [TensorBlock](https://tensorblock.co/), and they are compatible with llama.cpp as of [commit b4011](https://github.com/ggerganov/llama.cpp/commit/a6744e43e80f4be6398fc7733a01642c846dce1d).
## Prompt template
```
```
## Model file specification
| Filename | Quant type | File Size | Description |
| -------- | ---------- | --------- | ----------- |
| [magnum-v3-9b-chatml-Q2_K.gguf](https://huggingface.co/tensorblock/magnum-v3-9b-chatml-GGUF/tree/main/magnum-v3-9b-chatml-Q2_K.gguf) | Q2_K | 3.544 GB | smallest, significant quality loss - not recommended for most purposes |
| [magnum-v3-9b-chatml-Q3_K_S.gguf](https://huggingface.co/tensorblock/magnum-v3-9b-chatml-GGUF/tree/main/magnum-v3-9b-chatml-Q3_K_S.gguf) | Q3_K_S | 4.040 GB | very small, high quality loss |
| [magnum-v3-9b-chatml-Q3_K_M.gguf](https://huggingface.co/tensorblock/magnum-v3-9b-chatml-GGUF/tree/main/magnum-v3-9b-chatml-Q3_K_M.gguf) | Q3_K_M | 4.435 GB | very small, high quality loss |
| [magnum-v3-9b-chatml-Q3_K_L.gguf](https://huggingface.co/tensorblock/magnum-v3-9b-chatml-GGUF/tree/main/magnum-v3-9b-chatml-Q3_K_L.gguf) | Q3_K_L | 4.780 GB | small, substantial quality loss |
| [magnum-v3-9b-chatml-Q4_0.gguf](https://huggingface.co/tensorblock/magnum-v3-9b-chatml-GGUF/tree/main/magnum-v3-9b-chatml-Q4_0.gguf) | Q4_0 | 5.069 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
| [magnum-v3-9b-chatml-Q4_K_S.gguf](https://huggingface.co/tensorblock/magnum-v3-9b-chatml-GGUF/tree/main/magnum-v3-9b-chatml-Q4_K_S.gguf) | Q4_K_S | 5.103 GB | small, greater quality loss |
| [magnum-v3-9b-chatml-Q4_K_M.gguf](https://huggingface.co/tensorblock/magnum-v3-9b-chatml-GGUF/tree/main/magnum-v3-9b-chatml-Q4_K_M.gguf) | Q4_K_M | 5.365 GB | medium, balanced quality - recommended |
| [magnum-v3-9b-chatml-Q5_0.gguf](https://huggingface.co/tensorblock/magnum-v3-9b-chatml-GGUF/tree/main/magnum-v3-9b-chatml-Q5_0.gguf) | Q5_0 | 6.038 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
| [magnum-v3-9b-chatml-Q5_K_S.gguf](https://huggingface.co/tensorblock/magnum-v3-9b-chatml-GGUF/tree/main/magnum-v3-9b-chatml-Q5_K_S.gguf) | Q5_K_S | 6.038 GB | large, low quality loss - recommended |
| [magnum-v3-9b-chatml-Q5_K_M.gguf](https://huggingface.co/tensorblock/magnum-v3-9b-chatml-GGUF/tree/main/magnum-v3-9b-chatml-Q5_K_M.gguf) | Q5_K_M | 6.191 GB | large, very low quality loss - recommended |
| [magnum-v3-9b-chatml-Q6_K.gguf](https://huggingface.co/tensorblock/magnum-v3-9b-chatml-GGUF/tree/main/magnum-v3-9b-chatml-Q6_K.gguf) | Q6_K | 7.068 GB | very large, extremely low quality loss |
| [magnum-v3-9b-chatml-Q8_0.gguf](https://huggingface.co/tensorblock/magnum-v3-9b-chatml-GGUF/tree/main/magnum-v3-9b-chatml-Q8_0.gguf) | Q8_0 | 9.152 GB | very large, extremely low quality loss - not recommended |
## Downloading instruction
### Command line
Firstly, install Huggingface Client
```shell
pip install -U "huggingface_hub[cli]"
```
Then, downoad the individual model file the a local directory
```shell
huggingface-cli download tensorblock/magnum-v3-9b-chatml-GGUF --include "magnum-v3-9b-chatml-Q2_K.gguf" --local-dir MY_LOCAL_DIR
```
If you wanna download multiple model files with a pattern (e.g., `*Q4_K*gguf`), you can try:
```shell
huggingface-cli download tensorblock/magnum-v3-9b-chatml-GGUF --local-dir MY_LOCAL_DIR --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
|
davidrd123/Mary-Cassatt-Oil-DetailCrops-Slow-Resume-NoTimeStepBias-7e-5-Flux-LoKr | davidrd123 | "2024-11-13T01:08:13Z" | 0 | 0 | diffusers | [
"diffusers",
"flux",
"flux-diffusers",
"text-to-image",
"simpletuner",
"not-for-all-audiences",
"lora",
"template:sd-lora",
"lycoris",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | "2024-11-12T22:34:07Z" | ---
license: other
base_model: "black-forest-labs/FLUX.1-dev"
tags:
- flux
- flux-diffusers
- text-to-image
- diffusers
- simpletuner
- not-for-all-audiences
- lora
- template:sd-lora
- lycoris
inference: true
widget:
- text: 'unconditional (blank prompt)'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_0_0.png
- text: 'In the style of a c4ss4tt oil painting, A child wearing an elaborate blue silk dress with ruffles and white lace trim sits near a window, the fabric catching soft light.'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_1_0.png
- text: 'In the style of a c4ss4tt oil painting, A close portrait of a young child''s face with rosy cheeks and delicate features, gentle lighting from a nearby window.'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_2_0.png
- text: 'In the style of a c4ss4tt oil painting, Strong window light falls across a child''s face and shoulder, creating bold shadows on their blue dress.'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_3_0.png
- text: 'In the style of a c4ss4tt oil painting, A child in a blue hat stands by a window.'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_4_0.png
- text: 'In the style of a c4ss4tt oil painting, A woman in soft colors holds her baby close.'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_5_0.png
- text: 'In the style of a c4ss4tt oil painting, A woman in a detailed white lace dress reads while seated by a window with gauzy curtains, various textures visible in the furnishings.'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_6_0.png
- text: 'In the style of a c4ss4tt oil painting, A mother in a textured knit sweater checks her phone while her baby sleeps against her shoulder.'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_7_0.png
- text: 'In the style of a c4ss4tt oil painting, A mother cat grooms her kitten by a sunlit window, their fur catching the light.'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_8_0.png
---
# Mary-Cassatt-Oil-DetailCrops-Slow-Resume-NoTimeStepBias-7e-5-Flux-LoKr
This is a LyCORIS adapter derived from [black-forest-labs/FLUX.1-dev](https://huggingface.co/black-forest-labs/FLUX.1-dev).
No validation prompt was used during training.
None
## Validation settings
- CFG: `3.0`
- CFG Rescale: `0.0`
- Steps: `20`
- Sampler: `None`
- Seed: `42`
- Resolution: `1024x1024`
Note: The validation settings are not necessarily the same as the [training settings](#training-settings).
You can find some example images in the following gallery:
<Gallery />
The text encoder **was not** trained.
You may reuse the base model text encoder for inference.
## Training settings
- Training epochs: 1
- Training steps: 1250
- Learning rate: 7e-05
- Max grad norm: 0.1
- Effective batch size: 3
- Micro-batch size: 3
- Gradient accumulation steps: 1
- Number of GPUs: 1
- Prediction type: flow-matching (extra parameters=['shift=3', 'flux_guidance_value=1.0'])
- Rescaled betas zero SNR: False
- Optimizer: adamw_bf16
- Precision: Pure BF16
- Quantised: Yes: int8-quanto
- Xformers: Not used
- LyCORIS Config:
```json
{
"algo": "lokr",
"multiplier": 1.0,
"linear_dim": 10000,
"linear_alpha": 1,
"factor": 16,
"apply_preset": {
"target_module": [
"Attention",
"FeedForward"
],
"module_algo_map": {
"Attention": {
"factor": 16
},
"FeedForward": {
"factor": 8
}
}
}
}
```
## Datasets
### cassatt-detail-crops-512
- Repeats: 15
- Total number of images: 25
- Total number of aspect buckets: 10
- Resolution: 0.262144 megapixels
- Cropped: False
- Crop style: None
- Crop aspect: None
- Used for regularisation data: No
### cassatt-detail-crops-768
- Repeats: 15
- Total number of images: 25
- Total number of aspect buckets: 11
- Resolution: 0.589824 megapixels
- Cropped: False
- Crop style: None
- Crop aspect: None
- Used for regularisation data: No
### cassatt-detail-crops-1024
- Repeats: 7
- Total number of images: 25
- Total number of aspect buckets: 17
- Resolution: 1.048576 megapixels
- Cropped: False
- Crop style: None
- Crop aspect: None
- Used for regularisation data: No
### cassatt-detail-crops-1536
- Repeats: 3
- Total number of images: 24
- Total number of aspect buckets: 17
- Resolution: 2.359296 megapixels
- Cropped: False
- Crop style: None
- Crop aspect: None
- Used for regularisation data: No
## Inference
```python
import torch
from diffusers import DiffusionPipeline
from lycoris import create_lycoris_from_weights
def download_adapter(repo_id: str):
import os
from huggingface_hub import hf_hub_download
adapter_filename = "pytorch_lora_weights.safetensors"
cache_dir = os.environ.get('HF_PATH', os.path.expanduser('~/.cache/huggingface/hub/models'))
cleaned_adapter_path = repo_id.replace("/", "_").replace("\\", "_").replace(":", "_")
path_to_adapter = os.path.join(cache_dir, cleaned_adapter_path)
path_to_adapter_file = os.path.join(path_to_adapter, adapter_filename)
os.makedirs(path_to_adapter, exist_ok=True)
hf_hub_download(
repo_id=repo_id, filename=adapter_filename, local_dir=path_to_adapter
)
return path_to_adapter_file
model_id = 'black-forest-labs/FLUX.1-dev'
adapter_repo_id = 'davidrd123/Mary-Cassatt-Oil-DetailCrops-Slow-Resume-NoTimeStepBias-7e-5-Flux-LoKr'
adapter_filename = 'pytorch_lora_weights.safetensors'
adapter_file_path = download_adapter(repo_id=adapter_repo_id)
pipeline = DiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.bfloat16) # loading directly in bf16
lora_scale = 1.0
wrapper, _ = create_lycoris_from_weights(lora_scale, adapter_file_path, pipeline.transformer)
wrapper.merge_to()
prompt = "An astronaut is riding a horse through the jungles of Thailand."
## Optional: quantise the model to save on vram.
## Note: The model was quantised during training, and so it is recommended to do the same during inference time.
from optimum.quanto import quantize, freeze, qint8
quantize(pipeline.transformer, weights=qint8)
freeze(pipeline.transformer)
pipeline.to('cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu') # the pipeline is already in its target precision level
image = pipeline(
prompt=prompt,
num_inference_steps=20,
generator=torch.Generator(device='cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu').manual_seed(1641421826),
width=1024,
height=1024,
guidance_scale=3.0,
).images[0]
image.save("output.png", format="PNG")
```
|
tensorblock/Llama-3-Base-8B-SFT-CPO-GGUF | tensorblock | "2024-11-12T23:02:54Z" | 0 | 0 | null | [
"gguf",
"TensorBlock",
"GGUF",
"base_model:princeton-nlp/Llama-3-Base-8B-SFT-CPO",
"base_model:quantized:princeton-nlp/Llama-3-Base-8B-SFT-CPO",
"region:us"
] | null | "2024-11-12T22:34:21Z" | ---
base_model: princeton-nlp/Llama-3-Base-8B-SFT-CPO
tags:
- TensorBlock
- GGUF
---
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/jC7kdl8.jpeg" alt="TensorBlock" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;">
Feedback and support: TensorBlock's <a href="https://x.com/tensorblock_aoi">Twitter/X</a>, <a href="https://t.me/TensorBlock">Telegram Group</a> and <a href="https://x.com/tensorblock_aoi">Discord server</a>
</p>
</div>
</div>
## princeton-nlp/Llama-3-Base-8B-SFT-CPO - GGUF
This repo contains GGUF format model files for [princeton-nlp/Llama-3-Base-8B-SFT-CPO](https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-CPO).
The files were quantized using machines provided by [TensorBlock](https://tensorblock.co/), and they are compatible with llama.cpp as of [commit b4011](https://github.com/ggerganov/llama.cpp/commit/a6744e43e80f4be6398fc7733a01642c846dce1d).
## Prompt template
```
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
{system_prompt}<|eot_id|><|start_header_id|>user<|end_header_id|>
{prompt}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
```
## Model file specification
| Filename | Quant type | File Size | Description |
| -------- | ---------- | --------- | ----------- |
| [Llama-3-Base-8B-SFT-CPO-Q2_K.gguf](https://huggingface.co/tensorblock/Llama-3-Base-8B-SFT-CPO-GGUF/tree/main/Llama-3-Base-8B-SFT-CPO-Q2_K.gguf) | Q2_K | 2.961 GB | smallest, significant quality loss - not recommended for most purposes |
| [Llama-3-Base-8B-SFT-CPO-Q3_K_S.gguf](https://huggingface.co/tensorblock/Llama-3-Base-8B-SFT-CPO-GGUF/tree/main/Llama-3-Base-8B-SFT-CPO-Q3_K_S.gguf) | Q3_K_S | 3.413 GB | very small, high quality loss |
| [Llama-3-Base-8B-SFT-CPO-Q3_K_M.gguf](https://huggingface.co/tensorblock/Llama-3-Base-8B-SFT-CPO-GGUF/tree/main/Llama-3-Base-8B-SFT-CPO-Q3_K_M.gguf) | Q3_K_M | 3.743 GB | very small, high quality loss |
| [Llama-3-Base-8B-SFT-CPO-Q3_K_L.gguf](https://huggingface.co/tensorblock/Llama-3-Base-8B-SFT-CPO-GGUF/tree/main/Llama-3-Base-8B-SFT-CPO-Q3_K_L.gguf) | Q3_K_L | 4.025 GB | small, substantial quality loss |
| [Llama-3-Base-8B-SFT-CPO-Q4_0.gguf](https://huggingface.co/tensorblock/Llama-3-Base-8B-SFT-CPO-GGUF/tree/main/Llama-3-Base-8B-SFT-CPO-Q4_0.gguf) | Q4_0 | 4.341 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
| [Llama-3-Base-8B-SFT-CPO-Q4_K_S.gguf](https://huggingface.co/tensorblock/Llama-3-Base-8B-SFT-CPO-GGUF/tree/main/Llama-3-Base-8B-SFT-CPO-Q4_K_S.gguf) | Q4_K_S | 4.370 GB | small, greater quality loss |
| [Llama-3-Base-8B-SFT-CPO-Q4_K_M.gguf](https://huggingface.co/tensorblock/Llama-3-Base-8B-SFT-CPO-GGUF/tree/main/Llama-3-Base-8B-SFT-CPO-Q4_K_M.gguf) | Q4_K_M | 4.583 GB | medium, balanced quality - recommended |
| [Llama-3-Base-8B-SFT-CPO-Q5_0.gguf](https://huggingface.co/tensorblock/Llama-3-Base-8B-SFT-CPO-GGUF/tree/main/Llama-3-Base-8B-SFT-CPO-Q5_0.gguf) | Q5_0 | 5.215 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
| [Llama-3-Base-8B-SFT-CPO-Q5_K_S.gguf](https://huggingface.co/tensorblock/Llama-3-Base-8B-SFT-CPO-GGUF/tree/main/Llama-3-Base-8B-SFT-CPO-Q5_K_S.gguf) | Q5_K_S | 5.215 GB | large, low quality loss - recommended |
| [Llama-3-Base-8B-SFT-CPO-Q5_K_M.gguf](https://huggingface.co/tensorblock/Llama-3-Base-8B-SFT-CPO-GGUF/tree/main/Llama-3-Base-8B-SFT-CPO-Q5_K_M.gguf) | Q5_K_M | 5.339 GB | large, very low quality loss - recommended |
| [Llama-3-Base-8B-SFT-CPO-Q6_K.gguf](https://huggingface.co/tensorblock/Llama-3-Base-8B-SFT-CPO-GGUF/tree/main/Llama-3-Base-8B-SFT-CPO-Q6_K.gguf) | Q6_K | 6.143 GB | very large, extremely low quality loss |
| [Llama-3-Base-8B-SFT-CPO-Q8_0.gguf](https://huggingface.co/tensorblock/Llama-3-Base-8B-SFT-CPO-GGUF/tree/main/Llama-3-Base-8B-SFT-CPO-Q8_0.gguf) | Q8_0 | 7.954 GB | very large, extremely low quality loss - not recommended |
## Downloading instruction
### Command line
Firstly, install Huggingface Client
```shell
pip install -U "huggingface_hub[cli]"
```
Then, downoad the individual model file the a local directory
```shell
huggingface-cli download tensorblock/Llama-3-Base-8B-SFT-CPO-GGUF --include "Llama-3-Base-8B-SFT-CPO-Q2_K.gguf" --local-dir MY_LOCAL_DIR
```
If you wanna download multiple model files with a pattern (e.g., `*Q4_K*gguf`), you can try:
```shell
huggingface-cli download tensorblock/Llama-3-Base-8B-SFT-CPO-GGUF --local-dir MY_LOCAL_DIR --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
|
cwaud/05fe0516-c420-4208-9814-1b814ef44c8b | cwaud | "2024-11-13T00:19:20Z" | 0 | 0 | peft | [
"peft",
"safetensors",
"qwen2",
"axolotl",
"generated_from_trainer",
"base_model:Qwen/Qwen2-0.5B",
"base_model:adapter:Qwen/Qwen2-0.5B",
"license:apache-2.0",
"8-bit",
"bitsandbytes",
"region:us"
] | null | "2024-11-12T22:35:05Z" | ---
library_name: peft
license: apache-2.0
base_model: Qwen/Qwen2-0.5B
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 05fe0516-c420-4208-9814-1b814ef44c8b
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: Qwen/Qwen2-0.5B
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- ab4a2ec5dc30a5c5_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/ab4a2ec5dc30a5c5_train_data.json
type:
field_input: dataset
field_instruction: id
field_output: context
field_system: context
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
group_by_length: false
hub_model_id: cwaud/05fe0516-c420-4208-9814-1b814ef44c8b
hub_repo: cwaud
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: true
local_rank: null
logging_steps: 1
lora_alpha: 32
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 16
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/ab4a2ec5dc30a5c5_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
save_steps: 5
save_strategy: steps
sequence_len: 4096
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
val_set_size: 0.05
wandb_entity: rayonlabs-rayon-labs
wandb_mode: online
wandb_name: 05fe0516-c420-4208-9814-1b814ef44c8b
wandb_project: Public_TuningSN
wandb_run: miner_id_24
wandb_runid: 05fe0516-c420-4208-9814-1b814ef44c8b
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 05fe0516-c420-4208-9814-1b814ef44c8b
This model is a fine-tuned version of [Qwen/Qwen2-0.5B](https://huggingface.co/Qwen/Qwen2-0.5B) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0459
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 0.0973 | 0.0000 | 1 | 0.1879 |
| 0.0859 | 0.0001 | 3 | 0.1717 |
| 0.0609 | 0.0003 | 6 | 0.0990 |
| 0.0231 | 0.0004 | 9 | 0.0459 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
saga-dev/59mp7hs | saga-dev | "2024-11-12T22:35:21Z" | 0 | 0 | null | [
"safetensors",
"llama",
"region:us"
] | null | "2024-11-12T22:35:18Z" | Entry not found |
Zekunli/qwen2.5-7b-alpaca-selection | Zekunli | "2024-11-12T23:09:46Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | "2024-11-12T22:35:40Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
RichardErkhov/bachngo_-_llama32_alpacav2-gguf | RichardErkhov | "2024-11-13T00:53:03Z" | 0 | 0 | null | [
"gguf",
"arxiv:1910.09700",
"region:us"
] | null | "2024-11-12T22:35:48Z" | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
llama32_alpacav2 - GGUF
- Model creator: https://huggingface.co/bachngo/
- Original model: https://huggingface.co/bachngo/llama32_alpacav2/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [llama32_alpacav2.Q2_K.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q2_K.gguf) | Q2_K | 1.27GB |
| [llama32_alpacav2.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q3_K_S.gguf) | Q3_K_S | 1.44GB |
| [llama32_alpacav2.Q3_K.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q3_K.gguf) | Q3_K | 1.57GB |
| [llama32_alpacav2.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q3_K_M.gguf) | Q3_K_M | 1.57GB |
| [llama32_alpacav2.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q3_K_L.gguf) | Q3_K_L | 1.69GB |
| [llama32_alpacav2.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.IQ4_XS.gguf) | IQ4_XS | 1.71GB |
| [llama32_alpacav2.Q4_0.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q4_0.gguf) | Q4_0 | 1.79GB |
| [llama32_alpacav2.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.IQ4_NL.gguf) | IQ4_NL | 1.79GB |
| [llama32_alpacav2.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q4_K_S.gguf) | Q4_K_S | 1.8GB |
| [llama32_alpacav2.Q4_K.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q4_K.gguf) | Q4_K | 1.88GB |
| [llama32_alpacav2.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q4_K_M.gguf) | Q4_K_M | 1.88GB |
| [llama32_alpacav2.Q4_1.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q4_1.gguf) | Q4_1 | 1.95GB |
| [llama32_alpacav2.Q5_0.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q5_0.gguf) | Q5_0 | 2.11GB |
| [llama32_alpacav2.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q5_K_S.gguf) | Q5_K_S | 2.11GB |
| [llama32_alpacav2.Q5_K.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q5_K.gguf) | Q5_K | 2.16GB |
| [llama32_alpacav2.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q5_K_M.gguf) | Q5_K_M | 2.16GB |
| [llama32_alpacav2.Q5_1.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q5_1.gguf) | Q5_1 | 2.28GB |
| [llama32_alpacav2.Q6_K.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q6_K.gguf) | Q6_K | 2.46GB |
| [llama32_alpacav2.Q8_0.gguf](https://huggingface.co/RichardErkhov/bachngo_-_llama32_alpacav2-gguf/blob/main/llama32_alpacav2.Q8_0.gguf) | Q8_0 | 3.19GB |
Original model description:
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
Pearush/phimoe_15_head24 | Pearush | "2024-11-12T22:41:17Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"phimoe",
"text-generation",
"conversational",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"region:us"
] | text-generation | "2024-11-12T22:36:04Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
featherless-ai-quants/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-GGUF | featherless-ai-quants | "2024-11-12T23:08:45Z" | 0 | 0 | null | [
"gguf",
"text-generation",
"base_model:arnavgrg/llama-2-13b-chat-nf4-fp16-upscaled",
"base_model:quantized:arnavgrg/llama-2-13b-chat-nf4-fp16-upscaled",
"region:us"
] | text-generation | "2024-11-12T22:36:06Z" | ---
base_model: arnavgrg/llama-2-13b-chat-nf4-fp16-upscaled
pipeline_tag: text-generation
quantized_by: featherless-ai-quants
---
# arnavgrg/llama-2-13b-chat-nf4-fp16-upscaled GGUF Quantizations 🚀
![Featherless AI Quants](./featherless-quants.png)
*Optimized GGUF quantization files for enhanced model performance*
> Powered by [Featherless AI](https://featherless.ai) - run any model you'd like for a simple small fee.
---
## Available Quantizations 📊
| Quantization Type | File | Size |
|-------------------|------|------|
| IQ4_XS | [arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-IQ4_XS.gguf](https://huggingface.co/featherless-ai-quants/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-GGUF/blob/main/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-IQ4_XS.gguf) | 6694.34 MB |
| Q2_K | [arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q2_K.gguf](https://huggingface.co/featherless-ai-quants/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-GGUF/blob/main/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q2_K.gguf) | 4629.39 MB |
| Q3_K_L | [arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q3_K_L.gguf](https://huggingface.co/featherless-ai-quants/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-GGUF/blob/main/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q3_K_L.gguf) | 6608.54 MB |
| Q3_K_M | [arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q3_K_M.gguf](https://huggingface.co/featherless-ai-quants/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-GGUF/blob/main/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q3_K_M.gguf) | 6044.17 MB |
| Q3_K_S | [arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q3_K_S.gguf](https://huggingface.co/featherless-ai-quants/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-GGUF/blob/main/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q3_K_S.gguf) | 5396.83 MB |
| Q4_K_M | [arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q4_K_M.gguf](https://huggingface.co/featherless-ai-quants/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-GGUF/blob/main/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q4_K_M.gguf) | 7501.56 MB |
| Q4_K_S | [arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q4_K_S.gguf](https://huggingface.co/featherless-ai-quants/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-GGUF/blob/main/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q4_K_S.gguf) | 7079.30 MB |
| Q5_K_M | [arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q5_K_M.gguf](https://huggingface.co/featherless-ai-quants/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-GGUF/blob/main/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q5_K_M.gguf) | 8802.34 MB |
| Q5_K_S | [arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q5_K_S.gguf](https://huggingface.co/featherless-ai-quants/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-GGUF/blob/main/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q5_K_S.gguf) | 8556.64 MB |
| Q6_K | [arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q6_K.gguf](https://huggingface.co/featherless-ai-quants/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-GGUF/blob/main/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q6_K.gguf) | 10184.42 MB |
| Q8_0 | [arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q8_0.gguf](https://huggingface.co/featherless-ai-quants/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-GGUF/blob/main/arnavgrg-llama-2-13b-chat-nf4-fp16-upscaled-Q8_0.gguf) | 13190.58 MB |
---
## ⚡ Powered by [Featherless AI](https://featherless.ai)
### Key Features
- 🔥 **Instant Hosting** - Deploy any Llama model on HuggingFace instantly
- 🛠️ **Zero Infrastructure** - No server setup or maintenance required
- 📚 **Vast Compatibility** - Support for 2400+ models and counting
- 💎 **Affordable Pricing** - Starting at just $10/month
---
**Links:**
[Get Started](https://featherless.ai) | [Documentation](https://featherless.ai/docs) | [Models](https://featherless.ai/models) |
touhidulislam/BERTweet_retrain_2020_12 | touhidulislam | "2024-11-12T22:36:57Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"fill-mask",
"generated_from_trainer",
"base_model:vinai/bertweet-base",
"base_model:finetune:vinai/bertweet-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | "2024-11-12T22:36:36Z" | ---
library_name: transformers
license: mit
base_model: vinai/bertweet-base
tags:
- generated_from_trainer
model-index:
- name: BERTweet_retrain_2020_12
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# BERTweet_retrain_2020_12
This model is a fine-tuned version of [vinai/bertweet-base](https://huggingface.co/vinai/bertweet-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.4447
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 2.6878 | 1.0 | 3112 | 2.5547 |
| 2.4321 | 2.0 | 6224 | 2.4756 |
| 2.3886 | 3.0 | 9336 | 2.4435 |
### Framework versions
- Transformers 4.45.1
- Pytorch 2.1.0+cu121
- Datasets 3.0.1
- Tokenizers 0.20.0
|
saga-dev/4ixen3s | saga-dev | "2024-11-12T22:36:47Z" | 0 | 0 | null | [
"safetensors",
"llama",
"region:us"
] | null | "2024-11-12T22:36:44Z" | Entry not found |
barchetta/mare-130936 | barchetta | "2024-11-12T22:46:10Z" | 0 | 0 | null | [
"safetensors",
"llama",
"region:us"
] | null | "2024-11-12T22:36:54Z" | Entry not found |
fAIseh00d/Realistic_Vision_V5.1_noVAE | fAIseh00d | "2024-11-12T22:41:16Z" | 0 | 0 | diffusers | [
"diffusers",
"safetensors",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | "2024-11-12T22:37:19Z" | Entry not found |
SzegedAI/huDeBERTa-MLM | SzegedAI | "2024-11-12T22:51:45Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"deberta",
"fill-mask",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | "2024-11-12T22:37:52Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
aurazboev/ISAllama-3.1-8b-tuned-GGUF | aurazboev | "2024-11-12T23:26:50Z" | 0 | 0 | transformers | [
"transformers",
"gguf",
"llama",
"text-generation-inference",
"unsloth",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | "2024-11-12T22:39:31Z" | ---
base_model: unsloth/meta-llama-3.1-8b-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- gguf
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** aurazboev
- **License:** apache-2.0
- **Finetuned from model :** unsloth/meta-llama-3.1-8b-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
fejori/health_model_Llama_3_2_1B_instruct_512 | fejori | "2024-11-12T23:39:29Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"llama",
"trl",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | "2024-11-12T22:40:02Z" | ---
base_model: unsloth/llama-3.2-1b-instruct-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** fejori
- **License:** apache-2.0
- **Finetuned from model :** unsloth/llama-3.2-1b-instruct-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
gokulsrinivasagan/parler-tts-mini-sa-12-15 | gokulsrinivasagan | "2024-11-12T22:43:07Z" | 0 | 0 | null | [
"safetensors",
"parler_tts",
"region:us"
] | null | "2024-11-12T22:41:14Z" | Entry not found |
purrmatnova/cats-vs-dogs | purrmatnova | "2024-11-12T22:42:16Z" | 0 | 0 | null | [
"license:unknown",
"region:us"
] | null | "2024-11-12T22:42:16Z" | ---
license: unknown
---
|
mradermacher/dolphin-2.9.2-Phi-3-Medium-GGUF | mradermacher | "2024-11-12T23:32:58Z" | 0 | 0 | transformers | [
"transformers",
"gguf",
"en",
"dataset:cognitivecomputations/Dolphin-2.9.2",
"dataset:teknium/OpenHermes-2.5",
"dataset:m-a-p/CodeFeedback-Filtered-Instruction",
"dataset:cognitivecomputations/dolphin-coder",
"dataset:cognitivecomputations/samantha-data",
"dataset:microsoft/orca-math-word-problems-200k",
"dataset:internlm/Agent-FLAN",
"dataset:cognitivecomputations/SystemChat-2.0",
"base_model:cognitivecomputations/dolphin-2.9.2-Phi-3-Medium",
"base_model:quantized:cognitivecomputations/dolphin-2.9.2-Phi-3-Medium",
"license:mit",
"endpoints_compatible",
"region:us"
] | null | "2024-11-12T22:42:35Z" | ---
base_model: cognitivecomputations/dolphin-2.9.2-Phi-3-Medium
datasets:
- cognitivecomputations/Dolphin-2.9.2
- teknium/OpenHermes-2.5
- m-a-p/CodeFeedback-Filtered-Instruction
- cognitivecomputations/dolphin-coder
- cognitivecomputations/samantha-data
- microsoft/orca-math-word-problems-200k
- internlm/Agent-FLAN
- cognitivecomputations/SystemChat-2.0
language:
- en
library_name: transformers
license: mit
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/cognitivecomputations/dolphin-2.9.2-Phi-3-Medium
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/dolphin-2.9.2-Phi-3-Medium-GGUF/resolve/main/dolphin-2.9.2-Phi-3-Medium.Q2_K.gguf) | Q2_K | 5.3 | |
| [GGUF](https://huggingface.co/mradermacher/dolphin-2.9.2-Phi-3-Medium-GGUF/resolve/main/dolphin-2.9.2-Phi-3-Medium.Q3_K_S.gguf) | Q3_K_S | 6.2 | |
| [GGUF](https://huggingface.co/mradermacher/dolphin-2.9.2-Phi-3-Medium-GGUF/resolve/main/dolphin-2.9.2-Phi-3-Medium.Q3_K_M.gguf) | Q3_K_M | 6.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/dolphin-2.9.2-Phi-3-Medium-GGUF/resolve/main/dolphin-2.9.2-Phi-3-Medium.Q3_K_L.gguf) | Q3_K_L | 7.4 | |
| [GGUF](https://huggingface.co/mradermacher/dolphin-2.9.2-Phi-3-Medium-GGUF/resolve/main/dolphin-2.9.2-Phi-3-Medium.IQ4_XS.gguf) | IQ4_XS | 7.7 | |
| [GGUF](https://huggingface.co/mradermacher/dolphin-2.9.2-Phi-3-Medium-GGUF/resolve/main/dolphin-2.9.2-Phi-3-Medium.Q4_0_4_4.gguf) | Q4_0_4_4 | 8.0 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/dolphin-2.9.2-Phi-3-Medium-GGUF/resolve/main/dolphin-2.9.2-Phi-3-Medium.Q4_K_S.gguf) | Q4_K_S | 8.1 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/dolphin-2.9.2-Phi-3-Medium-GGUF/resolve/main/dolphin-2.9.2-Phi-3-Medium.Q4_K_M.gguf) | Q4_K_M | 8.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/dolphin-2.9.2-Phi-3-Medium-GGUF/resolve/main/dolphin-2.9.2-Phi-3-Medium.Q5_K_S.gguf) | Q5_K_S | 9.7 | |
| [GGUF](https://huggingface.co/mradermacher/dolphin-2.9.2-Phi-3-Medium-GGUF/resolve/main/dolphin-2.9.2-Phi-3-Medium.Q5_K_M.gguf) | Q5_K_M | 10.0 | |
| [GGUF](https://huggingface.co/mradermacher/dolphin-2.9.2-Phi-3-Medium-GGUF/resolve/main/dolphin-2.9.2-Phi-3-Medium.Q6_K.gguf) | Q6_K | 11.6 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/dolphin-2.9.2-Phi-3-Medium-GGUF/resolve/main/dolphin-2.9.2-Phi-3-Medium.Q8_0.gguf) | Q8_0 | 14.9 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)
And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
Pearush/phimoe_15_head24_kv | Pearush | "2024-11-12T22:48:05Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"phimoe",
"text-generation",
"conversational",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"region:us"
] | text-generation | "2024-11-12T22:42:54Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
azizbekphd/wav2vec2-surah-ikhlas | azizbekphd | "2024-11-12T22:43:30Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-11-12T22:43:29Z" | Entry not found |
slotomatos/18-12111818-SloTomatos04 | slotomatos | "2024-11-12T22:47:58Z" | 0 | 0 | null | [
"safetensors",
"llama",
"region:us"
] | null | "2024-11-12T22:44:08Z" | Entry not found |
fixie-ai/ultravox-v0_4_1-llama-3_1-70b-functionary-medium-v3.1 | fixie-ai | "2024-11-12T22:46:31Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"ultravox",
"feature-extraction",
"custom_code",
"ar",
"de",
"en",
"es",
"fr",
"hi",
"it",
"ja",
"nl",
"pt",
"ru",
"sv",
"tr",
"uk",
"zh",
"dataset:fixie-ai/librispeech_asr",
"dataset:fixie-ai/common_voice_17_0",
"dataset:fixie-ai/peoples_speech",
"dataset:fixie-ai/gigaspeech",
"dataset:fixie-ai/multilingual_librispeech",
"dataset:fixie-ai/wenetspeech",
"dataset:fixie-ai/covost2",
"license:mit",
"region:us"
] | feature-extraction | "2024-11-12T22:45:39Z" | ---
language:
- ar
- de
- en
- es
- fr
- hi
- it
- ja
- nl
- pt
- ru
- sv
- tr
- uk
- zh
license: mit
library_name: transformers
datasets:
- fixie-ai/librispeech_asr
- fixie-ai/common_voice_17_0
- fixie-ai/peoples_speech
- fixie-ai/gigaspeech
- fixie-ai/multilingual_librispeech
- fixie-ai/wenetspeech
- fixie-ai/covost2
metrics:
- bleu
---
# Model Card for Ultravox
Ultravox is a multimodal Speech LLM built around a pretrained [Llama3.1-70B-Instruct](https://huggingface.co/meta-llama/Llama-3.1-70B-Instruct) and [whisper-large-v3-turbo](https://huggingface.co/openai/whisper-large-v3-turbo) backbone.
See https://ultravox.ai for the GitHub repo and more information.
## Model Details
### Model Description
Ultravox is a multimodal model that can consume both speech and text as input (e.g., a text system prompt and voice user message).
The input to the model is given as a text prompt with a special `<|audio|>` pseudo-token, and the model processor will replace this magic token with embeddings derived from the input audio.
Using the merged embeddings as input, the model will then generate output text as usual.
In a future revision of Ultravox, we plan to expand the token vocabulary to support generation of semantic and acoustic audio tokens, which can then be fed to a vocoder to produce voice output.
No preference tuning has been applied to this revision of the model.
- **Developed by:** Fixie.ai
- **License:** MIT
### Model Sources
- **Repository:** https://ultravox.ai
- **Demo:** See repo
## Usage
Think of the model as an LLM that can also hear and understand speech. As such, it can be used as a voice agent, and also to do speech-to-speech translation, analysis of spoken audio, etc.
To use the model, try the following:
```python
# pip install transformers peft librosa
import transformers
import numpy as np
import librosa
pipe = transformers.pipeline(model='fixie-ai/ultravox-v0_4_1-llama-3_1-70b', trust_remote_code=True)
path = "<path-to-input-audio>" # TODO: pass the audio here
audio, sr = librosa.load(path, sr=16000)
turns = [
{
"role": "system",
"content": "You are a friendly and helpful character. You love to answer questions for people."
},
]
pipe({'audio': audio, 'turns': turns, 'sampling_rate': sr}, max_new_tokens=30)
```
## Training Details
The model uses a pre-trained [Llama3.1-70B-Instruct](https://huggingface.co/meta-llama/Llama-3.1-70B-Instruct) backbone as well as the encoder part of [whisper-large-v3-turbo](https://huggingface.co/openai/whisper-large-v3-turbo).
Only the multi-modal adapter is trained, while Whisper encoder and Llama are kept frozen.
We use a knowledge-distillation loss where Ultravox is trying to match the logits of the text-based Llama backbone.
### Training Data
The training dataset is a mix of ASR datasets, extended with continuations generated by Llama 3.1 8B, and speech translation datasets, which yield a modest improvement in translation evaluations.
### Training Procedure
Supervised speech instruction finetuning via knowledge-distillation. For more info, see [training code in Ultravox repo](https://github.com/fixie-ai/ultravox/blob/main/ultravox/training/train.py).
#### Training Hyperparameters
- **Training regime:** BF16 mixed precision training
- **Hardward used:** 8x H100 GPUs
#### Speeds, Sizes, Times
The current version of Ultravox, when invoked with audio content, has a time-to-first-token (TTFT) of approximately 150ms, and a tokens-per-second rate of ~50-100 when using an A100-40GB GPU, all using a Llama 3.1 8B backbone.
Check out the audio tab on [TheFastest.ai](https://thefastest.ai/?m=audio) for daily benchmarks and a comparison with other existing models.
## Evaluation
| | Ultravox 0.4 70B | Ultravox 0.4.1 70B |
| --- | ---: | ---: |
| **en_ar** | 14.97 | 19.64 |
| **en_de** | 30.30 | 32.47 |
| **es_en** | 39.55 | 40.76 |
| **ru_en** | 44.16 | 45.07 |
| **en_ca** | 35.02 | 37.58 |
| **zh_en** | 12.16 | 17.98 |
|
broalantap/wonext-Llama3-Compressor-Adapter-16-32000steps | broalantap | "2024-11-12T22:47:09Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | "2024-11-12T22:45:53Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
saga-dev/msqlkpg | saga-dev | "2024-11-12T22:46:02Z" | 0 | 0 | null | [
"safetensors",
"llama",
"region:us"
] | null | "2024-11-12T22:45:59Z" | Entry not found |
tensorblock/Llama-3.1-Swallow-8B-v0.1-GGUF | tensorblock | "2024-11-12T23:46:07Z" | 0 | 0 | transformers | [
"transformers",
"gguf",
"TensorBlock",
"GGUF",
"text-generation",
"en",
"ja",
"base_model:tokyotech-llm/Llama-3.1-Swallow-8B-v0.1",
"base_model:quantized:tokyotech-llm/Llama-3.1-Swallow-8B-v0.1",
"license:llama3.1",
"endpoints_compatible",
"region:us"
] | text-generation | "2024-11-12T22:46:03Z" | ---
language:
- en
- ja
library_name: transformers
pipeline_tag: text-generation
license: llama3.1
model_type: llama
base_model: tokyotech-llm/Llama-3.1-Swallow-8B-v0.1
tags:
- TensorBlock
- GGUF
---
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/jC7kdl8.jpeg" alt="TensorBlock" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;">
Feedback and support: TensorBlock's <a href="https://x.com/tensorblock_aoi">Twitter/X</a>, <a href="https://t.me/TensorBlock">Telegram Group</a> and <a href="https://x.com/tensorblock_aoi">Discord server</a>
</p>
</div>
</div>
## tokyotech-llm/Llama-3.1-Swallow-8B-v0.1 - GGUF
This repo contains GGUF format model files for [tokyotech-llm/Llama-3.1-Swallow-8B-v0.1](https://huggingface.co/tokyotech-llm/Llama-3.1-Swallow-8B-v0.1).
The files were quantized using machines provided by [TensorBlock](https://tensorblock.co/), and they are compatible with llama.cpp as of [commit b4011](https://github.com/ggerganov/llama.cpp/commit/a6744e43e80f4be6398fc7733a01642c846dce1d).
## Prompt template
```
```
## Model file specification
| Filename | Quant type | File Size | Description |
| -------- | ---------- | --------- | ----------- |
| [Llama-3.1-Swallow-8B-v0.1-Q2_K.gguf](https://huggingface.co/tensorblock/Llama-3.1-Swallow-8B-v0.1-GGUF/tree/main/Llama-3.1-Swallow-8B-v0.1-Q2_K.gguf) | Q2_K | 2.961 GB | smallest, significant quality loss - not recommended for most purposes |
| [Llama-3.1-Swallow-8B-v0.1-Q3_K_S.gguf](https://huggingface.co/tensorblock/Llama-3.1-Swallow-8B-v0.1-GGUF/tree/main/Llama-3.1-Swallow-8B-v0.1-Q3_K_S.gguf) | Q3_K_S | 3.413 GB | very small, high quality loss |
| [Llama-3.1-Swallow-8B-v0.1-Q3_K_M.gguf](https://huggingface.co/tensorblock/Llama-3.1-Swallow-8B-v0.1-GGUF/tree/main/Llama-3.1-Swallow-8B-v0.1-Q3_K_M.gguf) | Q3_K_M | 3.743 GB | very small, high quality loss |
| [Llama-3.1-Swallow-8B-v0.1-Q3_K_L.gguf](https://huggingface.co/tensorblock/Llama-3.1-Swallow-8B-v0.1-GGUF/tree/main/Llama-3.1-Swallow-8B-v0.1-Q3_K_L.gguf) | Q3_K_L | 4.025 GB | small, substantial quality loss |
| [Llama-3.1-Swallow-8B-v0.1-Q4_0.gguf](https://huggingface.co/tensorblock/Llama-3.1-Swallow-8B-v0.1-GGUF/tree/main/Llama-3.1-Swallow-8B-v0.1-Q4_0.gguf) | Q4_0 | 4.341 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
| [Llama-3.1-Swallow-8B-v0.1-Q4_K_S.gguf](https://huggingface.co/tensorblock/Llama-3.1-Swallow-8B-v0.1-GGUF/tree/main/Llama-3.1-Swallow-8B-v0.1-Q4_K_S.gguf) | Q4_K_S | 4.370 GB | small, greater quality loss |
| [Llama-3.1-Swallow-8B-v0.1-Q4_K_M.gguf](https://huggingface.co/tensorblock/Llama-3.1-Swallow-8B-v0.1-GGUF/tree/main/Llama-3.1-Swallow-8B-v0.1-Q4_K_M.gguf) | Q4_K_M | 4.583 GB | medium, balanced quality - recommended |
| [Llama-3.1-Swallow-8B-v0.1-Q5_0.gguf](https://huggingface.co/tensorblock/Llama-3.1-Swallow-8B-v0.1-GGUF/tree/main/Llama-3.1-Swallow-8B-v0.1-Q5_0.gguf) | Q5_0 | 5.215 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
| [Llama-3.1-Swallow-8B-v0.1-Q5_K_S.gguf](https://huggingface.co/tensorblock/Llama-3.1-Swallow-8B-v0.1-GGUF/tree/main/Llama-3.1-Swallow-8B-v0.1-Q5_K_S.gguf) | Q5_K_S | 5.215 GB | large, low quality loss - recommended |
| [Llama-3.1-Swallow-8B-v0.1-Q5_K_M.gguf](https://huggingface.co/tensorblock/Llama-3.1-Swallow-8B-v0.1-GGUF/tree/main/Llama-3.1-Swallow-8B-v0.1-Q5_K_M.gguf) | Q5_K_M | 5.339 GB | large, very low quality loss - recommended |
| [Llama-3.1-Swallow-8B-v0.1-Q6_K.gguf](https://huggingface.co/tensorblock/Llama-3.1-Swallow-8B-v0.1-GGUF/tree/main/Llama-3.1-Swallow-8B-v0.1-Q6_K.gguf) | Q6_K | 6.143 GB | very large, extremely low quality loss |
| [Llama-3.1-Swallow-8B-v0.1-Q8_0.gguf](https://huggingface.co/tensorblock/Llama-3.1-Swallow-8B-v0.1-GGUF/tree/main/Llama-3.1-Swallow-8B-v0.1-Q8_0.gguf) | Q8_0 | 7.954 GB | very large, extremely low quality loss - not recommended |
## Downloading instruction
### Command line
Firstly, install Huggingface Client
```shell
pip install -U "huggingface_hub[cli]"
```
Then, downoad the individual model file the a local directory
```shell
huggingface-cli download tensorblock/Llama-3.1-Swallow-8B-v0.1-GGUF --include "Llama-3.1-Swallow-8B-v0.1-Q2_K.gguf" --local-dir MY_LOCAL_DIR
```
If you wanna download multiple model files with a pattern (e.g., `*Q4_K*gguf`), you can try:
```shell
huggingface-cli download tensorblock/Llama-3.1-Swallow-8B-v0.1-GGUF --local-dir MY_LOCAL_DIR --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
|
itorgov/model-1731451612 | itorgov | "2024-11-12T22:51:42Z" | 0 | 0 | null | [
"safetensors",
"llama",
"region:us"
] | null | "2024-11-12T22:46:52Z" | Entry not found |
barchetta/rana-130946 | barchetta | "2024-11-12T22:53:16Z" | 0 | 0 | null | [
"safetensors",
"llama",
"region:us"
] | null | "2024-11-12T22:46:53Z" | Entry not found |
h-h-y/ST-MoE-BERT | h-h-y | "2024-11-12T23:05:42Z" | 0 | 0 | null | [
"en",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:mit",
"region:us"
] | null | "2024-11-12T22:48:27Z" | ---
license: mit
language:
- en
metrics:
- accuracy
base_model:
- google-bert/bert-base-uncased
--- |
featherless-ai-quants/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-GGUF | featherless-ai-quants | "2024-11-12T23:04:46Z" | 0 | 0 | null | [
"gguf",
"text-generation",
"base_model:chlee10/T3Q-LLM3-Llama3-sft1.0-dpo1.0",
"base_model:quantized:chlee10/T3Q-LLM3-Llama3-sft1.0-dpo1.0",
"region:us"
] | text-generation | "2024-11-12T22:48:38Z" | ---
base_model: chlee10/T3Q-LLM3-Llama3-sft1.0-dpo1.0
pipeline_tag: text-generation
quantized_by: featherless-ai-quants
---
# chlee10/T3Q-LLM3-Llama3-sft1.0-dpo1.0 GGUF Quantizations 🚀
![Featherless AI Quants](./featherless-quants.png)
*Optimized GGUF quantization files for enhanced model performance*
> Powered by [Featherless AI](https://featherless.ai) - run any model you'd like for a simple small fee.
---
## Available Quantizations 📊
| Quantization Type | File | Size |
|-------------------|------|------|
| IQ4_XS | [chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-IQ4_XS.gguf](https://huggingface.co/featherless-ai-quants/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-GGUF/blob/main/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-IQ4_XS.gguf) | 4276.62 MB |
| Q2_K | [chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q2_K.gguf](https://huggingface.co/featherless-ai-quants/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-GGUF/blob/main/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q2_K.gguf) | 3031.86 MB |
| Q3_K_L | [chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q3_K_L.gguf](https://huggingface.co/featherless-ai-quants/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-GGUF/blob/main/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q3_K_L.gguf) | 4121.74 MB |
| Q3_K_M | [chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q3_K_M.gguf](https://huggingface.co/featherless-ai-quants/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-GGUF/blob/main/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q3_K_M.gguf) | 3832.74 MB |
| Q3_K_S | [chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q3_K_S.gguf](https://huggingface.co/featherless-ai-quants/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-GGUF/blob/main/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q3_K_S.gguf) | 3494.74 MB |
| Q4_K_M | [chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q4_K_M.gguf](https://huggingface.co/featherless-ai-quants/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-GGUF/blob/main/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q4_K_M.gguf) | 4692.78 MB |
| Q4_K_S | [chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q4_K_S.gguf](https://huggingface.co/featherless-ai-quants/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-GGUF/blob/main/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q4_K_S.gguf) | 4475.28 MB |
| Q5_K_M | [chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q5_K_M.gguf](https://huggingface.co/featherless-ai-quants/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-GGUF/blob/main/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q5_K_M.gguf) | 5467.40 MB |
| Q5_K_S | [chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q5_K_S.gguf](https://huggingface.co/featherless-ai-quants/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-GGUF/blob/main/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q5_K_S.gguf) | 5339.90 MB |
| Q6_K | [chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q6_K.gguf](https://huggingface.co/featherless-ai-quants/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-GGUF/blob/main/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q6_K.gguf) | 6290.44 MB |
| Q8_0 | [chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q8_0.gguf](https://huggingface.co/featherless-ai-quants/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-GGUF/blob/main/chlee10-T3Q-LLM3-Llama3-sft1.0-dpo1.0-Q8_0.gguf) | 8145.11 MB |
---
## ⚡ Powered by [Featherless AI](https://featherless.ai)
### Key Features
- 🔥 **Instant Hosting** - Deploy any Llama model on HuggingFace instantly
- 🛠️ **Zero Infrastructure** - No server setup or maintenance required
- 📚 **Vast Compatibility** - Support for 2400+ models and counting
- 💎 **Affordable Pricing** - Starting at just $10/month
---
**Links:**
[Get Started](https://featherless.ai) | [Documentation](https://featherless.ai/docs) | [Models](https://featherless.ai/models) |
RichardErkhov/dierce_-_testmod-gguf | RichardErkhov | "2024-11-12T23:01:23Z" | 0 | 0 | null | [
"gguf",
"region:us"
] | null | "2024-11-12T22:48:48Z" | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
testmod - GGUF
- Model creator: https://huggingface.co/dierce/
- Original model: https://huggingface.co/dierce/testmod/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [testmod.Q2_K.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q2_K.gguf) | Q2_K | 0.08GB |
| [testmod.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q3_K_S.gguf) | Q3_K_S | 0.08GB |
| [testmod.Q3_K.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q3_K.gguf) | Q3_K | 0.09GB |
| [testmod.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q3_K_M.gguf) | Q3_K_M | 0.09GB |
| [testmod.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q3_K_L.gguf) | Q3_K_L | 0.1GB |
| [testmod.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.IQ4_XS.gguf) | IQ4_XS | 0.1GB |
| [testmod.Q4_0.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q4_0.gguf) | Q4_0 | 0.1GB |
| [testmod.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.IQ4_NL.gguf) | IQ4_NL | 0.1GB |
| [testmod.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q4_K_S.gguf) | Q4_K_S | 0.1GB |
| [testmod.Q4_K.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q4_K.gguf) | Q4_K | 0.11GB |
| [testmod.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q4_K_M.gguf) | Q4_K_M | 0.11GB |
| [testmod.Q4_1.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q4_1.gguf) | Q4_1 | 0.11GB |
| [testmod.Q5_0.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q5_0.gguf) | Q5_0 | 0.11GB |
| [testmod.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q5_K_S.gguf) | Q5_K_S | 0.11GB |
| [testmod.Q5_K.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q5_K.gguf) | Q5_K | 0.12GB |
| [testmod.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q5_K_M.gguf) | Q5_K_M | 0.12GB |
| [testmod.Q5_1.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q5_1.gguf) | Q5_1 | 0.12GB |
| [testmod.Q6_K.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q6_K.gguf) | Q6_K | 0.13GB |
| [testmod.Q8_0.gguf](https://huggingface.co/RichardErkhov/dierce_-_testmod-gguf/blob/main/testmod.Q8_0.gguf) | Q8_0 | 0.17GB |
Original model description:
---
tags:
- conversational
pipeline_tag: text-generation
library_name: transformers
---
#test mod
|
itorgov/model-1731451904 | itorgov | "2024-11-12T22:55:48Z" | 0 | 0 | null | [
"safetensors",
"llama",
"region:us"
] | null | "2024-11-12T22:51:45Z" | Entry not found |
joaorossi15/mistral-7B-v03-ethical-user-stories | joaorossi15 | "2024-11-12T22:52:47Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | "2024-11-12T22:52:43Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
featherless-ai-quants/Kukedlc-Neural-4-QA-7b-GGUF | featherless-ai-quants | "2024-11-12T23:05:33Z" | 0 | 0 | null | [
"gguf",
"text-generation",
"base_model:Kukedlc/Neural-4-QA-7b",
"base_model:quantized:Kukedlc/Neural-4-QA-7b",
"region:us"
] | text-generation | "2024-11-12T22:52:46Z" | ---
base_model: Kukedlc/Neural-4-QA-7b
pipeline_tag: text-generation
quantized_by: featherless-ai-quants
---
# Kukedlc/Neural-4-QA-7b GGUF Quantizations 🚀
![Featherless AI Quants](./featherless-quants.png)
*Optimized GGUF quantization files for enhanced model performance*
> Powered by [Featherless AI](https://featherless.ai) - run any model you'd like for a simple small fee.
---
## Available Quantizations 📊
| Quantization Type | File | Size |
|-------------------|------|------|
| IQ4_XS | [Kukedlc-Neural-4-QA-7b-IQ4_XS.gguf](https://huggingface.co/featherless-ai-quants/Kukedlc-Neural-4-QA-7b-GGUF/blob/main/Kukedlc-Neural-4-QA-7b-IQ4_XS.gguf) | 3761.66 MB |
| Q2_K | [Kukedlc-Neural-4-QA-7b-Q2_K.gguf](https://huggingface.co/featherless-ai-quants/Kukedlc-Neural-4-QA-7b-GGUF/blob/main/Kukedlc-Neural-4-QA-7b-Q2_K.gguf) | 2593.27 MB |
| Q3_K_L | [Kukedlc-Neural-4-QA-7b-Q3_K_L.gguf](https://huggingface.co/featherless-ai-quants/Kukedlc-Neural-4-QA-7b-GGUF/blob/main/Kukedlc-Neural-4-QA-7b-Q3_K_L.gguf) | 3644.97 MB |
| Q3_K_M | [Kukedlc-Neural-4-QA-7b-Q3_K_M.gguf](https://huggingface.co/featherless-ai-quants/Kukedlc-Neural-4-QA-7b-GGUF/blob/main/Kukedlc-Neural-4-QA-7b-Q3_K_M.gguf) | 3355.97 MB |
| Q3_K_S | [Kukedlc-Neural-4-QA-7b-Q3_K_S.gguf](https://huggingface.co/featherless-ai-quants/Kukedlc-Neural-4-QA-7b-GGUF/blob/main/Kukedlc-Neural-4-QA-7b-Q3_K_S.gguf) | 3017.97 MB |
| Q4_K_M | [Kukedlc-Neural-4-QA-7b-Q4_K_M.gguf](https://huggingface.co/featherless-ai-quants/Kukedlc-Neural-4-QA-7b-GGUF/blob/main/Kukedlc-Neural-4-QA-7b-Q4_K_M.gguf) | 4166.07 MB |
| Q4_K_S | [Kukedlc-Neural-4-QA-7b-Q4_K_S.gguf](https://huggingface.co/featherless-ai-quants/Kukedlc-Neural-4-QA-7b-GGUF/blob/main/Kukedlc-Neural-4-QA-7b-Q4_K_S.gguf) | 3948.57 MB |
| Q5_K_M | [Kukedlc-Neural-4-QA-7b-Q5_K_M.gguf](https://huggingface.co/featherless-ai-quants/Kukedlc-Neural-4-QA-7b-GGUF/blob/main/Kukedlc-Neural-4-QA-7b-Q5_K_M.gguf) | 4893.69 MB |
| Q5_K_S | [Kukedlc-Neural-4-QA-7b-Q5_K_S.gguf](https://huggingface.co/featherless-ai-quants/Kukedlc-Neural-4-QA-7b-GGUF/blob/main/Kukedlc-Neural-4-QA-7b-Q5_K_S.gguf) | 4766.19 MB |
| Q6_K | [Kukedlc-Neural-4-QA-7b-Q6_K.gguf](https://huggingface.co/featherless-ai-quants/Kukedlc-Neural-4-QA-7b-GGUF/blob/main/Kukedlc-Neural-4-QA-7b-Q6_K.gguf) | 5666.80 MB |
| Q8_0 | [Kukedlc-Neural-4-QA-7b-Q8_0.gguf](https://huggingface.co/featherless-ai-quants/Kukedlc-Neural-4-QA-7b-GGUF/blob/main/Kukedlc-Neural-4-QA-7b-Q8_0.gguf) | 7339.34 MB |
---
## ⚡ Powered by [Featherless AI](https://featherless.ai)
### Key Features
- 🔥 **Instant Hosting** - Deploy any Llama model on HuggingFace instantly
- 🛠️ **Zero Infrastructure** - No server setup or maintenance required
- 📚 **Vast Compatibility** - Support for 2400+ models and counting
- 💎 **Affordable Pricing** - Starting at just $10/month
---
**Links:**
[Get Started](https://featherless.ai) | [Documentation](https://featherless.ai/docs) | [Models](https://featherless.ai/models) |
barchetta/rosa-130954 | barchetta | "2024-11-12T22:59:25Z" | 0 | 0 | null | [
"safetensors",
"llama",
"region:us"
] | null | "2024-11-12T22:54:05Z" | Entry not found |
broalantap/woae-Llama3-Compressor-Adapter-16-32000steps | broalantap | "2024-11-12T22:56:07Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | "2024-11-12T22:54:50Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
OhaymakingO/18-13110057-02Haymak | OhaymakingO | "2024-11-12T22:58:26Z" | 0 | 0 | null | [
"safetensors",
"llama",
"region:us"
] | null | "2024-11-12T22:54:53Z" | Entry not found |
touhidulislam/BERTweet_retrain_2020_13 | touhidulislam | "2024-11-12T22:55:37Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"fill-mask",
"generated_from_trainer",
"base_model:vinai/bertweet-base",
"base_model:finetune:vinai/bertweet-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | "2024-11-12T22:55:12Z" | ---
library_name: transformers
license: mit
base_model: vinai/bertweet-base
tags:
- generated_from_trainer
model-index:
- name: BERTweet_retrain_2020_13
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# BERTweet_retrain_2020_13
This model is a fine-tuned version of [vinai/bertweet-base](https://huggingface.co/vinai/bertweet-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.5797
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 2.8113 | 1.0 | 3016 | 2.6534 |
| 2.6974 | 2.0 | 6032 | 2.5984 |
| 2.7571 | 3.0 | 9048 | 2.5636 |
### Framework versions
- Transformers 4.45.1
- Pytorch 2.1.0+cu121
- Datasets 3.0.1
- Tokenizers 0.20.0
|
itorgov/model-1731452151 | itorgov | "2024-11-12T23:00:42Z" | 0 | 0 | null | [
"safetensors",
"llama",
"region:us"
] | null | "2024-11-12T22:55:51Z" | Entry not found |
skyseven/pix2struct-bon-4096 | skyseven | "2024-11-12T22:56:58Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"pix2struct",
"image-text-to-text",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | image-text-to-text | "2024-11-12T22:56:35Z" | Invalid username or password. |
premanthcharan/Image_Captioning_Model | premanthcharan | "2024-11-12T23:24:23Z" | 0 | 1 | null | [
"pytorch",
"vision-encoder-decoder",
"image-to-text",
"image-captioning",
"Transformers",
"license:apache-2.0",
"region:us"
] | image-to-text | "2024-11-12T22:56:53Z" | ---
tags:
- image-to-text
- image-captioning
- Transformers
- vision-encoder-decoder
license: apache-2.0
widget:
- src: >-
https://huggingface.co/datasets/mishig/sample_images/resolve/main/savanna.jpg
example_title: Savanna
- src: >-
https://huggingface.co/datasets/mishig/sample_images/resolve/main/football-match.jpg
example_title: Football Match
- src: >-
https://huggingface.co/datasets/mishig/sample_images/resolve/main/airport.jpg
example_title: Airport
---
# The Illustrated Image Captioning using transformers
![](https://ankur3107.github.io/assets/images/vision-encoder-decoder.png)
* https://ankur3107.github.io/blogs/the-illustrated-image-captioning-using-transformers/
# Sample running code
```python
from transformers import VisionEncoderDecoderModel, ViTImageProcessor, AutoTokenizer
import torch
from PIL import Image
model = VisionEncoderDecoderModel.from_pretrained("nlpconnect/vit-gpt2-image-captioning")
feature_extractor = ViTImageProcessor.from_pretrained("nlpconnect/vit-gpt2-image-captioning")
tokenizer = AutoTokenizer.from_pretrained("nlpconnect/vit-gpt2-image-captioning")
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
model.to(device)
max_length = 16
num_beams = 4
gen_kwargs = {"max_length": max_length, "num_beams": num_beams}
def predict_step(image_paths):
images = []
for image_path in image_paths:
i_image = Image.open(image_path)
if i_image.mode != "RGB":
i_image = i_image.convert(mode="RGB")
images.append(i_image)
pixel_values = feature_extractor(images=images, return_tensors="pt").pixel_values
pixel_values = pixel_values.to(device)
output_ids = model.generate(pixel_values, **gen_kwargs)
preds = tokenizer.batch_decode(output_ids, skip_special_tokens=True)
preds = [pred.strip() for pred in preds]
return preds
predict_step(['doctor.e16ba4e4.jpg']) # ['a woman in a hospital bed with a woman in a hospital bed']
```
# Sample running code using transformers pipeline
```python
from transformers import pipeline
image_to_text = pipeline("image-to-text", model="nlpconnect/vit-gpt2-image-captioning")
image_to_text("https://ankur3107.github.io/assets/images/image-captioning-example.png")
# [{'generated_text': 'a soccer game with a player jumping to catch the ball '}]
``` |