|
--- |
|
base_model: microsoft/deberta-v3-small |
|
library_name: sentence-transformers |
|
metrics: |
|
- pearson_cosine |
|
- spearman_cosine |
|
- pearson_manhattan |
|
- spearman_manhattan |
|
- pearson_euclidean |
|
- spearman_euclidean |
|
- pearson_dot |
|
- spearman_dot |
|
- pearson_max |
|
- spearman_max |
|
- cosine_accuracy |
|
- cosine_accuracy_threshold |
|
- cosine_f1 |
|
- cosine_f1_threshold |
|
- cosine_precision |
|
- cosine_recall |
|
- cosine_ap |
|
- dot_accuracy |
|
- dot_accuracy_threshold |
|
- dot_f1 |
|
- dot_f1_threshold |
|
- dot_precision |
|
- dot_recall |
|
- dot_ap |
|
- manhattan_accuracy |
|
- manhattan_accuracy_threshold |
|
- manhattan_f1 |
|
- manhattan_f1_threshold |
|
- manhattan_precision |
|
- manhattan_recall |
|
- manhattan_ap |
|
- euclidean_accuracy |
|
- euclidean_accuracy_threshold |
|
- euclidean_f1 |
|
- euclidean_f1_threshold |
|
- euclidean_precision |
|
- euclidean_recall |
|
- euclidean_ap |
|
- max_accuracy |
|
- max_accuracy_threshold |
|
- max_f1 |
|
- max_f1_threshold |
|
- max_precision |
|
- max_recall |
|
- max_ap |
|
pipeline_tag: sentence-similarity |
|
tags: |
|
- sentence-transformers |
|
- sentence-similarity |
|
- feature-extraction |
|
- generated_from_trainer |
|
- dataset_size:32500 |
|
- loss:GISTEmbedLoss |
|
widget: |
|
- source_sentence: A picture of a white gas range with figurines above. |
|
sentences: |
|
- A nerdy woman brushing her teeth with a friend nearby. |
|
- a white stove turned off with a digital clock |
|
- The plasma membrane also contains other molecules, primarily other lipids and |
|
proteins. The green molecules in Figure above , for example, are the lipid cholesterol. |
|
Molecules of cholesterol help the plasma membrane keep its shape. Many of the |
|
proteins in the plasma membrane assist other substances in crossing the membrane. |
|
- source_sentence: who makes the kentucky derby garland of roses |
|
sentences: |
|
- Accrington strengthened their position in the play-off places with a hard-fought |
|
win over struggling Dagenham. |
|
- "tidal energy can be used to produce electricity. Ocean thermal is energy derived\ |
|
\ from waves and also from tidal waves. \n Ocean thermal energy can be used to\ |
|
\ produce electricity." |
|
- Kentucky Derby Trophy The Kroger Company has been the official florist of the |
|
Kentucky Derby since 1987. After taking over the duties from the Kingsley Walker |
|
florist, Kroger began constructing the prestigious garland in one of its local |
|
stores for the public to view on Derby Eve. The preservation of the garland and |
|
crowds of spectators watching its construction are a testament to the prestige |
|
and mystique of the Garland of Roses. |
|
- source_sentence: what is the difference between a general sense and a special sense? |
|
sentences: |
|
- 'Ian Curtis ( of Touching from a distance) Ian Kevin Curtis was an English musician |
|
and singer-songwriter. He is best known as the lead singer and lyricist of the |
|
post-punk band Joy Division. Joy Division released its debut album, Unknown Pleasures, |
|
in 1979 and recorded its follow-up, Closer, in 1980. Curtis, who suffered from |
|
epilepsy and depression, committed suicide on 18 May 1980, on the eve of Joy Division''s |
|
first North American tour, resulting in the band''s dissolution and the subsequent |
|
formation of New Order. Curtis was known for his baritone voice, dance style, |
|
and songwriting filled with imagery of desolation, emptiness and alienation. In |
|
1995, Curtis''s widow Deborah published Touching from a Distance: Ian Curtis and |
|
Joy Division, a biography of the singer. His life and death Ian Kevin Curtis was |
|
an English musician and singer-songwriter. He is best known as the lead singer |
|
and lyricist of the post-punk band Joy Division. Joy Division released its debut |
|
album, Unknown Pleasures, in 1979 and recorded its follow-up, Closer, in 1980. |
|
Curtis, who suffered from epilepsy and depression, committed suicide on 18 May |
|
1980, on the eve of Joy Division''s first North American tour, resulting in the |
|
band''s dissolution and the subsequent formation of New Order. Curtis was known |
|
for his baritone voice, dance style, and songwriting filled with imagery of desolation, |
|
emptiness and alienation. In 1995, Curtis''s widow Deborah published Touching |
|
from a Distance: Ian Curtis and Joy Division, a biography of the singer. His life |
|
and death have been dramatised in the films 24 Hour Party People (2002) and Control |
|
(2007). ...more' |
|
- The human body has two basic types of senses, called special senses and general |
|
senses. Special senses have specialized sense organs that gather sensory information |
|
and change it into nerve impulses. ... General senses, in contrast, are all associated |
|
with the sense of touch. They lack special sense organs. |
|
- Captain Hook Barrie states in the novel that "Hook was not his true name. To reveal |
|
who he really was would even at this date set the country in a blaze", and relates |
|
that Peter Pan began their rivalry by feeding the pirate's hand to the crocodile. |
|
He is said to be "Blackbeard's bo'sun" and "the only man of whom Barbecue was |
|
afraid".[5] (In Robert Louis Stevenson's Treasure Island, one of the names Long |
|
John Silver goes by is Barbecue.)[6] |
|
- source_sentence: Retzius was born in Stockholm , son of the anatomist Anders Jahan |
|
Retzius ( and grandson of the naturalist and chemist Anders Retzius ) . |
|
sentences: |
|
- Retzius was born in Stockholm , the son of anatomist Anders Jahan Retzius ( and |
|
grandson of the naturalist and chemist Anders Retzius ) . |
|
- As of 14 March , over 156,000 cases of COVID-19 have been reported in around 140 |
|
countries and territories ; more than 5,800 people have died from the disease |
|
and around 75,000 have recovered . |
|
- A person sitting on a stool on the street. |
|
- source_sentence: who was the first person who made the violin |
|
sentences: |
|
- Alice in Chains Alice in Chains is an American rock band from Seattle, Washington, |
|
formed in 1987 by guitarist and vocalist Jerry Cantrell and drummer Sean Kinney,[1] |
|
who recruited bassist Mike Starr[1] and lead vocalist Layne Staley.[1][2][3] Starr |
|
was replaced by Mike Inez in 1993.[4] After Staley's death in 2002, William DuVall |
|
joined in 2006 as co-lead vocalist and rhythm guitarist. The band took its name |
|
from Staley's previous group, the glam metal band Alice N' Chains.[5][2] |
|
- as distance from an object decreases , that object will appear larger |
|
- Violin The first makers of violins probably borrowed from various developments |
|
of the Byzantine lira. These included the rebec;[13] the Arabic rebab; the vielle |
|
(also known as the fidel or viuola); and the lira da braccio[11][14] The violin |
|
in its present form emerged in early 16th-century northern Italy. The earliest |
|
pictures of violins, albeit with three strings, are seen in northern Italy around |
|
1530, at around the same time as the words "violino" and "vyollon" are seen in |
|
Italian and French documents. One of the earliest explicit descriptions of the |
|
instrument, including its tuning, is from the Epitome musical by Jambe de Fer, |
|
published in Lyon in 1556.[15] By this time, the violin had already begun to spread |
|
throughout Europe. |
|
model-index: |
|
- name: SentenceTransformer based on microsoft/deberta-v3-small |
|
results: |
|
- task: |
|
type: semantic-similarity |
|
name: Semantic Similarity |
|
dataset: |
|
name: sts test |
|
type: sts-test |
|
metrics: |
|
- type: pearson_cosine |
|
value: 0.4008630217130614 |
|
name: Pearson Cosine |
|
- type: spearman_cosine |
|
value: 0.45136395222060033 |
|
name: Spearman Cosine |
|
- type: pearson_manhattan |
|
value: 0.43690206761360884 |
|
name: Pearson Manhattan |
|
- type: spearman_manhattan |
|
value: 0.4609070244585321 |
|
name: Spearman Manhattan |
|
- type: pearson_euclidean |
|
value: 0.42558645063013534 |
|
name: Pearson Euclidean |
|
- type: spearman_euclidean |
|
value: 0.45137661682757413 |
|
name: Spearman Euclidean |
|
- type: pearson_dot |
|
value: 0.39680226288992326 |
|
name: Pearson Dot |
|
- type: spearman_dot |
|
value: 0.4467416839340055 |
|
name: Spearman Dot |
|
- type: pearson_max |
|
value: 0.43690206761360884 |
|
name: Pearson Max |
|
- type: spearman_max |
|
value: 0.4609070244585321 |
|
name: Spearman Max |
|
- task: |
|
type: binary-classification |
|
name: Binary Classification |
|
dataset: |
|
name: allNLI dev |
|
type: allNLI-dev |
|
metrics: |
|
- type: cosine_accuracy |
|
value: 0.685546875 |
|
name: Cosine Accuracy |
|
- type: cosine_accuracy_threshold |
|
value: 0.9454631209373474 |
|
name: Cosine Accuracy Threshold |
|
- type: cosine_f1 |
|
value: 0.529531568228106 |
|
name: Cosine F1 |
|
- type: cosine_f1_threshold |
|
value: 0.881622314453125 |
|
name: Cosine F1 Threshold |
|
- type: cosine_precision |
|
value: 0.4088050314465409 |
|
name: Cosine Precision |
|
- type: cosine_recall |
|
value: 0.7514450867052023 |
|
name: Cosine Recall |
|
- type: cosine_ap |
|
value: 0.4792378857888552 |
|
name: Cosine Ap |
|
- type: dot_accuracy |
|
value: 0.685546875 |
|
name: Dot Accuracy |
|
- type: dot_accuracy_threshold |
|
value: 726.0874633789062 |
|
name: Dot Accuracy Threshold |
|
- type: dot_f1 |
|
value: 0.5287356321839081 |
|
name: Dot F1 |
|
- type: dot_f1_threshold |
|
value: 667.7103271484375 |
|
name: Dot F1 Threshold |
|
- type: dot_precision |
|
value: 0.3954154727793696 |
|
name: Dot Precision |
|
- type: dot_recall |
|
value: 0.7976878612716763 |
|
name: Dot Recall |
|
- type: dot_ap |
|
value: 0.4773554395168431 |
|
name: Dot Ap |
|
- type: manhattan_accuracy |
|
value: 0.681640625 |
|
name: Manhattan Accuracy |
|
- type: manhattan_accuracy_threshold |
|
value: 179.65867614746094 |
|
name: Manhattan Accuracy Threshold |
|
- type: manhattan_f1 |
|
value: 0.5375 |
|
name: Manhattan F1 |
|
- type: manhattan_f1_threshold |
|
value: 272.7696838378906 |
|
name: Manhattan F1 Threshold |
|
- type: manhattan_precision |
|
value: 0.4201954397394137 |
|
name: Manhattan Precision |
|
- type: manhattan_recall |
|
value: 0.7456647398843931 |
|
name: Manhattan Recall |
|
- type: manhattan_ap |
|
value: 0.4774414549549294 |
|
name: Manhattan Ap |
|
- type: euclidean_accuracy |
|
value: 0.685546875 |
|
name: Euclidean Accuracy |
|
- type: euclidean_accuracy_threshold |
|
value: 9.15554428100586 |
|
name: Euclidean Accuracy Threshold |
|
- type: euclidean_f1 |
|
value: 0.5303643724696356 |
|
name: Euclidean F1 |
|
- type: euclidean_f1_threshold |
|
value: 13.481329917907715 |
|
name: Euclidean F1 Threshold |
|
- type: euclidean_precision |
|
value: 0.40809968847352024 |
|
name: Euclidean Precision |
|
- type: euclidean_recall |
|
value: 0.7572254335260116 |
|
name: Euclidean Recall |
|
- type: euclidean_ap |
|
value: 0.479012627080778 |
|
name: Euclidean Ap |
|
- type: max_accuracy |
|
value: 0.685546875 |
|
name: Max Accuracy |
|
- type: max_accuracy_threshold |
|
value: 726.0874633789062 |
|
name: Max Accuracy Threshold |
|
- type: max_f1 |
|
value: 0.5375 |
|
name: Max F1 |
|
- type: max_f1_threshold |
|
value: 667.7103271484375 |
|
name: Max F1 Threshold |
|
- type: max_precision |
|
value: 0.4201954397394137 |
|
name: Max Precision |
|
- type: max_recall |
|
value: 0.7976878612716763 |
|
name: Max Recall |
|
- type: max_ap |
|
value: 0.4792378857888552 |
|
name: Max Ap |
|
- task: |
|
type: binary-classification |
|
name: Binary Classification |
|
dataset: |
|
name: Qnli dev |
|
type: Qnli-dev |
|
metrics: |
|
- type: cosine_accuracy |
|
value: 0.666015625 |
|
name: Cosine Accuracy |
|
- type: cosine_accuracy_threshold |
|
value: 0.8375362157821655 |
|
name: Cosine Accuracy Threshold |
|
- type: cosine_f1 |
|
value: 0.663594470046083 |
|
name: Cosine F1 |
|
- type: cosine_f1_threshold |
|
value: 0.776394248008728 |
|
name: Cosine F1 Threshold |
|
- type: cosine_precision |
|
value: 0.5204819277108433 |
|
name: Cosine Precision |
|
- type: cosine_recall |
|
value: 0.9152542372881356 |
|
name: Cosine Recall |
|
- type: cosine_ap |
|
value: 0.6688267117653419 |
|
name: Cosine Ap |
|
- type: dot_accuracy |
|
value: 0.666015625 |
|
name: Dot Accuracy |
|
- type: dot_accuracy_threshold |
|
value: 642.4522705078125 |
|
name: Dot Accuracy Threshold |
|
- type: dot_f1 |
|
value: 0.6636085626911314 |
|
name: Dot F1 |
|
- type: dot_f1_threshold |
|
value: 591.8570556640625 |
|
name: Dot F1 Threshold |
|
- type: dot_precision |
|
value: 0.5191387559808612 |
|
name: Dot Precision |
|
- type: dot_recall |
|
value: 0.9194915254237288 |
|
name: Dot Recall |
|
- type: dot_ap |
|
value: 0.667969160193219 |
|
name: Dot Ap |
|
- type: manhattan_accuracy |
|
value: 0.666015625 |
|
name: Manhattan Accuracy |
|
- type: manhattan_accuracy_threshold |
|
value: 322.94683837890625 |
|
name: Manhattan Accuracy Threshold |
|
- type: manhattan_f1 |
|
value: 0.6762360446570973 |
|
name: Manhattan F1 |
|
- type: manhattan_f1_threshold |
|
value: 371.5932922363281 |
|
name: Manhattan F1 Threshold |
|
- type: manhattan_precision |
|
value: 0.5421994884910486 |
|
name: Manhattan Precision |
|
- type: manhattan_recall |
|
value: 0.8983050847457628 |
|
name: Manhattan Recall |
|
- type: manhattan_ap |
|
value: 0.6781736723001364 |
|
name: Manhattan Ap |
|
- type: euclidean_accuracy |
|
value: 0.66796875 |
|
name: Euclidean Accuracy |
|
- type: euclidean_accuracy_threshold |
|
value: 15.774551391601562 |
|
name: Euclidean Accuracy Threshold |
|
- type: euclidean_f1 |
|
value: 0.662557781201849 |
|
name: Euclidean F1 |
|
- type: euclidean_f1_threshold |
|
value: 18.4862060546875 |
|
name: Euclidean F1 Threshold |
|
- type: euclidean_precision |
|
value: 0.5205811138014528 |
|
name: Euclidean Precision |
|
- type: euclidean_recall |
|
value: 0.9110169491525424 |
|
name: Euclidean Recall |
|
- type: euclidean_ap |
|
value: 0.6688954739628344 |
|
name: Euclidean Ap |
|
- type: max_accuracy |
|
value: 0.66796875 |
|
name: Max Accuracy |
|
- type: max_accuracy_threshold |
|
value: 642.4522705078125 |
|
name: Max Accuracy Threshold |
|
- type: max_f1 |
|
value: 0.6762360446570973 |
|
name: Max F1 |
|
- type: max_f1_threshold |
|
value: 591.8570556640625 |
|
name: Max F1 Threshold |
|
- type: max_precision |
|
value: 0.5421994884910486 |
|
name: Max Precision |
|
- type: max_recall |
|
value: 0.9194915254237288 |
|
name: Max Recall |
|
- type: max_ap |
|
value: 0.6781736723001364 |
|
name: Max Ap |
|
--- |
|
|
|
# SentenceTransformer based on microsoft/deberta-v3-small |
|
|
|
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. |
|
|
|
## Model Details |
|
|
|
### Model Description |
|
- **Model Type:** Sentence Transformer |
|
- **Base model:** [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) <!-- at revision a36c739020e01763fe789b4b85e2df55d6180012 --> |
|
- **Maximum Sequence Length:** 512 tokens |
|
- **Output Dimensionality:** 768 tokens |
|
- **Similarity Function:** Cosine Similarity |
|
<!-- - **Training Dataset:** Unknown --> |
|
<!-- - **Language:** Unknown --> |
|
<!-- - **License:** Unknown --> |
|
|
|
### Model Sources |
|
|
|
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net) |
|
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) |
|
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) |
|
|
|
### Full Model Architecture |
|
|
|
``` |
|
SentenceTransformer( |
|
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DebertaV2Model |
|
(1): AdvancedWeightedPooling( |
|
(linear_cls_pj): Linear(in_features=768, out_features=768, bias=True) |
|
(linear_cls_Qpj): Linear(in_features=768, out_features=768, bias=True) |
|
(linear_mean_pj): Linear(in_features=768, out_features=768, bias=True) |
|
(linear_attnOut): Linear(in_features=768, out_features=768, bias=True) |
|
(mha): MultiheadAttention( |
|
(out_proj): NonDynamicallyQuantizableLinear(in_features=768, out_features=768, bias=True) |
|
) |
|
(layernorm_output): LayerNorm((768,), eps=1e-05, elementwise_affine=True) |
|
(layernorm_weightedPooing): LayerNorm((768,), eps=1e-05, elementwise_affine=True) |
|
(layernorm_pjCls): LayerNorm((768,), eps=1e-05, elementwise_affine=True) |
|
(layernorm_pjMean): LayerNorm((768,), eps=1e-05, elementwise_affine=True) |
|
(layernorm_attnOut): LayerNorm((768,), eps=1e-05, elementwise_affine=True) |
|
) |
|
) |
|
``` |
|
|
|
## Usage |
|
|
|
### Direct Usage (Sentence Transformers) |
|
|
|
First install the Sentence Transformers library: |
|
|
|
```bash |
|
pip install -U sentence-transformers |
|
``` |
|
|
|
Then you can load this model and run inference. |
|
```python |
|
from sentence_transformers import SentenceTransformer |
|
|
|
# Download from the 🤗 Hub |
|
model = SentenceTransformer("bobox/DeBERTa3-s-CustomPoolin-toytest-step1-checkpoints-tmp") |
|
# Run inference |
|
sentences = [ |
|
'who was the first person who made the violin', |
|
'Violin The first makers of violins probably borrowed from various developments of the Byzantine lira. These included the rebec;[13] the Arabic rebab; the vielle (also known as the fidel or viuola); and the lira da braccio[11][14] The violin in its present form emerged in early 16th-century northern Italy. The earliest pictures of violins, albeit with three strings, are seen in northern Italy around 1530, at around the same time as the words "violino" and "vyollon" are seen in Italian and French documents. One of the earliest explicit descriptions of the instrument, including its tuning, is from the Epitome musical by Jambe de Fer, published in Lyon in 1556.[15] By this time, the violin had already begun to spread throughout Europe.', |
|
"Alice in Chains Alice in Chains is an American rock band from Seattle, Washington, formed in 1987 by guitarist and vocalist Jerry Cantrell and drummer Sean Kinney,[1] who recruited bassist Mike Starr[1] and lead vocalist Layne Staley.[1][2][3] Starr was replaced by Mike Inez in 1993.[4] After Staley's death in 2002, William DuVall joined in 2006 as co-lead vocalist and rhythm guitarist. The band took its name from Staley's previous group, the glam metal band Alice N' Chains.[5][2]", |
|
] |
|
embeddings = model.encode(sentences) |
|
print(embeddings.shape) |
|
# [3, 768] |
|
|
|
# Get the similarity scores for the embeddings |
|
similarities = model.similarity(embeddings, embeddings) |
|
print(similarities.shape) |
|
# [3, 3] |
|
``` |
|
|
|
<!-- |
|
### Direct Usage (Transformers) |
|
|
|
<details><summary>Click to see the direct usage in Transformers</summary> |
|
|
|
</details> |
|
--> |
|
|
|
<!-- |
|
### Downstream Usage (Sentence Transformers) |
|
|
|
You can finetune this model on your own dataset. |
|
|
|
<details><summary>Click to expand</summary> |
|
|
|
</details> |
|
--> |
|
|
|
<!-- |
|
### Out-of-Scope Use |
|
|
|
*List how the model may foreseeably be misused and address what users ought not to do with the model.* |
|
--> |
|
|
|
## Evaluation |
|
|
|
### Metrics |
|
|
|
#### Semantic Similarity |
|
* Dataset: `sts-test` |
|
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) |
|
|
|
| Metric | Value | |
|
|:--------------------|:-----------| |
|
| pearson_cosine | 0.4009 | |
|
| **spearman_cosine** | **0.4514** | |
|
| pearson_manhattan | 0.4369 | |
|
| spearman_manhattan | 0.4609 | |
|
| pearson_euclidean | 0.4256 | |
|
| spearman_euclidean | 0.4514 | |
|
| pearson_dot | 0.3968 | |
|
| spearman_dot | 0.4467 | |
|
| pearson_max | 0.4369 | |
|
| spearman_max | 0.4609 | |
|
|
|
#### Binary Classification |
|
* Dataset: `allNLI-dev` |
|
* Evaluated with [<code>BinaryClassificationEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.BinaryClassificationEvaluator) |
|
|
|
| Metric | Value | |
|
|:-----------------------------|:-----------| |
|
| cosine_accuracy | 0.6855 | |
|
| cosine_accuracy_threshold | 0.9455 | |
|
| cosine_f1 | 0.5295 | |
|
| cosine_f1_threshold | 0.8816 | |
|
| cosine_precision | 0.4088 | |
|
| cosine_recall | 0.7514 | |
|
| cosine_ap | 0.4792 | |
|
| dot_accuracy | 0.6855 | |
|
| dot_accuracy_threshold | 726.0875 | |
|
| dot_f1 | 0.5287 | |
|
| dot_f1_threshold | 667.7103 | |
|
| dot_precision | 0.3954 | |
|
| dot_recall | 0.7977 | |
|
| dot_ap | 0.4774 | |
|
| manhattan_accuracy | 0.6816 | |
|
| manhattan_accuracy_threshold | 179.6587 | |
|
| manhattan_f1 | 0.5375 | |
|
| manhattan_f1_threshold | 272.7697 | |
|
| manhattan_precision | 0.4202 | |
|
| manhattan_recall | 0.7457 | |
|
| manhattan_ap | 0.4774 | |
|
| euclidean_accuracy | 0.6855 | |
|
| euclidean_accuracy_threshold | 9.1555 | |
|
| euclidean_f1 | 0.5304 | |
|
| euclidean_f1_threshold | 13.4813 | |
|
| euclidean_precision | 0.4081 | |
|
| euclidean_recall | 0.7572 | |
|
| euclidean_ap | 0.479 | |
|
| max_accuracy | 0.6855 | |
|
| max_accuracy_threshold | 726.0875 | |
|
| max_f1 | 0.5375 | |
|
| max_f1_threshold | 667.7103 | |
|
| max_precision | 0.4202 | |
|
| max_recall | 0.7977 | |
|
| **max_ap** | **0.4792** | |
|
|
|
#### Binary Classification |
|
* Dataset: `Qnli-dev` |
|
* Evaluated with [<code>BinaryClassificationEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.BinaryClassificationEvaluator) |
|
|
|
| Metric | Value | |
|
|:-----------------------------|:-----------| |
|
| cosine_accuracy | 0.666 | |
|
| cosine_accuracy_threshold | 0.8375 | |
|
| cosine_f1 | 0.6636 | |
|
| cosine_f1_threshold | 0.7764 | |
|
| cosine_precision | 0.5205 | |
|
| cosine_recall | 0.9153 | |
|
| cosine_ap | 0.6688 | |
|
| dot_accuracy | 0.666 | |
|
| dot_accuracy_threshold | 642.4523 | |
|
| dot_f1 | 0.6636 | |
|
| dot_f1_threshold | 591.8571 | |
|
| dot_precision | 0.5191 | |
|
| dot_recall | 0.9195 | |
|
| dot_ap | 0.668 | |
|
| manhattan_accuracy | 0.666 | |
|
| manhattan_accuracy_threshold | 322.9468 | |
|
| manhattan_f1 | 0.6762 | |
|
| manhattan_f1_threshold | 371.5933 | |
|
| manhattan_precision | 0.5422 | |
|
| manhattan_recall | 0.8983 | |
|
| manhattan_ap | 0.6782 | |
|
| euclidean_accuracy | 0.668 | |
|
| euclidean_accuracy_threshold | 15.7746 | |
|
| euclidean_f1 | 0.6626 | |
|
| euclidean_f1_threshold | 18.4862 | |
|
| euclidean_precision | 0.5206 | |
|
| euclidean_recall | 0.911 | |
|
| euclidean_ap | 0.6689 | |
|
| max_accuracy | 0.668 | |
|
| max_accuracy_threshold | 642.4523 | |
|
| max_f1 | 0.6762 | |
|
| max_f1_threshold | 591.8571 | |
|
| max_precision | 0.5422 | |
|
| max_recall | 0.9195 | |
|
| **max_ap** | **0.6782** | |
|
|
|
<!-- |
|
## Bias, Risks and Limitations |
|
|
|
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* |
|
--> |
|
|
|
<!-- |
|
### Recommendations |
|
|
|
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* |
|
--> |
|
|
|
## Training Details |
|
|
|
### Training Dataset |
|
|
|
#### Unnamed Dataset |
|
|
|
|
|
* Size: 32,500 training samples |
|
* Columns: <code>sentence1</code> and <code>sentence2</code> |
|
* Approximate statistics based on the first 1000 samples: |
|
| | sentence1 | sentence2 | |
|
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| |
|
| type | string | string | |
|
| details | <ul><li>min: 4 tokens</li><li>mean: 29.3 tokens</li><li>max: 343 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 57.53 tokens</li><li>max: 512 tokens</li></ul> | |
|
* Samples: |
|
| sentence1 | sentence2 | |
|
|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| |
|
| <code>A Slippery Dick is what type of creature?</code> | <code>The Slippery Dick (Juvenile) - Whats That Fish! Description Also known as Sand-reef Wrasses and Slippery Dick Wrasse. Found singly or in pairs or in groups constantly circling around reefs, sea grass beds and sandy areas. Colours highly variable especially between juvenile to adult. They feed on hard shell invertebrates. Length - 18cm Depth - 2-12m Widespread Western Atlantic & Caribbean Most reef fish seen by divers during the day are grazers, that cruise around just above the surface of the coral or snoop into crevices looking for algae, worms and small crustaceans. Wrasses have small protruding teeth and graze the bottom taking in a variety of snails, worms, crabs, shrimps and eggs. Any hard coats or thick shells are then ground down by their pharyngeal jaws and the delicacies inside digested. From juvenile to adult wrasses dramatically alter their colour and body shapes. Wrasses are always on the go during the day, but are the first to go to bed and the last to rise. Small wrasses dive below the sand to sleep and larger wrasses wedge themselves in crevasses. Related creatures Heads up! Many creatures change during their life. Juvenile fish become adults and some change shape or their colour. Some species change sex and others just get older. The following creature(s) are known relatives of the Slippery Dick (Juvenile). Click the image(s) to explore further or hover over to get a better view! Slippery Dick</code> | |
|
| <code>e.	in solids the atoms are closely locked in position and can only vibrate, in liquids the atoms and molecules are more loosely connected and can collide with and move past one another, while in gases the atoms or molecules are free to move independently, colliding frequently.</code> | <code>Within a substance, atoms that collide frequently and move independently of one another are most likely in a gas</code> | |
|
| <code>In December 2015 , the film was ranked # 192 on IMDb .</code> | <code>As of December 2015 , it is the # 192 highest rated film on IMDb.</code> | |
|
* Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters: |
|
```json |
|
{'guide': SentenceTransformer( |
|
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel |
|
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) |
|
(2): Normalize() |
|
), 'temperature': 0.025} |
|
``` |
|
|
|
### Evaluation Dataset |
|
|
|
#### Unnamed Dataset |
|
|
|
|
|
* Size: 1,664 evaluation samples |
|
* Columns: <code>sentence1</code> and <code>sentence2</code> |
|
* Approximate statistics based on the first 1000 samples: |
|
| | sentence1 | sentence2 | |
|
|:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| |
|
| type | string | string | |
|
| details | <ul><li>min: 4 tokens</li><li>mean: 28.74 tokens</li><li>max: 330 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 56.55 tokens</li><li>max: 512 tokens</li></ul> | |
|
* Samples: |
|
| sentence1 | sentence2 | |
|
|:--------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| |
|
| <code>What component of an organism, made up of many cells, in turn makes up an organ?</code> | <code></code> | |
|
| <code>Diffusion Diffusion is a process where atoms or molecules move from areas of high concentration to areas of low concentration.</code> | <code>Diffusion is the process in which a substance naturally moves from an area of higher to lower concentration.</code> | |
|
| <code>In the 1966 movie The Good, The Bad And The Ugly, Clint Eastwood played the Good" and Lee van Cleef played "the Bad", but who played "the Ugly"?</code> | <code>View All Photos (10) Movie Info In the last and the best installment of his so-called "Dollars" trilogy of Sergio Leone-directed "spaghetti westerns," Clint Eastwood reprised the role of a taciturn, enigmatic loner. Here he searches for a cache of stolen gold against rivals the Bad (Lee Van Cleef), a ruthless bounty hunter, and the Ugly (Eli Wallach), a Mexican bandit. Though dubbed "the Good," Eastwood's character is not much better than his opponents -- he is just smarter and shoots faster. The film's title reveals its ironic attitude toward the canonized heroes of the classical western. "The real West was the world of violence, fear, and brutal instincts," claimed Leone. "In pursuit of profit there is no such thing as good and evil, generosity or deviousness; everything depends on chance, and not the best wins but the luckiest." Immensely entertaining and beautifully shot in Techniscope by Tonino Delli Colli, the movie is a virtually definitive "spaghetti western," rivaled only by Leone's own Once Upon a Time in the West (1968). The main musical theme by Ennio Morricone hit #1 on the British pop charts. Originally released in Italy at 177 minutes, the movie was later cut for its international release. ~ Yuri German, Rovi Rating:</code> | |
|
* Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters: |
|
```json |
|
{'guide': SentenceTransformer( |
|
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel |
|
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) |
|
(2): Normalize() |
|
), 'temperature': 0.025} |
|
``` |
|
|
|
### Training Hyperparameters |
|
#### Non-Default Hyperparameters |
|
|
|
- `eval_strategy`: steps |
|
- `per_device_train_batch_size`: 32 |
|
- `per_device_eval_batch_size`: 256 |
|
- `lr_scheduler_type`: cosine_with_min_lr |
|
- `lr_scheduler_kwargs`: {'num_cycles': 0.5, 'min_lr': 3.3333333333333337e-06} |
|
- `warmup_ratio`: 0.33 |
|
- `save_safetensors`: False |
|
- `fp16`: True |
|
- `push_to_hub`: True |
|
- `hub_model_id`: bobox/DeBERTa3-s-CustomPoolin-toytest-step1-checkpoints-tmp |
|
- `hub_strategy`: all_checkpoints |
|
- `batch_sampler`: no_duplicates |
|
|
|
#### All Hyperparameters |
|
<details><summary>Click to expand</summary> |
|
|
|
- `overwrite_output_dir`: False |
|
- `do_predict`: False |
|
- `eval_strategy`: steps |
|
- `prediction_loss_only`: True |
|
- `per_device_train_batch_size`: 32 |
|
- `per_device_eval_batch_size`: 256 |
|
- `per_gpu_train_batch_size`: None |
|
- `per_gpu_eval_batch_size`: None |
|
- `gradient_accumulation_steps`: 1 |
|
- `eval_accumulation_steps`: None |
|
- `torch_empty_cache_steps`: None |
|
- `learning_rate`: 5e-05 |
|
- `weight_decay`: 0.0 |
|
- `adam_beta1`: 0.9 |
|
- `adam_beta2`: 0.999 |
|
- `adam_epsilon`: 1e-08 |
|
- `max_grad_norm`: 1.0 |
|
- `num_train_epochs`: 3 |
|
- `max_steps`: -1 |
|
- `lr_scheduler_type`: cosine_with_min_lr |
|
- `lr_scheduler_kwargs`: {'num_cycles': 0.5, 'min_lr': 3.3333333333333337e-06} |
|
- `warmup_ratio`: 0.33 |
|
- `warmup_steps`: 0 |
|
- `log_level`: passive |
|
- `log_level_replica`: warning |
|
- `log_on_each_node`: True |
|
- `logging_nan_inf_filter`: True |
|
- `save_safetensors`: False |
|
- `save_on_each_node`: False |
|
- `save_only_model`: False |
|
- `restore_callback_states_from_checkpoint`: False |
|
- `no_cuda`: False |
|
- `use_cpu`: False |
|
- `use_mps_device`: False |
|
- `seed`: 42 |
|
- `data_seed`: None |
|
- `jit_mode_eval`: False |
|
- `use_ipex`: False |
|
- `bf16`: False |
|
- `fp16`: True |
|
- `fp16_opt_level`: O1 |
|
- `half_precision_backend`: auto |
|
- `bf16_full_eval`: False |
|
- `fp16_full_eval`: False |
|
- `tf32`: None |
|
- `local_rank`: 0 |
|
- `ddp_backend`: None |
|
- `tpu_num_cores`: None |
|
- `tpu_metrics_debug`: False |
|
- `debug`: [] |
|
- `dataloader_drop_last`: False |
|
- `dataloader_num_workers`: 0 |
|
- `dataloader_prefetch_factor`: None |
|
- `past_index`: -1 |
|
- `disable_tqdm`: False |
|
- `remove_unused_columns`: True |
|
- `label_names`: None |
|
- `load_best_model_at_end`: False |
|
- `ignore_data_skip`: False |
|
- `fsdp`: [] |
|
- `fsdp_min_num_params`: 0 |
|
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} |
|
- `fsdp_transformer_layer_cls_to_wrap`: None |
|
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} |
|
- `deepspeed`: None |
|
- `label_smoothing_factor`: 0.0 |
|
- `optim`: adamw_torch |
|
- `optim_args`: None |
|
- `adafactor`: False |
|
- `group_by_length`: False |
|
- `length_column_name`: length |
|
- `ddp_find_unused_parameters`: None |
|
- `ddp_bucket_cap_mb`: None |
|
- `ddp_broadcast_buffers`: False |
|
- `dataloader_pin_memory`: True |
|
- `dataloader_persistent_workers`: False |
|
- `skip_memory_metrics`: True |
|
- `use_legacy_prediction_loop`: False |
|
- `push_to_hub`: True |
|
- `resume_from_checkpoint`: None |
|
- `hub_model_id`: bobox/DeBERTa3-s-CustomPoolin-toytest-step1-checkpoints-tmp |
|
- `hub_strategy`: all_checkpoints |
|
- `hub_private_repo`: False |
|
- `hub_always_push`: False |
|
- `gradient_checkpointing`: False |
|
- `gradient_checkpointing_kwargs`: None |
|
- `include_inputs_for_metrics`: False |
|
- `eval_do_concat_batches`: True |
|
- `fp16_backend`: auto |
|
- `push_to_hub_model_id`: None |
|
- `push_to_hub_organization`: None |
|
- `mp_parameters`: |
|
- `auto_find_batch_size`: False |
|
- `full_determinism`: False |
|
- `torchdynamo`: None |
|
- `ray_scope`: last |
|
- `ddp_timeout`: 1800 |
|
- `torch_compile`: False |
|
- `torch_compile_backend`: None |
|
- `torch_compile_mode`: None |
|
- `dispatch_batches`: None |
|
- `split_batches`: None |
|
- `include_tokens_per_second`: False |
|
- `include_num_input_tokens_seen`: False |
|
- `neftune_noise_alpha`: None |
|
- `optim_target_modules`: None |
|
- `batch_eval_metrics`: False |
|
- `eval_on_start`: False |
|
- `eval_use_gather_object`: False |
|
- `batch_sampler`: no_duplicates |
|
- `multi_dataset_batch_sampler`: proportional |
|
|
|
</details> |
|
|
|
### Training Logs |
|
<details><summary>Click to expand</summary> |
|
|
|
| Epoch | Step | Training Loss | Validation Loss | sts-test_spearman_cosine | allNLI-dev_max_ap | Qnli-dev_max_ap | |
|
|:------:|:----:|:-------------:|:---------------:|:------------------------:|:-----------------:|:---------------:| |
|
| 0.0010 | 1 | 4.9603 | - | - | - | - | |
|
| 0.0020 | 2 | 28.2529 | - | - | - | - | |
|
| 0.0030 | 3 | 27.6365 | - | - | - | - | |
|
| 0.0039 | 4 | 6.1387 | - | - | - | - | |
|
| 0.0049 | 5 | 5.5753 | - | - | - | - | |
|
| 0.0059 | 6 | 5.6951 | - | - | - | - | |
|
| 0.0069 | 7 | 6.3533 | - | - | - | - | |
|
| 0.0079 | 8 | 27.3848 | - | - | - | - | |
|
| 0.0089 | 9 | 3.8501 | - | - | - | - | |
|
| 0.0098 | 10 | 27.911 | - | - | - | - | |
|
| 0.0108 | 11 | 4.9042 | - | - | - | - | |
|
| 0.0118 | 12 | 6.8003 | - | - | - | - | |
|
| 0.0128 | 13 | 5.7317 | - | - | - | - | |
|
| 0.0138 | 14 | 20.261 | - | - | - | - | |
|
| 0.0148 | 15 | 27.9051 | - | - | - | - | |
|
| 0.0157 | 16 | 5.5959 | - | - | - | - | |
|
| 0.0167 | 17 | 5.8052 | - | - | - | - | |
|
| 0.0177 | 18 | 4.5088 | - | - | - | - | |
|
| 0.0187 | 19 | 7.3472 | - | - | - | - | |
|
| 0.0197 | 20 | 5.8668 | - | - | - | - | |
|
| 0.0207 | 21 | 6.4083 | - | - | - | - | |
|
| 0.0217 | 22 | 6.011 | - | - | - | - | |
|
| 0.0226 | 23 | 5.2394 | - | - | - | - | |
|
| 0.0236 | 24 | 4.2966 | - | - | - | - | |
|
| 0.0246 | 25 | 26.605 | - | - | - | - | |
|
| 0.0256 | 26 | 6.2067 | - | - | - | - | |
|
| 0.0266 | 27 | 6.0346 | - | - | - | - | |
|
| 0.0276 | 28 | 5.4676 | - | - | - | - | |
|
| 0.0285 | 29 | 6.4292 | - | - | - | - | |
|
| 0.0295 | 30 | 26.6452 | - | - | - | - | |
|
| 0.0305 | 31 | 18.8401 | - | - | - | - | |
|
| 0.0315 | 32 | 7.4531 | - | - | - | - | |
|
| 0.0325 | 33 | 4.8286 | - | - | - | - | |
|
| 0.0335 | 34 | 5.0078 | - | - | - | - | |
|
| 0.0344 | 35 | 5.4115 | - | - | - | - | |
|
| 0.0354 | 36 | 5.4196 | - | - | - | - | |
|
| 0.0364 | 37 | 4.5023 | - | - | - | - | |
|
| 0.0374 | 38 | 5.376 | - | - | - | - | |
|
| 0.0384 | 39 | 5.2303 | - | - | - | - | |
|
| 0.0394 | 40 | 5.6694 | - | - | - | - | |
|
| 0.0404 | 41 | 4.7825 | - | - | - | - | |
|
| 0.0413 | 42 | 4.6507 | - | - | - | - | |
|
| 0.0423 | 43 | 24.2072 | - | - | - | - | |
|
| 0.0433 | 44 | 4.9285 | - | - | - | - | |
|
| 0.0443 | 45 | 6.326 | - | - | - | - | |
|
| 0.0453 | 46 | 4.5724 | - | - | - | - | |
|
| 0.0463 | 47 | 4.754 | - | - | - | - | |
|
| 0.0472 | 48 | 5.5443 | - | - | - | - | |
|
| 0.0482 | 49 | 4.5764 | - | - | - | - | |
|
| 0.0492 | 50 | 5.1434 | - | - | - | - | |
|
| 0.0502 | 51 | 22.6991 | - | - | - | - | |
|
| 0.0512 | 52 | 5.4277 | - | - | - | - | |
|
| 0.0522 | 53 | 5.0178 | - | - | - | - | |
|
| 0.0531 | 54 | 4.8779 | - | - | - | - | |
|
| 0.0541 | 55 | 4.2884 | - | - | - | - | |
|
| 0.0551 | 56 | 16.0994 | - | - | - | - | |
|
| 0.0561 | 57 | 21.31 | - | - | - | - | |
|
| 0.0571 | 58 | 4.9721 | - | - | - | - | |
|
| 0.0581 | 59 | 5.143 | - | - | - | - | |
|
| 0.0591 | 60 | 3.5933 | - | - | - | - | |
|
| 0.0600 | 61 | 5.2559 | - | - | - | - | |
|
| 0.0610 | 62 | 4.0757 | - | - | - | - | |
|
| 0.0620 | 63 | 3.6612 | - | - | - | - | |
|
| 0.0630 | 64 | 4.7505 | - | - | - | - | |
|
| 0.0640 | 65 | 4.1979 | - | - | - | - | |
|
| 0.0650 | 66 | 3.9982 | - | - | - | - | |
|
| 0.0659 | 67 | 4.7065 | - | - | - | - | |
|
| 0.0669 | 68 | 5.3413 | - | - | - | - | |
|
| 0.0679 | 69 | 3.6964 | - | - | - | - | |
|
| 0.0689 | 70 | 17.8774 | - | - | - | - | |
|
| 0.0699 | 71 | 4.8154 | - | - | - | - | |
|
| 0.0709 | 72 | 4.8356 | - | - | - | - | |
|
| 0.0719 | 73 | 4.568 | - | - | - | - | |
|
| 0.0728 | 74 | 4.0898 | - | - | - | - | |
|
| 0.0738 | 75 | 3.4502 | - | - | - | - | |
|
| 0.0748 | 76 | 3.7733 | - | - | - | - | |
|
| 0.0758 | 77 | 4.5204 | - | - | - | - | |
|
| 0.0768 | 78 | 4.2526 | - | - | - | - | |
|
| 0.0778 | 79 | 4.4398 | - | - | - | - | |
|
| 0.0787 | 80 | 4.0988 | - | - | - | - | |
|
| 0.0797 | 81 | 3.9704 | - | - | - | - | |
|
| 0.0807 | 82 | 4.3343 | - | - | - | - | |
|
| 0.0817 | 83 | 4.2587 | - | - | - | - | |
|
| 0.0827 | 84 | 15.0149 | - | - | - | - | |
|
| 0.0837 | 85 | 14.6599 | - | - | - | - | |
|
| 0.0846 | 86 | 4.0623 | - | - | - | - | |
|
| 0.0856 | 87 | 3.7597 | - | - | - | - | |
|
| 0.0866 | 88 | 4.3433 | - | - | - | - | |
|
| 0.0876 | 89 | 4.0287 | - | - | - | - | |
|
| 0.0886 | 90 | 4.6257 | - | - | - | - | |
|
| 0.0896 | 91 | 13.4689 | - | - | - | - | |
|
| 0.0906 | 92 | 4.6583 | - | - | - | - | |
|
| 0.0915 | 93 | 4.2682 | - | - | - | - | |
|
| 0.0925 | 94 | 4.468 | - | - | - | - | |
|
| 0.0935 | 95 | 3.4333 | - | - | - | - | |
|
| 0.0945 | 96 | 12.7654 | - | - | - | - | |
|
| 0.0955 | 97 | 3.5577 | - | - | - | - | |
|
| 0.0965 | 98 | 12.5875 | - | - | - | - | |
|
| 0.0974 | 99 | 4.2206 | - | - | - | - | |
|
| 0.0984 | 100 | 3.5981 | - | - | - | - | |
|
| 0.0994 | 101 | 3.5575 | - | - | - | - | |
|
| 0.1004 | 102 | 4.0271 | - | - | - | - | |
|
| 0.1014 | 103 | 4.0803 | - | - | - | - | |
|
| 0.1024 | 104 | 4.0886 | - | - | - | - | |
|
| 0.1033 | 105 | 4.176 | - | - | - | - | |
|
| 0.1043 | 106 | 4.6653 | - | - | - | - | |
|
| 0.1053 | 107 | 4.3076 | - | - | - | - | |
|
| 0.1063 | 108 | 8.7282 | - | - | - | - | |
|
| 0.1073 | 109 | 3.4192 | - | - | - | - | |
|
| 0.1083 | 110 | 10.6027 | - | - | - | - | |
|
| 0.1093 | 111 | 4.0959 | - | - | - | - | |
|
| 0.1102 | 112 | 4.2785 | - | - | - | - | |
|
| 0.1112 | 113 | 3.9945 | - | - | - | - | |
|
| 0.1122 | 114 | 10.0652 | - | - | - | - | |
|
| 0.1132 | 115 | 3.8621 | - | - | - | - | |
|
| 0.1142 | 116 | 4.3975 | - | - | - | - | |
|
| 0.1152 | 117 | 9.7899 | - | - | - | - | |
|
| 0.1161 | 118 | 4.3812 | - | - | - | - | |
|
| 0.1171 | 119 | 3.8715 | - | - | - | - | |
|
| 0.1181 | 120 | 3.8327 | - | - | - | - | |
|
| 0.1191 | 121 | 3.5103 | - | - | - | - | |
|
| 0.1201 | 122 | 9.3158 | - | - | - | - | |
|
| 0.1211 | 123 | 3.7201 | - | - | - | - | |
|
| 0.1220 | 124 | 3.4311 | - | - | - | - | |
|
| 0.1230 | 125 | 3.7946 | - | - | - | - | |
|
| 0.1240 | 126 | 4.0456 | - | - | - | - | |
|
| 0.125 | 127 | 3.482 | - | - | - | - | |
|
| 0.1260 | 128 | 3.1901 | - | - | - | - | |
|
| 0.1270 | 129 | 3.414 | - | - | - | - | |
|
| 0.1280 | 130 | 3.4967 | - | - | - | - | |
|
| 0.1289 | 131 | 3.6594 | - | - | - | - | |
|
| 0.1299 | 132 | 8.066 | - | - | - | - | |
|
| 0.1309 | 133 | 3.7872 | - | - | - | - | |
|
| 0.1319 | 134 | 4.0023 | - | - | - | - | |
|
| 0.1329 | 135 | 3.7728 | - | - | - | - | |
|
| 0.1339 | 136 | 3.1893 | - | - | - | - | |
|
| 0.1348 | 137 | 3.3635 | - | - | - | - | |
|
| 0.1358 | 138 | 4.0195 | - | - | - | - | |
|
| 0.1368 | 139 | 4.1097 | - | - | - | - | |
|
| 0.1378 | 140 | 3.7903 | - | - | - | - | |
|
| 0.1388 | 141 | 3.5748 | - | - | - | - | |
|
| 0.1398 | 142 | 3.8104 | - | - | - | - | |
|
| 0.1407 | 143 | 8.0411 | - | - | - | - | |
|
| 0.1417 | 144 | 3.4819 | - | - | - | - | |
|
| 0.1427 | 145 | 3.452 | - | - | - | - | |
|
| 0.1437 | 146 | 3.5861 | - | - | - | - | |
|
| 0.1447 | 147 | 3.4324 | - | - | - | - | |
|
| 0.1457 | 148 | 3.521 | - | - | - | - | |
|
| 0.1467 | 149 | 3.8868 | - | - | - | - | |
|
| 0.1476 | 150 | 8.1191 | - | - | - | - | |
|
| 0.1486 | 151 | 3.6447 | - | - | - | - | |
|
| 0.1496 | 152 | 2.9436 | - | - | - | - | |
|
| 0.1506 | 153 | 8.1535 | 2.2032 | 0.2236 | 0.4009 | 0.5892 | |
|
| 0.1516 | 154 | 3.9619 | - | - | - | - | |
|
| 0.1526 | 155 | 3.1301 | - | - | - | - | |
|
| 0.1535 | 156 | 3.0478 | - | - | - | - | |
|
| 0.1545 | 157 | 3.2986 | - | - | - | - | |
|
| 0.1555 | 158 | 3.2847 | - | - | - | - | |
|
| 0.1565 | 159 | 3.6599 | - | - | - | - | |
|
| 0.1575 | 160 | 3.2238 | - | - | - | - | |
|
| 0.1585 | 161 | 2.8897 | - | - | - | - | |
|
| 0.1594 | 162 | 3.9443 | - | - | - | - | |
|
| 0.1604 | 163 | 3.3733 | - | - | - | - | |
|
| 0.1614 | 164 | 3.7444 | - | - | - | - | |
|
| 0.1624 | 165 | 3.4813 | - | - | - | - | |
|
| 0.1634 | 166 | 2.6865 | - | - | - | - | |
|
| 0.1644 | 167 | 2.7587 | - | - | - | - | |
|
| 0.1654 | 168 | 3.3628 | - | - | - | - | |
|
| 0.1663 | 169 | 3.0035 | - | - | - | - | |
|
| 0.1673 | 170 | 10.1591 | - | - | - | - | |
|
| 0.1683 | 171 | 3.5366 | - | - | - | - | |
|
| 0.1693 | 172 | 8.4047 | - | - | - | - | |
|
| 0.1703 | 173 | 3.8643 | - | - | - | - | |
|
| 0.1713 | 174 | 3.3529 | - | - | - | - | |
|
| 0.1722 | 175 | 3.7143 | - | - | - | - | |
|
| 0.1732 | 176 | 3.3323 | - | - | - | - | |
|
| 0.1742 | 177 | 3.1206 | - | - | - | - | |
|
| 0.1752 | 178 | 3.1348 | - | - | - | - | |
|
| 0.1762 | 179 | 7.6011 | - | - | - | - | |
|
| 0.1772 | 180 | 3.7025 | - | - | - | - | |
|
| 0.1781 | 181 | 10.5662 | - | - | - | - | |
|
| 0.1791 | 182 | 8.966 | - | - | - | - | |
|
| 0.1801 | 183 | 9.426 | - | - | - | - | |
|
| 0.1811 | 184 | 3.0025 | - | - | - | - | |
|
| 0.1821 | 185 | 7.0984 | - | - | - | - | |
|
| 0.1831 | 186 | 7.3808 | - | - | - | - | |
|
| 0.1841 | 187 | 2.8657 | - | - | - | - | |
|
| 0.1850 | 188 | 6.5636 | - | - | - | - | |
|
| 0.1860 | 189 | 3.4702 | - | - | - | - | |
|
| 0.1870 | 190 | 5.9302 | - | - | - | - | |
|
| 0.1880 | 191 | 3.2406 | - | - | - | - | |
|
| 0.1890 | 192 | 3.4459 | - | - | - | - | |
|
| 0.1900 | 193 | 5.269 | - | - | - | - | |
|
| 0.1909 | 194 | 4.8605 | - | - | - | - | |
|
| 0.1919 | 195 | 2.9891 | - | - | - | - | |
|
| 0.1929 | 196 | 3.6681 | - | - | - | - | |
|
| 0.1939 | 197 | 3.1589 | - | - | - | - | |
|
| 0.1949 | 198 | 3.1835 | - | - | - | - | |
|
| 0.1959 | 199 | 3.7561 | - | - | - | - | |
|
| 0.1969 | 200 | 4.0891 | - | - | - | - | |
|
| 0.1978 | 201 | 3.563 | - | - | - | - | |
|
| 0.1988 | 202 | 3.7433 | - | - | - | - | |
|
| 0.1998 | 203 | 3.3813 | - | - | - | - | |
|
| 0.2008 | 204 | 5.2311 | - | - | - | - | |
|
| 0.2018 | 205 | 3.3494 | - | - | - | - | |
|
| 0.2028 | 206 | 3.3533 | - | - | - | - | |
|
| 0.2037 | 207 | 3.688 | - | - | - | - | |
|
| 0.2047 | 208 | 3.5342 | - | - | - | - | |
|
| 0.2057 | 209 | 4.9381 | - | - | - | - | |
|
| 0.2067 | 210 | 3.1839 | - | - | - | - | |
|
| 0.2077 | 211 | 3.0465 | - | - | - | - | |
|
| 0.2087 | 212 | 3.1232 | - | - | - | - | |
|
| 0.2096 | 213 | 4.6297 | - | - | - | - | |
|
| 0.2106 | 214 | 2.9834 | - | - | - | - | |
|
| 0.2116 | 215 | 4.2231 | - | - | - | - | |
|
| 0.2126 | 216 | 3.1458 | - | - | - | - | |
|
| 0.2136 | 217 | 3.2525 | - | - | - | - | |
|
| 0.2146 | 218 | 3.5971 | - | - | - | - | |
|
| 0.2156 | 219 | 3.5616 | - | - | - | - | |
|
| 0.2165 | 220 | 3.2378 | - | - | - | - | |
|
| 0.2175 | 221 | 2.9075 | - | - | - | - | |
|
| 0.2185 | 222 | 3.0391 | - | - | - | - | |
|
| 0.2195 | 223 | 3.5573 | - | - | - | - | |
|
| 0.2205 | 224 | 3.2092 | - | - | - | - | |
|
| 0.2215 | 225 | 3.2646 | - | - | - | - | |
|
| 0.2224 | 226 | 3.0886 | - | - | - | - | |
|
| 0.2234 | 227 | 3.5241 | - | - | - | - | |
|
| 0.2244 | 228 | 3.0111 | - | - | - | - | |
|
| 0.2254 | 229 | 3.707 | - | - | - | - | |
|
| 0.2264 | 230 | 5.3822 | - | - | - | - | |
|
| 0.2274 | 231 | 3.2646 | - | - | - | - | |
|
| 0.2283 | 232 | 2.7021 | - | - | - | - | |
|
| 0.2293 | 233 | 3.5131 | - | - | - | - | |
|
| 0.2303 | 234 | 3.103 | - | - | - | - | |
|
| 0.2313 | 235 | 2.9535 | - | - | - | - | |
|
| 0.2323 | 236 | 2.9631 | - | - | - | - | |
|
| 0.2333 | 237 | 2.8068 | - | - | - | - | |
|
| 0.2343 | 238 | 3.4251 | - | - | - | - | |
|
| 0.2352 | 239 | 2.8495 | - | - | - | - | |
|
| 0.2362 | 240 | 2.9972 | - | - | - | - | |
|
| 0.2372 | 241 | 3.3509 | - | - | - | - | |
|
| 0.2382 | 242 | 2.9234 | - | - | - | - | |
|
| 0.2392 | 243 | 2.4086 | - | - | - | - | |
|
| 0.2402 | 244 | 3.1282 | - | - | - | - | |
|
| 0.2411 | 245 | 2.3352 | - | - | - | - | |
|
| 0.2421 | 246 | 2.4706 | - | - | - | - | |
|
| 0.2431 | 247 | 3.5449 | - | - | - | - | |
|
| 0.2441 | 248 | 2.8963 | - | - | - | - | |
|
| 0.2451 | 249 | 2.773 | - | - | - | - | |
|
| 0.2461 | 250 | 2.355 | - | - | - | - | |
|
| 0.2470 | 251 | 2.656 | - | - | - | - | |
|
| 0.2480 | 252 | 2.6221 | - | - | - | - | |
|
| 0.2490 | 253 | 8.6739 | - | - | - | - | |
|
| 0.25 | 254 | 10.8242 | - | - | - | - | |
|
| 0.2510 | 255 | 2.3408 | - | - | - | - | |
|
| 0.2520 | 256 | 2.1221 | - | - | - | - | |
|
| 0.2530 | 257 | 3.295 | - | - | - | - | |
|
| 0.2539 | 258 | 2.5896 | - | - | - | - | |
|
| 0.2549 | 259 | 2.1215 | - | - | - | - | |
|
| 0.2559 | 260 | 9.4851 | - | - | - | - | |
|
| 0.2569 | 261 | 2.1982 | - | - | - | - | |
|
| 0.2579 | 262 | 3.0568 | - | - | - | - | |
|
| 0.2589 | 263 | 2.6269 | - | - | - | - | |
|
| 0.2598 | 264 | 2.4792 | - | - | - | - | |
|
| 0.2608 | 265 | 1.9445 | - | - | - | - | |
|
| 0.2618 | 266 | 2.4061 | - | - | - | - | |
|
| 0.2628 | 267 | 8.3116 | - | - | - | - | |
|
| 0.2638 | 268 | 8.0804 | - | - | - | - | |
|
| 0.2648 | 269 | 2.1674 | - | - | - | - | |
|
| 0.2657 | 270 | 7.1975 | - | - | - | - | |
|
| 0.2667 | 271 | 5.9104 | - | - | - | - | |
|
| 0.2677 | 272 | 2.498 | - | - | - | - | |
|
| 0.2687 | 273 | 2.5249 | - | - | - | - | |
|
| 0.2697 | 274 | 2.7152 | - | - | - | - | |
|
| 0.2707 | 275 | 2.7904 | - | - | - | - | |
|
| 0.2717 | 276 | 2.7745 | - | - | - | - | |
|
| 0.2726 | 277 | 2.9741 | - | - | - | - | |
|
| 0.2736 | 278 | 1.8215 | - | - | - | - | |
|
| 0.2746 | 279 | 4.6844 | - | - | - | - | |
|
| 0.2756 | 280 | 2.8613 | - | - | - | - | |
|
| 0.2766 | 281 | 2.7147 | - | - | - | - | |
|
| 0.2776 | 282 | 2.814 | - | - | - | - | |
|
| 0.2785 | 283 | 2.3569 | - | - | - | - | |
|
| 0.2795 | 284 | 2.672 | - | - | - | - | |
|
| 0.2805 | 285 | 3.2052 | - | - | - | - | |
|
| 0.2815 | 286 | 2.8056 | - | - | - | - | |
|
| 0.2825 | 287 | 2.6268 | - | - | - | - | |
|
| 0.2835 | 288 | 2.5641 | - | - | - | - | |
|
| 0.2844 | 289 | 2.4475 | - | - | - | - | |
|
| 0.2854 | 290 | 2.7377 | - | - | - | - | |
|
| 0.2864 | 291 | 2.3831 | - | - | - | - | |
|
| 0.2874 | 292 | 8.8069 | - | - | - | - | |
|
| 0.2884 | 293 | 2.186 | - | - | - | - | |
|
| 0.2894 | 294 | 2.3389 | - | - | - | - | |
|
| 0.2904 | 295 | 1.9744 | - | - | - | - | |
|
| 0.2913 | 296 | 2.4491 | - | - | - | - | |
|
| 0.2923 | 297 | 2.5668 | - | - | - | - | |
|
| 0.2933 | 298 | 2.1939 | - | - | - | - | |
|
| 0.2943 | 299 | 2.2832 | - | - | - | - | |
|
| 0.2953 | 300 | 2.7508 | - | - | - | - | |
|
| 0.2963 | 301 | 2.5206 | - | - | - | - | |
|
| 0.2972 | 302 | 2.3522 | - | - | - | - | |
|
| 0.2982 | 303 | 2.7186 | - | - | - | - | |
|
| 0.2992 | 304 | 2.1369 | - | - | - | - | |
|
| 0.3002 | 305 | 9.7972 | - | - | - | - | |
|
| 0.3012 | 306 | 1.9378 | 1.5786 | 0.2924 | 0.4272 | 0.6159 | |
|
| 0.3022 | 307 | 2.5365 | - | - | - | - | |
|
| 0.3031 | 308 | 2.0346 | - | - | - | - | |
|
| 0.3041 | 309 | 2.0721 | - | - | - | - | |
|
| 0.3051 | 310 | 2.6966 | - | - | - | - | |
|
| 0.3061 | 311 | 2.6757 | - | - | - | - | |
|
| 0.3071 | 312 | 10.6395 | - | - | - | - | |
|
| 0.3081 | 313 | 2.8671 | - | - | - | - | |
|
| 0.3091 | 314 | 2.0144 | - | - | - | - | |
|
| 0.3100 | 315 | 9.9338 | - | - | - | - | |
|
| 0.3110 | 316 | 2.6167 | - | - | - | - | |
|
| 0.3120 | 317 | 2.1342 | - | - | - | - | |
|
| 0.3130 | 318 | 9.0369 | - | - | - | - | |
|
| 0.3140 | 319 | 2.0182 | - | - | - | - | |
|
| 0.3150 | 320 | 2.2189 | - | - | - | - | |
|
| 0.3159 | 321 | 1.9667 | - | - | - | - | |
|
| 0.3169 | 322 | 2.3371 | - | - | - | - | |
|
| 0.3179 | 323 | 6.9866 | - | - | - | - | |
|
| 0.3189 | 324 | 1.6119 | - | - | - | - | |
|
| 0.3199 | 325 | 1.8615 | - | - | - | - | |
|
| 0.3209 | 326 | 2.1708 | - | - | - | - | |
|
| 0.3219 | 327 | 2.0174 | - | - | - | - | |
|
| 0.3228 | 328 | 6.7891 | - | - | - | - | |
|
| 0.3238 | 329 | 2.155 | - | - | - | - | |
|
| 0.3248 | 330 | 2.4636 | - | - | - | - | |
|
| 0.3258 | 331 | 1.9844 | - | - | - | - | |
|
| 0.3268 | 332 | 1.9035 | - | - | - | - | |
|
| 0.3278 | 333 | 2.0729 | - | - | - | - | |
|
| 0.3287 | 334 | 1.5715 | - | - | - | - | |
|
| 0.3297 | 335 | 2.7211 | - | - | - | - | |
|
| 0.3307 | 336 | 2.0351 | - | - | - | - | |
|
| 0.3317 | 337 | 2.4049 | - | - | - | - | |
|
| 0.3327 | 338 | 2.3939 | - | - | - | - | |
|
| 0.3337 | 339 | 1.7353 | - | - | - | - | |
|
| 0.3346 | 340 | 1.8393 | - | - | - | - | |
|
| 0.3356 | 341 | 2.2874 | - | - | - | - | |
|
| 0.3366 | 342 | 1.8566 | - | - | - | - | |
|
| 0.3376 | 343 | 2.2676 | - | - | - | - | |
|
| 0.3386 | 344 | 1.7895 | - | - | - | - | |
|
| 0.3396 | 345 | 2.2506 | - | - | - | - | |
|
| 0.3406 | 346 | 1.5613 | - | - | - | - | |
|
| 0.3415 | 347 | 2.3531 | - | - | - | - | |
|
| 0.3425 | 348 | 1.99 | - | - | - | - | |
|
| 0.3435 | 349 | 12.0831 | - | - | - | - | |
|
| 0.3445 | 350 | 2.0959 | - | - | - | - | |
|
| 0.3455 | 351 | 2.0641 | - | - | - | - | |
|
| 0.3465 | 352 | 1.9197 | - | - | - | - | |
|
| 0.3474 | 353 | 1.9382 | - | - | - | - | |
|
| 0.3484 | 354 | 2.3819 | - | - | - | - | |
|
| 0.3494 | 355 | 1.6053 | - | - | - | - | |
|
| 0.3504 | 356 | 2.4719 | - | - | - | - | |
|
| 0.3514 | 357 | 1.5602 | - | - | - | - | |
|
| 0.3524 | 358 | 2.1675 | - | - | - | - | |
|
| 0.3533 | 359 | 11.5856 | - | - | - | - | |
|
| 0.3543 | 360 | 9.3718 | - | - | - | - | |
|
| 0.3553 | 361 | 1.8952 | - | - | - | - | |
|
| 0.3563 | 362 | 1.701 | - | - | - | - | |
|
| 0.3573 | 363 | 1.46 | - | - | - | - | |
|
| 0.3583 | 364 | 1.7913 | - | - | - | - | |
|
| 0.3593 | 365 | 9.1152 | - | - | - | - | |
|
| 0.3602 | 366 | 9.2681 | - | - | - | - | |
|
| 0.3612 | 367 | 2.2932 | - | - | - | - | |
|
| 0.3622 | 368 | 1.7176 | - | - | - | - | |
|
| 0.3632 | 369 | 2.2559 | - | - | - | - | |
|
| 0.3642 | 370 | 1.9846 | - | - | - | - | |
|
| 0.3652 | 371 | 1.8022 | - | - | - | - | |
|
| 0.3661 | 372 | 8.1128 | - | - | - | - | |
|
| 0.3671 | 373 | 6.929 | - | - | - | - | |
|
| 0.3681 | 374 | 1.9038 | - | - | - | - | |
|
| 0.3691 | 375 | 1.3899 | - | - | - | - | |
|
| 0.3701 | 376 | 1.5677 | - | - | - | - | |
|
| 0.3711 | 377 | 5.2357 | - | - | - | - | |
|
| 0.3720 | 378 | 2.2304 | - | - | - | - | |
|
| 0.3730 | 379 | 2.1727 | - | - | - | - | |
|
| 0.3740 | 380 | 2.2941 | - | - | - | - | |
|
| 0.375 | 381 | 2.2257 | - | - | - | - | |
|
| 0.3760 | 382 | 1.7489 | - | - | - | - | |
|
| 0.3770 | 383 | 1.5027 | - | - | - | - | |
|
| 0.3780 | 384 | 1.6917 | - | - | - | - | |
|
| 0.3789 | 385 | 5.7867 | - | - | - | - | |
|
| 0.3799 | 386 | 1.6871 | - | - | - | - | |
|
| 0.3809 | 387 | 1.5652 | - | - | - | - | |
|
| 0.3819 | 388 | 2.1691 | - | - | - | - | |
|
| 0.3829 | 389 | 1.869 | - | - | - | - | |
|
| 0.3839 | 390 | 2.1934 | - | - | - | - | |
|
| 0.3848 | 391 | 7.0152 | - | - | - | - | |
|
| 0.3858 | 392 | 2.0454 | - | - | - | - | |
|
| 0.3868 | 393 | 1.8098 | - | - | - | - | |
|
| 0.3878 | 394 | 5.7529 | - | - | - | - | |
|
| 0.3888 | 395 | 1.3949 | - | - | - | - | |
|
| 0.3898 | 396 | 1.5962 | - | - | - | - | |
|
| 0.3907 | 397 | 6.1436 | - | - | - | - | |
|
| 0.3917 | 398 | 5.2979 | - | - | - | - | |
|
| 0.3927 | 399 | 1.2422 | - | - | - | - | |
|
| 0.3937 | 400 | 2.1152 | - | - | - | - | |
|
| 0.3947 | 401 | 1.6679 | - | - | - | - | |
|
| 0.3957 | 402 | 4.2978 | - | - | - | - | |
|
| 0.3967 | 403 | 1.624 | - | - | - | - | |
|
| 0.3976 | 404 | 2.0267 | - | - | - | - | |
|
| 0.3986 | 405 | 1.3975 | - | - | - | - | |
|
| 0.3996 | 406 | 1.905 | - | - | - | - | |
|
| 0.4006 | 407 | 5.4419 | - | - | - | - | |
|
| 0.4016 | 408 | 2.0008 | - | - | - | - | |
|
| 0.4026 | 409 | 1.8387 | - | - | - | - | |
|
| 0.4035 | 410 | 2.2391 | - | - | - | - | |
|
| 0.4045 | 411 | 1.7153 | - | - | - | - | |
|
| 0.4055 | 412 | 2.1533 | - | - | - | - | |
|
| 0.4065 | 413 | 1.788 | - | - | - | - | |
|
| 0.4075 | 414 | 3.482 | - | - | - | - | |
|
| 0.4085 | 415 | 1.8376 | - | - | - | - | |
|
| 0.4094 | 416 | 4.8811 | - | - | - | - | |
|
| 0.4104 | 417 | 1.9421 | - | - | - | - | |
|
| 0.4114 | 418 | 1.4796 | - | - | - | - | |
|
| 0.4124 | 419 | 1.6209 | - | - | - | - | |
|
| 0.4134 | 420 | 1.8734 | - | - | - | - | |
|
| 0.4144 | 421 | 1.9444 | - | - | - | - | |
|
| 0.4154 | 422 | 1.9581 | - | - | - | - | |
|
| 0.4163 | 423 | 1.5175 | - | - | - | - | |
|
| 0.4173 | 424 | 1.2831 | - | - | - | - | |
|
| 0.4183 | 425 | 1.1355 | - | - | - | - | |
|
| 0.4193 | 426 | 1.864 | - | - | - | - | |
|
| 0.4203 | 427 | 5.1574 | - | - | - | - | |
|
| 0.4213 | 428 | 5.323 | - | - | - | - | |
|
| 0.4222 | 429 | 1.385 | - | - | - | - | |
|
| 0.4232 | 430 | 1.1691 | - | - | - | - | |
|
| 0.4242 | 431 | 1.8994 | - | - | - | - | |
|
| 0.4252 | 432 | 5.4254 | - | - | - | - | |
|
| 0.4262 | 433 | 1.9113 | - | - | - | - | |
|
| 0.4272 | 434 | 2.1108 | - | - | - | - | |
|
| 0.4281 | 435 | 1.7012 | - | - | - | - | |
|
| 0.4291 | 436 | 1.5722 | - | - | - | - | |
|
| 0.4301 | 437 | 1.5967 | - | - | - | - | |
|
| 0.4311 | 438 | 5.609 | - | - | - | - | |
|
| 0.4321 | 439 | 1.4444 | - | - | - | - | |
|
| 0.4331 | 440 | 5.3153 | - | - | - | - | |
|
| 0.4341 | 441 | 5.0934 | - | - | - | - | |
|
| 0.4350 | 442 | 1.3028 | - | - | - | - | |
|
| 0.4360 | 443 | 1.263 | - | - | - | - | |
|
| 0.4370 | 444 | 1.8462 | - | - | - | - | |
|
| 0.4380 | 445 | 2.1533 | - | - | - | - | |
|
| 0.4390 | 446 | 1.5467 | - | - | - | - | |
|
| 0.4400 | 447 | 1.4331 | - | - | - | - | |
|
| 0.4409 | 448 | 1.4416 | - | - | - | - | |
|
| 0.4419 | 449 | 1.5976 | - | - | - | - | |
|
| 0.4429 | 450 | 1.8723 | - | - | - | - | |
|
| 0.4439 | 451 | 1.1753 | - | - | - | - | |
|
| 0.4449 | 452 | 2.3205 | - | - | - | - | |
|
| 0.4459 | 453 | 1.6467 | - | - | - | - | |
|
| 0.4469 | 454 | 0.9322 | - | - | - | - | |
|
| 0.4478 | 455 | 1.958 | - | - | - | - | |
|
| 0.4488 | 456 | 1.8746 | - | - | - | - | |
|
| 0.4498 | 457 | 1.4546 | - | - | - | - | |
|
| 0.4508 | 458 | 0.9795 | - | - | - | - | |
|
| 0.4518 | 459 | 1.5458 | 1.2676 | 0.2751 | 0.4485 | 0.6433 | |
|
| 0.4528 | 460 | 1.6558 | - | - | - | - | |
|
| 0.4537 | 461 | 1.389 | - | - | - | - | |
|
| 0.4547 | 462 | 1.5608 | - | - | - | - | |
|
| 0.4557 | 463 | 1.6618 | - | - | - | - | |
|
| 0.4567 | 464 | 1.5122 | - | - | - | - | |
|
| 0.4577 | 465 | 1.3602 | - | - | - | - | |
|
| 0.4587 | 466 | 1.6714 | - | - | - | - | |
|
| 0.4596 | 467 | 1.0644 | - | - | - | - | |
|
| 0.4606 | 468 | 7.6421 | - | - | - | - | |
|
| 0.4616 | 469 | 1.2987 | - | - | - | - | |
|
| 0.4626 | 470 | 1.4231 | - | - | - | - | |
|
| 0.4636 | 471 | 7.7424 | - | - | - | - | |
|
| 0.4646 | 472 | 1.6811 | - | - | - | - | |
|
| 0.4656 | 473 | 1.1814 | - | - | - | - | |
|
| 0.4665 | 474 | 1.4486 | - | - | - | - | |
|
| 0.4675 | 475 | 1.3892 | - | - | - | - | |
|
| 0.4685 | 476 | 1.3681 | - | - | - | - | |
|
| 0.4695 | 477 | 1.3081 | - | - | - | - | |
|
| 0.4705 | 478 | 0.9102 | - | - | - | - | |
|
| 0.4715 | 479 | 1.0992 | - | - | - | - | |
|
| 0.4724 | 480 | 6.018 | - | - | - | - | |
|
| 0.4734 | 481 | 6.0908 | - | - | - | - | |
|
| 0.4744 | 482 | 1.2245 | - | - | - | - | |
|
| 0.4754 | 483 | 1.4825 | - | - | - | - | |
|
| 0.4764 | 484 | 1.8037 | - | - | - | - | |
|
| 0.4774 | 485 | 1.3611 | - | - | - | - | |
|
| 0.4783 | 486 | 1.7482 | - | - | - | - | |
|
| 0.4793 | 487 | 1.6385 | - | - | - | - | |
|
| 0.4803 | 488 | 1.3245 | - | - | - | - | |
|
| 0.4813 | 489 | 1.5638 | - | - | - | - | |
|
| 0.4823 | 490 | 1.566 | - | - | - | - | |
|
| 0.4833 | 491 | 1.9482 | - | - | - | - | |
|
| 0.4843 | 492 | 6.0859 | - | - | - | - | |
|
| 0.4852 | 493 | 5.8754 | - | - | - | - | |
|
| 0.4862 | 494 | 0.9964 | - | - | - | - | |
|
| 0.4872 | 495 | 1.5949 | - | - | - | - | |
|
| 0.4882 | 496 | 1.3167 | - | - | - | - | |
|
| 0.4892 | 497 | 3.9345 | - | - | - | - | |
|
| 0.4902 | 498 | 4.3886 | - | - | - | - | |
|
| 0.4911 | 499 | 1.6124 | - | - | - | - | |
|
| 0.4921 | 500 | 1.2145 | - | - | - | - | |
|
| 0.4931 | 501 | 3.5499 | - | - | - | - | |
|
| 0.4941 | 502 | 1.2999 | - | - | - | - | |
|
| 0.4951 | 503 | 1.2375 | - | - | - | - | |
|
| 0.4961 | 504 | 1.1606 | - | - | - | - | |
|
| 0.4970 | 505 | 1.4634 | - | - | - | - | |
|
| 0.4980 | 506 | 1.35 | - | - | - | - | |
|
| 0.4990 | 507 | 1.7187 | - | - | - | - | |
|
| 0.5 | 508 | 1.5915 | - | - | - | - | |
|
| 0.5010 | 509 | 1.2357 | - | - | - | - | |
|
| 0.5020 | 510 | 3.4122 | - | - | - | - | |
|
| 0.5030 | 511 | 4.244 | - | - | - | - | |
|
| 0.5039 | 512 | 0.9151 | - | - | - | - | |
|
| 0.5049 | 513 | 1.4323 | - | - | - | - | |
|
| 0.5059 | 514 | 1.4824 | - | - | - | - | |
|
| 0.5069 | 515 | 1.339 | - | - | - | - | |
|
| 0.5079 | 516 | 4.1658 | - | - | - | - | |
|
| 0.5089 | 517 | 1.3062 | - | - | - | - | |
|
| 0.5098 | 518 | 1.2905 | - | - | - | - | |
|
| 0.5108 | 519 | 1.1487 | - | - | - | - | |
|
| 0.5118 | 520 | 2.8652 | - | - | - | - | |
|
| 0.5128 | 521 | 1.2634 | - | - | - | - | |
|
| 0.5138 | 522 | 1.6745 | - | - | - | - | |
|
| 0.5148 | 523 | 1.6548 | - | - | - | - | |
|
| 0.5157 | 524 | 2.4204 | - | - | - | - | |
|
| 0.5167 | 525 | 1.7201 | - | - | - | - | |
|
| 0.5177 | 526 | 1.761 | - | - | - | - | |
|
| 0.5187 | 527 | 2.7098 | - | - | - | - | |
|
| 0.5197 | 528 | 1.6425 | - | - | - | - | |
|
| 0.5207 | 529 | 1.2466 | - | - | - | - | |
|
| 0.5217 | 530 | 1.3339 | - | - | - | - | |
|
| 0.5226 | 531 | 1.2398 | - | - | - | - | |
|
| 0.5236 | 532 | 3.5325 | - | - | - | - | |
|
| 0.5246 | 533 | 1.1303 | - | - | - | - | |
|
| 0.5256 | 534 | 1.2601 | - | - | - | - | |
|
| 0.5266 | 535 | 1.5762 | - | - | - | - | |
|
| 0.5276 | 536 | 1.3992 | - | - | - | - | |
|
| 0.5285 | 537 | 1.7125 | - | - | - | - | |
|
| 0.5295 | 538 | 3.6759 | - | - | - | - | |
|
| 0.5305 | 539 | 1.5468 | - | - | - | - | |
|
| 0.5315 | 540 | 1.4316 | - | - | - | - | |
|
| 0.5325 | 541 | 1.2797 | - | - | - | - | |
|
| 0.5335 | 542 | 1.9122 | - | - | - | - | |
|
| 0.5344 | 543 | 2.0367 | - | - | - | - | |
|
| 0.5354 | 544 | 3.3029 | - | - | - | - | |
|
| 0.5364 | 545 | 3.9263 | - | - | - | - | |
|
| 0.5374 | 546 | 3.0101 | - | - | - | - | |
|
| 0.5384 | 547 | 3.3555 | - | - | - | - | |
|
| 0.5394 | 548 | 1.2068 | - | - | - | - | |
|
| 0.5404 | 549 | 1.1566 | - | - | - | - | |
|
| 0.5413 | 550 | 1.2773 | - | - | - | - | |
|
| 0.5423 | 551 | 1.4047 | - | - | - | - | |
|
| 0.5433 | 552 | 1.6048 | - | - | - | - | |
|
| 0.5443 | 553 | 1.217 | - | - | - | - | |
|
| 0.5453 | 554 | 1.8104 | - | - | - | - | |
|
| 0.5463 | 555 | 1.687 | - | - | - | - | |
|
| 0.5472 | 556 | 1.6702 | - | - | - | - | |
|
| 0.5482 | 557 | 1.7011 | - | - | - | - | |
|
| 0.5492 | 558 | 1.7341 | - | - | - | - | |
|
| 0.5502 | 559 | 1.5006 | - | - | - | - | |
|
| 0.5512 | 560 | 1.2778 | - | - | - | - | |
|
| 0.5522 | 561 | 1.5081 | - | - | - | - | |
|
| 0.5531 | 562 | 1.2398 | - | - | - | - | |
|
| 0.5541 | 563 | 1.1054 | - | - | - | - | |
|
| 0.5551 | 564 | 4.0185 | - | - | - | - | |
|
| 0.5561 | 565 | 1.0427 | - | - | - | - | |
|
| 0.5571 | 566 | 1.3934 | - | - | - | - | |
|
| 0.5581 | 567 | 1.2378 | - | - | - | - | |
|
| 0.5591 | 568 | 1.022 | - | - | - | - | |
|
| 0.5600 | 569 | 0.9001 | - | - | - | - | |
|
| 0.5610 | 570 | 1.3279 | - | - | - | - | |
|
| 0.5620 | 571 | 1.2889 | - | - | - | - | |
|
| 0.5630 | 572 | 0.9383 | - | - | - | - | |
|
| 0.5640 | 573 | 1.749 | - | - | - | - | |
|
| 0.5650 | 574 | 0.7669 | - | - | - | - | |
|
| 0.5659 | 575 | 0.9355 | - | - | - | - | |
|
| 0.5669 | 576 | 1.3596 | - | - | - | - | |
|
| 0.5679 | 577 | 5.5102 | - | - | - | - | |
|
| 0.5689 | 578 | 0.7984 | - | - | - | - | |
|
| 0.5699 | 579 | 0.8871 | - | - | - | - | |
|
| 0.5709 | 580 | 1.1151 | - | - | - | - | |
|
| 0.5719 | 581 | 0.9502 | - | - | - | - | |
|
| 0.5728 | 582 | 3.6492 | - | - | - | - | |
|
| 0.5738 | 583 | 3.4262 | - | - | - | - | |
|
| 0.5748 | 584 | 1.3362 | - | - | - | - | |
|
| 0.5758 | 585 | 0.9015 | - | - | - | - | |
|
| 0.5768 | 586 | 1.5884 | - | - | - | - | |
|
| 0.5778 | 587 | 1.109 | - | - | - | - | |
|
| 0.5787 | 588 | 1.041 | - | - | - | - | |
|
| 0.5797 | 589 | 1.4892 | - | - | - | - | |
|
| 0.5807 | 590 | 1.2623 | - | - | - | - | |
|
| 0.5817 | 591 | 1.5302 | - | - | - | - | |
|
| 0.5827 | 592 | 1.3517 | - | - | - | - | |
|
| 0.5837 | 593 | 0.6166 | - | - | - | - | |
|
| 0.5846 | 594 | 1.6761 | - | - | - | - | |
|
| 0.5856 | 595 | 1.1115 | - | - | - | - | |
|
| 0.5866 | 596 | 1.2945 | - | - | - | - | |
|
| 0.5876 | 597 | 1.4378 | - | - | - | - | |
|
| 0.5886 | 598 | 0.9928 | - | - | - | - | |
|
| 0.5896 | 599 | 0.9898 | - | - | - | - | |
|
| 0.5906 | 600 | 4.6887 | - | - | - | - | |
|
| 0.5915 | 601 | 1.2254 | - | - | - | - | |
|
| 0.5925 | 602 | 1.2707 | - | - | - | - | |
|
| 0.5935 | 603 | 1.8289 | - | - | - | - | |
|
| 0.5945 | 604 | 0.7801 | - | - | - | - | |
|
| 0.5955 | 605 | 0.9111 | - | - | - | - | |
|
| 0.5965 | 606 | 1.1405 | - | - | - | - | |
|
| 0.5974 | 607 | 1.0497 | - | - | - | - | |
|
| 0.5984 | 608 | 1.0792 | - | - | - | - | |
|
| 0.5994 | 609 | 0.9699 | - | - | - | - | |
|
| 0.6004 | 610 | 0.9398 | - | - | - | - | |
|
| 0.6014 | 611 | 1.5483 | - | - | - | - | |
|
| 0.6024 | 612 | 0.997 | 1.0047 | 0.3980 | 0.4554 | 0.6701 | |
|
| 0.6033 | 613 | 0.8358 | - | - | - | - | |
|
| 0.6043 | 614 | 1.211 | - | - | - | - | |
|
| 0.6053 | 615 | 6.7813 | - | - | - | - | |
|
| 0.6063 | 616 | 1.1229 | - | - | - | - | |
|
| 0.6073 | 617 | 1.0317 | - | - | - | - | |
|
| 0.6083 | 618 | 1.2123 | - | - | - | - | |
|
| 0.6093 | 619 | 1.4073 | - | - | - | - | |
|
| 0.6102 | 620 | 0.9951 | - | - | - | - | |
|
| 0.6112 | 621 | 1.3166 | - | - | - | - | |
|
| 0.6122 | 622 | 4.5204 | - | - | - | - | |
|
| 0.6132 | 623 | 0.6539 | - | - | - | - | |
|
| 0.6142 | 624 | 1.1959 | - | - | - | - | |
|
| 0.6152 | 625 | 4.2551 | - | - | - | - | |
|
| 0.6161 | 626 | 1.2459 | - | - | - | - | |
|
| 0.6171 | 627 | 1.3758 | - | - | - | - | |
|
| 0.6181 | 628 | 1.0524 | - | - | - | - | |
|
| 0.6191 | 629 | 1.5197 | - | - | - | - | |
|
| 0.6201 | 630 | 1.0201 | - | - | - | - | |
|
| 0.6211 | 631 | 0.9007 | - | - | - | - | |
|
| 0.6220 | 632 | 0.8418 | - | - | - | - | |
|
| 0.6230 | 633 | 1.4343 | - | - | - | - | |
|
| 0.6240 | 634 | 0.5292 | - | - | - | - | |
|
| 0.625 | 635 | 0.8549 | - | - | - | - | |
|
| 0.6260 | 636 | 0.8703 | - | - | - | - | |
|
| 0.6270 | 637 | 0.9911 | - | - | - | - | |
|
| 0.6280 | 638 | 1.3342 | - | - | - | - | |
|
| 0.6289 | 639 | 1.1332 | - | - | - | - | |
|
| 0.6299 | 640 | 3.9965 | - | - | - | - | |
|
| 0.6309 | 641 | 0.7236 | - | - | - | - | |
|
| 0.6319 | 642 | 0.9079 | - | - | - | - | |
|
| 0.6329 | 643 | 1.0967 | - | - | - | - | |
|
| 0.6339 | 644 | 1.4183 | - | - | - | - | |
|
| 0.6348 | 645 | 1.3841 | - | - | - | - | |
|
| 0.6358 | 646 | 1.2982 | - | - | - | - | |
|
| 0.6368 | 647 | 0.9048 | - | - | - | - | |
|
| 0.6378 | 648 | 0.7918 | - | - | - | - | |
|
| 0.6388 | 649 | 0.3685 | - | - | - | - | |
|
| 0.6398 | 650 | 0.6949 | - | - | - | - | |
|
| 0.6407 | 651 | 5.1568 | - | - | - | - | |
|
| 0.6417 | 652 | 1.3943 | - | - | - | - | |
|
| 0.6427 | 653 | 0.8608 | - | - | - | - | |
|
| 0.6437 | 654 | 0.8197 | - | - | - | - | |
|
| 0.6447 | 655 | 0.822 | - | - | - | - | |
|
| 0.6457 | 656 | 3.2918 | - | - | - | - | |
|
| 0.6467 | 657 | 0.5596 | - | - | - | - | |
|
| 0.6476 | 658 | 4.1499 | - | - | - | - | |
|
| 0.6486 | 659 | 1.0279 | - | - | - | - | |
|
| 0.6496 | 660 | 1.1506 | - | - | - | - | |
|
| 0.6506 | 661 | 1.1673 | - | - | - | - | |
|
| 0.6516 | 662 | 0.96 | - | - | - | - | |
|
| 0.6526 | 663 | 3.5414 | - | - | - | - | |
|
| 0.6535 | 664 | 0.6599 | - | - | - | - | |
|
| 0.6545 | 665 | 3.5518 | - | - | - | - | |
|
| 0.6555 | 666 | 1.1906 | - | - | - | - | |
|
| 0.6565 | 667 | 2.1353 | - | - | - | - | |
|
| 0.6575 | 668 | 0.7083 | - | - | - | - | |
|
| 0.6585 | 669 | 2.9425 | - | - | - | - | |
|
| 0.6594 | 670 | 0.9433 | - | - | - | - | |
|
| 0.6604 | 671 | 1.8499 | - | - | - | - | |
|
| 0.6614 | 672 | 1.1614 | - | - | - | - | |
|
| 0.6624 | 673 | 1.0474 | - | - | - | - | |
|
| 0.6634 | 674 | 1.2895 | - | - | - | - | |
|
| 0.6644 | 675 | 0.9789 | - | - | - | - | |
|
| 0.6654 | 676 | 0.7719 | - | - | - | - | |
|
| 0.6663 | 677 | 1.2203 | - | - | - | - | |
|
| 0.6673 | 678 | 1.0516 | - | - | - | - | |
|
| 0.6683 | 679 | 2.5514 | - | - | - | - | |
|
| 0.6693 | 680 | 0.7346 | - | - | - | - | |
|
| 0.6703 | 681 | 1.0245 | - | - | - | - | |
|
| 0.6713 | 682 | 2.8005 | - | - | - | - | |
|
| 0.6722 | 683 | 1.3212 | - | - | - | - | |
|
| 0.6732 | 684 | 0.95 | - | - | - | - | |
|
| 0.6742 | 685 | 1.0483 | - | - | - | - | |
|
| 0.6752 | 686 | 0.8504 | - | - | - | - | |
|
| 0.6762 | 687 | 2.281 | - | - | - | - | |
|
| 0.6772 | 688 | 1.8153 | - | - | - | - | |
|
| 0.6781 | 689 | 1.3652 | - | - | - | - | |
|
| 0.6791 | 690 | 1.0949 | - | - | - | - | |
|
| 0.6801 | 691 | 1.2196 | - | - | - | - | |
|
| 0.6811 | 692 | 0.7995 | - | - | - | - | |
|
| 0.6821 | 693 | 1.5108 | - | - | - | - | |
|
| 0.6831 | 694 | 0.7933 | - | - | - | - | |
|
| 0.6841 | 695 | 1.2367 | - | - | - | - | |
|
| 0.6850 | 696 | 1.0352 | - | - | - | - | |
|
| 0.6860 | 697 | 1.1709 | - | - | - | - | |
|
| 0.6870 | 698 | 1.452 | - | - | - | - | |
|
| 0.6880 | 699 | 0.8497 | - | - | - | - | |
|
| 0.6890 | 700 | 2.8109 | - | - | - | - | |
|
| 0.6900 | 701 | 2.6196 | - | - | - | - | |
|
| 0.6909 | 702 | 1.4556 | - | - | - | - | |
|
| 0.6919 | 703 | 1.3494 | - | - | - | - | |
|
| 0.6929 | 704 | 1.6624 | - | - | - | - | |
|
| 0.6939 | 705 | 1.6169 | - | - | - | - | |
|
| 0.6949 | 706 | 0.5565 | - | - | - | - | |
|
| 0.6959 | 707 | 0.8594 | - | - | - | - | |
|
| 0.6969 | 708 | 0.8551 | - | - | - | - | |
|
| 0.6978 | 709 | 1.1693 | - | - | - | - | |
|
| 0.6988 | 710 | 1.0514 | - | - | - | - | |
|
| 0.6998 | 711 | 1.1862 | - | - | - | - | |
|
| 0.7008 | 712 | 0.8359 | - | - | - | - | |
|
| 0.7018 | 713 | 0.7692 | - | - | - | - | |
|
| 0.7028 | 714 | 1.815 | - | - | - | - | |
|
| 0.7037 | 715 | 1.0751 | - | - | - | - | |
|
| 0.7047 | 716 | 0.6526 | - | - | - | - | |
|
| 0.7057 | 717 | 1.1617 | - | - | - | - | |
|
| 0.7067 | 718 | 1.0783 | - | - | - | - | |
|
| 0.7077 | 719 | 0.7916 | - | - | - | - | |
|
| 0.7087 | 720 | 1.3039 | - | - | - | - | |
|
| 0.7096 | 721 | 1.1156 | - | - | - | - | |
|
| 0.7106 | 722 | 1.0529 | - | - | - | - | |
|
| 0.7116 | 723 | 0.8265 | - | - | - | - | |
|
| 0.7126 | 724 | 0.8019 | - | - | - | - | |
|
| 0.7136 | 725 | 0.6116 | - | - | - | - | |
|
| 0.7146 | 726 | 1.135 | - | - | - | - | |
|
| 0.7156 | 727 | 0.7692 | - | - | - | - | |
|
| 0.7165 | 728 | 2.3559 | - | - | - | - | |
|
| 0.7175 | 729 | 1.352 | - | - | - | - | |
|
| 0.7185 | 730 | 2.823 | - | - | - | - | |
|
| 0.7195 | 731 | 1.0067 | - | - | - | - | |
|
| 0.7205 | 732 | 0.9077 | - | - | - | - | |
|
| 0.7215 | 733 | 1.0933 | - | - | - | - | |
|
| 0.7224 | 734 | 0.8174 | - | - | - | - | |
|
| 0.7234 | 735 | 1.2212 | - | - | - | - | |
|
| 0.7244 | 736 | 1.1557 | - | - | - | - | |
|
| 0.7254 | 737 | 0.6191 | - | - | - | - | |
|
| 0.7264 | 738 | 1.7437 | - | - | - | - | |
|
| 0.7274 | 739 | 0.8977 | - | - | - | - | |
|
| 0.7283 | 740 | 1.0782 | - | - | - | - | |
|
| 0.7293 | 741 | 0.8985 | - | - | - | - | |
|
| 0.7303 | 742 | 1.4867 | - | - | - | - | |
|
| 0.7313 | 743 | 0.7497 | - | - | - | - | |
|
| 0.7323 | 744 | 0.6433 | - | - | - | - | |
|
| 0.7333 | 745 | 1.4175 | - | - | - | - | |
|
| 0.7343 | 746 | 1.1896 | - | - | - | - | |
|
| 0.7352 | 747 | 1.9867 | - | - | - | - | |
|
| 0.7362 | 748 | 0.8968 | - | - | - | - | |
|
| 0.7372 | 749 | 0.7265 | - | - | - | - | |
|
| 0.7382 | 750 | 0.9418 | - | - | - | - | |
|
| 0.7392 | 751 | 1.3717 | - | - | - | - | |
|
| 0.7402 | 752 | 2.1774 | - | - | - | - | |
|
| 0.7411 | 753 | 1.0854 | - | - | - | - | |
|
| 0.7421 | 754 | 0.9777 | - | - | - | - | |
|
| 0.7431 | 755 | 1.2721 | - | - | - | - | |
|
| 0.7441 | 756 | 0.7114 | - | - | - | - | |
|
| 0.7451 | 757 | 1.4036 | - | - | - | - | |
|
| 0.7461 | 758 | 1.1742 | - | - | - | - | |
|
| 0.7470 | 759 | 0.9351 | - | - | - | - | |
|
| 0.7480 | 760 | 0.5537 | - | - | - | - | |
|
| 0.7490 | 761 | 0.8688 | - | - | - | - | |
|
| 0.75 | 762 | 3.0053 | - | - | - | - | |
|
| 0.7510 | 763 | 3.3743 | - | - | - | - | |
|
| 0.7520 | 764 | 1.9928 | - | - | - | - | |
|
| 0.7530 | 765 | 1.5118 | 0.9342 | 0.4514 | 0.4792 | 0.6782 | |
|
| 0.7539 | 766 | 1.1213 | - | - | - | - | |
|
| 0.7549 | 767 | 2.1312 | - | - | - | - | |
|
| 0.7559 | 768 | 1.3739 | - | - | - | - | |
|
| 0.7569 | 769 | 0.8819 | - | - | - | - | |
|
| 0.7579 | 770 | 0.9069 | - | - | - | - | |
|
| 0.7589 | 771 | 0.935 | - | - | - | - | |
|
| 0.7598 | 772 | 0.7874 | - | - | - | - | |
|
| 0.7608 | 773 | 1.9942 | - | - | - | - | |
|
| 0.7618 | 774 | 1.1711 | - | - | - | - | |
|
| 0.7628 | 775 | 0.8407 | - | - | - | - | |
|
| 0.7638 | 776 | 1.5171 | - | - | - | - | |
|
| 0.7648 | 777 | 0.5308 | - | - | - | - | |
|
| 0.7657 | 778 | 1.4107 | - | - | - | - | |
|
| 0.7667 | 779 | 1.1766 | - | - | - | - | |
|
| 0.7677 | 780 | 1.326 | - | - | - | - | |
|
| 0.7687 | 781 | 0.7371 | - | - | - | - | |
|
| 0.7697 | 782 | 1.0504 | - | - | - | - | |
|
| 0.7707 | 783 | 1.1458 | - | - | - | - | |
|
| 0.7717 | 784 | 0.7242 | - | - | - | - | |
|
| 0.7726 | 785 | 0.8113 | - | - | - | - | |
|
| 0.7736 | 786 | 1.3808 | - | - | - | - | |
|
| 0.7746 | 787 | 0.7584 | - | - | - | - | |
|
| 0.7756 | 788 | 1.226 | - | - | - | - | |
|
| 0.7766 | 789 | 1.0599 | - | - | - | - | |
|
| 0.7776 | 790 | 2.9348 | - | - | - | - | |
|
| 0.7785 | 791 | 1.0849 | - | - | - | - | |
|
| 0.7795 | 792 | 0.5362 | - | - | - | - | |
|
| 0.7805 | 793 | 1.3765 | - | - | - | - | |
|
| 0.7815 | 794 | 0.6824 | - | - | - | - | |
|
| 0.7825 | 795 | 0.6009 | - | - | - | - | |
|
| 0.7835 | 796 | 2.3853 | - | - | - | - | |
|
| 0.7844 | 797 | 1.0571 | - | - | - | - | |
|
| 0.7854 | 798 | 0.9172 | - | - | - | - | |
|
| 0.7864 | 799 | 0.7915 | - | - | - | - | |
|
| 0.7874 | 800 | 0.827 | - | - | - | - | |
|
| 0.7884 | 801 | 0.8465 | - | - | - | - | |
|
| 0.7894 | 802 | 2.3489 | - | - | - | - | |
|
| 0.7904 | 803 | 0.6506 | - | - | - | - | |
|
| 0.7913 | 804 | 0.8346 | - | - | - | - | |
|
| 0.7923 | 805 | 0.6249 | - | - | - | - | |
|
| 0.7933 | 806 | 1.0557 | - | - | - | - | |
|
| 0.7943 | 807 | 0.7552 | - | - | - | - | |
|
| 0.7953 | 808 | 1.281 | - | - | - | - | |
|
| 0.7963 | 809 | 0.7846 | - | - | - | - | |
|
| 0.7972 | 810 | 2.6403 | - | - | - | - | |
|
| 0.7982 | 811 | 0.3679 | - | - | - | - | |
|
| 0.7992 | 812 | 1.9118 | - | - | - | - | |
|
| 0.8002 | 813 | 2.5911 | - | - | - | - | |
|
| 0.8012 | 814 | 1.1783 | - | - | - | - | |
|
| 0.8022 | 815 | 0.9347 | - | - | - | - | |
|
| 0.8031 | 816 | 0.5311 | - | - | - | - | |
|
| 0.8041 | 817 | 0.7092 | - | - | - | - | |
|
| 0.8051 | 818 | 0.8384 | - | - | - | - | |
|
| 0.8061 | 819 | 0.514 | - | - | - | - | |
|
| 0.8071 | 820 | 0.3638 | - | - | - | - | |
|
| 0.8081 | 821 | 1.9376 | - | - | - | - | |
|
| 0.8091 | 822 | 0.9177 | - | - | - | - | |
|
| 0.8100 | 823 | 0.8293 | - | - | - | - | |
|
| 0.8110 | 824 | 0.7269 | - | - | - | - | |
|
| 0.8120 | 825 | 0.664 | - | - | - | - | |
|
| 0.8130 | 826 | 0.6205 | - | - | - | - | |
|
| 0.8140 | 827 | 0.6562 | - | - | - | - | |
|
| 0.8150 | 828 | 0.6576 | - | - | - | - | |
|
| 0.8159 | 829 | 0.9931 | - | - | - | - | |
|
| 0.8169 | 830 | 1.1707 | - | - | - | - | |
|
| 0.8179 | 831 | 0.8635 | - | - | - | - | |
|
| 0.8189 | 832 | 0.7274 | - | - | - | - | |
|
| 0.8199 | 833 | 1.6808 | - | - | - | - | |
|
| 0.8209 | 834 | 1.8309 | - | - | - | - | |
|
| 0.8219 | 835 | 0.6191 | - | - | - | - | |
|
| 0.8228 | 836 | 1.0789 | - | - | - | - | |
|
| 0.8238 | 837 | 1.1637 | - | - | - | - | |
|
| 0.8248 | 838 | 0.7813 | - | - | - | - | |
|
| 0.8258 | 839 | 1.0403 | - | - | - | - | |
|
| 0.8268 | 840 | 0.7656 | - | - | - | - | |
|
| 0.8278 | 841 | 0.9994 | - | - | - | - | |
|
| 0.8287 | 842 | 1.009 | - | - | - | - | |
|
| 0.8297 | 843 | 0.8585 | - | - | - | - | |
|
| 0.8307 | 844 | 0.8847 | - | - | - | - | |
|
| 0.8317 | 845 | 0.8321 | - | - | - | - | |
|
| 0.8327 | 846 | 1.2605 | - | - | - | - | |
|
| 0.8337 | 847 | 1.0609 | - | - | - | - | |
|
| 0.8346 | 848 | 2.0115 | - | - | - | - | |
|
| 0.8356 | 849 | 1.2952 | - | - | - | - | |
|
| 0.8366 | 850 | 0.6999 | - | - | - | - | |
|
| 0.8376 | 851 | 0.7006 | - | - | - | - | |
|
| 0.8386 | 852 | 0.927 | - | - | - | - | |
|
| 0.8396 | 853 | 1.2083 | - | - | - | - | |
|
| 0.8406 | 854 | 0.608 | - | - | - | - | |
|
| 0.8415 | 855 | 0.8478 | - | - | - | - | |
|
| 0.8425 | 856 | 1.5731 | - | - | - | - | |
|
| 0.8435 | 857 | 1.6353 | - | - | - | - | |
|
| 0.8445 | 858 | 0.7862 | - | - | - | - | |
|
| 0.8455 | 859 | 0.8909 | - | - | - | - | |
|
| 0.8465 | 860 | 1.1719 | - | - | - | - | |
|
| 0.8474 | 861 | 1.2722 | - | - | - | - | |
|
| 0.8484 | 862 | 1.0022 | - | - | - | - | |
|
| 0.8494 | 863 | 1.5307 | - | - | - | - | |
|
| 0.8504 | 864 | 1.0162 | - | - | - | - | |
|
| 0.8514 | 865 | 0.6827 | - | - | - | - | |
|
| 0.8524 | 866 | 0.7744 | - | - | - | - | |
|
| 0.8533 | 867 | 1.2011 | - | - | - | - | |
|
| 0.8543 | 868 | 0.9219 | - | - | - | - | |
|
| 0.8553 | 869 | 0.7636 | - | - | - | - | |
|
| 0.8563 | 870 | 1.5061 | - | - | - | - | |
|
| 0.8573 | 871 | 1.5569 | - | - | - | - | |
|
| 0.8583 | 872 | 0.5896 | - | - | - | - | |
|
| 0.8593 | 873 | 1.1918 | - | - | - | - | |
|
| 0.8602 | 874 | 0.8572 | - | - | - | - | |
|
| 0.8612 | 875 | 1.0421 | - | - | - | - | |
|
| 0.8622 | 876 | 2.4599 | - | - | - | - | |
|
| 0.8632 | 877 | 0.55 | - | - | - | - | |
|
| 0.8642 | 878 | 1.2829 | - | - | - | - | |
|
| 0.8652 | 879 | 0.7808 | - | - | - | - | |
|
| 0.8661 | 880 | 1.7712 | - | - | - | - | |
|
| 0.8671 | 881 | 0.7456 | - | - | - | - | |
|
| 0.8681 | 882 | 1.2805 | - | - | - | - | |
|
| 0.8691 | 883 | 2.1927 | - | - | - | - | |
|
| 0.8701 | 884 | 0.855 | - | - | - | - | |
|
| 0.8711 | 885 | 0.667 | - | - | - | - | |
|
| 0.8720 | 886 | 1.1097 | - | - | - | - | |
|
| 0.8730 | 887 | 1.8795 | - | - | - | - | |
|
| 0.8740 | 888 | 0.6767 | - | - | - | - | |
|
| 0.875 | 889 | 0.7549 | - | - | - | - | |
|
| 0.8760 | 890 | 0.8616 | - | - | - | - | |
|
| 0.8770 | 891 | 1.9461 | - | - | - | - | |
|
| 0.8780 | 892 | 1.2694 | - | - | - | - | |
|
| 0.8789 | 893 | 1.825 | - | - | - | - | |
|
| 0.8799 | 894 | 0.9218 | - | - | - | - | |
|
| 0.8809 | 895 | 1.0297 | - | - | - | - | |
|
| 0.8819 | 896 | 0.609 | - | - | - | - | |
|
| 0.8829 | 897 | 0.9638 | - | - | - | - | |
|
| 0.8839 | 898 | 0.5521 | - | - | - | - | |
|
| 0.8848 | 899 | 1.3365 | - | - | - | - | |
|
| 0.8858 | 900 | 0.8443 | - | - | - | - | |
|
| 0.8868 | 901 | 0.7848 | - | - | - | - | |
|
| 0.8878 | 902 | 1.0733 | - | - | - | - | |
|
| 0.8888 | 903 | 0.5657 | - | - | - | - | |
|
| 0.8898 | 904 | 1.8081 | - | - | - | - | |
|
| 0.8907 | 905 | 0.8232 | - | - | - | - | |
|
| 0.8917 | 906 | 0.6159 | - | - | - | - | |
|
| 0.8927 | 907 | 0.9832 | - | - | - | - | |
|
| 0.8937 | 908 | 1.1375 | - | - | - | - | |
|
| 0.8947 | 909 | 1.4182 | - | - | - | - | |
|
| 0.8957 | 910 | 1.2287 | - | - | - | - | |
|
| 0.8967 | 911 | 1.0915 | - | - | - | - | |
|
| 0.8976 | 912 | 0.8116 | - | - | - | - | |
|
| 0.8986 | 913 | 0.6824 | - | - | - | - | |
|
| 0.8996 | 914 | 0.8888 | - | - | - | - | |
|
| 0.9006 | 915 | 0.5974 | - | - | - | - | |
|
|
|
</details> |
|
|
|
### Framework Versions |
|
- Python: 3.10.12 |
|
- Sentence Transformers: 3.2.1 |
|
- Transformers: 4.44.2 |
|
- PyTorch: 2.5.0+cu121 |
|
- Accelerate: 0.34.2 |
|
- Datasets: 3.0.2 |
|
- Tokenizers: 0.19.1 |
|
|
|
## Citation |
|
|
|
### BibTeX |
|
|
|
#### Sentence Transformers |
|
```bibtex |
|
@inproceedings{reimers-2019-sentence-bert, |
|
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", |
|
author = "Reimers, Nils and Gurevych, Iryna", |
|
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", |
|
month = "11", |
|
year = "2019", |
|
publisher = "Association for Computational Linguistics", |
|
url = "https://arxiv.org/abs/1908.10084", |
|
} |
|
``` |
|
|
|
#### GISTEmbedLoss |
|
```bibtex |
|
@misc{solatorio2024gistembed, |
|
title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning}, |
|
author={Aivin V. Solatorio}, |
|
year={2024}, |
|
eprint={2402.16829}, |
|
archivePrefix={arXiv}, |
|
primaryClass={cs.LG} |
|
} |
|
``` |
|
|
|
<!-- |
|
## Glossary |
|
|
|
*Clearly define terms in order to be accessible across audiences.* |
|
--> |
|
|
|
<!-- |
|
## Model Card Authors |
|
|
|
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* |
|
--> |
|
|
|
<!-- |
|
## Model Card Contact |
|
|
|
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* |
|
--> |