---
base_model: BAAI/bge-base-en-v1.5
datasets: []
language:
- en
library_name: sentence-transformers
license: apache-2.0
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:6300
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: The two patent families both expire in the United States in 2029.
sentences:
- What method is used to record amortization and costs for owned content that is
predominantly monetized on an individual basis?
- What year do the patent families related to DARZALEX expire in the United States?
- What was the primary reason for the net cash used in investing activities in 2022?
- source_sentence: In October 2020, Fortis Advisors LLC filed a complaint against
Ethicon Inc. and others in Delaware's Court of Chancery. The lawsuit alleges breach
of contract and fraud related to Ethicon's acquisition of Auris Health Inc. in
2019. The case underwent a partial dismissal in December 2021, and as of January
2024, the trial's decision is pending.
sentences:
- What types of payment rates are used for dialysis treatments and associated pharmaceuticals?
- What legal claims does Fortis Advisors LLC allege against Ethicon Inc. in the
lawsuit related to the acquisition of Auris Health Inc.?
- What were the key components of the acquisition deal between ICE and Black Knight
completed on September 5, 2023?
- source_sentence: Net cash provided by operating activities was $712.2 million and
$223.7 million for the year ended December 31, 2023 and 2022, respectively. The
increase was primarily driven by timing of payments to vendors and timing of the
receipt of payments from our customers, as well as an increase in interest income.
sentences:
- What caused the increase in net cash provided by operating activities between
2022 and 2023?
- How long did Joanne D. Smith serve as the Vice President - Marketing at Delta?
- How does the management experience of Mr. Robert G. Goldstein benefit the company?
- source_sentence: We believe that, to varying degrees, our trademarks, trade names,
copyrights, proprietary processes, trade secrets, trade dress, domain names and
similar intellectual property add significant value to our business
sentences:
- What were the net interest expense on pre-acquisition-related debt and the cost
associated with the extinguishment of senior notes for 2022 as part of non-GAAP
adjustments?
- How did the fluctuation in foreign currency exchange rates impact the consolidated
net operating revenues in 2023?
- What does the company believe adds significant value to its business regarding
intellectual property?
- source_sentence: The consolidated financial statements are incorporated by reference
in the Annual Report on Form 10-K, indicating they are treated as part of the
document for legal and reporting purposes.
sentences:
- What does it mean for financial statements to be incorporated by reference?
- What is contained within the pages 163-309 of the financial section?
- What were the key business segments of The Goldman Sachs Group, Inc. as reported
in their 2023 financial disclosures?
model-index:
- name: BGE base Financial Matryoshka
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 768
type: dim_768
metrics:
- type: cosine_accuracy@1
value: 0.7014285714285714
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8271428571428572
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8714285714285714
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9028571428571428
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7014285714285714
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2757142857142857
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17428571428571427
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09028571428571427
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7014285714285714
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8271428571428572
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8714285714285714
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.9028571428571428
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8043195367351605
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7724552154195008
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7766441682397275
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 512
type: dim_512
metrics:
- type: cosine_accuracy@1
value: 0.7
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8328571428571429
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8685714285714285
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9042857142857142
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2776190476190476
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17371428571428568
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09042857142857141
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8328571428571429
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8685714285714285
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.9042857142857142
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.804097602951568
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.771829365079365
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7756860707173107
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 256
type: dim_256
metrics:
- type: cosine_accuracy@1
value: 0.7
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8214285714285714
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8557142857142858
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.89
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.27380952380952384
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17114285714285712
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08899999999999998
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8214285714285714
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8557142857142858
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.89
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.7977242461477416
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7678412698412698
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7726663884946474
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 128
type: dim_128
metrics:
- type: cosine_accuracy@1
value: 0.6785714285714286
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8257142857142857
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8528571428571429
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8857142857142857
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.6785714285714286
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2752380952380953
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17057142857142857
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08857142857142856
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.6785714285714286
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8257142857142857
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8528571428571429
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.8857142857142857
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.7864311013349103
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.754115079365079
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7585731100549844
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 64
type: dim_64
metrics:
- type: cosine_accuracy@1
value: 0.6642857142857143
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.7828571428571428
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8157142857142857
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8642857142857143
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.6642857142857143
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.26095238095238094
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.16314285714285712
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08642857142857142
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.6642857142857143
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.7828571428571428
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8157142857142857
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.8642857142857143
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.7634746514041137
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7313633786848066
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7360563668571922
name: Cosine Map@100
---
# BGE base Financial Matryoshka
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5)
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Yohhei/bge-base-financial-matryoshka")
# Run inference
sentences = [
'The consolidated financial statements are incorporated by reference in the Annual Report on Form 10-K, indicating they are treated as part of the document for legal and reporting purposes.',
'What does it mean for financial statements to be incorporated by reference?',
'What is contained within the pages 163-309 of the financial section?',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
## Evaluation
### Metrics
#### Information Retrieval
* Dataset: `dim_768`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.7014 |
| cosine_accuracy@3 | 0.8271 |
| cosine_accuracy@5 | 0.8714 |
| cosine_accuracy@10 | 0.9029 |
| cosine_precision@1 | 0.7014 |
| cosine_precision@3 | 0.2757 |
| cosine_precision@5 | 0.1743 |
| cosine_precision@10 | 0.0903 |
| cosine_recall@1 | 0.7014 |
| cosine_recall@3 | 0.8271 |
| cosine_recall@5 | 0.8714 |
| cosine_recall@10 | 0.9029 |
| cosine_ndcg@10 | 0.8043 |
| cosine_mrr@10 | 0.7725 |
| **cosine_map@100** | **0.7766** |
#### Information Retrieval
* Dataset: `dim_512`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.7 |
| cosine_accuracy@3 | 0.8329 |
| cosine_accuracy@5 | 0.8686 |
| cosine_accuracy@10 | 0.9043 |
| cosine_precision@1 | 0.7 |
| cosine_precision@3 | 0.2776 |
| cosine_precision@5 | 0.1737 |
| cosine_precision@10 | 0.0904 |
| cosine_recall@1 | 0.7 |
| cosine_recall@3 | 0.8329 |
| cosine_recall@5 | 0.8686 |
| cosine_recall@10 | 0.9043 |
| cosine_ndcg@10 | 0.8041 |
| cosine_mrr@10 | 0.7718 |
| **cosine_map@100** | **0.7757** |
#### Information Retrieval
* Dataset: `dim_256`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.7 |
| cosine_accuracy@3 | 0.8214 |
| cosine_accuracy@5 | 0.8557 |
| cosine_accuracy@10 | 0.89 |
| cosine_precision@1 | 0.7 |
| cosine_precision@3 | 0.2738 |
| cosine_precision@5 | 0.1711 |
| cosine_precision@10 | 0.089 |
| cosine_recall@1 | 0.7 |
| cosine_recall@3 | 0.8214 |
| cosine_recall@5 | 0.8557 |
| cosine_recall@10 | 0.89 |
| cosine_ndcg@10 | 0.7977 |
| cosine_mrr@10 | 0.7678 |
| **cosine_map@100** | **0.7727** |
#### Information Retrieval
* Dataset: `dim_128`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.6786 |
| cosine_accuracy@3 | 0.8257 |
| cosine_accuracy@5 | 0.8529 |
| cosine_accuracy@10 | 0.8857 |
| cosine_precision@1 | 0.6786 |
| cosine_precision@3 | 0.2752 |
| cosine_precision@5 | 0.1706 |
| cosine_precision@10 | 0.0886 |
| cosine_recall@1 | 0.6786 |
| cosine_recall@3 | 0.8257 |
| cosine_recall@5 | 0.8529 |
| cosine_recall@10 | 0.8857 |
| cosine_ndcg@10 | 0.7864 |
| cosine_mrr@10 | 0.7541 |
| **cosine_map@100** | **0.7586** |
#### Information Retrieval
* Dataset: `dim_64`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.6643 |
| cosine_accuracy@3 | 0.7829 |
| cosine_accuracy@5 | 0.8157 |
| cosine_accuracy@10 | 0.8643 |
| cosine_precision@1 | 0.6643 |
| cosine_precision@3 | 0.261 |
| cosine_precision@5 | 0.1631 |
| cosine_precision@10 | 0.0864 |
| cosine_recall@1 | 0.6643 |
| cosine_recall@3 | 0.7829 |
| cosine_recall@5 | 0.8157 |
| cosine_recall@10 | 0.8643 |
| cosine_ndcg@10 | 0.7635 |
| cosine_mrr@10 | 0.7314 |
| **cosine_map@100** | **0.7361** |
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 6,300 training samples
* Columns: positive
and anchor
* Approximate statistics based on the first 1000 samples:
| | positive | anchor |
|:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details |
Highlights during fiscal year 2023 include the following: We generated $18,085 million of cash from operations.
| What was the amount of cash generated from operations by the company in fiscal year 2023?
|
| U.S. government and agency securities | $ | 7,950 | | $ | (336 | ) | $ | 45,273 | $ | (3,534 | ) | $ | 53,223 | $ | (3,870 | )
| How much were unrealized losses on U.S. government and agency securities for those held for 12 months or greater as of June 30, 2023?
|
| For assets under development, assets are grouped and assessed for impairment by estimating the undiscounted cash flows, which include remaining construction costs, over the asset's remaining useful life. If cash flows do not exceed the carrying amount, impairment based on fair value versus carrying value is considered.
| How is the impairment of assets assessed for projects still under development?
|
* Loss: [MatryoshkaLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `gradient_accumulation_steps`: 16
- `learning_rate`: 2e-05
- `num_train_epochs`: 4
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `bf16`: True
- `tf32`: True
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `batch_sampler`: no_duplicates
#### All Hyperparameters