Edit model card

SUJET AI bge-base Finance Matryoshka

This is a sentence-transformers model trained. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity
  • Language: en
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("Rubyando59/bge-base-financial-matryoshka")
# Run inference
sentences = [
    'obligations (whether due to financial difficulties or other reasons), or make adverse changes in the pricing or other \nmaterial terms of our arrangements with them. \nWe have experienced and/or may in the future experience supply shortages, price increases, quality issues, and/\nor longer lead times that could negatively affect our operations, driven by raw material, component availability, \nmanufacturing capacity, labor shortages, industry allocations, logistics capacity, inflation, foreign currency exchange \nrates, tariffs, sanctions and export controls, trade disputes and barriers, forced labor concerns, sustainability sourcing \nrequirements, geopolitical tensions, armed conflicts, natural disasters or pandemics, the effects of climate change \n(such as sea level rise, drought, flooding, heat waves, wildfires and resultant air quality effects and power shutdowns  \nassociated with wildfire prevention, and increased storm severity), power loss, and significant changes in the financial \nor business condition of our suppliers. Some of the components we use in our technical infrastructure and our device s \nare available from only one or limited sources, and we may not be able to find replacement vendors on favorable terms \nin the event of a supply chain disruption. A significant supply interruption that affects us or our vendors could delay \ncritical data center upgrades or expansions and delay consumer product availability . \nWe may enter into long-term contracts for materials and products that commit us to significant terms and \nconditions. We may face costs for materials and products that are not consumed due to market demand, technological \nchange, changed consumer preferences, quality, product recalls, and warranty issues. For instance, because certain of \nour hardware  supply contracts have volume-based pricing or minimum purchase requirements, if the volume of sales \nof our devices decreases or does not reach projected targets, we could face increased materials and manufacturing \ncosts or other financial liabilities that could make our products more costly per unit to manufacture and harm our \nfinancial condition and operating results. Furthermore, certain of our competitors may negotiate more favorable \ncontractual terms based on volume and other commitments that may provide them with competitive advantages and \nmay affect our supply. \nOur device s have had, and in the future may have, quality issues resulting from design, manufacturing, or \noperations. Sometimes, these issues may be caused by components we purchase from other manufacturers or \nsuppliers. If the quality of our products and services does not meet expectations or our products or services are \ndefective or require a recall, it could harm our reputation, financial condition, and operating results.  \nWe require our suppliers and business partners to comply with laws and, where applicable, our company policies \nand practices, such as the Google Supplier Code of Conduct, regarding workplace and employment practices, data \nsecurity, environmental compliance, and intellectual property licensing, but we do not control them or their practices. \nViolations of law or unethical business practices could result in supply chain disruptions, canceled orders, harm to key \nrelationships, and damage to our reputation. Their failure to procure necessary license rights to intellectual property \ncould affect our ability to sell our products or services and expose us to litigation or financial claims. \nInterruption to, interference with, or failure of our complex information technology and communications \nsystems could hurt our ability to effectively provide our products and services, which could harm  our \nreputation, financial condition, and operating results. \nThe availability of our products and services and fulfillment of our customer contracts depend on the continuing \noperation of our information technology and communications systems. Our systems are vulnerable to damage, \ninterference, or interruption from modifications or upgrades, terrorist attacks, state-sponsored attacks, natural disasters \nor pandemics, geopolitical tensions or armed conflicts, export controls and sanctions, the effects of climate change \n(such as sea level rise, drought, flooding, heat waves, wildfires and resultant air quality effects and power shutdowns  \nassociated with wildfire prevention, and increased storm severity), power loss, utility outages, telecommunications \nfailures, computer viruses, software bugs, ransomware attacks, supply-chain attacks, computer denial of service \nattacks, phishing schemes, or other attempts to harm or access our systems. Some of our data centers are located in \nareas with a high risk of major earthquakes or other natural disasters. Our data centers are also subject to break-ins, \nsabotage, and intentional acts of vandalism, and, in some cases, to potential disruptions resulting from problems \nexperienced by facility operators or disruptions as a result of geopolitical tensions and conflicts happening in the area. \nSome of our systems are not fully redundant, and disaster recovery planning cannot account for all eventualities. The \noccurrence of a natural disaster or pandemic, closure of a facility, or other unanticipated problems affecting our data \ncenters could result in lengthy interruptions in our service.',
    "What are the implications of increased logistics capacity costs on a company's overall financial performance?",
    "How might legal proceedings and regulatory scrutiny affect a company's financial condition and operating results?",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.0154
cosine_accuracy@3 0.0466
cosine_accuracy@5 0.0699
cosine_accuracy@10 0.1308
cosine_precision@1 0.0154
cosine_precision@3 0.0155
cosine_precision@5 0.014
cosine_precision@10 0.0131
cosine_recall@1 0.0154
cosine_recall@3 0.0466
cosine_recall@5 0.0699
cosine_recall@10 0.1308
cosine_ndcg@10 0.0621
cosine_mrr@10 0.0416
cosine_map@100 0.0576

Information Retrieval

Metric Value
cosine_accuracy@1 0.015
cosine_accuracy@3 0.0453
cosine_accuracy@5 0.0671
cosine_accuracy@10 0.1276
cosine_precision@1 0.015
cosine_precision@3 0.0151
cosine_precision@5 0.0134
cosine_precision@10 0.0128
cosine_recall@1 0.015
cosine_recall@3 0.0453
cosine_recall@5 0.0671
cosine_recall@10 0.1276
cosine_ndcg@10 0.0604
cosine_mrr@10 0.0403
cosine_map@100 0.0561

Information Retrieval

Metric Value
cosine_accuracy@1 0.0122
cosine_accuracy@3 0.0406
cosine_accuracy@5 0.0627
cosine_accuracy@10 0.1173
cosine_precision@1 0.0122
cosine_precision@3 0.0135
cosine_precision@5 0.0125
cosine_precision@10 0.0117
cosine_recall@1 0.0122
cosine_recall@3 0.0406
cosine_recall@5 0.0627
cosine_recall@10 0.1173
cosine_ndcg@10 0.0548
cosine_mrr@10 0.0361
cosine_map@100 0.0507

Information Retrieval

Metric Value
cosine_accuracy@1 0.0102
cosine_accuracy@3 0.0354
cosine_accuracy@5 0.0512
cosine_accuracy@10 0.0973
cosine_precision@1 0.0102
cosine_precision@3 0.0118
cosine_precision@5 0.0102
cosine_precision@10 0.0097
cosine_recall@1 0.0102
cosine_recall@3 0.0354
cosine_recall@5 0.0512
cosine_recall@10 0.0973
cosine_ndcg@10 0.0456
cosine_mrr@10 0.0301
cosine_map@100 0.0427

Information Retrieval

Metric Value
cosine_accuracy@1 0.0059
cosine_accuracy@3 0.0213
cosine_accuracy@5 0.0337
cosine_accuracy@10 0.0674
cosine_precision@1 0.0059
cosine_precision@3 0.0071
cosine_precision@5 0.0067
cosine_precision@10 0.0067
cosine_recall@1 0.0059
cosine_recall@3 0.0213
cosine_recall@5 0.0337
cosine_recall@10 0.0674
cosine_ndcg@10 0.0304
cosine_mrr@10 0.0194
cosine_map@100 0.029

Training Details

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 16
  • gradient_accumulation_steps: 16
  • learning_rate: 2e-05
  • num_train_epochs: 10
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.1
  • bf16: True
  • tf32: True
  • load_best_model_at_end: True
  • optim: adamw_torch_fused
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 16
  • eval_accumulation_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 10
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: True
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss dim_128_cosine_map@100 dim_256_cosine_map@100 dim_512_cosine_map@100 dim_64_cosine_map@100 dim_768_cosine_map@100
0.0516 10 6.6963 - - - - -
0.1033 20 7.634 - - - - -
0.1549 30 6.8573 - - - - -
0.2065 40 8.1731 - - - - -
0.2581 50 7.2853 - - - - -
0.3098 60 7.6009 - - - - -
0.3614 70 9.0776 - - - - -
0.4130 80 7.8738 - - - - -
0.4647 90 10.46 - - - - -
0.5163 100 10.7396 - - - - -
0.5679 110 10.3513 - - - - -
0.6196 120 10.654 - - - - -
0.6712 130 12.6157 - - - - -
0.7228 140 11.955 - - - - -
0.7744 150 13.2498 - - - - -
0.8261 160 11.2981 - - - - -
0.8777 170 13.8403 - - - - -
0.9293 180 9.4428 - - - - -
0.9810 190 8.1768 - - - - -
1.0016 194 - 0.0427 0.0507 0.0561 0.029 0.0576
1.0303 200 7.0981 - - - - -
1.0820 210 7.3113 - - - - -
1.1336 220 7.0259 - - - - -
1.1852 230 7.5874 - - - - -
1.2369 240 7.65 - - - - -
1.2885 250 7.2387 - - - - -
1.3401 260 9.001 - - - - -
1.3917 270 7.5975 - - - - -
1.4434 280 9.9568 - - - - -
1.4950 290 10.4123 - - - - -
1.5466 300 10.5535 - - - - -
1.5983 310 9.8199 - - - - -
1.6499 320 12.7258 - - - - -
1.7015 330 11.9423 - - - - -
1.7531 340 12.7364 - - - - -
1.8048 350 12.1926 - - - - -
1.8564 360 12.926 - - - - -
1.9080 370 11.8007 - - - - -
1.9597 380 8.7379 - - - - -
2.0010 388 - 0.0427 0.0507 0.0561 0.0290 0.0576
2.0090 390 7.1936 - - - - -
2.0607 400 6.7359 - - - - -
2.1123 410 7.4212 - - - - -
2.1639 420 7.346 - - - - -
2.2156 430 7.6784 - - - - -
2.2672 440 7.5079 - - - - -
2.3188 450 7.8875 - - - - -
2.3704 460 8.7154 - - - - -
2.4221 470 8.1278 - - - - -
2.4737 480 11.1214 - - - - -
2.5253 490 10.5293 - - - - -
2.5770 500 9.9882 - - - - -
2.6286 510 11.5283 - - - - -
2.6802 520 12.4337 - - - - -
2.7318 530 11.641 - - - - -
2.7835 540 13.3482 - - - - -
2.8351 550 11.7302 - - - - -
2.8867 560 13.7171 - - - - -
2.9384 570 8.9323 - - - - -
2.9900 580 7.4869 - - - - -
3.0003 582 - 0.0427 0.0507 0.0561 0.0290 0.0576
3.0394 590 6.9978 - - - - -
3.0910 600 7.33 - - - - -
3.1426 610 7.1879 - - - - -
3.1943 620 7.9204 - - - - -
3.2459 630 7.4435 - - - - -
3.2975 640 7.4079 - - - - -
3.3491 650 9.2445 - - - - -
3.4008 660 7.1794 - - - - -
3.4524 670 10.4496 - - - - -
3.5040 680 10.7556 - - - - -
3.5557 690 10.3543 - - - - -
3.6073 700 9.9478 - - - - -
3.6589 710 12.6559 - - - - -
3.7106 720 12.2463 - - - - -
3.7622 730 12.8381 - - - - -
3.8138 740 11.726 - - - - -
3.8654 750 13.4883 - - - - -
3.9171 760 10.7751 - - - - -
3.9687 770 8.5484 - - - - -
3.9997 776 - 0.0427 0.0507 0.0561 0.0290 0.0576
4.0181 780 7.1582 - - - - -
4.0697 790 7.0161 - - - - -
4.1213 800 7.11 - - - - -
4.1730 810 7.4557 - - - - -
4.2246 820 7.723 - - - - -
4.2762 830 7.2889 - - - - -
4.3278 840 8.3884 - - - - -
4.3795 850 8.1581 - - - - -
4.4311 860 9.1386 - - - - -
4.4827 870 10.706 - - - - -
4.5344 880 10.4258 - - - - -
4.5860 890 9.9659 - - - - -
4.6376 900 11.8535 - - - - -
4.6893 910 12.5578 - - - - -
4.7409 920 11.834 - - - - -
4.7925 930 12.5328 - - - - -
4.8441 940 12.6998 - - - - -
4.8958 950 12.9728 - - - - -
4.9474 960 8.9204 - - - - -
4.9990 970 7.3909 0.0427 0.0507 0.0561 0.0290 0.0576
5.0484 980 6.6683 - - - - -
5.1000 990 7.5538 - - - - -
5.1517 1000 6.9256 - - - - -
5.2033 1010 8.0908 - - - - -
5.2549 1020 7.254 - - - - -
5.3066 1030 7.6558 - - - - -
5.3582 1040 9.2184 - - - - -
5.4098 1050 7.5886 - - - - -
5.4614 1060 10.4976 - - - - -
5.5131 1070 10.785 - - - - -
5.5647 1080 10.2376 - - - - -
5.6163 1090 10.4871 - - - - -
5.6680 1100 12.6986 - - - - -
5.7196 1110 12.0688 - - - - -
5.7712 1120 13.1161 - - - - -
5.8228 1130 11.3866 - - - - -
5.8745 1140 13.7281 - - - - -
5.9261 1150 9.8432 - - - - -
5.9777 1160 8.2606 - - - - -
5.9984 1164 - 0.0427 0.0507 0.0561 0.0290 0.0576
6.0271 1170 7.0799 - - - - -
6.0787 1180 7.2981 - - - - -
6.1304 1190 7.0085 - - - - -
6.1820 1200 7.4587 - - - - -
6.2336 1210 7.8467 - - - - -
6.2853 1220 7.2008 - - - - -
6.3369 1230 8.8152 - - - - -
6.3885 1240 7.7205 - - - - -
6.4401 1250 9.9131 - - - - -
6.4918 1260 10.212 - - - - -
6.5434 1270 10.6791 - - - - -
6.5950 1280 9.8454 - - - - -
6.6467 1290 12.4647 - - - - -
6.6983 1300 11.8962 - - - - -
6.7499 1310 12.8014 - - - - -
6.8015 1320 12.1836 - - - - -
6.8532 1330 12.9114 - - - - -
6.9048 1340 12.1711 - - - - -
6.9564 1350 8.8125 - - - - -
6.9977 1358 - 0.0427 0.0507 0.0561 0.0290 0.0576
7.0058 1360 7.2281 - - - - -
7.0574 1370 6.6681 - - - - -
7.1091 1380 7.5282 - - - - -
7.1607 1390 7.1585 - - - - -
7.2123 1400 7.8507 - - - - -
7.2640 1410 7.4737 - - - - -
7.3156 1420 7.6963 - - - - -
7.3672 1430 8.8799 - - - - -
7.4188 1440 7.9977 - - - - -
7.4705 1450 10.9078 - - - - -
7.5221 1460 10.5731 - - - - -
7.5737 1470 10.1121 - - - - -
7.6254 1480 11.2426 - - - - -
7.6770 1490 12.4832 - - - - -
7.7286 1500 11.6954 - - - - -
7.7803 1510 13.4836 - - - - -
7.8319 1520 11.4752 - - - - -
7.8835 1530 13.8097 - - - - -
7.9351 1540 9.0087 - - - - -
7.9868 1550 7.709 - - - - -
8.0023 1553 - 0.0427 0.0507 0.0561 0.0290 0.0576
8.0361 1560 7.1515 - - - - -
8.0878 1570 7.2816 - - - - -
8.1394 1580 7.1392 - - - - -
8.1910 1590 7.7863 - - - - -
8.2427 1600 7.4939 - - - - -
8.2943 1610 7.3074 - - - - -
8.3459 1620 9.1739 - - - - -
8.3975 1630 7.3667 - - - - -
8.4492 1640 10.2528 - - - - -
8.5008 1650 10.6824 - - - - -
8.5524 1660 10.3765 - - - - -
8.6041 1670 9.853 - - - - -
8.6557 1680 12.8624 - - - - -
8.7073 1690 12.0849 - - - - -
8.7590 1700 12.7345 - - - - -
8.8106 1710 11.9884 - - - - -
8.8622 1720 13.2117 - - - - -
8.9138 1730 11.1261 - - - - -
8.9655 1740 8.5941 - - - - -
9.0016 1747 - 0.0427 0.0507 0.0561 0.0290 0.0576
9.0148 1750 7.2587 - - - - -
9.0665 1760 6.8577 - - - - -
9.1181 1770 7.2256 - - - - -
9.1697 1780 7.456 - - - - -
9.2214 1790 7.6563 - - - - -
9.2730 1800 7.3877 - - - - -
9.3246 1810 8.2009 - - - - -
9.3763 1820 8.5318 - - - - -
9.4279 1830 8.5052 - - - - -
9.4795 1840 10.9953 - - - - -
9.5311 1850 10.4012 - - - - -
9.5828 1860 10.0235 - - - - -
9.6344 1870 11.9031 - - - - -
9.6860 1880 12.5293 - - - - -
9.7377 1890 11.5157 - - - - -
9.7893 1900 12.8049 - - - - -
9.8409 1910 12.4659 - - - - -
9.8925 1920 13.1517 - - - - -
9.9442 1930 9.0604 0.0427 0.0507 0.0561 0.0290 0.0576
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.10.13
  • Sentence Transformers: 3.0.1
  • Transformers: 4.42.3
  • PyTorch: 2.5.0.dev20240704+cu124
  • Accelerate: 0.32.1
  • Datasets: 2.20.0
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning}, 
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply}, 
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
6
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Evaluation results