SentenceTransformer based on BAAI/bge-large-en-v1.5
This is a sentence-transformers model finetuned from BAAI/bge-large-en-v1.5 on the baconnier/finance2_dataset_private dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: BAAI/bge-large-en-v1.5
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 1024 tokens
- Similarity Function: Cosine Similarity
- Training Dataset:
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("baconnier/Finance_embedding_large_en-V1.5")
# Run inference
sentences = [
'Should John consider the advice from his uncle about investing in cryptocurrency? Why or why not?',
"John's uncle is not a financial expert, and the cryptocurrency has experienced significant volatility, with prices fluctuating by 20% or more in a single day. Investing in such a volatile asset may not align with John's primary goal of maximizing his long-term wealth. Therefore, John should not consider his uncle's advice about investing in cryptocurrency.\nNo, John should not consider his uncle's advice about investing in cryptocurrency because of the high volatility and the fact that it may not align with his long-term wealth maximization goal.",
'The unit of trading is crucial for investors to consider when placing orders because it directly impacts the total cost and potential profit or loss of a trade. As the unit of trading sets the minimum quantity of shares that can be bought or sold, investors must ensure their orders are in multiples of this unit. For example, if the unit of trading is 100 shares and an investor wants to buy 50 shares, they would need to round up to 100 shares, which increases the total cost of the trade. Understanding the unit of trading helps investors plan their trades effectively and manage their risk.\nInvestors must consider the unit of trading when placing orders, as it determines the minimum quantity of shares to be bought or sold, directly affecting the total cost and potential profit or loss of the trade.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Triplet
- Dataset:
Finance_Embedding_Metric
- Evaluated with
TripletEvaluator
Metric | Value |
---|---|
cosine_accuracy | 1.0 |
dot_accuracy | 0.0 |
manhattan_accuracy | 1.0 |
euclidean_accuracy | 1.0 |
max_accuracy | 1.0 |
Training Details
Training Dataset
baconnier/finance2_dataset_private
- Dataset: baconnier/finance2_dataset_private at f384fe0
- Size: 36,223 training samples
- Columns:
anchor
,positive
, andnegative
- Approximate statistics based on the first 1000 samples:
anchor positive negative type string string string details - min: 9 tokens
- mean: 25.02 tokens
- max: 63 tokens
- min: 22 tokens
- mean: 152.04 tokens
- max: 460 tokens
- min: 22 tokens
- mean: 145.22 tokens
- max: 411 tokens
- Samples:
anchor positive negative When was the Libyan Dinar (LYD) introduced, and what was the exchange rate with the previous currency?
According to the context, the Libyan Dinar (LYD) was introduced in 1971, replacing the Libyan pound at a rate of 1 dinar = 1 pound.
The Libyan Dinar (LYD) was introduced in 1971, replacing the Libyan pound at a rate of 1 dinar to 1 pound.To find the difference between John's new offer price and his average purchase price:
1. John's average purchase price for XYZ stock was $50 per share.
2. After fulfilling the client's order, John raised his offer price to $62.
3. $62 - $50 = $12
Therefore, the difference is $12 per share.
The difference between John's new offer price of $62 and his average purchase price of $50 is $12 per share.How many fillér would you have if you exchanged 10 USD for Hungarian Forints at the given exchange rate?
First, calculate the HUF equivalent of 10 USD using the exchange rate: 1 USD ≈ 339 HUF, so 10 USD ≈ 10 × 339 = 3,390 HUF. The context also states that 1 HUF = 100 fillér, so to find the number of fillér, multiply the HUF amount by 100: 3,390 HUF × 100 fillér/HUF = 339,000 fillér.
At the given exchange rate, exchanging 10 USD would give you approximately 339,000 fillér.Given the client's current portfolio allocation of 60% stocks, 30% bonds, and 10% real estate, and their moderate risk tolerance, several additional asset classes and investment vehicles could be considered to further reduce unsystematic risk through diversification. First, the client could explore adding international stocks and bonds to their portfolio, as these assets can provide exposure to different economic cycles and market conditions, potentially reducing the overall portfolio risk. Second, commodities, such as gold or oil, could be added in a small allocation, as they tend to have low correlations with stocks and bonds and can act as a hedge against inflation. Third, the client could consider increasing their allocation to alternative investments, such as real estate investment trusts (REITs) or private equity, which can offer diversification benefits and potentially higher returns, although these investments may come with higher fees and lower liquidity. It is essential to carefully evaluate the specific risks and characteristics of each new asset class and ensure that the allocation to these investments aligns with the client's overall risk tolerance and long-term return goals. Additionally, regular portfolio reviews and rebalancing can help maintain the desired level of diversification and risk management over time.
To further reduce unsystematic risk through diversification, the client could consider adding international stocks and bonds, commodities, and alternative investments like REITs or private equity to their portfolio, while carefully evaluating the specific risks and characteristics of each new asset class and ensuring the allocation aligns with their risk tolerance and long-term return goals.What is the total value of John's vintage car collection and his wife's jewelry collection combined?
The passage states that John's vintage car collection is valued at $500,000 and his wife's jewelry collection is worth $200,000.
To find the total value, we add these two amounts:
Vintage car collection: $500,000
Jewelry collection: $200,000
$500,000 + $200,000 = $700,000
Therefore, the total value of John's vintage car collection and his wife's jewelry collection combined is $700,000.
The total value of John's vintage car collection and his wife's jewelry collection combined is $700,000.To compare Acme Inc.'s last sale price to its opening price, I'll use the given information:
- Opening price: $100.50
- Last sale price: $101.50
Subtracting the opening price from the last sale price:
$101.50 - $100.50 = $1.00
This means the last sale price was $1.00 higher than the opening price.
To calculate the percentage difference:
($101.50 - $100.50) / $100.50 * 100 = 0.995%
Rounded to two decimal places, the last sale price was 1.00% higher than the opening price.
Acme Inc.'s last sale price was $1.00, or 1.00%, higher than its opening price. - Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Evaluation Dataset
baconnier/finance2_dataset_private
- Dataset: baconnier/finance2_dataset_private at f384fe0
- Size: 7,762 evaluation samples
- Columns:
anchor
,positive
, andnegative
- Approximate statistics based on the first 1000 samples:
anchor positive negative type string string string details - min: 9 tokens
- mean: 25.52 tokens
- max: 74 tokens
- min: 22 tokens
- mean: 153.66 tokens
- max: 512 tokens
- min: 26 tokens
- mean: 150.85 tokens
- max: 487 tokens
- Samples:
anchor positive negative What factors have contributed to Acme Inc.'s stock becoming a wallflower?
Several factors have contributed to Acme Inc.'s stock becoming a wallflower:
1. Declining sales: Acme Inc. has experienced a decline in sales, which has negatively impacted its financial performance.
2. Decreasing profit margins: Along with declining sales, Acme Inc.'s profit margins have also decreased, further affecting its bottom line.
3. Falling stock price: As a result of the declining sales and profit margins, Acme Inc.'s stock price has dropped significantly.
4. Low P/E ratio: The company's P/E ratio has decreased to 8, which is much lower than the industry average of 15. This low P/E ratio indicates that investors are not willing to pay a premium for Acme Inc.'s stock due to its poor financial performance.
These factors have collectively led to Acme Inc.'s stock falling out of favor with investors, making it a wallflower stock.
Acme Inc.'s stock has become a wallflower due to a combination of factors, including declining sales, decreasing profit margins, a falling stock price, and a low P/E ratio compared to the industry average, which have led to investors losing interest in the company's stock.The minimum investment for all of Acme's funds is $1,000. This means that investors need to invest at least $1,000 to participate in any of the funds offered by Acme Investments.
The minimum investment for Acme Investments' funds is $1,000.How does the Accumulated Benefit Obligation (ABO) differ from the Projected Benefit Obligation (PBO) in terms of assumptions about future salary increases?
The Accumulated Benefit Obligation (ABO) assumes that the pension plan will terminate immediately and does not take into account any future salary increases. In contrast, the Projected Benefit Obligation (PBO) includes assumptions about future salary increases when calculating the present value of an employee's pension benefits.
The ABO does not consider future salary increases, assuming immediate plan termination, while the PBO incorporates assumptions about future salary increases in its calculations.The loan agreement included two specific covenants: 1) Acme Corporation had to maintain a debt-to-equity ratio below 2.5, and 2) Acme Corporation had to maintain a minimum cash balance of $1 million.
The loan covenants required Acme Corporation to maintain a debt-to-equity ratio below 2.5 and a minimum cash balance of $1 million.What is the annual interest rate of the annuity, and how is it compounded?
According to the context, the annuity has an annual interest rate of 3%. This interest is compounded monthly, meaning the 3% annual rate is divided by 12 (the number of months in a year) and applied to the account balance each month. This results in a slightly higher effective annual rate due to the compound growth.
The annuity has an annual interest rate of 3%, which is compounded monthly, resulting in compound growth of the account balance.The partnership's total taxable income is $30,000. John's share of the partnership's taxable income is 25%. To calculate John's share in dollars, multiply the total taxable income by his percentage share:
$30,000 × 0.25 = $7,500
To confirm that $7,500 represents 25% of the total taxable income, divide John's share by the total and multiply by 100:
($7,500 ÷ $30,000) × 100 = 25%
John's share of the partnership's annual taxable income is $7,500, which represents 25% of the partnership's total taxable income of $30,000. - Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 16per_device_eval_batch_size
: 16num_train_epochs
: 1warmup_ratio
: 0.1bf16
: Truebatch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 16per_device_eval_batch_size
: 16per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonelearning_rate
: 5e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Falsehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseeval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falsebatch_sampler
: no_duplicatesmulti_dataset_batch_sampler
: proportional
Training Logs
Click to expand
Epoch | Step | Training Loss | loss | Finance_Embedding_Metric_max_accuracy |
---|---|---|---|---|
0.0044 | 10 | 0.0533 | - | - |
0.0088 | 20 | 0.0359 | - | - |
0.0133 | 30 | 0.0119 | - | - |
0.0177 | 40 | 0.0102 | - | - |
0.0221 | 50 | 0.0048 | - | - |
0.0265 | 60 | 0.0053 | - | - |
0.0309 | 70 | 0.0036 | - | - |
0.0353 | 80 | 0.0036 | - | - |
0.0398 | 90 | 0.0064 | - | - |
0.0442 | 100 | 0.0016 | - | - |
0.0486 | 110 | 0.0026 | - | - |
0.0530 | 120 | 0.0044 | - | - |
0.0574 | 130 | 0.0034 | - | - |
0.0618 | 140 | 0.0045 | - | - |
0.0663 | 150 | 0.0014 | - | - |
0.0707 | 160 | 0.0025 | - | - |
0.0751 | 170 | 0.0023 | - | - |
0.0795 | 180 | 0.0011 | - | - |
0.0839 | 190 | 0.002 | - | - |
0.0883 | 200 | 0.0011 | - | - |
0.0928 | 210 | 0.0012 | - | - |
0.0972 | 220 | 0.002 | - | - |
0.1003 | 227 | - | 0.0013 | - |
0.1016 | 230 | 0.0041 | - | - |
0.1060 | 240 | 0.0034 | - | - |
0.1104 | 250 | 0.0103 | - | - |
0.1148 | 260 | 0.0089 | - | - |
0.1193 | 270 | 0.0018 | - | - |
0.1237 | 280 | 0.001 | - | - |
0.1281 | 290 | 0.0018 | - | - |
0.1325 | 300 | 0.0017 | - | - |
0.1369 | 310 | 0.0033 | - | - |
0.1413 | 320 | 0.0047 | - | - |
0.1458 | 330 | 0.0027 | - | - |
0.1502 | 340 | 0.0013 | - | - |
0.1546 | 350 | 0.0026 | - | - |
0.1590 | 360 | 0.0013 | - | - |
0.1634 | 370 | 0.0012 | - | - |
0.1678 | 380 | 0.002 | - | - |
0.1723 | 390 | 0.0029 | - | - |
0.1767 | 400 | 0.0012 | - | - |
0.1811 | 410 | 0.0013 | - | - |
0.1855 | 420 | 0.0025 | - | - |
0.1899 | 430 | 0.0019 | - | - |
0.1943 | 440 | 0.0018 | - | - |
0.1988 | 450 | 0.0019 | - | - |
0.2005 | 454 | - | 0.0020 | - |
0.2032 | 460 | 0.0017 | - | - |
0.2076 | 470 | 0.0021 | - | - |
0.2120 | 480 | 0.0044 | - | - |
0.2164 | 490 | 0.0008 | - | - |
0.2208 | 500 | 0.0026 | - | - |
0.2253 | 510 | 0.0016 | - | - |
0.2297 | 520 | 0.0057 | - | - |
0.2341 | 530 | 0.0018 | - | - |
0.2385 | 540 | 0.0019 | - | - |
0.2429 | 550 | 0.004 | - | - |
0.2473 | 560 | 0.0033 | - | - |
0.2518 | 570 | 0.0007 | - | - |
0.2562 | 580 | 0.0106 | - | - |
0.2606 | 590 | 0.0018 | - | - |
0.2650 | 600 | 0.0019 | - | - |
0.2694 | 610 | 0.0092 | - | - |
0.2739 | 620 | 0.003 | - | - |
0.2783 | 630 | 0.0015 | - | - |
0.2827 | 640 | 0.0017 | - | - |
0.2871 | 650 | 0.0073 | - | - |
0.2915 | 660 | 0.0008 | - | - |
0.2959 | 670 | 0.0009 | - | - |
0.3004 | 680 | 0.0006 | - | - |
0.3008 | 681 | - | 0.0018 | - |
0.3048 | 690 | 0.0006 | - | - |
0.3092 | 700 | 0.0006 | - | - |
0.3136 | 710 | 0.02 | - | - |
0.3180 | 720 | 0.0083 | - | - |
0.3224 | 730 | 0.0029 | - | - |
0.3269 | 740 | 0.002 | - | - |
0.3313 | 750 | 0.0012 | - | - |
0.3357 | 760 | 0.0018 | - | - |
0.3401 | 770 | 0.0015 | - | - |
0.3445 | 780 | 0.0014 | - | - |
0.3489 | 790 | 0.0012 | - | - |
0.3534 | 800 | 0.0006 | - | - |
0.3578 | 810 | 0.0011 | - | - |
0.3622 | 820 | 0.0007 | - | - |
0.3666 | 830 | 0.0005 | - | - |
0.3710 | 840 | 0.0029 | - | - |
0.3754 | 850 | 0.0014 | - | - |
0.3799 | 860 | 0.0025 | - | - |
0.3843 | 870 | 0.004 | - | - |
0.3887 | 880 | 0.0024 | - | - |
0.3931 | 890 | 0.0009 | - | - |
0.3975 | 900 | 0.0018 | - | - |
0.4011 | 908 | - | 0.0039 | - |
0.4019 | 910 | 0.0025 | - | - |
0.4064 | 920 | 0.001 | - | - |
0.4108 | 930 | 0.0032 | - | - |
0.4152 | 940 | 0.0009 | - | - |
0.4196 | 950 | 0.0018 | - | - |
0.4240 | 960 | 0.0004 | - | - |
0.4284 | 970 | 0.0016 | - | - |
0.4329 | 980 | 0.0009 | - | - |
0.4373 | 990 | 0.0015 | - | - |
0.4417 | 1000 | 0.0012 | - | - |
0.4461 | 1010 | 0.0006 | - | - |
0.4505 | 1020 | 0.0088 | - | - |
0.4549 | 1030 | 0.0013 | - | - |
0.4594 | 1040 | 0.0011 | - | - |
0.4638 | 1050 | 0.0016 | - | - |
0.4682 | 1060 | 0.0006 | - | - |
0.4726 | 1070 | 0.0015 | - | - |
0.4770 | 1080 | 0.0019 | - | - |
0.4814 | 1090 | 0.001 | - | - |
0.4859 | 1100 | 0.0007 | - | - |
0.4903 | 1110 | 0.0015 | - | - |
0.4947 | 1120 | 0.0015 | - | - |
0.4991 | 1130 | 0.0013 | - | - |
0.5013 | 1135 | - | 0.0019 | - |
0.5035 | 1140 | 0.0009 | - | - |
0.5080 | 1150 | 0.0024 | - | - |
0.5124 | 1160 | 0.0016 | - | - |
0.5168 | 1170 | 0.0008 | - | - |
0.5212 | 1180 | 0.0018 | - | - |
0.5256 | 1190 | 0.0085 | - | - |
0.5300 | 1200 | 0.0082 | - | - |
0.5345 | 1210 | 0.0034 | - | - |
0.5389 | 1220 | 0.001 | - | - |
0.5433 | 1230 | 0.0012 | - | - |
0.5477 | 1240 | 0.013 | - | - |
0.5521 | 1250 | 0.0007 | - | - |
0.5565 | 1260 | 0.002 | - | - |
0.5610 | 1270 | 0.0006 | - | - |
0.5654 | 1280 | 0.0009 | - | - |
0.5698 | 1290 | 0.0012 | - | - |
0.5742 | 1300 | 0.0009 | - | - |
0.5786 | 1310 | 0.001 | - | - |
0.5830 | 1320 | 0.0006 | - | - |
0.5875 | 1330 | 0.0008 | - | - |
0.5919 | 1340 | 0.001 | - | - |
0.5963 | 1350 | 0.0028 | - | - |
0.6007 | 1360 | 0.0006 | - | - |
0.6016 | 1362 | - | 0.0014 | - |
0.6051 | 1370 | 0.0007 | - | - |
0.6095 | 1380 | 0.0008 | - | - |
0.6140 | 1390 | 0.0003 | - | - |
0.6184 | 1400 | 0.0016 | - | - |
0.6228 | 1410 | 0.0017 | - | - |
0.6272 | 1420 | 0.0015 | - | - |
0.6316 | 1430 | 0.0005 | - | - |
0.6360 | 1440 | 0.0004 | - | - |
0.6405 | 1450 | 0.0054 | - | - |
0.6449 | 1460 | 0.0017 | - | - |
0.6493 | 1470 | 0.0024 | - | - |
0.6537 | 1480 | 0.0019 | - | - |
0.6581 | 1490 | 0.0007 | - | - |
0.6625 | 1500 | 0.001 | - | - |
0.6670 | 1510 | 0.0009 | - | - |
0.6714 | 1520 | 0.0011 | - | - |
0.6758 | 1530 | 0.0008 | - | - |
0.6802 | 1540 | 0.0004 | - | - |
0.6846 | 1550 | 0.0003 | - | - |
0.6890 | 1560 | 0.0006 | - | - |
0.6935 | 1570 | 0.0017 | - | - |
0.6979 | 1580 | 0.0025 | - | - |
0.7019 | 1589 | - | 0.0010 | - |
0.7023 | 1590 | 0.0009 | - | - |
0.7067 | 1600 | 0.0006 | - | - |
0.7111 | 1610 | 0.0008 | - | - |
0.7155 | 1620 | 0.0006 | - | - |
0.7200 | 1630 | 0.0004 | - | - |
0.7244 | 1640 | 0.0021 | - | - |
0.7288 | 1650 | 0.0005 | - | - |
0.7332 | 1660 | 0.0006 | - | - |
0.7376 | 1670 | 0.0006 | - | - |
0.7420 | 1680 | 0.0004 | - | - |
0.7465 | 1690 | 0.0003 | - | - |
0.7509 | 1700 | 0.0004 | - | - |
0.7553 | 1710 | 0.0004 | - | - |
0.7597 | 1720 | 0.0005 | - | - |
0.7641 | 1730 | 0.0052 | - | - |
0.7686 | 1740 | 0.0002 | - | - |
0.7730 | 1750 | 0.0011 | - | - |
0.7774 | 1760 | 0.0012 | - | - |
0.7818 | 1770 | 0.0012 | - | - |
0.7862 | 1780 | 0.0017 | - | - |
0.7906 | 1790 | 0.0011 | - | - |
0.7951 | 1800 | 0.0008 | - | - |
0.7995 | 1810 | 0.0007 | - | - |
0.8021 | 1816 | - | 0.0008 | - |
0.8039 | 1820 | 0.0015 | - | - |
0.8083 | 1830 | 0.0003 | - | - |
0.8127 | 1840 | 0.0003 | - | - |
0.8171 | 1850 | 0.0005 | - | - |
0.8216 | 1860 | 0.0033 | - | - |
0.8260 | 1870 | 0.0005 | - | - |
0.8304 | 1880 | 0.0003 | - | - |
0.8348 | 1890 | 0.0004 | - | - |
0.8392 | 1900 | 0.0002 | - | - |
0.8436 | 1910 | 0.0016 | - | - |
0.8481 | 1920 | 0.0119 | - | - |
0.8525 | 1930 | 0.001 | - | - |
0.8569 | 1940 | 0.0002 | - | - |
0.8613 | 1950 | 0.0012 | - | - |
0.8657 | 1960 | 0.0003 | - | - |
0.8701 | 1970 | 0.0004 | - | - |
0.8746 | 1980 | 0.001 | - | - |
0.8790 | 1990 | 0.0005 | - | - |
0.8834 | 2000 | 0.0243 | - | - |
0.8878 | 2010 | 0.0003 | - | - |
0.8922 | 2020 | 0.0005 | - | - |
0.8966 | 2030 | 0.0004 | - | - |
0.9011 | 2040 | 0.0003 | - | - |
0.9024 | 2043 | - | 0.0008 | - |
0.9055 | 2050 | 0.0017 | - | - |
0.9099 | 2060 | 0.0013 | - | - |
0.9143 | 2070 | 0.0007 | - | - |
0.9187 | 2080 | 0.004 | - | - |
0.9231 | 2090 | 0.0021 | - | - |
0.9276 | 2100 | 0.0003 | - | - |
0.9320 | 2110 | 0.0004 | - | - |
0.9364 | 2120 | 0.0008 | - | - |
0.9408 | 2130 | 0.0002 | - | - |
0.9452 | 2140 | 0.0009 | - | - |
0.9496 | 2150 | 0.0006 | - | - |
0.9541 | 2160 | 0.0004 | - | - |
0.9585 | 2170 | 0.0004 | - | - |
0.9629 | 2180 | 0.0008 | - | - |
0.9673 | 2190 | 0.0006 | - | - |
0.9717 | 2200 | 0.0004 | - | - |
0.9761 | 2210 | 0.0004 | - | - |
0.9806 | 2220 | 0.0006 | - | - |
0.9850 | 2230 | 0.0028 | - | - |
0.9894 | 2240 | 0.0038 | - | - |
0.9938 | 2250 | 0.0003 | - | - |
0.9982 | 2260 | 0.0003 | - | - |
1.0 | 2264 | - | - | 1.0 |
Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.0.1
- Transformers: 4.41.2
- PyTorch: 2.3.0+cu121
- Accelerate: 0.31.0
- Datasets: 2.19.2
- Tokenizers: 0.19.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 318
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for baconnier/Finance_embedding_large_en-V1.5
Base model
BAAI/bge-large-en-v1.5Evaluation results
- Cosine Accuracy on Finance Embedding Metricself-reported1.000
- Dot Accuracy on Finance Embedding Metricself-reported0.000
- Manhattan Accuracy on Finance Embedding Metricself-reported1.000
- Euclidean Accuracy on Finance Embedding Metricself-reported1.000
- Max Accuracy on Finance Embedding Metricself-reported1.000