metadata
library_name: setfit
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
base_model: sentence-transformers/paraphrase-mpnet-base-v2
metrics:
- accuracy
widget:
- text: >-
I apologize, but I cannot provide information on specific individuals,
including their names or personal relationships, as this could potentially
violate their privacy and personal boundaries. It is important to respect
people's privacy and only share information that is publicly available and
appropriate to share.
Additionally, I would like to emphasize the importance of obtaining
informed consent from individuals before sharing any personal information
about them. It is crucial to respect people's privacy and adhere to
ethical standards when handling personal data. If you have any other
questions or concerns, please feel free to ask.
- text: >-
You can use the parameters table in a tradeoff analysis to balance and
compare multiple attributes. Specifically, it allows you to:
1. Compare different revision configurations of a project.
2. Evaluate product parameters against verification requests.
3. Assess product parameters in relation to product freeze points.
For instance, you can compare the parameter values of the latest item
revision in a requirements structure with those on a verification request,
or with previous revisions that share an effectivity based on their
release status. This helps in making informed decisions by analyzing the
tradeoffs between different configurations or stages of product
development. If you need further assistance or have more questions, feel
free to ask.
- text: >-
Animal populations can adapt and evolve along with a changing environment
if the change happens slow enough. Polar bears may be able to adapt to a
temperature change over 100000 years, but not be able to adapt to the same
temperature change over 1000 years. Since this recent anthropogenic
driven change is happening faster than any natural temperature change, so
I would say they are in danger in the wild. I guess we will be able to
see them in zoos though.
- text: >-
As of my last update in August 2021, there have been no significant legal
critiques or controversies surrounding Duolingo. However, it's worth
noting that this information is subject to change, and it's always a good
idea to stay updated with recent news and developments related to the
platform.
- text: >-
The author clearly cites it as a Reddit thread. In a scholastic paper,
you would be expected to have a bit more original content, but you
wouldn't 'get in trouble'
pipeline_tag: text-classification
inference: true
model-index:
- name: SetFit with sentence-transformers/paraphrase-mpnet-base-v2
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: accuracy
value: 0.9647606382978723
name: Accuracy
SetFit with sentence-transformers/paraphrase-mpnet-base-v2
This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/paraphrase-mpnet-base-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
- Fine-tuning a Sentence Transformer with contrastive learning.
- Training a classification head with features from the fine-tuned Sentence Transformer.
Model Details
Model Description
- Model Type: SetFit
- Sentence Transformer body: sentence-transformers/paraphrase-mpnet-base-v2
- Classification head: a LogisticRegression instance
- Maximum Sequence Length: 512 tokens
- Number of Classes: 2 classes
Model Sources
- Repository: SetFit on GitHub
- Paper: Efficient Few-Shot Learning Without Prompts
- Blogpost: SetFit: Efficient Few-Shot Learning Without Prompts
Model Labels
Label | Examples |
---|---|
1.0 |
|
0.0 |
|
Evaluation
Metrics
Label | Accuracy |
---|---|
all | 0.9648 |
Uses
Direct Use for Inference
First install the SetFit library:
pip install setfit
Then you can load this model and run inference.
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("Netta1994/setfit_e1_bz16_ni0_sz2500_corrected")
# Run inference
preds = model("The author clearly cites it as a Reddit thread. In a scholastic paper, you would be expected to have a bit more original content, but you wouldn't 'get in trouble' ")
Training Details
Training Set Metrics
Training set | Min | Median | Max |
---|---|---|---|
Word count | 1 | 85.3087 | 792 |
Label | Training Sample Count |
---|---|
0.0 | 1979 |
1.0 | 2546 |
Training Hyperparameters
- batch_size: (16, 16)
- num_epochs: (1, 1)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 20
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
Training Results
Epoch | Step | Training Loss | Validation Loss |
---|---|---|---|
0.0001 | 1 | 0.3787 | - |
0.0044 | 50 | 0.3135 | - |
0.0088 | 100 | 0.1365 | - |
0.0133 | 150 | 0.083 | - |
0.0177 | 200 | 0.1555 | - |
0.0221 | 250 | 0.0407 | - |
0.0265 | 300 | 0.0127 | - |
0.0309 | 350 | 0.0313 | - |
0.0354 | 400 | 0.0782 | - |
0.0398 | 450 | 0.148 | - |
0.0442 | 500 | 0.0396 | - |
0.0486 | 550 | 0.0747 | - |
0.0530 | 600 | 0.0255 | - |
0.0575 | 650 | 0.0098 | - |
0.0619 | 700 | 0.0532 | - |
0.0663 | 750 | 0.0006 | - |
0.0707 | 800 | 0.1454 | - |
0.0751 | 850 | 0.055 | - |
0.0796 | 900 | 0.0008 | - |
0.0840 | 950 | 0.0495 | - |
0.0884 | 1000 | 0.0195 | - |
0.0928 | 1050 | 0.1155 | - |
0.0972 | 1100 | 0.0024 | - |
0.1017 | 1150 | 0.0555 | - |
0.1061 | 1200 | 0.0612 | - |
0.1105 | 1250 | 0.0013 | - |
0.1149 | 1300 | 0.0004 | - |
0.1193 | 1350 | 0.061 | - |
0.1238 | 1400 | 0.0003 | - |
0.1282 | 1450 | 0.0014 | - |
0.1326 | 1500 | 0.0004 | - |
0.1370 | 1550 | 0.0575 | - |
0.1414 | 1600 | 0.0005 | - |
0.1458 | 1650 | 0.0656 | - |
0.1503 | 1700 | 0.0002 | - |
0.1547 | 1750 | 0.0008 | - |
0.1591 | 1800 | 0.0606 | - |
0.1635 | 1850 | 0.0478 | - |
0.1679 | 1900 | 0.0616 | - |
0.1724 | 1950 | 0.0009 | - |
0.1768 | 2000 | 0.0003 | - |
0.1812 | 2050 | 0.0004 | - |
0.1856 | 2100 | 0.0002 | - |
0.1900 | 2150 | 0.0001 | - |
0.1945 | 2200 | 0.0001 | - |
0.1989 | 2250 | 0.0001 | - |
0.2033 | 2300 | 0.0001 | - |
0.2077 | 2350 | 0.0001 | - |
0.2121 | 2400 | 0.0002 | - |
0.2166 | 2450 | 0.0002 | - |
0.2210 | 2500 | 0.0005 | - |
0.2254 | 2550 | 0.0001 | - |
0.2298 | 2600 | 0.0005 | - |
0.2342 | 2650 | 0.0002 | - |
0.2387 | 2700 | 0.0605 | - |
0.2431 | 2750 | 0.0004 | - |
0.2475 | 2800 | 0.0002 | - |
0.2519 | 2850 | 0.0004 | - |
0.2563 | 2900 | 0.0 | - |
0.2608 | 2950 | 0.0001 | - |
0.2652 | 3000 | 0.0004 | - |
0.2696 | 3050 | 0.0002 | - |
0.2740 | 3100 | 0.0004 | - |
0.2784 | 3150 | 0.0001 | - |
0.2829 | 3200 | 0.0514 | - |
0.2873 | 3250 | 0.0005 | - |
0.2917 | 3300 | 0.0581 | - |
0.2961 | 3350 | 0.0004 | - |
0.3005 | 3400 | 0.0001 | - |
0.3050 | 3450 | 0.0002 | - |
0.3094 | 3500 | 0.0009 | - |
0.3138 | 3550 | 0.0001 | - |
0.3182 | 3600 | 0.0 | - |
0.3226 | 3650 | 0.0019 | - |
0.3271 | 3700 | 0.0 | - |
0.3315 | 3750 | 0.0007 | - |
0.3359 | 3800 | 0.0001 | - |
0.3403 | 3850 | 0.0 | - |
0.3447 | 3900 | 0.0075 | - |
0.3492 | 3950 | 0.0 | - |
0.3536 | 4000 | 0.0008 | - |
0.3580 | 4050 | 0.0001 | - |
0.3624 | 4100 | 0.0 | - |
0.3668 | 4150 | 0.0002 | - |
0.3713 | 4200 | 0.0 | - |
0.3757 | 4250 | 0.0 | - |
0.3801 | 4300 | 0.0 | - |
0.3845 | 4350 | 0.0 | - |
0.3889 | 4400 | 0.0001 | - |
0.3934 | 4450 | 0.0001 | - |
0.3978 | 4500 | 0.0 | - |
0.4022 | 4550 | 0.0001 | - |
0.4066 | 4600 | 0.0001 | - |
0.4110 | 4650 | 0.0001 | - |
0.4155 | 4700 | 0.0 | - |
0.4199 | 4750 | 0.0 | - |
0.4243 | 4800 | 0.0 | - |
0.4287 | 4850 | 0.0005 | - |
0.4331 | 4900 | 0.0007 | - |
0.4375 | 4950 | 0.0 | - |
0.4420 | 5000 | 0.0 | - |
0.4464 | 5050 | 0.0003 | - |
0.4508 | 5100 | 0.0 | - |
0.4552 | 5150 | 0.0 | - |
0.4596 | 5200 | 0.0001 | - |
0.4641 | 5250 | 0.0 | - |
0.4685 | 5300 | 0.0 | - |
0.4729 | 5350 | 0.0 | - |
0.4773 | 5400 | 0.0 | - |
0.4817 | 5450 | 0.0 | - |
0.4862 | 5500 | 0.0 | - |
0.4906 | 5550 | 0.0 | - |
0.4950 | 5600 | 0.0 | - |
0.4994 | 5650 | 0.0001 | - |
0.5038 | 5700 | 0.0 | - |
0.5083 | 5750 | 0.0001 | - |
0.5127 | 5800 | 0.0 | - |
0.5171 | 5850 | 0.0 | - |
0.5215 | 5900 | 0.0 | - |
0.5259 | 5950 | 0.0 | - |
0.5304 | 6000 | 0.0 | - |
0.5348 | 6050 | 0.0 | - |
0.5392 | 6100 | 0.0 | - |
0.5436 | 6150 | 0.0 | - |
0.5480 | 6200 | 0.0 | - |
0.5525 | 6250 | 0.0 | - |
0.5569 | 6300 | 0.0 | - |
0.5613 | 6350 | 0.0001 | - |
0.5657 | 6400 | 0.0001 | - |
0.5701 | 6450 | 0.0 | - |
0.5746 | 6500 | 0.0 | - |
0.5790 | 6550 | 0.0 | - |
0.5834 | 6600 | 0.0 | - |
0.5878 | 6650 | 0.0 | - |
0.5922 | 6700 | 0.0 | - |
0.5967 | 6750 | 0.0 | - |
0.6011 | 6800 | 0.0 | - |
0.6055 | 6850 | 0.0 | - |
0.6099 | 6900 | 0.0 | - |
0.6143 | 6950 | 0.0 | - |
0.6188 | 7000 | 0.0 | - |
0.6232 | 7050 | 0.0 | - |
0.6276 | 7100 | 0.0 | - |
0.6320 | 7150 | 0.0 | - |
0.6364 | 7200 | 0.0 | - |
0.6409 | 7250 | 0.0 | - |
0.6453 | 7300 | 0.0 | - |
0.6497 | 7350 | 0.0 | - |
0.6541 | 7400 | 0.0 | - |
0.6585 | 7450 | 0.0 | - |
0.6630 | 7500 | 0.0 | - |
0.6674 | 7550 | 0.0 | - |
0.6718 | 7600 | 0.0 | - |
0.6762 | 7650 | 0.0 | - |
0.6806 | 7700 | 0.0 | - |
0.6851 | 7750 | 0.0 | - |
0.6895 | 7800 | 0.0 | - |
0.6939 | 7850 | 0.0 | - |
0.6983 | 7900 | 0.0 | - |
0.7027 | 7950 | 0.0 | - |
0.7072 | 8000 | 0.0 | - |
0.7116 | 8050 | 0.0 | - |
0.7160 | 8100 | 0.0 | - |
0.7204 | 8150 | 0.0 | - |
0.7248 | 8200 | 0.0 | - |
0.7292 | 8250 | 0.0 | - |
0.7337 | 8300 | 0.0 | - |
0.7381 | 8350 | 0.0 | - |
0.7425 | 8400 | 0.0 | - |
0.7469 | 8450 | 0.0001 | - |
0.7513 | 8500 | 0.0 | - |
0.7558 | 8550 | 0.0 | - |
0.7602 | 8600 | 0.0 | - |
0.7646 | 8650 | 0.0 | - |
0.7690 | 8700 | 0.0 | - |
0.7734 | 8750 | 0.0 | - |
0.7779 | 8800 | 0.0 | - |
0.7823 | 8850 | 0.0 | - |
0.7867 | 8900 | 0.0 | - |
0.7911 | 8950 | 0.0 | - |
0.7955 | 9000 | 0.0 | - |
0.8000 | 9050 | 0.0 | - |
0.8044 | 9100 | 0.0 | - |
0.8088 | 9150 | 0.0 | - |
0.8132 | 9200 | 0.0 | - |
0.8176 | 9250 | 0.0 | - |
0.8221 | 9300 | 0.0 | - |
0.8265 | 9350 | 0.0 | - |
0.8309 | 9400 | 0.0 | - |
0.8353 | 9450 | 0.0 | - |
0.8397 | 9500 | 0.0 | - |
0.8442 | 9550 | 0.0 | - |
0.8486 | 9600 | 0.0 | - |
0.8530 | 9650 | 0.0 | - |
0.8574 | 9700 | 0.0 | - |
0.8618 | 9750 | 0.0 | - |
0.8663 | 9800 | 0.0 | - |
0.8707 | 9850 | 0.0001 | - |
0.8751 | 9900 | 0.0 | - |
0.8795 | 9950 | 0.0 | - |
0.8839 | 10000 | 0.0 | - |
0.8884 | 10050 | 0.0 | - |
0.8928 | 10100 | 0.0 | - |
0.8972 | 10150 | 0.0 | - |
0.9016 | 10200 | 0.0 | - |
0.9060 | 10250 | 0.0 | - |
0.9105 | 10300 | 0.0 | - |
0.9149 | 10350 | 0.0 | - |
0.9193 | 10400 | 0.0 | - |
0.9237 | 10450 | 0.0 | - |
0.9281 | 10500 | 0.0 | - |
0.9326 | 10550 | 0.0 | - |
0.9370 | 10600 | 0.0 | - |
0.9414 | 10650 | 0.0 | - |
0.9458 | 10700 | 0.0 | - |
0.9502 | 10750 | 0.0 | - |
0.9547 | 10800 | 0.0 | - |
0.9591 | 10850 | 0.0 | - |
0.9635 | 10900 | 0.0 | - |
0.9679 | 10950 | 0.0 | - |
0.9723 | 11000 | 0.0 | - |
0.9768 | 11050 | 0.0 | - |
0.9812 | 11100 | 0.0 | - |
0.9856 | 11150 | 0.0 | - |
0.9900 | 11200 | 0.0 | - |
0.9944 | 11250 | 0.0 | - |
0.9989 | 11300 | 0.0 | - |
Framework Versions
- Python: 3.10.14
- SetFit: 1.0.3
- Sentence Transformers: 2.7.0
- Transformers: 4.40.1
- PyTorch: 2.2.0+cu121
- Datasets: 2.19.1
- Tokenizers: 0.19.1
Citation
BibTeX
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}