Faeze's picture
Add SetFit model
01356b3
|
raw
history blame
14.3 kB
metadata
library_name: setfit
tags:
  - setfit
  - sentence-transformers
  - text-classification
  - generated_from_setfit_trainer
metrics:
  - metric
widget:
  - text: >-
      A combined 20 million people per year die of smoking and hunger,  so
      authorities can't seem to feed people and they allow you to buy cigarettes
      but we are facing another lockdown for a virus that has a 99.5% survival
      rate!!! THINK PEOPLE. LOOK AT IT LOGICALLY WITH YOUR OWN EYES.
  - text: >-
      Scientists do not agree on the consequences of climate change, nor is
      there any consensus on that subject. The predictions on that from are just
      ascientific speculation. Bring on the warming."
  - text: >-
      If Tam is our "top doctor"....I am going back to leaches and voodoo...just
      as much science in that as the crap she spouts
  - text: "Can she skip school by herself and sit infront of parliament? \r\n Fake emotions and just a good actor."
  - text: my dad had huge ones..so they may be real..
pipeline_tag: text-classification
inference: false
base_model: sentence-transformers/paraphrase-mpnet-base-v2
model-index:
  - name: SetFit with sentence-transformers/paraphrase-mpnet-base-v2
    results:
      - task:
          type: text-classification
          name: Text Classification
        dataset:
          name: Unknown
          type: unknown
          split: test
        metrics:
          - type: metric
            value: 0.65694899973345
            name: Metric

SetFit with sentence-transformers/paraphrase-mpnet-base-v2

This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/paraphrase-mpnet-base-v2 as the Sentence Transformer embedding model. A ClassifierChain instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Evaluation

Metrics

Label Metric
all 0.6569

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("CrisisNarratives/setfit-13classes-multi_label")
# Run inference
preds = model("my dad had huge ones..so they may be real..")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 1 25.8891 1681

Training Hyperparameters

  • batch_size: (16, 16)
  • num_epochs: (3, 3)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 40
  • body_learning_rate: (1.752e-05, 1.752e-05)
  • head_learning_rate: 1.752e-05
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 30
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0004 1 0.3059 -
0.0185 50 0.3597 -
0.0370 100 0.272 -
0.0555 150 0.2282 -
0.0739 200 0.2413 -
0.0924 250 0.2239 -
0.1109 300 0.2447 -
0.1294 350 0.1574 -
0.1479 400 0.1873 -
0.1664 450 0.1537 -
0.1848 500 0.1661 -
0.2033 550 0.1692 -
0.2218 600 0.1105 -
0.2403 650 0.1316 -
0.2588 700 0.1018 -
0.2773 750 0.1148 -
0.2957 800 0.0588 -
0.3142 850 0.2385 -
0.3327 900 0.0302 -
0.3512 950 0.0714 -
0.3697 1000 0.1587 -
0.3882 1050 0.1479 -
0.4067 1100 0.0897 -
0.4251 1150 0.064 -
0.4436 1200 0.0774 -
0.4621 1250 0.0318 -
0.4806 1300 0.1231 -
0.4991 1350 0.0983 -
0.5176 1400 0.1537 -
0.5360 1450 0.1382 -
0.5545 1500 0.1244 -
0.5730 1550 0.1169 -
0.5915 1600 0.0185 -
0.6100 1650 0.1368 -
0.6285 1700 0.0678 -
0.6470 1750 0.0827 -
0.6654 1800 0.028 -
0.6839 1850 0.0655 -
0.7024 1900 0.1099 -
0.7209 1950 0.0508 -
0.7394 2000 0.086 -
0.7579 2050 0.1087 -
0.7763 2100 0.0764 -
0.7948 2150 0.0646 -
0.8133 2200 0.0793 -
0.8318 2250 0.0678 -
0.8503 2300 0.0538 -
0.8688 2350 0.0495 -
0.8872 2400 0.0651 -
0.9057 2450 0.0966 -
0.9242 2500 0.1726 -
0.9427 2550 0.0491 -
0.9612 2600 0.043 -
0.9797 2650 0.0807 -
0.9982 2700 0.0905 -
1.0166 2750 0.0841 -
1.0351 2800 0.0735 -
1.0536 2850 0.0508 -
1.0721 2900 0.082 -
1.0906 2950 0.085 -
1.1091 3000 0.0412 -
1.1275 3050 0.0274 -
1.1460 3100 0.1012 -
1.1645 3150 0.0269 -
1.1830 3200 0.0377 -
1.2015 3250 0.0854 -
1.2200 3300 0.0854 -
1.2384 3350 0.0682 -
1.2569 3400 0.038 -
1.2754 3450 0.1073 -
1.2939 3500 0.0841 -
1.3124 3550 0.1024 -
1.3309 3600 0.0636 -
1.3494 3650 0.0821 -
1.3678 3700 0.0742 -
1.3863 3750 0.0504 -
1.4048 3800 0.1198 -
1.4233 3850 0.0233 -
1.4418 3900 0.0659 -
1.4603 3950 0.0252 -
1.4787 4000 0.0772 -
1.4972 4050 0.0466 -
1.5157 4100 0.0771 -
1.5342 4150 0.0489 -
1.5527 4200 0.0273 -
1.5712 4250 0.0335 -
1.5896 4300 0.0733 -
1.6081 4350 0.0323 -
1.6266 4400 0.0358 -
1.6451 4450 0.0252 -
1.6636 4500 0.078 -
1.6821 4550 0.0137 -
1.7006 4600 0.0858 -
1.7190 4650 0.0377 -
1.7375 4700 0.0607 -
1.7560 4750 0.0438 -
1.7745 4800 0.0501 -
1.7930 4850 0.0682 -
1.8115 4900 0.0571 -
1.8299 4950 0.0144 -
1.8484 5000 0.0518 -
1.8669 5050 0.0388 -
1.8854 5100 0.0685 -
1.9039 5150 0.0522 -
1.9224 5200 0.0518 -
1.9409 5250 0.0649 -
1.9593 5300 0.083 -
1.9778 5350 0.0652 -
1.9963 5400 0.0907 -
2.0148 5450 0.0767 -
2.0333 5500 0.0825 -
2.0518 5550 0.0818 -
2.0702 5600 0.0364 -
2.0887 5650 0.134 -
2.1072 5700 0.0379 -
2.1257 5750 0.1066 -
2.1442 5800 0.1288 -
2.1627 5850 0.0527 -
2.1811 5900 0.0343 -
2.1996 5950 0.0766 -
2.2181 6000 0.0862 -
2.2366 6050 0.0661 -
2.2551 6100 0.069 -
2.2736 6150 0.0429 -
2.2921 6200 0.0546 -
2.3105 6250 0.1237 -
2.3290 6300 0.0337 -
2.3475 6350 0.0616 -
2.3660 6400 0.0833 -
2.3845 6450 0.1074 -
2.4030 6500 0.0424 -
2.4214 6550 0.033 -
2.4399 6600 0.0933 -
2.4584 6650 0.0434 -
2.4769 6700 0.0328 -
2.4954 6750 0.0553 -
2.5139 6800 0.0557 -
2.5323 6850 0.0861 -
2.5508 6900 0.0294 -
2.5693 6950 0.0521 -
2.5878 7000 0.1529 -
2.6063 7050 0.055 -
2.6248 7100 0.0522 -
2.6433 7150 0.0715 -
2.6617 7200 0.0524 -
2.6802 7250 0.0469 -
2.6987 7300 0.1064 -
2.7172 7350 0.0485 -
2.7357 7400 0.0526 -
2.7542 7450 0.1063 -
2.7726 7500 0.0549 -
2.7911 7550 0.041 -
2.8096 7600 0.0312 -
2.8281 7650 0.0249 -
2.8466 7700 0.0807 -
2.8651 7750 0.0268 -
2.8835 7800 0.0306 -
2.9020 7850 0.0655 -
2.9205 7900 0.1469 -
2.9390 7950 0.0454 -
2.9575 8000 0.0754 -
2.9760 8050 0.0587 -
2.9945 8100 0.0452 -

Framework Versions

  • Python: 3.9.16
  • SetFit: 1.0.1
  • Sentence Transformers: 2.2.2
  • Transformers: 4.35.0
  • PyTorch: 2.1.0+cu121
  • Datasets: 2.14.6
  • Tokenizers: 0.14.1

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}