Edit model card

SetFit with sentence-transformers/paraphrase-mpnet-base-v2

This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/paraphrase-mpnet-base-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
critical
  • ' * Walk out from work and/or school * Picket Israeli embassies and consulates * Picket against companies that profit from Israel?s occupation of Palestine (Lockheed Martin, Boeing, Raytheon, Northrop Grumman, General Dynamics, Elbit Systems) * Host speak outs * Wear kuffiyehs * Wear black armbands'
  • "(Nov. 2) Thread of demonstrations in solidarity with Palestinians, via @LexiAlex: U.S., U.K., U.S., U.K., South Africa, Australia, Canada, U.S. Cool and all but I don't think Raytheon cares there's blood on their hands."
  • '99% of computers have intel processors, 100% of which are made with Israeli tech, 99% of which are manufactured in israel lmao and that?s just intel!'
neutral
  • " Intel secures $3.25B Israeli gov't grant to build $25B chip fab in Israel amid ongoing tensions : Read more"
  • '? Austin noted some defense contractors have required workers to take on additional shifts to keep up with production rates.'
  • '>?Germany?s leading role in NATO matters at this critical moment for European security,?'
negative
  • '? Sister of Israeli hostage Elad Katzir says her brother was murdered in captivity, and his body was recovered in Gaza during a military rescue operation.'
  • '"I think this is something that goes beyond what you would normally consider politics, in the sense that it's been hard for anyone to keep up with the rest of the world, and ignore the fact that every single university in Gaza has been flattened, the fact that hospitals have been destroyed, the fact that 14,500 children have died." The event ran two and a half-hours, and not without dissent from a boisterous group of counter-protestors along the west side of the plaza, less organized, shouting "USA," "Take a shower," "Go back to Russia" and "Stop supporting terrorism," some literally wrapped in U.'
  • '"There's been panic everywhere - even here in Khan Younis where the bombing was less - as people try to reach family members in other areas to check they are safe, but the phones have been cut off." There was anger as well as fear from the families of the Gaza hostages.'

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("setfit_model_id")
# Run inference
preds = model("For now, let?s remember a few pertinent points about a ceasefire in the Israel-Hamas war:")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 7 29.3647 111
Label Training Sample Count
critical 24
negative 26
neutral 35

Training Hyperparameters

  • batch_size: (16, 2)
  • num_epochs: (1, 16)
  • max_steps: -1
  • sampling_strategy: oversampling
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0034 1 0.3409 -
0.1684 50 0.1854 -
0.3367 100 0.0944 -
0.5051 150 0.035 -
0.6734 200 0.0021 -
0.8418 250 0.0011 -

Framework Versions

  • Python: 3.10.6
  • SetFit: 1.0.3
  • Sentence Transformers: 3.0.0
  • Transformers: 4.35.2
  • PyTorch: 2.2.0
  • Datasets: 2.14.4
  • Tokenizers: 0.15.2

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
11
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for msullivan/cemex_peers_stance

Finetuned
(247)
this model