Edit model card

SetFit with sentence-transformers/all-MiniLM-L6-v2

This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/all-MiniLM-L6-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Evaluation

Metrics

Label F1
all 0.1290

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("Zlovoblachko/dimension2_w_thesis_setfit")
# Run inference
preds = model("I loved the spiderman movie!")

Training Details

Training Hyperparameters

  • batch_size: (16, 16)
  • num_epochs: (1, 1)
  • max_steps: -1
  • sampling_strategy: oversampling
  • body_learning_rate: (0.00031763046129120506, 0.00031763046129120506)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • l2_weight: 0.01
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0007 1 0.304 -
0.0347 50 0.2656 -
0.0694 100 0.2733 -
0.1042 150 0.268 -
0.1389 200 0.2712 -
0.1736 250 0.2726 -
0.2083 300 0.2758 -
0.2431 350 0.2807 -
0.2778 400 0.2877 -
0.3125 450 0.2641 -
0.3472 500 0.2761 -
0.3819 550 0.2739 -
0.4167 600 0.2565 -
0.4514 650 0.2813 -
0.4861 700 0.2761 -
0.5208 750 0.2749 -
0.5556 800 0.2585 -
0.5903 850 0.2737 -
0.625 900 0.2807 -
0.6597 950 0.2782 -
0.6944 1000 0.2736 -
0.7292 1050 0.28 -
0.7639 1100 0.2821 -
0.7986 1150 0.2755 -
0.8333 1200 0.2743 -
0.8681 1250 0.2634 -
0.9028 1300 0.2779 -
0.9375 1350 0.2744 -
0.9722 1400 0.2816 -

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.1.0
  • Sentence Transformers: 3.2.1
  • Transformers: 4.44.2
  • PyTorch: 2.5.0+cu121
  • Datasets: 3.0.2
  • Tokenizers: 0.19.1

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
10
Safetensors
Model size
22.7M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Zlovoblachko/dimension2_w_thesis_setfit

Finetuned
(154)
this model

Evaluation results