Model Description

This model is a fine-tuned ModernBERT-large for Natural Language Inference. It was trained on the MoritzLaurer/synthetic_zeroshot_mixtral_v0.1 and is designed to carry out zero-shot classification.

Model Overview

Performance Metrics

To be added.

  • Training Loss: Measures the model's fit to the training data.
  • Validation Loss: Measures the model's generalization to unseen data.
  • Accuracy: The percentage of correct predictions over all examples.
  • F1 Score: A balanced metric between precision and recall.

Installation and Example Usage

pip install transformers torch datasets
classifier = pipeline("zero-shot-classification", "rob-field1/ModernBERT-large-zeroshot-v1")
sequence_to_classify = "I want to be an actor."
candidate_labels = ["space", "economy", "entertainment"]
output = classifier(sequence_to_classify, candidate_labels, multi_label=False)
print(output)
>>{'sequence': 'I want to be an actor.', 'labels': ['entertainment', 'space', 'economy'], 'scores': [0.9614731073379517, 0.028852475807070732, 0.009674412198364735]}

Model Card

Training Details

  • Model: ModernBERT (Large variant)
  • Framework: PyTorch
  • Batch Size: 32
  • Learning Rate: 2e-5
  • Optimizer: AdamW
  • Hardware: RTX 4090

Acknowledgments

License

This model is licensed under the MIT License. See the LICENSE file for more details.

Downloads last month
5
Safetensors
Model size
396M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for rob-field1/ModernBERT-large-zeroshot-v1

Finetuned
(9)
this model

Dataset used to train rob-field1/ModernBERT-large-zeroshot-v1