Edit model card

Model Card for Model ID

This model is a small timm/vit_base_patch16_224.orig_in21k_ft_in1k trained on cifar10.

  • Test Accuracy: 0.9896
  • License: MIT

How to Get Started with the Model

Use the code below to get started with the model.

import timm
import torch
from torch import nn

model = timm.create_model("timm/vit_base_patch16_224.orig_in21k_ft_in1k", pretrained=False)
model.head = nn.Linear(model.head.in_features, 10)
model.load_state_dict(
    torch.hub.load_state_dict_from_url(
        "https://huggingface.co/edadaltocg/vit_base_patch16_224_in21k_ft_cifar10/resolve/main/pytorch_model.bin",
        map_location="cpu",
        file_name="vit_base_patch16_224_in21k_ft_cifar10.pth",
    )
)

Training Data

Training data is cifar10.

Training Hyperparameters

  • config: scripts/train_configs/ft_cifar10.json

  • model: vit_base_patch16_224_in21k_ft_cifar10

  • dataset: cifar10

  • batch_size: 64

  • epochs: 10

  • validation_frequency: 1

  • seed: 1

  • criterion: CrossEntropyLoss

  • criterion_kwargs: {}

  • optimizer: SGD

  • lr: 0.01

  • optimizer_kwargs: {'momentum': 0.9, 'weight_decay': 0.0}

  • scheduler: CosineAnnealingLR

  • scheduler_kwargs: {'T_max': 10}

  • debug: False

Testing Data

Testing data is cifar10.


This model card was created by Eduardo Dadalto.

Downloads last month
18
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Dataset used to train edadaltocg/vit_base_patch16_224_in21k_ft_cifar10

Evaluation results