metadata
language: en
license: mit
library_name: timm
tags:
- image-classification
- timm/vit_base_patch16_224.orig_in21k_ft_in1k
- cifar10
datasets: cifar10
metrics:
- accuracy
model-index:
- name: vit_base_patch16_224_in21k_ft_cifar10
results:
- task:
type: image-classification
dataset:
name: CIFAR-10
type: cifar10
metrics:
- type: accuracy
value: 0.9896
Model Card for Model ID
This model is a small timm/vit_base_patch16_224.orig_in21k_ft_in1k trained on cifar10.
- Test Accuracy: 0.9896
- License: MIT
How to Get Started with the Model
Use the code below to get started with the model.
import timm
import torch
from torch import nn
model = timm.create_model("timm/vit_base_patch16_224.orig_in21k_ft_in1k", pretrained=False)
model.head = nn.Linear(model.head.in_features, 10)
model.load_state_dict(
torch.hub.load_state_dict_from_url(
"https://huggingface.co/edadaltocg/vit_base_patch16_224_in21k_ft_cifar10/resolve/main/pytorch_model.bin",
map_location="cpu",
file_name="vit_base_patch16_224_in21k_ft_cifar10.pth",
)
)
Training Data
Training data is cifar10.
Training Hyperparameters
config:
scripts/train_configs/ft_cifar10.json
model:
vit_base_patch16_224_in21k_ft_cifar10
dataset:
cifar10
batch_size:
64
epochs:
10
validation_frequency:
1
seed:
1
criterion:
CrossEntropyLoss
criterion_kwargs:
{}
optimizer:
SGD
lr:
0.01
optimizer_kwargs:
{'momentum': 0.9, 'weight_decay': 0.0}
scheduler:
CosineAnnealingLR
scheduler_kwargs:
{'T_max': 10}
debug:
False
Testing Data
Testing data is cifar10.
This model card was created by Eduardo Dadalto.