timm/vit_huge_patch14_clip_224.laion2b_ft_in12k_in1k Image Classification • 0.6B • Updated Jan 21 • 507 • 2
timm/vit_huge_patch14_clip_224.laion2b_ft_in12k Image Classification • 0.6B • Updated Jan 21 • 282 • 1
timm/vit_huge_patch14_clip_336.laion2b_ft_in12k_in1k Image Classification • 0.6B • Updated Jan 21 • 148 • 2
timm/vit_large_patch14_clip_336.laion2b_ft_in12k_in1k Image Classification • 0.3B • Updated Jan 21 • 266
timm/vit_large_patch14_clip_224.openai_ft_in1k Image Classification • 0.3B • Updated Jan 21 • 554 • 1
timm/vit_large_patch14_clip_224.openai_ft_in12k_in1k Image Classification • 0.3B • Updated Jan 21 • 2.07k • 38
timm/vit_base_patch32_clip_224.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 17.6k • 2
timm/vit_base_patch32_clip_384.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 156
timm/vit_base_patch32_clip_448.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 682k • 4
timm/vit_base_patch16_clip_224.laion2b_ft_in1k Image Classification • 0.1B • Updated Jan 21 • 2.01k • 1
timm/vit_base_patch16_clip_384.laion2b_ft_in1k Image Classification • 0.1B • Updated Jan 21 • 131 • 5
timm/vit_base_patch32_clip_384.openai_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 581
timm/vit_base_patch16_clip_384.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 912 • 4
timm/vit_large_patch14_clip_336.openai_ft_in12k_in1k Image Classification • 0.3B • Updated Jan 21 • 167 • 1
timm/vit_base_patch16_clip_224.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 2.1k • 2
timm/vit_base_patch16_clip_224.openai_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 3.73k
timm/vit_base_patch16_clip_384.openai_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 36 • 1