nielsr HF Staff commited on
Commit
4693fba
·
verified ·
1 Parent(s): 1e51959

Add github repo and update pipeline tag

Browse files

This PR updates the pipeline tag to `image-text-to-text` and adds the Github repository link.

Files changed (1) hide show
  1. README.md +9 -9
README.md CHANGED
@@ -1,15 +1,15 @@
1
  ---
 
 
 
 
 
2
  tags:
3
  - zero-shot-image-classification
4
  - clip
5
  - openMammut
6
  - datacomp
7
  library_tag: open_clip
8
- license: apache-2.0
9
- pipeline_tag: zero-shot-image-classification
10
- datasets:
11
- - mlfoundations/datacomp_pools
12
- library_name: open_clip
13
  ---
14
 
15
  # Model card for openMammut-ViT-L-14-DataComp-1.4B-s12.8B-b180K
@@ -105,6 +105,8 @@ More details in the ArXiv paper : [Scaling Laws for Robust Comparison of Open Fo
105
 
106
  # How to Get Started with the Model
107
 
 
 
108
  ATTENTION: currently, [custom openCLIP fork](https://github.com/LAION-AI/open_clip_mammut) is required to work with the model.
109
  Integrating openMaMMUT code into main [openCLIP repository](https://github.com/mlfoundations/open_clip) is work in progress. Any volunteers helping with intergration highly welcome, join [LAION discord](https://discord.gg/BZqhreFazY)
110
 
@@ -246,9 +248,7 @@ CLIP benchmark software
246
  ;origin=https://doi.org/10.5281/zenodo.15403102;vi
247
  sit=swh:1:snp:dd153b26f702d614346bf814f723d59fef3d
248
  77a2;anchor=swh:1:rel:cff2aeb98f42583b44fdab5374e9
249
- fa71793f2cff;path=CLIP\_benchmark-main
250
  },
251
  }
252
- ```
253
-
254
-
 
1
  ---
2
+ datasets:
3
+ - mlfoundations/datacomp_pools
4
+ library_name: open_clip
5
+ license: apache-2.0
6
+ pipeline_tag: image-text-to-text
7
  tags:
8
  - zero-shot-image-classification
9
  - clip
10
  - openMammut
11
  - datacomp
12
  library_tag: open_clip
 
 
 
 
 
13
  ---
14
 
15
  # Model card for openMammut-ViT-L-14-DataComp-1.4B-s12.8B-b180K
 
105
 
106
  # How to Get Started with the Model
107
 
108
+ Research repository: https://github.com/LAION-AI/scaling-laws-for-comparison
109
+
110
  ATTENTION: currently, [custom openCLIP fork](https://github.com/LAION-AI/open_clip_mammut) is required to work with the model.
111
  Integrating openMaMMUT code into main [openCLIP repository](https://github.com/mlfoundations/open_clip) is work in progress. Any volunteers helping with intergration highly welcome, join [LAION discord](https://discord.gg/BZqhreFazY)
112
 
 
248
  ;origin=https://doi.org/10.5281/zenodo.15403102;vi
249
  sit=swh:1:snp:dd153b26f702d614346bf814f723d59fef3d
250
  77a2;anchor=swh:1:rel:cff2aeb98f42583b44fdab5374e9
251
+ fa71793f2cff;path=CLIP\\_benchmark-main
252
  },
253
  }
254
+ ```