Jinawei commited on
Commit
add7092
·
1 Parent(s): a3c84de

Update Readme.md

Browse files
Files changed (1) hide show
  1. Readme.md +1 -1
Readme.md CHANGED
@@ -6,7 +6,7 @@ tags:
6
  license: mit
7
  ---
8
 
9
- # AutoDisProxyT-COLA for Distilling Massive Neural Networks
10
 
11
  AutoDisProxyT is a distilled task-agnostic transformer model that leverages task transfer for learning a small universal model that can be applied to arbitrary tasks and languages as outlined in the paper [Few-shot Task-agnostic Neural Architecture Search for
12
  Distilling Large Language Models](https://proceedings.neurips.cc/paper_files/paper/2022/file/b7c12689a89e98a61bcaa65285a41b7c-Paper-Conference.pdf).
 
6
  license: mit
7
  ---
8
 
9
+ # AutoDisProxyT-MNLI for Distilling Massive Neural Networks
10
 
11
  AutoDisProxyT is a distilled task-agnostic transformer model that leverages task transfer for learning a small universal model that can be applied to arbitrary tasks and languages as outlined in the paper [Few-shot Task-agnostic Neural Architecture Search for
12
  Distilling Large Language Models](https://proceedings.neurips.cc/paper_files/paper/2022/file/b7c12689a89e98a61bcaa65285a41b7c-Paper-Conference.pdf).