Update README.md
Browse files
README.md
CHANGED
@@ -31,6 +31,7 @@ MetaFold Dataset is a point-cloud trajectory dataset designed for multi-category
|
|
31 |
- **Paper:** [MetaFold: Language-Guided Multi-Category Garment Folding Framework via Trajectory Generation and Foundation Model](https://arxiv.org/pdf/2503.08372)
|
32 |
- **Website:** [https://meta-fold.github.io/](https://meta-fold.github.io/)
|
33 |
|
|
|
34 |
|
35 |
## Dataset Structure
|
36 |
|
@@ -91,7 +92,7 @@ The dataset categories is inherited from [ClothesNet](https://arxiv.org/pdf/2308
|
|
91 |
1. Fold the left pant leg over the right pant leg.
|
92 |
2. Fold from the waistband down toward the pant legs.
|
93 |
|
94 |
-
---
|
95 |
|
96 |
### Total Counts
|
97 |
|
@@ -101,7 +102,7 @@ The dataset categories is inherited from [ClothesNet](https://arxiv.org/pdf/2308
|
|
101 |
- **Pants:** 277 garments
|
102 |
- **Overall Total:** 1,210 garments
|
103 |
|
104 |
-
---
|
105 |
|
106 |
### Filename Convention
|
107 |
|
@@ -125,7 +126,6 @@ Each data file follows the pattern: `<Category>_<GarmentName>_<Foldstage>`
|
|
125 |
- **FoldStep:** `action1` (second step: fold the right sleeve)
|
126 |
- **Link:** [TNLC_010_action1](https://huggingface.co/datasets/chenhn02/MetaFold/tree/main/TNLC/TNLC_010_action1)
|
127 |
|
128 |
-
---
|
129 |
|
130 |
### Directory Content
|
131 |
|
@@ -136,6 +136,8 @@ Each Folding Trajectory directory contains:
|
|
136 |
|
137 |
If you need the point cloud, simply extract the vertex coordinates from the mesh files.
|
138 |
|
|
|
|
|
139 |
## Dataset Creation
|
140 |
|
141 |
### Curation Rationale
|
@@ -158,6 +160,7 @@ We use data from [ClothesNet](https://arxiv.org/pdf/2308.09987) and employ DiffC
|
|
158 |
|
159 |
**BibTeX:**
|
160 |
|
|
|
161 |
@misc{chen2025metafoldlanguageguidedmulticategorygarment,
|
162 |
title={MetaFold: Language-Guided Multi-Category Garment Folding Framework via Trajectory Generation and Foundation Model},
|
163 |
author={Haonan Chen and Junxiao Li and Ruihai Wu and Yiwei Liu and Yiwen Hou and Zhixuan Xu and Jingxiang Guo and Chongkai Gao and Zhenyu Wei and Shensi Xu and Jiaqi Huang and Lin Shao},
|
@@ -168,6 +171,8 @@ We use data from [ClothesNet](https://arxiv.org/pdf/2308.09987) and employ DiffC
|
|
168 |
url={https://arxiv.org/abs/2503.08372},
|
169 |
}
|
170 |
|
|
|
|
|
171 |
<!-- **APA:**
|
172 |
|
173 |
[More Information Needed] -->
|
|
|
31 |
- **Paper:** [MetaFold: Language-Guided Multi-Category Garment Folding Framework via Trajectory Generation and Foundation Model](https://arxiv.org/pdf/2503.08372)
|
32 |
- **Website:** [https://meta-fold.github.io/](https://meta-fold.github.io/)
|
33 |
|
34 |
+
---
|
35 |
|
36 |
## Dataset Structure
|
37 |
|
|
|
92 |
1. Fold the left pant leg over the right pant leg.
|
93 |
2. Fold from the waistband down toward the pant legs.
|
94 |
|
95 |
+
<!-- --- -->
|
96 |
|
97 |
### Total Counts
|
98 |
|
|
|
102 |
- **Pants:** 277 garments
|
103 |
- **Overall Total:** 1,210 garments
|
104 |
|
105 |
+
<!-- --- -->
|
106 |
|
107 |
### Filename Convention
|
108 |
|
|
|
126 |
- **FoldStep:** `action1` (second step: fold the right sleeve)
|
127 |
- **Link:** [TNLC_010_action1](https://huggingface.co/datasets/chenhn02/MetaFold/tree/main/TNLC/TNLC_010_action1)
|
128 |
|
|
|
129 |
|
130 |
### Directory Content
|
131 |
|
|
|
136 |
|
137 |
If you need the point cloud, simply extract the vertex coordinates from the mesh files.
|
138 |
|
139 |
+
---
|
140 |
+
|
141 |
## Dataset Creation
|
142 |
|
143 |
### Curation Rationale
|
|
|
160 |
|
161 |
**BibTeX:**
|
162 |
|
163 |
+
'''
|
164 |
@misc{chen2025metafoldlanguageguidedmulticategorygarment,
|
165 |
title={MetaFold: Language-Guided Multi-Category Garment Folding Framework via Trajectory Generation and Foundation Model},
|
166 |
author={Haonan Chen and Junxiao Li and Ruihai Wu and Yiwei Liu and Yiwen Hou and Zhixuan Xu and Jingxiang Guo and Chongkai Gao and Zhenyu Wei and Shensi Xu and Jiaqi Huang and Lin Shao},
|
|
|
171 |
url={https://arxiv.org/abs/2503.08372},
|
172 |
}
|
173 |
|
174 |
+
'''
|
175 |
+
|
176 |
<!-- **APA:**
|
177 |
|
178 |
[More Information Needed] -->
|