rsi commited on
Commit
a81f4d5
Β·
1 Parent(s): bbece0e

update readme

Browse files
Files changed (2) hide show
  1. .gitattributes +3 -74
  2. README.md +453 -34
.gitattributes CHANGED
@@ -1,74 +1,3 @@
1
- *.gitattributes filter=lfs diff=lfs merge=lfs -text
2
- *.7z filter=lfs diff=lfs merge=lfs -text
3
- *.arrow filter=lfs diff=lfs merge=lfs -text
4
- *.bin filter=lfs diff=lfs merge=lfs -text
5
- *.bz2 filter=lfs diff=lfs merge=lfs -text
6
- *.ckpt filter=lfs diff=lfs merge=lfs -text
7
- *.ftz filter=lfs diff=lfs merge=lfs -text
8
- *.gz filter=lfs diff=lfs merge=lfs -text
9
- *.h5 filter=lfs diff=lfs merge=lfs -text
10
- *.joblib filter=lfs diff=lfs merge=lfs -text
11
- *.lfs.* filter=lfs diff=lfs merge=lfs -text
12
- *.lz4 filter=lfs diff=lfs merge=lfs -text
13
- *.mlmodel filter=lfs diff=lfs merge=lfs -text
14
- *.model filter=lfs diff=lfs merge=lfs -text
15
- *.msgpack filter=lfs diff=lfs merge=lfs -text
16
- *.npy filter=lfs diff=lfs merge=lfs -text
17
- *.npz filter=lfs diff=lfs merge=lfs -text
18
- *.onnx filter=lfs diff=lfs merge=lfs -text
19
- *.ot filter=lfs diff=lfs merge=lfs -text
20
- *.parquet filter=lfs diff=lfs merge=lfs -text
21
- *.pb filter=lfs diff=lfs merge=lfs -text
22
- *.pickle filter=lfs diff=lfs merge=lfs -text
23
- *.pkl filter=lfs diff=lfs merge=lfs -text
24
- *.pt filter=lfs diff=lfs merge=lfs -text
25
- *.pth filter=lfs diff=lfs merge=lfs -text
26
- *.rar filter=lfs diff=lfs merge=lfs -text
27
- *.safetensors filter=lfs diff=lfs merge=lfs -text
28
- saved_model/**/* filter=lfs diff=lfs merge=lfs -text
29
- *.tar.* filter=lfs diff=lfs merge=lfs -text
30
- *.tar filter=lfs diff=lfs merge=lfs -text
31
- *.tflite filter=lfs diff=lfs merge=lfs -text
32
- *.tgz filter=lfs diff=lfs merge=lfs -text
33
- *.wasm filter=lfs diff=lfs merge=lfs -text
34
- *.xz filter=lfs diff=lfs merge=lfs -text
35
- *.zip filter=lfs diff=lfs merge=lfs -text
36
- *.zst filter=lfs diff=lfs merge=lfs -text
37
- *tfevents* filter=lfs diff=lfs merge=lfs -text
38
- # Audio files - uncompressed
39
- *.pcm filter=lfs diff=lfs merge=lfs -text
40
- *.sam filter=lfs diff=lfs merge=lfs -text
41
- *.raw filter=lfs diff=lfs merge=lfs -text
42
- # Audio files - compressed
43
- *.aac filter=lfs diff=lfs merge=lfs -text
44
- *.flac filter=lfs diff=lfs merge=lfs -text
45
- *.mp3 filter=lfs diff=lfs merge=lfs -text
46
- *.ogg filter=lfs diff=lfs merge=lfs -text
47
- *.wav filter=lfs diff=lfs merge=lfs -text
48
- # Image files - uncompressed
49
- *.bmp filter=lfs diff=lfs merge=lfs -text
50
- *.gif filter=lfs diff=lfs merge=lfs -text
51
- *.png filter=lfs diff=lfs merge=lfs -text
52
- *.tiff filter=lfs diff=lfs merge=lfs -text
53
- *.tif filter=lfs diff=lfs merge=lfs -text
54
- # Image files - compressed
55
- *.jpg filter=lfs diff=lfs merge=lfs -text
56
- *.jpeg filter=lfs diff=lfs merge=lfs -text
57
- *.webp filter=lfs diff=lfs merge=lfs -text
58
- # LiDAR files - compressed
59
- *.copc.laz filter=lfs diff=lfs merge=lfs -text
60
- *.laz filter=lfs diff=lfs merge=lfs -textdata/224/annotations/annotations_CH_train.json filter=lfs diff=lfs merge=lfs -text
61
- data/224/annotations/annotations_NY_test.json filter=lfs diff=lfs merge=lfs -text
62
- data/224/annotations/annotations_NY_train.json filter=lfs diff=lfs merge=lfs -text
63
- data/224/annotations/annotations_NZ_test.json filter=lfs diff=lfs merge=lfs -text
64
- data/224/annotations/annotations_NZ_train.json filter=lfs diff=lfs merge=lfs -text
65
- data/224/annotations/annotations_all_test.json filter=lfs diff=lfs merge=lfs -text
66
- data/224/annotations/annotations_all_train.json filter=lfs diff=lfs merge=lfs -text
67
- data/224/annotations/annotations_ffl_CH_test.json filter=lfs diff=lfs merge=lfs -text
68
- data/224/annotations/annotations_ffl_CH_train.json filter=lfs diff=lfs merge=lfs -text
69
- data/224/annotations/annotations_ffl_NY_test.json filter=lfs diff=lfs merge=lfs -text
70
- data/224/annotations/annotations_ffl_NY_train.json filter=lfs diff=lfs merge=lfs -text
71
- data/224/annotations/annotations_ffl_NZ_test.json filter=lfs diff=lfs merge=lfs -text
72
- data/224/annotations/annotations_ffl_NZ_train.json filter=lfs diff=lfs merge=lfs -text
73
- data/224/annotations/annotations_ffl_all_test.json filter=lfs diff=lfs merge=lfs -text
74
- data/224/annotations/annotations_ffl_all_train.json filter=lfs diff=lfs merge=lfs -text
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fd385f9fbd4309426bc1ee75378875aa4ec9be603de9f2b9f8908c4e8acaec6a
3
+ size 2566
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
README.md CHANGED
@@ -21,11 +21,8 @@ tags:
21
  language:
22
  - en
23
  ---
24
-
25
  <div align="center">
26
  <h1 align="center">The P<sup>3</sup> dataset: Pixels, Points and Polygons <br> for Multimodal Building Vectorization</h1>
27
- <!-- <h3 align="center">Arxiv</h3> -->
28
- <!-- <h3 align="center"><a href="https://raphaelsulzer.de/">Raphael Sulzer<sup>1,2</sup></a><br></h3> -->
29
  <h3><align="center">Raphael Sulzer<sup>1,2</sup> &nbsp;&nbsp;&nbsp; Liuyun Duan<sup>1</sup>
30
  &nbsp;&nbsp;&nbsp; Nicolas Girard<sup>1</sup>&nbsp;&nbsp;&nbsp; Florent Lafarge<sup>2</sup></a></h3>
31
  <align="center"><sup>1</sup>LuxCarta Technology <br> <sup>2</sup>Centre Inria d'UniversitΓ© CΓ΄te d'Azur
@@ -33,36 +30,433 @@ language:
33
  <b>Figure 1</b>: A view of our dataset of Zurich, Switzerland
34
  </div>
35
 
36
- ## Abstract:
37
 
38
  <div align="justify">
39
- We present the P<sup>3</sup> dataset, a large-scale multimodal benchmark for building vectorization, constructed from aerial LiDAR point clouds, high-resolution aerial imagery, and vectorized 2D building outlines, collected across three continents. The dataset contains over 10 billion LiDAR points with decimeter-level accuracy and RGB images at a ground sampling distance of 25 cm. While many existing datasets primarily focus on the image modality, P<sup>3</sup> offers a complementary perspective by also incorporating dense 3D information. We demonstrate that LiDAR point clouds serve as a robust modality for predicting building polygons, both in hybrid and end-to-end learning frameworks. Moreover, fusing aerial LiDAR and imagery further improves accuracy and geometric quality of predicted polygons. The P<sup>3</sup> dataset is publicly available, along with code and pretrained weights of three state-of-the-art models for building polygon prediction at <a href="https://github.com/raphaelsulzer/PixelsPointsPolygons">github.com/raphaelsulzer/PixelsPointsPolygons</a>.
40
  </div>
41
 
42
  ## Highlights
43
 
44
- - A global, multimodal dataset of aerial images, aerial lidar point clouds and building polygons
45
- - A library for training and evaluating state-of-the-art deep learning methods on the dataset
 
46
 
47
 
48
  ## Dataset
49
 
50
- ### Download
51
-
52
- You can download the dataset at [huggingface.co/datasets/rsi/PixelsPointsPolygons](https://huggingface.co/datasets/rsi/PixelsPointsPolygons) .
53
-
54
-
55
  ### Overview
56
 
57
  <div align="left">
58
  <img src="./worldmap.jpg" width=60% height=50%>
59
  </div>
60
 
 
61
 
62
- <!-- ### Prepare custom tile size
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
63
 
64
- See [datasets preprocessing](data_preprocess) for instructions on preparing a dataset with different tile sizes. -->
 
 
 
 
65
 
 
 
 
 
66
 
67
  ## Code
68
 
@@ -72,9 +466,9 @@ See [datasets preprocessing](data_preprocess) for instructions on preparing a da
72
  git clone https://github.com/raphaelsulzer/PixelsPointsPolygons
73
  ```
74
 
75
- ### Requirements
76
 
77
- To create a conda environment named `ppp` and install the repository as a python package with all dependencies run
78
  ```
79
  bash install.sh
80
  ```
@@ -103,50 +497,75 @@ pip install .
103
  | Pix2Poly |\<pix2poly>| PointPillars (PP) + ViT | \<pp_vit> | | βœ… | 0.80 | 0.88 |
104
  | Pix2Poly |\<pix2poly>| PP+ViT \& ViT | \<fusion_vit> | βœ… |βœ… | 0.78 | 0.85 | -->
105
 
106
- ### Configuration
 
 
 
 
107
 
108
- The project supports hydra configuration which allows to modify any parameter from the command line, such as the model and encoder types from the table above.
109
- To view all available options run
110
  ```
111
- python train.py --help
112
  ```
113
 
114
- ### Training
115
 
116
- Start training with the following command:
117
 
118
- ```
119
- torchrun --nproc_per_node=<num GPUs> train.py model=<model> encoder=<encoder> model.batch_size=<batch size> ...
 
120
 
 
 
 
 
 
121
  ```
122
 
123
- ### Prediction
124
 
125
- ```
126
- torchrun --nproc_per_node=<num GPUs> predict.py model=<model> checkpoint=best_val_iou ...
 
127
 
128
  ```
 
 
 
 
129
 
130
- ### Evaluation
131
 
132
  ```
133
- python evaluate.py model=<model> checkpoint=best_val_iou
 
 
134
  ```
135
- <!-- ## Trained models
136
 
137
- asd -->
138
 
 
139
 
140
- <!-- ## Results
 
 
 
 
 
 
 
141
 
142
- #TODO Put paper main results table here -->
143
 
 
 
 
 
144
 
145
  ## Citation
146
 
147
  If you find our work useful, please consider citing:
148
  ```bibtex
149
- ...
150
  ```
151
 
152
  ## Acknowledgements
 
21
  language:
22
  - en
23
  ---
 
24
  <div align="center">
25
  <h1 align="center">The P<sup>3</sup> dataset: Pixels, Points and Polygons <br> for Multimodal Building Vectorization</h1>
 
 
26
  <h3><align="center">Raphael Sulzer<sup>1,2</sup> &nbsp;&nbsp;&nbsp; Liuyun Duan<sup>1</sup>
27
  &nbsp;&nbsp;&nbsp; Nicolas Girard<sup>1</sup>&nbsp;&nbsp;&nbsp; Florent Lafarge<sup>2</sup></a></h3>
28
  <align="center"><sup>1</sup>LuxCarta Technology <br> <sup>2</sup>Centre Inria d'UniversitΓ© CΓ΄te d'Azur
 
30
  <b>Figure 1</b>: A view of our dataset of Zurich, Switzerland
31
  </div>
32
 
33
+ ## Abstract
34
 
35
  <div align="justify">
36
+ We present the P<sup>3</sup> dataset, a large-scale multimodal benchmark for building vectorization, constructed from aerial LiDAR point clouds, high-resolution aerial imagery, and vectorized 2D building outlines, collected across three continents. The dataset contains over 10 billion LiDAR points with decimeter-level accuracy and RGB images at a ground sampling distance of 25 cm. While many existing datasets primarily focus on the image modality, P<sup>3</sup> offers a complementary perspective by also incorporating dense 3D information. We demonstrate that LiDAR point clouds serve as a robust modality for predicting building polygons, both in hybrid and end-to-end learning frameworks. Moreover, fusing aerial LiDAR and imagery further improves accuracy and geometric quality of predicted polygons. The P<sup>3</sup> dataset is publicly available, along with code and pretrained weights of three state-of-the-art models for building polygon prediction at https://github.com/raphaelsulzer/PixelsPointsPolygons.
37
  </div>
38
 
39
  ## Highlights
40
 
41
+ - A global, multimodal dataset of aerial images, aerial LiDAR point clouds and building outline polygons, available at [huggingface.co/datasets/rsi/PixelsPointsPolygons](https://huggingface.co/datasets/rsi/PixelsPointsPolygons)
42
+ - A library for training and evaluating state-of-the-art deep learning methods on the dataset, available at [github.com/raphaelsulzer/PixelsPointsPolygons](https://github.com/raphaelsulzer/PixelsPointsPolygons)
43
+ - Pretrained model weights, available at [huggingface.co/rsi/PixelsPointsPolygons](https://huggingface.co/rsi/PixelsPointsPolygons)
44
 
45
 
46
  ## Dataset
47
 
 
 
 
 
 
48
  ### Overview
49
 
50
  <div align="left">
51
  <img src="./worldmap.jpg" width=60% height=50%>
52
  </div>
53
 
54
+ ### Download
55
 
56
+ ```
57
+ git lfs install
58
+ git clone https://huggingface.co/datasets/rsi/PixelsPointsPolygons $DATA_ROOT
59
+ ```
60
+
61
+ ### Structure
62
+
63
+ <details>
64
+ <summary>πŸ“ Click to expand folder structure</summary -->
65
+
66
+ ```text
67
+ PixelsPointsPolygons/data/224
68
+ β”œβ”€β”€ annotations
69
+ β”‚ β”œβ”€β”€ annotations_all_test.json
70
+ β”‚ β”œβ”€β”€ annotations_all_train.json
71
+ β”‚ └── annotations_all_val.json
72
+ β”‚ ... (24 files total)
73
+ β”œβ”€β”€ images
74
+ β”‚ β”œβ”€β”€ train
75
+ β”‚ β”‚ β”œβ”€β”€ CH
76
+ β”‚ β”‚ β”‚ β”œβ”€β”€ 0
77
+ β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ image0_CH_train.tif
78
+ β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ image1000_CH_train.tif
79
+ β”‚ β”‚ β”‚ β”‚ └── image1001_CH_train.tif
80
+ β”‚ β”‚ β”‚ β”‚ ... (5000 files total)
81
+ β”‚ β”‚ β”‚ β”œβ”€β”€ 5000
82
+ β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ image5000_CH_train.tif
83
+ β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ image5001_CH_train.tif
84
+ β”‚ β”‚ β”‚ β”‚ └── image5002_CH_train.tif
85
+ β”‚ β”‚ β”‚ β”‚ ... (5000 files total)
86
+ β”‚ β”‚ β”‚ └── 10000
87
+ β”‚ β”‚ β”‚ ���── image10000_CH_train.tif
88
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image10001_CH_train.tif
89
+ β”‚ β”‚ β”‚ └── image10002_CH_train.tif
90
+ β”‚ β”‚ β”‚ ... (5000 files total)
91
+ β”‚ β”‚ β”‚ ... (11 dirs total)
92
+ β”‚ β”‚ β”œβ”€β”€ NY
93
+ β”‚ β”‚ β”‚ β”œβ”€β”€ 0
94
+ β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ image0_NY_train.tif
95
+ β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ image1000_NY_train.tif
96
+ β”‚ β”‚ β”‚ β”‚ └── image1001_NY_train.tif
97
+ β”‚ β”‚ β”‚ β”‚ ... (5000 files total)
98
+ β”‚ β”‚ β”‚ β”œβ”€β”€ 5000
99
+ β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ image5000_NY_train.tif
100
+ β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ image5001_NY_train.tif
101
+ β”‚ β”‚ β”‚ β”‚ └── image5002_NY_train.tif
102
+ β”‚ β”‚ β”‚ β”‚ ... (5000 files total)
103
+ β”‚ β”‚ β”‚ └── 10000
104
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image10000_NY_train.tif
105
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image10001_NY_train.tif
106
+ β”‚ β”‚ β”‚ └── image10002_NY_train.tif
107
+ β”‚ β”‚ β”‚ ... (5000 files total)
108
+ β”‚ β”‚ β”‚ ... (11 dirs total)
109
+ β”‚ β”‚ └── NZ
110
+ β”‚ β”‚ β”œβ”€β”€ 0
111
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image0_NZ_train.tif
112
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image1000_NZ_train.tif
113
+ β”‚ β”‚ β”‚ └── image1001_NZ_train.tif
114
+ β”‚ β”‚ β”‚ ... (5000 files total)
115
+ β”‚ β”‚ β”œβ”€β”€ 5000
116
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image5000_NZ_train.tif
117
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image5001_NZ_train.tif
118
+ β”‚ β”‚ β”‚ └── image5002_NZ_train.tif
119
+ β”‚ β”‚ β”‚ ... (5000 files total)
120
+ β”‚ β”‚ └── 10000
121
+ β”‚ β”‚ β”œβ”€β”€ image10000_NZ_train.tif
122
+ β”‚ β”‚ β”œβ”€β”€ image10001_NZ_train.tif
123
+ β”‚ β”‚ └── image10002_NZ_train.tif
124
+ β”‚ β”‚ ... (5000 files total)
125
+ β”‚ β”‚ ... (11 dirs total)
126
+ β”‚ β”œβ”€β”€ val
127
+ β”‚ β”‚ β”œβ”€β”€ CH
128
+ β”‚ β”‚ β”‚ └── 0
129
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image0_CH_val.tif
130
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image100_CH_val.tif
131
+ β”‚ β”‚ β”‚ └── image101_CH_val.tif
132
+ β”‚ β”‚ β”‚ ... (529 files total)
133
+ β”‚ β”‚ β”œβ”€β”€ NY
134
+ β”‚ β”‚ β”‚ └── 0
135
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image0_NY_val.tif
136
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image100_NY_val.tif
137
+ β”‚ β”‚ β”‚ └── image101_NY_val.tif
138
+ β”‚ β”‚ β”‚ ... (529 files total)
139
+ β”‚ β”‚ └── NZ
140
+ β”‚ β”‚ └── 0
141
+ β”‚ β”‚ β”œβ”€β”€ image0_NZ_val.tif
142
+ β”‚ β”‚ β”œβ”€β”€ image100_NZ_val.tif
143
+ β”‚ β”‚ └── image101_NZ_val.tif
144
+ β”‚ β”‚ ... (529 files total)
145
+ β”‚ └── test
146
+ β”‚ β”œβ”€β”€ CH
147
+ β”‚ β”‚ β”œβ”€β”€ 0
148
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image0_CH_test.tif
149
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image1000_CH_test.tif
150
+ β”‚ β”‚ β”‚ └── image1001_CH_test.tif
151
+ β”‚ β”‚ β”‚ ... (5000 files total)
152
+ β”‚ β”‚ β”œβ”€β”€ 5000
153
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image5000_CH_test.tif
154
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image5001_CH_test.tif
155
+ β”‚ β”‚ β”‚ └── image5002_CH_test.tif
156
+ β”‚ β”‚ β”‚ ... (5000 files total)
157
+ β”‚ β”‚ └── 10000
158
+ β”‚ β”‚ β”œβ”€β”€ image10000_CH_test.tif
159
+ β”‚ β”‚ β”œβ”€β”€ image10001_CH_test.tif
160
+ β”‚ β”‚ └── image10002_CH_test.tif
161
+ β”‚ β”‚ ... (4400 files total)
162
+ β”‚ β”œβ”€β”€ NY
163
+ β”‚ β”‚ β”œβ”€β”€ 0
164
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image0_NY_test.tif
165
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image1000_NY_test.tif
166
+ β”‚ β”‚ β”‚ └── image1001_NY_test.tif
167
+ β”‚ β”‚ β”‚ ... (5000 files total)
168
+ β”‚ β”‚ β”œβ”€β”€ 5000
169
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image5000_NY_test.tif
170
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image5001_NY_test.tif
171
+ β”‚ β”‚ β”‚ └── image5002_NY_test.tif
172
+ β”‚ β”‚ β”‚ ... (5000 files total)
173
+ β”‚ β”‚ └── 10000
174
+ β”‚ β”‚ β”œβ”€β”€ image10000_NY_test.tif
175
+ β”‚ β”‚ β”œβ”€β”€ image10001_NY_test.tif
176
+ β”‚ β”‚ └── image10002_NY_test.tif
177
+ β”‚ β”‚ ... (4400 files total)
178
+ β”‚ └── NZ
179
+ β”‚ β”œβ”€β”€ 0
180
+ β”‚ β”‚ β”œβ”€β”€ image0_NZ_test.tif
181
+ β”‚ β”‚ β”œβ”€β”€ image1000_NZ_test.tif
182
+ β”‚ β”‚ └── image1001_NZ_test.tif
183
+ β”‚ β”‚ ... (5000 files total)
184
+ β”‚ β”œβ”€β”€ 5000
185
+ β”‚ β”‚ β”œβ”€β”€ image5000_NZ_test.tif
186
+ β”‚ β”‚ β”œβ”€β”€ image5001_NZ_test.tif
187
+ β”‚ β”‚ └── image5002_NZ_test.tif
188
+ β”‚ β”‚ ... (5000 files total)
189
+ β”‚ └── 10000
190
+ β”‚ β”œβ”€β”€ image10000_NZ_test.tif
191
+ β”‚ β”œβ”€β”€ image10001_NZ_test.tif
192
+ β”‚ └── image10002_NZ_test.tif
193
+ β”‚ ... (4400 files total)
194
+ β”œβ”€β”€ lidar
195
+ β”‚ β”œβ”€β”€ train
196
+ β”‚ β”‚ β”œβ”€β”€ CH
197
+ β”‚ β”‚ β”‚ β”œβ”€β”€ 0
198
+ β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar0_CH_train.copc.laz
199
+ β”‚ β”‚ β”‚ οΏ½οΏ½οΏ½ β”œβ”€β”€ lidar1000_CH_train.copc.laz
200
+ β”‚ β”‚ β”‚ β”‚ └── lidar1001_CH_train.copc.laz
201
+ β”‚ β”‚ β”‚ β”‚ ... (5000 files total)
202
+ β”‚ β”‚ β”‚ β”œβ”€β”€ 5000
203
+ β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar5000_CH_train.copc.laz
204
+ β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar5001_CH_train.copc.laz
205
+ β”‚ β”‚ β”‚ β”‚ └── lidar5002_CH_train.copc.laz
206
+ β”‚ β”‚ β”‚ β”‚ ... (5000 files total)
207
+ β”‚ β”‚ β”‚ └── 10000
208
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar10000_CH_train.copc.laz
209
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar10001_CH_train.copc.laz
210
+ β”‚ β”‚ β”‚ └── lidar10002_CH_train.copc.laz
211
+ β”‚ β”‚ β”‚ ... (5000 files total)
212
+ β”‚ β”‚ β”‚ ... (11 dirs total)
213
+ β”‚ β”‚ β”œβ”€β”€ NY
214
+ β”‚ β”‚ β”‚ β”œβ”€β”€ 0
215
+ β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar0_NY_train.copc.laz
216
+ β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar10_NY_train.copc.laz
217
+ β”‚ β”‚ β”‚ β”‚ └── lidar1150_NY_train.copc.laz
218
+ β”‚ β”‚ β”‚ β”‚ ... (1071 files total)
219
+ β”‚ β”‚ β”‚ β”œβ”€β”€ 5000
220
+ β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar5060_NY_train.copc.laz
221
+ β”‚ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar5061_NY_train.copc.laz
222
+ β”‚ β”‚ β”‚ β”‚ └── lidar5062_NY_train.copc.laz
223
+ β”‚ β”‚ β”‚ β”‚ ... (2235 files total)
224
+ β”‚ β”‚ β”‚ └── 10000
225
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar10000_NY_train.copc.laz
226
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar10001_NY_train.copc.laz
227
+ β”‚ β”‚ β”‚ └── lidar10002_NY_train.copc.laz
228
+ β”‚ β”‚ β”‚ ... (4552 files total)
229
+ β”‚ β”‚ β”‚ ... (11 dirs total)
230
+ β”‚ β”‚ └── NZ
231
+ β”‚ β”‚ β”œβ”€β”€ 0
232
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar0_NZ_train.copc.laz
233
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar1000_NZ_train.copc.laz
234
+ β”‚ β”‚ β”‚ └── lidar1001_NZ_train.copc.laz
235
+ β”‚ β”‚ β”‚ ... (5000 files total)
236
+ β”‚ β”‚ β”œβ”€β”€ 5000
237
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar5000_NZ_train.copc.laz
238
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar5001_NZ_train.copc.laz
239
+ β”‚ β”‚ β”‚ └── lidar5002_NZ_train.copc.laz
240
+ β”‚ β”‚ β”‚ ... (5000 files total)
241
+ β”‚ β”‚ └── 10000
242
+ β”‚ β”‚ β”œβ”€β”€ lidar10000_NZ_train.copc.laz
243
+ β”‚ β”‚ β”œβ”€β”€ lidar10001_NZ_train.copc.laz
244
+ β”‚ β”‚ └── lidar10002_NZ_train.copc.laz
245
+ β”‚ β”‚ ... (4999 files total)
246
+ β”‚ β”‚ ... (11 dirs total)
247
+ β”‚ β”œβ”€β”€ val
248
+ β”‚ β”‚ β”œβ”€β”€ CH
249
+ β”‚ β”‚ β”‚ └── 0
250
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar0_CH_val.copc.laz
251
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar100_CH_val.copc.laz
252
+ β”‚ β”‚ β”‚ └── lidar101_CH_val.copc.laz
253
+ β”‚ β”‚ β”‚ ... (529 files total)
254
+ β”‚ β”‚ β”œβ”€β”€ NY
255
+ β”‚ β”‚ β”‚ └── 0
256
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar0_NY_val.copc.laz
257
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar100_NY_val.copc.laz
258
+ β”‚ β”‚ β”‚ └── lidar101_NY_val.copc.laz
259
+ β”‚ β”‚ β”‚ ... (529 files total)
260
+ β”‚ β”‚ └── NZ
261
+ β”‚ β”‚ └── 0
262
+ β”‚ β”‚ β”œβ”€β”€ lidar0_NZ_val.copc.laz
263
+ β”‚ β”‚ β”œβ”€β”€ lidar100_NZ_val.copc.laz
264
+ β”‚ β”‚ └── lidar101_NZ_val.copc.laz
265
+ β”‚ β”‚ ... (529 files total)
266
+ β”‚ └── test
267
+ β”‚ β”œβ”€β”€ CH
268
+ β”‚ β”‚ β”œβ”€β”€ 0
269
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar0_CH_test.copc.laz
270
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar1000_CH_test.copc.laz
271
+ β”‚ β”‚ β”‚ └── lidar1001_CH_test.copc.laz
272
+ β”‚ β”‚ β”‚ ... (5000 files total)
273
+ β”‚ β”‚ β”œβ”€β”€ 5000
274
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar5000_CH_test.copc.laz
275
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar5001_CH_test.copc.laz
276
+ β”‚ β”‚ β”‚ └── lidar5002_CH_test.copc.laz
277
+ β”‚ β”‚ β”‚ ... (5000 files total)
278
+ β”‚ β”‚ └── 10000
279
+ β”‚ β”‚ β”œβ”€β”€ lidar10000_CH_test.copc.laz
280
+ β”‚ β”‚ β”œβ”€β”€ lidar10001_CH_test.copc.laz
281
+ β”‚ β”‚ └── lidar10002_CH_test.copc.laz
282
+ β”‚ β”‚ ... (4400 files total)
283
+ β”‚ β”œβ”€β”€ NY
284
+ β”‚ β”‚ β”œβ”€β”€ 0
285
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar0_NY_test.copc.laz
286
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar1000_NY_test.copc.laz
287
+ β”‚ β”‚ β”‚ └── lidar1001_NY_test.copc.laz
288
+ β”‚ β”‚ β”‚ ... (4964 files total)
289
+ β”‚ β”‚ β”œβ”€β”€ 5000
290
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar5000_NY_test.copc.laz
291
+ β”‚ β”‚ β”‚ β”œβ”€β”€ lidar5001_NY_test.copc.laz
292
+ β”‚ β”‚ β”‚ └── lidar5002_NY_test.copc.laz
293
+ β”‚ β”‚ β”‚ ... (4953 files total)
294
+ β”‚ β”‚ └── 10000
295
+ β”‚ β”‚ β”œβ”€β”€ lidar10000_NY_test.copc.laz
296
+ β”‚ β”‚ β”œβ”€β”€ lidar10001_NY_test.copc.laz
297
+ β”‚ β”‚ └── lidar10002_NY_test.copc.laz
298
+ β”‚ β”‚ ... (4396 files total)
299
+ β”‚ └── NZ
300
+ β”‚ β”œβ”€β”€ 0
301
+ β”‚ β”‚ β”œβ”€β”€ lidar0_NZ_test.copc.laz
302
+ β”‚ β”‚ β”œβ”€β”€ lidar1000_NZ_test.copc.laz
303
+ β”‚ β”‚ └── lidar1001_NZ_test.copc.laz
304
+ β”‚ β”‚ ... (5000 files total)
305
+ β”‚ β”œβ”€β”€ 5000
306
+ β”‚ β”‚ β”œβ”€β”€ lidar5000_NZ_test.copc.laz
307
+ β”‚ β”‚ β”œβ”€β”€ lidar5001_NZ_test.copc.laz
308
+ β”‚ β”‚ └── lidar5002_NZ_test.copc.laz
309
+ β”‚ β”‚ ... (5000 files total)
310
+ β”‚ └── 10000
311
+ β”‚ β”œβ”€β”€ lidar10000_NZ_test.copc.laz
312
+ β”‚ β”œβ”€β”€ lidar10001_NZ_test.copc.laz
313
+ β”‚ └── lidar10002_NZ_test.copc.laz
314
+ β”‚ ... (4400 files total)
315
+ └── ffl
316
+ β”œβ”€β”€ train
317
+ β”‚ β”œβ”€β”€ CH
318
+ β”‚ β”‚ β”œβ”€β”€ 0
319
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image0_CH_train.pt
320
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image1000_CH_train.pt
321
+ β”‚ β”‚ β”‚ └── image1001_CH_train.pt
322
+ β”‚ β”‚ β”‚ ... (5000 files total)
323
+ β”‚ β”‚ β”œβ”€β”€ 5000
324
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image5000_CH_train.pt
325
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image5001_CH_train.pt
326
+ β”‚ β”‚ β”‚ └── image5002_CH_train.pt
327
+ β”‚ β”‚ β”‚ ... (5000 files total)
328
+ β”‚ β”‚ └── 10000
329
+ β”‚ β”‚ β”œβ”€β”€ image10000_CH_train.pt
330
+ β”‚ β”‚ β”œβ”€β”€ image10001_CH_train.pt
331
+ β”‚ β”‚ └── image10002_CH_train.pt
332
+ β”‚ β”‚ ... (5000 files total)
333
+ β”‚ β”‚ ... (11 dirs total)
334
+ β”‚ β”œβ”€β”€ NY
335
+ β”‚ β”‚ β”œβ”€β”€ 0
336
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image0_NY_train.pt
337
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image1000_NY_train.pt
338
+ β”‚ β”‚ β”‚ └── image1001_NY_train.pt
339
+ β”‚ β”‚ β”‚ ... (5000 files total)
340
+ β”‚ β”‚ β”œβ”€β”€ 5000
341
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image5000_NY_train.pt
342
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image5001_NY_train.pt
343
+ β”‚ β”‚ β”‚ └── image5002_NY_train.pt
344
+ β”‚ β”‚ β”‚ ... (5000 files total)
345
+ β”‚ β”‚ └── 10000
346
+ β”‚ β”‚ β”œβ”€β”€ image10000_NY_train.pt
347
+ β”‚ β”‚ β”œβ”€β”€ image10001_NY_train.pt
348
+ β”‚ β”‚ └── image10002_NY_train.pt
349
+ β”‚ β”‚ ... (5000 files total)
350
+ β”‚ β”‚ ... (11 dirs total)
351
+ β”‚ β”œβ”€β”€ NZ
352
+ β”‚ β”‚ β”œβ”€β”€ 0
353
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image0_NZ_train.pt
354
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image1000_NZ_train.pt
355
+ β”‚ β”‚ β”‚ └── image1001_NZ_train.pt
356
+ β”‚ β”‚ β”‚ ... (5000 files total)
357
+ β”‚ β”‚ β”œβ”€β”€ 5000
358
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image5000_NZ_train.pt
359
+ β”‚ β”‚ β”‚ β”œβ”€β”€ image5001_NZ_train.pt
360
+ β”‚ β”‚ β”‚ └── image5002_NZ_train.pt
361
+ β”‚ β”‚ β”‚ ... (5000 files total)
362
+ β”‚ β”‚ └── 10000
363
+ β”‚ β”‚ β”œβ”€β”€ image10000_NZ_train.pt
364
+ β”‚ β”‚ β”œβ”€β”€ image10001_NZ_train.pt
365
+ β”‚ β”‚ └── image10002_NZ_train.pt
366
+ β”‚ β”‚ ... (5000 files total)
367
+ β”‚ β”‚ ... (11 dirs total)
368
+ β”‚ β”œβ”€β”€ processed-flag-all
369
+ β”‚ β”œβ”€β”€ processed-flag-CH
370
+ β”‚ └── processed-flag-NY
371
+ β”‚ ... (8 files total)
372
+ β”œβ”€β”€ val
373
+ β”‚ β”œβ”€β”€ CH
374
+ β”‚ β”‚ └── 0
375
+ β”‚ β”‚ β”œβ”€β”€ image0_CH_val.pt
376
+ β”‚ β”‚ β”œβ”€β”€ image100_CH_val.pt
377
+ β”‚ β”‚ └── image101_CH_val.pt
378
+ β”‚ β”‚ ... (529 files total)
379
+ β”‚ β”œβ”€β”€ NY
380
+ β”‚ β”‚ └── 0
381
+ β”‚ β”‚ β”œβ”€β”€ image0_NY_val.pt
382
+ β”‚ β”‚ β”œβ”€β”€ image100_NY_val.pt
383
+ β”‚ β”‚ └── image101_NY_val.pt
384
+ β”‚ β”‚ ... (529 files total)
385
+ β”‚ β”œβ”€β”€ NZ
386
+ β”‚ β”‚ └── 0
387
+ β”‚ β”‚ β”œβ”€β”€ image0_NZ_val.pt
388
+ β”‚ β”‚ β”œβ”€β”€ image100_NZ_val.pt
389
+ β”‚ β”‚ └── image101_NZ_val.pt
390
+ β”‚ β”‚ ... (529 files total)
391
+ β”‚ β”œβ”€β”€ processed-flag-all
392
+ β”‚ β”œβ”€β”€ processed-flag-CH
393
+ β”‚ └── processed-flag-NY
394
+ β”‚ ... (8 files total)
395
+ └── test
396
+ β”œβ”€β”€ CH
397
+ β”‚ β”œβ”€β”€ 0
398
+ β”‚ β”‚ β”œβ”€β”€ image0_CH_test.pt
399
+ β”‚ β”‚ β”œβ”€β”€ image1000_CH_test.pt
400
+ β”‚ β”‚ └── image1001_CH_test.pt
401
+ β”‚ β”‚ ... (5000 files total)
402
+ β”‚ β”œβ”€β”€ 5000
403
+ β”‚ β”‚ β”œβ”€β”€ image5000_CH_test.pt
404
+ β”‚ β”‚ β”œβ”€β”€ image5001_CH_test.pt
405
+ β”‚ β”‚ └── image5002_CH_test.pt
406
+ β”‚ β”‚ ... (5000 files total)
407
+ β”‚ └── 10000
408
+ β”‚ β”œβ”€β”€ image10000_CH_test.pt
409
+ β”‚ β”œβ”€β”€ image10001_CH_test.pt
410
+ β”‚ └── image10002_CH_test.pt
411
+ β”‚ ... (4400 files total)
412
+ β”œβ”€β”€ NY
413
+ β”‚ β”œβ”€β”€ 0
414
+ β”‚ β”‚ β”œβ”€β”€ image0_NY_test.pt
415
+ β”‚ β”‚ β”œβ”€β”€ image1000_NY_test.pt
416
+ β”‚ β”‚ └── image1001_NY_test.pt
417
+ β”‚ β”‚ ... (5000 files total)
418
+ β”‚ β”œβ”€β”€ 5000
419
+ β”‚ β”‚ β”œβ”€β”€ image5000_NY_test.pt
420
+ β”‚ β”‚ β”œβ”€β”€ image5001_NY_test.pt
421
+ β”‚ β”‚ └── image5002_NY_test.pt
422
+ β”‚ β”‚ ... (5000 files total)
423
+ β”‚ └── 10000
424
+ β”‚ β”œβ”€β”€ image10000_NY_test.pt
425
+ β”‚ β”œβ”€β”€ image10001_NY_test.pt
426
+ β”‚ └── image10002_NY_test.pt
427
+ β”‚ ... (4400 files total)
428
+ β”œβ”€β”€ NZ
429
+ β”‚ β”œβ”€β”€ 0
430
+ β”‚ β”‚ β”œβ”€β”€ image0_NZ_test.pt
431
+ β”‚ β”‚ β”œβ”€β”€ image1000_NZ_test.pt
432
+ β”‚ β”‚ └── image1001_NZ_test.pt
433
+ β”‚ β”‚ ... (5000 files total)
434
+ β”‚ β”œβ”€β”€ 5000
435
+ β”‚ β”‚ β”œβ”€β”€ image5000_NZ_test.pt
436
+ β”‚ β”‚ β”œβ”€β”€ image5001_NZ_test.pt
437
+ β”‚ β”‚ └── image5002_NZ_test.pt
438
+ β”‚ β”‚ ... (5000 files total)
439
+ β”‚ └── 10000
440
+ β”‚ β”œβ”€β”€ image10000_NZ_test.pt
441
+ β”‚ β”œβ”€β”€ image10001_NZ_test.pt
442
+ β”‚ └── image10002_NZ_test.pt
443
+ β”‚ ... (4400 files total)
444
+ β”œβ”€β”€ processed-flag-all
445
+ β”œβ”€β”€ processed-flag-CH
446
+ └── processed-flag-NY
447
+ ... (8 files total)
448
+ ```
449
 
450
+ </details>
451
+
452
+ ## Pretrained model weights
453
+
454
+ ### Download
455
 
456
+ ```
457
+ git lfs install
458
+ git clone https://huggingface.co/rsi/PixelsPointsPolygons $MODEL_ROOT
459
+ ```
460
 
461
  ## Code
462
 
 
466
  git clone https://github.com/raphaelsulzer/PixelsPointsPolygons
467
  ```
468
 
469
+ ### Installation
470
 
471
+ To create a conda environment named `p3` and install the repository as a python package with all dependencies run
472
  ```
473
  bash install.sh
474
  ```
 
497
  | Pix2Poly |\<pix2poly>| PointPillars (PP) + ViT | \<pp_vit> | | βœ… | 0.80 | 0.88 |
498
  | Pix2Poly |\<pix2poly>| PP+ViT \& ViT | \<fusion_vit> | βœ… |βœ… | 0.78 | 0.85 | -->
499
 
500
+ ### Setup
501
+
502
+ The project supports hydra configuration which allows to modify any parameter either from a `.yaml` file of directly from the command line.
503
+
504
+ To setup the project structure we recommend to specify your `$DATA_ROOT` and `$MODEL_ROOT` in `config/host/default.yaml`.
505
 
506
+ To view all available configuration options run
 
507
  ```
508
+ python scripts/train.py --help
509
  ```
510
 
 
511
 
 
512
 
513
+ <!-- The most important parameters are described below:
514
+ <details>
515
+ <summary>CLI Parameters</summary>
516
 
517
+ ```text
518
+ β”œβ”€β”€ processed-flag-all
519
+ β”œβ”€β”€ processed-flag-CH
520
+ └── processed-flag-NY
521
+ ... (8 files total)
522
  ```
523
 
524
+ </details> -->
525
 
526
+ ### Predict a single tile
527
+
528
+ TODO
529
 
530
  ```
531
+ python scripts/predict_demo.py
532
+ ```
533
+
534
+ ### Reproduce paper results
535
 
536
+ To reproduce the results from the paper you can run any of the following commands
537
 
538
  ```
539
+ python scripts/modality_ablation.py
540
+ python scripts/lidar_density_ablation.py
541
+ python scripts/all_countries.py
542
  ```
 
543
 
544
+ ### Custom training, prediction and evaluation
545
 
546
+ We recommend to first setup a custom `$EXP_FILE` in `config/experiment` following the structure of one of the existing experiment files, e.g. `ffl_fusion.yaml`. You can then run:
547
 
548
+ ```
549
+ # train your model (on multiple GPUs)
550
+ torchrun --nproc_per_node=$NUM_GPU scripts/train.py experiment=$EXP_FILE
551
+ # predict the test set with your model (on multiple GPUs)
552
+ torchrun --nproc_per_node=$NUM_GPU scripts/predict.py evaluation=test checkpoint=best_val_iou
553
+ # evaluate your prediction of the test set
554
+ python scripts/evaluate.py model=<model> evaluation=test checkpoint=best_val_iou
555
+ ```
556
 
557
+ You could also continue training from a provided pretrained model with
558
 
559
+ ```
560
+ # train your model (on a single GPU)
561
+ python scripts/train.py experiment=p2p_fusion checkpoint=latest
562
+ ```
563
 
564
  ## Citation
565
 
566
  If you find our work useful, please consider citing:
567
  ```bibtex
568
+ TODO
569
  ```
570
 
571
  ## Acknowledgements