Datasets:
Correct the compressing file to correct data(without redudant hirarchy
Browse files- README.md +1 -27
- contributor_README.md +14 -2
- data/ADMM-ADAM/config0/config0.tar.gz +2 -2
- data/ADMM-ADAM/config1/config1.tar.gz +2 -2
- data/ADMM-ADAM/config2/config2.tar.gz +2 -2
- data/ADMM-ADAM/config3/config3.tar.gz +2 -2
- data/ADMM-ADAM/config4/config4.tar.gz +2 -2
- data/Origin/Origin.tar.gz +2 -2
- data/new_FastHyIn/config1/config1.tar.gz +2 -2
- data/new_FastHyIn/config2/config2.tar.gz +2 -2
- data/new_FastHyIn/config3/config3.tar.gz +2 -2
- data/new_FastHyIn/config4/config4.tar.gz +2 -2
- unzipping.sh +13 -26
- zipping.sh +17 -16
README.md
CHANGED
@@ -190,30 +190,4 @@ If analyses have been run quantifying these biases, please add brief summaries a
|
|
190 |
|
191 |
If studies of the datasets have outlined other limitations of the dataset, such as annotation artifacts, please outline and cite them here.
|
192 |
|
193 |
-
##
|
194 |
-
|
195 |
-
### Dataset Curators
|
196 |
-
|
197 |
-
List the people involved in collecting the dataset and their affiliation(s). If funding information is known, include it here.
|
198 |
-
|
199 |
-
### Licensing Information
|
200 |
-
|
201 |
-
Provide the license and link to the license webpage if available.
|
202 |
-
|
203 |
-
### Citation Information
|
204 |
-
|
205 |
-
Provide the [BibTex](http://www.bibtex.org/)-formatted reference for the dataset. For example:
|
206 |
-
```
|
207 |
-
@article{article_id,
|
208 |
-
author = {Author List},
|
209 |
-
title = {Dataset Paper Title},
|
210 |
-
journal = {Publication Venue},
|
211 |
-
year = {2525}
|
212 |
-
}
|
213 |
-
```
|
214 |
-
|
215 |
-
If the dataset has a [DOI](https://www.doi.org/), please provide it here.
|
216 |
-
|
217 |
-
### Contributions
|
218 |
-
|
219 |
-
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset.
|
|
|
190 |
|
191 |
If studies of the datasets have outlined other limitations of the dataset, such as annotation artifacts, please outline and cite them here.
|
192 |
|
193 |
+
## Addition
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
contributor_README.md
CHANGED
@@ -13,13 +13,18 @@ git clone https://huggingface.co/datasets/OtoroLin/HyperForensics-plus-plus
|
|
13 |
cd ./HyperForensics-plus-plus
|
14 |
```
|
15 |
## 2. Extract the Dataset
|
16 |
-
|
17 |
-
Unzip the data.tar.gz file into a separate local directory. You can utilize the provided `zipping.sh` script. First, modify the `ROOT_DIR` variable to the `local_data` path
|
18 |
```bash
|
19 |
# In zipping.sh
|
20 |
# Defined the root directory of the whole dataset
|
21 |
ROOT_DIR="/root/to/local_data"
|
22 |
```
|
|
|
|
|
|
|
|
|
|
|
|
|
23 |
### 2.1 Decompressing the specify method and configuration:
|
24 |
```bash
|
25 |
./unzipping.sh --method <method name> --config <config name>
|
@@ -61,6 +66,12 @@ You can utilize the provided `zipping.sh` script. First, modify the `ROOT_DIR` v
|
|
61 |
# Defined the root directory of the whole dataset
|
62 |
ROOT_DIR="/root/to/local_data"
|
63 |
```
|
|
|
|
|
|
|
|
|
|
|
|
|
64 |
### 4.1 Compressing the specify method and configuration:
|
65 |
```bash
|
66 |
./zipping.sh --method <method name> --config <config name>
|
@@ -86,6 +97,7 @@ This will compress `/path/to/local/ADMM-ADAM/config0/*` to `HyperForensics-plus-
|
|
86 |
./zipping.sh --all
|
87 |
```
|
88 |
This will compress the entire local data hirarchy seperately and automatically put the compressed files under hirarchy of the dataset repo.
|
|
|
89 |
## 5. Create a Pull Request
|
90 |
Push your changes to create a pull request. Hugging Face uses `git push` to implement pull requests:
|
91 |
```bash
|
|
|
13 |
cd ./HyperForensics-plus-plus
|
14 |
```
|
15 |
## 2. Extract the Dataset
|
16 |
+
Decompress the `.tar.gz` files into a separate local hirarchy. You can utilize the provided `unzipping.sh` script. First, modify the `ROOT_DIR` variable to the `local_data` path
|
|
|
17 |
```bash
|
18 |
# In zipping.sh
|
19 |
# Defined the root directory of the whole dataset
|
20 |
ROOT_DIR="/root/to/local_data"
|
21 |
```
|
22 |
+
Due to the redundancy of our dataset, we use **`pigz`** program instead of standar `gipz` program to compress the files. Make sure you install it first.
|
23 |
+
```bash
|
24 |
+
sudo apt install pigz
|
25 |
+
# If you are using miniconda
|
26 |
+
conda install pigz
|
27 |
+
```
|
28 |
### 2.1 Decompressing the specify method and configuration:
|
29 |
```bash
|
30 |
./unzipping.sh --method <method name> --config <config name>
|
|
|
66 |
# Defined the root directory of the whole dataset
|
67 |
ROOT_DIR="/root/to/local_data"
|
68 |
```
|
69 |
+
Due to the redundancy of our dataset, we use **`pigz`** program instead of standar `gipz` program to compress the files. Make sure you install it first.
|
70 |
+
```bash
|
71 |
+
sudo apt install pigz
|
72 |
+
# If you are using miniconda
|
73 |
+
conda install pigz
|
74 |
+
```
|
75 |
### 4.1 Compressing the specify method and configuration:
|
76 |
```bash
|
77 |
./zipping.sh --method <method name> --config <config name>
|
|
|
97 |
./zipping.sh --all
|
98 |
```
|
99 |
This will compress the entire local data hirarchy seperately and automatically put the compressed files under hirarchy of the dataset repo.
|
100 |
+
> Note that I do not imiplement the method/config-specified compressing for **Origin** since it seems unlikely people needs to
|
101 |
## 5. Create a Pull Request
|
102 |
Push your changes to create a pull request. Hugging Face uses `git push` to implement pull requests:
|
103 |
```bash
|
data/ADMM-ADAM/config0/config0.tar.gz
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:f622697dbb7cd19ed84a43455f9bdbd522a5373adf39000c2cac1ed9e8e7ace2
|
3 |
+
size 4641814961
|
data/ADMM-ADAM/config1/config1.tar.gz
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:86dce66756d06e2c5cfdb74c15e68595b19226a6102a41c4c8cd74247abd6c26
|
3 |
+
size 4644982824
|
data/ADMM-ADAM/config2/config2.tar.gz
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:dfee8be8bd83c627dec9c205625e8dd8fe922a7083ccb90889894480c114404e
|
3 |
+
size 4651115011
|
data/ADMM-ADAM/config3/config3.tar.gz
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:76a20b7582fcc7e33c7ac4ebf72b932b1130de26a206d1285a468d649d78cdf9
|
3 |
+
size 4649205808
|
data/ADMM-ADAM/config4/config4.tar.gz
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:374e382bc5e5fbf193550e9276c7b44159bfe4f67f694cba968e9abd8e0358b0
|
3 |
+
size 4654738674
|
data/Origin/Origin.tar.gz
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ed7e4fbabcb2b86a794ce866a46f53a0961a7a13ce86b20b5b764b842813d94f
|
3 |
+
size 4687052148
|
data/new_FastHyIn/config1/config1.tar.gz
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:e2884c78c04af3bd9b43aa491f40945b4afeb0ed7acdfd1c6c8edc1353d3a531
|
3 |
+
size 18773745947
|
data/new_FastHyIn/config2/config2.tar.gz
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:11cf1a565368940fac38a290d54915e31e9cdb709373cacec1d07da40a37d696
|
3 |
+
size 18756683784
|
data/new_FastHyIn/config3/config3.tar.gz
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:44ad4e07dbd12ee65ea22c39dfc60d9738ebff9d12cf2cb3ea3982708a8f152d
|
3 |
+
size 18778169712
|
data/new_FastHyIn/config4/config4.tar.gz
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ed4f4be0eb161aa1ce251cfe63b368310473f2b692bf1240d8e76ddff17e42ec
|
3 |
+
size 18768394044
|
unzipping.sh
CHANGED
@@ -1,5 +1,11 @@
|
|
1 |
#!/bin/bash
|
2 |
|
|
|
|
|
|
|
|
|
|
|
|
|
3 |
# Defined the root directory of the whole dataset
|
4 |
ROOT_DIR="/ssd2/TzuYu/data_test"
|
5 |
|
@@ -14,17 +20,19 @@ unzip_config() {
|
|
14 |
local config=$2
|
15 |
local output_path="$ROOT_DIR/$method/$config"
|
16 |
local zipped_file="./data/$method/$config/$config.tar.gz"
|
17 |
-
|
18 |
# Ensure the output directory exists
|
19 |
mkdir -p "$output_path"
|
20 |
|
21 |
# Create the tar.gz file
|
22 |
echo "Decompressing $zipped_file to $output_path"
|
23 |
-
tar -
|
24 |
if [[ $? -ne 0 ]]; then
|
25 |
echo "Error: Failed to decompress $zipped_file"
|
26 |
exit 1
|
27 |
fi
|
|
|
|
|
28 |
}
|
29 |
|
30 |
# Parse arguments
|
@@ -43,7 +51,7 @@ done
|
|
43 |
# unzipping all folders
|
44 |
if [[ "$ALL" == "true" ]]; then
|
45 |
# Iterate through all methods and configs
|
46 |
-
for method_dir in
|
47 |
if [[ -d "$method_dir" && "$(basename "$method_dir")" != "Origin" ]]; then
|
48 |
method=$(basename "$method_dir")
|
49 |
for config_dir in "$method_dir"/config*; do
|
@@ -61,7 +69,7 @@ if [[ "$ALL" == "true" ]]; then
|
|
61 |
|
62 |
# decompress tar.gz file
|
63 |
echo "Decompressing ./data/Origin/Origin.tar.gz to $ROOT_DIR/Origin"
|
64 |
-
tar -
|
65 |
if [[ $? -ne 0 ]]; then
|
66 |
echo "Error: Failed to decompress ./data/Origin/Origin.tar.gz"
|
67 |
exit 1
|
@@ -77,25 +85,4 @@ if [[ ! -n "$METHOD" || ! -n "$CONFIG" ]]; then
|
|
77 |
echo "Error: Both --method and --config or --all must be specified."
|
78 |
usage
|
79 |
fi
|
80 |
-
|
81 |
-
OUTPUT_PATH="$ROOT_DIR/$METHOD/config$CONFIG"
|
82 |
-
|
83 |
-
# Extract method and config index from the directory path
|
84 |
-
METHOD=$(basename "$(dirname "$OUTPUT_PATH")")
|
85 |
-
CONFIG=$(basename "$OUTPUT_PATH")
|
86 |
-
|
87 |
-
INPUT_TAR="./data/$METHOD/$CONFIG/$CONFIG.tar.gz"
|
88 |
-
|
89 |
-
# Ensure the output directory exists
|
90 |
-
mkdir -p "$OUTPUT_PATH"
|
91 |
-
|
92 |
-
# Create the tar.gz file
|
93 |
-
echo "Decompressing: $INPUT_TAR to $OUTPUT_PATH"
|
94 |
-
tar -xvzf "$INPUT_TAR" -C "$OUTPUT_PATH"
|
95 |
-
if [[ $? -ne 0 ]]; then
|
96 |
-
echo "Error: Failed to create tar.gz file."
|
97 |
-
exit 1
|
98 |
-
fi
|
99 |
-
|
100 |
-
# Confirm completion
|
101 |
-
echo "Decompressing $INPUT_TAR suclsit essfully to $OUTPUT_PATH"
|
|
|
1 |
#!/bin/bash
|
2 |
|
3 |
+
# Ensure pigz is installed
|
4 |
+
if ! command -v pigz &> /dev/null; then
|
5 |
+
echo "Error: pigz is not installed. Please install it and try again."
|
6 |
+
exit 1
|
7 |
+
fi
|
8 |
+
|
9 |
# Defined the root directory of the whole dataset
|
10 |
ROOT_DIR="/ssd2/TzuYu/data_test"
|
11 |
|
|
|
20 |
local config=$2
|
21 |
local output_path="$ROOT_DIR/$method/$config"
|
22 |
local zipped_file="./data/$method/$config/$config.tar.gz"
|
23 |
+
echo "Decompressing $zipped_file to $output_path"
|
24 |
# Ensure the output directory exists
|
25 |
mkdir -p "$output_path"
|
26 |
|
27 |
# Create the tar.gz file
|
28 |
echo "Decompressing $zipped_file to $output_path"
|
29 |
+
tar -xv --use-compress-program=pigz -f "$zipped_file" --directory "$output_path"
|
30 |
if [[ $? -ne 0 ]]; then
|
31 |
echo "Error: Failed to decompress $zipped_file"
|
32 |
exit 1
|
33 |
fi
|
34 |
+
# Confirm completion
|
35 |
+
echo "Decompressing $zipped_file sucessfully to $output_path"
|
36 |
}
|
37 |
|
38 |
# Parse arguments
|
|
|
51 |
# unzipping all folders
|
52 |
if [[ "$ALL" == "true" ]]; then
|
53 |
# Iterate through all methods and configs
|
54 |
+
for method_dir in ./data/*; do
|
55 |
if [[ -d "$method_dir" && "$(basename "$method_dir")" != "Origin" ]]; then
|
56 |
method=$(basename "$method_dir")
|
57 |
for config_dir in "$method_dir"/config*; do
|
|
|
69 |
|
70 |
# decompress tar.gz file
|
71 |
echo "Decompressing ./data/Origin/Origin.tar.gz to $ROOT_DIR/Origin"
|
72 |
+
tar -xv --use-compress-program=pigz -f "./data/Origin/Origin.tar.gz" --directory "$ROOT_DIR/Origin"
|
73 |
if [[ $? -ne 0 ]]; then
|
74 |
echo "Error: Failed to decompress ./data/Origin/Origin.tar.gz"
|
75 |
exit 1
|
|
|
85 |
echo "Error: Both --method and --config or --all must be specified."
|
86 |
usage
|
87 |
fi
|
88 |
+
unzip_config "$METHOD" "config$CONFIG"
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
zipping.sh
CHANGED
@@ -1,5 +1,11 @@
|
|
1 |
#!/bin/bash
|
2 |
|
|
|
|
|
|
|
|
|
|
|
|
|
3 |
# Defined the root directory of the whole dataset
|
4 |
ROOT_DIR="/ssd2/TzuYu/data"
|
5 |
|
@@ -20,7 +26,7 @@ zip_config() {
|
|
20 |
|
21 |
# Create the tar.gz file
|
22 |
echo "Zipping $dir_path to $output_path"
|
23 |
-
tar -
|
24 |
if [[ $? -ne 0 ]]; then
|
25 |
echo "Error: Failed to create tar.gz for $dir_path"
|
26 |
exit 1
|
@@ -62,7 +68,7 @@ if [[ "$ALL" == "true" ]]; then
|
|
62 |
|
63 |
# Create the tar.gz file
|
64 |
echo "Zipping $ROOT_DIR/Origin to ./data/Origin/Origin.tar.gz"
|
65 |
-
tar -
|
66 |
if [[ $? -ne 0 ]]; then
|
67 |
echo "Error: Failed to create tar.gz for $ROOT_DIR/Origin"
|
68 |
exit 1
|
@@ -94,24 +100,19 @@ METHOD=$(basename "$(dirname "$DIR_PATH")")
|
|
94 |
CONFIG=$(basename "$DIR_PATH")
|
95 |
|
96 |
# Define the output tar file name
|
97 |
-
# If
|
98 |
# Otherwise, saved to the repo root directory
|
99 |
if [[ "$SPECIFIED" == "false" ]]; then
|
100 |
OUTPUT_TAR="./$CONFIG.tar.gz"
|
|
|
|
|
|
|
|
|
|
|
|
|
101 |
else
|
102 |
-
|
103 |
-
|
104 |
-
|
105 |
-
# Ensure the output directory exists
|
106 |
-
mkdir -p "$(dirname "$OUTPUT_TAR")"
|
107 |
-
|
108 |
-
# Create the tar.gz file
|
109 |
-
echo "Zipping directory: $DIR_PATH"
|
110 |
-
tar -czvf "$OUTPUT_TAR" "$DIR_PATH"/*
|
111 |
-
if [[ $? -ne 0 ]]; then
|
112 |
-
echo "Error: Failed to create tar.gz file."
|
113 |
-
exit 1
|
114 |
fi
|
115 |
|
116 |
-
# Confirm completion
|
117 |
echo "Directory zipped successfully to: $OUTPUT_TAR"
|
|
|
1 |
#!/bin/bash
|
2 |
|
3 |
+
# Ensure pigz is installed
|
4 |
+
if ! command -v pigz &> /dev/null; then
|
5 |
+
echo "Error: pigz is not installed. Please install it and try again."
|
6 |
+
exit 1
|
7 |
+
fi
|
8 |
+
|
9 |
# Defined the root directory of the whole dataset
|
10 |
ROOT_DIR="/ssd2/TzuYu/data"
|
11 |
|
|
|
26 |
|
27 |
# Create the tar.gz file
|
28 |
echo "Zipping $dir_path to $output_path"
|
29 |
+
tar -cv --use-compress-program=pigz -f "$output_path" -C "$dir_path" .
|
30 |
if [[ $? -ne 0 ]]; then
|
31 |
echo "Error: Failed to create tar.gz for $dir_path"
|
32 |
exit 1
|
|
|
68 |
|
69 |
# Create the tar.gz file
|
70 |
echo "Zipping $ROOT_DIR/Origin to ./data/Origin/Origin.tar.gz"
|
71 |
+
tar -cv --use-compress-program=pigz -f "./data/Origin/Origin.tar.gz" -C "$ROOT_DIR/Origin" .
|
72 |
if [[ $? -ne 0 ]]; then
|
73 |
echo "Error: Failed to create tar.gz for $ROOT_DIR/Origin"
|
74 |
exit 1
|
|
|
100 |
CONFIG=$(basename "$DIR_PATH")
|
101 |
|
102 |
# Define the output tar file name
|
103 |
+
# If method and config path specified, saved to the data hirarchy
|
104 |
# Otherwise, saved to the repo root directory
|
105 |
if [[ "$SPECIFIED" == "false" ]]; then
|
106 |
OUTPUT_TAR="./$CONFIG.tar.gz"
|
107 |
+
echo "Zipping $DIR_PATH to $OUTPUT_TAR"
|
108 |
+
tar -cv --use-compress-program=pigz -f "$OUTPUT_TAR" -C "$DIR_PATH" .
|
109 |
+
if [[ $? -ne 0 ]]; then
|
110 |
+
echo "Error: Failed to create tar.gz for $dir_path"
|
111 |
+
exit 1
|
112 |
+
fi
|
113 |
else
|
114 |
+
# Create the tar.gz file
|
115 |
+
zip_config "$METHOD" config"$CONFIG"
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
116 |
fi
|
117 |
|
|
|
118 |
echo "Directory zipped successfully to: $OUTPUT_TAR"
|