metadata
license: apache-2.0
tags:
- code
boolean networks, each sampled over 8192 timesteps, ranging from network size 32-1024, with connectivities (sizes of functions) per item taken from random gaussian mixtures
features:
network_size # number of nodes in the network
connectivity_modes # gaussians that connectivities are sampled from
connectivity_mu
connectivity_sigma
connectivity_mode_weights
initial_state
function_keys # think left (input) side of truth table
function_values # think right (output) side of truth table
function_inputs # the node indices that go into this function
for a tiny subset with 128 networks, for example if testing a pipeline, try:
from datasets import load_dataset
dataset = load_dataset(
"parquet", data_files=[
"https://huggingface.co/datasets/midwestern-simulation/boolean-networks/resolve/main/chunk_gUKO6HBS_128_8192_32-1024.parquet"
# "chunk_{id}_{num_samples}_{min_network_size}_{max_network_size}.parquet"
]
)