YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Population Transformer

Weights for pretrained Population Transformer (paper, website, code) using pretrained BrainBERT stft model (paper, code).

Trained on Brain TreeBank (paper, dataset)

Cite:

@misc{chau2024populationtransformer,
        title={Population Transformer: Learning Population-level Representations of Neural Activity}, 
        author={Geeling Chau and Christopher Wang and Sabera Talukder and Vighnesh Subramaniam and Saraswati Soedarmadji and Yisong Yue and Boris Katz and Andrei Barbu},
        year={2024},
        eprint={2406.03044},
        archivePrefix={arXiv},
        primaryClass={cs.LG},
        url={https://arxiv.org/abs/2406.03044}, 
      }
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support