Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
Bradley
/
fineweb-sample-100BT_over-2048-tokens-subset-split-processed-l3tokenizer
like
0
Formats:
parquet
Size:
1M - 10M
Libraries:
Datasets
Dask
Croissant
+ 1
Dataset card
Data Studio
Files
Files and versions
xet
Community
main
fineweb-sample-100BT_over-2048-tokens-subset-split-processed-l3tokenizer
Commit History
Upload dataset (part 00002-of-00003)
91fb401
verified
Bradley
commited on
May 21
Upload dataset (part 00001-of-00003)
5d2a18a
verified
Bradley
commited on
May 21
Upload dataset (part 00000-of-00003)
0536d2c
verified
Bradley
commited on
May 21
initial commit
ee4dc59
verified
Bradley
commited on
May 21