Datasets:

Modalities:
Text
Formats:
json
Size:
< 1K
Libraries:
Datasets
pandas
License:
itazap HF Staff commited on
Commit
6b657e3
·
verified ·
1 Parent(s): d2acd2a

Upload tokenizer_data_fixed.json

Browse files
Files changed (1) hide show
  1. tokenizer_data_fixed.json +2433 -0
tokenizer_data_fixed.json ADDED
@@ -0,0 +1,2433 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "TestClass": "AlbertTokenizationTest",
4
+ "sp_base": false,
5
+ "sequence": "I was born in 92000, and this is falsé.",
6
+ "tokens": [
7
+ "▁i",
8
+ "▁was",
9
+ "▁born",
10
+ "▁in",
11
+ "▁9",
12
+ "2000",
13
+ ",",
14
+ "▁and",
15
+ "▁this",
16
+ "▁is",
17
+ "▁false",
18
+ "."
19
+ ],
20
+ "encoded": [
21
+ 31,
22
+ 23,
23
+ 386,
24
+ 19,
25
+ 561,
26
+ 3050,
27
+ 15,
28
+ 17,
29
+ 48,
30
+ 25,
31
+ 4997,
32
+ 9
33
+ ],
34
+ "encoded_special": [
35
+ 2,
36
+ 31,
37
+ 23,
38
+ 386,
39
+ 19,
40
+ 561,
41
+ 3050,
42
+ 15,
43
+ 17,
44
+ 48,
45
+ 25,
46
+ 4997,
47
+ 9,
48
+ 3
49
+ ],
50
+ "params": {},
51
+ "params_encode": {}
52
+ },
53
+ {
54
+ "TestClass": "BertTokenizationTest",
55
+ "sp_base": false,
56
+ "sequence": "UNwantéd,running",
57
+ "tokens": [
58
+ "un",
59
+ "##want",
60
+ "##ed",
61
+ ",",
62
+ "runn",
63
+ "##ing"
64
+ ],
65
+ "encoded": [
66
+ 9,
67
+ 6,
68
+ 7,
69
+ 12,
70
+ 10,
71
+ 11
72
+ ],
73
+ "encoded_special": [
74
+ 1,
75
+ 9,
76
+ 6,
77
+ 7,
78
+ 12,
79
+ 10,
80
+ 11,
81
+ 2
82
+ ],
83
+ "params": {},
84
+ "params_encode": {}
85
+ },
86
+ {
87
+ "TestClass": "BertTokenizationTest",
88
+ "sp_base": false,
89
+ "sequence": "UNwantéd,running",
90
+ "tokens": [
91
+ "un",
92
+ "##want",
93
+ "##ed",
94
+ ",",
95
+ "runn",
96
+ "##ing"
97
+ ],
98
+ "encoded": [
99
+ 9,
100
+ 6,
101
+ 7,
102
+ 12,
103
+ 10,
104
+ 11
105
+ ],
106
+ "encoded_special": [
107
+ 1,
108
+ 9,
109
+ 6,
110
+ 7,
111
+ 12,
112
+ 10,
113
+ 11,
114
+ 2
115
+ ],
116
+ "params": {
117
+ "do_lower_case": true
118
+ },
119
+ "params_encode": {}
120
+ },
121
+ {
122
+ "TestClass": "BigBirdPegasusTokenizationTest",
123
+ "sp_base": true,
124
+ "sequence": "This is a test",
125
+ "tokens": [
126
+ "▁This",
127
+ "▁is",
128
+ "▁a",
129
+ "▁",
130
+ "t",
131
+ "est"
132
+ ],
133
+ "encoded": [
134
+ 288,
135
+ 46,
136
+ 9,
137
+ 3,
138
+ 12,
139
+ 390
140
+ ],
141
+ "encoded_special": [
142
+ 288,
143
+ 46,
144
+ 9,
145
+ 3,
146
+ 12,
147
+ 390,
148
+ 1
149
+ ],
150
+ "params": {},
151
+ "params_encode": {}
152
+ },
153
+ {
154
+ "TestClass": "BigBirdTokenizationTest",
155
+ "sp_base": false,
156
+ "sequence": "I was born in 92000, and this is falsé.",
157
+ "tokens": [
158
+ "▁I",
159
+ "▁was",
160
+ "▁b",
161
+ "or",
162
+ "n",
163
+ "▁in",
164
+ "▁",
165
+ "9",
166
+ "2",
167
+ "0",
168
+ "0",
169
+ "0",
170
+ ",",
171
+ "▁and",
172
+ "▁this",
173
+ "▁is",
174
+ "▁f",
175
+ "al",
176
+ "s",
177
+ "é",
178
+ "."
179
+ ],
180
+ "encoded": [
181
+ 8,
182
+ 21,
183
+ 84,
184
+ 55,
185
+ 24,
186
+ 19,
187
+ 7,
188
+ 0,
189
+ 602,
190
+ 347,
191
+ 347,
192
+ 347,
193
+ 3,
194
+ 12,
195
+ 66,
196
+ 46,
197
+ 72,
198
+ 80,
199
+ 6,
200
+ 0,
201
+ 4
202
+ ],
203
+ "encoded_special": [
204
+ 1002,
205
+ 8,
206
+ 21,
207
+ 84,
208
+ 55,
209
+ 24,
210
+ 19,
211
+ 7,
212
+ 0,
213
+ 602,
214
+ 347,
215
+ 347,
216
+ 347,
217
+ 3,
218
+ 12,
219
+ 66,
220
+ 46,
221
+ 72,
222
+ 80,
223
+ 6,
224
+ 0,
225
+ 4,
226
+ 1000
227
+ ],
228
+ "params": {},
229
+ "params_encode": {}
230
+ },
231
+ {
232
+ "TestClass": "CLIPTokenizationTest",
233
+ "sp_base": false,
234
+ "sequence": "lower newer",
235
+ "tokens": [
236
+ "lo",
237
+ "w",
238
+ "er</w>",
239
+ "n",
240
+ "e",
241
+ "w",
242
+ "er</w>"
243
+ ],
244
+ "encoded": [
245
+ 10,
246
+ 2,
247
+ 16,
248
+ 9,
249
+ 3,
250
+ 2,
251
+ 16
252
+ ],
253
+ "encoded_special": [
254
+ 21,
255
+ 10,
256
+ 2,
257
+ 16,
258
+ 9,
259
+ 3,
260
+ 2,
261
+ 16,
262
+ 22
263
+ ],
264
+ "params": {},
265
+ "params_encode": {}
266
+ },
267
+ {
268
+ "TestClass": "CamembertTokenizationTest",
269
+ "sp_base": true,
270
+ "sequence": "I was born in 92000, and this is falsé.",
271
+ "tokens": [
272
+ "▁I",
273
+ "▁was",
274
+ "▁b",
275
+ "or",
276
+ "n",
277
+ "▁in",
278
+ "▁",
279
+ "9",
280
+ "2",
281
+ "0",
282
+ "0",
283
+ "0",
284
+ ",",
285
+ "▁and",
286
+ "▁this",
287
+ "▁is",
288
+ "▁f",
289
+ "al",
290
+ "s",
291
+ "é",
292
+ "."
293
+ ],
294
+ "encoded": [
295
+ 12,
296
+ 25,
297
+ 88,
298
+ 59,
299
+ 28,
300
+ 23,
301
+ 11,
302
+ 3,
303
+ 606,
304
+ 351,
305
+ 351,
306
+ 351,
307
+ 7,
308
+ 16,
309
+ 70,
310
+ 50,
311
+ 76,
312
+ 84,
313
+ 10,
314
+ 3,
315
+ 8
316
+ ],
317
+ "encoded_special": [
318
+ 5,
319
+ 12,
320
+ 25,
321
+ 88,
322
+ 59,
323
+ 28,
324
+ 23,
325
+ 11,
326
+ 3,
327
+ 606,
328
+ 351,
329
+ 351,
330
+ 351,
331
+ 7,
332
+ 16,
333
+ 70,
334
+ 50,
335
+ 76,
336
+ 84,
337
+ 10,
338
+ 3,
339
+ 8,
340
+ 6
341
+ ],
342
+ "params": {},
343
+ "params_encode": {}
344
+ },
345
+ {
346
+ "TestClass": "CodeGenTokenizationTest",
347
+ "sp_base": false,
348
+ "sequence": "lower newer",
349
+ "tokens": [
350
+ "Ġlow",
351
+ "er",
352
+ "Ġ",
353
+ "n",
354
+ "e",
355
+ "w",
356
+ "er"
357
+ ],
358
+ "encoded": [
359
+ 14,
360
+ 15,
361
+ 10,
362
+ 9,
363
+ 3,
364
+ 2,
365
+ 15
366
+ ],
367
+ "encoded_special": [
368
+ 14,
369
+ 15,
370
+ 10,
371
+ 9,
372
+ 3,
373
+ 2,
374
+ 15
375
+ ],
376
+ "params": {
377
+ "add_prefix_space": true
378
+ },
379
+ "params_encode": {}
380
+ },
381
+ {
382
+ "TestClass": "DPRContextEncoderTokenizationTest",
383
+ "sp_base": false,
384
+ "sequence": "UNwantéd,running",
385
+ "tokens": [
386
+ "un",
387
+ "##want",
388
+ "##ed",
389
+ ",",
390
+ "runn",
391
+ "##ing"
392
+ ],
393
+ "encoded": [
394
+ 9,
395
+ 6,
396
+ 7,
397
+ 12,
398
+ 10,
399
+ 11
400
+ ],
401
+ "encoded_special": [
402
+ 1,
403
+ 9,
404
+ 6,
405
+ 7,
406
+ 12,
407
+ 10,
408
+ 11,
409
+ 2
410
+ ],
411
+ "params": {},
412
+ "params_encode": {}
413
+ },
414
+ {
415
+ "TestClass": "DPRQuestionEncoderTokenizationTest",
416
+ "sp_base": false,
417
+ "sequence": "UNwantéd,running",
418
+ "tokens": [
419
+ "un",
420
+ "##want",
421
+ "##ed",
422
+ ",",
423
+ "runn",
424
+ "##ing"
425
+ ],
426
+ "encoded": [
427
+ 9,
428
+ 6,
429
+ 7,
430
+ 12,
431
+ 10,
432
+ 11
433
+ ],
434
+ "encoded_special": [
435
+ 1,
436
+ 9,
437
+ 6,
438
+ 7,
439
+ 12,
440
+ 10,
441
+ 11,
442
+ 2
443
+ ],
444
+ "params": {},
445
+ "params_encode": {}
446
+ },
447
+ {
448
+ "TestClass": "DPRReaderTokenizationTest",
449
+ "sp_base": false,
450
+ "sequence": "UNwantéd,running",
451
+ "tokens": [
452
+ "un",
453
+ "##want",
454
+ "##ed",
455
+ ",",
456
+ "runn",
457
+ "##ing"
458
+ ],
459
+ "encoded": [
460
+ 9,
461
+ 6,
462
+ 7,
463
+ 12,
464
+ 10,
465
+ 11
466
+ ],
467
+ "encoded_special": [
468
+ 1,
469
+ 9,
470
+ 6,
471
+ 7,
472
+ 12,
473
+ 10,
474
+ 11,
475
+ 2
476
+ ],
477
+ "params": {},
478
+ "params_encode": {}
479
+ },
480
+ {
481
+ "TestClass": "DebertaTokenizationTest",
482
+ "sp_base": false,
483
+ "sequence": "lower newer",
484
+ "tokens": [
485
+ "l",
486
+ "o",
487
+ "w",
488
+ "er",
489
+ "Ġ",
490
+ "n",
491
+ "e",
492
+ "w",
493
+ "er"
494
+ ],
495
+ "encoded": [
496
+ 0,
497
+ 1,
498
+ 2,
499
+ 15,
500
+ 10,
501
+ 9,
502
+ 3,
503
+ 2,
504
+ 15
505
+ ],
506
+ "encoded_special": [
507
+ 20,
508
+ 0,
509
+ 1,
510
+ 2,
511
+ 15,
512
+ 10,
513
+ 9,
514
+ 3,
515
+ 2,
516
+ 15,
517
+ 21
518
+ ],
519
+ "params": {},
520
+ "params_encode": {}
521
+ },
522
+ {
523
+ "TestClass": "DebertaV2TokenizationTest",
524
+ "sp_base": false,
525
+ "sequence": "I was born in 92000, and this is falsé!",
526
+ "tokens": [
527
+ "▁",
528
+ "I",
529
+ "▁was",
530
+ "▁born",
531
+ "▁in",
532
+ "▁9",
533
+ "2000",
534
+ ",",
535
+ "▁and",
536
+ "▁this",
537
+ "▁is",
538
+ "▁fal",
539
+ "s",
540
+ "é",
541
+ "!"
542
+ ],
543
+ "encoded": [
544
+ 13,
545
+ 1,
546
+ 23,
547
+ 386,
548
+ 19,
549
+ 561,
550
+ 3050,
551
+ 15,
552
+ 17,
553
+ 48,
554
+ 25,
555
+ 8256,
556
+ 18,
557
+ 1,
558
+ 187
559
+ ],
560
+ "encoded_special": [
561
+ 2,
562
+ 13,
563
+ 1,
564
+ 23,
565
+ 386,
566
+ 19,
567
+ 561,
568
+ 3050,
569
+ 15,
570
+ 17,
571
+ 48,
572
+ 25,
573
+ 8256,
574
+ 18,
575
+ 1,
576
+ 187,
577
+ 3
578
+ ],
579
+ "params": {},
580
+ "params_encode": {}
581
+ },
582
+ {
583
+ "TestClass": "DistilBertTokenizationTest",
584
+ "sp_base": false,
585
+ "sequence": "UNwantéd,running",
586
+ "tokens": [
587
+ "un",
588
+ "##want",
589
+ "##ed",
590
+ ",",
591
+ "runn",
592
+ "##ing"
593
+ ],
594
+ "encoded": [
595
+ 9,
596
+ 6,
597
+ 7,
598
+ 12,
599
+ 10,
600
+ 11
601
+ ],
602
+ "encoded_special": [
603
+ 1,
604
+ 9,
605
+ 6,
606
+ 7,
607
+ 12,
608
+ 10,
609
+ 11,
610
+ 2
611
+ ],
612
+ "params": {},
613
+ "params_encode": {}
614
+ },
615
+ {
616
+ "TestClass": "ElectraTokenizationTest",
617
+ "sp_base": false,
618
+ "sequence": "UNwantéd,running",
619
+ "tokens": [
620
+ "un",
621
+ "##want",
622
+ "##ed",
623
+ ",",
624
+ "runn",
625
+ "##ing"
626
+ ],
627
+ "encoded": [
628
+ 9,
629
+ 6,
630
+ 7,
631
+ 12,
632
+ 10,
633
+ 11
634
+ ],
635
+ "encoded_special": [
636
+ 1,
637
+ 9,
638
+ 6,
639
+ 7,
640
+ 12,
641
+ 10,
642
+ 11,
643
+ 2
644
+ ],
645
+ "params": {},
646
+ "params_encode": {}
647
+ },
648
+ {
649
+ "TestClass": "ElectraTokenizationTest",
650
+ "sp_base": false,
651
+ "sequence": "UNwantéd,running",
652
+ "tokens": [
653
+ "un",
654
+ "##want",
655
+ "##ed",
656
+ ",",
657
+ "runn",
658
+ "##ing"
659
+ ],
660
+ "encoded": [
661
+ 9,
662
+ 6,
663
+ 7,
664
+ 12,
665
+ 10,
666
+ 11
667
+ ],
668
+ "encoded_special": [
669
+ 1,
670
+ 9,
671
+ 6,
672
+ 7,
673
+ 12,
674
+ 10,
675
+ 11,
676
+ 2
677
+ ],
678
+ "params": {
679
+ "do_lower_case": true
680
+ },
681
+ "params_encode": {}
682
+ },
683
+ {
684
+ "TestClass": "FNetTokenizationTest",
685
+ "sp_base": false,
686
+ "sequence": "I was born in 92000, and this is falsé.",
687
+ "tokens": [
688
+ "▁",
689
+ "I",
690
+ "▁was",
691
+ "▁born",
692
+ "▁in",
693
+ "▁9",
694
+ "2000",
695
+ ",",
696
+ "▁and",
697
+ "▁this",
698
+ "▁is",
699
+ "▁fal",
700
+ "s",
701
+ "é",
702
+ "."
703
+ ],
704
+ "encoded": [
705
+ 13,
706
+ 1,
707
+ 23,
708
+ 386,
709
+ 19,
710
+ 561,
711
+ 3050,
712
+ 15,
713
+ 17,
714
+ 48,
715
+ 25,
716
+ 8256,
717
+ 18,
718
+ 1,
719
+ 9
720
+ ],
721
+ "encoded_special": [
722
+ 2,
723
+ 13,
724
+ 1,
725
+ 23,
726
+ 386,
727
+ 19,
728
+ 561,
729
+ 3050,
730
+ 15,
731
+ 17,
732
+ 48,
733
+ 25,
734
+ 8256,
735
+ 18,
736
+ 1,
737
+ 9,
738
+ 3
739
+ ],
740
+ "params": {},
741
+ "params_encode": {}
742
+ },
743
+ {
744
+ "TestClass": "FunnelTokenizationTest",
745
+ "sp_base": false,
746
+ "sequence": "UNwantéd,running",
747
+ "tokens": [
748
+ "un",
749
+ "##want",
750
+ "##ed",
751
+ ",",
752
+ "runn",
753
+ "##ing"
754
+ ],
755
+ "encoded": [
756
+ 7,
757
+ 4,
758
+ 5,
759
+ 10,
760
+ 8,
761
+ 9
762
+ ],
763
+ "encoded_special": [
764
+ 1,
765
+ 7,
766
+ 4,
767
+ 5,
768
+ 10,
769
+ 8,
770
+ 9,
771
+ 2
772
+ ],
773
+ "params": {},
774
+ "params_encode": {}
775
+ },
776
+ {
777
+ "TestClass": "GPT2TokenizationTest",
778
+ "sp_base": false,
779
+ "sequence": "lower newer",
780
+ "tokens": [
781
+ "Ġlow",
782
+ "er",
783
+ "Ġ",
784
+ "n",
785
+ "e",
786
+ "w",
787
+ "er"
788
+ ],
789
+ "encoded": [
790
+ 14,
791
+ 15,
792
+ 10,
793
+ 9,
794
+ 3,
795
+ 2,
796
+ 15
797
+ ],
798
+ "encoded_special": [
799
+ 14,
800
+ 15,
801
+ 10,
802
+ 9,
803
+ 3,
804
+ 2,
805
+ 15
806
+ ],
807
+ "params": {
808
+ "add_prefix_space": true
809
+ },
810
+ "params_encode": {}
811
+ },
812
+ {
813
+ "TestClass": "HerbertTokenizationTest",
814
+ "sp_base": false,
815
+ "sequence": "lower,newer",
816
+ "tokens": [
817
+ "low",
818
+ "er</w>",
819
+ ",</w>",
820
+ "n",
821
+ "e",
822
+ "w",
823
+ "er</w>"
824
+ ],
825
+ "encoded": [
826
+ 16,
827
+ 17,
828
+ 22,
829
+ 11,
830
+ 5,
831
+ 4,
832
+ 17
833
+ ],
834
+ "encoded_special": [
835
+ 0,
836
+ 16,
837
+ 17,
838
+ 22,
839
+ 11,
840
+ 5,
841
+ 4,
842
+ 17,
843
+ 1
844
+ ],
845
+ "params": {},
846
+ "params_encode": {}
847
+ },
848
+ {
849
+ "TestClass": "LayoutLMTokenizationTest",
850
+ "sp_base": false,
851
+ "sequence": "UNwantéd,running",
852
+ "tokens": [
853
+ "un",
854
+ "##want",
855
+ "##ed",
856
+ ",",
857
+ "runn",
858
+ "##ing"
859
+ ],
860
+ "encoded": [
861
+ 7,
862
+ 4,
863
+ 5,
864
+ 10,
865
+ 8,
866
+ 9
867
+ ],
868
+ "encoded_special": [
869
+ 1,
870
+ 7,
871
+ 4,
872
+ 5,
873
+ 10,
874
+ 8,
875
+ 9,
876
+ 2
877
+ ],
878
+ "params": {},
879
+ "params_encode": {}
880
+ },
881
+ {
882
+ "TestClass": "LayoutLMv2TokenizationTest",
883
+ "sp_base": false,
884
+ "sequence": false,
885
+ "tokens": false,
886
+ "encoded": [
887
+ 10,
888
+ 11,
889
+ 12,
890
+ 13
891
+ ],
892
+ "encoded_special": [
893
+ 1,
894
+ 10,
895
+ 11,
896
+ 12,
897
+ 13,
898
+ 2
899
+ ],
900
+ "params": {},
901
+ "params_encode": {
902
+ "text": [
903
+ "a",
904
+ "weirdly",
905
+ "test"
906
+ ],
907
+ "boxes": [
908
+ [
909
+ 423,
910
+ 237,
911
+ 440,
912
+ 251
913
+ ],
914
+ [
915
+ 427,
916
+ 272,
917
+ 441,
918
+ 287
919
+ ],
920
+ [
921
+ 419,
922
+ 115,
923
+ 437,
924
+ 129
925
+ ]
926
+ ]
927
+ }
928
+ },
929
+ {
930
+ "TestClass": "LayoutLMv3TokenizationTest",
931
+ "sp_base": false,
932
+ "sequence": false,
933
+ "tokens": false,
934
+ "encoded": [
935
+ 14,
936
+ 15,
937
+ 10,
938
+ 9,
939
+ 3,
940
+ 2,
941
+ 15
942
+ ],
943
+ "encoded_special": [
944
+ 20,
945
+ 14,
946
+ 15,
947
+ 10,
948
+ 9,
949
+ 3,
950
+ 2,
951
+ 15,
952
+ 21
953
+ ],
954
+ "params": {},
955
+ "params_encode": {
956
+ "text": [
957
+ "lower",
958
+ "newer"
959
+ ],
960
+ "boxes": [
961
+ [
962
+ 423,
963
+ 237,
964
+ 440,
965
+ 251
966
+ ],
967
+ [
968
+ 427,
969
+ 272,
970
+ 441,
971
+ 287
972
+ ]
973
+ ]
974
+ }
975
+ },
976
+ {
977
+ "TestClass": "LayoutXLMTokenizationTest",
978
+ "sp_base": true,
979
+ "sequence": false,
980
+ "tokens": false,
981
+ "encoded": [
982
+ 11,
983
+ 113,
984
+ 159,
985
+ 17,
986
+ 39,
987
+ 171,
988
+ 383
989
+ ],
990
+ "encoded_special": [
991
+ 0,
992
+ 11,
993
+ 113,
994
+ 159,
995
+ 17,
996
+ 39,
997
+ 171,
998
+ 383,
999
+ 2
1000
+ ],
1001
+ "params": {},
1002
+ "params_encode": {
1003
+ "text": [
1004
+ "a",
1005
+ "weirdly",
1006
+ "test"
1007
+ ],
1008
+ "boxes": [
1009
+ [
1010
+ 423,
1011
+ 237,
1012
+ 440,
1013
+ 251
1014
+ ],
1015
+ [
1016
+ 427,
1017
+ 272,
1018
+ 441,
1019
+ 287
1020
+ ],
1021
+ [
1022
+ 419,
1023
+ 115,
1024
+ 437,
1025
+ 129
1026
+ ]
1027
+ ]
1028
+ }
1029
+ },
1030
+ {
1031
+ "TestClass": "LongformerTokenizationTest",
1032
+ "sp_base": false,
1033
+ "sequence": "lower newer",
1034
+ "tokens": [
1035
+ "l",
1036
+ "o",
1037
+ "w",
1038
+ "er",
1039
+ "Ġ",
1040
+ "n",
1041
+ "e",
1042
+ "w",
1043
+ "er"
1044
+ ],
1045
+ "encoded": [
1046
+ 0,
1047
+ 1,
1048
+ 2,
1049
+ 15,
1050
+ 10,
1051
+ 9,
1052
+ 3,
1053
+ 2,
1054
+ 15
1055
+ ],
1056
+ "encoded_special": [
1057
+ 20,
1058
+ 0,
1059
+ 1,
1060
+ 2,
1061
+ 15,
1062
+ 10,
1063
+ 9,
1064
+ 3,
1065
+ 2,
1066
+ 15,
1067
+ 21
1068
+ ],
1069
+ "params": {},
1070
+ "params_encode": {}
1071
+ },
1072
+ {
1073
+ "TestClass": "LxmertTokenizationTest",
1074
+ "sp_base": false,
1075
+ "sequence": "I was born in 92000, and this is falsé.",
1076
+ "tokens": [
1077
+ "[UNK]",
1078
+ "[UNK]",
1079
+ "[UNK]",
1080
+ "[UNK]",
1081
+ "[UNK]",
1082
+ ",",
1083
+ "[UNK]",
1084
+ "[UNK]",
1085
+ "[UNK]",
1086
+ "[UNK]",
1087
+ "[UNK]"
1088
+ ],
1089
+ "encoded": [
1090
+ 0,
1091
+ 0,
1092
+ 0,
1093
+ 0,
1094
+ 0,
1095
+ 10,
1096
+ 0,
1097
+ 0,
1098
+ 0,
1099
+ 0,
1100
+ 0
1101
+ ],
1102
+ "encoded_special": [
1103
+ 1,
1104
+ 0,
1105
+ 0,
1106
+ 0,
1107
+ 0,
1108
+ 0,
1109
+ 10,
1110
+ 0,
1111
+ 0,
1112
+ 0,
1113
+ 0,
1114
+ 0,
1115
+ 2
1116
+ ],
1117
+ "params": {},
1118
+ "params_encode": {}
1119
+ },
1120
+ {
1121
+ "TestClass": "MBart50TokenizationTest",
1122
+ "sp_base": true,
1123
+ "sequence": "the I to a and of in was it me that be he for with my not is s you",
1124
+ "tokens": [
1125
+ "▁the",
1126
+ "▁I",
1127
+ "▁to",
1128
+ "▁a",
1129
+ "▁and",
1130
+ "▁of",
1131
+ "▁in",
1132
+ "▁was",
1133
+ "▁it",
1134
+ "▁me",
1135
+ "▁that",
1136
+ "▁be",
1137
+ "▁he",
1138
+ "▁for",
1139
+ "▁with",
1140
+ "▁my",
1141
+ "▁not",
1142
+ "▁is",
1143
+ "▁s",
1144
+ "▁you"
1145
+ ],
1146
+ "encoded": [
1147
+ 6,
1148
+ 9,
1149
+ 10,
1150
+ 11,
1151
+ 13,
1152
+ 14,
1153
+ 20,
1154
+ 22,
1155
+ 26,
1156
+ 32,
1157
+ 35,
1158
+ 37,
1159
+ 38,
1160
+ 41,
1161
+ 42,
1162
+ 44,
1163
+ 45,
1164
+ 47,
1165
+ 48,
1166
+ 49
1167
+ ],
1168
+ "encoded_special": [
1169
+ 1004,
1170
+ 6,
1171
+ 9,
1172
+ 10,
1173
+ 11,
1174
+ 13,
1175
+ 14,
1176
+ 20,
1177
+ 22,
1178
+ 26,
1179
+ 32,
1180
+ 35,
1181
+ 37,
1182
+ 38,
1183
+ 41,
1184
+ 42,
1185
+ 44,
1186
+ 45,
1187
+ 47,
1188
+ 48,
1189
+ 49,
1190
+ 2
1191
+ ],
1192
+ "params": {},
1193
+ "params_encode": {}
1194
+ },
1195
+ {
1196
+ "TestClass": "MBartTokenizationTest",
1197
+ "sp_base": true,
1198
+ "sequence": "the I to a and of in was it me that be he for with my not is s you",
1199
+ "tokens": [
1200
+ "▁the",
1201
+ "▁I",
1202
+ "▁to",
1203
+ "▁a",
1204
+ "▁and",
1205
+ "▁of",
1206
+ "▁in",
1207
+ "▁was",
1208
+ "▁it",
1209
+ "▁me",
1210
+ "▁that",
1211
+ "▁be",
1212
+ "▁he",
1213
+ "▁for",
1214
+ "▁with",
1215
+ "▁my",
1216
+ "▁not",
1217
+ "▁is",
1218
+ "▁s",
1219
+ "▁you"
1220
+ ],
1221
+ "encoded": [
1222
+ 6,
1223
+ 9,
1224
+ 10,
1225
+ 11,
1226
+ 13,
1227
+ 14,
1228
+ 20,
1229
+ 22,
1230
+ 26,
1231
+ 32,
1232
+ 35,
1233
+ 37,
1234
+ 38,
1235
+ 41,
1236
+ 42,
1237
+ 44,
1238
+ 45,
1239
+ 47,
1240
+ 48,
1241
+ 49
1242
+ ],
1243
+ "encoded_special": [
1244
+ 6,
1245
+ 9,
1246
+ 10,
1247
+ 11,
1248
+ 13,
1249
+ 14,
1250
+ 20,
1251
+ 22,
1252
+ 26,
1253
+ 32,
1254
+ 35,
1255
+ 37,
1256
+ 38,
1257
+ 41,
1258
+ 42,
1259
+ 44,
1260
+ 45,
1261
+ 47,
1262
+ 48,
1263
+ 49,
1264
+ 2,
1265
+ 1004
1266
+ ],
1267
+ "params": {},
1268
+ "params_encode": {}
1269
+ },
1270
+ {
1271
+ "TestClass": "MPNetTokenizerTest",
1272
+ "sp_base": false,
1273
+ "sequence": "UNwantéd,running",
1274
+ "tokens": [
1275
+ "un",
1276
+ "##want",
1277
+ "##ed",
1278
+ ",",
1279
+ "runn",
1280
+ "##ing"
1281
+ ],
1282
+ "encoded": [
1283
+ 9,
1284
+ 6,
1285
+ 7,
1286
+ 12,
1287
+ 10,
1288
+ 11
1289
+ ],
1290
+ "encoded_special": [
1291
+ 15,
1292
+ 9,
1293
+ 6,
1294
+ 7,
1295
+ 12,
1296
+ 10,
1297
+ 11,
1298
+ 16
1299
+ ],
1300
+ "params": {},
1301
+ "params_encode": {}
1302
+ },
1303
+ {
1304
+ "TestClass": "MarkupLMTokenizationTest",
1305
+ "sp_base": false,
1306
+ "sequence": false,
1307
+ "tokens": false,
1308
+ "encoded": [
1309
+ 21,
1310
+ 3,
1311
+ 0,
1312
+ 0,
1313
+ 1,
1314
+ 2,
1315
+ 1,
1316
+ 4,
1317
+ 0,
1318
+ 8
1319
+ ],
1320
+ "encoded_special": [
1321
+ 22,
1322
+ 21,
1323
+ 3,
1324
+ 0,
1325
+ 0,
1326
+ 1,
1327
+ 2,
1328
+ 1,
1329
+ 4,
1330
+ 0,
1331
+ 8,
1332
+ 23
1333
+ ],
1334
+ "params": {},
1335
+ "params_encode": {
1336
+ "text": [
1337
+ "hello",
1338
+ "world"
1339
+ ],
1340
+ "xpaths": [
1341
+ ",/html/body/div/li[1]/div/span",
1342
+ ",/html/body/div/li[1]/div/span"
1343
+ ]
1344
+ }
1345
+ },
1346
+ {
1347
+ "TestClass": "MobileBERTTokenizationTest",
1348
+ "sp_base": false,
1349
+ "sequence": "UNwantéd,running",
1350
+ "tokens": [
1351
+ "un",
1352
+ "##want",
1353
+ "##ed",
1354
+ ",",
1355
+ "runn",
1356
+ "##ing"
1357
+ ],
1358
+ "encoded": [
1359
+ 9,
1360
+ 6,
1361
+ 7,
1362
+ 12,
1363
+ 10,
1364
+ 11
1365
+ ],
1366
+ "encoded_special": [
1367
+ 1,
1368
+ 9,
1369
+ 6,
1370
+ 7,
1371
+ 12,
1372
+ 10,
1373
+ 11,
1374
+ 2
1375
+ ],
1376
+ "params": {},
1377
+ "params_encode": {}
1378
+ },
1379
+ {
1380
+ "TestClass": "NllbTokenizationTest",
1381
+ "sp_base": true,
1382
+ "sequence": "the I to a and of in was it me that be he for with my not is s you",
1383
+ "tokens": [
1384
+ "▁the",
1385
+ "▁I",
1386
+ "▁to",
1387
+ "▁a",
1388
+ "▁and",
1389
+ "▁of",
1390
+ "▁in",
1391
+ "▁was",
1392
+ "▁it",
1393
+ "▁me",
1394
+ "▁that",
1395
+ "▁be",
1396
+ "▁he",
1397
+ "▁for",
1398
+ "▁with",
1399
+ "▁my",
1400
+ "▁not",
1401
+ "▁is",
1402
+ "▁s",
1403
+ "▁you"
1404
+ ],
1405
+ "encoded": [
1406
+ 6,
1407
+ 9,
1408
+ 10,
1409
+ 11,
1410
+ 13,
1411
+ 14,
1412
+ 20,
1413
+ 22,
1414
+ 26,
1415
+ 32,
1416
+ 35,
1417
+ 37,
1418
+ 38,
1419
+ 41,
1420
+ 42,
1421
+ 44,
1422
+ 45,
1423
+ 47,
1424
+ 48,
1425
+ 49
1426
+ ],
1427
+ "encoded_special": [
1428
+ 1048,
1429
+ 6,
1430
+ 9,
1431
+ 10,
1432
+ 11,
1433
+ 13,
1434
+ 14,
1435
+ 20,
1436
+ 22,
1437
+ 26,
1438
+ 32,
1439
+ 35,
1440
+ 37,
1441
+ 38,
1442
+ 41,
1443
+ 42,
1444
+ 44,
1445
+ 45,
1446
+ 47,
1447
+ 48,
1448
+ 49,
1449
+ 2
1450
+ ],
1451
+ "params": {},
1452
+ "params_encode": {}
1453
+ },
1454
+ {
1455
+ "TestClass": "OpenAIGPTTokenizationTest",
1456
+ "sp_base": false,
1457
+ "sequence": "lower newer",
1458
+ "tokens": [
1459
+ "low",
1460
+ "er</w>",
1461
+ "n",
1462
+ "e",
1463
+ "w",
1464
+ "er</w>"
1465
+ ],
1466
+ "encoded": [
1467
+ 14,
1468
+ 15,
1469
+ 9,
1470
+ 3,
1471
+ 2,
1472
+ 15
1473
+ ],
1474
+ "encoded_special": [
1475
+ 14,
1476
+ 15,
1477
+ 9,
1478
+ 3,
1479
+ 2,
1480
+ 15
1481
+ ],
1482
+ "params": {},
1483
+ "params_encode": {}
1484
+ },
1485
+ {
1486
+ "TestClass": "PegasusTokenizationTest",
1487
+ "sp_base": true,
1488
+ "sequence": "This is a test",
1489
+ "tokens": [
1490
+ "▁This",
1491
+ "▁is",
1492
+ "▁a",
1493
+ "▁",
1494
+ "t",
1495
+ "est"
1496
+ ],
1497
+ "encoded": [
1498
+ 391,
1499
+ 149,
1500
+ 112,
1501
+ 106,
1502
+ 115,
1503
+ 493
1504
+ ],
1505
+ "encoded_special": [
1506
+ 391,
1507
+ 149,
1508
+ 112,
1509
+ 106,
1510
+ 115,
1511
+ 493,
1512
+ 1
1513
+ ],
1514
+ "params": {},
1515
+ "params_encode": {}
1516
+ },
1517
+ {
1518
+ "TestClass": "Qwen2TokenizationTest",
1519
+ "sp_base": false,
1520
+ "sequence": "lower lower newer 010;}\r\n<|endoftext|>ϓ",
1521
+ "tokens": [
1522
+ "l",
1523
+ "o",
1524
+ "w",
1525
+ "er",
1526
+ "Ġlow",
1527
+ "er",
1528
+ "Ġ",
1529
+ "n",
1530
+ "e",
1531
+ "w",
1532
+ "er",
1533
+ "Ġ",
1534
+ "0",
1535
+ "1",
1536
+ "0",
1537
+ ";}",
1538
+ "č",
1539
+ "Ċ",
1540
+ "<|endoftext|>",
1541
+ "Ïĵ"
1542
+ ],
1543
+ "encoded": [
1544
+ 75,
1545
+ 78,
1546
+ 86,
1547
+ 260,
1548
+ 259,
1549
+ 260,
1550
+ 220,
1551
+ 77,
1552
+ 68,
1553
+ 86,
1554
+ 260,
1555
+ 220,
1556
+ 15,
1557
+ 16,
1558
+ 15,
1559
+ 265,
1560
+ 201,
1561
+ 198,
1562
+ 270,
1563
+ 267
1564
+ ],
1565
+ "encoded_special": [
1566
+ 75,
1567
+ 78,
1568
+ 86,
1569
+ 260,
1570
+ 259,
1571
+ 260,
1572
+ 220,
1573
+ 77,
1574
+ 68,
1575
+ 86,
1576
+ 260,
1577
+ 220,
1578
+ 15,
1579
+ 16,
1580
+ 15,
1581
+ 265,
1582
+ 201,
1583
+ 198,
1584
+ 270,
1585
+ 267
1586
+ ],
1587
+ "params": {},
1588
+ "params_encode": {}
1589
+ },
1590
+ {
1591
+ "TestClass": "ReformerTokenizationTest",
1592
+ "sp_base": false,
1593
+ "sequence": "I was born in 92000, and this is falsé.",
1594
+ "tokens": [
1595
+ "▁I",
1596
+ "▁was",
1597
+ "▁b",
1598
+ "or",
1599
+ "n",
1600
+ "▁in",
1601
+ "▁",
1602
+ "9",
1603
+ "2",
1604
+ "0",
1605
+ "0",
1606
+ "0",
1607
+ ",",
1608
+ "▁and",
1609
+ "▁this",
1610
+ "▁is",
1611
+ "▁f",
1612
+ "al",
1613
+ "s",
1614
+ "é",
1615
+ "."
1616
+ ],
1617
+ "encoded": [
1618
+ 8,
1619
+ 21,
1620
+ 84,
1621
+ 55,
1622
+ 24,
1623
+ 19,
1624
+ 7,
1625
+ 0,
1626
+ 602,
1627
+ 347,
1628
+ 347,
1629
+ 347,
1630
+ 3,
1631
+ 12,
1632
+ 66,
1633
+ 46,
1634
+ 72,
1635
+ 80,
1636
+ 6,
1637
+ 0,
1638
+ 4
1639
+ ],
1640
+ "encoded_special": [
1641
+ 8,
1642
+ 21,
1643
+ 84,
1644
+ 55,
1645
+ 24,
1646
+ 19,
1647
+ 7,
1648
+ 0,
1649
+ 602,
1650
+ 347,
1651
+ 347,
1652
+ 347,
1653
+ 3,
1654
+ 12,
1655
+ 66,
1656
+ 46,
1657
+ 72,
1658
+ 80,
1659
+ 6,
1660
+ 0,
1661
+ 4
1662
+ ],
1663
+ "params": {},
1664
+ "params_encode": {}
1665
+ },
1666
+ {
1667
+ "TestClass": "RemBertTokenizationTest",
1668
+ "sp_base": true,
1669
+ "sequence": "this is a test",
1670
+ "tokens": [
1671
+ "▁this",
1672
+ "▁is",
1673
+ "▁a",
1674
+ "▁t",
1675
+ "est"
1676
+ ],
1677
+ "encoded": [
1678
+ 66,
1679
+ 46,
1680
+ 10,
1681
+ 170,
1682
+ 382
1683
+ ],
1684
+ "encoded_special": [
1685
+ 1000,
1686
+ 66,
1687
+ 46,
1688
+ 10,
1689
+ 170,
1690
+ 382,
1691
+ 1001
1692
+ ],
1693
+ "params": {},
1694
+ "params_encode": {}
1695
+ },
1696
+ {
1697
+ "TestClass": "RobertaTokenizationTest",
1698
+ "sp_base": false,
1699
+ "sequence": "lower newer",
1700
+ "tokens": [
1701
+ "l",
1702
+ "o",
1703
+ "w",
1704
+ "er",
1705
+ "Ġ",
1706
+ "n",
1707
+ "e",
1708
+ "w",
1709
+ "er"
1710
+ ],
1711
+ "encoded": [
1712
+ 0,
1713
+ 1,
1714
+ 2,
1715
+ 15,
1716
+ 10,
1717
+ 9,
1718
+ 3,
1719
+ 2,
1720
+ 15
1721
+ ],
1722
+ "encoded_special": [
1723
+ 20,
1724
+ 0,
1725
+ 1,
1726
+ 2,
1727
+ 15,
1728
+ 10,
1729
+ 9,
1730
+ 3,
1731
+ 2,
1732
+ 15,
1733
+ 21
1734
+ ],
1735
+ "params": {},
1736
+ "params_encode": {}
1737
+ },
1738
+ {
1739
+ "TestClass": "SeamlessM4TTokenizationTest",
1740
+ "sp_base": true,
1741
+ "sequence": "the I to a and of in was it me that be he for with my not is s you",
1742
+ "tokens": [
1743
+ "▁the",
1744
+ "▁I",
1745
+ "▁to",
1746
+ "▁a",
1747
+ "▁and",
1748
+ "▁of",
1749
+ "▁in",
1750
+ "▁was",
1751
+ "▁it",
1752
+ "▁me",
1753
+ "▁that",
1754
+ "▁be",
1755
+ "▁he",
1756
+ "▁for",
1757
+ "▁with",
1758
+ "▁my",
1759
+ "▁not",
1760
+ "▁is",
1761
+ "▁s",
1762
+ "▁you"
1763
+ ],
1764
+ "encoded": [
1765
+ 6,
1766
+ 9,
1767
+ 10,
1768
+ 11,
1769
+ 13,
1770
+ 14,
1771
+ 20,
1772
+ 22,
1773
+ 26,
1774
+ 32,
1775
+ 35,
1776
+ 37,
1777
+ 38,
1778
+ 41,
1779
+ 42,
1780
+ 44,
1781
+ 45,
1782
+ 47,
1783
+ 48,
1784
+ 49
1785
+ ],
1786
+ "encoded_special": [
1787
+ 3,
1788
+ 1,
1789
+ 6,
1790
+ 9,
1791
+ 10,
1792
+ 11,
1793
+ 13,
1794
+ 14,
1795
+ 20,
1796
+ 22,
1797
+ 26,
1798
+ 32,
1799
+ 35,
1800
+ 37,
1801
+ 38,
1802
+ 41,
1803
+ 42,
1804
+ 44,
1805
+ 45,
1806
+ 47,
1807
+ 48,
1808
+ 49,
1809
+ 3
1810
+ ],
1811
+ "params": {},
1812
+ "params_encode": {}
1813
+ },
1814
+ {
1815
+ "TestClass": "SplinterTokenizationTest",
1816
+ "sp_base": false,
1817
+ "sequence": "I need to test this rigor",
1818
+ "tokens": [
1819
+ "[UNK]",
1820
+ "[UNK]",
1821
+ "[UNK]",
1822
+ "test",
1823
+ "this",
1824
+ "rigor"
1825
+ ],
1826
+ "encoded": [
1827
+ 3,
1828
+ 10,
1829
+ 10,
1830
+ 10,
1831
+ 16,
1832
+ 13,
1833
+ 21,
1834
+ 1
1835
+ ],
1836
+ "encoded_special": NaN,
1837
+ "params": {},
1838
+ "params_encode": {}
1839
+ },
1840
+ {
1841
+ "TestClass": "SqueezeBertTokenizationTest",
1842
+ "sp_base": false,
1843
+ "sequence": "UNwantéd,running",
1844
+ "tokens": [
1845
+ "un",
1846
+ "##want",
1847
+ "##ed",
1848
+ ",",
1849
+ "runn",
1850
+ "##ing"
1851
+ ],
1852
+ "encoded": [
1853
+ 9,
1854
+ 6,
1855
+ 7,
1856
+ 12,
1857
+ 10,
1858
+ 11
1859
+ ],
1860
+ "encoded_special": [
1861
+ 1,
1862
+ 9,
1863
+ 6,
1864
+ 7,
1865
+ 12,
1866
+ 10,
1867
+ 11,
1868
+ 2
1869
+ ],
1870
+ "params": {},
1871
+ "params_encode": {}
1872
+ },
1873
+ {
1874
+ "TestClass": "T5TokenizationTest",
1875
+ "sp_base": false,
1876
+ "sequence": "I was born in 92000, and this is falsé.",
1877
+ "tokens": [
1878
+ "▁I",
1879
+ "▁was",
1880
+ "▁b",
1881
+ "or",
1882
+ "n",
1883
+ "▁in",
1884
+ "▁",
1885
+ "9",
1886
+ "2",
1887
+ "0",
1888
+ "0",
1889
+ "0",
1890
+ ",",
1891
+ "▁and",
1892
+ "▁this",
1893
+ "▁is",
1894
+ "▁f",
1895
+ "al",
1896
+ "s",
1897
+ "é",
1898
+ "."
1899
+ ],
1900
+ "encoded": [
1901
+ 8,
1902
+ 21,
1903
+ 84,
1904
+ 55,
1905
+ 24,
1906
+ 19,
1907
+ 7,
1908
+ 0,
1909
+ 602,
1910
+ 347,
1911
+ 347,
1912
+ 347,
1913
+ 3,
1914
+ 12,
1915
+ 66,
1916
+ 46,
1917
+ 72,
1918
+ 80,
1919
+ 6,
1920
+ 0,
1921
+ 4
1922
+ ],
1923
+ "encoded_special": [
1924
+ 8,
1925
+ 21,
1926
+ 84,
1927
+ 55,
1928
+ 24,
1929
+ 19,
1930
+ 7,
1931
+ 0,
1932
+ 602,
1933
+ 347,
1934
+ 347,
1935
+ 347,
1936
+ 3,
1937
+ 12,
1938
+ 66,
1939
+ 46,
1940
+ 72,
1941
+ 80,
1942
+ 6,
1943
+ 0,
1944
+ 4,
1945
+ 2
1946
+ ],
1947
+ "params": {},
1948
+ "params_encode": {}
1949
+ },
1950
+ {
1951
+ "TestClass": "TestTokenizationBart",
1952
+ "sp_base": false,
1953
+ "sequence": "lower newer",
1954
+ "tokens": [
1955
+ "l",
1956
+ "o",
1957
+ "w",
1958
+ "er",
1959
+ "Ġ",
1960
+ "n",
1961
+ "e",
1962
+ "w",
1963
+ "er"
1964
+ ],
1965
+ "encoded": [
1966
+ 0,
1967
+ 1,
1968
+ 2,
1969
+ 15,
1970
+ 10,
1971
+ 9,
1972
+ 3,
1973
+ 2,
1974
+ 15
1975
+ ],
1976
+ "encoded_special": [
1977
+ 20,
1978
+ 0,
1979
+ 1,
1980
+ 2,
1981
+ 15,
1982
+ 10,
1983
+ 9,
1984
+ 3,
1985
+ 2,
1986
+ 15,
1987
+ 21
1988
+ ],
1989
+ "params": {},
1990
+ "params_encode": {}
1991
+ },
1992
+ {
1993
+ "TestClass": "TestTokenizationLED",
1994
+ "sp_base": false,
1995
+ "sequence": "lower newer",
1996
+ "tokens": [
1997
+ "l",
1998
+ "o",
1999
+ "w",
2000
+ "er",
2001
+ "Ġ",
2002
+ "n",
2003
+ "e",
2004
+ "w",
2005
+ "er"
2006
+ ],
2007
+ "encoded": [
2008
+ 0,
2009
+ 1,
2010
+ 2,
2011
+ 15,
2012
+ 10,
2013
+ 9,
2014
+ 3,
2015
+ 2,
2016
+ 15
2017
+ ],
2018
+ "encoded_special": [
2019
+ 20,
2020
+ 0,
2021
+ 1,
2022
+ 2,
2023
+ 15,
2024
+ 10,
2025
+ 9,
2026
+ 3,
2027
+ 2,
2028
+ 15,
2029
+ 21
2030
+ ],
2031
+ "params": {},
2032
+ "params_encode": {}
2033
+ },
2034
+ {
2035
+ "TestClass": "TestTokenizationMvp",
2036
+ "sp_base": false,
2037
+ "sequence": "lower newer",
2038
+ "tokens": [
2039
+ "l",
2040
+ "o",
2041
+ "w",
2042
+ "er",
2043
+ "Ġ",
2044
+ "n",
2045
+ "e",
2046
+ "w",
2047
+ "er"
2048
+ ],
2049
+ "encoded": [
2050
+ 0,
2051
+ 1,
2052
+ 2,
2053
+ 15,
2054
+ 10,
2055
+ 9,
2056
+ 3,
2057
+ 2,
2058
+ 15
2059
+ ],
2060
+ "encoded_special": [
2061
+ 20,
2062
+ 0,
2063
+ 1,
2064
+ 2,
2065
+ 15,
2066
+ 10,
2067
+ 9,
2068
+ 3,
2069
+ 2,
2070
+ 15,
2071
+ 21
2072
+ ],
2073
+ "params": {},
2074
+ "params_encode": {}
2075
+ },
2076
+ {
2077
+ "TestClass": "UdopTokenizationTest",
2078
+ "sp_base": true,
2079
+ "sequence": false,
2080
+ "tokens": false,
2081
+ "encoded": [
2082
+ 10,
2083
+ 112,
2084
+ 158,
2085
+ 16,
2086
+ 38,
2087
+ 170,
2088
+ 382,
2089
+ 37,
2090
+ 86,
2091
+ 20
2092
+ ],
2093
+ "encoded_special": [
2094
+ 10,
2095
+ 112,
2096
+ 158,
2097
+ 16,
2098
+ 38,
2099
+ 170,
2100
+ 382,
2101
+ 37,
2102
+ 86,
2103
+ 20,
2104
+ 2
2105
+ ],
2106
+ "params": {},
2107
+ "params_encode": {
2108
+ "text": [
2109
+ "a",
2110
+ "weirdly",
2111
+ "test",
2112
+ "hello"
2113
+ ],
2114
+ "boxes": [
2115
+ [
2116
+ 423,
2117
+ 237,
2118
+ 440,
2119
+ 251
2120
+ ],
2121
+ [
2122
+ 427,
2123
+ 272,
2124
+ 441,
2125
+ 287
2126
+ ],
2127
+ [
2128
+ 419,
2129
+ 115,
2130
+ 437,
2131
+ 129
2132
+ ],
2133
+ [
2134
+ 961,
2135
+ 885,
2136
+ 992,
2137
+ 912
2138
+ ]
2139
+ ]
2140
+ }
2141
+ },
2142
+ {
2143
+ "TestClass": "WhisperTokenizerTest",
2144
+ "sp_base": false,
2145
+ "sequence": "A BCDEFGHIJKLMNOPQRST",
2146
+ "tokens": [
2147
+ "A",
2148
+ "ĠBC",
2149
+ "DE",
2150
+ "F",
2151
+ "GH",
2152
+ "I",
2153
+ "J",
2154
+ "K",
2155
+ "L",
2156
+ "M",
2157
+ "N",
2158
+ "OP",
2159
+ "Q",
2160
+ "R",
2161
+ "ST"
2162
+ ],
2163
+ "encoded": [
2164
+ 32,
2165
+ 14359,
2166
+ 22296,
2167
+ 37,
2168
+ 4269,
2169
+ 40,
2170
+ 41,
2171
+ 42,
2172
+ 43,
2173
+ 44,
2174
+ 45,
2175
+ 12059,
2176
+ 48,
2177
+ 49,
2178
+ 6840
2179
+ ],
2180
+ "encoded_special": [
2181
+ 50258,
2182
+ 50363,
2183
+ 32,
2184
+ 14359,
2185
+ 22296,
2186
+ 37,
2187
+ 4269,
2188
+ 40,
2189
+ 41,
2190
+ 42,
2191
+ 43,
2192
+ 44,
2193
+ 45,
2194
+ 12059,
2195
+ 48,
2196
+ 49,
2197
+ 6840,
2198
+ 50257
2199
+ ],
2200
+ "params": {},
2201
+ "params_encode": {}
2202
+ },
2203
+ {
2204
+ "TestClass": "XGLMTokenizationTest",
2205
+ "sp_base": false,
2206
+ "sequence": "I was born in 92000, and this is falsé.",
2207
+ "tokens": [
2208
+ "▁I",
2209
+ "▁was",
2210
+ "▁b",
2211
+ "or",
2212
+ "n",
2213
+ "▁in",
2214
+ "▁",
2215
+ "9",
2216
+ "2",
2217
+ "0",
2218
+ "0",
2219
+ "0",
2220
+ ",",
2221
+ "▁and",
2222
+ "▁this",
2223
+ "▁is",
2224
+ "▁f",
2225
+ "al",
2226
+ "s",
2227
+ "é",
2228
+ "."
2229
+ ],
2230
+ "encoded": [
2231
+ 9,
2232
+ 22,
2233
+ 85,
2234
+ 56,
2235
+ 25,
2236
+ 20,
2237
+ 8,
2238
+ 3,
2239
+ 603,
2240
+ 348,
2241
+ 348,
2242
+ 348,
2243
+ 4,
2244
+ 13,
2245
+ 67,
2246
+ 47,
2247
+ 73,
2248
+ 81,
2249
+ 7,
2250
+ 3,
2251
+ 5
2252
+ ],
2253
+ "encoded_special": [
2254
+ 2,
2255
+ 9,
2256
+ 22,
2257
+ 85,
2258
+ 56,
2259
+ 25,
2260
+ 20,
2261
+ 8,
2262
+ 3,
2263
+ 603,
2264
+ 348,
2265
+ 348,
2266
+ 348,
2267
+ 4,
2268
+ 13,
2269
+ 67,
2270
+ 47,
2271
+ 73,
2272
+ 81,
2273
+ 7,
2274
+ 3,
2275
+ 5
2276
+ ],
2277
+ "params": {},
2278
+ "params_encode": {}
2279
+ },
2280
+ {
2281
+ "TestClass": "XLMRobertaTokenizationTest",
2282
+ "sp_base": false,
2283
+ "sequence": "I was born in 92000, and this is falsé.",
2284
+ "tokens": [
2285
+ "▁I",
2286
+ "▁was",
2287
+ "▁b",
2288
+ "or",
2289
+ "n",
2290
+ "▁in",
2291
+ "▁",
2292
+ "9",
2293
+ "2",
2294
+ "0",
2295
+ "0",
2296
+ "0",
2297
+ ",",
2298
+ "▁and",
2299
+ "▁this",
2300
+ "▁is",
2301
+ "▁f",
2302
+ "al",
2303
+ "s",
2304
+ "é",
2305
+ "."
2306
+ ],
2307
+ "encoded": [
2308
+ 9,
2309
+ 22,
2310
+ 85,
2311
+ 56,
2312
+ 25,
2313
+ 20,
2314
+ 8,
2315
+ 3,
2316
+ 603,
2317
+ 348,
2318
+ 348,
2319
+ 348,
2320
+ 4,
2321
+ 13,
2322
+ 67,
2323
+ 47,
2324
+ 73,
2325
+ 81,
2326
+ 7,
2327
+ 3,
2328
+ 5
2329
+ ],
2330
+ "encoded_special": [
2331
+ 0,
2332
+ 9,
2333
+ 22,
2334
+ 85,
2335
+ 56,
2336
+ 25,
2337
+ 20,
2338
+ 8,
2339
+ 3,
2340
+ 603,
2341
+ 348,
2342
+ 348,
2343
+ 348,
2344
+ 4,
2345
+ 13,
2346
+ 67,
2347
+ 47,
2348
+ 73,
2349
+ 81,
2350
+ 7,
2351
+ 3,
2352
+ 5,
2353
+ 2
2354
+ ],
2355
+ "params": {},
2356
+ "params_encode": {}
2357
+ },
2358
+ {
2359
+ "TestClass": "XLNetTokenizationTest",
2360
+ "sp_base": true,
2361
+ "sequence": "the I to a and of in was it me that be he for with my not is s you",
2362
+ "tokens": [
2363
+ "▁the",
2364
+ "▁I",
2365
+ "▁to",
2366
+ "▁a",
2367
+ "▁and",
2368
+ "▁of",
2369
+ "▁in",
2370
+ "▁was",
2371
+ "▁it",
2372
+ "▁me",
2373
+ "▁that",
2374
+ "▁be",
2375
+ "▁he",
2376
+ "▁for",
2377
+ "▁with",
2378
+ "▁my",
2379
+ "▁not",
2380
+ "▁is",
2381
+ "▁s",
2382
+ "▁you"
2383
+ ],
2384
+ "encoded": [
2385
+ 5,
2386
+ 8,
2387
+ 9,
2388
+ 10,
2389
+ 12,
2390
+ 13,
2391
+ 19,
2392
+ 21,
2393
+ 25,
2394
+ 31,
2395
+ 34,
2396
+ 36,
2397
+ 37,
2398
+ 40,
2399
+ 41,
2400
+ 43,
2401
+ 44,
2402
+ 46,
2403
+ 47,
2404
+ 48
2405
+ ],
2406
+ "encoded_special": [
2407
+ 5,
2408
+ 8,
2409
+ 9,
2410
+ 10,
2411
+ 12,
2412
+ 13,
2413
+ 19,
2414
+ 21,
2415
+ 25,
2416
+ 31,
2417
+ 34,
2418
+ 36,
2419
+ 37,
2420
+ 40,
2421
+ 41,
2422
+ 43,
2423
+ 44,
2424
+ 46,
2425
+ 47,
2426
+ 48,
2427
+ 1000,
2428
+ 1002
2429
+ ],
2430
+ "params": {},
2431
+ "params_encode": {}
2432
+ }
2433
+ ]