TJURL-Lab commited on
Commit
995e699
·
verified ·
1 Parent(s): 342d861

Upload 101 files

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. temporal_perception/core/data.json +1118 -0
  2. temporal_perception/core/images/0.jpg +3 -0
  3. temporal_perception/core/images/1.jpg +3 -0
  4. temporal_perception/core/images/10.jpg +3 -0
  5. temporal_perception/core/images/11.jpg +3 -0
  6. temporal_perception/core/images/12.jpg +3 -0
  7. temporal_perception/core/images/13.jpg +3 -0
  8. temporal_perception/core/images/14.jpg +3 -0
  9. temporal_perception/core/images/15.jpg +3 -0
  10. temporal_perception/core/images/16.jpg +3 -0
  11. temporal_perception/core/images/17.jpg +3 -0
  12. temporal_perception/core/images/18.jpg +3 -0
  13. temporal_perception/core/images/19.jpg +3 -0
  14. temporal_perception/core/images/2.jpg +3 -0
  15. temporal_perception/core/images/20.jpg +3 -0
  16. temporal_perception/core/images/21.jpg +3 -0
  17. temporal_perception/core/images/22.jpg +3 -0
  18. temporal_perception/core/images/23.jpg +3 -0
  19. temporal_perception/core/images/24.jpg +3 -0
  20. temporal_perception/core/images/25.jpg +3 -0
  21. temporal_perception/core/images/26.jpg +3 -0
  22. temporal_perception/core/images/27.jpg +3 -0
  23. temporal_perception/core/images/28.jpg +3 -0
  24. temporal_perception/core/images/29.jpg +3 -0
  25. temporal_perception/core/images/3.jpg +3 -0
  26. temporal_perception/core/images/30.jpg +3 -0
  27. temporal_perception/core/images/31.jpg +3 -0
  28. temporal_perception/core/images/32.jpg +3 -0
  29. temporal_perception/core/images/33.jpg +3 -0
  30. temporal_perception/core/images/34.jpg +3 -0
  31. temporal_perception/core/images/35.jpg +3 -0
  32. temporal_perception/core/images/36.jpg +3 -0
  33. temporal_perception/core/images/37.jpg +3 -0
  34. temporal_perception/core/images/38.jpg +3 -0
  35. temporal_perception/core/images/39.jpg +3 -0
  36. temporal_perception/core/images/4.jpg +3 -0
  37. temporal_perception/core/images/40.jpg +3 -0
  38. temporal_perception/core/images/41.jpg +3 -0
  39. temporal_perception/core/images/42.jpg +3 -0
  40. temporal_perception/core/images/43.jpg +3 -0
  41. temporal_perception/core/images/44.jpg +3 -0
  42. temporal_perception/core/images/45.jpg +3 -0
  43. temporal_perception/core/images/46.jpg +3 -0
  44. temporal_perception/core/images/47.jpg +3 -0
  45. temporal_perception/core/images/48.jpg +3 -0
  46. temporal_perception/core/images/49.jpg +3 -0
  47. temporal_perception/core/images/5.jpg +3 -0
  48. temporal_perception/core/images/50.jpg +3 -0
  49. temporal_perception/core/images/51.jpg +3 -0
  50. temporal_perception/core/images/52.jpg +3 -0
temporal_perception/core/data.json ADDED
@@ -0,0 +1,1118 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "dataset": "temporal_perception",
4
+ "split": "core",
5
+ "num_sample": 100,
6
+ "task_instruction": [
7
+ "",
8
+ "",
9
+ "",
10
+ "",
11
+ "",
12
+ ""
13
+ ],
14
+ "question_type": "open-ended"
15
+ },
16
+ "data": [
17
+ {
18
+ "sample_id": 0,
19
+ "task_instruction_id": 0,
20
+ "task_instance": {
21
+ "context": "What will I do?",
22
+ "images_path": [
23
+ "0.jpg"
24
+ ]
25
+ },
26
+ "response": "Play baseball."
27
+ },
28
+ {
29
+ "sample_id": 1,
30
+ "task_instruction_id": 0,
31
+ "task_instance": {
32
+ "context": "What will I do?",
33
+ "images_path": [
34
+ "1.jpg"
35
+ ]
36
+ },
37
+ "response": "Cut the grass."
38
+ },
39
+ {
40
+ "sample_id": 2,
41
+ "task_instruction_id": 0,
42
+ "task_instance": {
43
+ "context": "According to the thing holding in my hands, which direction will the car go into, left or right?",
44
+ "images_path": [
45
+ "2.jpg"
46
+ ]
47
+ },
48
+ "response": "Left."
49
+ },
50
+ {
51
+ "sample_id": 3,
52
+ "task_instruction_id": 0,
53
+ "task_instance": {
54
+ "context": "Which direction will I walk to, upstairs or downstairs?",
55
+ "images_path": [
56
+ "3.jpg"
57
+ ]
58
+ },
59
+ "response": "Downstairs."
60
+ },
61
+ {
62
+ "sample_id": 4,
63
+ "task_instruction_id": 0,
64
+ "task_instance": {
65
+ "context": "What will I do?",
66
+ "images_path": [
67
+ "4.jpg"
68
+ ]
69
+ },
70
+ "response": "Open the cabinet."
71
+ },
72
+ {
73
+ "sample_id": 5,
74
+ "task_instruction_id": 0,
75
+ "task_instance": {
76
+ "context": "What will I do?",
77
+ "images_path": [
78
+ "5.jpg"
79
+ ]
80
+ },
81
+ "response": "Take out the mushrooms."
82
+ },
83
+ {
84
+ "sample_id": 6,
85
+ "task_instruction_id": 0,
86
+ "task_instance": {
87
+ "context": "What will I do?",
88
+ "images_path": [
89
+ "6.jpg"
90
+ ]
91
+ },
92
+ "response": "Cook."
93
+ },
94
+ {
95
+ "sample_id": 7,
96
+ "task_instruction_id": 0,
97
+ "task_instance": {
98
+ "context": "What am I going to do?",
99
+ "images_path": [
100
+ "7.jpg"
101
+ ]
102
+ },
103
+ "response": "Open the lid of the bucket."
104
+ },
105
+ {
106
+ "sample_id": 8,
107
+ "task_instruction_id": 0,
108
+ "task_instance": {
109
+ "context": "Am I going upstairs or downstairs?",
110
+ "images_path": [
111
+ "8.jpg"
112
+ ]
113
+ },
114
+ "response": "Upstairs."
115
+ },
116
+ {
117
+ "sample_id": 9,
118
+ "task_instruction_id": 0,
119
+ "task_instance": {
120
+ "context": "What will I put in the washing machine?",
121
+ "images_path": [
122
+ "9.jpg"
123
+ ]
124
+ },
125
+ "response": "Clothes."
126
+ },
127
+ {
128
+ "sample_id": 10,
129
+ "task_instruction_id": 0,
130
+ "task_instance": {
131
+ "context": "What will I do next?",
132
+ "images_path": [
133
+ "10.jpg"
134
+ ]
135
+ },
136
+ "response": "Roll the dough."
137
+ },
138
+ {
139
+ "sample_id": 11,
140
+ "task_instruction_id": 0,
141
+ "task_instance": {
142
+ "context": "What will I do next?",
143
+ "images_path": [
144
+ "11.jpg"
145
+ ]
146
+ },
147
+ "response": "Take out the cabbage."
148
+ },
149
+ {
150
+ "sample_id": 12,
151
+ "task_instruction_id": 0,
152
+ "task_instance": {
153
+ "context": "What will I do next?",
154
+ "images_path": [
155
+ "12.jpg"
156
+ ]
157
+ },
158
+ "response": "Open the fridge."
159
+ },
160
+ {
161
+ "sample_id": 13,
162
+ "task_instruction_id": 0,
163
+ "task_instance": {
164
+ "context": "What will I do next?",
165
+ "images_path": [
166
+ "13.jpg"
167
+ ]
168
+ },
169
+ "response": "Mop the floor."
170
+ },
171
+ {
172
+ "sample_id": 14,
173
+ "task_instruction_id": 0,
174
+ "task_instance": {
175
+ "context": "What will I do next?",
176
+ "images_path": [
177
+ "14.jpg"
178
+ ]
179
+ },
180
+ "response": "Iron the shirt"
181
+ },
182
+ {
183
+ "sample_id": 15,
184
+ "task_instruction_id": 0,
185
+ "task_instance": {
186
+ "context": "What will I do next?",
187
+ "images_path": [
188
+ "15.jpg"
189
+ ]
190
+ },
191
+ "response": "Cut plants."
192
+ },
193
+ {
194
+ "sample_id": 16,
195
+ "task_instruction_id": 0,
196
+ "task_instance": {
197
+ "context": "What will I do next?",
198
+ "images_path": [
199
+ "16.jpg"
200
+ ]
201
+ },
202
+ "response": "Open the car door."
203
+ },
204
+ {
205
+ "sample_id": 17,
206
+ "task_instruction_id": 0,
207
+ "task_instance": {
208
+ "context": "What will I do next?",
209
+ "images_path": [
210
+ "17.jpg"
211
+ ]
212
+ },
213
+ "response": "Haircut."
214
+ },
215
+ {
216
+ "sample_id": 18,
217
+ "task_instruction_id": 0,
218
+ "task_instance": {
219
+ "context": "What will I do next?",
220
+ "images_path": [
221
+ "18.jpg"
222
+ ]
223
+ },
224
+ "response": "Paint the board."
225
+ },
226
+ {
227
+ "sample_id": 19,
228
+ "task_instruction_id": 0,
229
+ "task_instance": {
230
+ "context": "What will I do with the bag?",
231
+ "images_path": [
232
+ "19.jpg"
233
+ ]
234
+ },
235
+ "response": "Fasten it."
236
+ },
237
+ {
238
+ "sample_id": 20,
239
+ "task_instruction_id": 0,
240
+ "task_instance": {
241
+ "context": "What will I do next?",
242
+ "images_path": [
243
+ "20.jpg"
244
+ ]
245
+ },
246
+ "response": "Measure the board."
247
+ },
248
+ {
249
+ "sample_id": 21,
250
+ "task_instruction_id": 0,
251
+ "task_instance": {
252
+ "context": "What will I do next?",
253
+ "images_path": [
254
+ "21.jpg"
255
+ ]
256
+ },
257
+ "response": "Wipe the cat."
258
+ },
259
+ {
260
+ "sample_id": 22,
261
+ "task_instruction_id": 0,
262
+ "task_instance": {
263
+ "context": "What will I do next?",
264
+ "images_path": [
265
+ "22.jpg"
266
+ ]
267
+ },
268
+ "response": "Paint the wall."
269
+ },
270
+ {
271
+ "sample_id": 23,
272
+ "task_instruction_id": 0,
273
+ "task_instance": {
274
+ "context": "What will I do next?",
275
+ "images_path": [
276
+ "23.jpg"
277
+ ]
278
+ },
279
+ "response": "Add water."
280
+ },
281
+ {
282
+ "sample_id": 24,
283
+ "task_instruction_id": 0,
284
+ "task_instance": {
285
+ "context": "What will I do next?",
286
+ "images_path": [
287
+ "24.jpg"
288
+ ]
289
+ },
290
+ "response": "Serve rice."
291
+ },
292
+ {
293
+ "sample_id": 25,
294
+ "task_instruction_id": 0,
295
+ "task_instance": {
296
+ "context": "What will I do next?",
297
+ "images_path": [
298
+ "25.jpg"
299
+ ]
300
+ },
301
+ "response": "Clean the desk."
302
+ },
303
+ {
304
+ "sample_id": 26,
305
+ "task_instruction_id": 0,
306
+ "task_instance": {
307
+ "context": "What will I use next? The white raddish or the dough?",
308
+ "images_path": [
309
+ "26.jpg"
310
+ ]
311
+ },
312
+ "response": "The white raddish."
313
+ },
314
+ {
315
+ "sample_id": 27,
316
+ "task_instruction_id": 0,
317
+ "task_instance": {
318
+ "context": "What will I do to the tire?",
319
+ "images_path": [
320
+ "27.jpg"
321
+ ]
322
+ },
323
+ "response": "Inflate the tire."
324
+ },
325
+ {
326
+ "sample_id": 28,
327
+ "task_instruction_id": 0,
328
+ "task_instance": {
329
+ "context": "Will the thing in my hands be cut?",
330
+ "images_path": [
331
+ "28.jpg"
332
+ ]
333
+ },
334
+ "response": "Yes."
335
+ },
336
+ {
337
+ "sample_id": 29,
338
+ "task_instruction_id": 0,
339
+ "task_instance": {
340
+ "context": "How will the pepper in my hands be cut?",
341
+ "images_path": [
342
+ "29.jpg"
343
+ ]
344
+ },
345
+ "response": "Into two pieces."
346
+ },
347
+ {
348
+ "sample_id": 30,
349
+ "task_instruction_id": 0,
350
+ "task_instance": {
351
+ "context": "What color will the thing I am painting be?",
352
+ "images_path": [
353
+ "30.jpg"
354
+ ]
355
+ },
356
+ "response": "Blue."
357
+ },
358
+ {
359
+ "sample_id": 31,
360
+ "task_instruction_id": 0,
361
+ "task_instance": {
362
+ "context": "Which state will the thing that my right hand is touching be in? On or off?",
363
+ "images_path": [
364
+ "31.jpg"
365
+ ]
366
+ },
367
+ "response": "On"
368
+ },
369
+ {
370
+ "sample_id": 32,
371
+ "task_instruction_id": 0,
372
+ "task_instance": {
373
+ "context": "According to the thing I am controlling, will the water be boiled?",
374
+ "images_path": [
375
+ "32.jpg"
376
+ ]
377
+ },
378
+ "response": "Yes."
379
+ },
380
+ {
381
+ "sample_id": 33,
382
+ "task_instruction_id": 0,
383
+ "task_instance": {
384
+ "context": "What will the device that I'm controlling turn the milk into?",
385
+ "images_path": [
386
+ "33.jpg"
387
+ ]
388
+ },
389
+ "response": "Milk foam."
390
+ },
391
+ {
392
+ "sample_id": 34,
393
+ "task_instruction_id": 0,
394
+ "task_instance": {
395
+ "context": "What am I going to do?",
396
+ "images_path": [
397
+ "34.jpg"
398
+ ]
399
+ },
400
+ "response": "Cut lemon from the branch."
401
+ },
402
+ {
403
+ "sample_id": 35,
404
+ "task_instruction_id": 0,
405
+ "task_instance": {
406
+ "context": "Where will the thing holding in my left hand be hammered?",
407
+ "images_path": [
408
+ "35.jpg"
409
+ ]
410
+ },
411
+ "response": "Into the plank."
412
+ },
413
+ {
414
+ "sample_id": 36,
415
+ "task_instruction_id": 0,
416
+ "task_instance": {
417
+ "context": "Will the meat held by the chopsticks in my left hand be placed in the pot or in a bowl?",
418
+ "images_path": [
419
+ "36.jpg"
420
+ ]
421
+ },
422
+ "response": "In the pot."
423
+ },
424
+ {
425
+ "sample_id": 37,
426
+ "task_instruction_id": 0,
427
+ "task_instance": {
428
+ "context": "Am I going to add the thing holding in my left hand to vegetables?",
429
+ "images_path": [
430
+ "37.jpg"
431
+ ]
432
+ },
433
+ "response": "Yes."
434
+ },
435
+ {
436
+ "sample_id": 38,
437
+ "task_instruction_id": 0,
438
+ "task_instance": {
439
+ "context": "Will the thing holding in my left hand be lifted?",
440
+ "images_path": [
441
+ "38.jpg"
442
+ ]
443
+ },
444
+ "response": "No."
445
+ },
446
+ {
447
+ "sample_id": 39,
448
+ "task_instruction_id": 0,
449
+ "task_instance": {
450
+ "context": "What will the thing holding in my left hand be full of?",
451
+ "images_path": [
452
+ "39.jpg"
453
+ ]
454
+ },
455
+ "response": "Water."
456
+ },
457
+ {
458
+ "sample_id": 40,
459
+ "task_instruction_id": 0,
460
+ "task_instance": {
461
+ "context": "What am I going to do?",
462
+ "images_path": [
463
+ "40.jpg"
464
+ ]
465
+ },
466
+ "response": "Pour water."
467
+ },
468
+ {
469
+ "sample_id": 41,
470
+ "task_instruction_id": 0,
471
+ "task_instance": {
472
+ "context": "What am I going to do?",
473
+ "images_path": [
474
+ "41.jpg"
475
+ ]
476
+ },
477
+ "response": "Peel the onion."
478
+ },
479
+ {
480
+ "sample_id": 42,
481
+ "task_instruction_id": 0,
482
+ "task_instance": {
483
+ "context": "What am I going to do? Pour the noodle to the plate or to the pot?",
484
+ "images_path": [
485
+ "42.jpg"
486
+ ]
487
+ },
488
+ "response": "Pour the noodle to the plate."
489
+ },
490
+ {
491
+ "sample_id": 43,
492
+ "task_instruction_id": 0,
493
+ "task_instance": {
494
+ "context": "Will am I going to do?",
495
+ "images_path": [
496
+ "43.jpg"
497
+ ]
498
+ },
499
+ "response": "Wash clothes."
500
+ },
501
+ {
502
+ "sample_id": 44,
503
+ "task_instruction_id": 0,
504
+ "task_instance": {
505
+ "context": "What am I going to pour into the small bowl? Soup or water?",
506
+ "images_path": [
507
+ "44.jpg"
508
+ ]
509
+ },
510
+ "response": "Soup."
511
+ },
512
+ {
513
+ "sample_id": 45,
514
+ "task_instruction_id": 0,
515
+ "task_instance": {
516
+ "context": "Will the thing holding in my left hand be filled with water then?",
517
+ "images_path": [
518
+ "45.jpg"
519
+ ]
520
+ },
521
+ "response": "Yes."
522
+ },
523
+ {
524
+ "sample_id": 46,
525
+ "task_instruction_id": 0,
526
+ "task_instance": {
527
+ "context": "Where am I going to put the fruit? On the ground or in the basket?",
528
+ "images_path": [
529
+ "46.jpg"
530
+ ]
531
+ },
532
+ "response": "In the basket."
533
+ },
534
+ {
535
+ "sample_id": 47,
536
+ "task_instruction_id": 0,
537
+ "task_instance": {
538
+ "context": "Will this piece of orange holding in my hands be eaten?",
539
+ "images_path": [
540
+ "47.jpg"
541
+ ]
542
+ },
543
+ "response": "Yes."
544
+ },
545
+ {
546
+ "sample_id": 48,
547
+ "task_instruction_id": 0,
548
+ "task_instance": {
549
+ "context": "What am I going to do with the box? Open it or close it?",
550
+ "images_path": [
551
+ "48.jpg"
552
+ ]
553
+ },
554
+ "response": "Open it."
555
+ },
556
+ {
557
+ "sample_id": 49,
558
+ "task_instruction_id": 0,
559
+ "task_instance": {
560
+ "context": "What am I going to do?",
561
+ "images_path": [
562
+ "49.jpg"
563
+ ]
564
+ },
565
+ "response": "Plant the tree."
566
+ },
567
+ {
568
+ "sample_id": 50,
569
+ "task_instruction_id": 0,
570
+ "task_instance": {
571
+ "context": "Where will the thing holding in my hands more likely to be? In the fridge or on the table?",
572
+ "images_path": [
573
+ "50.jpg"
574
+ ]
575
+ },
576
+ "response": "In the fridge."
577
+ },
578
+ {
579
+ "sample_id": 51,
580
+ "task_instruction_id": 0,
581
+ "task_instance": {
582
+ "context": "Where will the thing holding in my right hand be in then?",
583
+ "images_path": [
584
+ "51.jpg"
585
+ ]
586
+ },
587
+ "response": "Coffee machine."
588
+ },
589
+ {
590
+ "sample_id": 52,
591
+ "task_instruction_id": 0,
592
+ "task_instance": {
593
+ "context": "Will the meat be mixed with the vegetables holding in my left hand?",
594
+ "images_path": [
595
+ "52.jpg"
596
+ ]
597
+ },
598
+ "response": "Yes."
599
+ },
600
+ {
601
+ "sample_id": 53,
602
+ "task_instruction_id": 0,
603
+ "task_instance": {
604
+ "context": "What will I do?",
605
+ "images_path": [
606
+ "53.jpg"
607
+ ]
608
+ },
609
+ "response": "Clean the floor."
610
+ },
611
+ {
612
+ "sample_id": 54,
613
+ "task_instruction_id": 0,
614
+ "task_instance": {
615
+ "context": "Am I going to put away these clothes?",
616
+ "images_path": [
617
+ "54.jpg"
618
+ ]
619
+ },
620
+ "response": "Yes."
621
+ },
622
+ {
623
+ "sample_id": 55,
624
+ "task_instruction_id": 0,
625
+ "task_instance": {
626
+ "context": "Where will the thing in the bag I am holding be?",
627
+ "images_path": [
628
+ "55.jpg"
629
+ ]
630
+ },
631
+ "response": "In the washing machine."
632
+ },
633
+ {
634
+ "sample_id": 56,
635
+ "task_instruction_id": 0,
636
+ "task_instance": {
637
+ "context": "What am I going to do?",
638
+ "images_path": [
639
+ "56.jpg"
640
+ ]
641
+ },
642
+ "response": "Stir milk."
643
+ },
644
+ {
645
+ "sample_id": 57,
646
+ "task_instruction_id": 0,
647
+ "task_instance": {
648
+ "context": "Will the thing holding in my hands be in the box or on the desk?",
649
+ "images_path": [
650
+ "57.jpg"
651
+ ]
652
+ },
653
+ "response": "On the desk."
654
+ },
655
+ {
656
+ "sample_id": 58,
657
+ "task_instruction_id": 0,
658
+ "task_instance": {
659
+ "context": "What ma I going to do?",
660
+ "images_path": [
661
+ "58.jpg"
662
+ ]
663
+ },
664
+ "response": "Pour the cement into the mixer."
665
+ },
666
+ {
667
+ "sample_id": 59,
668
+ "task_instruction_id": 0,
669
+ "task_instance": {
670
+ "context": "Am I going to heat the food in the bowl?",
671
+ "images_path": [
672
+ "59.jpg"
673
+ ]
674
+ },
675
+ "response": "Yes."
676
+ },
677
+ {
678
+ "sample_id": 60,
679
+ "task_instruction_id": 0,
680
+ "task_instance": {
681
+ "context": "What color am I going to paint on the plank?",
682
+ "images_path": [
683
+ "60.jpg"
684
+ ]
685
+ },
686
+ "response": "White."
687
+ },
688
+ {
689
+ "sample_id": 61,
690
+ "task_instruction_id": 0,
691
+ "task_instance": {
692
+ "context": "How many pieces am I going to cut the wooden board into?",
693
+ "images_path": [
694
+ "61.jpg"
695
+ ]
696
+ },
697
+ "response": "Two."
698
+ },
699
+ {
700
+ "sample_id": 62,
701
+ "task_instruction_id": 0,
702
+ "task_instance": {
703
+ "context": "Am I going to pour drink to the pot?",
704
+ "images_path": [
705
+ "62.jpg"
706
+ ]
707
+ },
708
+ "response": "No."
709
+ },
710
+ {
711
+ "sample_id": 63,
712
+ "task_instruction_id": 0,
713
+ "task_instance": {
714
+ "context": "What color am I going to paint on the toy?",
715
+ "images_path": [
716
+ "63.jpg"
717
+ ]
718
+ },
719
+ "response": "Golden."
720
+ },
721
+ {
722
+ "sample_id": 64,
723
+ "task_instruction_id": 0,
724
+ "task_instance": {
725
+ "context": "What am I going to do?",
726
+ "images_path": [
727
+ "64.jpg"
728
+ ]
729
+ },
730
+ "response": "Prune the leaves."
731
+ },
732
+ {
733
+ "sample_id": 65,
734
+ "task_instruction_id": 0,
735
+ "task_instance": {
736
+ "context": "What am I going to do?",
737
+ "images_path": [
738
+ "65.jpg"
739
+ ]
740
+ },
741
+ "response": "Open the drawer."
742
+ },
743
+ {
744
+ "sample_id": 66,
745
+ "task_instruction_id": 0,
746
+ "task_instance": {
747
+ "context": "What am I going to do?",
748
+ "images_path": [
749
+ "66.jpg"
750
+ ]
751
+ },
752
+ "response": "Remove the handle of the eggplant."
753
+ },
754
+ {
755
+ "sample_id": 67,
756
+ "task_instruction_id": 0,
757
+ "task_instance": {
758
+ "context": "Will the grass on the ground in front of me be removed?",
759
+ "images_path": [
760
+ "67.jpg"
761
+ ]
762
+ },
763
+ "response": "Yes."
764
+ },
765
+ {
766
+ "sample_id": 68,
767
+ "task_instruction_id": 0,
768
+ "task_instance": {
769
+ "context": "What will the water in the container that I am holding turn into?",
770
+ "images_path": [
771
+ "68.jpg"
772
+ ]
773
+ },
774
+ "response": "Ice."
775
+ },
776
+ {
777
+ "sample_id": 69,
778
+ "task_instruction_id": 0,
779
+ "task_instance": {
780
+ "context": "Will the pants that I am ironing become hot or cold?",
781
+ "images_path": [
782
+ "69.jpg"
783
+ ]
784
+ },
785
+ "response": "Hot."
786
+ },
787
+ {
788
+ "sample_id": 70,
789
+ "task_instruction_id": 0,
790
+ "task_instance": {
791
+ "context": "What am I going to do?",
792
+ "images_path": [
793
+ "70.jpg"
794
+ ]
795
+ },
796
+ "response": "Pour the ice to the cup."
797
+ },
798
+ {
799
+ "sample_id": 71,
800
+ "task_instruction_id": 0,
801
+ "task_instance": {
802
+ "context": "Will noodles in the pot closer to me become softer or harder?",
803
+ "images_path": [
804
+ "71.jpg"
805
+ ]
806
+ },
807
+ "response": "Softer."
808
+ },
809
+ {
810
+ "sample_id": 72,
811
+ "task_instruction_id": 0,
812
+ "task_instance": {
813
+ "context": "Will the vegetables on the chopping board in my hand be put into the pot?",
814
+ "images_path": [
815
+ "72.jpg"
816
+ ]
817
+ },
818
+ "response": "Yes."
819
+ },
820
+ {
821
+ "sample_id": 73,
822
+ "task_instruction_id": 0,
823
+ "task_instance": {
824
+ "context": "What am I going to do? Empty the pot or fill the pot?",
825
+ "images_path": [
826
+ "73.jpg"
827
+ ]
828
+ },
829
+ "response": "Empty the pot."
830
+ },
831
+ {
832
+ "sample_id": 74,
833
+ "task_instruction_id": 0,
834
+ "task_instance": {
835
+ "context": "Will the braided fabric in my hand become larger or smaller?",
836
+ "images_path": [
837
+ "74.jpg"
838
+ ]
839
+ },
840
+ "response": "larger."
841
+ },
842
+ {
843
+ "sample_id": 75,
844
+ "task_instruction_id": 0,
845
+ "task_instance": {
846
+ "context": "What am I going to do?",
847
+ "images_path": [
848
+ "75.jpg"
849
+ ]
850
+ },
851
+ "response": "Put on my shoes."
852
+ },
853
+ {
854
+ "sample_id": 76,
855
+ "task_instruction_id": 0,
856
+ "task_instance": {
857
+ "context": "Will the food on my fork be put into the pot or plate?",
858
+ "images_path": [
859
+ "76.jpg"
860
+ ]
861
+ },
862
+ "response": "Plate."
863
+ },
864
+ {
865
+ "sample_id": 77,
866
+ "task_instruction_id": 0,
867
+ "task_instance": {
868
+ "context": "Will the brick in my hand be put on the ground?",
869
+ "images_path": [
870
+ "77.jpg"
871
+ ]
872
+ },
873
+ "response": "No."
874
+ },
875
+ {
876
+ "sample_id": 78,
877
+ "task_instruction_id": 0,
878
+ "task_instance": {
879
+ "context": "What am I going to do?",
880
+ "images_path": [
881
+ "78.jpg"
882
+ ]
883
+ },
884
+ "response": "Cut the paper."
885
+ },
886
+ {
887
+ "sample_id": 79,
888
+ "task_instruction_id": 0,
889
+ "task_instance": {
890
+ "context": "Will the water in the cup I am holding increase or decrease?",
891
+ "images_path": [
892
+ "79.jpg"
893
+ ]
894
+ },
895
+ "response": "Increase."
896
+ },
897
+ {
898
+ "sample_id": 80,
899
+ "task_instruction_id": 0,
900
+ "task_instance": {
901
+ "context": "Am I going to water the plants on my right?",
902
+ "images_path": [
903
+ "80.jpg"
904
+ ]
905
+ },
906
+ "response": "Yes."
907
+ },
908
+ {
909
+ "sample_id": 81,
910
+ "task_instruction_id": 0,
911
+ "task_instance": {
912
+ "context": "What am I going to do to the thing touching by my left hand?",
913
+ "images_path": [
914
+ "81.jpg"
915
+ ]
916
+ },
917
+ "response": "Clean it."
918
+ },
919
+ {
920
+ "sample_id": 82,
921
+ "task_instruction_id": 0,
922
+ "task_instance": {
923
+ "context": "What am I going to do?",
924
+ "images_path": [
925
+ "82.jpg"
926
+ ]
927
+ },
928
+ "response": "Wash the brush."
929
+ },
930
+ {
931
+ "sample_id": 83,
932
+ "task_instruction_id": 0,
933
+ "task_instance": {
934
+ "context": "Will the painting that touching by my right hand become drier or wetter?",
935
+ "images_path": [
936
+ "83.jpg"
937
+ ]
938
+ },
939
+ "response": "Drier."
940
+ },
941
+ {
942
+ "sample_id": 84,
943
+ "task_instruction_id": 0,
944
+ "task_instance": {
945
+ "context": "Am I going to add more green to the painting?",
946
+ "images_path": [
947
+ "84.jpg"
948
+ ]
949
+ },
950
+ "response": "Yes."
951
+ },
952
+ {
953
+ "sample_id": 85,
954
+ "task_instruction_id": 0,
955
+ "task_instance": {
956
+ "context": "What am I planning to clean?",
957
+ "images_path": [
958
+ "85.jpg"
959
+ ]
960
+ },
961
+ "response": "The car."
962
+ },
963
+ {
964
+ "sample_id": 86,
965
+ "task_instruction_id": 0,
966
+ "task_instance": {
967
+ "context": "Will I shape the dough into cubes?",
968
+ "images_path": [
969
+ "86.jpg"
970
+ ]
971
+ },
972
+ "response": "No."
973
+ },
974
+ {
975
+ "sample_id": 87,
976
+ "task_instruction_id": 0,
977
+ "task_instance": {
978
+ "context": "What am I going to do?",
979
+ "images_path": [
980
+ "87.jpg"
981
+ ]
982
+ },
983
+ "response": "Paint on the paper."
984
+ },
985
+ {
986
+ "sample_id": 88,
987
+ "task_instruction_id": 0,
988
+ "task_instance": {
989
+ "context": "What am I going to do with the vegetables holding in my left hand?",
990
+ "images_path": [
991
+ "88.jpg"
992
+ ]
993
+ },
994
+ "response": "Cut it."
995
+ },
996
+ {
997
+ "sample_id": 89,
998
+ "task_instruction_id": 0,
999
+ "task_instance": {
1000
+ "context": "What am I going to do?",
1001
+ "images_path": [
1002
+ "89.jpg"
1003
+ ]
1004
+ },
1005
+ "response": "Pick flowers."
1006
+ },
1007
+ {
1008
+ "sample_id": 90,
1009
+ "task_instruction_id": 0,
1010
+ "task_instance": {
1011
+ "context": "Will I use the hammer to strike the object holding in my left hand?",
1012
+ "images_path": [
1013
+ "90.jpg"
1014
+ ]
1015
+ },
1016
+ "response": "Yes."
1017
+ },
1018
+ {
1019
+ "sample_id": 91,
1020
+ "task_instruction_id": 0,
1021
+ "task_instance": {
1022
+ "context": "What am I going to do?",
1023
+ "images_path": [
1024
+ "91.jpg"
1025
+ ]
1026
+ },
1027
+ "response": "Lift the brown tool."
1028
+ },
1029
+ {
1030
+ "sample_id": 92,
1031
+ "task_instruction_id": 0,
1032
+ "task_instance": {
1033
+ "context": "What am I going to use? Laptop or iPad",
1034
+ "images_path": [
1035
+ "92.jpg"
1036
+ ]
1037
+ },
1038
+ "response": "IPad."
1039
+ },
1040
+ {
1041
+ "sample_id": 93,
1042
+ "task_instruction_id": 0,
1043
+ "task_instance": {
1044
+ "context": "What am I going to do?",
1045
+ "images_path": [
1046
+ "93.jpg"
1047
+ ]
1048
+ },
1049
+ "response": "Connect the wire to the lamp."
1050
+ },
1051
+ {
1052
+ "sample_id": 94,
1053
+ "task_instruction_id": 0,
1054
+ "task_instance": {
1055
+ "context": "What am I going to do?",
1056
+ "images_path": [
1057
+ "94.jpg"
1058
+ ]
1059
+ },
1060
+ "response": "Mow grass."
1061
+ },
1062
+ {
1063
+ "sample_id": 95,
1064
+ "task_instruction_id": 0,
1065
+ "task_instance": {
1066
+ "context": "What am I going to do?",
1067
+ "images_path": [
1068
+ "95.jpg"
1069
+ ]
1070
+ },
1071
+ "response": "Wash the car."
1072
+ },
1073
+ {
1074
+ "sample_id": 96,
1075
+ "task_instruction_id": 0,
1076
+ "task_instance": {
1077
+ "context": "What am I going to do?",
1078
+ "images_path": [
1079
+ "96.jpg"
1080
+ ]
1081
+ },
1082
+ "response": "Open the tap."
1083
+ },
1084
+ {
1085
+ "sample_id": 97,
1086
+ "task_instruction_id": 0,
1087
+ "task_instance": {
1088
+ "context": "Am I going to clean the board?",
1089
+ "images_path": [
1090
+ "97.jpg"
1091
+ ]
1092
+ },
1093
+ "response": "Yes."
1094
+ },
1095
+ {
1096
+ "sample_id": 98,
1097
+ "task_instruction_id": 0,
1098
+ "task_instance": {
1099
+ "context": "Am I going to put the thing in my left hand to the water or to the bucket?",
1100
+ "images_path": [
1101
+ "98.jpg"
1102
+ ]
1103
+ },
1104
+ "response": "To the bucket."
1105
+ },
1106
+ {
1107
+ "sample_id": 99,
1108
+ "task_instruction_id": 0,
1109
+ "task_instance": {
1110
+ "context": "Am I going to cut the branches?",
1111
+ "images_path": [
1112
+ "99.jpg"
1113
+ ]
1114
+ },
1115
+ "response": "Yes."
1116
+ }
1117
+ ]
1118
+ }
temporal_perception/core/images/0.jpg ADDED

Git LFS Details

  • SHA256: cd68a266a8a44837e8362936b4da491bf5382c0529006c2a70dd87d8a275a834
  • Pointer size: 131 Bytes
  • Size of remote file: 181 kB
temporal_perception/core/images/1.jpg ADDED

Git LFS Details

  • SHA256: d9d7ffc30fb2931b5dbbb507164f0ea04b6a1c262f3ff259f5181040cbc5d35e
  • Pointer size: 131 Bytes
  • Size of remote file: 485 kB
temporal_perception/core/images/10.jpg ADDED

Git LFS Details

  • SHA256: 1efed1a96959c247dd31cf0641b5a4fd8e8f8be4bf141abe506d99cb56d2e326
  • Pointer size: 131 Bytes
  • Size of remote file: 111 kB
temporal_perception/core/images/11.jpg ADDED

Git LFS Details

  • SHA256: 6e4f172c1f8c9247b0711db50c92d040ffd3f944b5da36da80cf7b4ceaeddd6f
  • Pointer size: 131 Bytes
  • Size of remote file: 178 kB
temporal_perception/core/images/12.jpg ADDED

Git LFS Details

  • SHA256: 8578de3612d0c97f859387a3f79ccf19e7124ec769d23c2e8656a815b737f619
  • Pointer size: 131 Bytes
  • Size of remote file: 103 kB
temporal_perception/core/images/13.jpg ADDED

Git LFS Details

  • SHA256: 3e8ab802e09611cad9340c0f545098c7530fd631fc915bccaac5e2b0f9b0d62f
  • Pointer size: 130 Bytes
  • Size of remote file: 69.7 kB
temporal_perception/core/images/14.jpg ADDED

Git LFS Details

  • SHA256: 6a8ca89cd01fcfff2458cee2bf154f888a425f76d6461cc10f67e9190fa3ecbc
  • Pointer size: 131 Bytes
  • Size of remote file: 160 kB
temporal_perception/core/images/15.jpg ADDED

Git LFS Details

  • SHA256: 22c9f14cd86bff1a5b78635a172be7e7d4261a6bf46e1668265c548e18799a7a
  • Pointer size: 131 Bytes
  • Size of remote file: 314 kB
temporal_perception/core/images/16.jpg ADDED

Git LFS Details

  • SHA256: 5432eb36f6b67bdc81f190633ade3dcbad5351cb48c9ad2c9af198bdeb59236c
  • Pointer size: 131 Bytes
  • Size of remote file: 107 kB
temporal_perception/core/images/17.jpg ADDED

Git LFS Details

  • SHA256: 58c1e330701c20bedece8d2c5549a029222db91267a4e51285f9912832686496
  • Pointer size: 130 Bytes
  • Size of remote file: 99.8 kB
temporal_perception/core/images/18.jpg ADDED

Git LFS Details

  • SHA256: b9b8a7704c9b512832e995940e749461cebb92bf42b9b4df0eacf68e78e2c42c
  • Pointer size: 131 Bytes
  • Size of remote file: 269 kB
temporal_perception/core/images/19.jpg ADDED

Git LFS Details

  • SHA256: 8bf3843ac65db57593e33b1f94721981e0c61c19b68b5719ddff8de58e24047f
  • Pointer size: 131 Bytes
  • Size of remote file: 163 kB
temporal_perception/core/images/2.jpg ADDED

Git LFS Details

  • SHA256: c0d427f4bb1c87351eb9731ed6c00c5d69e12ed2f47de3303603d2e384d984c1
  • Pointer size: 131 Bytes
  • Size of remote file: 164 kB
temporal_perception/core/images/20.jpg ADDED

Git LFS Details

  • SHA256: c4cfb6327eb305884757aa061c8915289eac3d5cf38f583b8bc0798ecfbb66a9
  • Pointer size: 131 Bytes
  • Size of remote file: 117 kB
temporal_perception/core/images/21.jpg ADDED

Git LFS Details

  • SHA256: 0698b7cd0c5eac85f69b225d0295ff036723fedd0ef3567f2c584f16ab4ec8e4
  • Pointer size: 130 Bytes
  • Size of remote file: 88.5 kB
temporal_perception/core/images/22.jpg ADDED

Git LFS Details

  • SHA256: ba4c282f4e781ad344cdade86746dc4ae25108faf8671bae9562ad024b8e04d1
  • Pointer size: 130 Bytes
  • Size of remote file: 58.1 kB
temporal_perception/core/images/23.jpg ADDED

Git LFS Details

  • SHA256: 0d0ec0c622b3869b27c94b9d51547ecfbc5e399cce530a6a397902ef6d28e6b7
  • Pointer size: 131 Bytes
  • Size of remote file: 179 kB
temporal_perception/core/images/24.jpg ADDED

Git LFS Details

  • SHA256: e3255986594b754b4437c667e2f5a2da9847b09e0c030d130a03e3981e661ba2
  • Pointer size: 131 Bytes
  • Size of remote file: 143 kB
temporal_perception/core/images/25.jpg ADDED

Git LFS Details

  • SHA256: e429af0a7e45d864fde2397a6c73cb0565e9d5f8385ca89967b0e1e5d71c79ed
  • Pointer size: 131 Bytes
  • Size of remote file: 123 kB
temporal_perception/core/images/26.jpg ADDED

Git LFS Details

  • SHA256: 54297f0d67bdc79e04b6cb36b59daad5c5df286801570dd63a291133c848e445
  • Pointer size: 131 Bytes
  • Size of remote file: 158 kB
temporal_perception/core/images/27.jpg ADDED

Git LFS Details

  • SHA256: 2a12d3de328262a3f5ce16ef8a33911354664f5e5e44153cf625f198c0dfdc9d
  • Pointer size: 131 Bytes
  • Size of remote file: 151 kB
temporal_perception/core/images/28.jpg ADDED

Git LFS Details

  • SHA256: 2dd4df9158f420fa1d1cfb11e5a62e751f4a22e5e92a626e47f38713a0573f93
  • Pointer size: 131 Bytes
  • Size of remote file: 173 kB
temporal_perception/core/images/29.jpg ADDED

Git LFS Details

  • SHA256: f0a0defc79acd0e1d2b281a413a0ed7d6bd3c47f00f0c615cae00c60c58cc34f
  • Pointer size: 131 Bytes
  • Size of remote file: 203 kB
temporal_perception/core/images/3.jpg ADDED

Git LFS Details

  • SHA256: 421ed6dafd6c4a82dcab9a2af6e88395b99185e2e4a697eacdae444c4bccb67e
  • Pointer size: 130 Bytes
  • Size of remote file: 66.4 kB
temporal_perception/core/images/30.jpg ADDED

Git LFS Details

  • SHA256: dbaea3819dc0d65677fe513824bef4512a4f372057635ba9ae161f591b967f9e
  • Pointer size: 131 Bytes
  • Size of remote file: 230 kB
temporal_perception/core/images/31.jpg ADDED

Git LFS Details

  • SHA256: 210bbebfb583c76b54ed7070905219f5a9738aafebf9e48aa24e48c7b5ceb379
  • Pointer size: 130 Bytes
  • Size of remote file: 86.7 kB
temporal_perception/core/images/32.jpg ADDED

Git LFS Details

  • SHA256: cf6ce7766e3ffab487930ed402fe3d89d54823f836eb890eefe2ae6bc5c99756
  • Pointer size: 130 Bytes
  • Size of remote file: 80.1 kB
temporal_perception/core/images/33.jpg ADDED

Git LFS Details

  • SHA256: bbf77284bbf1e3ad2a373168b0349cd18917288f50c98341db8ee8fc1406a95c
  • Pointer size: 130 Bytes
  • Size of remote file: 72.1 kB
temporal_perception/core/images/34.jpg ADDED

Git LFS Details

  • SHA256: e77707f454c5b358d109d4adb23ea869e9427aee7ba49c8facce5e87d95f896c
  • Pointer size: 131 Bytes
  • Size of remote file: 142 kB
temporal_perception/core/images/35.jpg ADDED

Git LFS Details

  • SHA256: fd8129b59d4cff73e92e27353cd374befd6cb9b879429902d89417dc147d150b
  • Pointer size: 131 Bytes
  • Size of remote file: 489 kB
temporal_perception/core/images/36.jpg ADDED

Git LFS Details

  • SHA256: 33df52edd5b902b19119432ab794f835e2ad379664c2f3cd00db072187aadc26
  • Pointer size: 131 Bytes
  • Size of remote file: 167 kB
temporal_perception/core/images/37.jpg ADDED

Git LFS Details

  • SHA256: 48bb5127f767bdafae9ca31e035ca4499e81ea34a2e481d09991740f894d9c20
  • Pointer size: 131 Bytes
  • Size of remote file: 210 kB
temporal_perception/core/images/38.jpg ADDED

Git LFS Details

  • SHA256: c85a6ca848fb22f0dd8f9b516874015a27562a745756e7c9c28edb626d148683
  • Pointer size: 131 Bytes
  • Size of remote file: 164 kB
temporal_perception/core/images/39.jpg ADDED

Git LFS Details

  • SHA256: 30fd73b40094f9d2ab8ed663960281f782fc290ec47dfbff5869934284ddeefd
  • Pointer size: 131 Bytes
  • Size of remote file: 155 kB
temporal_perception/core/images/4.jpg ADDED

Git LFS Details

  • SHA256: 064ac7905df305dac3e14a8b1e627015636a63822d0bd0aa475066c57bde965c
  • Pointer size: 130 Bytes
  • Size of remote file: 76.5 kB
temporal_perception/core/images/40.jpg ADDED

Git LFS Details

  • SHA256: f580d9874844bd019dd3ba072f841e9d4542aa8091b62a31e5aca7e0b7b29d60
  • Pointer size: 131 Bytes
  • Size of remote file: 180 kB
temporal_perception/core/images/41.jpg ADDED

Git LFS Details

  • SHA256: 24d16ce3b2858095c12a4150165e15b29aab4c59bc32e2c68c55705174114eda
  • Pointer size: 131 Bytes
  • Size of remote file: 188 kB
temporal_perception/core/images/42.jpg ADDED

Git LFS Details

  • SHA256: 5130fbfd7a8d17b3160b7520c083cdb70a7124f5055b0aa3a94f1377bd837dfd
  • Pointer size: 131 Bytes
  • Size of remote file: 166 kB
temporal_perception/core/images/43.jpg ADDED

Git LFS Details

  • SHA256: 5a9189ac29c047ddb37d054568818e6a2809f03326dd379096eed3a86aa6199d
  • Pointer size: 131 Bytes
  • Size of remote file: 121 kB
temporal_perception/core/images/44.jpg ADDED

Git LFS Details

  • SHA256: 7c337fede349d0eaca0ec9e55df121c6531f0166663b3587ee6ce61b8867d4e9
  • Pointer size: 131 Bytes
  • Size of remote file: 206 kB
temporal_perception/core/images/45.jpg ADDED

Git LFS Details

  • SHA256: 742d4f0b18329870ec4aa462c215f2133b8d2c424f4bd3282260c19d23cb3eea
  • Pointer size: 131 Bytes
  • Size of remote file: 101 kB
temporal_perception/core/images/46.jpg ADDED

Git LFS Details

  • SHA256: 293ebba42172a5df3c4a93f4a89319a50522dc32cbbce0b8140cce44c1846c87
  • Pointer size: 131 Bytes
  • Size of remote file: 423 kB
temporal_perception/core/images/47.jpg ADDED

Git LFS Details

  • SHA256: 1388234c04c816d44626a37df2f1ab670c3d248fe9baeeb224570c032310cde9
  • Pointer size: 131 Bytes
  • Size of remote file: 162 kB
temporal_perception/core/images/48.jpg ADDED

Git LFS Details

  • SHA256: 0f717881bd9c0235d39ce57d947dd4a0d4058f91392bc1412c2a885d87770c80
  • Pointer size: 130 Bytes
  • Size of remote file: 87.9 kB
temporal_perception/core/images/49.jpg ADDED

Git LFS Details

  • SHA256: c29b1301b4040f90495c4faabd55107600357e46090a8199ad2f269cded150ac
  • Pointer size: 131 Bytes
  • Size of remote file: 607 kB
temporal_perception/core/images/5.jpg ADDED

Git LFS Details

  • SHA256: 7e8144970597f99080445842c56e35d6364e939eb4a329795cce585671bed9fc
  • Pointer size: 131 Bytes
  • Size of remote file: 142 kB
temporal_perception/core/images/50.jpg ADDED

Git LFS Details

  • SHA256: e387190c6669107a847ab05f1d9d6af33cfb775ae493272d3ab65865590a692c
  • Pointer size: 131 Bytes
  • Size of remote file: 119 kB
temporal_perception/core/images/51.jpg ADDED

Git LFS Details

  • SHA256: 05e7506eb4c71f941df405c7347f2faa52023742e6778ad05ecd15bcb8ea4b60
  • Pointer size: 131 Bytes
  • Size of remote file: 149 kB
temporal_perception/core/images/52.jpg ADDED

Git LFS Details

  • SHA256: d8a925b6248350fe7b68458e5f4ed8daa7a35d9e3b4daf0017abe51e5855df70
  • Pointer size: 130 Bytes
  • Size of remote file: 89 kB