Spaces:
Running
Running
Update Operational_Instructions/Live_Challenge_Day_and_Dry_Test_Instructions.md
Browse files
Operational_Instructions/Live_Challenge_Day_and_Dry_Test_Instructions.md
CHANGED
@@ -7,13 +7,14 @@
|
|
7 |
2. Just before your session start time, your team leader will receive an email containing the Question file (500 questions) in jsonl format, where each line contains a Question json object β see Question [json schema](Question_File.json.schema) and [example](Question_File_Example.json)
|
8 |
|
9 |
3. You must generate and submit your Answer file in jsonl format, where each line contains an Answer json object β see Answer [json schema](Answer_File.json.schema) and [example](Answer_File_Example.json) β within 2 hours from your session start time\
|
|
|
10 |
**Remark:** Details about the exact Answer file upload procedure will be provided soon
|
11 |
|
12 |
-
|
13 |
|
14 |
-
|
15 |
|
16 |
-
|
17 |
6.1 The automatic evaluation results leaderboard will be published on the HuggingFace Challenge page once the evaluation process is completed\
|
18 |
6.2 The top-performing teams will undergo manual evaluation to determine the final winners, who will be announced on July 17 during the SIGIR 2025 LiveRAG Workshop
|
19 |
|
|
|
7 |
2. Just before your session start time, your team leader will receive an email containing the Question file (500 questions) in jsonl format, where each line contains a Question json object β see Question [json schema](Question_File.json.schema) and [example](Question_File_Example.json)
|
8 |
|
9 |
3. You must generate and submit your Answer file in jsonl format, where each line contains an Answer json object β see Answer [json schema](Answer_File.json.schema) and [example](Answer_File_Example.json) β within 2 hours from your session start time\
|
10 |
+
3.1 Here is a simple [script](Live_Challenge_Day_and_Dry_Test_Instructions.md) for generating and verifying a valid Answer file\
|
11 |
**Remark:** Details about the exact Answer file upload procedure will be provided soon
|
12 |
|
13 |
+
5. Please refer to the LiveRAG Challenge [Evaluation Guidelines](Evaluation_Guidelines_for_LiveRAG.md) for important information about the evaluation process
|
14 |
|
15 |
+
6. You must share your RAG system Git repository with us by end-of-day AoE May 13 via email at sigir2025-liverag-gen@tii.ae to enable result reproduction
|
16 |
|
17 |
+
7. Additional information:\
|
18 |
6.1 The automatic evaluation results leaderboard will be published on the HuggingFace Challenge page once the evaluation process is completed\
|
19 |
6.2 The top-performing teams will undergo manual evaluation to determine the final winners, who will be announced on July 17 during the SIGIR 2025 LiveRAG Workshop
|
20 |
|