Ubuntu commited on
Commit
e67edda
·
1 Parent(s): fdd0e33

Initial commit

Browse files
dataset_card.yaml ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ pretty_name: .NET Runtime
2
+ tags:
3
+ - raw-json
4
+ - parquet
5
+ - faiss-index
6
+ - text
7
+ - large-scale
8
+ - offline-processing
9
+ - github
10
+ - code
11
+ license: mit
12
+ language:
13
+ - en
14
+ size_categories:
15
+ - 100K<n<1M
16
+ task_categories:
17
+ - text-classification
18
+ - retrieval
19
+ - language-modeling
20
+ - code-generation
21
+ source_datasets: []
22
+ annotations_creators:
23
+ - machine-generated
24
+ - human-verified
25
+ dataset_description: |
26
+ This dataset contains processed and raw data from the dotnet/runtime GitHub repository, including PR metadata, diffs, and event timelines. It is suitable for fine-tuning language models, RAGs, and code-related tasks. Data is available in raw json, processed parquet, and faiss vector index formats.
scripts/README.md ADDED
@@ -0,0 +1,83 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Model fine-tuning
2
+
3
+ This directory contains scripts for:
4
+ - **Model fine-tuning**: Generate datasets and fine-tune an LLM on GitHub PRs and commits.
5
+ - **RAG indexing**: Generate vector indexes (embeddings) based on the repository.
6
+ - **GitHub crawler**: Retrieve PR metadata, comments, reviews, and commit diffs from a public GitHub repository.
7
+
8
+ ## Directory structure
9
+
10
+ - `model/`: Python scripts for dataset generation, fine-tuning, and RAG vector indexing.
11
+ - `github/`: Node.js CLI tool for crawling GitHub repositories.
12
+ - `../data/`: Output directory for crawled data, generated datasets, and vector indexes.
13
+
14
+ ---
15
+
16
+ ## Dataset generation & RAG indexing
17
+
18
+ ### Overview
19
+
20
+ - **generate_dataset.py**: Processes raw PR metadata and commit diffs (from `../data/`) to generate training examples in JSONL format.
21
+ - **rag.py**: Generates vector indexes (embeddings) from processed data for retrieval-augmented generation.
22
+
23
+ ### Quick Start
24
+
25
+ 1. **Install dependencies**:
26
+ ```bash
27
+ pip3 install -r requirements.txt
28
+ ```
29
+ 2. **Prepare a `settings.json` file**:
30
+ ```json
31
+ {
32
+ "system_instruction": "...",
33
+ "base_model": "microsoft/Phi-4-reasoning",
34
+ "max_context_size": 32768,
35
+ "embed_model": "all-MiniLM-L6-v2",
36
+ "repository": "https://github.com/dotnet/runtime"
37
+ }
38
+ ```
39
+ 3. **Data preparation & indexing**:
40
+ - Run the dataset generator and RAG indexer:
41
+ ```sh
42
+ python3 generate_dataset.py
43
+ python3 rag.py
44
+ ```
45
+
46
+ ## GitHub Crawler
47
+
48
+ A CLI tool to retrieve PR metadata, comments, reviews, and commit diffs from a public GitHub repo.
49
+
50
+ ### Quick Start
51
+
52
+ 1. **Install dependencies**:
53
+ ```bash
54
+ npm install
55
+ ```
56
+ 2. **Set your GitHub token**:
57
+ ```bash
58
+ export GITHUB_TOKEN=YOUR_TOKEN
59
+ ```
60
+ 3. **Run the crawler**:
61
+ ```bash
62
+ node main.js
63
+ ```
64
+
65
+ ## Expected Output
66
+
67
+ After running, you'll find:
68
+ ```
69
+ ../data/raw_sample/
70
+ ├── prs/
71
+ │ ├── pr-1.json
72
+ │ ├── pr-2.json
73
+ │ └── ...
74
+ └── diffs/
75
+ ├── <sha1>.diff
76
+ ├── <sha2>.diff
77
+ └── ...
78
+ ../data/processed/
79
+ train.parquet
80
+ test.parquet
81
+ ../data/faiss/
82
+ index
83
+ ```
scripts/github/main.js ADDED
@@ -0,0 +1,290 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import dotenv from 'dotenv';
2
+ import { Octokit } from '@octokit/rest';
3
+ import { graphql } from '@octokit/graphql';
4
+ import fs from 'fs/promises';
5
+ import fsSync from 'fs';
6
+ import path from 'path';
7
+ import pRetry from 'p-retry';
8
+ import { promisify } from 'util';
9
+ import cliProgress from 'cli-progress';
10
+ import settings from '../settings.json' assert { type: 'json' };
11
+ import fetch from 'node-fetch';
12
+
13
+ dotenv.config();
14
+ const TOKEN = process.env.GITHUB_TOKEN;
15
+ if (!TOKEN) throw new Error('Missing GITHUB_TOKEN');
16
+
17
+ const owner = settings.repository.replace("https://github.com/", "").split('/')[0];
18
+ const repo = settings.repository.replace("https://github.com/", "").split('/')[1];
19
+ const outDir = 'data/raw_sample';
20
+ const limit = Number.MAX_SAFE_INTEGER;
21
+
22
+ const octokit = new Octokit({ auth: TOKEN });
23
+ const gh = graphql.defaults({ headers: { authorization: `token ${TOKEN}` } });
24
+ const sleep = promisify(setTimeout);
25
+
26
+ function log(level, msg) {
27
+ console.log(`[${new Date().toISOString()}] [${level}] ${msg}`);
28
+ }
29
+
30
+ async function ensureDir(dir) {
31
+ await fs.mkdir(dir, { recursive: true });
32
+ }
33
+
34
+ // Returns the GraphQL rate limit status
35
+ async function getGraphqlRateLimit() {
36
+ const query = `query { rateLimit { limit cost remaining resetAt nodeCount } }`;
37
+ const resp = await gh(query);
38
+ const rl = resp.rateLimit;
39
+ return {
40
+ remaining: rl.remaining,
41
+ resetAt: new Date(rl.resetAt).getTime(),
42
+ window: new Date(rl.resetAt).getTime() - Date.now(),
43
+ cost: rl.cost || 1
44
+ };
45
+ }
46
+
47
+ // Returns the REST rate limit status
48
+ async function getRestRateLimit() {
49
+ const { data } = await octokit.rest.rateLimit.get();
50
+ const core = data.resources.core;
51
+ return {
52
+ remaining: core.remaining,
53
+ resetAt: core.reset * 1000,
54
+ window: core.reset * 1000 - Date.now()
55
+ };
56
+ }
57
+
58
+ function msToHMS(ms) {
59
+ const s = Math.ceil(ms/1000);
60
+ const h = Math.floor(s/3600);
61
+ const m = Math.floor((s%3600)/60);
62
+ const sec = s%60;
63
+ return `${h}h ${m}m ${sec}s`;
64
+ }
65
+
66
+ async function waitForRateLimitReset(mode = 'rest', expectedCost = 1) {
67
+ while (true) {
68
+ const rl = mode === 'graphql' ? await getGraphqlRateLimit() : await getRestRateLimit();
69
+ if (rl.remaining >= expectedCost) return;
70
+ const ms = Math.max(rl.resetAt - Date.now(), 0);
71
+ log('WARN', `Rate limit hit (${mode.toUpperCase()}); sleeping ${msToHMS(ms)}`);
72
+ await sleep(ms + 1000);
73
+ }
74
+ }
75
+
76
+ async function withSmartRetry(fn, mode = 'rest', expectedCost = 1) {
77
+ return pRetry(async () => {
78
+ await waitForRateLimitReset(mode, expectedCost);
79
+ return fn();
80
+ }, {
81
+ retries: 5,
82
+ factor: 2,
83
+ minTimeout: 1000,
84
+ maxTimeout: 30000,
85
+ onFailedAttempt: e => log('WARN', `Attempt #${e.attemptNumber} failed: ${e.status || e.code || e.message}`),
86
+ retry: e =>
87
+ (/rate limit/i.test(e.message)) ||
88
+ (e.status >= 500) ||
89
+ ['ECONNRESET','ENOTFOUND'].includes(e.code)
90
+ });
91
+ }
92
+
93
+ async function callGh(query, vars) {
94
+ try {
95
+ return await gh(query, vars);
96
+ } catch (err) {
97
+ if (/rate limit/i.test(err.message)) {
98
+ await waitForRateLimitReset('graphql');
99
+ return await gh(query, vars);
100
+ }
101
+ throw err;
102
+ }
103
+ }
104
+
105
+ const STATE_FILE = path.join(outDir, 'state.json');
106
+ let state = { prsDone: [], diffsDone: [], errors: [] };
107
+ try { state = JSON.parse(fsSync.readFileSync(STATE_FILE, 'utf-8')); } catch {}
108
+ function saveState() {
109
+ fsSync.writeFileSync(STATE_FILE, JSON.stringify(state, null, 2));
110
+ }
111
+
112
+ async function fetchConnection(owner, repo, prNumber, field, selection) {
113
+ let cursor = null, nodes = [];
114
+ do {
115
+ const resp = await withSmartRetry(() =>
116
+ callGh(
117
+ `query($owner:String!,$repo:String!,$pr:Int!,$first:Int!,$after:String){
118
+ repository(owner:$owner,name:$repo){
119
+ pullRequest(number:$pr){
120
+ ${field}(first:$first,after:$after){
121
+ pageInfo{ hasNextPage endCursor }
122
+ nodes{ ${selection} }
123
+ }
124
+ }
125
+ }
126
+ }`,
127
+ { owner, repo, pr: prNumber, first: 100, after: cursor }
128
+ )
129
+ , 'graphql', 100);
130
+ const conn = resp.repository.pullRequest[field];
131
+ nodes.push(...conn.nodes);
132
+ cursor = conn.pageInfo.hasNextPage ? conn.pageInfo.endCursor : null;
133
+ } while (cursor);
134
+ return nodes;
135
+ }
136
+
137
+ async function fetchAllPRs(owner, repo, limit) {
138
+ if (!state.cursor) {
139
+ const repoInfo = await callGh(
140
+ `query($owner:String!,$repo:String!){repository(owner:$owner,name:$repo){pullRequests{totalCount}}}`,
141
+ { owner, repo }
142
+ );
143
+ state.totalPRs = limit ? Math.min(limit, repoInfo.repository.pullRequests.totalCount) : repoInfo.repository.pullRequests.totalCount;
144
+ state.cursor = null;
145
+ state.processedPRs = 0;
146
+ state.prsCompleted = [];
147
+ state.prsPending = [];
148
+ saveState();
149
+ }
150
+
151
+ const bar = new cliProgress.SingleBar({
152
+ format: 'Fetching PRs |{bar}| {value}/{total} PRs',
153
+ hideCursor: true
154
+ }, cliProgress.Presets.shades_classic);
155
+ bar.start(limit ? Math.min(limit, state.totalPRs) : state.totalPRs, state.processedPRs);
156
+
157
+ while (state.processedPRs < limit && (state.cursor || state.processedPRs === 0)) {
158
+ const resp = await withSmartRetry(() =>
159
+ callGh(
160
+ `query($owner:String!,$repo:String!,$first:Int!,$after:String){
161
+ repository(owner:$owner,name:$repo){
162
+ pullRequests(first:$first,after:$after,orderBy:{field:CREATED_AT,direction:DESC}){
163
+ pageInfo{ hasNextPage endCursor }
164
+ nodes{
165
+ number title body createdAt closedAt mergedAt state author{login}
166
+ labels(first:10){ nodes{ name }}
167
+ headRefName additions deletions changedFiles
168
+ comments(first:100) { totalCount nodes { body createdAt } }
169
+ reviewThreads(first:100) { totalCount nodes { comments(first:20) { nodes { path diffHunk body createdAt }}}}
170
+ commits(first:100) { totalCount nodes { commit { oid message committedDate }}}
171
+ }
172
+ }
173
+ }
174
+ }`,
175
+ { owner, repo, first: 100, after: state.cursor }
176
+ )
177
+ , 'graphql', 100);
178
+
179
+ const conn = resp.repository.pullRequests;
180
+ for (const pr of conn.nodes) {
181
+ const prFile = path.join(outDir, 'prs', `pr-${pr.number}.json`);
182
+ if (!fsSync.existsSync(prFile)) {
183
+ await fs.writeFile(prFile, JSON.stringify(pr, null, 2));
184
+ if (
185
+ pr.comments.totalCount === pr.comments.nodes.length &&
186
+ pr.reviewThreads.totalCount === pr.reviewThreads.nodes.length &&
187
+ pr.commits.totalCount === pr.commits.nodes.length
188
+ ) {
189
+ state.prsCompleted.push(pr.number);
190
+ } else {
191
+ state.prsPending.push(pr.number);
192
+ }
193
+ }
194
+ }
195
+
196
+ state.processedPRs += conn.nodes.length;
197
+ state.cursor = conn.pageInfo.hasNextPage ? conn.pageInfo.endCursor : null;
198
+ saveState();
199
+ bar.update(state.processedPRs);
200
+ }
201
+ bar.stop();
202
+ }
203
+
204
+ async function fetchMissingDetails(pr) {
205
+ const outFile = path.join(outDir, 'prs', `pr-${pr}.json`);
206
+ const prData = JSON.parse(await fs.readFile(outFile, 'utf-8'));
207
+
208
+ const commentCount = prData.comments?.totalCount || 0;
209
+ const reviewCount = prData.reviewThreads?.totalCount || 0;
210
+ const commitCount = prData.commits?.totalCount || 0;
211
+
212
+ if ((prData.comments?.nodes?.length || 0) < commentCount) {
213
+ prData.comments.nodes = await fetchConnection(owner, repo, pr, 'comments', 'body createdAt');
214
+ }
215
+ if ((prData.reviewThreads?.nodes?.length || 0) < reviewCount) {
216
+ prData.reviewThreads.nodes = await fetchConnection(owner, repo, pr, 'reviewThreads', 'comments(first:100) { nodes { path diffHunk body createdAt } }');
217
+ }
218
+ if ((prData.commits?.nodes?.length || 0) < commitCount) {
219
+ prData.commits.nodes = await fetchConnection(owner, repo, pr, 'commits', 'commit { oid message committedDate }');
220
+ }
221
+
222
+ await fs.writeFile(outFile, JSON.stringify(prData, null, 2));
223
+ }
224
+
225
+ async function fetchCommitDiff(owner, repo, sha) {
226
+ await waitForRateLimitReset('rest');
227
+ const url = `https://github.com/${owner}/${repo}/commit/${sha}.diff`;
228
+ const res = await withSmartRetry(() => fetch(url), 'rest');
229
+ if (!res.ok) throw new Error(`Failed to fetch diff: ${res.status} ${res.statusText}`);
230
+ return await res.text();
231
+ }
232
+
233
+ async function main() {
234
+ // Create output directories
235
+ await ensureDir(path.join(outDir, 'prs'));
236
+ await ensureDir(path.join(outDir, 'diffs'));
237
+
238
+ // Fetch PRs
239
+ await fetchAllPRs(owner, repo, limit);
240
+ log('INFO', `Fetched ${state.totalPRs} PRs`);
241
+
242
+ // Fetch missing PR data
243
+ let bar = new cliProgress.SingleBar({
244
+ format: 'Fetching PR details |{bar}| {value}/{total} PRs',
245
+ hideCursor: true
246
+ }, cliProgress.Presets.shades_classic);
247
+ bar.start(state.prsPending.length, 0);
248
+ while (state.prsPending.length > 0) {
249
+ const pr = state.prsPending.pop();
250
+ await fetchMissingDetails(pr);
251
+ state.prsCompleted.push(pr);
252
+ saveState();
253
+ bar.increment();
254
+ }
255
+ bar.stop();
256
+
257
+ // Fetch commit diffs
258
+ bar = new cliProgress.SingleBar({
259
+ format: 'Fetching commit diffs |{bar}| {value}/{total} PRs',
260
+ hideCursor: true
261
+ }, cliProgress.Presets.shades_classic);
262
+ bar.start(state.prsCompleted.length, 0);
263
+ while (state.prsCompleted.length > 0) {
264
+ const pr = state.prsCompleted.pop();
265
+ const prFile = path.join(outDir, 'prs', `pr-${pr}.json`);
266
+ const prData = JSON.parse(await fs.readFile(prFile, 'utf-8'));
267
+ const commits = prData.commits?.nodes || [];
268
+
269
+ for (const entry of commits) {
270
+ const sha = entry.commit.oid;
271
+ const diffFile = path.join(outDir, 'diffs', `${sha}.diff`);
272
+ if (fsSync.existsSync(diffFile)) continue;
273
+
274
+ try {
275
+ const data = await fetchCommitDiff(owner, repo, sha);
276
+ await fs.writeFile(diffFile, data);
277
+ } catch (err) {
278
+ log('ERROR', `Failed to fetch diff for ${sha}: ${err.message}`);
279
+ }
280
+ }
281
+
282
+ saveState();
283
+ bar.increment();
284
+ }
285
+ bar.stop();
286
+
287
+ log('INFO', 'All phases complete.');
288
+ }
289
+
290
+ main();
scripts/github/package-lock.json ADDED
@@ -0,0 +1,593 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "name": "dotnet-agent",
3
+ "version": "1.0.0",
4
+ "lockfileVersion": 3,
5
+ "requires": true,
6
+ "packages": {
7
+ "": {
8
+ "name": "dotnet-agent",
9
+ "version": "1.0.0",
10
+ "license": "ISC",
11
+ "dependencies": {
12
+ "@octokit/graphql": "^8.2.2",
13
+ "@octokit/rest": "^21.1.1",
14
+ "cli-progress": "^3.12.0",
15
+ "delay": "^6.0.0",
16
+ "dotenv": "^16.5.0",
17
+ "node-fetch": "^3.3.2",
18
+ "p-limit": "^6.2.0",
19
+ "p-queue": "^8.1.0",
20
+ "p-retry": "^6.2.1",
21
+ "yargs": "^17.7.2"
22
+ }
23
+ },
24
+ "node_modules/@octokit/auth-token": {
25
+ "version": "5.1.2",
26
+ "resolved": "https://registry.npmjs.org/@octokit/auth-token/-/auth-token-5.1.2.tgz",
27
+ "integrity": "sha512-JcQDsBdg49Yky2w2ld20IHAlwr8d/d8N6NiOXbtuoPCqzbsiJgF633mVUw3x4mo0H5ypataQIX7SFu3yy44Mpw==",
28
+ "engines": {
29
+ "node": ">= 18"
30
+ }
31
+ },
32
+ "node_modules/@octokit/core": {
33
+ "version": "6.1.5",
34
+ "resolved": "https://registry.npmjs.org/@octokit/core/-/core-6.1.5.tgz",
35
+ "integrity": "sha512-vvmsN0r7rguA+FySiCsbaTTobSftpIDIpPW81trAmsv9TGxg3YCujAxRYp/Uy8xmDgYCzzgulG62H7KYUFmeIg==",
36
+ "dependencies": {
37
+ "@octokit/auth-token": "^5.0.0",
38
+ "@octokit/graphql": "^8.2.2",
39
+ "@octokit/request": "^9.2.3",
40
+ "@octokit/request-error": "^6.1.8",
41
+ "@octokit/types": "^14.0.0",
42
+ "before-after-hook": "^3.0.2",
43
+ "universal-user-agent": "^7.0.0"
44
+ },
45
+ "engines": {
46
+ "node": ">= 18"
47
+ }
48
+ },
49
+ "node_modules/@octokit/endpoint": {
50
+ "version": "10.1.4",
51
+ "resolved": "https://registry.npmjs.org/@octokit/endpoint/-/endpoint-10.1.4.tgz",
52
+ "integrity": "sha512-OlYOlZIsfEVZm5HCSR8aSg02T2lbUWOsCQoPKfTXJwDzcHQBrVBGdGXb89dv2Kw2ToZaRtudp8O3ZIYoaOjKlA==",
53
+ "dependencies": {
54
+ "@octokit/types": "^14.0.0",
55
+ "universal-user-agent": "^7.0.2"
56
+ },
57
+ "engines": {
58
+ "node": ">= 18"
59
+ }
60
+ },
61
+ "node_modules/@octokit/graphql": {
62
+ "version": "8.2.2",
63
+ "resolved": "https://registry.npmjs.org/@octokit/graphql/-/graphql-8.2.2.tgz",
64
+ "integrity": "sha512-Yi8hcoqsrXGdt0yObxbebHXFOiUA+2v3n53epuOg1QUgOB6c4XzvisBNVXJSl8RYA5KrDuSL2yq9Qmqe5N0ryA==",
65
+ "dependencies": {
66
+ "@octokit/request": "^9.2.3",
67
+ "@octokit/types": "^14.0.0",
68
+ "universal-user-agent": "^7.0.0"
69
+ },
70
+ "engines": {
71
+ "node": ">= 18"
72
+ }
73
+ },
74
+ "node_modules/@octokit/openapi-types": {
75
+ "version": "25.0.0",
76
+ "resolved": "https://registry.npmjs.org/@octokit/openapi-types/-/openapi-types-25.0.0.tgz",
77
+ "integrity": "sha512-FZvktFu7HfOIJf2BScLKIEYjDsw6RKc7rBJCdvCTfKsVnx2GEB/Nbzjr29DUdb7vQhlzS/j8qDzdditP0OC6aw=="
78
+ },
79
+ "node_modules/@octokit/plugin-paginate-rest": {
80
+ "version": "11.6.0",
81
+ "resolved": "https://registry.npmjs.org/@octokit/plugin-paginate-rest/-/plugin-paginate-rest-11.6.0.tgz",
82
+ "integrity": "sha512-n5KPteiF7pWKgBIBJSk8qzoZWcUkza2O6A0za97pMGVrGfPdltxrfmfF5GucHYvHGZD8BdaZmmHGz5cX/3gdpw==",
83
+ "dependencies": {
84
+ "@octokit/types": "^13.10.0"
85
+ },
86
+ "engines": {
87
+ "node": ">= 18"
88
+ },
89
+ "peerDependencies": {
90
+ "@octokit/core": ">=6"
91
+ }
92
+ },
93
+ "node_modules/@octokit/plugin-paginate-rest/node_modules/@octokit/openapi-types": {
94
+ "version": "24.2.0",
95
+ "resolved": "https://registry.npmjs.org/@octokit/openapi-types/-/openapi-types-24.2.0.tgz",
96
+ "integrity": "sha512-9sIH3nSUttelJSXUrmGzl7QUBFul0/mB8HRYl3fOlgHbIWG+WnYDXU3v/2zMtAvuzZ/ed00Ei6on975FhBfzrg=="
97
+ },
98
+ "node_modules/@octokit/plugin-paginate-rest/node_modules/@octokit/types": {
99
+ "version": "13.10.0",
100
+ "resolved": "https://registry.npmjs.org/@octokit/types/-/types-13.10.0.tgz",
101
+ "integrity": "sha512-ifLaO34EbbPj0Xgro4G5lP5asESjwHracYJvVaPIyXMuiuXLlhic3S47cBdTb+jfODkTE5YtGCLt3Ay3+J97sA==",
102
+ "dependencies": {
103
+ "@octokit/openapi-types": "^24.2.0"
104
+ }
105
+ },
106
+ "node_modules/@octokit/plugin-request-log": {
107
+ "version": "5.3.1",
108
+ "resolved": "https://registry.npmjs.org/@octokit/plugin-request-log/-/plugin-request-log-5.3.1.tgz",
109
+ "integrity": "sha512-n/lNeCtq+9ofhC15xzmJCNKP2BWTv8Ih2TTy+jatNCCq/gQP/V7rK3fjIfuz0pDWDALO/o/4QY4hyOF6TQQFUw==",
110
+ "engines": {
111
+ "node": ">= 18"
112
+ },
113
+ "peerDependencies": {
114
+ "@octokit/core": ">=6"
115
+ }
116
+ },
117
+ "node_modules/@octokit/plugin-rest-endpoint-methods": {
118
+ "version": "13.5.0",
119
+ "resolved": "https://registry.npmjs.org/@octokit/plugin-rest-endpoint-methods/-/plugin-rest-endpoint-methods-13.5.0.tgz",
120
+ "integrity": "sha512-9Pas60Iv9ejO3WlAX3maE1+38c5nqbJXV5GrncEfkndIpZrJ/WPMRd2xYDcPPEt5yzpxcjw9fWNoPhsSGzqKqw==",
121
+ "dependencies": {
122
+ "@octokit/types": "^13.10.0"
123
+ },
124
+ "engines": {
125
+ "node": ">= 18"
126
+ },
127
+ "peerDependencies": {
128
+ "@octokit/core": ">=6"
129
+ }
130
+ },
131
+ "node_modules/@octokit/plugin-rest-endpoint-methods/node_modules/@octokit/openapi-types": {
132
+ "version": "24.2.0",
133
+ "resolved": "https://registry.npmjs.org/@octokit/openapi-types/-/openapi-types-24.2.0.tgz",
134
+ "integrity": "sha512-9sIH3nSUttelJSXUrmGzl7QUBFul0/mB8HRYl3fOlgHbIWG+WnYDXU3v/2zMtAvuzZ/ed00Ei6on975FhBfzrg=="
135
+ },
136
+ "node_modules/@octokit/plugin-rest-endpoint-methods/node_modules/@octokit/types": {
137
+ "version": "13.10.0",
138
+ "resolved": "https://registry.npmjs.org/@octokit/types/-/types-13.10.0.tgz",
139
+ "integrity": "sha512-ifLaO34EbbPj0Xgro4G5lP5asESjwHracYJvVaPIyXMuiuXLlhic3S47cBdTb+jfODkTE5YtGCLt3Ay3+J97sA==",
140
+ "dependencies": {
141
+ "@octokit/openapi-types": "^24.2.0"
142
+ }
143
+ },
144
+ "node_modules/@octokit/request": {
145
+ "version": "9.2.3",
146
+ "resolved": "https://registry.npmjs.org/@octokit/request/-/request-9.2.3.tgz",
147
+ "integrity": "sha512-Ma+pZU8PXLOEYzsWf0cn/gY+ME57Wq8f49WTXA8FMHp2Ps9djKw//xYJ1je8Hm0pR2lU9FUGeJRWOtxq6olt4w==",
148
+ "dependencies": {
149
+ "@octokit/endpoint": "^10.1.4",
150
+ "@octokit/request-error": "^6.1.8",
151
+ "@octokit/types": "^14.0.0",
152
+ "fast-content-type-parse": "^2.0.0",
153
+ "universal-user-agent": "^7.0.2"
154
+ },
155
+ "engines": {
156
+ "node": ">= 18"
157
+ }
158
+ },
159
+ "node_modules/@octokit/request-error": {
160
+ "version": "6.1.8",
161
+ "resolved": "https://registry.npmjs.org/@octokit/request-error/-/request-error-6.1.8.tgz",
162
+ "integrity": "sha512-WEi/R0Jmq+IJKydWlKDmryPcmdYSVjL3ekaiEL1L9eo1sUnqMJ+grqmC9cjk7CA7+b2/T397tO5d8YLOH3qYpQ==",
163
+ "dependencies": {
164
+ "@octokit/types": "^14.0.0"
165
+ },
166
+ "engines": {
167
+ "node": ">= 18"
168
+ }
169
+ },
170
+ "node_modules/@octokit/rest": {
171
+ "version": "21.1.1",
172
+ "resolved": "https://registry.npmjs.org/@octokit/rest/-/rest-21.1.1.tgz",
173
+ "integrity": "sha512-sTQV7va0IUVZcntzy1q3QqPm/r8rWtDCqpRAmb8eXXnKkjoQEtFe3Nt5GTVsHft+R6jJoHeSiVLcgcvhtue/rg==",
174
+ "dependencies": {
175
+ "@octokit/core": "^6.1.4",
176
+ "@octokit/plugin-paginate-rest": "^11.4.2",
177
+ "@octokit/plugin-request-log": "^5.3.1",
178
+ "@octokit/plugin-rest-endpoint-methods": "^13.3.0"
179
+ },
180
+ "engines": {
181
+ "node": ">= 18"
182
+ }
183
+ },
184
+ "node_modules/@octokit/types": {
185
+ "version": "14.0.0",
186
+ "resolved": "https://registry.npmjs.org/@octokit/types/-/types-14.0.0.tgz",
187
+ "integrity": "sha512-VVmZP0lEhbo2O1pdq63gZFiGCKkm8PPp8AUOijlwPO6hojEVjspA0MWKP7E4hbvGxzFKNqKr6p0IYtOH/Wf/zA==",
188
+ "dependencies": {
189
+ "@octokit/openapi-types": "^25.0.0"
190
+ }
191
+ },
192
+ "node_modules/@types/retry": {
193
+ "version": "0.12.2",
194
+ "resolved": "https://registry.npmjs.org/@types/retry/-/retry-0.12.2.tgz",
195
+ "integrity": "sha512-XISRgDJ2Tc5q4TRqvgJtzsRkFYNJzZrhTdtMoGVBttwzzQJkPnS3WWTFc7kuDRoPtPakl+T+OfdEUjYJj7Jbow=="
196
+ },
197
+ "node_modules/ansi-regex": {
198
+ "version": "5.0.1",
199
+ "resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.1.tgz",
200
+ "integrity": "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==",
201
+ "engines": {
202
+ "node": ">=8"
203
+ }
204
+ },
205
+ "node_modules/ansi-styles": {
206
+ "version": "4.3.0",
207
+ "resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz",
208
+ "integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==",
209
+ "dependencies": {
210
+ "color-convert": "^2.0.1"
211
+ },
212
+ "engines": {
213
+ "node": ">=8"
214
+ },
215
+ "funding": {
216
+ "url": "https://github.com/chalk/ansi-styles?sponsor=1"
217
+ }
218
+ },
219
+ "node_modules/before-after-hook": {
220
+ "version": "3.0.2",
221
+ "resolved": "https://registry.npmjs.org/before-after-hook/-/before-after-hook-3.0.2.tgz",
222
+ "integrity": "sha512-Nik3Sc0ncrMK4UUdXQmAnRtzmNQTAAXmXIopizwZ1W1t8QmfJj+zL4OA2I7XPTPW5z5TDqv4hRo/JzouDJnX3A=="
223
+ },
224
+ "node_modules/cli-progress": {
225
+ "version": "3.12.0",
226
+ "resolved": "https://registry.npmjs.org/cli-progress/-/cli-progress-3.12.0.tgz",
227
+ "integrity": "sha512-tRkV3HJ1ASwm19THiiLIXLO7Im7wlTuKnvkYaTkyoAPefqjNg7W7DHKUlGRxy9vxDvbyCYQkQozvptuMkGCg8A==",
228
+ "dependencies": {
229
+ "string-width": "^4.2.3"
230
+ },
231
+ "engines": {
232
+ "node": ">=4"
233
+ }
234
+ },
235
+ "node_modules/cliui": {
236
+ "version": "8.0.1",
237
+ "resolved": "https://registry.npmjs.org/cliui/-/cliui-8.0.1.tgz",
238
+ "integrity": "sha512-BSeNnyus75C4//NQ9gQt1/csTXyo/8Sb+afLAkzAptFuMsod9HFokGNudZpi/oQV73hnVK+sR+5PVRMd+Dr7YQ==",
239
+ "dependencies": {
240
+ "string-width": "^4.2.0",
241
+ "strip-ansi": "^6.0.1",
242
+ "wrap-ansi": "^7.0.0"
243
+ },
244
+ "engines": {
245
+ "node": ">=12"
246
+ }
247
+ },
248
+ "node_modules/color-convert": {
249
+ "version": "2.0.1",
250
+ "resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
251
+ "integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
252
+ "dependencies": {
253
+ "color-name": "~1.1.4"
254
+ },
255
+ "engines": {
256
+ "node": ">=7.0.0"
257
+ }
258
+ },
259
+ "node_modules/color-name": {
260
+ "version": "1.1.4",
261
+ "resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
262
+ "integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA=="
263
+ },
264
+ "node_modules/data-uri-to-buffer": {
265
+ "version": "4.0.1",
266
+ "resolved": "https://registry.npmjs.org/data-uri-to-buffer/-/data-uri-to-buffer-4.0.1.tgz",
267
+ "integrity": "sha512-0R9ikRb668HB7QDxT1vkpuUBtqc53YyAwMwGeUFKRojY/NWKvdZ+9UYtRfGmhqNbRkTSVpMbmyhXipFFv2cb/A==",
268
+ "engines": {
269
+ "node": ">= 12"
270
+ }
271
+ },
272
+ "node_modules/delay": {
273
+ "version": "6.0.0",
274
+ "resolved": "https://registry.npmjs.org/delay/-/delay-6.0.0.tgz",
275
+ "integrity": "sha512-2NJozoOHQ4NuZuVIr5CWd0iiLVIRSDepakaovIN+9eIDHEhdCAEvSy2cuf1DCrPPQLvHmbqTHODlhHg8UCy4zw==",
276
+ "engines": {
277
+ "node": ">=16"
278
+ },
279
+ "funding": {
280
+ "url": "https://github.com/sponsors/sindresorhus"
281
+ }
282
+ },
283
+ "node_modules/dotenv": {
284
+ "version": "16.5.0",
285
+ "resolved": "https://registry.npmjs.org/dotenv/-/dotenv-16.5.0.tgz",
286
+ "integrity": "sha512-m/C+AwOAr9/W1UOIZUo232ejMNnJAJtYQjUbHoNTBNTJSvqzzDh7vnrei3o3r3m9blf6ZoDkvcw0VmozNRFJxg==",
287
+ "engines": {
288
+ "node": ">=12"
289
+ },
290
+ "funding": {
291
+ "url": "https://dotenvx.com"
292
+ }
293
+ },
294
+ "node_modules/emoji-regex": {
295
+ "version": "8.0.0",
296
+ "resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
297
+ "integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="
298
+ },
299
+ "node_modules/escalade": {
300
+ "version": "3.2.0",
301
+ "resolved": "https://registry.npmjs.org/escalade/-/escalade-3.2.0.tgz",
302
+ "integrity": "sha512-WUj2qlxaQtO4g6Pq5c29GTcWGDyd8itL8zTlipgECz3JesAiiOKotd8JU6otB3PACgG6xkJUyVhboMS+bje/jA==",
303
+ "engines": {
304
+ "node": ">=6"
305
+ }
306
+ },
307
+ "node_modules/eventemitter3": {
308
+ "version": "5.0.1",
309
+ "resolved": "https://registry.npmjs.org/eventemitter3/-/eventemitter3-5.0.1.tgz",
310
+ "integrity": "sha512-GWkBvjiSZK87ELrYOSESUYeVIc9mvLLf/nXalMOS5dYrgZq9o5OVkbZAVM06CVxYsCwH9BDZFPlQTlPA1j4ahA=="
311
+ },
312
+ "node_modules/fast-content-type-parse": {
313
+ "version": "2.0.1",
314
+ "resolved": "https://registry.npmjs.org/fast-content-type-parse/-/fast-content-type-parse-2.0.1.tgz",
315
+ "integrity": "sha512-nGqtvLrj5w0naR6tDPfB4cUmYCqouzyQiz6C5y/LtcDllJdrcc6WaWW6iXyIIOErTa/XRybj28aasdn4LkVk6Q==",
316
+ "funding": [
317
+ {
318
+ "type": "github",
319
+ "url": "https://github.com/sponsors/fastify"
320
+ },
321
+ {
322
+ "type": "opencollective",
323
+ "url": "https://opencollective.com/fastify"
324
+ }
325
+ ]
326
+ },
327
+ "node_modules/fetch-blob": {
328
+ "version": "3.2.0",
329
+ "resolved": "https://registry.npmjs.org/fetch-blob/-/fetch-blob-3.2.0.tgz",
330
+ "integrity": "sha512-7yAQpD2UMJzLi1Dqv7qFYnPbaPx7ZfFK6PiIxQ4PfkGPyNyl2Ugx+a/umUonmKqjhM4DnfbMvdX6otXq83soQQ==",
331
+ "funding": [
332
+ {
333
+ "type": "github",
334
+ "url": "https://github.com/sponsors/jimmywarting"
335
+ },
336
+ {
337
+ "type": "paypal",
338
+ "url": "https://paypal.me/jimmywarting"
339
+ }
340
+ ],
341
+ "dependencies": {
342
+ "node-domexception": "^1.0.0",
343
+ "web-streams-polyfill": "^3.0.3"
344
+ },
345
+ "engines": {
346
+ "node": "^12.20 || >= 14.13"
347
+ }
348
+ },
349
+ "node_modules/formdata-polyfill": {
350
+ "version": "4.0.10",
351
+ "resolved": "https://registry.npmjs.org/formdata-polyfill/-/formdata-polyfill-4.0.10.tgz",
352
+ "integrity": "sha512-buewHzMvYL29jdeQTVILecSaZKnt/RJWjoZCF5OW60Z67/GmSLBkOFM7qh1PI3zFNtJbaZL5eQu1vLfazOwj4g==",
353
+ "dependencies": {
354
+ "fetch-blob": "^3.1.2"
355
+ },
356
+ "engines": {
357
+ "node": ">=12.20.0"
358
+ }
359
+ },
360
+ "node_modules/get-caller-file": {
361
+ "version": "2.0.5",
362
+ "resolved": "https://registry.npmjs.org/get-caller-file/-/get-caller-file-2.0.5.tgz",
363
+ "integrity": "sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg==",
364
+ "engines": {
365
+ "node": "6.* || 8.* || >= 10.*"
366
+ }
367
+ },
368
+ "node_modules/is-fullwidth-code-point": {
369
+ "version": "3.0.0",
370
+ "resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz",
371
+ "integrity": "sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==",
372
+ "engines": {
373
+ "node": ">=8"
374
+ }
375
+ },
376
+ "node_modules/is-network-error": {
377
+ "version": "1.1.0",
378
+ "resolved": "https://registry.npmjs.org/is-network-error/-/is-network-error-1.1.0.tgz",
379
+ "integrity": "sha512-tUdRRAnhT+OtCZR/LxZelH/C7QtjtFrTu5tXCA8pl55eTUElUHT+GPYV8MBMBvea/j+NxQqVt3LbWMRir7Gx9g==",
380
+ "engines": {
381
+ "node": ">=16"
382
+ },
383
+ "funding": {
384
+ "url": "https://github.com/sponsors/sindresorhus"
385
+ }
386
+ },
387
+ "node_modules/node-domexception": {
388
+ "version": "1.0.0",
389
+ "resolved": "https://registry.npmjs.org/node-domexception/-/node-domexception-1.0.0.tgz",
390
+ "integrity": "sha512-/jKZoMpw0F8GRwl4/eLROPA3cfcXtLApP0QzLmUT/HuPCZWyB7IY9ZrMeKw2O/nFIqPQB3PVM9aYm0F312AXDQ==",
391
+ "deprecated": "Use your platform's native DOMException instead",
392
+ "funding": [
393
+ {
394
+ "type": "github",
395
+ "url": "https://github.com/sponsors/jimmywarting"
396
+ },
397
+ {
398
+ "type": "github",
399
+ "url": "https://paypal.me/jimmywarting"
400
+ }
401
+ ],
402
+ "engines": {
403
+ "node": ">=10.5.0"
404
+ }
405
+ },
406
+ "node_modules/node-fetch": {
407
+ "version": "3.3.2",
408
+ "resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-3.3.2.tgz",
409
+ "integrity": "sha512-dRB78srN/l6gqWulah9SrxeYnxeddIG30+GOqK/9OlLVyLg3HPnr6SqOWTWOXKRwC2eGYCkZ59NNuSgvSrpgOA==",
410
+ "dependencies": {
411
+ "data-uri-to-buffer": "^4.0.0",
412
+ "fetch-blob": "^3.1.4",
413
+ "formdata-polyfill": "^4.0.10"
414
+ },
415
+ "engines": {
416
+ "node": "^12.20.0 || ^14.13.1 || >=16.0.0"
417
+ },
418
+ "funding": {
419
+ "type": "opencollective",
420
+ "url": "https://opencollective.com/node-fetch"
421
+ }
422
+ },
423
+ "node_modules/p-limit": {
424
+ "version": "6.2.0",
425
+ "resolved": "https://registry.npmjs.org/p-limit/-/p-limit-6.2.0.tgz",
426
+ "integrity": "sha512-kuUqqHNUqoIWp/c467RI4X6mmyuojY5jGutNU0wVTmEOOfcuwLqyMVoAi9MKi2Ak+5i9+nhmrK4ufZE8069kHA==",
427
+ "dependencies": {
428
+ "yocto-queue": "^1.1.1"
429
+ },
430
+ "engines": {
431
+ "node": ">=18"
432
+ },
433
+ "funding": {
434
+ "url": "https://github.com/sponsors/sindresorhus"
435
+ }
436
+ },
437
+ "node_modules/p-queue": {
438
+ "version": "8.1.0",
439
+ "resolved": "https://registry.npmjs.org/p-queue/-/p-queue-8.1.0.tgz",
440
+ "integrity": "sha512-mxLDbbGIBEXTJL0zEx8JIylaj3xQ7Z/7eEVjcF9fJX4DBiH9oqe+oahYnlKKxm0Ci9TlWTyhSHgygxMxjIB2jw==",
441
+ "dependencies": {
442
+ "eventemitter3": "^5.0.1",
443
+ "p-timeout": "^6.1.2"
444
+ },
445
+ "engines": {
446
+ "node": ">=18"
447
+ },
448
+ "funding": {
449
+ "url": "https://github.com/sponsors/sindresorhus"
450
+ }
451
+ },
452
+ "node_modules/p-retry": {
453
+ "version": "6.2.1",
454
+ "resolved": "https://registry.npmjs.org/p-retry/-/p-retry-6.2.1.tgz",
455
+ "integrity": "sha512-hEt02O4hUct5wtwg4H4KcWgDdm+l1bOaEy/hWzd8xtXB9BqxTWBBhb+2ImAtH4Cv4rPjV76xN3Zumqk3k3AhhQ==",
456
+ "dependencies": {
457
+ "@types/retry": "0.12.2",
458
+ "is-network-error": "^1.0.0",
459
+ "retry": "^0.13.1"
460
+ },
461
+ "engines": {
462
+ "node": ">=16.17"
463
+ },
464
+ "funding": {
465
+ "url": "https://github.com/sponsors/sindresorhus"
466
+ }
467
+ },
468
+ "node_modules/p-timeout": {
469
+ "version": "6.1.4",
470
+ "resolved": "https://registry.npmjs.org/p-timeout/-/p-timeout-6.1.4.tgz",
471
+ "integrity": "sha512-MyIV3ZA/PmyBN/ud8vV9XzwTrNtR4jFrObymZYnZqMmW0zA8Z17vnT0rBgFE/TlohB+YCHqXMgZzb3Csp49vqg==",
472
+ "engines": {
473
+ "node": ">=14.16"
474
+ },
475
+ "funding": {
476
+ "url": "https://github.com/sponsors/sindresorhus"
477
+ }
478
+ },
479
+ "node_modules/require-directory": {
480
+ "version": "2.1.1",
481
+ "resolved": "https://registry.npmjs.org/require-directory/-/require-directory-2.1.1.tgz",
482
+ "integrity": "sha512-fGxEI7+wsG9xrvdjsrlmL22OMTTiHRwAMroiEeMgq8gzoLC/PQr7RsRDSTLUg/bZAZtF+TVIkHc6/4RIKrui+Q==",
483
+ "engines": {
484
+ "node": ">=0.10.0"
485
+ }
486
+ },
487
+ "node_modules/retry": {
488
+ "version": "0.13.1",
489
+ "resolved": "https://registry.npmjs.org/retry/-/retry-0.13.1.tgz",
490
+ "integrity": "sha512-XQBQ3I8W1Cge0Seh+6gjj03LbmRFWuoszgK9ooCpwYIrhhoO80pfq4cUkU5DkknwfOfFteRwlZ56PYOGYyFWdg==",
491
+ "engines": {
492
+ "node": ">= 4"
493
+ }
494
+ },
495
+ "node_modules/string-width": {
496
+ "version": "4.2.3",
497
+ "resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
498
+ "integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
499
+ "dependencies": {
500
+ "emoji-regex": "^8.0.0",
501
+ "is-fullwidth-code-point": "^3.0.0",
502
+ "strip-ansi": "^6.0.1"
503
+ },
504
+ "engines": {
505
+ "node": ">=8"
506
+ }
507
+ },
508
+ "node_modules/strip-ansi": {
509
+ "version": "6.0.1",
510
+ "resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz",
511
+ "integrity": "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==",
512
+ "dependencies": {
513
+ "ansi-regex": "^5.0.1"
514
+ },
515
+ "engines": {
516
+ "node": ">=8"
517
+ }
518
+ },
519
+ "node_modules/universal-user-agent": {
520
+ "version": "7.0.3",
521
+ "resolved": "https://registry.npmjs.org/universal-user-agent/-/universal-user-agent-7.0.3.tgz",
522
+ "integrity": "sha512-TmnEAEAsBJVZM/AADELsK76llnwcf9vMKuPz8JflO1frO8Lchitr0fNaN9d+Ap0BjKtqWqd/J17qeDnXh8CL2A=="
523
+ },
524
+ "node_modules/web-streams-polyfill": {
525
+ "version": "3.3.3",
526
+ "resolved": "https://registry.npmjs.org/web-streams-polyfill/-/web-streams-polyfill-3.3.3.tgz",
527
+ "integrity": "sha512-d2JWLCivmZYTSIoge9MsgFCZrt571BikcWGYkjC1khllbTeDlGqZ2D8vD8E/lJa8WGWbb7Plm8/XJYV7IJHZZw==",
528
+ "engines": {
529
+ "node": ">= 8"
530
+ }
531
+ },
532
+ "node_modules/wrap-ansi": {
533
+ "version": "7.0.0",
534
+ "resolved": "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-7.0.0.tgz",
535
+ "integrity": "sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==",
536
+ "dependencies": {
537
+ "ansi-styles": "^4.0.0",
538
+ "string-width": "^4.1.0",
539
+ "strip-ansi": "^6.0.0"
540
+ },
541
+ "engines": {
542
+ "node": ">=10"
543
+ },
544
+ "funding": {
545
+ "url": "https://github.com/chalk/wrap-ansi?sponsor=1"
546
+ }
547
+ },
548
+ "node_modules/y18n": {
549
+ "version": "5.0.8",
550
+ "resolved": "https://registry.npmjs.org/y18n/-/y18n-5.0.8.tgz",
551
+ "integrity": "sha512-0pfFzegeDWJHJIAmTLRP2DwHjdF5s7jo9tuztdQxAhINCdvS+3nGINqPd00AphqJR/0LhANUS6/+7SCb98YOfA==",
552
+ "engines": {
553
+ "node": ">=10"
554
+ }
555
+ },
556
+ "node_modules/yargs": {
557
+ "version": "17.7.2",
558
+ "resolved": "https://registry.npmjs.org/yargs/-/yargs-17.7.2.tgz",
559
+ "integrity": "sha512-7dSzzRQ++CKnNI/krKnYRV7JKKPUXMEh61soaHKg9mrWEhzFWhFnxPxGl+69cD1Ou63C13NUPCnmIcrvqCuM6w==",
560
+ "dependencies": {
561
+ "cliui": "^8.0.1",
562
+ "escalade": "^3.1.1",
563
+ "get-caller-file": "^2.0.5",
564
+ "require-directory": "^2.1.1",
565
+ "string-width": "^4.2.3",
566
+ "y18n": "^5.0.5",
567
+ "yargs-parser": "^21.1.1"
568
+ },
569
+ "engines": {
570
+ "node": ">=12"
571
+ }
572
+ },
573
+ "node_modules/yargs-parser": {
574
+ "version": "21.1.1",
575
+ "resolved": "https://registry.npmjs.org/yargs-parser/-/yargs-parser-21.1.1.tgz",
576
+ "integrity": "sha512-tVpsJW7DdjecAiFpbIB1e3qxIQsE6NoPc5/eTdrbbIC4h0LVsWhnoa3g+m2HclBIujHzsxZ4VJVA+GUuc2/LBw==",
577
+ "engines": {
578
+ "node": ">=12"
579
+ }
580
+ },
581
+ "node_modules/yocto-queue": {
582
+ "version": "1.2.1",
583
+ "resolved": "https://registry.npmjs.org/yocto-queue/-/yocto-queue-1.2.1.tgz",
584
+ "integrity": "sha512-AyeEbWOu/TAXdxlV9wmGcR0+yh2j3vYPGOECcIj2S7MkrLyC7ne+oye2BKTItt0ii2PHk4cDy+95+LshzbXnGg==",
585
+ "engines": {
586
+ "node": ">=12.20"
587
+ },
588
+ "funding": {
589
+ "url": "https://github.com/sponsors/sindresorhus"
590
+ }
591
+ }
592
+ }
593
+ }
scripts/github/package.json ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "name": "dotnet-agent",
3
+ "version": "1.0.0",
4
+ "description": "A GitHub agent that can answer and propose code changes for .NET runtime issues and PRs using a fine-tuned language model. The agent leverages context from all public .NET GitHub repositories to provide accurate and actionable responses.",
5
+ "main": "index.js",
6
+ "type": "module",
7
+ "scripts": {
8
+ "test": "echo \"Error: no test specified\" && exit 1"
9
+ },
10
+ "keywords": [],
11
+ "author": "",
12
+ "license": "ISC",
13
+ "dependencies": {
14
+ "@octokit/graphql": "^8.2.2",
15
+ "@octokit/rest": "^21.1.1",
16
+ "cli-progress": "^3.12.0",
17
+ "delay": "^6.0.0",
18
+ "dotenv": "^16.5.0",
19
+ "node-fetch": "^3.3.2",
20
+ "p-limit": "^6.2.0",
21
+ "p-queue": "^8.1.0",
22
+ "p-retry": "^6.2.1",
23
+ "yargs": "^17.7.2"
24
+ }
25
+ }
scripts/model/generate_dataset.py ADDED
@@ -0,0 +1,148 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env python3
2
+ import json
3
+ from pathlib import Path
4
+ import pandas as pd
5
+ import pyarrow as pa
6
+ import pyarrow.parquet as pq
7
+
8
+ def iter_commit_rows(snapshot_dir: Path, diff_dir: Path, repo: str):
9
+ # Generator yielding single-row dicts
10
+ for snapshot_path in sorted(snapshot_dir.glob('pr-*.json')):
11
+ pr = json.loads(snapshot_path.read_text(encoding='utf-8'))
12
+ pr_number = pr.get('number')
13
+ if pr_number is None:
14
+ continue
15
+
16
+ commits = pr.get('commits', {}).get('nodes', [])
17
+ for node in commits:
18
+ commit = node.get('commit', {})
19
+ oid = commit.get('oid')
20
+ if oid:
21
+ diff_file = diff_dir / f"{oid}.diff"
22
+ commit['diff'] = diff_file.read_text(encoding='utf-8').strip() if diff_file.exists() else ''
23
+
24
+ events = []
25
+ # collect events
26
+ for c in pr.get('comments', {}).get('nodes', []):
27
+ ts = c.get('createdAt')
28
+ if ts: events.append(('comment', ts, c))
29
+ for rt in pr.get('reviewThreads', {}).get('nodes', []):
30
+ for r in rt.get('comments', {}).get('nodes', []):
31
+ ts = r.get('createdAt')
32
+ if ts: events.append(('review', ts, r))
33
+ for node in commits:
34
+ ts = node.get('commit', {}).get('committedDate')
35
+ if ts: events.append(('commit', ts, node))
36
+ events.sort(key=lambda e: e[1])
37
+
38
+ history = []
39
+ for kind, ts, data in events:
40
+ history.append((kind, ts, data))
41
+ if kind != 'commit':
42
+ continue
43
+
44
+ c = data['commit']
45
+ oid = c.get('oid')
46
+ diff_text = c.get('diff', '')
47
+ msg = c.get('message', '')
48
+ if not oid or not diff_text.strip():
49
+ continue
50
+
51
+ # build prompt
52
+ prompt_parts = [
53
+ f"Title: {pr.get('title', '')}",
54
+ f"Body: {pr.get('body', '')}",
55
+ ]
56
+ labels = pr.get('labels') or []
57
+ if labels:
58
+ prompt_parts.append("Labels: " + ", ".join(labels))
59
+
60
+ # events since last commit
61
+ last_idx = next((i for i in range(len(history)-2, -1, -1) if history[i][0] == 'commit'), None)
62
+ seg = history[last_idx+1:-1] if last_idx is not None else history[:-1]
63
+ if last_idx is not None:
64
+ prev = history[last_idx][2]['commit']
65
+ prompt_parts.append(
66
+ f"Last commit: {prev.get('message')}\nDiff:\n{prev.get('diff', '')}"
67
+ )
68
+
69
+ for ekind, _, edata in seg:
70
+ if ekind == 'comment':
71
+ body = edata.get('body', '').strip()
72
+ if body:
73
+ prompt_parts.append(f"Comment: {body}")
74
+ elif ekind == 'review':
75
+ path = edata.get('path', '')
76
+ review_body = edata.get('body', '').strip()
77
+ hunk = (edata.get('diffHunk') or '').strip()
78
+ prompt_parts.append(
79
+ f"Review on {path}: {review_body}\nDiff:\n{hunk}"
80
+ )
81
+
82
+ author = c.get('author', {}) or {}
83
+ yield {
84
+ 'prompt': '\n'.join(prompt_parts),
85
+ 'completion': f"Diff:\n{diff_text}",
86
+ 'repo': repo,
87
+ 'pr_number': pr_number,
88
+ 'title': pr.get('title', ''),
89
+ 'body': pr.get('body', ''),
90
+ 'created_at': pr.get('createdAt', ''),
91
+ 'closed_at': pr.get('closedAt', ''),
92
+ 'merged_at': pr.get('mergedAt', ''),
93
+ 'author': author.get('login', ''),
94
+ 'state': pr.get('state', ''),
95
+ 'additions': pr.get('additions', 0),
96
+ 'deletions': pr.get('deletions', 0),
97
+ 'changed_files': pr.get('changedFiles', 0),
98
+ 'head_ref': pr.get('headRefName', ''),
99
+ 'labels': ", ".join(labels),
100
+ 'completion_commit': oid,
101
+ }
102
+
103
+ def main():
104
+ BASE_DIR = Path(__file__).resolve().parent
105
+ snapshot_dir = BASE_DIR.parent / 'data' / 'raw-data' / 'prs'
106
+ diff_dir = BASE_DIR.parent / 'data' / 'raw-data' / 'diffs'
107
+ dataset_dir = BASE_DIR.parent / 'data' / 'dataset'
108
+ dataset_dir.mkdir(parents=True, exist_ok=True)
109
+
110
+ # define schema
111
+ schema = pa.schema([
112
+ ('prompt', pa.string()),
113
+ ('completion', pa.string()),
114
+ ('repo', pa.string()),
115
+ ('pr_number', pa.int64()),
116
+ ('title', pa.string()),
117
+ ('body', pa.string()),
118
+ ('created_at', pa.string()),
119
+ ('closed_at', pa.string()),
120
+ ('merged_at', pa.string()),
121
+ ('author', pa.string()),
122
+ ('state', pa.string()),
123
+ ('additions', pa.int64()),
124
+ ('deletions', pa.int64()),
125
+ ('changed_files', pa.int64()),
126
+ ('head_ref', pa.string()),
127
+ ('labels', pa.string()),
128
+ ('completion_commit', pa.string()),
129
+ ])
130
+
131
+ train_writer = pq.ParquetWriter(str(dataset_dir / 'train.parquet'), schema)
132
+ test_writer = pq.ParquetWriter(str(dataset_dir / 'test.parquet'), schema)
133
+
134
+ for row in iter_commit_rows(snapshot_dir, diff_dir, 'dotnet/runtime'):
135
+ table = pa.Table.from_pydict({k: [v] for k, v in row.items()}, schema)
136
+ # route by hash of commit ID
137
+ if hash(row['completion_commit']) % 5 == 0:
138
+ test_writer.write_table(table)
139
+ else:
140
+ train_writer.write_table(table)
141
+
142
+ train_writer.close()
143
+ test_writer.close()
144
+
145
+ print(f"Wrote train.parquet and test.parquet to {dataset_dir}")
146
+
147
+ if __name__ == '__main__':
148
+ main()
scripts/model/rag.py ADDED
@@ -0,0 +1,94 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env python3
2
+ import sys
3
+ import subprocess
4
+ from pathlib import Path
5
+ from typing import List
6
+ import json
7
+ from tqdm import tqdm
8
+
9
+ from sentence_transformers import SentenceTransformer
10
+ from langchain.text_splitter import RecursiveCharacterTextSplitter
11
+ from langchain.schema import Document
12
+ from langchain.embeddings.base import Embeddings
13
+ from langchain_community.vectorstores import FAISS
14
+
15
+ def load_settings(path: Path):
16
+ if not path.exists():
17
+ print(f"Settings file not found: {path}", file=sys.stderr)
18
+ sys.exit(1)
19
+ return json.loads(path.read_text(encoding='utf-8'))
20
+
21
+ def clone_repo(repo_url: str, local_path: Path) -> None:
22
+ if not local_path.exists():
23
+ print(f"Cloning repo {repo_url} into {local_path}...")
24
+ subprocess.run(["git", "clone", repo_url, str(local_path)], check=True)
25
+ else:
26
+ print(f"Repository already exists at {local_path}")
27
+
28
+
29
+ def extract_repo_files(repo_path: Path) -> List[Document]:
30
+ docs: List[Document] = []
31
+ allowed_extensions = {'.cs', '.cpp', '.c', '.h', '.hpp'}
32
+ all_files = [p for p in repo_path.rglob('*') if p.is_file() and p.suffix in allowed_extensions]
33
+ for path in tqdm(all_files, desc="Reading repo files"):
34
+ try:
35
+ text = path.read_text(encoding='utf-8', errors='ignore')
36
+ docs.append(Document(page_content=text, metadata={'source': str(path)}))
37
+ except Exception as e:
38
+ print(f"Warning: could not read {path}: {e}", file=sys.stderr)
39
+ return docs
40
+
41
+
42
+ def build_embeddings_index(
43
+ repo_path: Path,
44
+ index_path: Path,
45
+ embed_model_name: str
46
+ ) -> None:
47
+ splitter = RecursiveCharacterTextSplitter(chunk_size=512, chunk_overlap=64)
48
+ raw_docs = extract_repo_files(repo_path)
49
+
50
+ chunks: List[Document] = []
51
+ for doc in raw_docs:
52
+ splits = splitter.split_text(doc.page_content)
53
+ for chunk_text in splits:
54
+ chunks.append(Document(page_content=chunk_text, metadata=doc.metadata))
55
+
56
+ embedder = SentenceTransformer(embed_model_name)
57
+ class BTEmbeddings(Embeddings):
58
+ def embed_documents(self, texts: List[str]) -> List[List[float]]:
59
+ return embedder.encode(texts, show_progress_bar=True)
60
+ def embed_query(self, text: str) -> List[float]:
61
+ return embedder.encode([text])[0]
62
+
63
+ embedding = BTEmbeddings()
64
+
65
+ if not index_path.exists():
66
+ print("Building FAISS index...")
67
+ vectorstore = FAISS.from_documents(chunks, embedding)
68
+ vectorstore.save_local(str(index_path))
69
+ print("FAISS index built and saved.")
70
+ else:
71
+ print(f"FAISS index already exists at {index_path}.")
72
+
73
+
74
+ def main():
75
+ # Configuration
76
+ BASE_DIR = Path(__file__).resolve().parent
77
+ SETTINGS_PATH = BASE_DIR.parent / 'settings.json'
78
+
79
+ # Load settings
80
+ settings = load_settings(SETTINGS_PATH)
81
+
82
+ EMBED_MODEL = settings['embed_model']
83
+ OUT_DIR = BASE_DIR.parent / 'data' / 'rag'
84
+ OUT_DIR.mkdir(parents=True, exist_ok=True)
85
+
86
+ repo_url = settings['repository']
87
+ local_repo = OUT_DIR / 'repo'
88
+ vector_index_path = OUT_DIR / 'faiss_index'
89
+
90
+ clone_repo(repo_url, local_repo)
91
+ build_embeddings_index(local_repo, vector_index_path, EMBED_MODEL)
92
+
93
+ if __name__ == '__main__':
94
+ main()
scripts/model/requirements.txt ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ transformers>=4.35.0
2
+ peft>=0.7.0
3
+ datasets>=2.16.0
4
+ bitsandbytes>=0.41.0
5
+ sentence-transformers
6
+ langchain
7
+ faiss-cpu
8
+ fastapi
9
+ uvicorn
10
+ langchain-community
11
+ tqdm
scripts/settings.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "system_instruction": "You are Phi, a language model trained by Microsoft to help users. Your role as an assistant involves thoroughly exploring questions through a systematic thinking process before providing the final precise and accurate solutions. This requires engaging in a comprehensive cycle of analysis, summarizing, exploration, reassessment, reflection, backtracing, and iteration to develop well-considered thinking process. Please structure your response into two main sections: Thought and Solution using the specified format: <think> {Thought section} </think> {Solution section}. In the Thought section, detail your reasoning process in steps. Each step should include detailed considerations such as analysing questions, summarizing relevant findings, brainstorming new ideas, verifying the accuracy of the current steps, refining any errors, and revisiting previous steps. In the Solution section, based on various attempts, explorations, and reflections from the Thought section, systematically present the final solution that you deem correct. The Solution section should be logical, accurate, and concise and detail necessary steps needed to reach the conclusion. Now, try to solve the following question through the above guidelines:",
3
+ "base_model": "microsoft/Phi-4-reasoning",
4
+ "max_context_size": 32768,
5
+ "embed_model": "all-MiniLM-L6-v2",
6
+ "repository": "https://github.com/dotnet/runtime"
7
+ }