Direct any model (named without `claude-`) to the OpenAI backend.
Browse filesThe old logic (OpenAI backend only for `gpt-` names) directed any
OpenAI-compatible server model configuration to the Claude backend,
as it was picked by default for any model named without `gpt-`.
It is easier to just switch the default backend, rather than
considering whether the environment variable `OPENAI_BASE_URL` is set.
I confirmed the fix on the following run:
- Use Ollama: `export OPENAI_BASE_URL="http://localhost:11434/v1"`
- Set dummy OpenAI key: `export OPENAI_API_KEY="ollama"`
- Run README command with extra `agent` parameters:
`aide agent.code.model="qwen2.5" agent.feedback.model="qwen2.5" data_dir="example_tasks/house_prices" goal="Predict the sales price for each house" eval="Use the RMSE metric between the logarithm of the predicted and observed values."`
- aide/backend/__init__.py +1 -1
@@ -33,7 +33,7 @@ def query(
|
|
33 |
"max_tokens": max_tokens,
|
34 |
}
|
35 |
|
36 |
-
query_func =
|
37 |
output, req_time, in_tok_count, out_tok_count, info = query_func(
|
38 |
system_message=compile_prompt_to_md(system_message) if system_message else None,
|
39 |
user_message=compile_prompt_to_md(user_message) if user_message else None,
|
|
|
33 |
"max_tokens": max_tokens,
|
34 |
}
|
35 |
|
36 |
+
query_func = backend_anthropic.query if "claude-" in model else backend_openai.query
|
37 |
output, req_time, in_tok_count, out_tok_count, info = query_func(
|
38 |
system_message=compile_prompt_to_md(system_message) if system_message else None,
|
39 |
user_message=compile_prompt_to_md(user_message) if user_message else None,
|