Spaces:
Running
Running
logo
Browse files
README.md
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
<div align="center">
|
2 |
|
3 |
-
<img src="assets/proxy-lite.png" alt="Proxy Lite logo" width="
|
4 |
|
5 |
<h2>
|
6 |
A mini, open-weights, version of our Proxy assistant.
|
@@ -84,7 +84,7 @@ By default, Proxy Lite will point to an endpoint set up on HuggingFace spaces. T
|
|
84 |
We recommend hosting your own endpoint with vLLM, you can use the following command:
|
85 |
|
86 |
```bash
|
87 |
-
vllm serve --model convergence-ai/proxy-lite
|
88 |
--trust-remote-code \
|
89 |
--enable-auto-tool-choice \
|
90 |
--tool-call-parser hermes \
|
@@ -93,7 +93,7 @@ vllm serve --model convergence-ai/proxy-lite-7b \
|
|
93 |
|
94 |
The tool arguments are **very important** for parsing the tool calls from the model appropriately.
|
95 |
|
96 |
-
> **Important:** To run this, install vLLM and transformers with `uv sync --all-extras`.
|
97 |
|
98 |
You can set the `api_base` to point to your local endpoint when calling Proxy Lite:
|
99 |
|
|
|
1 |
<div align="center">
|
2 |
|
3 |
+
<img src="assets/proxy-lite.png" alt="Proxy Lite logo" width="600" height="auto" style="margin-bottom: 20px;" />
|
4 |
|
5 |
<h2>
|
6 |
A mini, open-weights, version of our Proxy assistant.
|
|
|
84 |
We recommend hosting your own endpoint with vLLM, you can use the following command:
|
85 |
|
86 |
```bash
|
87 |
+
vllm serve --model convergence-ai/proxy-lite \
|
88 |
--trust-remote-code \
|
89 |
--enable-auto-tool-choice \
|
90 |
--tool-call-parser hermes \
|
|
|
93 |
|
94 |
The tool arguments are **very important** for parsing the tool calls from the model appropriately.
|
95 |
|
96 |
+
> **Important:** To run this, install vLLM and transformers with `uv sync --all-extras`. Qwen-2.5-VL Support in `transformers` is not yet available in the latest release so is done from source.
|
97 |
|
98 |
You can set the `api_base` to point to your local endpoint when calling Proxy Lite:
|
99 |
|