Update README.md
Browse files
README.md
CHANGED
@@ -19,15 +19,15 @@ By accessing this model, you are agreeing to the Llama 2 terms and conditions of
|
|
19 |
## Usage:
|
20 |
Large language models, including CodeLlama-13B-QML, are not designed to be deployed in isolation but instead should be deployed as part of an overall AI system with additional safety guardrails as required. Developers are expected to deploy system safeguards when building AI systems.
|
21 |
|
22 |
-
## How to run in ollama
|
23 |
#### Install the ollama
|
24 |
https://ollama.com/download
|
25 |
|
26 |
-
instructions written
|
27 |
|
28 |
#### Download model repository
|
29 |
|
30 |
-
#### Open terminal and go to the repository
|
31 |
|
32 |
#### Build model in ollama
|
33 |
```
|
@@ -40,9 +40,9 @@ e.g. ollama create customcodellama13bqml -f Modelfile
|
|
40 |
ollama run <your-model-name>
|
41 |
e.g. ollama run customcodellame13bqml
|
42 |
```
|
43 |
-
You can start writing in the terminal or send curl requests
|
44 |
|
45 |
-
|
46 |
```
|
47 |
curl -X POST http://localhost:11434/api/generate -d '{
|
48 |
"model": "codellama13bqml",
|
@@ -66,7 +66,7 @@ If there is no suffix, please use:
|
|
66 |
"<PRE>{prefix}<MID>"
|
67 |
```
|
68 |
|
69 |
-
|
70 |
|
71 |
## Model Version:
|
72 |
v1.0
|
|
|
19 |
## Usage:
|
20 |
Large language models, including CodeLlama-13B-QML, are not designed to be deployed in isolation but instead should be deployed as part of an overall AI system with additional safety guardrails as required. Developers are expected to deploy system safeguards when building AI systems.
|
21 |
|
22 |
+
## How to run CodeLlama 13B-QML in ollama:
|
23 |
#### Install the ollama
|
24 |
https://ollama.com/download
|
25 |
|
26 |
+
These instructions are written for ollama version 0.5.7.
|
27 |
|
28 |
#### Download model repository
|
29 |
|
30 |
+
#### Open the terminal and go to the repository
|
31 |
|
32 |
#### Build model in ollama
|
33 |
```
|
|
|
40 |
ollama run <your-model-name>
|
41 |
e.g. ollama run customcodellame13bqml
|
42 |
```
|
43 |
+
You can start writing prompts in the terminal or send curl requests now.
|
44 |
|
45 |
+
Here is a curl request example:
|
46 |
```
|
47 |
curl -X POST http://localhost:11434/api/generate -d '{
|
48 |
"model": "codellama13bqml",
|
|
|
66 |
"<PRE>{prefix}<MID>"
|
67 |
```
|
68 |
|
69 |
+
|
70 |
|
71 |
## Model Version:
|
72 |
v1.0
|