PeterSchneider commited on
Commit
7bbf9a0
·
verified ·
1 Parent(s): 4347674

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -20,22 +20,22 @@ By accessing this model, you are agreeing to the Llama 2 terms and conditions of
20
  Large language models, including CodeLlama-13B-QML, are not designed to be deployed in isolation but instead should be deployed as part of an overall AI system with additional safety guardrails as required. Developers are expected to deploy system safeguards when building AI systems.
21
 
22
  ## How to run CodeLlama 13B-QML in ollama:
23
- #### Install the ollama
24
  https://ollama.com/download
25
 
26
  These instructions are written for ollama version 0.5.7.
27
 
28
- #### Download model repository
29
 
30
- #### Open the terminal and go to the repository
31
 
32
- #### Build model in ollama
33
  ```
34
  ollama create <your-model-name> -f Modelfile
35
  e.g. ollama create customcodellama13bqml -f Modelfile
36
  ```
37
 
38
- #### Run the model
39
  ```
40
  ollama run <your-model-name>
41
  e.g. ollama run customcodellame13bqml
@@ -56,7 +56,7 @@ curl -X POST http://localhost:11434/api/generate -d '{
56
  }'
57
  ```
58
 
59
- #### The prompt format:
60
  ```
61
  "<SUF>{suffix}<PRE>{prefix}<MID>"
62
  ```
 
20
  Large language models, including CodeLlama-13B-QML, are not designed to be deployed in isolation but instead should be deployed as part of an overall AI system with additional safety guardrails as required. Developers are expected to deploy system safeguards when building AI systems.
21
 
22
  ## How to run CodeLlama 13B-QML in ollama:
23
+ #### 1. Install ollama
24
  https://ollama.com/download
25
 
26
  These instructions are written for ollama version 0.5.7.
27
 
28
+ #### 2. Download model repository
29
 
30
+ #### 3. Open the terminal and go to the repository
31
 
32
+ #### 4. Build model in ollama
33
  ```
34
  ollama create <your-model-name> -f Modelfile
35
  e.g. ollama create customcodellama13bqml -f Modelfile
36
  ```
37
 
38
+ #### 5. Run the model
39
  ```
40
  ollama run <your-model-name>
41
  e.g. ollama run customcodellame13bqml
 
56
  }'
57
  ```
58
 
59
+ The prompt format:
60
  ```
61
  "<SUF>{suffix}<PRE>{prefix}<MID>"
62
  ```