File size: 1,822 Bytes
fe39026
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a4e5389
fe39026
 
 
 
 
 
 
 
 
 
 
 
 
a4e5389
fe39026
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
## βš–οΈ License and Usage

This repository contains quantized variants of the Gemma language model developed by Google.

* 🧠 **Model source:** [Google / Gemma](https://ai.google.dev/gemma/terms)
* πŸͺ„ **Quantized by:** c516a

### Terms of Use

These quantized models are:

* Provided under the same terms as the original Google Gemma models.
* Intended only for **non-commercial use**, **research**, and **experimentation**.
* Redistributed without modification to the underlying model weights, except for **format (GGUF)** and **quantization level**.

By using this repository or its contents, you agree to:

* Comply with the [Gemma License Terms](https://ai.google.dev/gemma/terms),
* Not use the model or its derivatives for any **commercial purposes** without a separate license from Google,
* Acknowledge Google as the original model creator.

> πŸ“’ **Disclaimer:** This repository is not affiliated with Google.

---

## πŸ“¦ Model Downloads

All quantized model files are hosted externally for convenience.
You can download them from:

πŸ‘‰ **[https://modelbakery.nincs.net/c516a/projects](https://modelbakery.nincs.net/users/c516a/projects/quantized-codegemma-7b-it)**

πŸ‘‰ git clone https://modelbakery.nincs.net/c516a/quantized-codegemma-7b-it.git

### File list

Each `.gguf` file has a corresponding `.txt` file that contains the same download URL for clarity.

Example:

* `codegemma-7b-it.Q4_K_M.gguf` (binary file)
* `codegemma-7b-it.Q4_K_M.gguf.txt` β†’ contains:

```
Download: https://modelbakery.nincs.net/c516a/projects/quantized-codegemma-7b-it/codegemma-7b-it.Q4_K_M.gguf
```

---

## πŸ“˜ Notes

These models were quantized locally using `llama.cpp` and tested on RTX 3050 / 5950X / 64GB RAM setups.

If you find them useful, feel free to star the project or fork it to share improvements!