π§ Osmosis-Structure 0.6B (GGUF)
Osmosis-Structure 0.6B is a lightweight language model optimized for inference using the GGUF format. It's suitable for edge deployment, research, and low-resource environments.
π¦ Model Overview
- Model Size: 0.6 Billion parameters
- Quantization: Q4_K_M
- Format: GGUF
- Tokenizer: SentencePiece (
tokenizer.model
) - Usage: Optimized for fast inference with low memory requirements
π§° How to Use
This model is in GGUF format, which is supported by:
Example command using llama.cpp
:
./main -m Osmosis-Structure-0.6B-Q4_K_M.gguf -p "Explain the structure of a water molecule."
π Files Included
Osmosis-Structure-0.6B-Q4_K_M.gguf
tokenizer.model
README.md
β οΈ License & Usage
Please review the included license file (if any) for usage terms. This model is provided for educational and research purposes.
β¨ Maintained by
Model hosted by XythicK
Powered by open-source magic β‘
- Downloads last month
- 83
Hardware compatibility
Log In
to view the estimation
2-bit
3-bit
4-bit
5-bit
6-bit
8-bit
16-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support
Model tree for XythicK/Osmosis-Structure-0.6B-GGUF
Base model
osmosis-ai/Osmosis-Structure-0.6B