Spaces:
No application file
No application file
title: PY LLM DEMO | |
emoji: π | |
colorFrom: indigo | |
colorTo: green | |
sdk: docker | |
pinned: false | |
license: apache-2.0 | |
short_description: using public LLM to create own Gen AI app | |
app_port: 7860 | |
image: dharmendrarathore/first-py-app:latest | |
hf_token_secret: HUGGINGFACEHUB_API_TOKEN | |
# PY LLM DEMO | |
This Hugging Face Space hosts a FastAPI application serving a Qwen language model via a Docker container. | |
The Docker image is pulled from Docker Hub. | |
## API Endpoint Usage: | |
To interact with the API, send a POST request with a JSON body with a "question" field to the `/api/generate` endpoint. | |
**Example using `curl`:** | |
```bash | |
curl -X POST -H "Content-Type: application/json" -d '{"question": "How do large language models work?"}' [https://rathore11-py-llm-demo.hf.space/api/generate](https://rathore11-py-llm-demo.hf.space/api/generate) | |