Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Duplicated fromย
huggingface/inference-playground
FallnAI
/
LLM-Inference
like
0
Sleeping
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
288702f
LLM-Inference
/
src
/
lib
/
components
/
InferencePlayground
Ctrl+K
Ctrl+K
5 contributors
History:
211 commits
mishig
HF Staff
Fix streaming extra new lines
0ece011
9 months ago
InferencePlayground.svelte
Safe
19.6 kB
Fix streaming extra new lines
9 months ago
InferencePlaygroundCodeSnippets.svelte
Safe
15.3 kB
view docs + close from top
11 months ago
InferencePlaygroundConversation.svelte
Safe
3.13 kB
view docs + close from top
11 months ago
InferencePlaygroundConversationHeader.svelte
Safe
2.87 kB
border dark
11 months ago
InferencePlaygroundGenerationConfig.svelte
Safe
4.49 kB
[system prompts] Support default system prompts
10 months ago
InferencePlaygroundHFTokenModal.svelte
Safe
4.57 kB
format
about 1 year ago
InferencePlaygroundMessage.svelte
Safe
1.77 kB
lint fix
11 months ago
InferencePlaygroundModelSelector.svelte
Safe
2.25 kB
fix reactivity
10 months ago
InferencePlaygroundModelSelectorModal.svelte
Safe
6.36 kB
text align
11 months ago
generationConfigSettings.ts
Safe
1.01 kB
[Settings] max_tokens: { default: 2048 }
10 months ago
inferencePlaygroundUtils.ts
Safe
2.13 kB
order
9 months ago
types.ts
Safe
698 Bytes
Compare models
11 months ago