Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Duplicated fromย
huggingface/inference-playground
FallnAI
/
LLM-Inference
like
0
Sleeping
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
c8ac2fd
LLM-Inference
/
src
/
lib
/
components
/
InferencePlayground
45.8 kB
5 contributors
History:
154 commits
mishig
HF Staff
cleaner modelId URL param
b52f201
about 1 year ago
InferencePlayground.svelte
Safe
12.5 kB
Rm last message on error if empty
about 1 year ago
InferencePlaygroundCodeSnippets.svelte
Safe
9.69 kB
Escape single quotes in CURL snippets
about 1 year ago
InferencePlaygroundConversation.svelte
Safe
2.78 kB
ability to scroll when message is being generated
about 1 year ago
InferencePlaygroundGenerationConfig.svelte
Safe
2.08 kB
handle when /api/model err
about 1 year ago
InferencePlaygroundHFTokenModal.svelte
Safe
4.57 kB
format
about 1 year ago
InferencePlaygroundMessage.svelte
Safe
1.66 kB
ability to scroll when message is being generated
about 1 year ago
InferencePlaygroundModelSelector.svelte
Safe
2.52 kB
cleaner modelId URL param
about 1 year ago
InferencePlaygroundModelSelectorModal.svelte
Safe
6.17 kB
misc
about 1 year ago
generationConfigSettings.ts
Safe
934 Bytes
default steps
about 1 year ago
inferencePlaygroundUtils.ts
Safe
2.29 kB
typo maxTokens vs max_tokens
about 1 year ago
types.ts
Safe
607 Bytes
System message as part of Conversation
about 1 year ago