Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Duplicated fromย
huggingface/inference-playground
FallnAI
/
LLM-Inference
like
0
Sleeping
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
fcefed2
LLM-Inference
/
src
/
lib
/
components
/
InferencePlayground
Ctrl+K
Ctrl+K
5 contributors
History:
113 commits
mishig
HF Staff
better link for "Create new token"
fcefed2
12 months ago
InferencePlayground.svelte
11.4 kB
Hide "API Quota" div for now
12 months ago
InferencePlaygroundCodeSnippets.svelte
Safe
8.65 kB
padding
about 1 year ago
InferencePlaygroundConversation.svelte
Safe
1.61 kB
wip
about 1 year ago
InferencePlaygroundGenerationConfig.svelte
Safe
2.08 kB
handle when /api/model err
about 1 year ago
InferencePlaygroundHFTokenModal.svelte
4.31 kB
better link for "Create new token"
12 months ago
InferencePlaygroundMessage.svelte
Safe
1.54 kB
order imports
about 1 year ago
InferencePlaygroundModelSelector.svelte
Safe
2.07 kB
quick fixes
about 1 year ago
InferencePlaygroundModelSelectorModal.svelte
3.62 kB
Model selector w-full
12 months ago
generationConfigSettings.ts
Safe
933 Bytes
Rm advanced options for config
about 1 year ago
inferencePlaygroundUtils.ts
Safe
2.16 kB
make tokens count working for non-streaming as well
about 1 year ago
types.ts
Safe
607 Bytes
System message as part of Conversation
about 1 year ago