File size: 98 Bytes
f0c6e0d |
1 2 3 |
import gradio as gr
gr.Interface.load("models/concise/LLaMa-V2-13B-Chat_Quantized_fp16").launch() |
f0c6e0d |
1 2 3 |
import gradio as gr
gr.Interface.load("models/concise/LLaMa-V2-13B-Chat_Quantized_fp16").launch() |