Spaces:
Running on Zero

lad / llama_diffusion_model.py

Commit History

Change LoRA size from 256 to 512, also back to bidirectional_masked
620a6cd
verified

Ruurd commited on

Changed back to bidirectional attention
e237f80
verified

Ruurd commited on

Try out bidirectional_masked prediction
0daaccf

Ruurd commited on

input_size?
a5ca1bf

Ruurd commited on

input_size
f7efac8

Ruurd commited on

Updated model architecture
0af2920

Ruurd commited on

First commit
7252f98

Ruurd commited on