Spaces:
Running on Zero

lad / llama_diffusion_model.py

Commit History

Set to unidirectional for debugging
57e6bce
verified

Ruurd commited on

Deal with float values
a721355
verified

Ruurd commited on

Make attention mask float
8851563
verified

Ruurd commited on

Update llama_diffusion_model.py
238c8f8
verified

Ruurd commited on

Create safe fallback for models not yet initialized with masking_type
f2ca6a6
verified

Ruurd commited on

Overhaul code for appropriate masking for full model instead of just attention layers
b43e862
verified

Ruurd commited on

Fix attention_weights referenced before assigned bug
22370b2
verified

Ruurd commited on

Implement improved attention masking for bidirectional_masked
1723639
verified

Ruurd commited on

Change LoRA size from 256 to 512, also back to bidirectional_masked
620a6cd
verified

Ruurd commited on

Changed back to bidirectional attention
e237f80
verified

Ruurd commited on

Try out bidirectional_masked prediction
0daaccf

Ruurd commited on

input_size?
a5ca1bf

Ruurd commited on

input_size
f7efac8

Ruurd commited on

Updated model architecture
0af2920

Ruurd commited on

First commit
7252f98

Ruurd commited on