Spaces:
Running
on
Zero
Apply for community grant: Personal project (gpu)
Hello π
Is it possible to switch this space on ZeroGPU ? Everything should be ready to handle it π€
Done!
@hysts Unfortunately, despite my efforts to make it run and infer as expected, it wonβt work properly with the zeroGPU config π
If we can switch back to the L40s grant, which the model behave perfectly with, it would be great !
Otherwise iβll shutdown until a new spaces compatible version with this kind of complicated workflow is available π€
OK, I've just assigned L40S as a grant for now.
Out of curiosity, what was the issue?
@fffiloni
I looked into it a bit and found https://github.com/memoavatar/memo/issues/6. Looking at your commits, looks like you tried it as well.
It seems that calling .enable_xformers_memory_efficient_attention()
inside the function decorated with @spaces.GPU
works, so you were so close to the solution. I opened this PR and force-merged it. Sorry if I'm overstepping.
Oh, BTW, unrelated, but you don't have to comment out import spaces
or @spaces.GPU
. The spaces
package is installed by default in gradio Spaces even if the hardware is not ZeroGPU and the @spaces.GPU
decorator does nothing in Spaces with normal GPUs so that users can simply duplicate ZeroGPU Spaces and assign a normal hardware when they run out of their ZeroGPU quota and want to use it more on paid hardware.
Ah cool, so i had to move xformers related calls inside the inference function to bind everything at this step.
I was close π
I even tried to mess with Attention classes in motion_module but got weird result with cross_attention π€
Thank you very much π!