Apply for community grant: Personal project (gpu)

#4
by fffiloni - opened

Hello πŸ‘‹

Is it possible to switch this space on ZeroGPU ? Everything should be ready to handle it πŸ€—

@hysts Unfortunately, despite my efforts to make it run and infer as expected, it won’t work properly with the zeroGPU config πŸ˜•

If we can switch back to the L40s grant, which the model behave perfectly with, it would be great !
Otherwise i’ll shutdown until a new spaces compatible version with this kind of complicated workflow is available πŸ€—

OK, I've just assigned L40S as a grant for now.
Out of curiosity, what was the issue?

@fffiloni I looked into it a bit and found https://github.com/memoavatar/memo/issues/6. Looking at your commits, looks like you tried it as well.
It seems that calling .enable_xformers_memory_efficient_attention() inside the function decorated with @spaces.GPU works, so you were so close to the solution. I opened this PR and force-merged it. Sorry if I'm overstepping.

Oh, BTW, unrelated, but you don't have to comment out import spaces or @spaces.GPU. The spaces package is installed by default in gradio Spaces even if the hardware is not ZeroGPU and the @spaces.GPU decorator does nothing in Spaces with normal GPUs so that users can simply duplicate ZeroGPU Spaces and assign a normal hardware when they run out of their ZeroGPU quota and want to use it more on paid hardware.

Ah cool, so i had to move xformers related calls inside the inference function to bind everything at this step.
I was close πŸ˜…I even tried to mess with Attention classes in motion_module but got weird result with cross_attention 😀

Thank you very much πŸ™!

fffiloni changed discussion status to closed

Sign up or log in to comment