-
Notifications
You must be signed in to change notification settings - Fork 6.5k
Description
Describe the bug
diffusers/src/diffusers/models/attention.py
Line 305 in 9f8c915
| _ = xformers.ops.memory_efficient_attention( |
diffusers/src/diffusers/models/attention.py
Line 468 in 9f8c915
| _ = xformers.ops.memory_efficient_attention( |
I have users that have xformer pre-compiled, but it not always work on their local computer. I do my own tests before calling set_use_memory_efficient_attention_xformers, but this will crash the application even if I'm turning the xformer to Off.
It's still a strange test to have hidden inside the model, this should be tested on a outside script, I think. Even so, even if tested in here, it should at least print a warning, not crash the application.
Reproduction
Pre-compile or install a xformers binary that don't work on your computer, it will aways throw a Exception, even if you are setting xformers = Off
This tests should be on another layer of the script or at least just do that tests if xformers = On
Logs
No response
System Info
Windows