Your current environment
As the title, vllm-flash-attn pins torch==2.4.0 but vllm 0.6.5 requires torch==2.5.1.
How you are installing vllm
$ uv pip install vllm==0.6.5 vllm-flash-attn
× No solution found when resolving dependencies:
╰─▶ Because vllm==0.6.5 depends on torch{platform_machine != 'aarch64'}==2.5.1 and
vllm-flash-attn>=2.6.1 depends on torch==2.4.0, we can conclude that vllm==0.6.5 and
all of:
vllm-flash-attn==2.6.1
vllm-flash-attn==2.6.2
are incompatible. (1)
Before submitting a new issue...