-
-
Notifications
You must be signed in to change notification settings - Fork 11.2k
Closed
Labels
feature requestNew feature or requestNew feature or request
Description
🚀 The feature, motivation and pitch
When I tried to deploy openbmb/ProSparse-MiniCPM-1B-sft, it raised an error:
File "/home/xxxxxx/.conda/envs/vllm_env/lib/python3.10/site-packages/vllm/model_executor/models/minicpm.py", line 290, in __init__
self.mlp = MiniCPMMLP(
File "/home/xxxxxx/.conda/envs/vllm_env/lib/python3.10/site-packages/vllm/model_executor/models/minicpm.py", line 166, in __init__
raise ValueError(f"Unsupported activation: {hidden_act}. "
ValueError: Unsupported activation: fatrelu. Only silu is supported for now.
Here is the scripts:
#!/bin/bash
set -e
source ~/.bashrc
conda activate vllm_env
port=${1:-"8010"}
devices=${2:-"6,7"}
model_name=${3:-"LOCAL_PATH/ProSparse-MiniCPM-1B-sft"}
export CUDA_VISIBLE_DEVICES="$devices"
vllm serve ${model_name} --port $port --trust-remote-code
Packages:
-torch2.4.0
-vllm0.5.4
Alternatives
No response
Additional context
No response
Metadata
Metadata
Assignees
Labels
feature requestNew feature or requestNew feature or request