Skip to content

Commit 8f4359b

Browse files
ch-wanjimoosciuc
authored andcommitted
[deepep] fix: shared experts are not initialized when shared experts fusion is enabled (sgl-project#5072)
1 parent 9f0f616 commit 8f4359b

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

python/sglang/srt/server_args.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -197,7 +197,7 @@ class ServerArgs:
197197
enable_flashmla: bool = False
198198
flashinfer_mla_disable_ragged: bool = False
199199
warmups: Optional[str] = None
200-
n_share_experts_fusion: Optional[int] = None
200+
n_share_experts_fusion: int = 0
201201
disable_shared_experts_fusion: bool = False
202202

203203
# KV cache transfer
@@ -1134,7 +1134,7 @@ def add_cli_args(parser: argparse.ArgumentParser):
11341134
parser.add_argument(
11351135
"--n-share-experts-fusion",
11361136
type=int,
1137-
default=None,
1137+
default=0,
11381138
help="The number of shared_experts need to be replica to fuse with normal experts in deepseek v3/r1 "
11391139
"we use tp_size by default.",
11401140
)

0 commit comments

Comments
 (0)