Skip to content

Commit 96bf47c

Browse files
BBufwenju.li
authored andcommitted
simplify fused_moe config logging (sgl-project#5801)
1 parent abe98a6 commit 96bf47c

File tree

2 files changed

+9
-7
lines changed

2 files changed

+9
-7
lines changed

benchmark/kernels/fused_moe_triton/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
## Benchmark Kernels
1+
## Tuning Triton MoE Kernels
22

33
This directory contains benchmarking tools for MoE (Mixture of Experts) kernels.
44

python/sglang/srt/layers/moe/fused_moe_triton/fused_moe.py

Lines changed: 8 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -940,19 +940,21 @@ def get_moe_configs(
940940
)
941941
if os.path.exists(config_file_path):
942942
with open(config_file_path) as f:
943-
logger.info(
944-
"Using configuration from %s for MoE layer. Please note that due to the large number of configs under fused_moe_triton/configs potentially not being tuned with the corresponding Triton version in your current environment, using the current configs may result in performance degradation. To achieve best performance, you can consider re-tuning the Triton fused MOE kernel in your current environment. For the tuning method, please refer to: https://github.com/sgl-project/sglang/blob/main/benchmark/kernels/fused_moe_triton/tuning_fused_moe_triton.py. ",
945-
config_file_path,
946-
)
943+
# Please note that although we find the config files, performance might still be suboptimal.
944+
# This is because the tuning environment might differ from your current environment.
945+
# For example, updating the Triton version might cause all old configs to become suboptimal.
946+
# To achieve the best performance, consider re-tuning the Triton fused MOE kernel in your environment.
947+
# For the tuning method, refer to: https://github.com/sgl-project/sglang/tree/main/benchmark/kernels/fused_moe_triton
948+
logger.info("Using MoE kernel config from %s.", config_file_path)
947949
# If a configuration has been found, return it
948950
return {int(key): val for key, val in json.load(f).items()}
949951

950952
# If no optimized configuration is available, we will use the default
951953
# configuration
952954
logger.warning(
953955
(
954-
"Using default MoE config. Performance might be sub-optimal! "
955-
"Config file not found at %s, you can tune the config with https://github.com/sgl-project/sglang/blob/main/benchmark/kernels/fused_moe_triton/tuning_fused_moe_triton.py."
956+
"Using default MoE kernel config. Performance might be sub-optimal! "
957+
"Config file not found at %s, you can create them with https://github.com/sgl-project/sglang/tree/main/benchmark/kernels/fused_moe_triton"
956958
),
957959
config_file_path,
958960
)

0 commit comments

Comments
 (0)