forked from vllm-project/vllm
-
Notifications
You must be signed in to change notification settings - Fork 113
Pull requests: HabanaAI/vllm-fork
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
Add multi-image prompt support for benchmark offline test
#1518
opened Jul 3, 2025 by
Jianhong-Zhang
Loading…
[Misc] Allow AutoWeightsLoader to skip loading weights with specific substr in name
#1514
opened Jul 2, 2025 by
kwisniewski98
Loading…
[deepseek_r1] refine _schedule_prefills for prompts with large length range
#1511
opened Jul 2, 2025 by
yangulei
Loading…
[SW-233526]Fix MLA and deepseek modeling for 9.0.1 rebase
#1509
opened Jul 1, 2025 by
xuechendi
Loading…
vllm hpu-extension for automatization of long context prompt
#1499
opened Jun 30, 2025 by
iboiko-habana
Loading…
vllm hpu-extension for automatization of long context
#1498
opened Jun 30, 2025 by
iboiko-habana
Loading…
use fused RoPE kernel in DeepseekScalingRotaryEmbedding
#1488
opened Jun 27, 2025 by
yangulei
Loading…
Previous Next
ProTip!
Follow long discussions with comments:>50.