Skip to content

Commit 985fb44

Browse files
committed
remove torch sdpa as input
Signed-off-by: Frida Hou <201670829+Fridah-nv@users.noreply.github.com>
1 parent d708701 commit 985fb44

File tree

2 files changed

+0
-2
lines changed

2 files changed

+0
-2
lines changed

tensorrt_llm/_torch/auto_deploy/transform/library/attention.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -529,7 +529,6 @@ def _apply(
529529

530530
# List of SDPA operations to look for
531531
sdpa_ops = {
532-
torch.ops.auto_deploy.torch_attention_sdpa,
533532
torch.ops.auto_deploy.torch_attention_grouped_sdpa,
534533
}
535534

tests/unittest/_torch/auto_deploy/unit/singlegpu/transformations/library/test_attention_matcher.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1157,7 +1157,6 @@ def forward(self, x: torch.Tensor) -> torch.Tensor:
11571157
@pytest.mark.parametrize(
11581158
"model_config",
11591159
[
1160-
{"type": "standard", "use_grouped_sdpa": False, "name": "SDPA"},
11611160
{"type": "standard", "use_grouped_sdpa": True, "name": "GroupedSDPA"},
11621161
{"type": "already_bsnd", "name": "DirectBSND"},
11631162
],

0 commit comments

Comments
 (0)