Skip to content

Actions: pytorch/benchmark

All workflows

Actions

Loading...

Showing runs from all workflows
9,080 workflow runs
9,080 workflow runs
Event

Filter by event

Status

Filter by status

Branch
Actor

Filter by actor

TorchBench Optim Regression Detector on A100
TorchBench Optim Regression Detector on A100 #470: Scheduled
June 15, 2024 04:03 1h 5m 46s main
June 15, 2024 04:03 1h 5m 46s
Add colfax_cutlass backend to flash_attention operator
TorchBench GPU model stability test #3656: Pull request #2296 synchronize by xuzhao9
June 15, 2024 03:15 1s xz9/fix-attn
June 15, 2024 03:15 1s
Add colfax_cutlass backend to flash_attention operator
TorchBench PR Test #2831: Pull request #2296 synchronize by xuzhao9
June 15, 2024 03:15 2h 56m 36s xz9/fix-attn
June 15, 2024 03:15 2h 56m 36s
Add colfax_cutlass backend to flash_attention operator
TorchBench PR Test on A10G #2333: Pull request #2296 synchronize by xuzhao9
June 15, 2024 03:15 2h 37m 50s xz9/fix-attn
June 15, 2024 03:15 2h 37m 50s
Add colfax_cutlass backend to flash_attention operator
TorchBench PR Test #2830: Pull request #2296 synchronize by xuzhao9
June 15, 2024 03:03 12m 23s xz9/fix-attn
June 15, 2024 03:03 12m 23s
Add colfax_cutlass backend to flash_attention operator
TorchBench GPU model stability test #3655: Pull request #2296 synchronize by xuzhao9
June 15, 2024 03:03 2s xz9/fix-attn
June 15, 2024 03:03 2s
Add colfax_cutlass backend to flash_attention operator
TorchBench PR Test on A10G #2332: Pull request #2296 synchronize by xuzhao9
June 15, 2024 03:03 12m 36s xz9/fix-attn
June 15, 2024 03:03 12m 36s
Add colfax_cutlass backend to flash_attention operator
TorchBench PR Test on A10G #2331: Pull request #2296 synchronize by xuzhao9
June 15, 2024 03:01 2m 51s xz9/fix-attn
June 15, 2024 03:01 2m 51s
Add colfax_cutlass backend to flash_attention operator
TorchBench PR Test #2829: Pull request #2296 synchronize by xuzhao9
June 15, 2024 03:01 2m 49s xz9/fix-attn
June 15, 2024 03:01 2m 49s
Add colfax_cutlass backend to flash_attention operator
TorchBench GPU model stability test #3654: Pull request #2296 synchronize by xuzhao9
June 15, 2024 03:01 2s xz9/fix-attn
June 15, 2024 03:01 2s
Add colfax_cutlass backend to flash_attention operator
TorchBench PR Test #2828: Pull request #2296 synchronize by xuzhao9
June 15, 2024 02:59 2m 9s xz9/fix-attn
June 15, 2024 02:59 2m 9s
Add colfax_cutlass backend to flash_attention operator
TorchBench PR Test on A10G #2330: Pull request #2296 synchronize by xuzhao9
June 15, 2024 02:59 2m 11s xz9/fix-attn
June 15, 2024 02:59 2m 11s
Add colfax_cutlass backend to flash_attention operator
TorchBench GPU model stability test #3653: Pull request #2296 synchronize by xuzhao9
June 15, 2024 02:59 2s xz9/fix-attn
June 15, 2024 02:59 2s
Add colfax_cutlass backend to flash_attention operator
TorchBench PR Test #2827: Pull request #2296 synchronize by xuzhao9
June 15, 2024 02:57 3m 0s xz9/fix-attn
June 15, 2024 02:57 3m 0s
Add colfax_cutlass backend to flash_attention operator
TorchBench PR Test on A10G #2329: Pull request #2296 synchronize by xuzhao9
June 15, 2024 02:57 2m 57s xz9/fix-attn
June 15, 2024 02:57 2m 57s
Add colfax_cutlass backend to flash_attention operator
TorchBench GPU model stability test #3652: Pull request #2296 synchronize by xuzhao9
June 15, 2024 02:57 2s xz9/fix-attn
June 15, 2024 02:57 2s
Deploy the flash_attention operator CI on H100
TorchBench PR Test on A10G #2328: Commit 5831be0 pushed by facebook-github-bot
June 15, 2024 02:37 2h 37m 23s main
June 15, 2024 02:37 2h 37m 23s
Deploy the flash_attention operator CI on H100
TorchBench PR Test #2826: Commit 5831be0 pushed by facebook-github-bot
June 15, 2024 02:37 2h 57m 9s main
June 15, 2024 02:37 2h 57m 9s
Add colfax_cutlass backend to flash_attention operator
TorchBench PR Test on A10G #2327: Pull request #2296 synchronize by xuzhao9
June 15, 2024 01:53 1h 4m 9s xz9/fix-attn
June 15, 2024 01:53 1h 4m 9s
Add colfax_cutlass backend to flash_attention operator
TorchBench GPU model stability test #3651: Pull request #2296 synchronize by xuzhao9
June 15, 2024 01:53 1s xz9/fix-attn
June 15, 2024 01:53 1s
Add colfax_cutlass backend to flash_attention operator
TorchBench PR Test #2825: Pull request #2296 synchronize by xuzhao9
June 15, 2024 01:53 1h 3m 56s xz9/fix-attn
June 15, 2024 01:53 1h 3m 56s
Add colfax_cutlass backend to flash_attention operator
TorchBench PR Test #2824: Pull request #2296 synchronize by xuzhao9
June 15, 2024 00:26 1h 27m 16s xz9/fix-attn
June 15, 2024 00:26 1h 27m 16s
Add colfax_cutlass backend to flash_attention operator
TorchBench GPU model stability test #3650: Pull request #2296 synchronize by xuzhao9
June 15, 2024 00:26 2s xz9/fix-attn
June 15, 2024 00:26 2s
Add colfax_cutlass backend to flash_attention operator
TorchBench PR Test on A10G #2326: Pull request #2296 synchronize by xuzhao9
June 15, 2024 00:26 1h 27m 30s xz9/fix-attn
June 15, 2024 00:26 1h 27m 30s
Refactor code for sum Triton kernels (#2303)
TorchBench PR Test #2823: Commit 339ccfd pushed by facebook-github-bot
June 14, 2024 23:39 2h 54m 39s main
June 14, 2024 23:39 2h 54m 39s