You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/blog/2025-11-05-1762335811.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -64,7 +64,7 @@ It may not even need to actually run the code, as long as it produces correct te
64
64
65
65
A simulator (with a library of lots of popular GPUs) might also help train ML schedulers by simulating a diverse range of operating constraints. Like a "GPU Dojo".
66
66
67
-
There have been attempts like [GPGPU-Sim](https://github.com/gpgpu-sim/gpgpu-sim_distribution) and [Accel-Sim](https://accel-sim.github.io/), and it is a hard problem. Again, this isn't a new idea. CPU compilers have hand-written models of various hardware targets.
67
+
There have been attempts like [GPGPU-Sim](https://github.com/gpgpu-sim/gpgpu-sim_distribution), [Accel-Sim](https://accel-sim.github.io/),[MGPUSIM](https://github.com/sarchlab/mgpusim) and it is a hard problem. Again, this isn't a new idea. CPU compilers have hand-written models of various hardware targets.
68
68
69
69
A simulator will never be perfect - ask any engineer in Formula 1. But I don't think any Formula 1 team today would get rid of their simulation software, just because they can't model reality to 100% accuracy.
0 commit comments