Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 12 additions & 6 deletions _bibliography/papers.bib
Original file line number Diff line number Diff line change
@@ -1,18 +1,24 @@

--- 2025
@article{lasby2025reapexpertspruningprevails_preprint,
@inproceedings{lasby2025reapexpertspruningprevails,
title={REAP the Experts: Why Pruning Prevails for One-Shot MoE compression},
author={Mike Lasby and Ivan Lazarevich and Nish Sinnadurai and Sean Lie and Yani Ioannou and Vithursan Thangarasa},
year={2025},
year={2026},
EPRINTTYPE={arXiv},
eprint={2510.13999},
archivePrefix={arXiv},
ARXIVID={2510.13999},
primaryClass={cs.LG},
month={10},
year={2025},
ABBR={arXiv preprint},
journal = {arXiv preprint arXiv:2510.13999},
month={1},
year={2026},
BOOKTITLE = {{International Conference on Learning Representations (ICLR)}},
ABBR={ICLR},
VENUE = {{Rio de Janeiro, Brazil}},
EVENTDATE = {2026-04-23/2026-04-27},
openreview = {https://openreview.net/forum?id=ukGxWd2aDG},
pdf = {https://openreview.net/pdf?id=ukGxWd2aDG},
code = {https://github.com/CerebrasResearch/reap},
blog = {https://www.cerebras.ai/blog/reap},
bibtex_show={true}
}

Expand Down
7 changes: 7 additions & 0 deletions _news/announcement_27_mike_borealis.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
---
layout: post
date: 2025-11-19 00:00:00-0700
inline: true
---

Mike Lasby was announced as one of only 10 [RBC Borealis 2025 AI Fellows](https://rbcborealis.com/news/the-2024-2025-rbc-borealis-fellows-driving-the-future-of-ai/).
File renamed without changes.
8 changes: 8 additions & 0 deletions _news/announcement_29_reap_ICLR.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
---
layout: post
date: 2026-01-26 00:00:00-0700
inline: true
---

[Mike Lasby's](/labmembers/) collaborative work with Cerebras, "REAP the experts: Why pruning prevails for one-shot moe compression" {% cite lasby2025reapexpertspruningprevails %}, has been accepted at the [International Conference on Learning Representations (ICLR), 2026](https://iclr.cc/Conferences/2026).
This work explores the compression of Sparse Mixture of Experts (SMoE) models through expert compression techniques, demonstrating that REAP (Router-weighted Expert Activation Pruning) outperforms existing expert merging and pruning methods in terms of compressed model quality retention.
4 changes: 2 additions & 2 deletions _people/manuel_zamudiolopez.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,12 @@ layout: page
firstname: Manuel
lastname: Zamudio Lopez
pronouns: he/him
description: PhD Student co-supervised with <a href="https://profiles.ucalgary.ca/hamidreza-zareipour">Dr. Hamidreza Zareipour</a> (Fall 2022 - Present)
description: PhD Student co-supervised with <a href="https://profiles.ucalgary.ca/hamidreza-zareipour">Dr. Hamidreza Zareipour</a> (Fall 2022 - Winter 2026)
img: assets/img/people/manuel_zamudiolopez.jpg
redirect: https://www.linkedin.com/in/manuel-zamudio
orcid_id: 0009-0009-7460-8178
linkedin_username: manuel-zamudio
scholar_userid: 4eW8V0YAAAAJ
category: PhD Students
category: Alumni
show: true
---