Hi there! I'm Asher, a half-decent coder and senior at Brown University. I focus on the development & analysis of statistical and ML algorithms, especially those with mathematical guarantees vis a vis conformal prediction. The following are some of my most interesting repositories:
Active Projects
- Private project: As part of my Applied Mathematics Honors thesis at Brown, I'm working on a new architecture for language models that would improve reasoning skills while drastically reducing necessary inference compute. I'll be training the compute equiuvalent of ~11 1B-param models with the help of Brown's supercomputer, OSCAR. All work on this paper is currently private, but it'll be made public once I've published!
- In ML Papers, I'm working on building all the major ML/DL algorithms from scratch -- no pytorch or scikit-learn. So far I've built FFNN, DT, RF, and GBDTs. I mostly work on this when I'm bored and not actively building anything new.
Finished Projects
- I recently published a paper in TMLR on K-Means Binning for GBDTs, which can be found on ArXiv. The code for this paper is here.
- For the 2024 election, I founded 24cast.org, the very first open-source, ML-based election prediction model. You can find these repositories in the Brown Political Review organization. Of particular interest is the model itself, and the code for election night, which includes a conformal prediction model designed to take in live election data and output mathematically valid, minimum-size confidence intervals using linear regression. There's plenty other cool stuff in that repo too -- take a look around!
- At an event hosted by the Institute for Replication, I worked with PhD students and professors to replicate a paper on green policies and rightward political shifts in Europe. The code can be found here.
- Along with one other Brown student, I designed Time-Based Bayesian Optimization, a method for derivative-free optimization in high-cost evaluation circumstances. It was created for a school project, and was ultimately submitted into NeurIPS. While it wasn't accepted (another researcher had discovered it a few years back), it's still fun to look at! The paper can be found in the README.
These are just some of my projects, and I've got plenty more in the works! Feel free to reach out to me at asherlabovich@gmail.com with any questions.


