Skip to content
#

high-dimensional-optimization

Here are 2 public repositories matching this topic...

A memory-efficient, gradient-free zeroth-order (derivative-free) optimizer designed to solve the "Curse of Dimensionality" in Black-Box optimization and memory-constrained Machine Learning. It provides an O(log D) gradient estimation approach that can successfully train Neural Networks without ever calculating analytical derivatives or Backprop

  • Updated Apr 26, 2026
  • Python

Improve this page

Add a description, image, and links to the high-dimensional-optimization topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the high-dimensional-optimization topic, visit your repo's landing page and select "manage topics."

Learn more