Skip to content

feat(gui): implement Activation Functions interactive plotter#645

Closed
fderuiter wants to merge 2 commits intomainfrom
feature/activation-functions-plotter-1061055521213772669
Closed

feat(gui): implement Activation Functions interactive plotter#645
fderuiter wants to merge 2 commits intomainfrom
feature/activation-functions-plotter-1061055521213772669

Conversation

@fderuiter
Copy link
Owner

This PR addresses the missing "Activation Functions" plotter feature from the GUI todo list (todo_gui.md).

Architecture
Adhering to the "Glassblower" persona and clean architecture constraints:

  1. No Logic in GUI: The underlying mathematical functions (tanh, gelu, and their derivatives) were added to the math_explorer library (math_explorer/src/ai/deep_learning_theory/calculus.rs) rather than hardcoding them in the UI.
  2. Interface Segregation: A new module, activation_functions.rs, was created specifically for this UI tool.
  3. Strategy Pattern: The new tool implements the existing AiTool trait and is injected cleanly into the AiTab's tools vector, avoiding massive match/if-else modifications in the main application loop.
  4. Type Safety & Error Handling: The UI uses a strict ActivationFunction enum to manage selection state instead of loose strings. No unwrap() or expect() calls were used in the UI thread.

Features

  • Interactive selection between ReLU, Sigmoid, Tanh, and GELU.
  • Toggle switch to overlay the mathematical derivative of the selected function as a dashed line.
  • Sliders to adjust the X-axis plotting domain interactively.

All code passes cargo check, cargo fmt --check, and the test suite cargo test.


PR created automatically by Jules for task 1061055521213772669 started by @fderuiter

- Adds `tanh`, `tanh_prime`, `gelu`, and `gelu_prime` functions to `math_explorer/src/ai/deep_learning_theory/calculus.rs`.
- Creates `ActivationFunctionsTool` in `math_explorer_gui/src/tabs/ai/activation_functions.rs` using `egui_plot` to visualize the functions and their derivatives.
- Integrates the new tool into `AiTab` (`math_explorer_gui/src/tabs/ai/mod.rs`).
- Updates `todo_gui.md` marking the task as complete.

Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
@google-labs-jules
Copy link
Contributor

👋 Jules, reporting for duty! I'm here to lend a hand with this pull request.

When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down.

I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job!

For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with @jules. You can find this option in the Pull Request section of your global Jules UI settings. You can always switch back!

New to Jules? Learn more at jules.google/docs.


For security, I will only act on instructions from the user who triggered this task.

@fderuiter fderuiter marked this pull request as ready for review March 3, 2026 14:08
Copilot AI review requested due to automatic review settings March 3, 2026 14:08
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Implements the missing “Activation Functions” interactive plotter in the AI GUI tab and adds the corresponding math primitives (tanh/GELU + derivatives) to the math_explorer library.

Changes:

  • Added ActivationFunctionsTool implementing AiTool with function selection, derivative overlay, and domain controls.
  • Added tanh, tanh_prime, gelu, and gelu_prime to the deep-learning calculus module for reuse outside the UI.
  • Minor formatting/todo-list updates (Prime Spiral loop formatting; GUI todo checkbox marked complete).

Reviewed changes

Copilot reviewed 5 out of 5 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
todo_gui.md Marks the Activation Functions plotter as completed in the GUI roadmap.
math_explorer_gui/src/tabs/number_theory/prime_spiral.rs Formatting-only change to the prime iteration loop.
math_explorer_gui/src/tabs/ai/mod.rs Registers the new Activation Functions tool module and adds it to the AI tab tool list.
math_explorer_gui/src/tabs/ai/activation_functions.rs New interactive activation-function plotting tool (function + derivative).
math_explorer/src/ai/deep_learning_theory/calculus.rs Adds tanh/GELU activation functions and their derivatives to the shared library.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +84 to +86
ui.label("Domain:");
ui.add(egui::Slider::new(&mut self.x_min, -10.0..=-1.0).text("Min X"));
ui.add(egui::Slider::new(&mut self.x_max, 1.0..=10.0).text("Max X"));
Copy link

Copilot AI Mar 3, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The X-axis domain sliders are constrained to x_min ∈ [-10, -1] and x_max ∈ [1, 10], which prevents exploring purely-positive/negative domains and also prevents zooming in to narrow windows around 0 (e.g., [-0.5, 0.5]). Consider letting both sliders span a shared range (e.g., -10..=10) and enforcing x_min < x_max via clamping/swapping (and optionally showing a validation message) so the domain is truly adjustable.

Copilot uses AI. Check for mistakes.
Comment on lines +93 to +98
let step = (self.x_max - self.x_min) / (self.points_count as f64 - 1.0);
let x_vals: Vec<f64> = (0..self.points_count)
.map(|i| self.x_min + i as f64 * step)
.collect();
let x_vec = DVector::from_vec(x_vals.clone());

Copy link

Copilot AI Mar 3, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

x_vals is cloned to build x_vec (DVector::from_vec(x_vals.clone())), which duplicates allocations/work every frame. Since x_vals is only used to build plot points, consider generating x coordinates once (e.g., directly as a DVector/iterator) and using that single source to both compute y-values and build PlotPoints, avoiding the extra clone.

Copilot uses AI. Check for mistakes.
Comment on lines +84 to +99
/// GELU (Gaussian Error Linear Unit) activation function: f(z) = 0.5 * z * (1 + tanh(sqrt(2/pi) * (z + 0.044715 * z^3))).
/// Using the approximate formulation.
pub fn gelu(z: &Vector) -> Vector {
let sqrt_2_over_pi = (2.0f64 / std::f64::consts::PI).sqrt();
z.map(|v| 0.5 * v * (1.0 + (sqrt_2_over_pi * (v + 0.044715 * v.powi(3))).tanh()))
}

/// Derivative of GELU (approximate).
pub fn gelu_prime(z: &Vector) -> Vector {
let sqrt_2_over_pi = (2.0f64 / std::f64::consts::PI).sqrt();
z.map(|v| {
let x = sqrt_2_over_pi * (v + 0.044715 * v.powi(3));
let tanh_x = x.tanh();
let sech2_x = 1.0 - tanh_x * tanh_x;
0.5 * (1.0 + tanh_x)
+ 0.5 * v * sech2_x * sqrt_2_over_pi * (1.0 + 3.0 * 0.044715 * v.powi(2))
Copy link

Copilot AI Mar 3, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

gelu/gelu_prime repeat the same constants (0.044715 and sqrt_2_over_pi) in multiple places. Consider extracting these into named consts (e.g., GELU_COEFF, SQRT_2_OVER_PI) or a shared helper to avoid magic numbers and keep the approximation/derivative consistent if the coefficients ever need to change.

Copilot uses AI. Check for mistakes.
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot open a new pull request to apply changes based on this feedback

Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
@fderuiter fderuiter closed this Mar 13, 2026
@fderuiter fderuiter deleted the feature/activation-functions-plotter-1061055521213772669 branch March 13, 2026 18:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants