Skip to content

feat: Implement Activation Functions Plotter in AI Tab#646

Merged
fderuiter merged 2 commits intomainfrom
feature/ai-activation-functions-15589009739659043216
Mar 4, 2026
Merged

feat: Implement Activation Functions Plotter in AI Tab#646
fderuiter merged 2 commits intomainfrom
feature/ai-activation-functions-15589009739659043216

Conversation

@fderuiter
Copy link
Owner

This PR implements the interactive plotter for Activation Functions in the AI Deep Learning Theory tab as specified in todo_gui.md.

Architectural Analysis
The AiTab already utilizes a Strategy Pattern (AiTool trait) which allows for clean integration of new tools without modifying app.rs with large if/else statements. This adheres strictly to the Single Responsibility Principle and Open/Closed Principle.

Implementation Details:

  1. Math Library (math_explorer): Added tanh, tanh_prime, gelu, and gelu_prime functions to src/ai/deep_learning_theory/calculus.rs. This keeps the mathematical logic strictly isolated from the GUI.
  2. GUI Tool (math_explorer_gui): Created a new module tabs/ai/activation_functions.rs containing the ActivationFunctionsTool struct. This struct manages its own state (selected function, plot range) and handles plotting using egui_plot.
  3. Integration: Registered the new tool within tabs/ai/mod.rs by adding it to the AiTab tool list.
  4. Checklist: Updated todo_gui.md to mark the "Activation Functions" task as complete, effectively finishing the entire "3.1 Deep Learning Theory" section.

Quality & Safety:

  • No unwrap() or expect() are used.
  • The UI state is fully encapsulated within the ActivationFunctionsTool.
  • Math operations utilize the specific Vector struct from the AI module (math_explorer::ai::deep_learning_theory::linear_algebra::Vector).

PR created automatically by Jules for task 15589009739659043216 started by @fderuiter

- Added `tanh`, `tanh_prime`, `gelu`, and `gelu_prime` to `math_explorer` calculus module.
- Created `ActivationFunctionsTool` in `math_explorer_gui` using `egui_plot`.
- Integrated tool into `AiTab` adhering to Strategy Pattern.
- Checked off "Activation Functions" task in `todo_gui.md`.

Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
@google-labs-jules
Copy link
Contributor

👋 Jules, reporting for duty! I'm here to lend a hand with this pull request.

When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down.

I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job!

For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with @jules. You can find this option in the Pull Request section of your global Jules UI settings. You can always switch back!

New to Jules? Learn more at jules.google/docs.


For security, I will only act on instructions from the user who triggered this task.

- Ran `cargo fmt` across all crates to ensure standard styling.
- Fixed `clippy::upper_case_acronyms` in `ActivationFunction` enum (`GELU` to `Gelu`).
- Fixed `clippy::needless_range_loop` violations in `neural_network_viz.rs`.

Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
@fderuiter fderuiter marked this pull request as ready for review March 4, 2026 17:01
Copilot AI review requested due to automatic review settings March 4, 2026 17:02
@fderuiter fderuiter merged commit 38ae655 into main Mar 4, 2026
1 check passed
@fderuiter fderuiter deleted the feature/ai-activation-functions-15589009739659043216 branch March 4, 2026 17:02
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds an interactive “Activation Functions” plotter to the AI → Deep Learning Theory tab, backed by new math helpers in the math_explorer crate, and wires the tool into the existing AI tool selector.

Changes:

  • Added tanh/tanh_prime and gelu/gelu_prime to ai::deep_learning_theory::calculus.
  • Introduced a new ActivationFunctionsTool UI module that plots activation functions and their derivatives with egui_plot.
  • Registered the new tool in the AI tab and marked the corresponding roadmap item as implemented.

Reviewed changes

Copilot reviewed 6 out of 6 changed files in this pull request and generated 2 comments.

Show a summary per file
File Description
todo_gui.md Marks “Activation Functions” (and the Deep Learning Theory section) as implemented.
math_explorer_gui/src/tabs/number_theory/prime_spiral.rs Formatting-only change to the prime lookup loop.
math_explorer_gui/src/tabs/neuroscience/neural_network_viz.rs Refactors weight initialization and synaptic current loops.
math_explorer_gui/src/tabs/ai/mod.rs Registers the new Activation Functions tool in the AI tab tool list.
math_explorer_gui/src/tabs/ai/activation_functions.rs New tool implementing the activation function + derivative plotter UI.
math_explorer/src/ai/deep_learning_theory/calculus.rs Adds tanh/GELU activation functions and their derivatives to the math library.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +42 to +47
if let Some(row) = weights.get_mut(next) {
row[i] = 2.0; // Excitatory connection
}
if let Some(row) = weights.get_mut(prev) {
row[i] = -1.0; // Inhibitory connection
}
Copy link

Copilot AI Mar 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The get_mut checks here are redundant because weights is initialized as a num_neurons x num_neurons matrix and next/prev are always within 0..num_neurons. Keeping the Option handling can silently skip setting connections if an invariant is broken, making issues harder to detect. Consider using direct indexing (or an explicit debug_assert! on dimensions) instead of get_mut.

Suggested change
if let Some(row) = weights.get_mut(next) {
row[i] = 2.0; // Excitatory connection
}
if let Some(row) = weights.get_mut(prev) {
row[i] = -1.0; // Inhibitory connection
}
weights[next][i] = 2.0; // Excitatory connection
weights[prev][i] = -1.0; // Inhibitory connection

Copilot uses AI. Check for mistakes.
Comment on lines +106 to +108
let step = (self.x_max - self.x_min) / (self.points as f64 - 1.0);
let mut x_vals = Vec::with_capacity(self.points);
for i in 0..self.points {
Copy link

Copilot AI Mar 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

step is computed as (x_max - x_min) / (points - 1). If points is ever set to 0 or 1, this will produce an invalid step (division by zero / infinities) and can yield NaNs in the plotted data. Consider clamping points to at least 2 (or handling the points <= 1 case explicitly) before computing step.

Suggested change
let step = (self.x_max - self.x_min) / (self.points as f64 - 1.0);
let mut x_vals = Vec::with_capacity(self.points);
for i in 0..self.points {
let effective_points = self.points.max(2);
let step = (self.x_max - self.x_min) / (effective_points as f64 - 1.0);
let mut x_vals = Vec::with_capacity(effective_points);
for i in 0..effective_points {

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants