feat: Implement Activation Functions Plotter in AI Tab#646
Conversation
- Added `tanh`, `tanh_prime`, `gelu`, and `gelu_prime` to `math_explorer` calculus module. - Created `ActivationFunctionsTool` in `math_explorer_gui` using `egui_plot`. - Integrated tool into `AiTab` adhering to Strategy Pattern. - Checked off "Activation Functions" task in `todo_gui.md`. Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
|
👋 Jules, reporting for duty! I'm here to lend a hand with this pull request. When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down. I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job! For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with New to Jules? Learn more at jules.google/docs. For security, I will only act on instructions from the user who triggered this task. |
- Ran `cargo fmt` across all crates to ensure standard styling. - Fixed `clippy::upper_case_acronyms` in `ActivationFunction` enum (`GELU` to `Gelu`). - Fixed `clippy::needless_range_loop` violations in `neural_network_viz.rs`. Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
There was a problem hiding this comment.
Pull request overview
This PR adds an interactive “Activation Functions” plotter to the AI → Deep Learning Theory tab, backed by new math helpers in the math_explorer crate, and wires the tool into the existing AI tool selector.
Changes:
- Added
tanh/tanh_primeandgelu/gelu_primetoai::deep_learning_theory::calculus. - Introduced a new
ActivationFunctionsToolUI module that plots activation functions and their derivatives withegui_plot. - Registered the new tool in the AI tab and marked the corresponding roadmap item as implemented.
Reviewed changes
Copilot reviewed 6 out of 6 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
todo_gui.md |
Marks “Activation Functions” (and the Deep Learning Theory section) as implemented. |
math_explorer_gui/src/tabs/number_theory/prime_spiral.rs |
Formatting-only change to the prime lookup loop. |
math_explorer_gui/src/tabs/neuroscience/neural_network_viz.rs |
Refactors weight initialization and synaptic current loops. |
math_explorer_gui/src/tabs/ai/mod.rs |
Registers the new Activation Functions tool in the AI tab tool list. |
math_explorer_gui/src/tabs/ai/activation_functions.rs |
New tool implementing the activation function + derivative plotter UI. |
math_explorer/src/ai/deep_learning_theory/calculus.rs |
Adds tanh/GELU activation functions and their derivatives to the math library. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| if let Some(row) = weights.get_mut(next) { | ||
| row[i] = 2.0; // Excitatory connection | ||
| } | ||
| if let Some(row) = weights.get_mut(prev) { | ||
| row[i] = -1.0; // Inhibitory connection | ||
| } |
There was a problem hiding this comment.
The get_mut checks here are redundant because weights is initialized as a num_neurons x num_neurons matrix and next/prev are always within 0..num_neurons. Keeping the Option handling can silently skip setting connections if an invariant is broken, making issues harder to detect. Consider using direct indexing (or an explicit debug_assert! on dimensions) instead of get_mut.
| if let Some(row) = weights.get_mut(next) { | |
| row[i] = 2.0; // Excitatory connection | |
| } | |
| if let Some(row) = weights.get_mut(prev) { | |
| row[i] = -1.0; // Inhibitory connection | |
| } | |
| weights[next][i] = 2.0; // Excitatory connection | |
| weights[prev][i] = -1.0; // Inhibitory connection |
| let step = (self.x_max - self.x_min) / (self.points as f64 - 1.0); | ||
| let mut x_vals = Vec::with_capacity(self.points); | ||
| for i in 0..self.points { |
There was a problem hiding this comment.
step is computed as (x_max - x_min) / (points - 1). If points is ever set to 0 or 1, this will produce an invalid step (division by zero / infinities) and can yield NaNs in the plotted data. Consider clamping points to at least 2 (or handling the points <= 1 case explicitly) before computing step.
| let step = (self.x_max - self.x_min) / (self.points as f64 - 1.0); | |
| let mut x_vals = Vec::with_capacity(self.points); | |
| for i in 0..self.points { | |
| let effective_points = self.points.max(2); | |
| let step = (self.x_max - self.x_min) / (effective_points as f64 - 1.0); | |
| let mut x_vals = Vec::with_capacity(effective_points); | |
| for i in 0..effective_points { |
This PR implements the interactive plotter for Activation Functions in the AI Deep Learning Theory tab as specified in
todo_gui.md.Architectural Analysis
The
AiTabalready utilizes aStrategy Pattern(AiTooltrait) which allows for clean integration of new tools without modifyingapp.rswith largeif/elsestatements. This adheres strictly to the Single Responsibility Principle and Open/Closed Principle.Implementation Details:
math_explorer): Addedtanh,tanh_prime,gelu, andgelu_primefunctions tosrc/ai/deep_learning_theory/calculus.rs. This keeps the mathematical logic strictly isolated from the GUI.math_explorer_gui): Created a new moduletabs/ai/activation_functions.rscontaining theActivationFunctionsToolstruct. This struct manages its own state (selected function, plot range) and handles plotting usingegui_plot.tabs/ai/mod.rsby adding it to theAiTabtool list.todo_gui.mdto mark the "Activation Functions" task as complete, effectively finishing the entire "3.1 Deep Learning Theory" section.Quality & Safety:
unwrap()orexpect()are used.ActivationFunctionsTool.Vectorstruct from the AI module (math_explorer::ai::deep_learning_theory::linear_algebra::Vector).PR created automatically by Jules for task 15589009739659043216 started by @fderuiter