Skip to content

Add grad clipping, loss history, config accessors, and HYBRID mode to TensorProgram#4

Draft
Copilot wants to merge 2 commits intomainfrom
copilot/proceed-next-steps
Draft

Add grad clipping, loss history, config accessors, and HYBRID mode to TensorProgram#4
Copilot wants to merge 2 commits intomainfrom
copilot/proceed-next-steps

Conversation

Copy link
Contributor

Copilot AI commented Mar 7, 2026

Four missing/incomplete pieces of the TensorProgram learning pipeline, all natural follow-ons to the gradient computation work in the previous PR.

Changes

TensorEquation.h / .cc

  • Config accessors_max_iterations and _convergence_threshold were private with no public API; adds set_max_iterations(), max_iterations(), set_convergence_threshold(), convergence_threshold()
  • Gradient clipping — adds set_grad_clip(double) / grad_clip(); when non-zero, each gradient vector is L2-norm-clamped in update_parameters() before the SGD step; extracted into a clip_gradient_vector() static helper to avoid duplication
  • Loss historytrain() now clears and appends to _loss_history each epoch (pre-update, standard ML convention); exposed via loss_history() -> const std::vector<double>&
  • HYBRID modeexecute() previously treated HYBRID identically to CONTINUOUS; now applies sigmoid (when no explicit nonlinearity is set) then threshold, yielding binary outputs with gradient flow via the existing straight-through estimator in backward()
TensorProgram prog("learn");
prog.set_learning_rate(0.01);
prog.set_grad_clip(1.0);          // prevent exploding gradients
prog.set_max_iterations(50);      // limit fixpoint iterations
prog.set_convergence_threshold(1e-4);

prog.train({}, targets, 100);

// Inspect per-epoch loss
for (double loss : prog.loss_history())
    std::cout << loss << "\n";

Tests

10 new tests: max_iterations / convergence_threshold accessors and their effect on forward_to_fixpoint(); grad_clip round-trip and that clipping actually bounds weight deltas to ≤ lr × clip; loss_history length equals epoch count, values non-negative, and loss decreases over training; HYBRID mode produces binary {0,1} output, routes through sigmoid→threshold when nonlinearity is NONE, and skips the extra sigmoid when an explicit nonlinearity is set.


🔒 GitHub Advanced Security automatically protects Copilot coding agent pull requests. You can protect all pull requests by enabling Advanced Security for your repositories. Learn more about Advanced Security.

Copilot AI self-assigned this Mar 7, 2026
…g accessors

Co-authored-by: danregima <10253941+danregima@users.noreply.github.com>
Copilot AI changed the title [WIP] Proceed with next steps Add grad clipping, loss history, config accessors, and HYBRID mode to TensorProgram Mar 7, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants