Skip to content

feat: autoresearch harness for SNAG optimization#26

Merged
seanfromthepast merged 1 commit intomainfrom
autoresearch/pro/harness
Mar 16, 2026
Merged

feat: autoresearch harness for SNAG optimization#26
seanfromthepast merged 1 commit intomainfrom
autoresearch/pro/harness

Conversation

@realityinspector
Copy link
Collaborator

Summary

Adds autoresearch/ directory with Karpathy-style optimization loop for Pro simulation parameters.

  • pro_autoresearch.py — main loop: mutate config → run → evaluate → keep/discard
  • config_space.py — 27 mutable dimensions across 6 mechanism clusters
  • metrics.py — Causal Resolution metric, dry-run synthetic metrics
  • pareto.py — Pareto frontier analysis

Purely additive — no existing files modified. Safe to merge.

Merge intent

This is the base branch. Once merged, the 8 cluster branches below can PR their result files in:

DO NOT MERGE: feat/pro/pytorch-backend — modifies tensors.py, needs venv testing first.

Test plan

  • python3 -m autoresearch.pro_autoresearch --dry-run --iterations 10 completes <1s
  • Verify no import conflicts with Pro's existing modules

Karpathy-style autoresearch loop that mutates Hydra config parameters
across 27 dimensions (6 mechanism clusters), runs simulations, extracts
quality metrics (Causal Resolution, coherence, plausibility), and
identifies Pareto-optimal configs via quality vs cost tradeoffs.

Supports --dry-run mode with deterministic synthetic metrics for
testing the mutation/selection loop without API calls.
@seanfromthepast seanfromthepast merged commit a927d6e into main Mar 16, 2026
1 of 5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants