Skip to content

Ch14: Add --explain mode to tutoring A/B analyzer #69

@nicholaskarlson

Description

@nicholaskarlson

Summary

No response

Details

Summary

Add an --explain mode to the Chapter 14 tutoring A/B case study analyzer
(scripts/ch14_tutoring_ab.py).

When --explain is set, the script should print human-readable explanations
alongside the numerical results (effect size, confidence interval, p-value).

Goals

  • Add a --explain flag to the CLI (using the shared base_parser).
  • When --explain is true:
    • Print brief, plain-language commentary on:
      • the estimated mean difference,
      • the 95% confidence interval,
      • the p-value and what it implies.
  • Keep the default non-explain mode output unchanged (so existing workflows don’t break).
  • Add at least one small test or example to show the new flag in action.

Hints

  • Look at how scripts/ch14_tutoring_ab.py currently uses the CLI helper.
  • Mirror the style of printouts used in Chapter 13/14 (short, clear, instructor-friendly).
  • It’s fine to start with very minimal natural-language text; we can refine wording in review.

Difficulty

Beginner-friendly: good first issue for someone comfortable with basic Python
and the command line.

Files to Touch

No response

Contributor Checklist

  • I have read CONTRIBUTING.md.
  • I can run make lint locally.
  • I can run make test locally.
  • I have checked for existing issues/PRs that might overlap.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions