diff --git a/README.md b/README.md
index 56e8138..36c7716 100644
--- a/README.md
+++ b/README.md
@@ -44,18 +44,35 @@ effective logging for troubleshooting.
* [WPILib](https://docs.wpilib.org/en/stable/index.html) -- FIRST basic libraries
* [AdvantageKit](
- https://docs.advantagekit.org/getting-start ed/what-is-advantagekit/)
+ https://docs.advantagekit.org/getting-started/what-is-advantagekit/)
-- Logging
* [CTRE Phoenix6](
- https://v6.docs.ctr-electronics.com/en/stable/docs/api-reference/mechanisms/swerve/swerve-overview.html)
- / [YAGSL](https://docs.yagsl.com/) -- Swerve drive library
-* [PathPlanner](https://pathplanner.dev/home.html) / [Choreo](
- https://sleipnirgroup.github.io/Choreo/) -- Autonomous path planning
+ https://v6.docs.ctr-electronics.com/en/stable/docs/api-reference/mechanisms/swerve/swerve-overview.html) -- Swerve drive library
+* [PathPlanner](https://pathplanner.dev/home.html) -- Autonomous path planning
* [PhotonVision](https://docs.photonvision.org/en/latest/) / [Limelight](
https://docs.limelightvision.io/docs/docs-limelight/getting-started/summary)
-- Robot vision / tracking
* [Autopilot](https://therekrab.github.io/autopilot/index.html) -- Drive-to-Pose semi-autonomous movements
+## FRC Kickoff Workshop Slides
+
+### 2026 - REBUILT
+
+Google Drive links for our 30-minute 2026 Kickoff Workshops:
+
+* [AZ RBSI and Advantage Kit
+](https://docs.google.com/presentation/d/1KOfODbdGbk8L_G25i7iYnaahoKr_Tzg54LJYN4yax_4/edit?usp=sharing)
+* [Know Where You Are: PhotonVision for Alignment and Odometry
+](https://docs.google.com/presentation/d/1JWYmwpZYA2zBuNIj9kKBUC_O-i0d1-SW_6qsVxgPdCA/edit?usp=sharing)
+
+
+### 2025 - Reefscape
+
+Google Drive link for our 2-hour 2025 Kickoff Workshop introducing Az-RBSI:
+
+* [AZ Liftoff RBSI](https://docs.google.com/presentation/d/1c8A5RlPeEvKcj9yC66Ffvh5Os6jWyZiACoSRjDDETUs/edit?usp=sharing)
+
+
## Further Reading
For tips on command-based programming, see this post:
diff --git a/doc/INSTALL.md b/doc/INSTALL.md
index a1a4518..c3c2372 100644
--- a/doc/INSTALL.md
+++ b/doc/INSTALL.md
@@ -1,5 +1,30 @@
# Az-RBSI Installation Instructions
+### Pre install
+Before you even think about Az-RBSI, you need these _minimum_ versions of the
+following components on your laptop and devices.
+
+* WPILib ` v2026.2.1`
+* RoboRIO image `FRC_roboRIO_2026_v1.2` (comes with the FRC Game Tools from
+ National Instruments)
+* Driver Station `Version 26.0` (comes with the FRC Game Tools from National
+ Instruments)
+* CTRE Tunner X `26.2.4.0`, with all devices running firmware `26.0` or newer.
+ This includes all motors, CANivore, Pigeon 2.0, and all CANcoders!
+* Rev Hardware Client `2.0`, with the PDH and all SparkMax's, and other devices
+ running firmware `26.1` or newer.
+* Vivid Hosting Radio firmware `2.0` or newer is required for competition this
+ year.
+* Photon Vision ([Orange Pi or other device](https://docs.photonvision.org/en/latest/docs/quick-start/quick-install.html))
+ **running `26.1` or newer** (make sure you are **not** acidentially running
+ `25.3`). We HIGHLY recomend downloading the image and re-imaging the SD Card
+ in your co-processor instead of trying to upgrade it.
+
+It is highly recommmended to update all you devices, and label what can id's or ip adresses and firmware versions they are running. This helps your team, and the FRC field staff quickly identify issues.
+
+--------
+
+### Getting Az-RBSI
The Az-RBSI is available as a [Template Repository](
https://docs.github.com/en/repositories/creating-and-managing-repositories/creating-a-repository-from-a-template#creating-a-repository-from-a-template)
for teams to use for creating their own 2026 FRC robot code. These instructions
@@ -29,11 +54,13 @@ If you want to keep caught up on dependencies, you will need to ENABLE the
Dependency Graph selection under the "Advanced Security" tab of the repository
Settings.
+* If you are struggling with this step, you may need the mentor or teacher that owns your github org to to it.
+
--------
-### Software Requirements
+### Software Requirements (Update Everything to 2026!)
The Az-RBSI requires the [2026 WPILib Installer](
https://github.com/wpilibsuite/allwpilib/releases) (VSCode and associated
@@ -45,10 +72,7 @@ https://v6.docs.ctr-electronics.com/en/stable/docs/tuner/index.html). Take a
moment to update all software and firmware to the latest versions before
attempting to load your new robot project.
-Please note that you need these _minimum_ versions of the following components:
-* WPILib ` v2026.2.1`
-* RoboRIO image `FRC_roboRIO_2026_v1.2`
--------
diff --git a/doc/RBSI-GSG.md b/doc/RBSI-GSG.md
index 11d93d4..da7774e 100644
--- a/doc/RBSI-GSG.md
+++ b/doc/RBSI-GSG.md
@@ -91,3 +91,44 @@ Additionally, both [PhotonVision](https://docs.photonvision.org/en/latest/) and
[Limelight](
https://docs.limelightvision.io/docs/docs-limelight/getting-started/summary)
computer vision systems are supported in the present release.
+
+--------
+
+### Included 3D Prints
+
+To help teams with standardized enclosures for their PhotonVision Orange Pi's
+and CTRE CANivores, we include three 3D print files as part of the "Assets"
+section of [each release](https://github.com/AZ-First/Az-RBSI/releases).
+
+* The [CANivore cable holder](https://github.com/AZ-First/Az-RBSI/releases/download/v26.0.0-rc2/Canivore.Cable.holder.STL)
+ is designed to hold a [right-angle USB-C cable](https://www.amazon.com/dp/B092ZS6SJG)
+ onto the CANivore in a way that won't get knocked loose if your robot hits
+ something. The entire assembly can be attached to the RoboRIO using
+ [double-sided mounting tape](https://www.amazon.com/dp/B00FUEN2GK).
+
+* The [Orange Pi Double Case](https://github.com/AZ-First/Az-RBSI/releases/download/v26.0.0-rc2/Orange.Pi.Double.case.STL)
+ and [Lid](https://github.com/AZ-First/Az-RBSI/releases/download/v26.0.0-rc2/Orange.Pi.Double.case.lid.STL)
+ are designed to hold one or two [Orange Pi 5](https://www.amazon.com/dp/B0BN17PWWB)'s
+ (not **B** or **Pro** or **Max**) (and connect up to 4 cameras). If only using one
+ Orange Pi, mount it in the "upper" position for airflow. Also requires:
+
+ * 2x [128 GB micro SD card](https://www.amazon.com/dp/B0B7NTY2S6)
+ * 4x [M2.5x6mm+6mm Male-Female Hex Standoff](https://www.amazon.com/gp/product/B08F2F96HM) (under the bottom Pi)
+ * 4x [M2.5x25mm+6mm Male-Female Hex Standoff](https://www.amazon.com/gp/product/B08F2DBNZW) (between the two Pi's)
+ * 4x [M2.5x20mm Female-Female Hex Standoff](https://www.amazon.com/gp/product/B08F2HZN4R) (atop the upper Pi)
+ * 8x [M2.5x8mm Machine Screws](https://www.amazon.com/gp/product/B07MLB1627) (through case and into standoffs)
+ * 2x [Cooling Fan 40mm 5V DC + Grill](https://www.amazon.com/gp/product/B08R1CXGCJ) (attaches to side of case)
+ * 1x [Redux Robotics Zinc-V Regulator](https://shop.reduxrobotics.com/products/zinc-v) OR [Pololu 5V, 5.5A Step-Down Voltage Regulator](https://www.pololu.com/product/4091) OR [Pololu 5V, 3A Step-Up/Step-Down Voltage Regulator](https://www.pololu.com/product/4082) (Power regulation for the Pi's)
+ * 2x [90-Degree USB-C to USB-C Cable, 10 inch](https://www.amazon.com/dp/B0CG1PZMVG) for ZINC-V or [90-Degree USB-C to 2 Pin Bare Wire, 10 inch](https://www.amazon.com/gp/product/B0CY2J5H3K) for Pololu (for powering Pi's)
+ * 3x [M2x8mm Machine Screws](https://www.amazon.com/gp/product/B07M6RTWCC) (for attaching the Pololu to the lid)
+
+ **NOTE: Powering the case with a Pololu requires soldering the USB-C cables
+ and the 18/2 AWG wires to the Pololu.** This requires patience. Using the
+ Zinc-V requires no soldering, but introduces an extra USB-C connection.
+
+ See the [PhotonVision Wiring documentation
+ ](https://docs.photonvision.org/en/latest/docs/quick-start/wiring.html) for
+ more details.
+
+ Mounting the case to the robot requires 4x #10-32 nylock nuts (placed in the
+ hex-shaped mounts inside the case) and 4x #10-32 bolts.
diff --git a/src/main/java/frc/robot/Constants.java b/src/main/java/frc/robot/Constants.java
index 1ebb197..9e33d5d 100644
--- a/src/main/java/frc/robot/Constants.java
+++ b/src/main/java/frc/robot/Constants.java
@@ -432,14 +432,19 @@ public static class Cameras {
// Robot to camera transforms
// (ONLY USED FOR PHOTONVISION -- Limelight: configure in web UI instead)
- // Example Camera are mounted on the frame perimeter, 18" up from the floor, centered
- // side-to-side
+ // Example Cameras are mounted in the back corners, 18" up from the floor, facing sideways
public static Transform3d robotToCamera0 =
new Transform3d(
- Inches.of(13.0), Inches.of(0), Inches.of(18.0), new Rotation3d(0.0, 0.0, 0.0));
+ Inches.of(-13.0),
+ Inches.of(13.0),
+ Inches.of(12.0),
+ new Rotation3d(0.0, 0.0, Math.PI / 2));
public static Transform3d robotToCamera1 =
new Transform3d(
- Inches.of(-13.0), Inches.of(0), Inches.of(18.0), new Rotation3d(0.0, 0.0, Math.PI));
+ Inches.of(-13.0),
+ Inches.of(-13.0),
+ Inches.of(12.0),
+ new Rotation3d(0.0, 0.0, -Math.PI / 2));
// Standard deviation multipliers for each camera
// (Adjust to trust some cameras more than others)
@@ -456,7 +461,7 @@ public static class SimCameras {
public static final SimCameraProperties kSimCamera1Props =
new SimCameraProperties() {
{
- setCalibration(1280, 800, Rotation2d.fromDegrees(90));
+ setCalibration(1280, 800, Rotation2d.fromDegrees(120));
setCalibError(0.25, 0.08);
setFPS(30);
setAvgLatencyMs(20);
@@ -467,7 +472,7 @@ public static class SimCameras {
public static final SimCameraProperties kSimCamera2Props =
new SimCameraProperties() {
{
- setCalibration(1280, 800, Rotation2d.fromDegrees(90));
+ setCalibration(1280, 800, Rotation2d.fromDegrees(120));
setCalibError(0.25, 0.08);
setFPS(30);
setAvgLatencyMs(20);
diff --git a/src/main/java/frc/robot/RobotContainer.java b/src/main/java/frc/robot/RobotContainer.java
index c8099d9..ce35545 100644
--- a/src/main/java/frc/robot/RobotContainer.java
+++ b/src/main/java/frc/robot/RobotContainer.java
@@ -62,7 +62,8 @@
import frc.robot.util.GetJoystickValue;
import frc.robot.util.LoggedTunableNumber;
import frc.robot.util.OverrideSwitches;
-import frc.robot.util.RBSIEnum;
+import frc.robot.util.RBSIEnum.AutoType;
+import frc.robot.util.RBSIEnum.Mode;
import frc.robot.util.RBSIPowerMonitor;
import java.util.Set;
import org.littletonrobotics.junction.networktables.LoggedDashboardChooser;
@@ -379,24 +380,27 @@ private void configureBindings() {
m_drivebase::stop,
m_drivebase));
- // Double-press the A button on Joystick3 to run the CameraSweepEvaluator
- // Use WPILib's built-in double-press binding
- joystick3
- .button(1)
- .multiPress(2, 0.2)
- .onTrue(
- Commands.runOnce(
- () -> {
- try {
- sweep.runFullSweep(
- Filesystem.getOperatingDirectory()
- .toPath()
- .resolve("camera_sweep.csv")
- .toString());
- } catch (Exception e) {
- e.printStackTrace();
- }
- }));
+ if (Constants.getMode() == Mode.SIM) {
+ // IN SIMULATION ONLY:
+ // Double-press the A button on Joystick3 to run the CameraSweepEvaluator
+ // Use WPILib's built-in double-press binding
+ joystick3
+ .button(1)
+ .multiPress(2, 0.2)
+ .onTrue(
+ Commands.runOnce(
+ () -> {
+ try {
+ sweep.runFullSweep(
+ Filesystem.getOperatingDirectory()
+ .toPath()
+ .resolve("camera_sweep.csv")
+ .toString());
+ } catch (Exception e) {
+ e.printStackTrace();
+ }
+ }));
+ }
}
/**
@@ -467,7 +471,7 @@ public Drive getDrivebase() {
*
NOTE: These are currently only accessible with Constants.AutoType.PATHPLANNER
*/
private void definesysIdRoutines() {
- if (Constants.getAutoType() == RBSIEnum.AutoType.PATHPLANNER) {
+ if (Constants.getAutoType() == AutoType.PATHPLANNER) {
// Drivebase characterization
autoChooserPathPlanner.addOption(
"Drive Wheel Radius Characterization",
diff --git a/src/main/java/frc/robot/subsystems/vision/CameraPlacementEvaluator.java b/src/main/java/frc/robot/subsystems/vision/CameraPlacementEvaluator.java
deleted file mode 100644
index aa907ee..0000000
--- a/src/main/java/frc/robot/subsystems/vision/CameraPlacementEvaluator.java
+++ /dev/null
@@ -1,90 +0,0 @@
-// Copyright (c) 2026 Az-FIRST
-// http://github.com/AZ-First
-//
-// This program is free software; you can redistribute it and/or
-// modify it under the terms of the GNU General Public License
-// version 3 as published by the Free Software Foundation or
-// available in the root directory of this project.
-//
-// This program is distributed in the hope that it will be useful,
-// but WITHOUT ANY WARRANTY; without even the implied warranty of
-// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
-// GNU General Public License for more details.
-//
-// Copyright (c) FIRST and other WPILib contributors.
-// Open Source Software; you can modify and/or share it under the terms of
-// the WPILib BSD license file in the root directory of this project.
-
-package frc.robot.subsystems.vision;
-
-import edu.wpi.first.math.geometry.*;
-import edu.wpi.first.wpilibj.Timer;
-import frc.robot.Constants;
-import frc.robot.subsystems.drive.Drive;
-import java.util.List;
-import org.photonvision.simulation.PhotonCameraSim;
-import org.photonvision.simulation.SimCameraProperties;
-import org.photonvision.simulation.VisionSystemSim;
-import org.photonvision.simulation.VisionTargetSim;
-import org.photonvision.targeting.PhotonPipelineResult;
-
-public class CameraPlacementEvaluator {
-
- private final Drive driveSim;
- private final VisionSystemSim visionSim;
-
- public CameraPlacementEvaluator(Drive driveSim, VisionSystemSim visionSim) {
- this.driveSim = driveSim;
- this.visionSim = visionSim;
- }
-
- public double evaluateCameraPose(
- PhotonCameraSim simCam, Transform3d robotToCamera, SimCameraProperties props) {
-
- // Add or adjust camera transform in vision sim
- visionSim.adjustCamera(simCam, robotToCamera);
-
- // Test field poses
- Pose2d[] testPoses =
- new Pose2d[] {
- new Pose2d(1.0, 1.0, new Rotation2d(0)),
- new Pose2d(3.0, 1.0, new Rotation2d(Math.PI / 2)),
- new Pose2d(5.0, 3.0, new Rotation2d(Math.PI)),
- new Pose2d(2.0, 4.0, new Rotation2d(-Math.PI / 2)),
- };
-
- double totalScore = 0.0;
- double latencyMillis = 0.0;
-
- for (Pose2d pose : testPoses) {
-
- driveSim.resetPose(pose);
-
- for (int i = 0; i < 5; i++) {
- driveSim.simulationPeriodic();
- visionSim.update(driveSim.getPose());
- Timer.delay(Constants.loopPeriodSecs);
- }
-
- var maybeCamPose3d = visionSim.getCameraPose(simCam);
- if (maybeCamPose3d.isEmpty()) {
- continue;
- }
- Pose3d camFieldPose = maybeCamPose3d.get();
-
- List Conventions: yawDeg = rotation about +Z (up) pitchDeg = rotation about +Y
+ *
+ * Order: extra = yaw ⊗ pitch (yaw first, then pitch, both in the camera/base frame) combined =
+ * base ⊗ extra
+ */
+ private static Rotation3d composeCameraExtra(Rotation3d base, double yawDeg, double pitchDeg) {
+ double yaw = Units.degreesToRadians(yawDeg);
+ double pitch = Units.degreesToRadians(pitchDeg);
+
+ Quaternion qBase = base.getQuaternion();
+
+ Quaternion qYaw = quatFromAxisAngle(0.0, 0.0, 1.0, yaw);
+ Quaternion qPitch = quatFromAxisAngle(0.0, 1.0, 0.0, pitch);
+
+ Quaternion qExtra = quatMul(qYaw, qPitch);
+ Quaternion qCombined = quatMul(qBase, qExtra);
+
+ return new Rotation3d(qCombined);
+ }
+
/**
* Run a full sweep of candidate camera placements.
*
@@ -52,66 +114,120 @@ public CameraSweepEvaluator(
*/
public void runFullSweep(String outputCsvPath) throws IOException {
// Example field bounds (meters) -- tune these for your field size
- double[] fieldX = {0.0, 2.0, 4.0, 6.0, 8.0}; // 0–8m along field length
- double[] fieldY = {0.0, 1.0, 2.0, 3.0, 4.0}; // 0–4m along field width
+ double[] fieldX = {
+ 0.5, 1.0, 1.5, 2.0, 2.5, 3.0, 3.5, 4.0, 4.5, 5.0, 5.5, 6.0, 6.5, 7.0, 7.5, 8.0
+ };
+ double[] fieldY = {0.5, 1.0, 1.5, 2.0, 2.5, 3.0, 3.5, 4.0, 4.5, 5.0, 5.5, 6.0, 6.5, 7.0, 7.5};
double[] robotZ = {0.0}; // Usually floor height
- double[] camYaw = {-30, 0, 30}; // Camera yaw
- double[] camPitch = {-10, 0, 10}; // Camera pitch
+
+ // NEW: Sweep robot yaw at each location (degrees)
+ double[] robotYawDeg = {-180, -135, -90, -45, 0, 45, 90, 135};
+
+ // Camera “extra” rotation sweep (applied on top of the mount rotation)
+ double[] camYawDeg = {-15, 0, 15};
+ double[] camPitchDeg = {-10, 0, 10};
+
+ // Pull fixed mount transforms from Constants once (and log/write them)
+ final Transform3d rToC0 = robotToCamera0;
+ final Transform3d rToC1 = robotToCamera1;
+
+ final Translation3d c0T = rToC0.getTranslation();
+ final Translation3d c1T = rToC1.getTranslation();
+ final Quaternion c0Q = rToC0.getRotation().getQuaternion();
+ final Quaternion c1Q = rToC1.getRotation().getQuaternion();
try (FileWriter writer = new FileWriter(outputCsvPath)) {
- writer.write("robotX,robotY,robotZ," + "cam1Yaw,cam1Pitch,cam2Yaw,cam2Pitch,score\n");
+ // Header includes robot yaw and BOTH robot->camera transforms (translation + quaternion)
+ writer.write(
+ "robotX,robotY,robotZ,robotYawDeg,"
+ + "cam1YawDeg,cam1PitchDeg,cam2YawDeg,cam2PitchDeg,score,"
+ + "rToC0_tx,rToC0_ty,rToC0_tz,rToC0_qw,rToC0_qx,rToC0_qy,rToC0_qz,"
+ + "rToC1_tx,rToC1_ty,rToC1_tz,rToC1_qw,rToC1_qx,rToC1_qy,rToC1_qz\n");
// Sweep robot over the field
for (double rx : fieldX) {
for (double ry : fieldY) {
for (double rz : robotZ) {
- Pose3d robotPose =
- new Pose3d(
- new Translation3d(rx, ry, rz),
- new Rotation3d(0, 0, 0) // robot heading = 0 for simplicity
- );
-
- // Sweep camera rotations
- for (double c1Yaw : camYaw) {
- for (double c1Pitch : camPitch) {
- Pose3d cam1Pose =
- new Pose3d(
- robotPose.getTranslation(),
- new Rotation3d(Math.toRadians(c1Pitch), 0, Math.toRadians(c1Yaw)));
-
- for (double c2Yaw : camYaw) {
- for (double c2Pitch : camPitch) {
- Pose3d cam2Pose =
- new Pose3d(
- robotPose.getTranslation(),
- new Rotation3d(Math.toRadians(c2Pitch), 0, Math.toRadians(c2Yaw)));
-
- // Get all vision targets
- Set