diff --git a/Assets/akit.png b/Assets/akit.png
new file mode 100644
index 0000000..9b177cb
Binary files /dev/null and b/Assets/akit.png differ
diff --git a/Assets/bb.jpg b/Assets/bb.jpg
new file mode 100644
index 0000000..a96cb6a
Binary files /dev/null and b/Assets/bb.jpg differ
diff --git a/Assets/servo.webp b/Assets/servo.webp
new file mode 100644
index 0000000..f71de31
Binary files /dev/null and b/Assets/servo.webp differ
diff --git a/Assets/x44.webp b/Assets/x44.webp
new file mode 100644
index 0000000..39ce5d1
Binary files /dev/null and b/Assets/x44.webp differ
diff --git a/Docs/2_Architecture/2.2_ElectronicsCrashCourse.md b/Docs/2_Architecture/2.2_ElectronicsCrashCourse.md
index 9a6e575..f8bd8af 100644
--- a/Docs/2_Architecture/2.2_ElectronicsCrashCourse.md
+++ b/Docs/2_Architecture/2.2_ElectronicsCrashCourse.md
@@ -4,14 +4,14 @@
### Electrical system diagram
-
+
This diagram shows many of the electrical components found on a typical FRC robot.
You don't need to memorize this layout, but its handy to have this as a reference.
### Battery
-
+
All of the power our robot uses comes from a single 12-volt car battery.
These can be found on our battery cart in the corner of the lab.
@@ -38,7 +38,7 @@ $V = IR$ is the equation which governs voltage sag, where $V$ is amount the volt
### Main Breaker
-
+
The main breaker is the power switch of the robot.
The red button will turn the robot off, and a small black lever on the side of the breaker will turn it on.
@@ -53,7 +53,7 @@ We tend to have a 3d-printed guard over the off switch to prevent accidental pre
### Power Distribution Hub (PDH)
-
+
The PDH takes the power from the battery and distributes it to the motors, sensors, and other components on the robot (Almost like its a hub for distributing power!).
We don't have to do anything with the PDH in code, but if a motor or sensor does not have power it could have a bad connection with the PDH.
@@ -69,7 +69,7 @@ The PDH is also often at one end of our CAN network. What's CAN? Glad you asked
### The CAN Network
-
+
CAN is a type of communication along wires that allow our sensors and motors to communicate along our robot.
In FRC, CAN is transmitted over yellow and green pairs of cables.
@@ -86,7 +86,7 @@ This app also has features to test and configure devices on the CAN network.
### CANivore
-
+
The CANivore is a device that connects to the [Rio](#roborio-2) (in the next section) over usb.
It allows us to have a second CAN network with increased bandwidth.
@@ -97,7 +97,7 @@ In 2024 we exclusively used the CANivore network for motors and sensors.
### RoboRIO 2
-
+
The RoboRIO (rio) is the brain of the robot.
It is built around a computer running Linux with a large number of Input/Output (IO) ports.
@@ -114,7 +114,7 @@ These ports include:
- The large set of pins in the middle of the Rio is the MXP port.
MXP (and the SPI port in the top-right corner) is used for communication over serial interfaces such as I²C protocal.
Unfortunately, there is an issue with I²C that can cause the code to lock up when it is used, so we avoid using the serial ports.
- We can get around this issue by using a coprocessor (computer other than the rio, like a raspberry pi) to convert the signal to a different protocal.
+ We can get around this issue by using a coprocessor (computer other than the rio, like a raspberry pi) to convert the signal to a different protocol.
Generally we avoid using I²C devices.
- A CAN network originates at the RIO.
- Several USB ports are available on the Rio.
@@ -126,13 +126,19 @@ The Rio also has an SD card as its memory.
When we set up a Rio for the season we need to image it using a tool that comes with the driver station.
The WPILib docs have [instructions](https://docs.wpilib.org/en/stable/docs/zero-to-robot/step-3/roborio2-imaging.html) on how to image the SD card.
+### Note on 2026-7+ Control System
+As per [this](https://community.firstinspires.org/introducing-the-future-mobile-robot-controller) blog post, FRC will use a different robot control system called SystemCore beginning with the 2027 season.
+This moves away from the RoboRIO and NI and instead will use a controller based on the Raspberry Pi CM5.
+As we have not been selected for beta testing at this time, we don't have a lot of information about this right now, but it's good to keep this on your radar.
+**(Leads should remember to update this part of the article in the fall of 2026!)**
+
### Vivid Hosting Radio
-
+
-The radio is essentially a Wi-Fi router that connects the robot to the driver station.
+The VH-109 radio is essentially a Wi-Fi router that lives on the robot and connects it to the driver station.
At tournaments we have to take the radio to get reprogrammed for that competition.
-This makes it able to connect to the field so that our robot gets enabled and disabled at the right times during matches, however it prevents us from connecting to the robot with a laptop wirelessly$`^1`$.
+This makes it able to connect to the access point (another radio on the field) so that our robot gets enabled and disabled at the right times during matches, however it prevents us from connecting to the robot with a laptop wirelessly $`^1`$.
The radio has four ethernet ports and a pair of terminals for power wires.
One ethernet port connects to the rio and is labeled RIO.
One is usually reserved for tethering to a laptop and is labeled DS.
@@ -143,30 +149,22 @@ Network switches and other ethernet devices are plugged into the AUX1 and AUX2 p
After each competition we have to reimage the radio to allow it to connect to a laptop wirelessly again.
Refer to the [vivid hosting radio documentation](https://frc-radio.vivid-hosting.net/) for more information.
-The radio can also be connected to via a second radio acting as an access point.
-At time of writing we have not tried this, and best practices are still being figured out.
+We can also connect to the robot's VH-109 by connecting to a second network (which comes from a second radio that's connected to the robot).
+This radio could be the VH-113 access point, which is a larger radio that is for the field instead of going on a robot, or a VH-109 configured to act like a VH-113.
+This is the setup we'll usually be using at the Loom.
The radio can either be powered using Power-Over-Ethernet (PoE) or the power terminals.
-This radio model is new at time of writing$`^2`$, and best practices are still being figured out.
Be careful with checking for good, consistent radio power if you are having connection issues.
-Footnote $1$
-
-At Chezy Champs 2023 we tested a beta version of this radio, and were able to connect to it wirelessly in the pit on a second network.
-This did have stability issues.
-It is unknown if and when this capability will be re-enabled.
-
-Footnote $2$
-
An older radio known as the OM5P was in use until champs 2024, and you may encounter some on old/offseason robots.
It was much worse to deal with (more fragile and finicky) and we are lucky to be done with it.
It is pictured below.
-
+
### Motor Controllers
-
+
Motors are the primary form of movement on the robot, but on their own they are just expensive paperweights.
Motor controllers take signals from our code, often over CAN, and turn them into the voltage sent to the motor.
@@ -177,30 +175,37 @@ For instance they might be able to run PID loops much faster than our code is ab
Knowing what motor controller you are using and what features it has is very important when writing robot code.
Pictured above is the Spark Max built by REV Robotics, a common modern motor controller often used with the NEO and NEO 550 motors.
REV also produces a motor called the NEO Vortex which uses the Spark Flex, pictured below.
-
+
+
However, we avoid using REV motors due to poor software and historical mechanical issues.
Instead we use . . .
-### The Talon FX + Kraken X60
+### The Talon FX + Kraken X60/44
+
+
-
+The Kraken X60 ("Kraken") motor is the primary motor we use on the robot.
+Unlike many other motors, Krakens come with a built in motor controller called the Talon FX.
+The Kraken also has a built in encoder, or sensor that tracks the rotation and speed of the motor.
+This is a relative encoder, so we tend to pair them with CTRE CANcoders (which are absolute).
+More on encoders [below](#encoders).
+Documentation for the software to control Krakens can be found [here](https://pro.docs.ctr-electronics.com/en/stable/).
+There is also the Kraken X44, which is mostly the same as the X60 but a bit smaller and lighter.
+It looks the same in code to us, since it also has a TalonFX controller.
-The Kraken X60 ("kraken") motor is the primary motor we use on the robot.
-Unlike many other motors, krakens come with a built in motor controller called the Talon FX.
-The kraken also has a built in encoder, or sensor that tracks the rotation and speed of the motor.
-Documentation for the software to control falcons can be found [here](https://pro.docs.ctr-electronics.com/en/stable/).
+
-We also use the Falcon 500 ("falcon") motor in some places on the robot.
-Slightly less powerful, slightly older, and likely out of stock for the forseeable future, falcons are slowly being phased out of our motor stock.
-Because falcons also have an integrated TalonFX, they behave exactly the same in code as krakens.
-A falcon is pictured below.
+We also use the Falcon 500 ("Falcon") motor in some places on the robot.
+Slightly less powerful, slightly older, and likely out of stock for the forseeable future, Falcons are slowly being phased out of our motor stock.
+Because Falcons also have an integrated TalonFX, they behave exactly the same in code as Krakens.
+A Falcon is pictured below.
-
+
### Solenoids and Pneumatics
-
+
Pneumatics are useful for simple, linear, and repeatable motions where motors might not be ideal.
In the past we have used them primarily for extending intakes.
@@ -210,9 +215,17 @@ We usually use double solenoids, which have both powered extension and retractio
Single solenoids only supply air for extension, and rely on the piston having a spring or something similar to retract them.
Our mechanical team has moved somewhat away from pneumatics over weight and complexity concerns, but they may still appear on future robots.
+### Servos
+
+
+
+Servos are similar to motors in that they can produce rotational motion, but are designed for very precise rotation.
+Servos are somewhat more common in FTC than FRC, but are good for "single use" mechanisms or something that needs to go to a specific position.
+In 2025, we used 2 servos to open the funnel hatch panel in order to climb.
+
### Robot Status Light
-
+
The Robot Status Light (RSL) is an orange light we are required to have on our robot by competition rules.
When the robot is powered on it will glow solid orange.
@@ -226,24 +239,26 @@ We have additional LEDs on the robot that indicate the state of the robot, but t
### Encoders
-
-
+
+
Encoders are devices that measure rotation.
We use them to measure the angles of our swerve modules, the speed of our wheels, the position of our elevator, and much more.
-Modern motors have encoders built in.
-However, absolute encoders are only useful when they cannot rotate more than once, because they only return their position in a 0-360 degree range.
-Motors are almost always geared down such that the motor rotates multiple times per rotation of the mechanism it is attached to, making the absolute data not so absolute.
+Relative encoders measure relative change in position (e.g. the shaft has rotated 30 degrees since powering on), while absolute encoders measure the exact position of the shaft (e.g. the shaft is 30 degrees from some absolute zero position).
+Modern motors have relative encoders built in.
+
+Absolute encoders are useful when they're measuring something that cannot rotate more than once, because they only return their position in a 0-360 degree range.
+This is well suited to mechanisms like arms, that usually can't rotate past 360 degrees without causing problems.
+However, motors are almost always geared down such that the motor rotates multiple times per rotation of the mechanism it is attached to, making the absolute data not so absolute.
While this gearing is required to get enough torque from the motor, this is something to keep in mind when prototyping or looking at designs for mechanisms.
If a mechanism has an absolute encoder, it should be rotating 1 or less rotations over the mechanism's range.
-If it is infeasable to have an encoder that goes 1 or less rotations for a mechanism, you might want to take a look at the next section, [Limit Switches](#limit-switches)
+If it is infeasible to have an encoder that goes 1 or less rotations for a mechanism, you might want to take a look at the next section, [Limit Switches](#limit-switches)
Some examples of absolute encoders are the CTRE CANcoder (upper picture).
The CANcoder sends absolute position and velocity data over the CAN network and uses a magnet embedded in the mechanism to keep track of position.
-We use these to track the heading of our swerve modules.
-However, designing and manufacturing mechanisms to hold that magnet is inconvenient so we don't usually use CANcoders for other mechanisms.
+We use these to track the heading of our swerve modules, as well as mechanisms like arms.
-The Rev Through Bore encoder (lower picture) solves this mounting problem by connecting directly to hex shaft (the most common type of shaft in FRC).
+The Rev Through Bore encoder (lower picture) solves the mounting problem by connecting directly to hex shaft (the most common type of shaft in FRC).
However, the Through Bore does not communicate over CAN and requires wiring to the DIO ports.
We used one of these on the hood of our 2022 robot.
@@ -251,7 +266,7 @@ Some teams have seen success getting around the absolute encoder only having one
### Limit Switches
-
+
A limit switch is a simple sensor that tells us whether a switch is pressed or not.
They are one of the electrically simplest sensors we can use, and are quite useful as a way to reset the encoder on a mechanism ("Zero" the mechanism).
@@ -268,21 +283,34 @@ This tells us that we are at the hardstop in the same way that a limit switch wo
### IMU
-
+
An Inertial Measurement Unit (IMU or sometimes Gyro) is a sensor that measures its acceleration and rotation.
We primarily use them to measure the rotation of the robot (heading) so that the robot knows which way it is pointing on the field.
This lets us run field-relative swerve drive.
Pictured above is the Pigeon 2.0 IMU by CTRE, an IMU that connects over the CAN network.
+### Beambreaks/Light Sensors
+
+
+
+A beambreak has two parts: an emitter, which sends out a beam of infrared light, and a receiver, which is directly across from the emitter and receives that light.
+These are useful for when we need to detect if something like a game piece is present or not.
+If we mount the two parts on opposite sides of a mechanism, the game piece will block the light from reaching the other side and thus tell us that it's there.
+
+We might use an IR reflective sensor if we don't want to or can't add the receiver part, but this works in the same way by detecting when something's blocked the beam of light coming from the emitter.
+
### Cameras and Vision
-
+
Cameras and vision systems are an important part of advanced FRC software.
-Vision in FRC can be used to detect visual markers called apriltags on the field, game pieces, and even other robots.
-The pictured camera is a Limelight camera, a purchaseable vision solution that we used from 2021-2023.
-Limelights connect to the robot over ethernet.
-However they are generally not built for apriltag detection and pose estimation, which has pushed us towards other solutions.
-These generally involve more custom hardware, such as [arducam cameras](https://www.arducam.com/product/arducam-100fps-global-shutter-usb-camera-board-1mp-720p-ov9281-uvc-webcam-module-with-low-distortion-m12-lens-without-microphones-for-computer-laptop-android-device-and-raspberry-pi/), [orange pi processors](http://www.orangepi.org/), and [PhotonVision software](https://photonvision.org/).
-We used that hardware and software to success in 2024 for pose estimation with apriltags.
+Vision in FRC can be used to detect visual markers called AprilTags on the field, game pieces, and even other robots.
+The pictured camera is a Limelight camera, a purchaseable vision solution that connects to the robot over ethernet and we used from 2021-2023.
+
+However, we've moved towards custom hardware, such as [Arducam cameras](https://www.arducam.com/product/arducam-100fps-global-shutter-usb-camera-board-1mp-720p-ov9281-uvc-webcam-module-with-low-distortion-m12-lens-without-microphones-for-computer-laptop-android-device-and-raspberry-pi/) and [Orange Pi processors](http://www.orangepi.org/).
+See the [Vision](..\3_Specifics\3.5_Vision.md) article for more details.
+The cameras plug into the USB ports on the Orange Pi.
+The Orange Pi connects to the radio over Ethernet,
+It's powered off the PDH with a buck/step down converter (which decreases voltage and increases current, since the Orange Pi only takes 5V and not 12).
+If you're having issues, check the PhotonVision docs pages on [networking](https://docs.photonvision.org/en/latest/docs/quick-start/networking.html) and [wiring](https://docs.photonvision.org/en/latest/docs/quick-start/wiring.html).
\ No newline at end of file
diff --git a/Docs/2_Architecture/2.3_CommandBased.md b/Docs/2_Architecture/2.3_CommandBased.md
index 484a7c4..7de4cf1 100644
--- a/Docs/2_Architecture/2.3_CommandBased.md
+++ b/Docs/2_Architecture/2.3_CommandBased.md
@@ -4,17 +4,15 @@
Command based programming revolves around three concepts: **Subsystems**, **Commands**, and **Triggers**.
-A Subsystem is a set of hardware that forms one system on our robot, like the drivetrain, elevator, arm, or intake.
-Each subsystem contains some associated hardware (motors, pistons, sensors, etc.) They are the "nouns" of our robot, what it is.
-Each Subsystem is generally made to contain a broad set of hardware that will always operate as a unit.
+A Subsystem represents a system on our robot, like the drivetrain, elevator, arm, or intake, that will always operate as a unit.
+Each subsystem contains some associated hardware (motors, pistons, sensors, etc).
+They are the "nouns" of our robot, or things that it is.
Commands are the "verbs" of the robot, or what our robot does.
Each Subsystem can be used by one Command at the same time, but Commands may use many Subsystems.
-Commands can be composed together, so the `LineUp`, `Extend`, and `Outake` Commands might be put together to make a `Score` Command.
+Commands can be composed together, so the `LineUp`, `Extend`, and `Outtake` Commands might be put together to make a `Score` Command.
Because each Subsystem can only be used by one Command at once, we are safe from multiple pieces of code trying to command the same motor to different speeds, for example.
-Subsystems are ways to organize resources that can be used by one Command at a time.
-
Some hardware might not be stored in a Subsystem if multiple things can/should use it at the same time safely.
For example, a vision setup can be read from by many things, and might not need to be locked by Commands.
Therefore, it might not be stored in a Subsystem.
@@ -24,27 +22,25 @@ Therefore, we would wrap it in a Subsystem so that only one Command can use it a
A Trigger is something which can start a Command.
The classic form of this is a button on the driver's controller.
-Another common type is one which checks for when the robot enables.
-One non-obvious Trigger we used in 2024 was one which checked when we had detected a game piece in the robot, which we used to flash our LEDs and vibrate the driver controller.
-Triggers can be made of any function that returns a boolean which makes them very powerful.
+Another common type is one which checks if the robot is enabled.
+One non-obvious Trigger we used in 2024 was one which checked when we had detected a game piece in the robot, which then triggered the Commands to flash our LEDs and vibrate the driver controller.
+Triggers can be made of any function that returns a boolean, which makes them very powerful, and can be composed together with boolean operators.
+Triggers can also be bound to a certain activation event.
+For example, we might want something to happen while a condition is true (`whileTrue()`), when a condition changes from false to true (`onTrue()`), or is true for some amount of time (`debounce()`).
+
Some large Commands are better represented by several Commands and some Triggers!
-# update with superstructure stuff later
+Subsystems, commands, and triggers are the building blocks of the robot's overall "superstructure".
+This will be covered in more detail in the [Superstructure article.](2.10_Superstructure.md)
### Resources
-- [WPILib intro to functional programming](https://docs.wpilib.org/en/stable/docs/software/basic-programming/functions-as-data.html).
- Read through this article on lambda expressions and functional programming if you haven't already.
-- [WPILib docs](https://docs.wpilib.org/en/stable/docs/software/commandbased/index.html).
- Read through these docs until you finish "Organizing Command-Based Robot Projects"
+- Read through [this article](https://docs.wpilib.org/en/stable/docs/software/basic-programming/functions-as-data.html) on lambda expressions and functional programming if you haven't already.
+- Read through [these docs](https://docs.wpilib.org/en/stable/docs/software/commandbased/index.html) until you finish "Organizing Command-Based Robot Projects"
OR watch [this video](https://drive.google.com/file/d/1ykFDfXVYk27aHlXYKTAqtj1U2T80Szdj/view?usp=drive_link).
- Presentation notes for the video are [here](CommandBasedPresentationNotes.md).
- If you watch the video, it is recommended to also read the [Subsystems](https://docs.wpilib.org/en/stable/docs/software/commandbased/subsystems.html), [Binding Commands to Triggers](https://docs.wpilib.org/en/stable/docs/software/commandbased/binding-commands-to-triggers.html), and [Organizing Command-Based Robot Projects](https://docs.wpilib.org/en/stable/docs/software/commandbased/organizing-command-based.html#) for addition details on using Command-Based.
-
- The important segment of all of this to remember is:
- > Commands represent actions the robot can take. Commands run when scheduled, until they are interrupted or their end condition is met. Commands are very recursively composable: commands can be composed to accomplish more-complicated tasks. See Commands for more info.
- >
- > Subsystems represent independently-controlled collections of robot hardware (such as motor controllers, sensors, pneumatic actuators, etc.) that operate together. Subsystems back the resource-management system of command-based: only one command can use a given subsystem at the same time. Subsystems allow users to “hide” the internal complexity of their actual hardware from the rest of their code - this both simplifies the rest of the robot code, and allows changes to the internal details of a subsystem’s hardware without also changing the rest of the robot code.
+ Presentation notes for the video are [here](2.4_CommandBasedPresentationNotes.md).
+
+- If you watch the video, it is recommended to also read the [Subsystems](https://docs.wpilib.org/en/stable/docs/software/commandbased/subsystems.html), [Binding Commands to Triggers](https://docs.wpilib.org/en/stable/docs/software/commandbased/binding-commands-to-triggers.html), and [Organizing Command-Based Robot Projects](https://docs.wpilib.org/en/stable/docs/software/commandbased/organizing-command-based.html#) for additional details on using Command-Based.
### Examples
@@ -53,12 +49,12 @@ Some large Commands are better represented by several Commands and some Triggers
### Exercises
-- Make basic KitBot code using the Command-Based skeleton. You can follow [this](KitbotExampleWalkthrough.md) tutorial.
+- Make basic KitBot code using the Command-Based skeleton. You can follow [this](2.5_KitbotExampleWalkthrough.md) tutorial.
### Notes
- We prefer making simple Commands with Command factories, or methods in a subsystem that return a Command.
These methods should be simple interactions like `setTargetExtensionMeters()` or `extendIntake()`.
- Then you can use decorators as described [here](https://docs.wpilib.org/en/stable/docs/software/commandbased/command-compositions.html) to compose the basic Commands into more complex sequences.
- Generally we make these compositions in `Robot` and `Superstructure` but you can also make single-Subsystem compositions within that Subsystem.
+ Then, you can use decorators as described [here](https://docs.wpilib.org/en/stable/docs/software/commandbased/command-compositions.html) to compose the basic Commands into more complex sequences.
+ Generally, we make these compositions in `Robot` and `Superstructure`, but you can also make single-Subsystem compositions within that Subsystem.
See our code from previous years for examples of this pattern, or talk to a software lead.
diff --git a/Docs/2_Architecture/2.6_AdvantageKit.md b/Docs/2_Architecture/2.6_AdvantageKit.md
index 70a16a4..34e8803 100644
--- a/Docs/2_Architecture/2.6_AdvantageKit.md
+++ b/Docs/2_Architecture/2.6_AdvantageKit.md
@@ -4,23 +4,29 @@
### What is logging?
-Logging is recording some or all of the state (think the current values of variables, inputs and outputs, and what Commands are running,) of the robot so that we can play it back later.
+Logging is recording some or all of the state (such as the current values of variables, inputs and outputs, and what Commands are running) of the robot so that we can play it back later.
### Why log?
-Logging helps with debugging by letting us see exactly what the robot was doing when it broke.
-This is especially useful at competition when we have limited time and testing ability to diagnose problems.
+Logging helps with debugging by letting us see exactly what was happening to the robot and what it was doing when it broke.
+This is especially useful at competition, when we have limited time and testing ability to diagnose problems.
For instance, at Houston 2023 we had an issue in our second quals match where our grabber stopped responding to input.
Using the logs of that match, we saw that the sensor readings of the grabber had stopped responding, which suggested that the CAN bus to the grabber had broken.
### Why AdvantageKit?
-AdvantageKit logs every input and output to and from the robot, so that we can perfectly recreate the state of the robot in the log or with a simulator.
-Logging everything means we never have to say "if only we had logged one more thing!" It also means that we can simulate how the code might have responded differently with changes.
-One of the examples 6328 uses is when they adjusted the way they tracked vision targets based on a match log that revealed a problem, then used the log replay to confirm that the change fixed the vision issue.
-AdvantageKit is a mature and developed framework for this type of logging, and has been used on Einstein by 6328 and 3476.
-The framework should continue to be maintained for the forseeable future and by using the framework instead of a custom solution we reduce our overhead for using a system like this.
-AdvantageKit is closely integrated with AdvantageScope, a log and sim viewer built by 6328
+AdvantageKit logs every input and output to and from the robot.
+This means we can perfectly recreate the state of the robot in the log or with a simulator.
+
+
+
+It also means that we can run the same inputs through modified code, and simulate how the robot would have responded.
+AdvantageKit calls this replay.
+One of the examples 6328 uses is when they used a log to diagnose an issue with the way they tracked vision targets, adjusted it, then used replay to confirm that the change would have produced the correct outputs with the same inputs.
+
+AdvantageKit is a mature and developed framework for this type of logging that should continue to be maintained for the foreseeable future.
+
+AdvantageKit is closely integrated with AdvantageScope, a log and sim viewer built by 6328.
### Drawbacks
@@ -28,12 +34,12 @@ Running this amount of logging has performance overhead on the rio, using valuab
Logging also requires a non-insignificant architecture change to our codebase by using an IO layer under each of our subsystems.
While this does require some additional effort to write subsystems, it also makes simulating subsystems easier so it is a worthwhile tradeoff.
-We have not yet done significant on-robot r&d with AdvantageKit and need to assess the performance impacts of using it.
+8033-specific usage of AdvantageKit features will be covered in more detail in the next couple of lessons.
### Resources
+- [AdvantageKit docs](https://docs.advantagekit.org/)
- [AdvantageKit repository](https://github.com/Mechanical-Advantage/AdvantageKit)
- - See the README for this repo for docs
- [AdvantageScope log viewer](https://github.com/Mechanical-Advantage/AdvantageScope)
- [6328 logging talk](https://youtu.be/mmNJjKJG8mw)
@@ -41,14 +47,15 @@ We have not yet done significant on-robot r&d with AdvantageKit and need to asse
- [6328 2023 code](https://github.com/Mechanical-Advantage/RobotCode2023)
- [3476 2023 code](https://github.com/FRC3476/FRC-2023)
-- [8033 2023 AdvantageKit port](https://github.com/HighlanderRobotics/Charged-Up/tree/advantagekit) [LINK DOWN]
+- [8033 2025 code](https://github.com/HighlanderRobotics/Reefscape)
### Exercises
-- Install AdvantageKit into your kitbot project following [this tutorial](https://github.com/Mechanical-Advantage/AdvantageKit/blob/45d8067b336c7693e63ee01cdeff0e5ddf50b92d/docs/INSTALLATION.md).
- You do not need to modify the subsystem file yet, we will do that as part of the simulation tutorial.
+- Install AdvantageKit into your kitbot project following [this tutorial](https://docs.advantagekit.org/getting-started/installation/existing-projects).
+ You do not need to add the suggested block in the `Robot()` constructor.
+ We will do that as part of the simulation tutorial.
### Notes
-- See also the [Simulation](Simulation.md) article for more on the IO layer structure
-- _When we have log files, put a link to one here as an example_
+- See the [AdvantageKit Structure Reference](2.7_AKitStructureReference.md) article for more on the IO layer structure
+- [Here](https://drive.google.com/drive/folders/1qNMZ7aYOGI31dQNAwt7rhFo97NR7mxtr) are some logs from our 2023-2024 season
diff --git a/Docs/2_Architecture/2.9_KitbotExampleWalkthroughSim.md b/Docs/2_Architecture/2.9_KitbotExampleWalkthroughSim.md
index 190740d..a6667e4 100644
--- a/Docs/2_Architecture/2.9_KitbotExampleWalkthroughSim.md
+++ b/Docs/2_Architecture/2.9_KitbotExampleWalkthroughSim.md
@@ -18,7 +18,7 @@ The first step of moving our standard Command-based code to a loggable, simulate
Luckily AdvantageKit has a handy guide on how to install it on an existing code base.
Follow [this walkthrough](https://github.com/Mechanical-Advantage/AdvantageKit/blob/main/docs/INSTALLATION.md).
Follow all the steps in the doc through adding the `@AutoLog` annotation.
-You do not need to add the suggested block in `robotInit()`, instead use the one below.
+You do not need to add the suggested block in `Robot()`, instead use the one below.
```Java
Logger.getInstance().recordMetadata("ProjectName", "KitbotExample"); // Set a metadata value
diff --git a/Examples/KitbotDemoFinal/src/main/java/frc/robot/subsystems/Drivetrain/DrivetrainIOReal.java b/Examples/KitbotDemoFinal/src/main/java/frc/robot/subsystems/Drivetrain/DrivetrainIOReal.java
new file mode 100644
index 0000000..7f33eae
--- /dev/null
+++ b/Examples/KitbotDemoFinal/src/main/java/frc/robot/subsystems/Drivetrain/DrivetrainIOReal.java
@@ -0,0 +1,61 @@
+// Copyright (c) FIRST and other WPILib contributors.
+// Open Source Software; you can modify and/or share it under the terms of
+// the WPILib BSD license file in the root directory of this project.
+
+package frc.robot.Subsystems.Drivetrain;
+
+import com.ctre.phoenix6.controls.VoltageOut;
+import com.ctre.phoenix6.hardware.TalonFX;
+
+import edu.wpi.first.wpilibj.simulation.DifferentialDrivetrainSim;
+import edu.wpi.first.wpilibj.simulation.DifferentialDrivetrainSim.KitbotGearing;
+import edu.wpi.first.wpilibj.simulation.DifferentialDrivetrainSim.KitbotMotor;
+import edu.wpi.first.wpilibj.simulation.DifferentialDrivetrainSim.KitbotWheelSize;
+import edu.wpi.first.wpilibj.simulation.RoboRioSim;
+
+public class DrivetrainIOReal implements DrivetrainIO {
+ TalonFX leftFalcon = new TalonFX(Constants.drivetrainLeftFalconID);
+ TalonFX rightFalcon = new TalonFX(Constants.drivetrainRightFalconID);
+
+ VoltageOut leftVoltage = new VoltageOut(0);
+ VoltageOut rightVoltage = new VoltageOut(0);
+
+ DifferentialDrivetrainSim physicsSim = DifferentialDrivetrainSim.createKitbotSim(
+ KitbotMotor.kDoubleFalcon500PerSide,
+ KitbotGearing.k8p45,
+ KitbotWheelSize.kSixInch,
+ null);
+
+ @Override
+ public void updateInputs(DrivetrainIOInputs inputs) {
+ physicsSim.update(0.020);
+
+ var leftSimState = leftFalcon.getSimState();
+ leftSimState.setSupplyVoltage(RoboRioSim.getVInVoltage());
+
+ var rightSimState = rightFalcon.getSimState();
+ rightSimState.setSupplyVoltage(RoboRioSim.getVInVoltage());
+
+ physicsSim.setInputs(leftSimState.getMotorVoltage(), rightSimState.getMotorVoltage());
+
+ inputs.leftOutputVolts = leftSimState.getMotorVoltage();
+ inputs.rightOutputVolts = rightSimState.getMotorVoltage();
+
+ inputs.leftVelocityMetersPerSecond = physicsSim.getLeftVelocityMetersPerSecond();
+ inputs.rightVelocityMetersPerSecond = physicsSim.getRightVelocityMetersPerSecond();
+
+ inputs.leftPositionMeters = physicsSim.getLeftPositionMeters();
+ inputs.rightPositionMeters = physicsSim.getRightPositionMeters();
+
+ inputs.leftCurrentAmps = new double[] {leftSimState.getTorqueCurrent()};
+ inputs.leftTempCelsius = new double[0];
+ inputs.rightCurrentAmps = new double[] {rightSimState.getTorqueCurrent()};
+ inputs.rightTempCelsius = new double[0];
+ }
+
+ @Override
+ public void setVolts(double left, double right) {
+ leftFalcon.setControl(leftVoltage.withOutput(left));
+ rightFalcon.setControl(rightVoltage.withOutput(right));
+ }
+}