Skip to content
View TabithaKO's full-sized avatar

Highlights

  • Pro

Block or report TabithaKO

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don't include any personal information such as legal names or email addresses. Markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
TabithaKO/README.md

Hi, I'm Tabby πŸ‘‹

I'm a PhD researcher at Brown University advised by Professor Nora Ayanian. My work is on robotic cloth manipulation: building the hardware, perception, and learning systems needed to handle fabric reliably on real robots.


πŸ§ͺ Policy Training β€” ACT for Cloth Folding

Training visuomotor fold policies with ACT (Action Chunking with Transformers). 267 demos, 15+ models trained, 40–60% deployment success. Currently comparing image encoders (ResNet vs DINOv2) and pretrained action models (OpenVLA, Octo, pi0).

multi-fold.mp4
single-fold.mp4

Multi-fold with cloth resets Β· Single fold baseline (ResNet18, no pretrained action model)


🧠 Learning Cloth Dynamics β€” From Paper to Real Fabric

Took PhysTwin and PGND, got them running on cloth data I collected myself, then extended PGND with a differentiable render loss (DINOv2 + SSIM) and live camera conditioning at rollout time. Three model variants, each adding a new supervisory signal on top of the last.

pgnd-comparison-all.mp4

Baseline vs. Visual PGND β€” all held-out episodes

cloth-dynamics-fold-l-over-r-v2.mp4
sew-unit-dual-pull-apart.mp4

PhysTwin novel actions: fold left-over-right Β· bimanual pull-apart, simulated with parameters learned from a single training trajectory


Two custom end-effectors: silicone FSR grippers with embedded force sensors for contact-aware grasping, and a UMI-inspired handheld teleop gripper with ArUco markers and IMU for imitation learning data collection.


🦾 Sew Unit β€” Bimanual Cloth Manipulation Platform

A bimanual robot platform I designed and built from scratch: custom aluminum extrusion frame, two inverted SO-101 arms, ROS2/MoveIt motion planning, and leader-follower teleoperation for data collection.

sew-unit-cad-spin.mp4

CAD model spin


Research repos

  • TabithaKO/PhysTwin β€” SO-101 cloth data pipeline, multi-camera perception, trajectory generation
  • TabithaKO/pgnd β€” visual PGND: mesh-constrained Gaussian Splatting + DINOv2 camera conditioning

πŸ“„ tabithako.github.io Β Β·Β  πŸ‘— Fashion

Pinned Loading

  1. PhysTwin PhysTwin Public

    Forked from Jianghanxiao/PhysTwin

    [ICCV 2025] PhysTwin: Physics-Informed Reconstruction and Simulation of Deformable Objects from Videos

    Python 2

  2. GarmentCode GarmentCode Public

    Forked from maria-korosteleva/GarmentCode

    A modular programming framework for designing parametric sewing patterns

    Python

  3. pgnd pgnd Public

    Forked from kywind/pgnd

    Code implementation for RSS 2025 paper Particle-Grid Neural Dynamics for Learning Deformable Object Models from RGB-D Videos

    Python