Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
48 commits
Select commit Hold shift + click to select a range
ec0b4fa
Update README.md
Junxiong-Chen Sep 12, 2025
9499e5a
Update README.md
Junxiong-Chen Sep 12, 2025
fb17472
Update README.md
Junxiong-Chen Sep 14, 2025
9eb4e82
Update README.md
Junxiong-Chen Sep 14, 2025
310810e
Update README.md
Junxiong-Chen Sep 14, 2025
3c4fdd2
Update README.md
Junxiong-Chen Sep 14, 2025
cd83df8
Update README.md
Junxiong-Chen Sep 14, 2025
e45fc6f
Update README.md
Junxiong-Chen Sep 14, 2025
fcc3bc8
Update README.md
Junxiong-Chen Sep 14, 2025
d5bb999
Update README.md
Junxiong-Chen Sep 14, 2025
d53a6af
Update README.md
Junxiong-Chen Sep 14, 2025
e32e70c
Add files via upload
Junxiong-Chen Sep 17, 2025
d3a2304
Update README.md
Junxiong-Chen Sep 17, 2025
4125624
Update README.md
Junxiong-Chen Sep 17, 2025
df9c8ee
Update README.md
Junxiong-Chen Sep 17, 2025
7d57e3c
get lab3 updates
Junxiong-Chen Sep 24, 2025
c74c190
Merge remote-tracking branch 'upstream/Fall2025' into Fall2025
Junxiong-Chen Sep 25, 2025
afbf4a5
git pushMerge remote-tracking branch 'origin/Fall2025' into Fall2025
Junxiong-Chen Sep 25, 2025
3f264e1
Update README.md
Junxiong-Chen Sep 28, 2025
43ebf74
Update README.md
Junxiong-Chen Sep 28, 2025
304ee96
Update README.md
Junxiong-Chen Sep 28, 2025
0a46ecf
Update README.md
Junxiong-Chen Oct 2, 2025
6ce178f
Update README.md
Junxiong-Chen Oct 5, 2025
2449d1e
Update README.md
Junxiong-Chen Oct 6, 2025
f774bdc
Update README.md
Junxiong-Chen Oct 6, 2025
d386324
Update README.md
Junxiong-Chen Oct 6, 2025
10312b0
Update README.md
Junxiong-Chen Oct 6, 2025
3b51581
Update README.md
Junxiong-Chen Oct 6, 2025
bf2223f
Update README.md
Junxiong-Chen Oct 6, 2025
e629824
Update README.md
Junxiong-Chen Oct 6, 2025
642318a
Update README.md
Junxiong-Chen Oct 6, 2025
3d53b81
Update README.md
Junxiong-Chen Oct 6, 2025
4f6b936
Update README.md
Junxiong-Chen Oct 6, 2025
51e4c8d
Add files via upload
Junxiong-Chen Oct 11, 2025
5110823
Update README.md
Junxiong-Chen Oct 11, 2025
b686be7
Update README.md
Junxiong-Chen Oct 12, 2025
b6d0f2f
Update README.md
Junxiong-Chen Oct 12, 2025
f118372
Update README.md
Junxiong-Chen Oct 12, 2025
9dc31a7
Update README.md
Junxiong-Chen Oct 14, 2025
a5899e8
Update README.md
Junxiong-Chen Oct 14, 2025
a5a9acb
Lab4 jade (#2)
Jadepypy Oct 24, 2025
336444e
Merge branch 'Fall2025' into Fall2025
Junxiong-Chen Oct 30, 2025
3e82ab6
Lab5 (#4)
Jadepypy Nov 5, 2025
1cdc8b7
Improve formatting and clarity in README.md
Jadepypy Nov 5, 2025
87a19ad
Merge branch 'IRL-CT:Fall2025' into Fall2025
Jadepypy Nov 13, 2025
bb54551
Enhance README with project details and reflections
Jadepypy Nov 13, 2025
19c4cc3
update code
Nov 13, 2025
608937b
add picture
Nov 13, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
64 changes: 62 additions & 2 deletions Lab 1/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

# Staging Interaction

\*\***NAME OF COLLABORATOR HERE**\*\*


In the original stage production of Peter Pan, Tinker Bell was represented by a darting light created by a small handheld mirror off-stage, reflecting a little circle of light from a powerful lamp. Tinkerbell communicates her presence through this light to the other characters. See more info [here](https://en.wikipedia.org/wiki/Tinker_Bell).

Expand Down Expand Up @@ -64,33 +64,60 @@ To stage an interaction with your interactive device, think about:

_Setting:_ Where is this interaction happening? (e.g., a jungle, the kitchen) When is it happening?

In a workplace.

_Players:_ Who is involved in the interaction? Who else is there? If you reflect on the design of current day interactive devices like the Amazon Alexa, it’s clear they didn’t take into account people who had roommates, or the presence of children. Think through all the people who are in the setting.

A poor student with too much work to do.

_Activity:_ What is happening between the actors?

The device reminds the student to drink certain amount of water.

_Goals:_ What are the goals of each player? (e.g., jumping to a tree, opening the fridge).

To drink enough water and prepare enough water for the next round

The interactive device can be anything *except* a computer, a tablet computer or a smart phone, but the main way it interacts needs to be using light.

\*\***Describe your setting, players, activity and goals here.**\*\*

When a poor student is too addicted to school work, he or she may forget to take in enough water. The device will measure the remaining amount of water and order the student to drink periodically and refill the bottle for future drinking.

Storyboards are a tool for visually exploring a users interaction with a device. They are a fast and cheap method to understand user flow, and iterate on a design before attempting to build on it. Take some time to read through this explanation of [storyboarding in UX design](https://www.smashingmagazine.com/2017/10/storyboarding-ux-design/). Sketch seven storyboards of the interactions you are planning. **It does not need to be perfect**, but must get across the behavior of the interactive device and the other characters in the scene.

\*\***Include pictures of your storyboards here**\*\*

![4a2a7bd6bb5ce869a0873717aeff236f](https://github.com/user-attachments/assets/863c91d3-744b-441c-9d8b-4128a125dba2)




Present your ideas to the other people in your breakout room (or in small groups). You can just get feedback from one another or you can work together on the other parts of the lab.

\*\***Summarize feedback you got here.**\*\*
Jade: Sounds healthy
Karl: Cool idea


## Part B. Act out the Interaction


https://github.com/user-attachments/assets/6046a4f5-557a-473f-9951-0a82c72dda10


Try physically acting out the interaction you planned. For now, you can just pretend the device is doing the things you’ve scripted for it.



\*\***Are there things that seemed better on paper than acted out?**\*\*

It is more eye-catching than I thought.

\*\***Are there new ideas that occur to you or your collaborator that come up from the acting?**\*\*

Maybe the color should change by stages not gradually to emphasize on periods.


## Part C. Prototype the device

Expand All @@ -104,12 +131,20 @@ If you run into technical issues with this tool, you can also use a light switch

\*\***Give us feedback on Tinkerbelle.**\*\*

Really cool! I have never tried to control one computing device with another!


## Part D. Wizard the device
Take a little time to set up the wizarding set-up that allows for someone to remotely control the device while someone acts with it. Hint: You can use Zoom to record videos, and you can pin someone’s video feed if that is the scene which you want to record.

\*\***Include your first attempts at recording the set-up video here.**\*\*



https://github.com/user-attachments/assets/295cf934-57ba-4f47-859c-d50ff5eb5c85



Now, change the goal within the same setting, and update the interaction with the paper prototype.

\*\***Show the follow-up work here.**\*\*
Expand All @@ -123,16 +158,30 @@ Think about the setting of the device: is the environment a place where the devi

\*\***Include sketches of what your devices might look like here.**\*\*

![1676fa1c4f0087b5c68cc6720bea3847](https://github.com/user-attachments/assets/85ab013b-9665-404a-a125-bd2a03416f80)
![a02b364798ecfefda73321d87ee4d666](https://github.com/user-attachments/assets/5ef2f0b5-946f-477a-839d-1292a179f2df)



\*\***What concerns or opportunitities are influencing the way you've designed the device to look?**\*\*

The device can be easily exposed to water. I have two plans to issue this. First, use a light sensor to sense the water volume from the outside of a transparent bottle. Second, craft a bottle with a built-in force sensor to sense the gravity of the water, thus estimating the volume.


## Part F. Record

\*\***Take a video of your prototyped interaction.**\*\*



https://github.com/user-attachments/assets/65811893-f958-42ce-81ed-7dbb2a9288cf




\*\***Please indicate who you collaborated with on this Lab.**\*\*
Be generous in acknowledging their contributions! And also recognizing any other influences (e.g. from YouTube, Github, Twitter) that informed your design.

No one. I may find a helper later.


# Staging Interaction, Part 2
Expand All @@ -145,6 +194,9 @@ This describes the second week's work for this lab activity.
You will be assigned three partners from other groups. Go to their github pages, view their videos, and provide them with reactions, suggestions & feedback: explain to them what you saw happening in their video. Guess the scene and the goals of the character. Ask them about anything that wasn’t clear.

\*\***Summarize feedback from your partners here.**\*\*
Karl: what if I take alcohol
Jade: you really love drinking water


## Make it your own

Expand All @@ -154,3 +206,11 @@ Do last week’s assignment again, but this time:
3) We will be grading with an emphasis on creativity.

\*\***Document everything here. (Particularly, we would like to see the storyboard and video, although photos of the prototype are also great.)**\*\*
![55908a4ca247093ce42e62f5f445d605](https://github.com/user-attachments/assets/2303d6c5-1523-4fb0-a5a3-c4e3f9bec1b4)


https://github.com/user-attachments/assets/326198fe-f96e-4ca9-b4b9-cc287deb8fd8




167 changes: 166 additions & 1 deletion Lab 2/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Interactive Prototyping: The Clock of Pi
[screen_clock.py](https://github.com/user-attachments/files/22322685/screen_clock.py)# Interactive Prototyping: The Clock of Pi
**NAMES OF COLLABORATORS HERE**

Does it feel like time is moving strangely during this semester?
Expand Down Expand Up @@ -199,6 +199,8 @@ Pro Tip: Using tools like [code-server](https://coder.com/docs/code-server/lates

2. Look at and give feedback on the Part G. for at least 2 other people in the class (and get 2 people to comment on your Part G!)

Karl: I used to do this before important appointments.
Jade: I will definitely buy this clock.
# Lab 2 Part 2

## Assignment that was formerly Lab 2 Part E.
Expand All @@ -210,17 +212,180 @@ Can you make time interactive? You can look in `screen_test.py` for examples for

Please sketch/diagram your clock idea. (Try using a [Verplank diagram](https://ccrma.stanford.edu/courses/250a-fall-2004/IDSketchbok.pdf))!

![44234b402c39a7e125e42abfa205d0d8](https://github.com/user-attachments/assets/7293c20d-6e9d-4b08-b5ca-399acbbf7eee)

I don't know why but the verplank link leads to a gambling website, so I used some traditional sketching skills.

**We strongly discourage and will reject the results of literal digital or analog clock display.**


\*\*\***A copy of your code should be in your Lab 2 Github repo.**\*\*\*


↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓CODE↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓
```
import time
import subprocess
import digitalio
import board
from PIL import Image, ImageDraw, ImageFont
import adafruit_rgb_display.st7789 as st7789

# Configuration for CS and DC pins (these are FeatherWing defaults on M0/M4):
cs_pin = digitalio.DigitalInOut(board.D5)
dc_pin = digitalio.DigitalInOut(board.D25)
reset_pin = None

# Config for display baudrate (default max is 24mhz):
BAUDRATE = 64000000

# Setup SPI bus using hardware SPI:
spi = board.SPI()

# Create the ST7789 display:
disp = st7789.ST7789(
spi,
cs=cs_pin,
dc=dc_pin,
rst=reset_pin,
baudrate=BAUDRATE,
width=135,
height=240,
x_offset=53,
y_offset=40,
)

buttonA = digitalio.DigitalInOut(board.D23) #GPI023 (PIN 16)
buttonB = digitalio.DigitalInOut(board.D24) #GPI024 (PIN 18)
# Use internal pull-ups; buttons then read LOW when pressed.


buttonA.switch_to_input(pull=digitalio.Pull.UP)
buttonB.switch_to_input(pull=digitalio.Pull.UP)

diff = 0 #calculate the difference caused by pressing buttons

# Create blank image for drawing.
# Make sure to create image with mode 'RGB' for full color.
height = disp.width # we swap height/width to rotate it to landscape!
width = disp.height
image = Image.new("RGB", (width, height))
rotation = 90

# Get drawing object to draw on image.
draw = ImageDraw.Draw(image)

# Draw a black filled box to clear the image.
draw.rectangle((0, 0, width, height), outline=0, fill=(0, 0, 0))
disp.image(image, rotation)
# Draw some shapes.
# First define some constants to allow easy resizing of shapes.
padding = -2
top = padding
bottom = height - padding
# Move left to right keeping track of the current x position for drawing shapes.
x = 0

# Alternatively load a TTF font. Make sure the .ttf font file is in the
# same directory as the python script!
# Some other nice fonts to try: http://www.dafont.com/bitmap.php
time_font = ImageFont.truetype("/usr/share/fonts/truetype/dejavu/DejaVuSans.ttf",32)
date_font = ImageFont.truetype("/usr/share/fonts/truetype/dejavu/DejaVuSans.ttf", 18)

# Turn on the backlight
backlight = digitalio.DigitalInOut(board.D22)
backlight.switch_to_output()
backlight.value = True

while True:
# Draw a black filled box to clear the image.
draw.rectangle((0, 0, width, height), outline=0, fill=0)

#TODO: Lab 2 part D work should be filled in here. You should be able to look in cli_clock.py and stats.py
a_pressed = (buttonA.value == False)
b_pressed = (buttonB.value == False)

if a_pressed and b_pressed:
diff = 0
elif a_pressed:
diff += 5
elif b_pressed:
diff -= 5

t = time.localtime()
hour, min, sec = t.tm_hour, t.tm_min, t.tm_sec
min = min + diff
hour_adj = min // 60
hour = hour + hour_adj
min = min % 60
current_time = f"{hour:02d}:{min:02d}:{sec:02d}"
current_date = time.strftime("%m-%d-%Y")
alabel = "B:-5min"
blabel = "A:+5min"

if diff > 0:
hint = f"Hurry! +{diff}"
elif diff < 0:
hint = f"Easy! -{-diff}"
else:
hint = "0"

time_bbox = draw.textbbox((0,0), current_time, font=time_font)
time_width = time_bbox[2] - time_bbox[0]
time_height = time_bbox[3] - time_bbox[0]

date_bbox = draw.textbbox((0,0), current_date, font=date_font)
date_width = date_bbox[2] - date_bbox[0]
date_height = date_bbox[3] - date_bbox[0]

alabel_bbox = draw.textbbox((0,0), alabel, font=date_font)
alabel_width = alabel_bbox[2] - alabel_bbox[0]
alabel_height = alabel_bbox[3] - alabel_bbox[0]

blabel_bbox = draw.textbbox((0,0), blabel, font=date_font)
blabel_width = blabel_bbox[2] - blabel_bbox[0]
blabel_height = blabel_bbox[3] - blabel_bbox[0]

hint_bbox = draw.textbbox((0,0), hint, font=date_font)
hint_width = hint_bbox[2] - hint_bbox[0]
hint_height = hint_bbox[3] - hint_bbox[0]

draw.text((width//2 - time_width//2, height//2 - time_height//2 - 20), current_time, font=time_font, fill="#FFFFFF")
draw.text((width//2 - date_width//2, height//2 + 10), current_date, font=date_font, fill="#FFFFFF")
draw.text((0, height - alabel_height), alabel, font=date_font, fill="#FFFFFF")
draw.text((0, 0), blabel, font=date_font, fill="#FFFFFF")
draw.text((width - hint_width, height - hint_height//2 - 10), hint, font=date_font, fill="#FFFFFF")



# Display image.
disp.image(image, rotation)
time.sleep(0.1)
```


↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑CODE↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑




## Assignment that was formerly Part F.
## Make a short video of your modified barebones PiClock

\*\*\***Take a video of your PiClock.**\*\*\*

↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓VIDEO↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓


https://github.com/user-attachments/assets/cef7d98a-c059-4e17-a64e-a1015bab4c73



[If the video cannot broadcast normally, please use the .mp4 file from this link](https://github.com/Junxiong-Chen/Interactive-Lab-Hub/blob/Fall2025/Lab%202/pull_updates/1.1.mp4)


↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑VIDEO↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑↑

After you edit and work on the scripts for Lab 2, the files should be upload back to your own GitHub repo! You can push to your personal github repo by adding the files here, commiting and pushing.

```
Expand Down
Binary file added Lab 2/pull_updates/1.1.mp4
Binary file not shown.
Loading