+
+
diff --git a/PLAY-roster.htm b/PLAY-roster.htm
new file mode 100644
index 0000000..e802463
--- /dev/null
+++ b/PLAY-roster.htm
@@ -0,0 +1,336 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/README.md b/README.md
index f12dedf..70c1b21 100644
--- a/README.md
+++ b/README.md
@@ -6,7 +6,8 @@ We acknowledge extensive use of this resource: Xie, Y., Allaire, J. J., & Grolem
## Rendering the site
-To render the website, run `rmarkdown::render_site()` from the project root directory.
+To render the website, run `rmarkdown::render_site()` in R from the project root directory.
+(If any packages need to be installed, you can do so using `install.packages("package_name")`)
The website files are copied to `docs/` where you can view them using a web browser or file editor.
## Deploying the site
diff --git a/_site.yml b/_site.yml
index e2bba38..9efc480 100644
--- a/_site.yml
+++ b/_site.yml
@@ -2,8 +2,8 @@ name: "PLAY Project"
navbar:
title: "PLAY"
left:
- - text: "home"
- href: index.html
+ - text: "about"
+ href: about.html
- text: "people"
href: people.html
- text: "protocol"
@@ -14,8 +14,6 @@ navbar:
href: data.html
- text: "news"
href: news.html
- - text: "GitHub"
- href: http://github.com/PLAY-behaviorome/
- text: "site info"
href: site-info.html
output_dir: "docs"
diff --git a/about.Rmd b/about.Rmd
index e860762..548416f 100644
--- a/about.Rmd
+++ b/about.Rmd
@@ -2,21 +2,98 @@
title: ""
---
-
+
-# Goals of the project
-Play is the primary context for infant learning and is foundational for all domains of healthy development—cognition, language, social interaction, motor action, and emotion. Before children begin formal schooling, play occupies nearly all of their waking day. In the first years of life, play provides an unparalleled window into typical and atypical patterns of development, and it is an ideal context for understanding development in children around the globe.
+
-Video uniquely captures the nuances and details of natural behavior and the surrounding context, and video can be used and reused by experts in multiple domains.
+# An overview of the PLAY data
-The Play & Learning Across A Year (PLAY) project will create a large-scale shared corpus of video of natural infant play from homes in varied locations across the U.S. Researchers with expertise in multiple domains of infant behavior (physical and motor development, communication and gesture, object exploration and play, emotion, gender, home environment, media) will contribute to the collection and coding of these videos, and will use the data to address questions about infant learning and development. The shared corpus will ultimately be shared with the entire research community.
+To answer questions about infant behaviours in their natural environments, the PLAY project will collect, code, and share 900 hours of video collected in the homes of children at 12, 18, and 24 months of age drawn from 30 sites across North America.
-
-
-
+Materials (videos, questionnaires, links to Databrary volumes) for the PLAY project are included in this site, where we document data collection protocols, workflows, coding strategies, and operational definitions.
+
+
-
+### Project-wide overview
+
+
+
+
+
+### Ending statements
-Materials (videos, questionnaires, links to Databrary) for the PLAY project are included in this site.
-We document data collection protocols, workflows, coding strategies, and operational definitions.
diff --git a/coding_setup.Rmd b/coding_setup.Rmd
index 055b759..2d77549 100644
--- a/coding_setup.Rmd
+++ b/coding_setup.Rmd
@@ -7,130 +7,53 @@ output:
toc_float: true
---
-# Coding Set Up
-
-This section describes how to transcribe & code the 1-hour natural play session.
+# Set Up for Transcribing & Coding 1-Hour Natural Play
## Getting Started
-Download the [development version of Datavyu](http://datavyu.org/download.html).
-Download the `PLAY_CodingTemplate.opf` file from the [PLAY Databrary Volume](https://nyu.databrary.org/volume/254/slot/14924/-?asset=73892).
-Name this file with the PLAY naming convention (e.g., PLAY_NYU001, … PLAY_NYU010, … PLAY_NYU030).
- - This template contains all of the primary variables that will be coded by each site: communication, gesture, object interaction, locomotion, and emotion.
-Download Ruby scripts for each coding variable as needed from the [PLAY Github repository](https://github.com/databrary/PLAY-Project-Datavyu-scripts).
+1. **Download** the [development version of Datavyu](http://datavyu.org/download.html).
+2. **Download** the *PLAY_CodingTemplate.opf* file from the [PLAY Databrary Volume](https://nyu.databrary.org/volume/254/slot/14924/-?asset=73892). Name this file with the PLAY naming convention (e.g., *PLAY_NYU001, … PLAY_NYU010, … PLAY_NYU030*).
+ - This template contains all of the primary variables that will be coded by each site: communication, gesture, object interaction, locomotion, and emotion.
+3. **Download** Ruby scripts for each coding variable as needed from the [PLAY Github repository](https://github.com/databrary/PLAY-Project-Datavyu-scripts).
## Get to Know Datavyu
-Familiarize yourself with Datavyu before you begin coding (resources on Datavyu.org, videos from past workshops, etc.).
-Refer to the [Datavyu User Guide](http://www.datavyu.org/user-guide/index.html).
-Take a look at our [Best Practices for Coding Behavioral Data From Video](http://www.datavyu.org/user-guide/best-practices.html) on the Datavyu site.
+1. Familiarize yourself with Datavyu before you begin coding (resources on Datavyu.org, videos from past workshops, etc.).
+ - Refer to the [Datavyu User Guide](http://www.datavyu.org/user-guide/index.html).
+ - Take a look at our [Best Practices for Coding Behavioral Data From Video](http://www.datavyu.org/user-guide/best-practices.html) on the Datavyu site.
## Coding in Passes
-The coding manual describes the [transcription ](transcription.html) process and codes for 5 content areas: [communication](communication.html), [gesture](gesture.html), [object interaction](objects.html), [locomotion](locomotion.html), and [emotion](emotion.html).
-Each content area includes two passes: one pass for the infant and one pass for the mother. For gesture, the baby and mom are coded together in a single pass.
-A pass entails scoring the relevant codes for 1-hour of video.
+* The coding manual describes the transcription process and codes for 5 content areas: communication, gesture, object interaction, locomotion, and emotion.
+* Each content area includes two passes: one pass for the infant and one pass for the mother. For gesture, the baby and mom are coded together in a single pass.
+* A pass entails scoring the relevant codes for 1-hour of video.
+
+Please visit our [GitHub Repository](https://github.com/databrary/PLAY-Project-Datavyu-scripts) for all of the scripts mentioned in this wiki.
- Please visit our GitHub Repository for all of the scripts mentioned in this wiki.
-
## Workflow for Coding Communication Passes
-After the file has been transcribed according to procedure in Transcription, run two additional scripts that will prepare new Communication columns for further coding.
-Run `splitmombaby_transcribe.rb`. This script pulls out mom and baby language from the transcribe column into two new columns: (1) momspeech and (2) babyvoc.
-Each column is automatically populated with cells from the respectively tagged utterances from the transcribe column (e.g., the script ports all utterances coded as ‘m’ to the momspeech column and 'b' to the babyvoc column).
-Each new cell in momspeech and babyvoc is a point cell created at the onset of each cell from the transcription.
-Run create_mombaby_utterancetype.rb.
-This script also creates two new columns: (1) momutterancetype and (2) babyutterancetype. For each cell in momspeech and babyvoc, a new cell is created in momutterancetype and babyutterancetype, respectively.
-The codes for these cells are blank, and the coder now scores mom and baby communication according to definitions in Communication Codes.
+1. After the file has been transcribed according to procedure in [Transcription](https://dev1.ed-projects.nyu.edu/wikis/docuwiki/doku.php/transcription), run two additional scripts that will prepare new [Communication](https://dev1.ed-projects.nyu.edu/wikis/docuwiki/doku.php/manual2) columns for further coding.
+2. **Run** *splitmombaby_transcribe.rb*. This script pulls out mom and baby language from the _*transcribe*_ column into two new columns: (1) _*momspeech*_ and (2) _*babyvoc*_ . Each column is automatically populated with cells from the respectively tagged utterances from the _*transcribe*_ column (e.g., the script ports all utterances coded as ‘m’ to the _*momspeech*_ column and 'b' to the _*babyvoc*_ column). Each new cell in _*momspeech*_ and _*babyvoc*_ is a point cell created at the onset of each cell from the transcription.
+3. **Run** *create_mombaby_utterancetype.rb*. This script also creates two new columns: (1) _*momutterancetype*_ and (2) _*babyutterancetype*_ . For each cell in _*momspeech*_ and _*babyvoc*_ , a new cell is created in _*momutterancetype*_ and _*babyutterancetype*_ , respectively. The codes for these cells are blank, and the coder now scores mom and baby communication according to definitions in [Communication Codes](https://dev1.ed-projects.nyu.edu/wikis/docuwiki/doku.php/manual2).
## Workflow for Coding Gesture Pass
-Score baby and mom gesture together in a single pass according to definitions in Gesture Codes.
-After the gesture coding pass (for both mom and baby) has been done, run a script that will separate mom and baby gestures into two columns.
-Run `Split-MomBabyGesture.rb`. This script pulls out mom and baby gestures from the gesture column into two new columns: (1) babygesture and (2) momgesture. Each column is automatically populated with cells from the respectively tagged events from the gesture column (e.g., the script ports all gestures coded as ‘m’ to the momgesture column and 'b' to the babygesture column). Each new cell in babygesture and momgesture is a point cell created at the onset of each cell in the gesture column.
-
-## Workflow for [Object](objects.html), [Locomotion](locomotion.html), and [Emotion](emotion.html) Passes
-
-Choose whether to code baby or mom first within each pass for object, locomotion, or emotion.
-Score each pass according to definitions in Object Codes, Locomotion Codes, or Emotion Codes.
-Workflow for Inter-Observer Reliability on Communication, Gesture, Object, Locomotion, and Emotion Passes
-After the primary coder finishes a pass: babyutterancetype, momutterancetype, gesture (split into babygesture, momgesture), babyobject, momobject, babyloc, momloc, babyemotion, or momemotion run two scripts to set up the Datavyu spreadsheet for coding reliability.
-First, run a script called `insert-RelBlocks.rb`.
-This script randomly generates 3, 5-minute chunks within the first, second, and third 20-minute sections of the 1-hour video of free play. By quasi-randomly inserting reliability blocks from areas of the primary coder’s pass, this will ensure that the reliability coder sees each portion of the video for each child’s session. Thus, the idiosyncrasies of each child, fluctuations over the 1-hour session, and drift in the coder are spread over the session.
-Reliability on each coding pass is done on the same 3, 5-minute blocks for each pass.
-The scripting window in Datavyu will prompt when everything has been successfully completed. You should now have a brand new column in your spreadsheet named reliability_blocks.
-This script should only be run once so that reliability coding can be done within the same time frame for each coding domain for each session.
-Now, run another script appropriate for the pass reliability needs to be coded on: `CreateReliability-BabyUtterancetype.rb` or `CreateReliability-MomUtterancetype.rb` or `CreateReliability-Gesture.rb` or `CreateReliability-MomBaby-Loc.rb` or `CreateReliability-MomBaby-Object.rb` OR `CreateReliability-MomBaby-Emotion.rb`
-This script inserts new coding columns where your reliability coder will score the video while they are locked into the script-generated, 5-minute chunks in the reliability_blocks column.
-
-# Coding ID
-
-Datavyu ID Code for 1-Hour Natural Play
-
- Make sure you are currently logged in at Databrary to view embedded video examples in this wiki.
-
-## id
-
-`` `` `` `` `` `` `` `` `` `` ``
-
-### Operational Definitions
-
-``: Set every ID cell onset to 00:00:00:000 (hours : minutes : seconds : milliseconds).
-
-Set ID cell offset to the last frame in the 1-hour free play session.
-
-``: Site refers to the data collection site: New York University, Rutgers Newark, CUNY Staten Island, Penn State, etc.
-
-Get the site from the metadata information collected on the app.
-
-Format is three letters all caps: NYU, RTG, CSI, PSU.
-
-``: Participant number refers to the infant's participant number in the order that the data were collected.
-
-Participant numbers run consecutively from 001 within each site.
-
-Get the participant number from the metadata information collected on the app.
-
-Format for id number is three digits 001, 012, 021.
-
-``: Test date is the day of the home visit.
-
-Get the test date from the metadata information collected on the app.
-
-Format for test date is YYYY-MM-DD.
-
-``: Birth date is the day the baby was born.
-
-Get the birth date from the metadata information collected on the app.
-
-Format for birth date is YYYY-MM-DD.
-
-``: Entered as 12, 18, or 24.
-
-Get the age group from the metadata information collected on the app.
-
-``: Refers to infant's biological sex.
-
-Code `m` = male/boy; `f` = female/girl.
-
-Get the sex from the metadata information collected on the app.
-
-``: Study name.
-
-Code as 'PLAY'.
-
-``: Refers to infant's predominant language spoken during the session.
-
-Code with lowercase, full name of the language: 'english' or 'spanish'.
-
-``: Refers to infant's other language spoken during the session, if another language was spoken.
-
-Code with lowercase, full name of the language: 'english' or 'spanish'. If no other language was spoken as missing '.'
+1. Score baby and mom gesture together in a single pass according to definitions in [Gesture Codes](https://dev1.ed-projects.nyu.edu/wikis/docuwiki/doku.php/gesture).
+2. After the _*gesture*_ coding pass (for both mom and baby) has been done, run a script that will separate mom and baby gestures into two columns.
+3. **Run** *Split-MomBabyGesture.rb*. This script pulls out mom and baby gestures from the _*gesture*_ column into two new columns: (1) _*babygesture*_ and (2) _*momgesture*_. Each column is automatically populated with cells from the respectively tagged events from the _*gesture*_ column (e.g., the script ports all gestures coded as ‘m’ to the _*momgesture*_ column and 'b' to the _*babygesture*_ column). Each new cell in babygesture and momgesture is a point cell created at the onset of each cell in the _*gesture*_ column.
-``: Refers to mother's predominant language spoken during the session.
+## Workflow for coding Object, Locomotion, and Emotion Passes
-Code with lowercase, full name of the language: 'english' or 'spanish'.
+1. Choose whether to code baby or mom first within each pass for object, locomotion, or emotion.
+2. Score each pass according to definitions in [Object Codes](https://dev1.ed-projects.nyu.edu/wikis/docuwiki/doku.php/manual3), [Locomotion Codes](https://dev1.ed-projects.nyu.edu/wikis/docuwiki/doku.php/manual4), or [Emotion Codes](https://dev1.ed-projects.nyu.edu/wikis/docuwiki/doku.php/emotion).
-``: Refers to mother's other language spoken during the session, if another language was spoken.
+## Workflow for Inter-Observer Reliability on Communication, Gesture, Object, Locomotion, and Emotion Passes
-Code with lowercase, full name of the language: 'english' or 'spanish'. If no other language was spoken as missing '.'
\ No newline at end of file
+1. After the primary coder finishes a pass: _*babyutterancetype*_, _*momutterancetype*_, _*gesture*_ (split into _*babygesture*_, _*momgesture*_), _*babyobject*_, _*momobject*_, _*babyloc*_, _*momloc*_, _*babyemotion*_, or _*momemotion*_ run two scripts to set up the Datavyu spreadsheet for coding reliability.
+2. First, **run** a script called *insert-RelBlocks.rb*.
+ - This script randomly generates 3, 5-minute chunks within the first, second, and third 20-minute sections of the 1-hour video of free play. By quasi-randomly inserting reliability blocks from areas of the primary coder’s pass, this will ensure that the reliability coder sees each portion of the video for each child’s session. Thus, the idiosyncrasies of each child, fluctuations over the 1-hour session, and drift in the coder are spread over the session.
+ - Reliability on each coding pass is done on the same 3, 5-minute blocks for each pass.
+ - The scripting window in Datavyu will prompt when everything has been successfully completed. You should now have a brand new column in your spreadsheet named _*reliability_blocks*_.
+ - This script should only be run **once** so that reliability coding can be done within the same time frame for each coding domain for each session.
+3. Now, **run** another script appropriate for the pass reliability needs to be coded on: *CreateReliability-BabyUtterancetype.rb* or *CreateReliability-MomUtterancetype.rb* or *CreateReliability-Gesture.rb* or *CreateReliability-MomBaby-Loc.rb* or *CreateReliability-MomBaby-Object.rb* OR *CreateReliability-MomBaby-Emotion.rb*
+ - This script inserts new coding columns where your reliability coder will score the video while they are locked into the script-generated, 5-minute chunks in the _*reliability_blocks*_ column.
\ No newline at end of file
diff --git a/communication.Rmd b/communication.Rmd
index aa3c0f6..8f072b2 100644
--- a/communication.Rmd
+++ b/communication.Rmd
@@ -12,9 +12,13 @@ knitr::opts_chunk$set(echo = FALSE)
source("R/write_video_clip_html.R")
```
-# Coding [communication](communication.html)
+# Datavyu [Communication Codes](communication.html) for 1-Hour Natural Play
-## `babyvoc`
+Make sure you are [currently logged in at Databrary](https://nyu.databrary.org/user/login) to view embedded video examples in this wiki.
+
+This section covers these 4 main codes: babyvoc, babyutterancetype, momspeech, and momutterancetype
+
+## `1. babyvoc`
``
@@ -22,11 +26,9 @@ source("R/write_video_clip_html.R")
Contains a transcript of all of the utterances/vocalizations of the baby.
-This column is automatically populated after the transcribe pass is completed using a Ruby script.
-All of the utterances tagged with 'b' in in transcribe are transferred here.
-The onset and offset are equal, and set to the onset from the transcribe column, which reflects a time as close as possible to the onset of that utterance.
+This column is automatically populated after the *transcribe* pass is completed using a Ruby script. All of the utterances tagged with 'b' in in *transcribe* are transferred here. The onset and offset are equal, and set to the onset from the *transcribe* column, which reflects a time as close as possible to the onset of that utterance.
-## `babyutterancetype`
+## `2. babyutterancetype`
`` `` `` ``
@@ -69,187 +71,33 @@ The transcript will expedite this process. Double check and listen again as you
``
-Sentence = an utterance in which the speaker utters more than one word, producing a sentence or phrase (e.g., “Daddy's shoe” or “Go to the park”).
-
-Transcription: “ooh gimme that”
-
-
-
-
-
-Transcription: “i take this”
-
-
-
-
-
-Transcription: “goodbye sad face?”
-
-
-
-
+Sentence = an utterance in which the speaker utters more than one word, producing a sentence or phrase (e.g., “Daddy's shoe” or “Go to the park”). 3 videos ("ooh gimme that", "i take this", "goodbye sad face?")
``
-Word = an utterance in which the speaker utters a single word, such as “dolly” or “ball.”
-
-Transcription: “cars”
-
-
-
-
-
-Transcription: “basketball”
-
-
-
-
-
-Transcription: “truck”
-
-
-
-
+Word = an utterance in which the speaker utters a single word, such as “dolly” or “ball.” 3 videos ("car", "basketball", "truck")
+
``
-Babble = an utterance in which the speaker utters a series of repeated canonical syllables, such as ba-ba-ba, or ga-ga-ga.
-
-Transcription: “b”
-
-
-
-
-
-Transcription: “b”
-
-
-
-
-
-Transcription: “b”
-
-
-
-
+Babble = an utterance in which the speaker utters a series of repeated canonical syllables, such as ba-ba-ba, or ga-ga-ga. 3 videos (all "b")
``
-Vowel = an utterance in which the speaker utters a vowel sound (e.g, /a/, /i:/).
-
-Transcription: “v”
-
-
-
-
-
-Transcription: “v”
-
-
-
-
-
-Transcription: “v”
-
-
-
-
-
+Vowel = an utterance in which the speaker utters a vowel sound (e.g, /a/, /i:/). 3 videos (all "v")
+
``
-Cry = an utterance in which the speaker is experiencing a period of prolonged distress.
-
-Transcription: “c”
-
-
-
-
-
-Transcription: “c”
-
-
-
-
-
-Transcription: “c”
-
-
-
-
+Cry= an utterance in which the speaker is experiencing a period of prolonged distress. 3 videos (all "c")
``
-Grunt = an utterance in which the speaker produces a low, short, inarticulate, guttural sound often used to express effort or exertion.
-Vegetative sounds, such as coughing and sneezing, should be captured using this code.
-
-
-Transcription: “c”
-
-
-
-
-
-
-Transcription: “c”
-
-
-
-
+Grunt= an utterance in which the speaker produces a low, short, inarticulate, guttural sound often used to express effort or exertion. Vegetative sounds, such as coughing and sneezing, should be captured using this code. 2 videos (both "g")
``
-Unintelligible = either what the baby said was not intelligible to the transcriber, or after listening you are not able to understand well enough what they say even with the transcript to properly code it.
+Unintelligible = either what the baby said was not intelligible to the transcriber, or after listening you are not able to understand well enough what they say even with the transcript to properly code it.
+
### How to Code
@@ -261,7 +109,7 @@ JUMP-BACK-BY 2 s so the utterance can be viewed in context.
Play in real time to code each utterance, which is coded in mutually exclusive categories. TAB to between each argument/prompt inserting period “.” until you reach the appropriate code.
Then insert periods to the end of the cell.
-## `momspeech`
+## `3. momspeech`
``
@@ -273,7 +121,7 @@ This column is automatically populated after the transcribe pass is completed us
All of the utterances tagged with 'm' in in transcribe are transferred here.
The onset and offset are equal, and set to the onset from the transcribe column, which reflects a time as close as possible to the onset of that utterance.
-## `momutterancetype`
+## `4. momutterancetype`
`` `` `` ``
@@ -318,185 +166,41 @@ What is coded is not solely based on the transcript. Listen to the audio, watch
``
-Imperative Look = an utterance in which the speaker directs a baby's attention (e.g., “Look here”, “See?”, or calls baby's name to alert attention).
-
-Transcript: “evelyn”
-
-
-
-
-
-Transcript: “look”
-
-
-
-
+Imperative Look = an utterance in which the speaker directs a baby's attention (e.g., “Look here”, “See?”, or calls baby's name to alert attention). 2 videos ("evelyn", "look")
``
-Imperative Act = an utterance in which the speaker directs a baby's action, such as asking baby to do something, or to play with an object. An example would be if a mother tells her baby “let's play with the ball”.
-
-Transcript: “turn the page”
-
-
-
-
-
-Transcript: “come here please”
-
-
-
-
-
-Transcript: “go get the basketball”
-
-
-
-
+Imperative Act = an utterance in which the speaker directs a baby's action, such as asking baby to do something, or to play with an object. An example would be if a mother tells her baby “let's play with the ball”. 3 videos ("turn the page", "come here please", "go get the basketball")
**NOTE**: The imperative look and imperative act can be collapsed if the breakdown takes too long to code/specify (although we don't think it will save time).
`
`
-Imperative Prohibit = an utterance in which the speaker prohibits a baby's behavior, such as asking baby to stop what they're doing.
-
-Transcript: “dont knock it over”
-
-
-
-
-
-Transcript: “dont be so rough”
-
-
-
-
-
-Transcript: “no no tv”
-
-
-
-
+Imperative Prohibit = an utterance in which the speaker prohibits a baby's behavior, such as asking baby to stop what they're doing. 3 videos (“dont knock it over”, “dont be so rough”, “no no tv”)
``
Interrogative = an utterance in which the speaker asks for information about objects or ongoing activities (e.g., “What is this called?”, “What color is this?).
- Questions that start with “Can you” or “Would you” (e.g., “Can you put that down”) should not be considered for interrogatives. Their function is to regulate the baby's behavior and should be coded as imperatives. Tag questions, in which the speaker adds a question at the end of a statement (“That's a blue truck, right?”) are not considered questions. These should be coded as declaratives.
-
-Transcript: “how does the pig say?”
-
-
-
-
-
-Transcript: “what is this?”
-
-
-
-
+ Questions that start with “Can you” or “Would you” (e.g., “Can you put that down”) should not be considered for interrogatives. Their function is to regulate the baby's behavior and should be coded as imperatives. Tag questions, in which the speaker adds a question at the end of a statement (“That's a blue truck, right?”) are not considered questions. These should be coded as declaratives. 2 videos (“what does the pig say?”, “what is this?”)
``
-Declarative= an utterance in which the speaker provides information about objects, events or ongoing activities (e.g., “This is a fun toy”; “Red truck”; “You are stirring in the cup”.
-
-Transcript: “baby's clothes”
-
-
-
-
-
-Transcript: “thats a lemonade”
-
-
-
-
+Declarative= an utterance in which the speaker provides information about objects, events or ongoing activities (e.g., “This is a fun toy”; “Red truck”; “You are stirring in the cup”. 2 videos (“baby's clothes”, “that's a lemonade”)
``
-Affirmations/Fillers = an utterance in which the speaker is recognizing another speaker's behavior and agreeing with it, or using words as conversational fillers.
-For instance, when the mother says “There you go” when the baby successfully completes a puzzle, or when she says “yeah”, or “uhuh”.
-
-Transcript: “wow”
-
-
-
-
-
-Transcript: “there you go”
-
-
-
-
+Affirmations/Fillers = an utterance in which the speaker is recognizing another speaker's behavior and agreeing with it, or using words as conversational fillers. For instance, when the mother says “There you go” when the baby successfully completes a puzzle, or when she says “yeah”, or “uhuh”. 2 videos (“wow”, “there you go”)
``
- Unintelligible = either what the mom said was not intelligible to the transcriber, or after listening you are not able to understand well enough what they say even with the transcript to properly code it.
-
-Transcript: “xxx”
-
-
-
-
-
-Transcript: “xxx”
-
-
-
-
+Unintelligible = either what the mom said was not intelligible to the transcriber, or after listening you are not able to understand well enough what they say even with the transcript to properly code it. 2 videos (both "xxx")
+
### How to Code
+
+
Set the JUMP-BACK-BY key for 2 s.
-Hit “FIND” on the controller to go to the onset of each utterance, which was populated in momspeech column.
-JUMP-BACK-BY 2 s so the utterance can be viewed in context.
-Play in real time to code each utterance, which is coded in mutually exclusive categories.
+Hit “FIND” on the controller to go to the onset of each utterance, which was populated in _*momspeech*_ column. JUMP-BACK-BY 2 s so the utterance can be viewed in context.
-TAB between each argument/prompt inserting period ”.“ until you reach the appropriate code. Then insert periods to the end of the cell.
+Play in real time to code each utterance, which is coded in mutually exclusive categories. TAB between each argument/prompt inserting period ”.“ until you reach the appropriate code. Then insert periods to the end of the cell.
diff --git a/docs/PLAY-roster.fld/filelist.xml b/docs/PLAY-roster.fld/filelist.xml
new file mode 100644
index 0000000..db891cc
--- /dev/null
+++ b/docs/PLAY-roster.fld/filelist.xml
@@ -0,0 +1,8 @@
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/docs/PLAY-roster.fld/sheet001.htm b/docs/PLAY-roster.fld/sheet001.htm
new file mode 100644
index 0000000..3e3c608
--- /dev/null
+++ b/docs/PLAY-roster.fld/sheet001.htm
@@ -0,0 +1,1505 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
To render the website, run rmarkdown::render_site() from the project root directory. The website files are copied to docs/ where you can view them using a web browser or file editor.
+
To render the website, run rmarkdown::render_site() in R from the project root directory. (If any packages need to be installed, you can do so using install.packages("package_name")) The website files are copied to docs/ where you can view them using a web browser or file editor.
Play is the primary context for infant learning and is foundational for all domains of healthy development—cognition, language, social interaction, motor action, and emotion. Before children begin formal schooling, play occupies nearly all of their waking day. In the first years of life, play provides an unparalleled window into typical and atypical patterns of development, and it is an ideal context for understanding development in children around the globe.
-
Video uniquely captures the nuances and details of natural behavior and the surrounding context, and video can be used and reused by experts in multiple domains.
-
The Play & Learning Across A Year (PLAY) project will create a large-scale shared corpus of video of natural infant play from homes in varied locations across the U.S. Researchers with expertise in multiple domains of infant behavior (physical and motor development, communication and gesture, object exploration and play, emotion, gender, home environment, media) will contribute to the collection and coding of these videos, and will use the data to address questions about infant learning and development. The shared corpus will ultimately be shared with the entire research community.
-
-
-
-
-
Materials (videos, questionnaires, links to Databrary) for the PLAY project are included in this site. We document data collection protocols, workflows, coding strategies, and operational definitions.
+
+
+
+
An overview of the PLAY data
+
To answer questions about infant behaviours in their natural environments, the PLAY project will collect, code, and share 900 hours of video collected in the homes of children at 12, 18, and 24 months of age drawn from 30 sites across North America.
+
Materials (videos, questionnaires, links to Databrary volumes) for the PLAY project are included in this site, where we document data collection protocols, workflows, coding strategies, and operational definitions.
Play is the primary context for infant learning and is foundational for all domains of healthy development—cognition, language, social interaction, motor action, and emotion. Before children begin formal schooling, play occupies nearly all of their waking day. In the first years of life, play provides an unparalleled window into typical and atypical patterns of development, and it is an ideal context for understanding development in children around the globe.
-
Video uniquely captures the nuances and details of natural behavior and the surrounding context, and video can be used and reused by experts in multiple domains.
The Play & Learning Across a Year (PLAY) project
-
PLAY is a collaborative research initiative by 65 researchers from 45 universities across the United States and Canada. PLAY focuses on recording and revealing the behaviors of infants and mothers during natural activity in their homes, providing an unprecedented corpus of data, and using an innovative, transparent approach to science. The data set will consist of fully transcribed and annotated videos, parent report questionnaires, video tours of the home, digital recordings of ambient noise, and detailed demographic information on 900+ infants and mothers from across the United States. This first-of-its-kind corpus will be shareable and searchable with data spanning domains from language to locomotion, gender to gesture, and object play to emotion.
+
The PLAY project is a collaborative research initiative by 65 researchers from 45 universities across the United States and Canada. It serves as a model system for doing development science from a “big data” approach. Natural free play represents the foundation of infant learning, but we know little about how infants play, how play unfolds in real time and across development, and how individual and group differences promote infant learning and development through play.
+
PLAY focuses on recording and revealing the behaviors of infants and mothers during natural activity in their homes, providing an unprecedented corpus of data, and using an innovative, transparent approach to science. The data set will consist of fully transcribed and annotated videos, parent report questionnaires, video tours of the home, digital recordings of ambient noise, and detailed demographic information on 900+ infants and mothers from across the United States. This first-of-its-kind corpus will be shareable and searchable with data spanning domains from language to locomotion, gender to gesture, and object play to emotion.
+
The aim of the project is to develop a new approach to developmental science that enables: (1) “big data“ science for researchers who would not otherwise have access; (2) a communal, low-cost means of collecting and coding data that retains the autonomy of individual labs; and (3) a plan for leveraging diverse expertise to address a common goal.
Support
PLAY is supported by grants from the Office of the Director, National Institutes of Health, (OD), Eunice Kennedy Shriver National Institute for Child Health and Human Development (NICHD), the National Institute of Mental Health (NIMH), and the National Institute on Drug Abuse (NIDA) under R01HD094830-01, the LEGO Foundation, and the Alfred P. Sloan Foundation.
PLAY aims to set new standards for conducting open, transparent, and reproducible behavioral science by i) publishing the protocol, and ii) making extensive use of video exemplars to demonstrate phenomena and illustrate behavioral codes. For confidentiality reasons, access to video exemplars is restricted to researchers with authorized access to Databrary. To register for access, visit http://databrary.org/register.
+
+
+
Inclusion/Exclusion Criteria
+
Infants’ natural play in the home is characterized by tremendous variability including variations in: geographic location, climate, socioeconomic status (SES), maternal/paternal employment, childcare experiences, infants’ and mothers’ ages, language environment, physical layout and characteristics of the home, availability of media, toys for play, and so on. Researchers will be able to explore the effects of any/all such factors.
+
However, to ensure a sufficient sample size and based on conversations with the launch group, we will limit variability along several dimensions. To be included in the PLAY sample of 900 sessions, families must be two-parent households. All infants must be English or Spanish monolingual or bilingual. All infants must be the firstborn child and 12, 18, or 24 months of age (plus/minus one week). All infants must be full-term (37-40 weeks) without known disabilities. The mother must act as the caregiver during visits, which will be scheduled at a time when only the mother and infant are present in the home.
My name is [CALLER NAME] and I’m calling from [LAB]. We have a study for [12 / 18 / 24]-month-olds and [CHILD] is the perfect age. Can I tell you about it?
+
What language(s) do you speak to [CHILD]?
+
→ If not ENGLISH or SPANISH: end call
+
To control for differences in communication, we are looking for families who speak mainly English or Spanish. Would it be alright if you are contacted for other studies in the future?
+
→ If yes: continue
+
For this study, we are interested in learning about babies’ natural, everyday experiences in their homes–such as the toys they play with and places they go.For this study, a researcher will visit you and [CHILD] in your home.You and [CHILD] will be video recorded for 1 hour as you go about your day.At the end of the visit we will ask questions about your family, your home, and [CHILD]’s skills and routines. We will also ask you to take us through your home as we do a video tour capturing the places [CHILD] gets to be throughout the day and things that [he/she] plays with.
+
The study will take about 2 hours. You will receive XXX for your participation.We will schedule a day and time that’s convenient for you and when [CHILD] is usually awake/alert and not during a typical meal time.
+
The data collected in this study are valuable and will be placed in a secure web-based library available only to researchers.The purpose is to share the data with experts in the field so that scientists can learn more about infant development.
+
Are you interested in participating?
+
→ If yes:
+
Is there a day and time that works best for you (when [CHILD] is awake/alert and not a typical meal time)?
+
→ If no:
+
Ok thank you. May we call you for other studies?
+
+
Voicemail
+
Hi, this message is for [MOM]. My name is [NAME] and I’m calling from [LAB].I’m calling because we have a fun study for [12 / 18 / 24]-month-olds and [CHILD] is the perfect age.If you are interested in hearing more about the study, please give us a call back.Our phone number is [XXX-XXX-XXXX]. Thank you and we hope to hear from you soon!
+
+
+
+
Confirming the visit (2 days before actual visit, email the day before)
My name is [NAME] and I’m calling from [LAB] to confirm our visit on [DATE].Before the visit, I’d like to ask you a few questions.It will only take 5 minutes of your time. Can we speak now?
+
→ If yes:
+
Just as a reminder, the data we collect from you now and during the visit, will be shared on a web-based library only available to researchers like the professor who runs this lab.
Please note that presentation and format will differ in the app.
+
→ If no:
+
Can I call you back today or tomorrow [before the visit]?
+
Schedule call.
+
+
+
+
Preparing for Visit
+
+
Prepare paperwork
+
Write Participant ID on all paperwork (consents and questionnaires). Fill out locomotor milestone worksheet.
+
+
+
Pack equipment
+
+
Camera, SD card and extra battery
+
+
Mic
+
Laser Measure
+
Solitary play toy
+
Dyadic play toy
+
Yoga mat
+
Tablet with app for questionnaires (if mom speaks English and/or Spanish, bring both versions of MacArthur), study consent form, Databrary sharing release form, and decibel meter.
+
Answer choice sheet with response scales
+
Participant payment
+
Paper copies of all questionnaires, MCDI, and consent and Databrary forms in case of tablet failure
+
Tools for body dimensions (Height and Weight)- TBD
+
+
+
+
+
Home Visit
+
+
Introduction
+
Say to Mom:
+
Thanks for letting us come to your home. The visit has a few parts:
+
I’ll begin by video-recording you and [CHILD] as you go about your day. I will video-record you both for 1 hour. Then, I will ask [CHILD] to play with some toys both by him/herself and with you.
+
Afterwards, I will ask you some general questions about your family and home, and about [CHILD]’s skills and routines.
+
You will give me a tour of your home that I will record on video to get a sense of the places [CHILD] goes and things that he/she plays with.
+
Do you have any questions? Let’s start with reading and signing the consent.
+
+
+
Consent to Participate and Permission to Share
+
Ask parent to review form asking for consent to participate in the study. When finished, give parent a moment to look over form and sign it.
+
Ask parent to review form asking for permission to share videos and metadata. When finished, give parent a moment to look over the form and sign it.
+
Here is the Databrary Release Language. Here are videos depicting how to ask for permission to share and a sample script.
+
+
+
Visit Protocol
+
+
1: 1-Hour Natural Play Video, Shoes, & Noise Measurement
For the next hour, do anything you would typically do if I weren’t here. Try to ignore me as much as possible and I will stay out of the way. I will also try not to respond to you and [CHILD] so that he/she is not distracted. You can go anywhere in your home. You can play together or not. The idea is to capture what your typical day is like.
+
Procedure:
+
Keep camera on the child at all times. Specifically, ensure that the child’s whole body is visible on camera. If mom is in frame, capture as much of her body as possible without compromising view of the child. Record in front or to the side of the child as much as possible. Do not zoom in. Remain at as far a distance as possible (~3 to 5 m, hugging the wall) so that the child is not distracted by your presence. Try not to interact with the child or make eye contact with the child. Just watch through the view finder of the camera.
If child is wearing shoes, video-record the shoes after the session; take them off child and video the bottom, side, and top views.
+
Procedure:
+
Zoom in with camera and comment on shoe type, heel (if any), and other observations.
+
+
+
Decibel Meter
+
Open the app on your tablet and start running it just before you begin recording the free play video portion of the visit.
+
Procedure:
+
Open application (the application immediately starts recording noise levels upon startup). Place device in the most central place in the home (e.g., living room)
*For the next few minutes, we want to see how [CHILD] plays by him/herself. We ask you not to distract him/her or tell him/her how to play. If [CHILD] tries to get your attention or wants to play with you, you can say, “Go play. It’s perfectly fine if he/she doesn’t play with the toy. Say to child: Here [CHILD], play with this!*
+
Camera:
+
Record solitary toy play so that view is on baby’s body entirely and hands on object. If child moves around, follow child and keep face in frontal view. Procedure: Set yoga mat down on the floor. Un-stack cups and arrange randomly, standing upright (out of child’s view). Place child in a sitting position on yoga mat and start timing after you present the toy. Use timer on the camera (let the timer run for a bit longer than 2 min to avoid cutting the play time short. Later we code only 2 min of engagement). After 2 minutes, say: “Great job!”
[Only give introduction to the sections that need introduction (i.e., ECBQ and MB-CDI)].
+
A GoogleSheet with most of the questions in a database format can be found here.
+
Procedure:
+
Set up camera to record the questionnaires. You’ll need to change the battery on the camera to ensure sufficient power. Sit next to the mom so she is able to read along.
Now, we would like to see the space that [CHILD] gets to explore throughout the day. Please give me a tour of your home as I follow with a camera, and take measurements of the spaces.As we walk around, please mention the things that [CHILD] plays with in each room. Please show me where you keep his/her clothes to give us an idea of the kinds of things he/she wears.
+
Procedure (Video):
+
Pause at the entrance of the room.
+
Name the room by its function (e.g., “This is where [CHILD] sleeps”). First, pan the camera SLOWLY from Left to Right. Then, pan the camera to Floor, name the different types of surfaces in the space (hardwood, plush carpet, thin rug, linoleum, tile, etc.), and then pan to the Ceiling. Hold the camera in one hand while you take measurements of the room. Do NOT turn off the camera when walking to next room. Walk SLOWLY.
+
+
+
Room Measurements with Laser Distance Measurer
+
+
Measure all rooms in the house. Room = any space used by someone on a regular basis, including: bedrooms, kitchens, bathrooms, and basements. Do not measure laundry rooms. Rooms don’t have to have windows. A room has to have a clear demarcation (e.g., a wall or an entry). If the room has a short divider (e.g., when a kitchen and a living room are divided by a counter), count as one big room and measure accordingly.
+
Procedure:
+
Turn measure on by pressing ON/DIST button. Make sure the laser is on. Place the base of the laser flat on the wall. Avoid moldings and door castings. Measure wall to wall, lengthwise and widthwise. If a room has an odd or asymmetrical shape (i.e., any shape other than a rectangle or a square), measure the largest rectangle or square area of the room. Press ON/DIST again to take measurement. Repeat the above for length and width. Focus camera on laser measure and read measurements out loud.
+
+
+
+
6: Body Dimensions
+
[TBD]
+
+
+
7: Visit Wrap-up
+
Complete home measurement, housing checklist sections of the Home Questionnaire. When you arrive back at the lab, wash all toys and equipment thoroughly. Wipe down yoga mat. Rinse nesting cups in bleach-water. Do not submerge shape sorter in water (or it will stop making noise).
+
+
+
8: Visit post-processing
+
Export questionnaire data from tablet. Upload videos, questionnaires, and house decibel data to Databrary.
diff --git a/gesture.Rmd b/gesture.Rmd
index c05308e..25e4960 100644
--- a/gesture.Rmd
+++ b/gesture.Rmd
@@ -7,7 +7,7 @@ output:
toc_float: true
---
-# Coding [gesture](gesture.html)
+# Datavyu [Gesture Codes](gesture.html) for 1-Hour Natural Play
## `gesture`
@@ -15,79 +15,76 @@ output:
### General Orientation
-Gestures are segmented, [durative](https://www.dictionary.com/browse/durative), event-based behaviors.
-Watch the video paying attention to the communicative gestures used by the parent and the child.
-When coding for gesture, focus on the mother’s or baby’s hands and head.
+Gestures are segmented, durative, event-based behaviors. Watch the video paying attention to the **communicative gestures** used by the parent and the child. When coding for gesture, focus on the mother’s or baby’s **hands** and **head**.
-Code mother and baby gesture simultaneously in one pass. Then based on the `` of the gesture, a script breaks apart mom and baby into separate babygesture and momgesture columns.
+Code mother and baby gesture simultaneously in **one pass**. Then based on the `` of the gesture, a script breaks apart mom and baby into separate _babygesture_ and _momgesture_ columns.
-Only onsets are coded to expedite coding; offsets could be coded later if duration of gesture or overlap with specific other domains is of interest.
+_Only onsets are coded_ to expedite coding; offsets could be coded later if duration of gesture or overlap with specific other domains is of interest.
### Value List
``
+
`m` = mom
+
`b` = baby
+
`h` = mom holding baby
+
``
+
`p` = point
+
`s` = show/hold up
+
`i` = iconic gesture
+
`c` = conventional gesture
+
### Operational Definitions
``
+
``: Code 'm' if the mom is the source of the gesture.
+
``: Code 'b' if the baby is the source of the gesture.
+
``
-Gesturing by either mom or baby to the investigator, or anyone else in the room, should not be coded.
-The following should NOT be coded as gestures: tapping baby to get his/her attention; pushing an object away; hugging and kissing; one partner moving the other's hand (e.g., to initiate contact, like proximity seeking); jerking the head to indicate “come here.”
+Gesturing by either mom or baby _to the investigator_ (or anyone else in the room) should not be coded. The following should _NOT_ be coded as gestures: tapping baby to get his/her attention; pushing an object away; hugging and kissing; one partner moving the other's hand (e.g., to initiate contact, like proximity seeking); jerking the head to indicate “come here.”
`
`
-Code 'p' when the baby or mom extends their index finder to indicate reference to objects, people, events, or locations in the environment.
+Code '**p**' when the baby or mom extends their index finder to indicate reference to objects, people, events, or locations in the environment.
-Onset is the frame when the finger is fully extended in space toward a referent, or when the point finger is extended and makes contact with the object.
+**Onset** is the frame when the finger is fully extended in space toward a referent, or when the point finger is extended and makes contact with the object.
Repetitive points should be coded as separate gesture events.
``
-Code 's' when the baby or mom holds up an object to present it as if to say: “look at this” or “do you want this” or “I want you to take this”.
+Code '**s**' when the baby or mom holds up an object to present it as if to say: “look at this” or “do you want this” or “I want you to take this”.
Given that it’s not possible to distinguish intention, when a participant shows, offers, or gives an object (e.g., baby actually hands toy to mom, offering toy to mom but mom doesn’t take) code as 's', to save decision-making time.
-Onset is the frame when the object is fully held up or out to show it. Repetitive instances of holding up or offering an object should be coded as separate gesture events.
+**Onset** is the frame when the object is fully held up or out to show it. Repetitive instances of holding up or offering an object should be coded as separate gesture events.
``
-Code 'i' when the baby or mom engages in an iconic gesture.
-They are called iconic because they represent an object, idea, or action that can't easily be referenced with a deictic (point/show) or conventional gesture.
-The movement of these gestures usually calls to mind something about the nature of the object, idea or action being referenced.
-For example, you could move your arms back and forth to represent running, or you could trace a square in the air with your finger, or flap your arms as if flying.
+Code '**i**' when the baby or mom engages in an iconic gesture. They are called iconic because they represent an object, idea, or action that can't easily be referenced with a deictic (point/show) or conventional gesture. The movement of these gestures usually calls to mind something about the nature of the object, idea or action being referenced. For example, you could move your arms back and forth to represent running, or you could trace a square in the air with your finger, or flap your arms as if flying.
-Onset is the frame when the baby or mom has clearly begun the iconic gesture, and the coder can clearly identify this a gesture but does fall into the conventional gesture category (see ``). Repetitive instances of an iconic gesture should be coded as separate gesture events.
+**Onset** is the frame when the baby or mom has clearly begun the iconic gesture, and the coder can clearly identify this a gesture but does fall into the conventional gesture category (see ``). Repetitive instances of an iconic gesture should be coded as separate gesture events.
``
-Code 'c' when the baby or mom engages in a conventional gesture. Conventional gestures are culturally-agreed-upon hand or head movements with a specific meaning, like nodding the head to mean “yes,” shaking the head to mean “no,” and moving the finger to lips to indicate “be quiet”.
-
-shaking head “no”
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/20618,23455/asset/76362/download?inline=true")
-```
+Code 'c' when the baby or mom engages in a conventional gesture. Conventional gestures are culturally-agreed-upon hand or head movements with a specific meaning, like nodding the head to mean “yes,” shaking the head to mean “no,” and moving the finger to lips to indicate “be quiet”. 2 videos (shaking head “no”; holding out hand for “give me”)
-holding out hand for “give me”
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/20618,23455/asset/76362/download?inline=true")
-```
If a gesture is conventional, you should be able to understand its meaning just by seeing it in isolation, without knowing any of the context.
Some additional examples of conventional gestures include: waving, clapping, flipping arms out to side to indicate “I don’t know’ or “where is it”, come here gesture (finger motions or palms), sit down gesture (pats ground), pickup gesture (child holds up arms to be picked up), thumbs up, shrugs, naughties (wag finger), hug me (hold arms out asking for hug), etc.
-Onset is the frame when the baby or mom has clearly begun the conventional gesture, and the coder can clearly identify this a gesture but does fall into the iconic gesture category (see ``).
+**Onset** is the frame when the baby or mom has clearly begun the conventional gesture, and the coder can clearly identify this a gesture but does fall into the iconic gesture category (see ``).
Repetitive instances of a conventional gesture should be coded as separate gesture events.
### How to Code
@@ -96,18 +93,18 @@ Set “JUMP-BACK-BY” key to 2 s.
Gestures are best coded with the volume low or muted so that the language content does not confound the coding process.
-Watch in 1x speed until either mom or baby gestures. Focus on the mom’s and infant’s hands and head to identify instances of gestures.
+Watch in **1x speed** until either mom or baby gestures. Focus on the mom’s and infant’s hands and head to identify instances of gestures.
-Gestures are defined purely as they relate to the communicative nature of each action. The coder can establish whether something is communicative by looking at things like eye contact, conversational context, and the reaction of the person being spoken or gestured to. If the movement isn’t supposed to communicate anything, then it’s not a gesture. For example, a child might reach for an object and pick it up and look at it. This is an action, not a gesture. But, if the child points to the object to indicate its presence, or if the parent claps her hands to indicate “good job,” then these are gestures. (If there is significant ambiguity in whether a gesture is communicative, or how to code it, sound may be of assistance.)
+Gestures are defined purely as they relate to the communicative nature of each action. The coder can establish whether something is _communicative_ by looking at things like eye contact, conversational context, and the reaction of the person being spoken or gestured to. If the movement isn’t supposed to communicate anything, then it’s not a gesture. For example, a child might reach for an object and pick it up and look at it. This is an action, not a gesture. But, if the child points to the object to indicate its presence, or if the parent claps her hands to indicate “good job,” then these are gestures. (If there is significant ambiguity in whether a gesture is communicative, or how to code it, sound may be of assistance.)
-When the coder identifies a mom or baby gesturing, jump back 2 seconds and play the video again at ½ speed until the frame the gesture is clearly underway is found. Hit the = key (equal sign) to insert a point cell; so the current video frame becomes the onset and the offset.
+When the coder identifies a mom or baby gesturing, jump back **2 seconds** and play the video again at **½ speed** until the frame the gesture is clearly underway is found. Hit the = key (equal sign) to insert a point cell; so the current video frame becomes the onset and the offset.
-Type 'm' or 'b' to indicate whether mom of the baby was the of the gesture. Hit the TAB key to advance the cursor to , then type 'p', 's', 'i', or 'c' to indicate the type of gesture.
+Type **'m'** or **'b'** to indicate whether mom of the baby was the `` of the gesture. Hit the TAB key to advance the cursor to ``, then type **'p'**, **'s'**, **'i'**, or **'c'** to indicate the type of gesture.
-Splitting Mom and Baby Gestures
+#### Splitting Mom and Baby Gestures
It's faster to code mom and baby gesture together in one pass. But for consistency with the other coding passes, we want mom speech and baby gestures to be in two separate columns.
-Run the Split-MomBabyGesture.rb script to pull baby and mom gestures from the the gesture column into babygesture and momgesture columns.
+**Run the Split-MomBabyGesture.rb script** to pull baby and mom gestures from the the _gesture_ column into _babygesture_ and _momgesture_ columns.
## `babygesture`
@@ -117,7 +114,7 @@ Run the Split-MomBabyGesture.rb script to pull baby and mom gestures from the th
Contains gestures produced by the baby.
-This column is automatically populated after the gesture pass is completed, using a Ruby script. All of the gestures tagged with 'b' in in the gesture column are transferred here. The onset and offset are equal, and set to the onset from the gesture column, which reflects the time when the coder was sure the gesture had begun.
+This column is automatically populated after the _gesture_ pass is completed, using a Ruby script. All of the gestures tagged with 'b' in `` in the _gesture_ column are transferred here. The onset and offset are equal, and set to the onset from the gesture column, which reflects the time when the coder was sure the gesture had begun.
### Value List
@@ -139,9 +136,9 @@ This column is automatically populated after the gesture pass is completed, usin
Contains gestures produced by the mom.
-This column is automatically populated after the gesture pass is completed, using a Ruby script.
-All of the gestures tagged with 'm' in in the gesture column are transferred here.
-The onset and offset are equal, and set to the onset from the gesture column, which reflects the time when the coder was sure the gesture had begun.
+This column is automatically populated after the _gesture_ pass is completed, using a Ruby script.
+All of the gestures tagged with 'm' in `` in the gesture column are transferred here.
+The onset and offset are equal, and set to the onset from the _gesture_ column, which reflects the time when the coder was sure the gesture had begun.
### Value List
diff --git a/img/.DS_Store b/img/.DS_Store
new file mode 100644
index 0000000..5008ddf
Binary files /dev/null and b/img/.DS_Store differ
diff --git a/img/camera-sdcard-battery.png b/img/camera-sdcard-battery.png
new file mode 100644
index 0000000..1d2a4c0
Binary files /dev/null and b/img/camera-sdcard-battery.png differ
diff --git a/img/card.eps b/img/card.eps
new file mode 100644
index 0000000..0307f35
Binary files /dev/null and b/img/card.eps differ
diff --git a/img/case.eps b/img/case.eps
new file mode 100644
index 0000000..146c2bd
Binary files /dev/null and b/img/case.eps differ
diff --git a/img/dish-set.eps b/img/dish-set.eps
new file mode 100644
index 0000000..8afea31
Binary files /dev/null and b/img/dish-set.eps differ
diff --git a/img/dish-set2.eps b/img/dish-set2.eps
new file mode 100644
index 0000000..55b7178
Binary files /dev/null and b/img/dish-set2.eps differ
diff --git a/img/dish-sets.png b/img/dish-sets.png
new file mode 100644
index 0000000..2662e8e
Binary files /dev/null and b/img/dish-sets.png differ
diff --git a/img/dm-mic.eps b/img/dm-mic.eps
new file mode 100644
index 0000000..3e25d61
Binary files /dev/null and b/img/dm-mic.eps differ
diff --git a/img/dm-mic.png b/img/dm-mic.png
new file mode 100644
index 0000000..e4dbced
Binary files /dev/null and b/img/dm-mic.png differ
diff --git a/img/dog.eps b/img/dog.eps
new file mode 100644
index 0000000..711dcc9
Binary files /dev/null and b/img/dog.eps differ
diff --git a/img/dog.png b/img/dog.png
new file mode 100644
index 0000000..a0cb46e
Binary files /dev/null and b/img/dog.png differ
diff --git a/img/laser.eps b/img/laser.eps
new file mode 100644
index 0000000..9b5005f
Binary files /dev/null and b/img/laser.eps differ
diff --git a/img/laser.png b/img/laser.png
new file mode 100644
index 0000000..ddfbf87
Binary files /dev/null and b/img/laser.png differ
diff --git a/img/mic.png b/img/mic.png
new file mode 100644
index 0000000..fda17b6
Binary files /dev/null and b/img/mic.png differ
diff --git a/img/overview-coding.png b/img/overview-coding.png
new file mode 100644
index 0000000..f9cbd39
Binary files /dev/null and b/img/overview-coding.png differ
diff --git a/img/overview-collecting.png b/img/overview-collecting.png
new file mode 100644
index 0000000..f9cbd39
Binary files /dev/null and b/img/overview-collecting.png differ
diff --git a/img/overview-processing.png b/img/overview-processing.png
new file mode 100644
index 0000000..f9cbd39
Binary files /dev/null and b/img/overview-processing.png differ
diff --git a/img/overview-project.png b/img/overview-project.png
new file mode 100644
index 0000000..f9cbd39
Binary files /dev/null and b/img/overview-project.png differ
diff --git a/img/payment.png b/img/payment.png
new file mode 100644
index 0000000..0df728e
Binary files /dev/null and b/img/payment.png differ
diff --git a/img/people-cathie.png b/img/people-cathie.png
new file mode 100644
index 0000000..ef440e3
Binary files /dev/null and b/img/people-cathie.png differ
diff --git a/img/people-karen.png b/img/people-karen.png
new file mode 100644
index 0000000..75d1a68
Binary files /dev/null and b/img/people-karen.png differ
diff --git a/img/people-kasey.png b/img/people-kasey.png
new file mode 100644
index 0000000..53ed66f
Binary files /dev/null and b/img/people-kasey.png differ
diff --git a/img/people-melody.png b/img/people-melody.png
new file mode 100644
index 0000000..46f5560
Binary files /dev/null and b/img/people-melody.png differ
diff --git a/img/people-orit.png b/img/people-orit.png
new file mode 100644
index 0000000..150d9d3
Binary files /dev/null and b/img/people-orit.png differ
diff --git a/img/people-rick.png b/img/people-rick.png
new file mode 100644
index 0000000..500fde0
Binary files /dev/null and b/img/people-rick.png differ
diff --git a/img/people-swapnaa.png b/img/people-swapnaa.png
new file mode 100644
index 0000000..9b3dd02
Binary files /dev/null and b/img/people-swapnaa.png differ
diff --git a/img/questionnaires.png b/img/questionnaires.png
new file mode 100644
index 0000000..60e996a
Binary files /dev/null and b/img/questionnaires.png differ
diff --git a/img/tablet.eps b/img/tablet.eps
new file mode 100644
index 0000000..f6194eb
Binary files /dev/null and b/img/tablet.eps differ
diff --git a/img/tablet.png b/img/tablet.png
new file mode 100644
index 0000000..c648eb3
Binary files /dev/null and b/img/tablet.png differ
diff --git a/img/tote.eps b/img/tote.eps
new file mode 100644
index 0000000..ef8c8b0
Binary files /dev/null and b/img/tote.eps differ
diff --git a/img/tote.png b/img/tote.png
new file mode 100644
index 0000000..1b16440
Binary files /dev/null and b/img/tote.png differ
diff --git a/img/tripod.eps b/img/tripod.eps
new file mode 100644
index 0000000..5121a4c
Binary files /dev/null and b/img/tripod.eps differ
diff --git a/img/tripod.png b/img/tripod.png
new file mode 100644
index 0000000..6d4e936
Binary files /dev/null and b/img/tripod.png differ
diff --git a/img/yoga-mat.eps b/img/yoga-mat.eps
new file mode 100644
index 0000000..b86da05
Binary files /dev/null and b/img/yoga-mat.eps differ
diff --git a/img/yoga-mat.png b/img/yoga-mat.png
new file mode 100644
index 0000000..0765395
Binary files /dev/null and b/img/yoga-mat.png differ
diff --git a/index.Rmd b/index.Rmd
index 0293840..3e31213 100644
--- a/index.Rmd
+++ b/index.Rmd
@@ -4,16 +4,14 @@ title: ""
-Play is the primary context for infant learning and is foundational for all domains of healthy development—cognition, language, social interaction, motor action, and emotion. Before children begin formal schooling, play occupies nearly all of their waking day. In the first years of life, play provides an unparalleled window into typical and atypical patterns of development, and it is an ideal context for understanding development in children around the globe.
+# The Play & Learning Across a Year (PLAY) project
-Video uniquely captures the nuances and details of natural behavior and the surrounding context, and video can be used and reused by experts in multiple domains.
+The PLAY project is a collaborative research initiative by 65 researchers from 45 universities across the United States and Canada. It serves as a model system for doing development science from a “big data” approach. Natural free play represents the foundation of infant learning, but we know little about how infants play, how play unfolds in real time and across development, and how individual and group differences promote infant learning and development through play.
-# The Play & Learning Across a Year (PLAY) project
+PLAY focuses on recording and revealing the behaviors of infants and mothers during natural activity in their homes, providing an unprecedented corpus of data, and using an innovative, transparent approach to science. The data set will consist of fully transcribed and annotated videos, parent report questionnaires, video tours of the home, digital recordings of ambient noise, and detailed demographic information on 900+ infants and mothers from across the United States. This first-of-its-kind corpus will be shareable and searchable with data spanning domains from language to locomotion, gender to gesture, and object play to emotion.
+
+The aim of the project is to develop a new approach to developmental science that enables: (1) “big data“ science for researchers who would not otherwise have access; (2) a communal, low-cost means of collecting and coding data that retains the autonomy of individual labs; and (3) a plan for leveraging diverse expertise to address a common goal.
-PLAY is a collaborative research initiative by 65 researchers from 45 universities across the United States and Canada.
-PLAY focuses on recording and revealing the behaviors of infants and mothers during natural activity in their homes, providing an unprecedented corpus of data, and using an innovative, transparent approach to science.
-The data set will consist of fully transcribed and annotated videos, parent report questionnaires, video tours of the home, digital recordings of ambient noise, and detailed demographic information on 900+ infants and mothers from across the United States.
-This first-of-its-kind corpus will be shareable and searchable with data spanning domains from language to locomotion, gender to gesture, and object play to emotion.
## Support
diff --git a/objects.Rmd b/objects.Rmd
index 23452a1..34fd9ed 100644
--- a/objects.Rmd
+++ b/objects.Rmd
@@ -7,7 +7,9 @@ output:
toc_float: true
---
-# Coding [objects](objects.html)
+# Datavyu [Object Codes](objects.html) for 1-Hour Natural Play
+
+Make sure you are [currently logged in at Databrary](https://nyu.databrary.org/user/login) to view embedded video examples in this wiki.
## `babyobject`
@@ -15,18 +17,7 @@ output:
### General Orientation
-This code captures the times that the baby is manually engaged with an object.
-
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/-/asset/62793/download?inline=true")
-```
-
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/-/asset/62777/download?inline=true")
-```
-
-Coders score only when object events occur, not when they don't occur.
-This is an event code, where gray spaces between cells mean that the baby is not engaged with an object.
+This code captures the times that the baby is manually engaged with an object. Coders score only when object events occur, not when they don't occur. This is an event code, where gray spaces between cells mean that the baby is not engaged with an object. (2 videos)
### Value List
@@ -38,53 +29,15 @@ This is an event code, where gray spaces between cells mean that the baby is not
``
-Object = is defined as any manipulable, moveable item that may be detached and moved through space (e.g., toys, household items, and smaller moveable elements of larger objects like beads on busy box, doorknob).
-
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/-/asset/62763/download?inline=true")
-```
-
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/-/asset/63765/download?inline=true")
-```
-
-Objects may include large objects (i.e., a stroller, adult furniture, door, etc.) when baby moves them, thus, manually engaging with them.
-If the object never moves (e.g., the baby has a hand on the stroller but does not displace it), then this is not coded as 'o.'
-
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/-/asset/63763/download?inline=true")
-```
+**Object** = is defined as any manipulable, moveable item that may be detached and moved through space (e.g., toys, household items, and smaller moveable elements of larger objects like beads on busy box, doorknob) (2 videos) . Objects may include large objects (i.e., a stroller, adult furniture, door, etc.) when baby moves them, thus, manually engaging with them. If the object never moves (e.g., the baby has a hand on the stroller but does not displace it), then this is not coded as 'o.' (1 video) The displacement rule is so that we can differentiate object engagement episodes from instances where baby is exploring a surface or resting hands on a surface for support. (1 video) The infant does not have to be looking at the object for the event to count as an object engagement (e.g., baby is carrying object) (1 video) .
-The displacement rule is so that we can differentiate object engagement episodes from instances where baby is exploring a surface or resting hands on a surface for support.
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/-/asset/63767/download?inline=true")
-```
+Riding on toys with wheels does not count as object, but this will be coded in _babyloc_ pass.
-The infant does not have to be looking at the object for the event to count as an object engagement (e.g., baby is carrying object).
+Code '**o**' if the baby is engaged with an object by making contact with the item with hand(s) and/or moving the item in space (e.g., carrying, pushing on the floor, etc.) (1 video)
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/-/asset/62775/download?inline=true")
-```
+**Onset** is the frame when baby first causes the object to move while making contact with any part of the hand(s), not feet. Contact could be from any part of the hand (fingers, palm, side of hand). Movement could including lifting, holding, pressing, grasping, shaking, banging, or any other type of displacement event. DO NOT code onset, just when the hand touches the object if the object is not displaced (e.g., if they child touches a pillow but then 1 minute later actually grasps and moves it, code onset at the movement not when the hand touches the object). **Offset** is the frame when baby is no longer in manual contact with an object for at least 3 s. OR when the baby is in manual contact but the object is no longer being displaced (displacement includes holding, lifting) for at least 3 s. There is no minimum duration for baby to touch an object to be scored as 'o,' but if infant is touching multiple objects, the offset of 'o' object cell is when baby is no longer in manual contact with the last object contacted for 3+ s. If the baby is in manual contact with an object in one hand and makes contact with another object with their second hand, count this as the same bout. (1 video)
-Riding on toys with wheels does not count as object, but this will be coded in babyloc pass.
-
-Code 'o' if the baby is engaged with an object by making contact with the item with hand(s) and/or moving the item in space (e.g., carrying, pushing on the floor, etc.)
-
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/-/asset/62783/download?inline=true")
-```
-
-Onset is the frame when baby first causes the object to move while making contact with any part of the hand(s), not feet.
-Contact could be from any part of the hand (fingers, palm, side of hand).
-Movement could including lifting, holding, pressing, grasping, shaking, banging, or any other type of displacement event. DO NOT code onset, just when the hand touches the object if the object is not displaced (e.g., if they child touches a pillow but then 1 minute later actually grasps and moves it, code onset at the movement not when the hand touches the object).
-Offset is the frame when baby is no longer in manual contact with an object for at least 3 s. OR when the baby is in manual contact but the object is no longer being displaced (displacement includes holding, lifting) for at least 3 s.
-There is no minimum duration for baby to touch an object to be scored as 'o,' but if infant is touching multiple objects, the offset of 'o' object cell is when baby is no longer in manual contact with the last object contacted for 3+ s.
-If the baby is in manual contact with an object in one hand and makes contact with another object with their second hand, count this as the same bout.
-
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/-/asset/62755/download?inline=true")
-```
### How to Code
@@ -92,20 +45,12 @@ Set “JUMP-BACK-BY” key to 3 s.
Enable cell highlighting.
-Watch in real time for the baby's hand(s).
-As soon as you see the hand(s) touch an object (as defined above), continue watching for a couple of seconds to see if the baby moves/manipulates the object.
-Then, hit #4-SHUTTLEBACK to get to the onset of the cell.
-The Onset is the first frame when the baby makes manual contact with the item.
-Set this onset by hitting ENTER to set a new cell with that onset time.
-Now, continue watching the object bout in real time and set the Offset when the baby breaks manual contact or stops moving object (e.g., stroller) for at least 3 s.
-Once you've determined that the bout has ended, set the offset by hitting #5-STOP and then #4-SHUTTLEFORWARD or #6-SHUTTLEBACK to the last frame where the baby is no longer in manual contact with the item and/or when the baby is no longer moving it.
-Then, hit #9-SETOFFSET.
+Watch in real time for the baby's hand(s). As soon as you see the hand(s) touch an object (as defined above), continue watching for a couple of seconds to see if the baby moves/manipulates the object. Then, hit #4-SHUTTLEBACK to get to the onset of the cell. The **Onset** is the first frame when the baby makes manual contact with the item. Set this onset by hitting ENTER to set a new cell with that onset time. Now, continue watching the object bout in real time and set the **Offset** when the baby breaks manual contact or stops moving object (e.g., stroller) for at least 3 s. Once you've determined that the bout has ended, set the offset by hitting #5-STOP and then #4-SHUTTLEFORWARD or #6-SHUTTLEBACK to the last frame where the baby is no longer in manual contact with the item and/or when the baby is no longer moving it. Then, hit #9-SETOFFSET.
+
+Continue watching in real time for the next object bout. If the baby is holding an object while crawling or walking around, you can watch faster by SHUTTLING at 2x speed to find the end of the object engagement.
-Continue watching in real time for the next object bout.
-If the baby is holding an object while crawling or walking around, you can watch faster by SHUTTLING at 2x speed to find the end of the object engagement.
+To check whether a 3-s pause has occurred between object engagements, go to the offset of the previous object cell and watch until you reach the next instance of 'o'. Then, hit the 'JUMP-BACK-BY' key and check to see if the previous cell lights up. If it does, then the two cells are <3 s apart and should be combined into one bout of '**o**'.
-To check whether a 3-s pause has occurred between object engagements, go to the offset of the previous object cell and watch until you reach the next instance of 'o'.
-Then, hit the 'JUMP-BACK-BY' key and check to see if the previous cell lights up. If it does, then the two cells are <3 s apart and should be combined into one bout of 'o'.
## `momobject`
@@ -127,69 +72,38 @@ This is an event code, where gray space in between cells means that the mom is n
``
-Object = is defined as any manipulable, moveable item that may be detached and moved through space (e.g., toys, household items). Object can include parts of a stationary object (e.g., doorknob on door, clasp on drawer) that can be moved or manipulated.
+**Object** = is defined as any manipulable, moveable item that may be detached and moved through space (e.g., toys, household items). Object can include parts of a stationary object (e.g., doorknob on door, clasp on drawer) that can be moved or manipulated (5 videos) . Object can include large objects that mom may move (chairs).
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/-/asset/62767/download?inline=true")
-```
+Code '**o**' if mom is engaged with an object by making contact with the item with her hand(s). **Onset** is the frame when mom first makes contact with hands. **Offset** is the frame when mom is no longer in manual contact with an object for at least 3 s. If the mom has multiple items in hand, the Onset of object is when a hand(s) touched the first object in the multiple-object-bout and the Offset is when the hand(s) release the last object.
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/-/asset/62781/download?inline=true")
-```
+In cases of larger objects (i.e., a stroller, a box, a chair, a table, etc.), the object engagement begins when the object starts to move. If the large object never moves (e.g., the mom has a hand on the stroller but does not displace it), then this is not coded as '**o**'. (1 video)
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/-/asset/62791/download?inline=true")
-```
+Your browser does not support html5 video.
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/-/asset/62781/download?inline=true")
-```
+If the mom is not in the camera view, code this with a '.' as missing data.
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/-/asset/62797/download?inline=true")
-```
-Object can include large objects that mom may move (chairs).
-Code 'o' if mom is engaged with an object by making contact with the item with her hand(s).
-Onset is the frame when mom first makes contact with hands. Offset is the frame when mom is no longer in manual contact with an object for at least 3 s.
-If the mom has multiple items in hand, the Onset of object is when a hand(s) touched the first object in the multiple-object-bout and the Offset is when the hand(s) release the last object.
-
-In cases of larger objects (i.e., a stroller, a box, a chair, a table, etc.), the object engagement begins when the object starts to move. If the large object never moves (e.g., the mom has a hand on the stroller but does not displace it), then this is not coded as 'o'.
+### How to Code
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/-/asset/63366/download?inline=true")
-```
-If the mom is not in the camera view, code this with a '.' as missing data.
-### How to Code
Set “JUMP-BACK-BY” key to 3 s.
Enable cell highlighting.
-Watch in real-time for the mom's hand(s).
-As soon as you see the hand(s) touch an object (as defined above), continue watching for a couple of seconds to see if the mom moves/manipulated the object (which would make this an instance of Object).
-Then, hit #4-SHUTTLEBACK to get to the onset of the cell.
-The Onset is the first frame when the mom makes manual contact with the item and moves it through space.
-Set this onset by hitting ENTER to set a new cell with that onset time.
-Now, continue watching the Object bout in real time and set the Offset when the mom breaks manual contact or stops moving the object for at least 3 s (i.e., Object bouts that are interrupted by gray space are more than 3 s apart.
+Watch in real-time for the mom's hand(s). As soon as you see the hand(s) touch an object (as defined above), continue watching for a couple of seconds to see if the mom moves/manipulated the object (which would make this an instance of Object). Then, hit #4-SHUTTLEBACK to get to the onset of the cell. The **Onset** is the first frame when the mom makes manual contact with the item and moves it through space. Set this onset by hitting ENTER to set a new cell with that onset time. Now, continue watching the Object bout in real time and set the **Offset** when the mom breaks manual contact or stops moving the object for at least 3 s (i.e., Object bouts that are interrupted by gray space are more than 3 s apart.
-There is no necessary minimum duration for object engagement during the 'o' bout to be coded as Object.
-In other words, the mom can engage with an item or as little or as much time as they would like, however, the mom must make manual contact and move it through space to count.
+There is no necessary minimum duration for object engagement during the 'o' bout to be coded as Object. In other words, the mom can engage with an item or as little or as much time as they would like, however, the mom must make manual contact and move it through space to count.
-Once you've determined that the bout has ended, set the offset by hitting #5-STOP and then #4-SHUTTLEFORWARD or #6-SHUTTLEBACK to the last frame where the mom if no longer in manual contact with the item and/or when the mom is no longer moving it.
-Then, hit #9-SETOFFSET.
+Once you've determined that the bout has ended, set the offset by hitting #5-STOP and then #4-SHUTTLEFORWARD or #6-SHUTTLEBACK to the last frame where the mom if no longer in manual contact with the item and/or when the mom is no longer moving it. Then, hit #9-SETOFFSET.
-Continue watching in real time for the next object bout.
-If the mom is walking or crawling with an object, watch at 2x speed.
+Continue watching in real time for the next object bout. If the mom is walking or crawling with an object, watch at 2x speed.
-Do not agonize.
-If the mom goes in and out of the camera view, but you know she is still holding the same object and has not put it down, code it in the same bout of 'o'.
-Do not mark the “.” for every few seconds she is out of frame.
+Do not agonize. If the mom goes in and out of the camera view, but you know she is still holding the same object and has not put it down, code it in the same bout of '**o**'. Do not mark the “.” for every few seconds she is out of frame.
Code as Object event if mom's back is to the camera, but you see her arms moving and she overtly appears to be manipulating something—even if you can't see exactly what it is.
-Many times, onsets and offsets are coded when mom goes in and out of frame.
-In these instances, hit the _0_ key to set a continuous cell, whose onset is 1-ms after the previous cell.
+Many times, onsets and offsets are coded when mom goes in and out of frame. In these instances, hit the _0_ key to set a continuous cell, whose onset is 1-ms after the previous cell.
+
diff --git a/people.Rmd b/people.Rmd
index 2f3e077..9284f6c 100644
--- a/people.Rmd
+++ b/people.Rmd
@@ -2,35 +2,90 @@
title: ""
---
+
+
+
## Principal Investigators
+
+
+
+
**Karen E. Adolph, Ph.D**.
New York University
Principal Investigator
+
+
+
+
+
**Catherine Tamis-LeMonda, Ph.D.**
New York University
Co-Principal Investigator
+
+
+
+
+
**Rick O. Gilmore, Ph.D.**
The Pennsylvania State University
Co-Principal Investigator
+
diff --git a/protocol-preMar26.Rmd b/protocol-preMar26.Rmd
new file mode 100644
index 0000000..ae63931
--- /dev/null
+++ b/protocol-preMar26.Rmd
@@ -0,0 +1,575 @@
+---
+title: ""
+output:
+ html_document:
+ toc: true
+ toc_depth: 3
+ toc_float: true
+---
+
+
+
+```{r setup, include=FALSE}
+knitr::opts_chunk$set(echo = FALSE)
+source("R/write_video_clip_html.R")
+```
+
+# Overview
+
+PLAY aims to set new standards for conducting open, transparent, and reproducible behavioral science by i) publishing the protocol, and ii) making extensive use of video exemplars to demonstrate phenomena and illustrate behavioral codes.
+For confidentiality reasons, access to video exemplars is restricted to researchers with authorized access to [Databrary](http://databrary.org).
+To register for access, visit .
+
+# Inclusion/Exclusion Criteria
+
+Infants' natural play in the home is characterized by tremendous variability including variations in: geographic location, climate, socioeconomic status (SES), maternal/paternal employment, childcare experiences, infants’ and mothers’ ages, language environment, physical layout and characteristics of the home, availability of media, toys for play, and so on.
+Researchers will be able to explore the effects of any/all such factors.
+
+However, to ensure a sufficient sample size and based on conversations with the launch group, we will limit variability along several dimensions. To be included in the PLAY sample of 900 sessions, families must be two-parent households.
+All infants must be English or Spanish monolingual or bilingual.
+All infants must be the firstborn child and 12, 18, or 24 months of age (plus/minus one week). All infants must be full-term (37-40 weeks) without known disabilities.
+The mother must act as the caregiver during visits, which will be scheduled at a time when only the mother and infant are present in the home.
+
+# Scheduling Visit
+
+## Initial recruiting call
+
+
+
+
+
+
+*Hi, may I speak with [MOM]?*
+
+*My name is [CALLER NAME] and I’m calling from [LAB]. We have a study for [12 / 18 / 24]-month-olds and [CHILD] is the perfect age. Can I tell you about it?*
+
+*What language(s) do you speak to [CHILD]?*
+
+→ If not ENGLISH or SPANISH: end call
+
+*To control for differences in communication, we are looking for families who speak mainly English or Spanish. Would it be alright if you are contacted for other studies in the future?*
+
+→ If yes: continue
+
+*For this study, we are interested in learning about babies’ natural, everyday experiences in their homes–such as the toys they play with and places they go.*
+*For this study, a researcher will visit you and [CHILD] in your home.*
+*You and [CHILD] will be video recorded for 1 hour as you go about your day.*
+*At the end of the visit we will ask questions about your family, your home, and [CHILD]’s skills and routines. We will also ask you to take us through your home as we do a video tour capturing the places [CHILD] gets to be throughout the day and things that [he/she] plays with.*
+
+*The study will take about 2 hours. You will receive XXX for your participation.*
+*We will schedule a day and time that’s convenient for you and when [CHILD] is usually awake/alert and not during a typical meal time.*
+
+*The data collected in this study are valuable and will be placed in a secure web-based library available only to researchers.*
+*The purpose is to share the data with experts in the field so that scientists can learn more about infant development.*
+
+*Are you interested in participating?*
+
+→ If yes:
+
+*Is there a day and time that works best for you (when [CHILD] is awake/alert and not a typical meal time)?*
+
+→ If no:
+
+*Ok thank you. May we call you for other studies?*
+
+### Voicemail
+
+*Hi, this message is for [MOM]. My name is [NAME] and I’m calling from [LAB].*
+*I’m calling because we have a fun study for [12 / 18 / 24]-month-olds and [CHILD] is the perfect age.*
+*If you are interested in hearing more about the study, please give us a call back.*
+*Our phone number is [XXX-XXX-XXXX]. Thank you and we hope to hear from you soon!*
+
+## Confirming the visit (2 days before actual visit, email the day before)
+
+12-mo crawler
+
+
+
+
+
+12-mo walker
+
+
+
+
+
+18-mo
+
+
+
+
+
+24-mo
+
+
+
+
+
+*Hi, may I speak with [MOM]?*
+
+*My name is [NAME] and I’m calling from [LAB] to confirm our visit on [DATE].*
+*Before the visit, I’d like to ask you a few questions.*
+*It will only take 5 minutes of your time. Can we speak now?*
+
+→ If yes:
+
+*Just as a reminder, the data we collect from you now and during the visit, will be shared on a web-based library only available to researchers like the professor who runs this lab.*
+
+List of questions on the [Phone Questionnaire](phone_questionnaire.html)
+
+Please note that presentation and format will differ in the app.
+
+→ If no:
+
+*Can I call you back today or tomorrow [before the visit]?*
+
+Schedule call.
+
+# Preparing for Visit
+
+## Prepare paperwork
+
+Write Participant ID on all paperwork (consents and questionnaires).
+Fill out locomotor milestone worksheet.
+
+## Pack equipment
+
+- Camera, SD card and extra battery
+- Mic
+- Laser Measure
+- Solitary play toy
+- Dyadic play toy
+- Yoga mat
+- Tablet with app for questionnaires (if mom speaks English and/or Spanish, bring both versions of MacArthur), study consent form, Databrary sharing release form, and decibel meter.
+- Answer choice sheet with response scales
+- Participant payment
+- Paper copies of all questionnaires, MCDI, and consent and Databrary forms in case of tablet failure
+- Tools for body dimensions (Height and Weight)- TBD
+
+# Home Visit
+
+## Introduction
+
+Say to Mom:
+
+*Thanks for letting us come to your home. The visit has a few parts:*
+
+*I’ll begin by video-recording you and [CHILD] as you go about your day. I will video-record you both for 1 hour. Then, I will ask [CHILD] to play with some toys both by him/herself and with you.*
+
+*Afterwards, I will ask you some general questions about your family and home, and about [CHILD]’s skills and routines.*
+
+*You will give me a tour of your home that I will record on video to get a sense of the places [CHILD] goes and things that he/she plays with.*
+
+*Do you have any questions? Let’s start with reading and signing the consent.*
+
+## Consent to Participate and Permission to Share
+
+Ask parent to review form asking for consent to participate in the study. When finished, give parent a moment to look over form and sign it.
+
+Ask parent to review form asking for permission to share videos and metadata.
+When finished, give parent a moment to look over the form and sign it.
+
+[Here](https://www.databrary.org/resources/templates/release-template.html) is the Databrary Release Language.
+[Here](https://www.databrary.org/resources/guide/investigators/release/asking/examples.html) are videos depicting how to ask for permission to share and a sample script.
+
+## Visit Protocol
+
+### 1: 1-Hour Natural Play Video, Shoes, & Noise Measurement
+
+#### 12-mo (crawler and walker)
+
+Crawler participant view
+
+
+
+
+
+Crawler experimenter view
+
+
+
+
+
+Walker participant view
+
+
+
+
+
+Walker experimenter view
+
+
+
+
+
+#### 18-mo
+
+Participant view
+
+
+
+
+
+Experimenter view
+
+
+
+
+
+#### 24-mo
+
+Participant view
+
+
+
+
+
+Experimenter view
+
+
+
+
+
+Instruction to mom:
+
+*For the next hour, do anything you would typically do if I weren’t here. Try to ignore me as much as possible and I will stay out of the way. I will also try not to respond to you and [CHILD] so that he/she is not distracted. You can go anywhere in your home. You can play together or not. The idea is to capture what your typical day is like.*
+
+Procedure:
+
+Keep camera on the child at all times.
+Specifically, ensure that the child’s whole body is visible on camera. If mom is in frame, capture as much of her body as possible without compromising view of the child.
+Record in front or to the side of the child as much as possible.
+Do not zoom in.
+Remain at as far a distance as possible (~3 to 5 m, hugging the wall) so that the child is not distracted by your presence.
+Try not to interact with the child or make eye contact with the child. Just watch through the view finder of the camera.
+
+#### Shoes
+
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14765/0,6640/asset/65148/download?inline=true")
+```
+
+If child is wearing shoes, video-record the shoes after the session; take them off child and video the bottom, side, and top views.
+
+Procedure:
+
+Zoom in with camera and comment on shoe type, heel (if any), and other observations.
+
+#### Decibel Meter
+
+Open the app on your tablet and start running it just before you begin recording the free play video portion of the visit.
+
+Procedure:
+
+Open application (the application immediately starts recording noise levels upon startup).
+Place device in the most central place in the home (e.g., living room)
+
+### 2: Solitary Play
+
+
+
+#### 12-mo crawler & walker
+
+12-mo crawler participant view
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14574/-/asset/61352/download?inline=true")
+```
+
+12-mo crawler experimenter view
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14574/-/asset/61358/download?inline=true")
+```
+
+12-mo walker participant view
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14167/-/asset/59918/download?inline=true")
+```
+
+12-mo walker experimenter view
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14167/-/asset/59928/download?inline=true")
+```
+
+#### 18-mo
+
+18-mo participant view
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14513/-/asset/61064/download?inline=true")
+```
+
+18-mo experimenter view
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14513/-/asset/61078/download?inline=true")
+```
+
+#### 24-mo
+
+24-mo participant view
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14514/-/asset/61052/download?inline=true")
+```
+
+24-mo experimenter view
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14514/-/asset/61060/download?inline=true")
+```
+
+
+Interviewer:
+
+*For the next few minutes, we want to see how [CHILD] plays by him/herself. We ask you not to distract him/her or tell him/her how to play. If [CHILD] tries to get your attention or wants to play with you, you can say, “Go play. It’s perfectly fine if he/she doesn’t play with the toy. Say to child: Here [CHILD], play with this!*
+
+Camera:
+
+Record solitary toy play so that view is on baby’s body entirely and hands on object.
+If child moves around, follow child and keep face in frontal view.
+Procedure:
+Set yoga mat down on the floor. Un-stack cups and arrange randomly, standing upright (out of child’s view).
+Place child in a sitting position on yoga mat and start timing after you present the toy.
+Use timer on the camera (let the timer run for a bit longer than 2 min to avoid cutting the play time short. Later we code only 2 min of engagement).
+After 2 minutes, say: *“Great job!”*
+
+### 3: Dyadic (Mother-Child) Play
+
+
+
+#### 12-mo crawler & walker
+
+[VIDEO 12-mo crawler participant view
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14574/-/asset/61354/download?inline=true")
+```
+
+12-mo crawler experimenter view
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14574/-/asset/61358/download?inline=true")
+```
+
+12-mo walker participant view
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14167/-/asset/59920/download?inline=true")
+```
+
+12-mo walker experimenter view starts at 03:40
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14167/-/asset/59928/download?inline=true")
+```
+
+#### 18-mo
+
+18-mo participant view
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14513/-/asset/61066/download?inline=true")
+```
+
+18-mo experimenter view starts at 3:48
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14513/-/asset/61078/download?inline=true")
+```
+
+#### 24-mo
+
+24-mo participant view
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14514/-/asset/61050/download?inline=true")
+```
+
+24-mo experimenter view starts at 03:32
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14514/-/asset/61060/download?inline=true")
+```
+
+Instructions:
+
+*Please sit next to [CHILD]. I’ll give you a toy. Please play with [CHILD].*
+
+Procedure:
+
+Record so that the child and mother’s entire body and hands are captured.
+Use timer on camera to time engagement.
+After 3 minutes, say “Great job!”
+
+### 4: [Questionnaires](questionnaires.html)
+
+Please note that presentation and format will differ in the app.
+
+#### 12-mo
+
+12-mo crawler
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14514/-/asset/61050/download?inline=true")
+```
+
+12-mo walker
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14514/-/asset/61060/download?inline=true")
+```
+
+#### 18-mo
+
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14513/-/asset/61076/download?inline=true")
+```
+
+#### 24-mo
+
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14514/-/asset/61088/download?inline=true")
+```
+
+#### General Questionnaires
+
+Instructions:
+
+*I have some questions for you…*
+
+[Only give introduction to the sections that need introduction (i.e., ECBQ and MB-CDI)].
+
+A GoogleSheet with most of the questions in a database format can be found [here](https://docs.google.com/spreadsheets/d/1pVOM2naRS_STCXx4nkaRDLO6_V5kGhFaduRfwsv7cnI/edit?usp=sharing).
+
+Procedure:
+
+Set up camera to record the questionnaires.
+You'll need to change the battery on the camera to ensure sufficient power.
+Sit next to the mom so she is able to read along.
+
+- [Toys & Pets](toys_pets.html)
+- [HOME](home.html)
+- [Gender Socialization](gender.html)
+- [Locomotor milestones](locomotor_milestones.html)
+- [ECLS-B Health](eclsb_health.html)
+- [Typical Day](typical_day.html)
+- [Media time & use](media.html)
+
+#### MacArthur-Bates Communicative Development Inventory (MB-CDI)
+
+MB-CDI should be administered in the primary language of the mom.
+Specific instructions and procedure are included in the questionnaire.
+
+#### [ECBQ](ecbq.html)
+
+Read instructions on questionnaire.
+Give mom answer sheet with rating scale.
+
+### 5: House Walkthrough & Room Measurements
+
+#### Video House Walkthrough
+
+#### 12-mo
+
+Crawler
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14574/-/asset/61356/download?inline=true")
+```
+
+Walker
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14167/-/asset/59922/download?inline=true")
+```
+
+#### 18-mo
+
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14513/-/asset/61068/download?inline=true")
+```
+
+#### 24-mo
+
+```{r, results='asis'}
+write_video_clip_html("https://nyu.databrary.org/slot/14514/-/asset/61048/download?inline=true")
+```
+
+Instructions:
+
+*Now, we would like to see the space that [CHILD] gets to explore throughout the day. Please give me a tour of your home as I follow with a camera, and take measurements of the spaces.*
+*As we walk around, please mention the things that [CHILD] plays with in each room. Please show me where you keep his/her clothes to give us an idea of the kinds of things he/she wears.*
+
+Procedure (Video):
+
+Pause at the entrance of the room.
+
+Name the room by its function (e.g., “This is where [CHILD] sleeps”).
+First, pan the camera SLOWLY from Left to Right.
+Then, pan the camera to Floor, name the different types of surfaces in the space (hardwood, plush carpet, thin rug, linoleum, tile, etc.), and then pan to the Ceiling.
+Hold the camera in one hand while you take measurements of the room.
+Do NOT turn off the camera when walking to next room.
+Walk SLOWLY.
+
+#### Room Measurements with Laser Distance Measurer
+
+
+
+Measure all rooms in the house.
+Room = any space used by someone on a regular basis, including: bedrooms, kitchens, bathrooms, and basements.
+Do not measure laundry rooms.
+Rooms don’t have to have windows.
+A room has to have a clear demarcation (e.g., a wall or an entry).
+If the room has a short divider (e.g., when a kitchen and a living room are divided by a counter), count as one big room and measure accordingly.
+
+Procedure:
+
+Turn measure on by pressing ON/DIST button.
+Make sure the laser is on.
+Place the base of the laser flat on the wall.
+Avoid moldings and door castings.
+Measure wall to wall, lengthwise and widthwise.
+If a room has an odd or asymmetrical shape (i.e., any shape other than a rectangle or a square), measure the largest rectangle or square area of the room.
+Press ON/DIST again to take measurement.
+Repeat the above for length and width.
+Focus camera on laser measure and read measurements out loud.
+
+### 6: Body Dimensions
+
+[TBD]
+
+### 7: Visit Wrap-up
+
+Complete home measurement, housing checklist sections of the Home Questionnaire.
+When you arrive back at the lab, wash all toys and equipment thoroughly.
+Wipe down yoga mat.
+Rinse nesting cups in bleach-water.
+Do not submerge shape sorter in water (or it will stop making noise).
+
+### 8: Visit post-processing
+
+Export questionnaire data from tablet.
+Upload videos, questionnaires, and house decibel data to Databrary.
diff --git a/protocol.Rmd b/protocol.Rmd
index ae63931..b320903 100644
--- a/protocol.Rmd
+++ b/protocol.Rmd
@@ -20,556 +20,367 @@ PLAY aims to set new standards for conducting open, transparent, and reproducibl
For confidentiality reasons, access to video exemplars is restricted to researchers with authorized access to [Databrary](http://databrary.org).
To register for access, visit .
-# Inclusion/Exclusion Criteria
+Please ensure that you are [**currently logged in at Databrary**](https://nyu.databrary.org/user/login) to view embedded video examples in this wiki and gain access to phone and home questionnaires.
-Infants' natural play in the home is characterized by tremendous variability including variations in: geographic location, climate, socioeconomic status (SES), maternal/paternal employment, childcare experiences, infants’ and mothers’ ages, language environment, physical layout and characteristics of the home, availability of media, toys for play, and so on.
-Researchers will be able to explore the effects of any/all such factors.
+# Participant Inclusion/Exclusion Criteria
-However, to ensure a sufficient sample size and based on conversations with the launch group, we will limit variability along several dimensions. To be included in the PLAY sample of 900 sessions, families must be two-parent households.
-All infants must be English or Spanish monolingual or bilingual.
-All infants must be the firstborn child and 12, 18, or 24 months of age (plus/minus one week). All infants must be full-term (37-40 weeks) without known disabilities.
-The mother must act as the caregiver during visits, which will be scheduled at a time when only the mother and infant are present in the home.
+Infants' natural play in the home is characterized by tremendous variability including variations in: geographic location, climate, SES, maternal/paternal employment, childcare experiences, infants’ and mothers’ ages, language environment, physical layout and characteristics of the home, availability of media, toys for play, and so on. Researchers will be able to explore the effects of any/all such factors.
-# Scheduling Visit
-
-## Initial recruiting call
-
-
-
-
-
-
-*Hi, may I speak with [MOM]?*
-
-*My name is [CALLER NAME] and I’m calling from [LAB]. We have a study for [12 / 18 / 24]-month-olds and [CHILD] is the perfect age. Can I tell you about it?*
-
-*What language(s) do you speak to [CHILD]?*
-
-→ If not ENGLISH or SPANISH: end call
-
-*To control for differences in communication, we are looking for families who speak mainly English or Spanish. Would it be alright if you are contacted for other studies in the future?*
-
-→ If yes: continue
-
-*For this study, we are interested in learning about babies’ natural, everyday experiences in their homes–such as the toys they play with and places they go.*
-*For this study, a researcher will visit you and [CHILD] in your home.*
-*You and [CHILD] will be video recorded for 1 hour as you go about your day.*
-*At the end of the visit we will ask questions about your family, your home, and [CHILD]’s skills and routines. We will also ask you to take us through your home as we do a video tour capturing the places [CHILD] gets to be throughout the day and things that [he/she] plays with.*
-
-*The study will take about 2 hours. You will receive XXX for your participation.*
-*We will schedule a day and time that’s convenient for you and when [CHILD] is usually awake/alert and not during a typical meal time.*
-
-*The data collected in this study are valuable and will be placed in a secure web-based library available only to researchers.*
-*The purpose is to share the data with experts in the field so that scientists can learn more about infant development.*
-
-*Are you interested in participating?*
+However, to ensure a sufficient sample size and based on conversations with the launch group, we will limit variability along several dimensions. To be included in the PLAY sample of 900 sessions, participants must be:
-→ If yes:
+- From two-parent or single-parent households
+- The mother must act as the caregiver during the one-hour natural interaction, which will be scheduled at a time when only the mother and infant are present.
+- English or Spanish monolingual or bilingual (i.e., no other language exposure in the home)
+- The firstborn child (i.e., only child in the household)
+- 12, 18, or 24 months of age (+/- 1 week)
+- Born full-term (37-40 weeks) without known disabilities
-*Is there a day and time that works best for you (when [CHILD] is awake/alert and not a typical meal time)?*
-→ If no:
-
-*Ok thank you. May we call you for other studies?*
-
-### Voicemail
+# Scheduling Visit
-*Hi, this message is for [MOM]. My name is [NAME] and I’m calling from [LAB].*
-*I’m calling because we have a fun study for [12 / 18 / 24]-month-olds and [CHILD] is the perfect age.*
-*If you are interested in hearing more about the study, please give us a call back.*
-*Our phone number is [XXX-XXX-XXXX]. Thank you and we hope to hear from you soon!*
+*To schedule a visit, you will be making two phone calls to each family: the initial recruiting call and the confirmation call (if the family agrees to participate). Depending on the availability of the mother, you will complete the [participant paperwork](link to questionnaires)*
-## Confirming the visit (2 days before actual visit, email the day before)
+## Initial recruiting call
-12-mo crawler
+
+
+
+
+
+
+
+
+
+
-
-
+
+
+
+
+
-18-mo
-
-
-
-
-24-mo
-
-
-
-
-
-*Hi, may I speak with [MOM]?*
-
-*My name is [NAME] and I’m calling from [LAB] to confirm our visit on [DATE].*
-*Before the visit, I’d like to ask you a few questions.*
-*It will only take 5 minutes of your time. Can we speak now?*
-
-→ If yes:
-
-*Just as a reminder, the data we collect from you now and during the visit, will be shared on a web-based library only available to researchers like the professor who runs this lab.*
-
-List of questions on the [Phone Questionnaire](phone_questionnaire.html)
-
-Please note that presentation and format will differ in the app.
-
-→ If no:
-
-*Can I call you back today or tomorrow [before the visit]?*
-
-Schedule call.
-
-# Preparing for Visit
-
-## Prepare paperwork
-
-Write Participant ID on all paperwork (consents and questionnaires).
-Fill out locomotor milestone worksheet.
-
-## Pack equipment
-
-- Camera, SD card and extra battery
-- Mic
-- Laser Measure
-- Solitary play toy
-- Dyadic play toy
-- Yoga mat
-- Tablet with app for questionnaires (if mom speaks English and/or Spanish, bring both versions of MacArthur), study consent form, Databrary sharing release form, and decibel meter.
-- Answer choice sheet with response scales
-- Participant payment
-- Paper copies of all questionnaires, MCDI, and consent and Databrary forms in case of tablet failure
-- Tools for body dimensions (Height and Weight)- TBD
-
-# Home Visit
+Hi, may I speak with [MOM]?
-## Introduction
+My name is [CALLER NAME] and I’m calling from [LAB] about a home study where we're looking at how moms and babies interact with each other. We'd be visitng your home and you'd receive $25 at the end of the visit for your participation. Can we tell you about it?
-Say to Mom:
+First, I have some questions to see if you and [CHILD] qualify for the study.
-*Thanks for letting us come to your home. The visit has a few parts:*
+Does [CHILD]have any siblings?
-*I’ll begin by video-recording you and [CHILD] as you go about your day. I will video-record you both for 1 hour. Then, I will ask [CHILD] to play with some toys both by him/herself and with you.*
+- → *If yes: **end call**.* In this study, we are currently looking for only-children. Would it be alright if we contacted you for other studies in the future?
+- → *If no: continue*
-*Afterwards, I will ask you some general questions about your family and home, and about [CHILD]’s skills and routines.*
+What language(s) do you speak to [CHILD]?
-*You will give me a tour of your home that I will record on video to get a sense of the places [CHILD] goes and things that he/she plays with.*
+- → *If not ENGLISH or SPANISH: **end call**.* To control for differences in communication, we are looking for families who speak mainly English or Spanish. Would it be alright if we contacted you for other studies in the future?
+- → *If yes: continue*
-*Do you have any questions? Let’s start with reading and signing the consent.*
+Was [CHILD] born on his/her due date? (If not: “How many weeks and days early/late was he/she?”)
-## Consent to Participate and Permission to Share
+- → *If more than 4 weeks early: end call.* In this study, we are currently looking for children born on term. Would it be alright if we contacted you for other studies in the future?
+- → *If born on term (37-41 weeks): continue*
-Ask parent to review form asking for consent to participate in the study. When finished, give parent a moment to look over form and sign it.
+For this study, we are interested in learning about babies’ natural, everyday experiences in their homes. A researcher will visit you and [CHILD] in your home for about 2.5 hours. You and [CHILD] will be video recorded for 1 hour as you go about your day, followed by questions on [CHILD]'s skills and routines. We will ask you to take us through your home as we record the environment. For your participation, we will compensate you with $25 at the end of the session.
-Ask parent to review form asking for permission to share videos and metadata.
-When finished, give parent a moment to look over the form and sign it.
+The data collected in this study are valuable and will be placed in a secure web-based library available only to researchers. The purpose is to share the data with experts in the field so that scientists can learn more about infant development.
-[Here](https://www.databrary.org/resources/templates/release-template.html) is the Databrary Release Language.
-[Here](https://www.databrary.org/resources/guide/investigators/release/asking/examples.html) are videos depicting how to ask for permission to share and a sample script.
+Does this sound like something you would be interested in participating in with [CHILD]?
-## Visit Protocol
+→ *If no to study or to sharing video on Databrary:* Okay, thank you. May we call you for other studies?
-### 1: 1-Hour Natural Play Video, Shoes, & Noise Measurement
+→ *If yes:* Great! Because we are interested in mother-infant routines, we'd like to find a time and date when we can observe just **you and [CHILD]** at home. It would be great if we can schedule a time that is not during a typical mealtime and when [CHILD] is usually awake. Is there a convenient time and day that works best for you that would be within these criteria?
-#### 12-mo (crawler and walker)
+→ *If the date they are available puts baby out of age range:* For this study, we are interested in studying specific age groups: 12-, 18-, and 23-month olds. Would it be possible for us to contact you in XX months to see if [CHILD] can participate then?
-Crawler participant view
-
-
-
-
-
-Crawler experimenter view
-
-
-
-
+---
-Walker participant view
-
-
-
-
+- Before the study, we have a few questions that we'd like to ask. It should only take about 5 minutes. We can either ask you now or when we call to confirm the study. What would you prefer?
-Walker experimenter view
-
-
-
-
+→ *If yes:*
-#### 18-mo
+Just as a reminder, the data we collect from you now and during the visit, will be shared on a web-based library only available to researchers like the professor who runs this lab.
-Participant view
-
-
-
-
+- On your tablet, open Kobo toolbox and start a new questionnaire set
+- Fill out participant information at top of new session
+- “Save as Draft” after Phone Questionnaire and home visit questionnaires
+- Only hit “Submit” after filling out clean-up notes back in lab
-Experimenter view
-
-
-
-
+[List of questions on the Phone Questionnaire](https://nyu.databrary.org/volume/254/slot/15048/-/asset/63741)
+*Please note that presentation and format will differ in the app.*
-#### 24-mo
+→ *If no:* Proceed with wrapping up the call:
-Participant view
-
-
-
-
+---
-Experimenter view
-
-
-
-
+- So I have you and [CHILD] for our study on [DATE] at [TIME]. We'll be calling you the day before (if study is on Monday: the Friday before) your appointment to confirm that that time still works for you. Have a great day!
+- If the date they are available puts baby out of age range: For this study, we are interested in studying specific age groups: 12-, 18-, and 23-month olds. Would it be possible for us to contact you in XX months to see if [CHILD] can participate then? → If no to study or to sharing video on Databrary: Okay, thank you. May we call you for other studies?
-Instruction to mom:
+→ If no: Okay, thank you. May we call you for other studies?
-*For the next hour, do anything you would typically do if I weren’t here. Try to ignore me as much as possible and I will stay out of the way. I will also try not to respond to you and [CHILD] so that he/she is not distracted. You can go anywhere in your home. You can play together or not. The idea is to capture what your typical day is like.*
-Procedure:
+### Voicemail
-Keep camera on the child at all times.
-Specifically, ensure that the child’s whole body is visible on camera. If mom is in frame, capture as much of her body as possible without compromising view of the child.
-Record in front or to the side of the child as much as possible.
-Do not zoom in.
-Remain at as far a distance as possible (~3 to 5 m, hugging the wall) so that the child is not distracted by your presence.
-Try not to interact with the child or make eye contact with the child. Just watch through the view finder of the camera.
+Hi, this message is for [MOM]. My name is [NAME] and I’m calling from [LAB]. I’m calling because we have a fun study for [12 / 18 / 24]-month-olds and [CHILD] is the perfect age. You would receive $25 for participating in the study and if you are interested in hearing more, please give us a call back. Our phone number is [XXX-XXX-XXXX]. Thank you and we hope to hear from you soon!
-#### Shoes
+### Confirming the visit (1 day before actual visit, also email the day before)
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14765/0,6640/asset/65148/download?inline=true")
-```
+
-If child is wearing shoes, video-record the shoes after the session; take them off child and video the bottom, side, and top views.
+Hi, may I speak with [MOM]?
-Procedure:
+My name is [NAME] and I’m calling from [LAB] to confirm our visit with you and [CHILD] on [DATE]. Does this time and date still work for you?
-Zoom in with camera and comment on shoe type, heel (if any), and other observations.
+→ *If yes:*
-#### Decibel Meter
+---
-Open the app on your tablet and start running it just before you begin recording the free play video portion of the visit.
+→ *If phone questionnaire was not completed during initial phone call:* Before the visit, I’d like to ask you a few questions. It will only take 5 minutes of your time. Can we speak now?
-Procedure:
+→ *If yes:*
-Open application (the application immediately starts recording noise levels upon startup).
-Place device in the most central place in the home (e.g., living room)
+Just as a reminder, the data we collect from you now and during the visit, will be shared on a web-based library only available to researchers like the professor who runs this lab.
-### 2: Solitary Play
+- On your tablet, open Kobo toolbox and start a new questionnaire set
+- Fill out participant information at top of new session
+- “Save as Draft” after Phone Questionnaire and home visit questionnaires
+- Only hit “Submit” after filling out clean-up notes back in lab
-
+[List of questions on the Phone Questionnaire](https://nyu.databrary.org/volume/254/slot/15048/-/asset/63741)
+*Please note that presentation and format will differ in the app.*
-#### 12-mo crawler & walker
+→ *If no:* Can I call you back today or tomorrow [before the visit]. *Schedule call.*
-12-mo crawler participant view
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14574/-/asset/61352/download?inline=true")
-```
+---
-12-mo crawler experimenter view
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14574/-/asset/61358/download?inline=true")
-```
+# Preparing for Visit
-12-mo walker participant view
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14167/-/asset/59918/download?inline=true")
-```
+## Pack
+
+
+
+
+
Camera, SD card and extra battery
+
+
Microphone with sponge cover
+
+
+
+
Fully loaded and charged tablet
+
+
Laser Measure
+
+
+
+
Decibel meter mic
+
+
Tripod for camera
+
+
+
+
Tote bag
+
+
Yoga mat
+
+
+
+
Dish set
+
+
Toy
+
+
+
+
Participant payment
+
+
Paper backups of all questionnaires and forms
+
+
+
+
+
+
-12-mo walker experimenter view
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14167/-/asset/59928/download?inline=true")
-```
+# Home Visit
-#### 18-mo
+## Introduction
-18-mo participant view
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14513/-/asset/61064/download?inline=true")
-```
+**Say to Mom:**
-18-mo experimenter view
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14513/-/asset/61078/download?inline=true")
-```
+“Thanks for letting us come to your home. The visit has a few parts:
-#### 24-mo
+I’ll begin by video-recording you and [CHILD] as you go about your day. I will video-record you both for 1 hour. Then, I will ask [CHILD] to play with some toys both by him/herself and with you.
-24-mo participant view
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14514/-/asset/61052/download?inline=true")
-```
+Afterwards, I will ask you some general questions about your family and home, and about [CHILD]’s skills and routines.
-24-mo experimenter view
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14514/-/asset/61060/download?inline=true")
-```
+You will give me a tour of your home that I will record on video to get a sense of the places [CHILD] goes and things that he/she plays with.
+Do you have any questions? Let’s start with reading and signing the consent.“
-Interviewer:
+## Consent to Participate and Permission to Share
-*For the next few minutes, we want to see how [CHILD] plays by him/herself. We ask you not to distract him/her or tell him/her how to play. If [CHILD] tries to get your attention or wants to play with you, you can say, “Go play. It’s perfectly fine if he/she doesn’t play with the toy. Say to child: Here [CHILD], play with this!*
+Ask parent to review form asking for consent to participate in the study. When finished, give parent a moment to look over form and sign it.
-Camera:
+Ask parent to review form asking for permission to share videos and metadata. When finished, give parent a moment to look over the form and sign it. Here is the [Databrary Release Language](https://databrary.org/access/policies/release-template.html). Here are [videos](https://databrary.org/access/guide/investigators/release/asking/examples.html) depicting how to ask for permission to share and a [sample script](https://databrary.org/access/guide/investigators/release/asking/script.html).
-Record solitary toy play so that view is on baby’s body entirely and hands on object.
-If child moves around, follow child and keep face in frontal view.
-Procedure:
-Set yoga mat down on the floor. Un-stack cups and arrange randomly, standing upright (out of child’s view).
-Place child in a sitting position on yoga mat and start timing after you present the toy.
-Use timer on the camera (let the timer run for a bit longer than 2 min to avoid cutting the play time short. Later we code only 2 min of engagement).
-After 2 minutes, say: *“Great job!”*
+## Visit Protocol
-### 3: Dyadic (Mother-Child) Play
+### 1: One-Hour Natural Play Video & Noise Measurement
+#### 1.1 One-Hour Natural Play Video
-
+
-#### 12-mo crawler & walker
+##### Instruction to mom:
-[VIDEO 12-mo crawler participant view
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14574/-/asset/61354/download?inline=true")
-```
+“For the next hour, do anything you would typically do if I weren’t here. Try to ignore me as much as possible and I will stay out of the way. I will also try not to respond to you and [CHILD] so that he/she is not distracted. You can go anywhere in your home. You can play together or not. The idea is to capture what your typical day is like.”
-12-mo crawler experimenter view
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14574/-/asset/61358/download?inline=true")
-```
+##### Procedure:
-12-mo walker participant view
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14167/-/asset/59920/download?inline=true")
-```
+- Keep camera on the child at all times. Specifically, ensure that the child’s whole body is visible on camera. If mom is in frame, capture as much of her body as possible without compromising view of the child.
+- Record in front or to the side of the child as much as possible.
+- Do not zoom in.
+- Remain at as far a distance as possible (~3 to 5 m, hugging the wall) so that the child is not distracted by your presence.
+- Try not to interact with the child or make eye contact with the child. Just watch through the view finder of the camera.
-12-mo walker experimenter view starts at 03:40
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14167/-/asset/59928/download?inline=true")
-```
-#### 18-mo
+#### 1.2 Shoes:
-18-mo participant view
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14513/-/asset/61066/download?inline=true")
-```
+
-18-mo experimenter view starts at 3:48
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14513/-/asset/61078/download?inline=true")
-```
+If child is wearing shoes, video-record the shoes after the session; take them off child and video the bottom, side, and top views.
-#### 24-mo
+##### Procedure:
-24-mo participant view
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14514/-/asset/61050/download?inline=true")
-```
+Zoom in with camera and comment on shoe type, heel (if any), and other observations.
-24-mo experimenter view starts at 03:32
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14514/-/asset/61060/download?inline=true")
-```
+#### 1.3 Decibel Meter
-Instructions:
+Open the app on your tablet and start running it just before you begin recording the free play video portion of the visit. This process should be recorded on the video camera.
-*Please sit next to [CHILD]. I’ll give you a toy. Please play with [CHILD].*
+##### Procedure:
-Procedure:
+- Open application (the application immediately starts recording noise levels upon startup).
+- Place device in the most central place in the home (e.g., living room).
+- The microphone should be facing towards the room (e.g., away from walls) and propped up on the microphone stand so that it is **not** lying flat against the surface of the space.
-Record so that the child and mother’s entire body and hands are captured.
-Use timer on camera to time engagement.
-After 3 minutes, say “Great job!”
-### 4: [Questionnaires](questionnaires.html)
+### 2: Structured five-minute mother-child play
-Please note that presentation and format will differ in the app.
+##### Instruction to mom:
-#### 12-mo
+“Please sit next to [CHILD]. I’ll give you a set of toys. Please play with [CHILD] however you like.”
-12-mo crawler
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14514/-/asset/61050/download?inline=true")
-```
+##### Procedure:
-12-mo walker
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14514/-/asset/61060/download?inline=true")
-```
+* Record so that the child and mother’s entire body and hands are captured.
+* Use timer on camera to time engagement.
+* After 5 minutes, say “Great job!”
-#### 18-mo
+### 3. Questionnaires
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14513/-/asset/61076/download?inline=true")
-```
+* [List of Questions on 12-mo Home Questionnaire](https://nyu.databrary.org/volume/254/slot/15048/-/asset/72494)
-#### 24-mo
+* [List of Questions on 18-mo Home Questionnaire](https://nyu.databrary.org/volume/254/slot/15048/-/asset/72492)
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14514/-/asset/61088/download?inline=true")
-```
+* [List of Questions on 24-mo Home Questionnaire](https://nyu.databrary.org/volume/254/slot/15048/-/asset/72493)
-#### General Questionnaires
+**Please note that presentation and format will differ in the app.**
-Instructions:
+
-*I have some questions for you…*
+
-[Only give introduction to the sections that need introduction (i.e., ECBQ and MB-CDI)].
+#### General Questionnaire
-A GoogleSheet with most of the questions in a database format can be found [here](https://docs.google.com/spreadsheets/d/1pVOM2naRS_STCXx4nkaRDLO6_V5kGhFaduRfwsv7cnI/edit?usp=sharing).
+##### Instructions:
-Procedure:
+“I have some questions for you…” [Only give introduction to the sections that need introduction (i.e., ECBQ and MacArthur)].
-Set up camera to record the questionnaires.
-You'll need to change the battery on the camera to ensure sufficient power.
-Sit next to the mom so she is able to read along.
+##### Procedure:
-- [Toys & Pets](toys_pets.html)
-- [HOME](home.html)
-- [Gender Socialization](gender.html)
-- [Locomotor milestones](locomotor_milestones.html)
-- [ECLS-B Health](eclsb_health.html)
-- [Typical Day](typical_day.html)
-- [Media time & use](media.html)
+Set up camera to record the questionnaires. You'll need to change the battery on the camera to ensure sufficient power. - Sit next to the mom so she is able to read along.
-#### MacArthur-Bates Communicative Development Inventory (MB-CDI)
+#### MacArthur
-MB-CDI should be administered in the primary language of the mom.
-Specific instructions and procedure are included in the questionnaire.
+* MacArthur should be administered in the primary language of the mom.
+* Specific instructions and procedure are included in the questionnaire.
-#### [ECBQ](ecbq.html)
+#### ECBQ
-Read instructions on questionnaire.
-Give mom answer sheet with rating scale.
+* Read instructions on questionnaire.
+* Give mom answer sheet with rating scale
-### 5: House Walkthrough & Room Measurements
+### 4: House Walkthrough & Room Measurements
#### Video House Walkthrough
-#### 12-mo
+##### Instructions:
-Crawler
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14574/-/asset/61356/download?inline=true")
-```
+“Now, we would like to see the space that [CHILD] gets to explore throughout the day. Please give me a tour of your home as I follow with a camera, and take measurements of the spaces. As we walk around, please show me where you keep any objects — toys, books, sippy cups, anything like that — that [CHILD] might interact with regularly. Please show me where you keep his/her clothes to give us an idea of the kinds of things he/she wears.”
-Walker
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14167/-/asset/59922/download?inline=true")
-```
+##### Procedure (Video):
-#### 18-mo
+1. Experimenter should watch recording through camcorder screen to ensure that the view is not blurry or shaky. Move the camera slowly and walk slowly. A clear and steady view, free of blurriness and shakiness, is necessary for detailed coding of the home environment.
+2. Ensure during the house walkthrough that the parent provides information on all of the following:
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14513/-/asset/61068/download?inline=true")
-```
+- *Children's Sleeping Arrangements.* If parent does not offer information during walkthrough, say “Please show me where [CHILD] typically sleeps.”
+- *Child’s Clothes.* If parent does not offer information during walkthrough, say “Please show me where you keep [CHILD]’s clothes.”
+- *Child’s Books.* If parent does not offer information during walkthrough, say “Please show me where you keep [CHILD]’s books.”
+- *Child’s Toys.* If parent does not offer information during walkthrough, say “Please show me where you keep [CHILD]’s toys.”
-#### 24-mo
+##### Instructions for House Walkthrough:
-```{r, results='asis'}
-write_video_clip_html("https://nyu.databrary.org/slot/14514/-/asset/61048/download?inline=true")
-```
-
-Instructions:
-
-*Now, we would like to see the space that [CHILD] gets to explore throughout the day. Please give me a tour of your home as I follow with a camera, and take measurements of the spaces.*
-*As we walk around, please mention the things that [CHILD] plays with in each room. Please show me where you keep his/her clothes to give us an idea of the kinds of things he/she wears.*
-
-Procedure (Video):
-
-Pause at the entrance of the room.
-
-Name the room by its function (e.g., “This is where [CHILD] sleeps”).
-First, pan the camera SLOWLY from Left to Right.
-Then, pan the camera to Floor, name the different types of surfaces in the space (hardwood, plush carpet, thin rug, linoleum, tile, etc.), and then pan to the Ceiling.
-Hold the camera in one hand while you take measurements of the room.
-Do NOT turn off the camera when walking to next room.
-Walk SLOWLY.
+1. Pause at the entrance of the room.
+2. Name the room by its function (e.g., “This is where [CHILD] sleeps”).
+3. First, get as much of the ** *Entire Room* ** in frame as possible. Keep the camera zoomed out and make sure to capture the ceiling and the floor of the room.
+4. Next, pan the camera SLOWLY from ** *Left to Right* **.
+5. Then, pan the camera to ** *Floor* **, name the different types of surfaces in the space (hardwood, plush carpet, thin rug, linoleum, tile, etc.), and then pan to the ** *Ceiling* **.
+6. Hold the camera in one hand while you take measurements of the room.
+7. If parent doesn’t offer information, ask parent if child spends time in each room: “Does [CHILD] spend any time in this room?”
+8. When parent doesn’t offer information, ask parent about child’s objects in the room: “Do you keep anything for [CHILD] in this room? (If yes:) Would you mind showing me?”
+9. Experimenter should film the ** *Location* ** of the storage space (drawer, toy chest, cabinet) in clear context of the rest of the room. Then, SLOWLY and CLEARLY film the ** *Contents* ** of the storage space to show what is inside of it, zooming in if needed. (Overhead view for bed, crib, drawers, toy chest, etc.; Zoomed in side view for cabinet, closet, bookshelf, etc.)
+10. Do ** *NOT* ** turn off the camera when walking to next room.
+11. Walk SLOWLY.
#### Room Measurements with Laser Distance Measurer
-
-
-Measure all rooms in the house.
-Room = any space used by someone on a regular basis, including: bedrooms, kitchens, bathrooms, and basements.
-Do not measure laundry rooms.
-Rooms don’t have to have windows.
-A room has to have a clear demarcation (e.g., a wall or an entry).
-If the room has a short divider (e.g., when a kitchen and a living room are divided by a counter), count as one big room and measure accordingly.
-
-Procedure:
-
-Turn measure on by pressing ON/DIST button.
-Make sure the laser is on.
-Place the base of the laser flat on the wall.
-Avoid moldings and door castings.
-Measure wall to wall, lengthwise and widthwise.
-If a room has an odd or asymmetrical shape (i.e., any shape other than a rectangle or a square), measure the largest rectangle or square area of the room.
-Press ON/DIST again to take measurement.
-Repeat the above for length and width.
-Focus camera on laser measure and read measurements out loud.
-
-### 6: Body Dimensions
+* Room Measurements with Laser Distance Measurer
+* Measure all rooms in the house.
+* Room = any space used by someone on a regular basis, including: bedrooms, kitchens, bathrooms, and basements. Do not measure laundry rooms. Rooms don’t have to have windows.
+* A room has to have a clear demarcation (e.g., a wall or an entry).
+* If the room has a short divider (e.g., when a kitchen and a living room are divided by a counter), count as one big room and measure accordingly.
-[TBD]
+##### Procedure:
-### 7: Visit Wrap-up
+1. Turn measure on by pressing ON/DIST button. Make sure the laser is on.
+2. Place the base of the laser flat on the wall. Avoid moldings, door castings, and reflective surfaces.
+3. Measure wall to wall, lengthwise and widthwise.
+4. If a room has an odd or asymmetrical shape (i.e., any shape other than a rectangle or a square), measure the largest rectangle or square area of the room.
+5. Press ON/DIST again to take measurement.
+6. Repeat the above for length and width.
+7. Focus camera on laser measure and read measurements out loud.
-Complete home measurement, housing checklist sections of the Home Questionnaire.
-When you arrive back at the lab, wash all toys and equipment thoroughly.
-Wipe down yoga mat.
-Rinse nesting cups in bleach-water.
-Do not submerge shape sorter in water (or it will stop making noise).
-### 8: Visit post-processing
+### 5: Visit wrap-up
-Export questionnaire data from tablet.
-Upload videos, questionnaires, and house decibel data to Databrary.
+* Complete housing checklist and clean-up notes sections of the questionnaire battery.
+* When you arrive back at the lab, wash all toys and equipment thoroughly. Wipe down yoga mat. Rinse dish set in bleach-water.
+* Check over all questionnaire responses and hit “Submit”.
+* Upload videos and decibel data to Databrary.
diff --git a/site-info.Rmd b/site-info.Rmd
index 839e544..44be2fb 100644
--- a/site-info.Rmd
+++ b/site-info.Rmd
@@ -4,7 +4,7 @@ title: "About the site"
-This site is generated in [R Markdown](http://rmarkdown.rstudio.com/) using [RStudio](http://rstudio.com), version `r rstudioapi::versionInfo()$version`.
+This site is hosted in [GitHub](http://github.com/PLAY-behaviorome/) and is generated in [R Markdown](http://rmarkdown.rstudio.com/) using [RStudio](http://rstudio.com), version `r rstudioapi::versionInfo()$version`.
For information about using R Markdown to generate websites, see .
For formatting information, see .
diff --git a/transcription.Rmd b/transcription.Rmd
index 6ba1fa1..348237d 100644
--- a/transcription.Rmd
+++ b/transcription.Rmd
@@ -55,229 +55,66 @@ If the content of the utterance can be heard clearly by the coder, then type tra
``: Code 'b' if the baby is the source of the utterance. This code will be filled in automatically using quick keys.
``
+
transcribe utterance
-Type the complete utterance. Type everything in lower case, except for proper names (e.g., Mommy, I, Cheerios, Anna). Use apostrophes correctly for contractions and possessives (e.g., don't, where's, Daddy's, Lily's). Do not use “,” commas.
-
-Transcription: “snowmans dont drink coffee”
-
-
-
-
-
-
-Transcription: “un caballo”
-
-
-
-
-
-
-Transcription: “momma”
-
-
-
-
-
-
-Transcription: “woof woof”
-
-
-
-
-
-
-Put a question mark “?” at the end of any utterance that is a question.
-
-Transcription: “want cheerios?”
-
-
-
-
-
-
-Individual letters (e.g., mom spells out zoo as “z” “o” “o”) need to be marked with an @ (at symbol) so that they're not confused with actual words, for example z@ o@ o@.
-Use existing rules for utterances to decide if each letter is it's own utterance.
-
-Any utterance that is unintelligible or hard to decipher, code as “xxx”. This could be the full utterance: for example, the mom says multiple words but they are all unintelligible, so the entire code is “xxx”.
-Or part of the utterance is intelligible, but part is not: for example, the mom says “give me” and what she says to give is unintelligible, so code “give me xxx”.
-
-Transcription: “xxx”
-
-