-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathchapterThreeMitchellSP.txt
More file actions
285 lines (285 loc) · 62.5 KB
/
chapterThreeMitchellSP.txt
File metadata and controls
285 lines (285 loc) · 62.5 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
In order for public school educators to combat the inequalities that continue to
persist, close the achievement gap, and meet the requirements from federal and state accountability measures, they must continue to immerse themselves in organizational learning. One way educators can do this is by using protocols to facilitate professional discussions of teacher practices and the resulting student data. The basic premise of this study was to examine and describe how UES made changes during the school reform process that benefited students and how the school has been promoted throughout the school division and the state for its use of DDDM protocols to facilitate school reform. From 2012 to 2015, UES was recognized for its overall performance and achievement by the state and the school division for its school reform process. UES earned two of the highest designations that could be bestowed upon schools during that time. UES was designated as a School of Excellence by the school division. Upon visiting UES, the governor of the state designated it as a school to watch for exceeding the standards for ELLs. The recognitions conferred upon UES created a buzz within the school division, suggesting the school is a model for other schools that are struggling with similar processes of school reform. The excitement that was generated by the recognitions from the state and division compelled the leaders of many schools in the division to adopt some variation of the UES model.
Chapter three contains a description of the methodology for this single-case
embedded common qualitative case study (Yin, 2014) of the DDDM protocols UES used as a tool to help the organization learn during the school reform process. Chapter three also contains explanations of the research questions, an overview of the research process,
46
and a description and justification for the site selection. In addition, it contains details of the process used to collect and analyze the data, including the units of analysis, the procedures, authenticity and trustworthiness, and any potential researcher bias. Last, the chapter contains a brief review of researcher bias and the ethics of this research, the IRB approval process, and confidentiality.
Research Questions
Three research questions framed the design of this case study (Creswell, 2013;
Miles & Huberman, 1994; Yin, 2014). The overarching research question was: How did protocol-structured discussions contribute to organizational learning at UES between 2012 and 2015? Further, this study was guided by the following related questions:
1. How did protocol-structured discussions affect the school culture at UES
between 2012 and 2015?
2. What aspects of protocol-structured discussions transferred to teacher
practices at UES between 2012 and 2015?
Overview of the Research Process
The research at UES was guided by the use of a single-case embedded common
qualitative case study (Creswell, 2009, 2013; Merriam, 2009; Yin, 2014). The focus was on DDDM protocols as a tool to help organizations learn during the school reform process, particularly as it pertains to teacher collaboration on planning instructional unit guides and looking at student work. UES is a school that used protocol-structured discussions as an essential part of its school reform process. To examine the research questions, I conducted interviews with the former principal, the current principal, and the division school improvement planning coordinator; focus group interviews with the
47
school improvement team and with two CLTs at the school; and document reviews and observations of CLT meetings to note the protocols being utilized at UES to accurately describe and understand how their use promoted organizational learning. The study involved an examination of how each of the collaborative teams at UES reviewed student work and fostered interdisciplinary learning at UES. The observations were critical for providing insight into the school’s performance on the state assessments in reading and mathematics. Finally, to address issues of researcher bias, I maintained a reflexivity journal.
The Case Study Site: Description and Justification for Selection
The study took place at a fully accredited, public elementary school. This school
was chosen for the study because it was recognized by the state and the school division for its overall performance and achievement from the period of 2012 to 2015. I first learned about the success story of UES at a monthly principal meeting in Joshua County Public Schools (JCPS; a pseudonym). At that meeting, schools were recognized for their achievements on the state standardized assessment results. The practice of recognizing schools for their performance on these assessments put Title I principals at an unfair disadvantage, perhaps discouraging principals who may want to go to Title I schools because most of the recognitions are bestowed upon non-Title I schools. UES was one of the few Title I schools recognized on that day. One solution to address the discouragement of principals could be the use of various protocols to facilitate the meetings.
From 2012 to 2015, there were five schools in JCPS participating in some aspect
of the state school improvement planning (SIP). UES was the only school out of the five
48
to get out of the state SIP process. In the 2014-2015 school year, UES had the highest mathematics performance in the entire school division on the Grade 3 through 5 standards of learning (SOL) tests. Out of the cohort of five schools participating in the state SIP process, comparable student performance data from 2012 to 2015 show that UES rose faster than the other four schools (Appendix W). When reviewing the recognitions and designations, it became apparent that leaders of other schools might be looking into how to make this kind of progress in such a short period of time. In addition to the excitement that was generated from the success of UES, members of the school leadership team have presented their school reform practices at several local and state conferences. Because of the accolades bequeathed on UES, it was chosen as the site for this single-case embedded case study. UES has a Head Start program, a Pre-school Initiative (PI) program, as well as kindergarten through fifth grades. UES is located in a suburban and transient school division in the mid-Atlantic part of the United States with an enrollment of approximately 700 students. It is classified as a Title I school because 50% or more of the student body receives free or reduced lunch. Students represent different ethnicities and socioeconomic levels. Their ethnic backgrounds include White, Black, Hispanic, and Asian. The student population consists of 77% Hispanics of any race, 10% African Americans, approximately 4% Asians, 7% White, and 3% of the student population identifies as two or more races. Approximately 90% of the student population qualifies for free or reduced lunch. In addition, 66% of the student population is classified as limited English proficiency, 6% are labeled as gifted, and 13% are classified as special education. Last, 100% of the student population is classified as Title I because UES uses a school-wide Title I model instead of a targeted assistance model.
49), Section(heading=In a school-wide model, all students are classified as Title I and therefore can receive, text=remedial services by the Title I teachers in reading and mathematics. In a targeted assistance Title I program, only students who are identified receive services from the Title I teachers. The school-wide Title I model allows for greater flexibility. The school mobility rate at UES is 31%. The average daily attendance is above 95% with approximately 21% of students being absent more than 10 days.
Creswell (2013) suggested selecting cases for a study through purposeful
sampling to discover and understand a phenomenon (Merriam, 2009; Patton, 2002; Yin, 2014). For the purpose of this study, I selected the site because of the buzz it created in the school division as opposed to using purposeful sampling. UES was an exemplar site because of the buzz and the accolades it received from the governor for meeting the needs of ELL students. In addition, it earned a School of Excellence award for two consecutive years. Similar to purposeful sampling, this selection strategy was critical because it allowed me to pick a case that was information-rich and would provide the necessary data for in-depth discussions to understand the phenomenon that occurred from 2012 through 2015 (Patton, 2002). This strategy was appropriate for the current study because UES was one of three schools in JCPS participating in the school reform process. It was the only school to exit the school reform process during 2012 through 2015. The research was limited to one school division that was selected prior to the selection of the school. UES had not met the federal AMO targets for several years prior to the appointment of the new principal. During the period of 2012 to 2015, UES was the only school participating in the school reform process that consistently engaged staff in organizational learning through the use of protocols.
50
UES is known within the school division as a pioneer in leading the school reform
process through the use of protocol-structured discussions. The work at UES has become a model for other schools in the school division. Members of the school faculty have presented at several local and state conferences and have been recognized by highranking political officials within the state and local governments. As an elementary school principal, I have experience in engaging my staff in the use of protocol-structured discussions to look at student data and plan instructional unit guides. This process began many years ago when we were in the school reform process. Because of my experience in this area, I was intrigued by the work being done at UES as I employed a similar approach when I led the school reform process almost a decade ago using systems and processes for DDDM, RtI, and MTSS.
Research Participants
Participants in the study at UES included the former principal, current principal,
current assistant principal, Title I supervisor, former Title I professional development specialist, members of the third grade and fourth grade CLTs, and several ESOL, SPED, reading, and math specialists. The current principal of UES, Lindsay, took over for Howard at the beginning of the 2015-2016 school year. She has 17 years of experience in education, 13 of which were at UES. In addition, several members of the SIP team who were instrumental in the implementation of protocol-structured discussions at UES from 2012 to 2015 participated in the study. These participants’ experience in education ranged from 1 year to over 35 years. Table 2 provides further detail on each participant in the study.
51
study research. Units of analysis are the people, programs, events, occurrences, or
52
incidents to be studied in case study research (Patton, 2002). Table 3 aligns the research questions with the units of analysis. The main unit of analysis was UES and the embedded units of analysis were the third and fourth grade CLTs. The overarching research question was designed to examine how protocol-structured discussions contributed to organizational learning at UES between 2012 and 2015. Therefore, the embedded units of analysis were the third grade CLT, fourth grade CLT, school-based leaders, and school division supervisors. The second and third research questions were designed to explore how protocol-structured discussions affected the culture at UES and the aspects of protocol-structured discussions that were transferred to teachers’ classroom practice between 2012 and 2015. Based on these two questions, the CLTs, school-based leaders, and school division supervisors served as the units of analysis.
53
approach (Creswell, 2009; Merriam, 2009; Yin, 2014). Creswell (2013) characterized a case study as a “qualitative approach in which the investigator explores real-life, contemporary bounded system (a case) or multiple bounded systems (cases) over time, through detailed, in-depth data collection involving multiple sources of information” (p. 97). Case study research allows a researcher to investigate a current phenomenon in a real-life context (Yin, 2014). According to Yin (2014), there are four types of case study designs a researcher can utilize to conduct a qualitative study: (a) single-case (holistic), (b) single-case (embedded), (c) multiple-case (holistic), and (d) multiple-case
54
(embedded). Holistic case studies have one single unit of analysis whereas embedded case studies have multiple units of analysis (Yin, 2014). In order to complete single-case studies, Yin posited five rationales: critical, unusual, common, revelatory, and longitudinal (p. 51). According to Yin, “This means the need for a decision, prior to any data collection, on whether you are going to use a single case or multiple cases in your study” (p. 51).
Single-case common case study is described as a case study where “the objective
is to capture the circumstances and conditions of an everyday situation” (p. 52). This single-case embedded common qualitative case (Yin, 2014) study research at UES was designed to find common themes that affected organizational learning. This particular case met the criteria for a single-case embedded common case study because the research questions were about one school during a specific period of time. However, I looked closely at several CLTs at UES. The third and fourth grade CLT teams served as the embedded units of analysis. Single-case embedded case study is especially important when a researcher is looking to represent the critical test of a significant theory as well as document the lessons it might provide about the social processes related to some theoretical interest (Yin, 2014). Last, single-case embedded common case study is important because unlike multiple-case studies where more than one case is studied, the possibility of the data collected being diluted is minimal.
A challenge of a single-case embedded case study is the focus on one subunit
level; therefore, it fails to return to the larger unit of analysis (Yin, 2014). In addition, getting the scope of the study correct because the study may be broad or narrow in scope is a challenge (Creswell, 2013). Sometimes the research topic may be too narrow, and
55
the researcher must make a decision pertaining to which bounded system to study (p. 101). In a broad topic, a researcher might use a multiple-case study. According to Creswell (2013), “The study of more than one case dilutes the overall analysis; the more cases an individual studies, the less the depth in any single case” (p. 101). Single-case embedded common case study methodology was chosen for this research because of its importance in establishing the meaning of the phenomenon at UES and because of the reliance on observable occurrences associated with the use of protocol-structured discussions from the viewpoints of the participants (Creswell, 2013; Yin, 2014).
The focus of the research at UES was on the systems and processes of individuals
and groups as they engaged in organizational learning through the use of DDDM protocols to facilitate school change. According to Merriam (2009), case study is suitable when there are many potential variables available to understand a phenomenon. In this study, there were a number of potential variables to understand the phenomenon of school change, including the leadership strategies used by division and school-based leaders, organizational structures, group processes, teacher practices, and student test scores.
The concept map in Figure 6 provides an overview of the methods and sources
used to collect data to complete this research study. The data from the individual and focus group interviews with the third grade, fourth grade, and SIP team provided further insight into the protocol-structured discussions at UES. In addition to the interviews, I conducted observations of the third and fourth grade CLTs, and analyzed several documents to find common themes that led to the success at UES on the state assessments in reading and mathematics for 3 years. Upon completing this study, the
56
findings add to the body of research on protocol-structured discussions as a tool in organizational learning, specifically the school reform process.
Data Collection Methods
During case study research, the data collected should come from a variety of
sources (Yin, 2014) and must be detailed and in-depth (Creswell, 2013). For this research study, I used a combination of interviews (individual and focus groups), observations, and document reviews to collect the data and maintained a reflexivity journal. I analyzed documents describing programmatic offerings and reviewed the protocols CLTs employed at UES to engage in organizational learning, specifically protocols to improve interdisciplinary learning and looking at student work. In addition, I conducted two observations of the third and fourth grade CLTs. The data collected from the interviews were triangulated to create themes. The themes that emerged from the data served as a basis for the findings of the research and provided insight into areas for further research.
57), Section(heading=Interviews, text=The primary method of data collection was a series of interviews (Marshall &
Rossman, 2011; Merriam 2009; Yin, 2014) with the leadership team, school-based leaders, teachers, division SIP coordinator, and CLTs to gauge their perspectives on how protocol-structured discussions transformed the school reform process at UES and provide insight into their effects on organizational learning at UES. Interviews were essential in this case study because they provided a rich description of the significant events that happened at UES.
Interview questions were developed based on the research topic. All interviews
were completed in semi-structured and open-ended format. The average length of the interviews was 35 to 45 minutes. I developed several guided theme questions for the interviews with the former and current principals. The same processes were followed for the interview questions asked of the division SIP coordinator, the third and fourth grade CLTs, and school-based SIP team.
Patton (2002) outlined six kinds of interview questions a researcher can ask of
people when conducting interviews or focus groups: (a) experience and behavior, (b) opinion and value, (c) feeling, (d) knowledge, (e) sensory, and (f) background and demographic. Table 4 provides a brief synopsis of each question type and shows the relationship between Patton’s six types of questions to the interview protocols used for the individual and focus group interviews.
60
Table 4 (continued)
Relationship Between Patton’s Six Types of Questions and the Interview Protocol
Patton’s Question Focus Interview Questions for Former Principal
Interview Questions for Current Principal
Interview Questions for Division SIP Coordinator
Background These types of questions provide standard background information and help the interviewer understand the interviewee in relation to other people. How did the use of protocols influence school culture? What evidence do you have to justify the influence on school culture? What changes, if any, have you made at UES pertaining to the use of protocolstructured discussions?
How has the principal of UES made the use of protocol-structured discussion an integral part of the school reform process? What supports are provided to schools during the school reform process? Are the supports the same at all schools?
Prior to beginning the research at UES, I tested the interview questions with
another administrator and a few teachers in the same division as UES; however, the pilot test did not encompass any teachers from the site where the research was conducted. The interview protocols for the former principal (Appendix A) and current principal (Appendix B) were as follows.
The questions for the former principal included: 1. Why did you decide to use protocol-structured discussions as a school reform
strategy at UES?
2. How did the use of protocols influence school culture? What evidence do you
have to justify the influence on school culture?
3. How has the use of protocols changed learning for teachers, students, and
administrators at UES?
61
4. How would you describe the protocol UES uses to engage teachers in the
planning of instructional units?
5. How would you describe the protocol UES uses to engage teachers in
DDDM?
6. How does teachers’ thinking change after they engage in protocol-structured
discussions?
7. What aspects of protocol-structured discussions are transferred to teachers’
classroom practice?
8. Is there anything else you would like to add?
The questions for the current principal included:
1. What changes, if any, have you made at UES pertaining to the use of protocol-
structured discussions?
2. How has the use of protocols changed learning for teachers, students, and
administrators at UES?
3. How would you describe the protocol your school uses to engage teachers in
the planning of instructional units?
4. How would you describe the protocol your school uses to engage teachers in
DDDM?
5. How does teachers’ thinking change after they engage in protocol-structured
discussions?
6. What aspects of protocol-structured discussions are transferred to teachers’
classroom practice?
7. Is there anything else you would like to add?
62
The interview protocol for the division SIP coordinator (Appendix C) was as follows:
1. What supports are provided to schools during the school reform process? Are
the supports the same at all schools?
2. UES received several recognitions by the state and division during 2012 to
2015. From a central office perspective, what do you think contributed to these recognitions?
3. How has the principal of UES made the use of protocol-structured discussions
an integral part of the school reform process?
4. How would you describe the data and planning protocols UES uses to engage
teachers in DDDM?
5. How does teachers’ thinking change after they engage in protocol-structured
discussions?
6. What aspects of protocol-structured discussions are transferred to teachers’
classroom practice?
7. Is there anything else you would like to add? One of the shortcomings of interviews in qualitative research pertains to trust
between the interviewer and interviewee. If trust is not established, the interviewee may not feel comfortable sharing information freely, which prevents the interviewer from collecting the data needed to conduct the study. In addition to trust, another shortcoming is that the interviewer might not ask the appropriate follow-up, elaborating questions to gather additional data (Marshall & Rossman, 2011, p. 145). Marshall and Rossman (2011) described the process of asking follow-up questions as “probes.” According to Marshall and Rossman, conducting interviews is valuable because it conveys that the
63
participants’ viewpoints are useful. In order to address the shortcomings of interviews in qualitative research, I used multiple data sources to triangulate the information. I also used member checking (Creswell & Miller, 2000) to ensure validity. I provided each participant in the study an opportunity to review the final interview transcripts before I triangulated the data collected at UES. Lincoln and Guba (1985) noted member checking to be “the most crucial technique for establishing credibility” (p. 314). It improves the validity of the research because it solicits participants’ views of the credibility of the findings and interpretations (Merriam, 1998).), Section(heading=Focus Groups, text=Focus group interviews with the CLTs and SIP team were another primary source
of data. Focus group interviews are commonly described as interviews with a group of people who have knowledge about a specific topic (Merriam, 2009). Conducting focus groups yields data in quantity quickly (Marshall & Rossman, 2011). The focus group format also allowed the participants to hear each other’s views (Patton, 2002) and provided opportunities for discourse as I collected descriptive data on the use of protocolstructured discussions at UES during the school reform process. In addition, the focus group interviews enhanced the data quality by affording participants the opportunity to provide checks and balances on each other as they engaged in a discussion (Patton, 2002), essentially “weeding out false or extreme views” (p. 386). Some of the limitations of focus group interviews include that the total number of questions that can be asked is restricted in the group setting, response time is restricted, confidentiality cannot be assured, and the format can prevent those with minority viewpoints from speaking freely (Patton, 2002). I worked collaboratively with the school principal to select participants
64
for the focus group interviews based on the criteria set forth in the research. Participants were contacted by email (Appendix S). Prior to conducing the focus group interviews, I read the focus group introduction, prompting, and closing protocol to each focus group (Appendix T). Each focus group interview lasted approximately 45 minutes to 1 hour. The total number of participants for the focus group interviews was determined by the composition of the CLTs. Table 5 outlines Patton’s (2002) question types and the focus group interview protocol used to conduct the research at UES. Table 5 Relationship Between Patton’s Six Types of Questions and the Focus Group Protocol
Patton’s Question Focus Interview Questions for 3rd, 4th, and SIP Focus Groups Behaviors/experiences These types of questions provide information about a person’s experience, actions, and activities in a manner where the observer would observe if they were present. What aspects of protocol-structured discussions are transferred to teachers’ classroom practice? Opinions/values These types of questions provide information in a cognitive and interpretive manner about opinions, judgments, and values as opposed to actions and behaviors. It helps the interviewer see what the interviewee is thinking. How has the use of protocols changed learning for teachers, students and administrators at UES?
How do teachers’ thinking change after they engage in protocol-structured discussions?
Knowledge These types of questions provide information about a person’s factual information. It helps the interviewer see what the interviewee knows.
What protocols do you use to plan instructional units? Why is this protocol important? How would you describe the protocol your school uses to engage teachers in DDDM? What aspects of protocol-structured discussions are transferred to teachers’ classroom practice?
(continued)
65
Table 5 (continued)
Relationship Between Patton’s Six Types of Questions and the Focus Group Protocol
Patton’s Question Focus Interview Questions for 3rd, 4th, and SIP Focus Groups Background These types of questions provide standard background information and help the interviewer understand the interviewee in relation to other people. What is the structure of your CLT meetings? Who is included on the specific CLT?
According to Merriam (2009), having a facilitator who has experience with a
topic matters when conducting focus group interviews because it allows the interviewer to ask follow-up questions based on answers provided by the interviewees to collect more in-depth data. In addition to more in-depth data, experience with the subject matter can minimize subtle influences between the interviewer and interviewee. Yin (2014) described this process as reflexivity, where the interviewer’s perspective unknowingly influences the interviewee’s responses (p. 112). My role as an elementary principal with experience in the school reform processes at the state and local division, as well as my understanding of DDDM and the use of protocols to facilitate organizational learning, made me equipped to conduct the focus interviews. Upon collecting the data, I looked for themes regarding the use of DDDM protocols at UES. These themes were triangulated with the themes that emerged from the other data sources. One of the strengths of focus groups is they allow participants to speak freely (Marshall & Rossman, 2011, p. 145), therefore providing information that may present a contrasting view to the individual interviews. The interview protocols for the focus groups (CLTs and SIP team) were as follows (Appendix D):
66
1. What is the structure of your CLT meetings? Who is included on the specific
CLTs?
2. What protocols do you use to plan instructional units? Why is this protocol
important?
3. How has the use of protocols changed learning for teachers, students, and
administrators at UES?
4. How would you describe the protocol your school uses to engage teachers in
DDDM?
5. How does teachers’ thinking change after they engage in protocol-structured
discussions?
6. What aspects of protocol-structured discussions are transferred to teachers’
classroom practice?
7. Is there anything else you would like to add?), Section(heading=Document Reviews, text=Throughout this study, I collected and analyzed documents and artifacts
pertaining to UES (Appendix V). Document review includes an analysis of documents produced in the course of everyday events (Marshall & Rossman, 2011). One of the key advantages of using document reviews is that it affords researchers the opportunity to collect data without causing a major interruption to the learning environment. One weakness of document review lies in the interpretation of the written materials (Marshall & Rossman, 2011) because the researcher determines where the emphasis lies after the data have been collected (pp. 161-162).
67
Documents related to the topic provided important data and served a critical role
(Yin, 2014) in the data collection for this case study. Patton (2002) described this process as getting a behind the scenes look. Some of the contextual documents I collected during the study were state data profiles, school division profiles, UES SIP plan, programmatic offerings, and the various protocols utilized by the CLTs.
Document reviews provided another data source to help triangulate what was said
in the individual interviews and focus groups. Marshall and Rossman (2011) described the process of triangulation as compensation for the weakness of another research approach. Merriam (1998) posited that it is essential for the researcher to determine the authenticity as well as origins when working with documents. In addition, purpose, accuracy, author, and context are essential to determining the quality of a document (Merriam, 1998). I used Merriam’s Document Analysis Authenticity Protocol (Appendix E) and the Comparative Analysis of Protocols for Improving Professional Practice and Protocols for Looking at Student Work (Table 2; Appendix F) to analyze the authenticity of the documents collected at UES. The same protocols were utilized to determine the authenticity of any related documents outside of UES. The document review protocol included the following questions:
1. What is the history of the document? 2. How did it come into my hands? 3. What guarantee is there that it is what it pretends to be? 4. Is the document complete, as originally constructed? 5. Has it been tampered with or edited?
68
6. If the document is genuine, under what circumstances and for what purposes
was it produced?
7. Who was/is the author? 8. What was (s)he trying to accomplish? For whom was the document intended? 9. What were the maker’s sources of information? Does the document represent
an eyewitness account, a secondhand account, a reconstruction of an event long prior to the writing, an interpretation?
10. What was or is the maker’s bias? 11. To what extent was the writer likely to want to tell the truth? 12. Do other documents exist that might shed additional light on the same story,
event, project, program, context? If so, are they available, accessible? Who holds them? (p. 112-122).
In addition to Merriam’s (1998) Document Analysis Authenticity Protocol, I used
Johnson and Christensen’s (2012) protocol entitled, “How to Judge the Quality of Internet Resources,” to determine the validity of the document and the information provided in all web-based documents. The components of Johnson and Christensen’s protocol are: (a) authority, or the author’s role and credentials; (b) accuracy, or name and institution that publishes the page with contacting information; (c) objectivity, which provides accurate and objective information with no little or no advertising; (d) currency, which indicates the webpage and links are updated regularly; and (e) coverage, or the viewing of the information on the webpage without paying fees or additional software (p. 70).
The information gathered from the document reviews was categorized based on
content and later utilized in triangulation of the data collected from the interviews with the principal, teachers, and other school-based leaders. Appendix G provides further detail on Johnson and Christensen’s (2012) website analysis protocol. Table 6 provides a comparative analysis of the various protocols I analyzed at UES (See Appendices H, I, and J).
70), Section(heading=Observations and the Reflexivity Journal, text=Observation is central to qualitative research (Marshall & Rossman, 2011).
Observations entail the systematic notation and recording of events as well as behaviors and artifacts where the research is being conducted (Marshall & Rossman, 2011). While conducting the observations of the third and fourth grade CLTs, I documented my notes in a specific reflexivity format (See Appendix K). I focused on the interactions of the teachers with each other during the planning of a unit guide and the disaggregation of a given assessment. In addition to the focus on teacher interactions, I documented the process the CLTs used to make use of certain protocols during the planning and DDDM processes. In order to remain non-judgmental, I maintained a reflexivity journal throughout the entire study. The reflexivity journal provided me a means of emphasizing the importance of self-awareness, political/cultural consciousness, and ownership of one’s perspective (Patton 2002, p. 64). Patton (2002) asserted that “the qualitative analyst owns and is reflective about her or his own voice and perspective” (p. 494).
A researcher with a credible voice conveys to the participants authenticity and
trustworthiness (Patton, 2002). In order to accomplish this, I used the Field Notes Reflexivity Protocol (Patton, 2002). I focused on self-reflexivity, reflexivity about participants, and reflexivity about audience. While conducting this qualitative research study, it was important for me to continue to reflect on my own practice to observe and eliminate biases, values, and experiences (Creswell, 2013) to ensure the data collection process remained objective. Figure 7 and Appendix L contain a sample of the Field Notes Reflexivity Protocol I used during the study.
71
72
fourth grade, and SIP team, I conducted two observations at UES. The first observation was with the fourth grade CLT as they planned math unit 9. The second observation was with the third grade CLT as they engaged in a DDDM process to review student assessment results from math unit 6. My last observation was with the fourth grade CLT team. During this observation, I was able to see how the CLT disaggregated data from the math unit 9 they had planned during my first observation. All observations, interviews, and focus groups were collected during the months of March and June of 2016. Table 7 provides an overview of the data sources and participant information. Table 7 Overview of Data Sources and Participant Information
Data Sources Position Pseudonym Embedded Case # Observation #1 Principal Lindsay 001
4th Grade Teacher Chloe 001 4th Grade Teacher Ana 001 4th Grade Teacher Regina 001 4th Grade Teacher Absent 001
LD Teacher Kendall 001 ESOL Teacher Juliana 001 Math Specialist Tracy 001
Observation # 2 Assistant Principal Lauren 002
3rd Grade Teacher Mitzy 002 3rd Grade Teacher Louise 002 3rd Grade Teacher Gladys 002 3rd Grade Teacher Alyssa 002 3rd Grade Teacher Katie 002
ESOL Teacher Heather 002 ESOL Teacher Jean 002 Math Specialist Tracy 002
(continued)
73
Table 7 (continued)
Overview of Data Sources and Participant Information
Data Sources Position Pseudonym Embedded Case # Interview # 1 SIP Coordinator Jessica 003 Interview # 2 Former Principal Howard 004 Interview # 3 Current Principal Lindsay 005 Focus Group # 1 4th Grade Teacher Chloe 006
4th Grade Teacher Ana 006 4th Grade Teacher Absent 006 4th Grade Teacher Absent 006
ESOL Teacher Juliana 006
Focus Group #2 3rd Grade Teacher Mitzy 007
3rd Grade Teacher Absent 007 3rd Grade Teacher Gladys 007 3rd Grade Teacher Absent 007 3rd Grade Teacher Katie 007
Math Specialist Tracy 007
Focus Group #3 ESOL Teacher Catherine 008
Math Specialist Tracy 008 Reading Specialist Joanne 008
LD Specialist Kendall 008
Former Title 1 Specialist/Assistant Principal
Brittany 008
Observation #3 Principal Absent 009
4th Grade Teacher Chloe 009 4th Grade Teacher Absent 009 4th Grade Teacher Regina 009 4th Grade Teacher Alexis 009
LD Teacher Kendall 009 ESOL Teacher Juliana 009 Math Specialist Absent 009
74
Data Analysis Procedures
Merriam (1998) defined data analysis as “the process of making sense out of the
data . . . it involves consolidating, reducing, and interpreting what people have said and what the researcher has seen and read-it is the process of making meaning” (p. 178). Similarly, Yin (2014) defined data analysis as “consisting of examining, categorizing, tabulating, testing or otherwise recombining evidence, to produce empirically based findings” (p. 133). Qualitative analysis renovates data into conclusions (Patton, 2002). Table 8 outlines the relationship of the three research questions of the study and the data sources collected and analyzed to depict the phenomenon at UES.
76
UES served as the main unit of analysis, while the third and fourth grade CLTs
served as embedded units of analysis. The data from the state and local data profiles as well as the interviews with the SIP team, former principal, current principal, and the division SIP coordinator were used to define the main unit of analysis. Data from the focus group interviews, document reviews, and the observations of the CLTs were collected from the embedded units of analysis. Upon collecting the data from the main unit of analysis and the embedded units of analysis, I began the data analysis process.
According to Patton (2002), “The challenge in qualitative analysis lies in making
sense of massive amounts of data” (p. 432). In order to make sense of the data collected at UES, I needed to be able to reduce the volume of the raw data by distinguishing what was trivial from what was significant, identify patterns, and develop a framework for communicating the meaning of the data (Patton, 2002). All individual and focus group interviews were digitally transcribed. The information gleaned from the individual interviews, focus groups, and any pertinent information from the CLT observation notes and document reviews was labeled and entered into NVivo 11 Pro. I created five internal folders in NVivo 11 Pro to store the appropriate data transcripts: (a) Document Reviews, (b) Focus Groups, (c) Interviews, (d) Observation Notes, and (e) Observation Transcripts. Once I received the transcription from the transcriber, I immediately stored it in the appropriate internal folder in NVivo 11 Pro. This tool assisted in determining whether any meaningful patterns were emerging. It also allowed me to keep the data organized, allowing for easier manipulation to establish patterns and themes.
After transcribing the individual and focus group interviews, I used a process of
thematic emergence to identify themes in the data collected. Yin (2014) defined thematic
77
emergence as “searching [for] patterns, insights, or concepts that seem promising” (p. 135). In this research study, I used the constant comparative analysis method (Merriam, 1998, 2009) to determine themes in the data collected at UES. Constant comparison is commonly defined as a process whereby a researcher decides on the type of data that will be collected, therefore assigning categories, finding relationships between each category, and then comparing the findings to the data collected (Merriam, 1998).
I started the process of thematic emergence by coding the data in distinct
categories: (a) third and fourth grade CLTs, (b) principal and former principal, (c) SIP team, and (d) SIP coordinator. Thereafter, I began highlighting key observables in the data and creating nodes in NVivo 11 Pro. One of the key observables during this process was the way some key information was standing out. Patton (2002) described this as interocular significance, or important information that hits the researcher in the face while analyzing data. I initially created 22 different nodes (Appendix U), and then combined the nodes because several patterns began to develop. I created a Word document to categorize all 22 nodes into four primary nodes. The primary nodes that were created set the stage for the themes that derived from my research. Several other themes developed from the data pertaining to organizational learning. These themes were not directly referenced by participants; however, they were indirectly stated in the participants’ responses to the research questions during triangulation. Some of the key observables from the data were assigned to multiple nodes if applicable. Table 9 provides a detailed description of the four major nodes and the number of times they were referenced in the different data sources. I also utilize the research questions to establish the final findings discussed in Chapter four.
78
school-based leaders used to engage in interdisciplinary learning and look at student work. All document reviews were linked to the node pertaining to the overall organizational structure of UES. The interviews and focus groups helped me understand the initial school change process. Observations of teacher meetings helped me to learn more about the sustainability aspect of school change. More specifically, I was able to see how protocol-structured discussions shaped teacher thinking and identify which aspects of those protocol-structured discussions carried over to classroom practices.
In Tables 10 through 13, I outline the three research questions of the study to the
individual and focus group interview questions. As can be seen from the tables provided, the data collected from these interviews explicitly addressed each research question.
81
in qualitative research evolved from experimental sciences (Marshall & Rossman, 2011). In a qualitative research study, the main criterion for a researcher is to establish reliability, validity, objectivity, and generalizability to judge the soundness of the research (Creswell, 2013; Marshall & Rossman, 2011). To accomplish the soundness of qualitative research, Guba and Lincoln (2000) suggested four models: (a) credibility or truth-value, (b) dependability or consistency, (c) conformability or neutrality, and (d) transferability or applicability. Similarly, Yin (2014) provided four criteria for judging the quality of qualitative research designs: (a) construct validity, (b) internal validity, (c) external validity, and (d) reliability (p. 45). While the research of Guba and Lincoln (2000) is well known in the field of qualitative research, I used Yin’s (2014) model for judging the quality of qualitative research as a basis for the current research. During this
82
study, I focused on triangulation of the data sources and the use of member checking to strengthen the credibility of the research. In addition to triangulation and member checking, I utilized my reflexivity journal to conduct ongoing self-reflection to clarify researcher bias, made use of peer review or debriefing, and incorporated rich, thick description of findings so any preconceived notions would not be a factor in the data collection process and analysis.
Triangulation is a “validity procedure where researchers search for convergence
among multiple and different sources of information to form themes or categories in a study” (Creswell & Miller, 2000, p. 126). Similarly, Patton (2002) defined triangulation as “comparing and cross-checking the consistency of information derived at different times and by different means within qualitative study” (p. 559). Triangulation strengthens a study through the use of different data sources or inquiry methods to test for consistency in the research findings (Patton, 2002). This research involved the use of triangulation in a number of places.
Triangulation was built into this research study because of the different
perspectives gathered from the school leaders, division SIP coordinator, focus groups, document reviews, and the reflexivity journal. Another unique aspect of the study is that it included the views of two different principals, two different grade level focus groups, a central office perspective, and the perspectives of several teachers and specialists as members of the SIP team. This unique view provided for even better triangulation. Each of the data sources related directly to the different themes that emerged from the research. Each of the participant groups provided similar insights into the phenomenon that occurred at UES from 2012 to 2015.
83
The interviews with the former principal and the current principal provided a
unique view of the overall understanding and implementation of protocol-structured discussions during the school reform process from two different perspectives. They also provided some insight into how the current principal might or might not have made changes to the process upon assuming the post after the promotion of the former principal. Another area where triangulation existed in this research pertained to Research Questions 2 and 3, whereby teachers, CLTs, the SIP team, and other school-based leaders validated the work done by the principal in Research Question 1. Interviewing the teachers and conducting observations provided a good view of the implementation process. Finally, triangulation existed in this study by virtue of including an examination of student performance in teachers’ individual classrooms on state assessments as well as the overall school performance.
Member checking (Creswell, 2013; Merriam, 2009; Yin, 2014) was utilized as a
procedure to check the validity of the research by specifically focusing on the lens of the participants in the study. During member checking, I worked directly with the administrative team to review any outcomes, check for accuracy in the data, and get a better understanding of the data collected. All participants were afforded the opportunity to review the transcriptions from the data collected for accuracy and formulate further questions and or concerns pertaining to the research.
In addition, I worked closely with two external peer reviewers throughout the
research. I provided copies of the de-identified data from the three focus groups and asked the first reviewer to provide specific feedback on what she believed were the big themes that emerged from the data. She sent me a copy of her notes via e-mail. After
84
reviewing her notes, we met for about an hour to compare and contrast our findings. I also provided this external peer reviewer copies of the de-identified data from the former principal, principal, and SIP coordinator interviews and asked her to identify the major themes in the data. We met a second time to discuss her findings. The second peer reviewer had the specific task of reviewing the data analysis component of the research. I shared a draft of the first two themes, followed by the other themes in the study, with him and asked for feedback pertaining to structure and content. Once I received the feedback, I made the necessary changes I felt were critical for the study. Dr. Patrizio, Dissertation Chair, and other members of the dissertation committee reviewed the coding themes from the information gathered from the individual and focus group interviews to ensure there was proper alignment with the research questions.
Validity
Creswell and Miller (2000) defined validity in qualitative research studies as the
accuracy and credibility of a researcher’s account to represent each participant’s reality. Credibility in qualitative inquiry depends on several factors. Patton (2002) outlined three main areas when looking at credibility during qualitative research: (a) rigorous methods for doing fieldwork, (b) the credibility of the researcher, and (c) the philosophical belief in the value of qualitative inquiry. Rigorous methods pertain to the data collection and analyzing process. The credibility of the researcher is dependent on training, experience, previous track record conducting research, how a person presents him or herself, and status (Patton, 2002). Philosophical belief in the value of qualitative research pertains to the researcher’s value of naturalistic inquiry, qualitative methods, sampling strategies, holistic thinking, and inductive analysis (Patton, 2002).
85
In order to increase researcher credibility or validity in qualitative research,
Creswell (2013) outlined a series of strategies and recommendations to check the accuracy of a researcher’s findings: (a) prolonged engagement and persistent observation; (b) triangulation of data sources; (c) peer review or debriefing; (d) negative case analysis; (e) clarifying researcher bias; (f) use of member checking; (g) use of rich, thick descriptions of findings; and (h) external audits. This study encompassed five out of the eight strategies Creswell suggested: triangulation of data sources, member checking, selfreflection, peer debriefing, and the use of rich, thick descriptions of findings. Internal Validity
Internal validity provides the researcher an avenue to explore the relationships
between different events in the study (Yin, 2014). Miles and Huberman (1994) asserted that internal validity pertains to questions such as, “Do the findings of the study make sense? Are they credible to the people we study and to our readers? Do we have an authentic portrait of what we were looking at?” (p. 278). The data in the current study were collected from a variety of sources, including interviews, focus groups, document reviews, protocols used, and the maintenance of a reflexivity journal. These sources helped strengthen the study and revealed any inconsistencies in the data.
Triangulation of data sources is a validity approach qualitative researchers utilize
to determine the accuracy of findings in a research study (Creswell, 2013). It helps strengthen a qualitative case study through the use of multiple data sources or inquiry methods to test for consistency in the findings (Patton, 2002). For my research, I triangulated data collected from the individual interviews with the former principal, current principal, and SIP coordinator with data from the focus group interviews with the
86
school-based SIP team and the third and fourth grade CLTs to ensure there was internal validity. Similarly, the document reviews and the observations of the third and fourth grade CLTs provided data to ensure internal validity. These sources provided the necessary data to portray a precise picture of the best practices at UES. External Validity
External validity is commonly described as how well the findings of a study
might be applied to other situations (Merriam, 2009). It provides for generalizations and opportunities for replication (Yin, 2014). This study had several limitations. One of the key limitations was that the study took place in a single Title I elementary school in a suburban school division. Another limitation was that the principal who began the process of using DDDM protocols as a tool during the school reform process was no longer the principal. The study was conducted to observe the phenomenon that occurred from 2012 to 2015. The current principal, who was appointed in 2015, was the former assistant principal and therefore had historical knowledge about the process and its inception. The new principal is considered to be an “insider.” Other limitations relate to the generalization of the results because the study was limited to one school that implemented protocol-structured discussions and DDDM processes. This does not mean that all schools that embark on a journey to infuse protocol-structured discussions into their school reform process or as part of the day-to-day operations of running a school will experience the same results. There are many other factors that lead to successful school reform processes. If another researcher decides to replicate the study at another elementary school, it is essential for the researcher to maintain vivid descriptions of the study as well as the reader’s interpretation of the findings (Yin, 2014). Based on my
87
capacity to document and convey the rich detailed descriptions of the data collected from the individual and focus group interviews and observations, the transferability of the findings will enable readers of the research and other researchers to interpret the findings. In order to maintain these vivid descriptions, I kept a reflexivity journal (Creswell, 2013; Creswell & Miller, 2000; Marshall & Rossman, 2011) to document my role as a researcher. Construct Validity
Construct validity certifies that the appropriate operational procedures are
developed and followed throughout the study. Yin (2014) defined construct validity as “identifying correct operational measures for the concepts being studied” (p. 46). Construct validity in case study research is strengthened when the researcher employs multiple data sources, establishes a chain of evidence, and provides a thorough review of key informants (Yin, 2014). I used triangulation (Creswell, 2013; Creswell & Miller, 2000; Merriam, 2009; Patton, 2002; Yin, 2014) to determine the consistency of findings at UES. My research encompassed student achievement data on standardized assessments; this data source strengthened the qualitative data that I collected, thereby enhancing the construct validity of the study. One limitation of construct validity is researcher bias. To avoid researcher bias in this study, I looked at the data through a nonbiased lens to the best of my ability and used the research journal as an opportunity to surface and examine my biases.
Reliability
Reliability often refers to the stability of responses to multiple coders of data sets
(Creswell, 2013). The objective of reliability is to ensure that if another researcher
88
follows the same series of procedures used by a previous researcher as well as conducts the same exact study, the researcher should arrive at the same findings (Johnson & Christensen, 2012; Yin, 2014). This is typically done when the new researcher follows the exact procedures from the earlier case. To avoid issues of reliability, I used case study protocols to address any issues with documentation (Yin, 2014). Throughout the duration of this research, I used the Document Analysis Authenticity Protocol (Merriam, 1998) and the Field Notes Reflexivity Protocol (Patton, 2002) to avoid issues of reliability. These protocols allowed for transparency by not allowing my personal experiences and professional judgments to affect the research. In addition to the two protocols mentioned above, the research at UES involved the use of statistical data from end-of-year state assessment results in reading and mathematics to provide another lens to the research being conducted. I made use of a professional transcriber to assist with the transcriptions of qualitative data from the individual and focus group interviews and made use of the reflexivity journal to document any potential biases.
Researcher Bias
I have several years of experience working on school reform practices at JCPS.
Because of my knowledge of the school reform processes in JCPS, I was consistently aware of this bias. Some of the advantages I have as a former participant in the school reform process in JCPS are: (a) I have first-hand knowledge of the struggles of UES prior to the appointment of the new principal, (b) I have knowledge of some of the systems and processes the former and current principal implemented since their appointments to the position, (c) I know some of the key personnel in the division to help me conduct the research, and (d) I have an excellent understanding of protocol-structured discussions as
89
it is a process I utilize at my school. The use of protocol-structured discussions to engage the staff at UES during the school reform process brought a certain affirmation to the work that I began several years ago. This type of learning continues to be an integral part of my leadership to this day. In order to eliminate my biases, I worked with the principal to identify the school-based leaders, CLTs, and SIP team members to interview. I utilized the reflexivity journal to ensure any potential blind spots did not affect the research at UES.
In addition to the school reform experience at JCPS, I have 3 years of experience
with school reform processes as a classroom teacher in another mid-Atlantic state. This knowledge allowed me the opportunity to make connections, predictions, and probe for additional clarification in certain situations. As an elementary principal with 10 years of experience in JCPS, I have some thoughts, values, and experiences that I may not necessarily know exist. These biases did not surface until I actually began the research. If I was a researcher coming in to conduct the research at UES or had to go out and conduct the research at another school where I had no prior experiences, my views would be very different. While conducting the research, I needed to be conscientious about the construct from which I operate as compared to what a researcher from outside will bring to the table. Member checks and peer reviews were used throughout the study, further reducing any potential biases I brought to the table.
Despite the knowledge described above, it should be noted that I have never
worked at UES nor do I have any direct relationship with the student body and faculty. In order to ensure the bias described above did not interfere with my research, I took several approaches to address this concern: (a) triangulation of all data collected
90
(Creswell, 2013; Creswell & Miller, 2000; Yin, 2014); (b) collected quantitative data on UES end-of-year summative assessments; (c) utilized the Document Review Analysis Protocol (Merriam, 1998) to review all documents; (d) maintained a reflexivity journal (Creswell, 2013; Creswell & Miller, 2000; Marshall & Rossman, 2011; Patton, 2002); (e) conducted member checks (Merriam, 2009) throughout the study; (f) used peer debriefing; and (g) followed the research design described in this chapter to document the rich, thick descriptions of the findings.
Ethics, IRB Approval, and Confidentiality
During this research study, I followed the research guidelines set forth by the
Institutional Review Board (IRB) at Virginia Tech (VT; See Appendix M). I followed a specific protocol of informed consent to safeguard the rights and confidentiality of all participants in the study (See Appendix N). Creswell (2013) posited that the ethical researcher assures the confidentiality of participants by refraining from identifying the individuals participating in the study. This was achieved by ensuring that I complied with the informed consent. The VT IRB approved the protocol I used for informed consent prior to the collection of any data during this study. I also followed all guidelines, approval procedures, and any protocols required by the school division’s research board in the Office of Accountability. To protect the identities of the division, schools, and individuals, I used pseudonyms throughout the research.
I notified each participant orally and in writing of the goals of the study as well as
the methods of data collection and analysis utilized in the study. All participants in the study were afforded the opportunity to ask questions and raise any concerns pertaining to the study. Participants had the opportunity to ask questions during the informed consent
91
process, and were also given my contact information and that of my dissertation advisor, Dr. Kami Patrizio, so they could ask any questions after the interview. I made available all written transcripts of the interviews conducted as well as provided member checking opportunities to gather feedback or any relevant comments or statements the participants felt were valuable to the study. I ensured that all participants knew of their rights to withdraw from the study (interviews, focus groups, or observation) at any point without prejudice or bias. My external peer editors only had access to the de-identified versions of the data and followed the guidelines set forth by the VT IRB and the school division’s IRB to protect participants’ confidentiality.
The data that were assembled were only accessible to my advisor, Dr. Kami
Patrizio, and myself. Digital copies of the data were stored electronically in a passwordprotected computer in a separate unidentified folder. Additionally, an electronic copy of the data was stored on an encrypted external hard drive with a password-protected login. The password-protected encrypted external hard drive was stored in a secure location under lock and key. A hard copy of the data, research paper, and all transcripts from interviews, observations, documents, and a copy of the reflexivity journal are being stored in a locked cabinet. I will erase all electronic copies of the data collected and shred hard copies of all other data 3 years after the initial collection process.
Data collection began once the VT IRB and the school division’s IRB approved
my research. The next step in the planning process was to create a letter to introduce myself as the researcher and explain the goals and procedures of the study at UES. Once my dissertation chair, Dr. Kami Patrizio, and the other members of my dissertation committee approved the letter, I presented it to the principal of UES. Upon her approval,
92
the letter was distributed to the other school-based leaders, CLTs, teachers, members of the SIP team, and the division SIP coordinator.
Summary
The goal in conducting this single-case embedded common qualitative case study
was to examine and describe how UES made changes during the school reform process that benefited students. This chapter encompassed the rationale for the methodology as well as the rationale for a qualitative case study. In addition, it contained a vivid description of the data collection process. Qualitative data collection methods include interviews, focus groups, observations, document reviews, reflexivity journal notes, and peer debriefing. The data collected were triangulated to determine themes from the embedded units (i.e., third and fourth grade CLTs) and UES to determine the findings that benefitted students. This chapter also addressed issues pertaining to validity, reliability, and bias. The goal of this study was to provide further insight into the use of protocol-structured discussions as a tool to promote organizational learning during the school reform process and add to the body of work on protocol-structured discussions and the use of protocols as learning tools for educators.
93