IPI Process Overview
The Instructional Practices Inventory process is a set of strategies for profiling student cognitive engagement across six categories so faculty who study the profiles will view the data as a fair and accurate representation of engagement for the school and thus be comfortable collaboratively studying and problem solving the data and creating a sustained focus on student engagement that will influence instructional design and thus student learning. In the IPI process, teachers-leaders collect data about school-wide engagement and facilitate faculty collaborative study of the data. All faculty have the opportunity to reflect upon the data and deepen their understanding of how to most effectively engage students in their respective classrooms. The IPI is a teacher-led, teacher empowering process. The IPI process does not focus on or profile individual teachers or classrooms and it was not designed for, nor should it be used for, supervisory or evaluative purposes.
Professional journals, school improvement books, and sessions at professional conferences are replete with discussions about the importance of student engagement. Increasing the time students spend cognitively engaged in learning is associated with increased student achievement. Increasing the time students spend in higher-order, deeper thinking is also associated with increased student achievement and, perhaps more importantly, the development of life-long thinking skills. These relationships with achievement are evident regardless of how student achievement is measured. Increases are apparent even when the measure is a predominantly recall-format high-stakes state assessment (Kumar, 1991; Goff and Ackerman, 1992; Yair, 2000; Hattie, 2009; Collins, 2009; Collins and Valentine, 2010, 2011; Valentine and Collins, 2009a, 2009b, 2011). Clearly, meaningful engagement in learning is a vital element for student academic success.
Likewise, it is common knowledge that when faculty collaboratively study and problem solve together using information directly related to student achievement, professional growth evolves individually and collectively. The more frequent the faculty collaborative study and problem solve data directly related to classroom instruction and learning, the greater the impact on student achievement (Hattie, 2009; Collins, 2009; Valentine and Collins, 2009b; Valentine 2010). Collaboration is a vital ingredient for cohesive professional development throughout the school (NSDC, 2001) and a basis for developing collective commitment and collective efficacy and optimism (Valentine, 2009; Valentine, 2010).
Maximizing student achievement across a school’s student population requires more than an increase in engagement, an increase in higher-order, deeper learning, and meaningful faculty collaborative study; however, those are relevant factors in the equation of continuous improvement. The Instructional Practices Inventory (IPI) process addresses those factors, though the IPI is neither a silver bullet nor the answer to educators’ prayers for how to increase student achievement. The IPI is merely a straight forward process for accurately coding how students are engaged in learning and viable strategies for collaboratively studying those data to inform instruction on a daily basis. The IPI process fosters organizational learning based upon evidence of student engagement and helps faculties develop the capacities to study the evidence and build the collective commitment and efficacy important to making a difference for all students (Collins, 2009; Valentine and Collins, 2009a; Valentine, 2009; Collins and Valentine, 2011). The IPI is a viable process for documenting and studying cognitive engagement so faculty can more effectively design and facilitate instruction across all grades and types of learning settings (Kachur, et al., 2009; Gauen, 2009; Fredericks, et. al., 2011).
The IPI process was developed in 1995-96 by Jerry Valentine, Professor at the University of Missouri, and Bryan Painter, graduate research assistant to Dr. Valentine. Their goal was to create a tool that would document the degree of change in engagement and instruction during a two-year school improvement project with 10 elementary, 10 middle, and 10 high schools from across the state of Missouri. During the initial work with the IPI from 1996-1998, it was evident that faculties who had the opportunity to study and problem solve the meaning of the data made greater instructional strides than their counterparts who did not collaboratively study their data. From that point forward, the IPI evolved into a “process of data collection and collaborative study,” not just a tool to understand the degree of change in engagement. From 1998 to 2002, the IPI was used to support school improvement in an additional set of Missouri schools and it was also used to study engagement in a set of nationally recognized exemplary middle schools. By 2002, a set of protocols and standards had been established for the professional development of educators wishing to implement the process in their school. Nearly 40,000 educators have completed the IPI Level I Workshop and are certified as IPI data collectors and discussion facilitators for their schools. That number continues to grow steadily across the U. S. at the time of this writing.
The Instructional Practices Inventory process is a way to systematically profile student engagement during a specified timeframe, typically a school day, so faculty can study and apply the data as they facilitate classroom instruction. The IPI process can be formally defined as: “A process for creating an optimum profile of student engagement in learning that teachers will view as fair and accurate, thus becoming a basis for collaborative faculty study and subsequent refinement of how students are engaged in learning throughout the school.” The profile is described as “optimum” because data are not collected at times of transition and because if engagement is evenly split between two categories, or if the data collector does not have clear evidence between two codes, the higher numbered category is selected. If an error is to be made in coding, it will be made on the “high” side of the coding categories. Such protocols are established because the data are of no practical worth if faculty do not view the data as valuable; and, teachers certainly will not see value if they are concerned the data are unfair, biased, or inaccurate.
Ample evidence exists to document the relationship between engagement and student achievement. Yair (2000) added to the early 21st century discussion with a critical study noting that students were engaged only 54% of the school day and that engagement varied by ethnicity, grade level, and course content. He further noted that actively oriented learning occurred one-ninth as frequently as did teacher lecture in secondary schools. In a study of highly successful, high achieving middle level schools and relatively unsuccessful, low achieving middle level schools, Valentine and his colleagues (Valentine, et al., 2004; Valentine 2004) documented the engagement practices and the significant differences across the two types of school settings. The National Association of Secondary School Principals Breaking Ranks (2004) and Breaking Ranks in the Middle (2006) both recommended the use of the IPI as a process for collecting and studying student engagement. At the University of Missouri, comprehensive and complex analyses from 1996 through 2009 described engagement on the six IPI categories throughout that period of time. We documented that higher-order/deeper engagement as measured by the IPI declined from a high of approximately 25% in the early 2000s to the current rate of 20%. Just as interesting was the fact that most of that drop occurred from 2003-2005. Student attentiveness to teacher directed instruction accounts for approximately 40% of learning time and student engagement with skill development, seatwork-like practice accounts for approximately another 40 percent of learning time. Student disengagement is typically somewhat less than 5%. Such normative percentages are obviously different in schools with higher and lower levels of student achievement. The data also differ by grade levels. We also found that the strongest relationships between engagement, as measured by the IPI categories, and student achievement on high-stakes assessments are student disengagement (category 1) and teacher disengagement (category 2), along with higher-order engagement (categories 5 and 6). Further, our studies of how schools collaboratively problem solve their IPI data link to the degree of change in engagement as well as changes in student achievement. The frequency with which schools collect and study their data is also related to changes in higher-order/deeper engagement and student achievement(Collins, 2009; Valentine and Collins, 2009a; Valentine and Collins, 2009b; Collins, 2009; Valentine, 2010; Collins and Valentine, 2010; Valentine and Collins, 2011; Collins and Valentine, 2011).
When IPI data are collected for the purposes of school improvement, all teachers should have the opportunity to study the data and reflect upon their perceptions of effective learning/instruction. These reflections should include conversations about best practices and the value of the six categories, especially categories 3-4-5-6. Discussions about how to change the engagement profiles over time are essential if instructional design and teaching practices are to change. The IPI conversations should occur, as much as feasible, in a setting of trust and inquiry, where teachers can be open, not defensive, about profile data. The data describe current practice and can be the basis for discussions leading to overall faculty, department, grade-level, and/or team growth. Teachers should be reminded that these data represent a “snapshot in time” of the entire school’s learning experiences. They should also be reminded that the six categories are “discreet” not “continuous” and categories three through six are of value at different times throughout a unit of study. The six categories are not a hierarchy, but rather six distinct ways to categorize student engagement.
The school’s first snapshot (profile) serves as baseline data. Future observations provide longitudinal perspectives of engaged learning for the school. Each school should collect data multiple times each school year, studying the pattern or trend over time as well as the details and changes in each profile. While developing a level of awareness about the instructional experiences is important, establishing purposeful professional development and continuous conversations is also important for significant change in teaching practices. To make a difference for students, the faculty IPI collaborative learning conversations must go beyond the profile numbers, with discussions that deepen knowledge and build commitment to enriching students’ learning experiences.
References:
Collins, J. (2009). Higher-order thinking in the high-stakes accountability era: Linking student engagement and test performance. Unpublished doctoral dissertation, University of Missouri.
Collins, J. and Valentine, J (2010). Testing the Impact of Student Engagement on Standardized Achievement: An Empirical Study of the Influence of Classroom Engagement on Test Scores across School Types, University Council of Educational Administration, Annual Convention, October 30, 2010.
Collins, J. and Valentine, J. (2011). The Instructional Practices Inventory in Rural Settings: Testing the Student Engagement-Standardized Test Performance Relationship, American Educational Research Association, Annual Conference, April 10, 2011.
Gauen, K. (2009). The Impact of the Instructional Practices Inventory on an Illinois Middle School. Doctoral dissertation, Lindenwood University.
Fredericks, J., McCloskey, W., Meli, J., Mordica, J., Montrosse, B., and Mooney, K. (2011). Measuring Student Engagement in Upper Elementary through High School: A Description of 21 Instruments. (Issues and Answers Report, REL 2011-No. 098). Washington, DC: U.S. Department of Education, institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southeast. Retrieved from http://ies.ed.gov/ncee/edlabs.
Goff, M. and Ackerman, P. (1992). Personality-Intelligence Relations: Assessment of Typical Intelligence Engagement. Journal of Educational Psychology, 84(4), 537-552.
Hattie, J. (2009). Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Student Achievement. New York, NY: Routledge Press.
Kachur, D., Stout, J., and Edwards, C. (2010). Classroom Walkthroughs to Improve Teaching and Learning, Larchmont, NY: Eye on Education.
Kumar, D. (1991). A Meta-Analysis of the Relationship between Science Instruction and Student Engagement. Educational Review, 43(1), 49-61.
National Association of Secondary School Principals (2004). Breaking Ranks II: Strategies for Leading High School Reform. Reston, VA: National Association of Secondary School Principals.
National Association of Secondary School Principals (2006). Breaking Ranks in the Middle: Strategies for Leading Middle Level Reform. Reston, VA: National Association of Secondary School Principals.
National Staff Development Council. (2001). NSDC Standards for Professional Development. Online at http://www.nsdc.org/standards/index.cfm (retrieved 4-14-10)
Painter, B. (1998). The Impact of Student Engagement on Student Achievement and Perceptions of Student-Teacher Relationships. Unpublished doctoral dissertation, University of Missouri.
Valentine, J., Clark, D., Hackmann, D., and Petzko, V. (2004). A National Study of Leadership in Middle Level Schools, Volume II: Leadership for Highly Successful Middle Level Schools. Reston, VA: National Association of Secondary School Principals.
Valentine, J. (2005). Statistical Differences for the Percentages of Student Engagement as Measured by IPI Categories between Very Successful and Very Unsuccessful Middle Schools. University of Missouri, Columbia, MO: Middle Level Leadership Center.
Valentine, J. (2009). The Instructional Practices Inventory: Using a Student Learning Assessment to Foster Organizational Learning. National Staff Development Council, Annual Convention, December 8, 2009, St. Louis, MO.
Valentine, J. (2010). ) Establishing a Faculty-wide Collaborative Study of Student Engagement, National Association of Secondary School Principals, Annual Conference, San Diego, CA March 14, 2010.
Valentine, J. and Collins, J. (2009a). Improving Instruction by Profiling Student Engaged Learning and Creating Collaborative Teacher Learning Conversations. National Association of Secondary School Principals, Annual Conference, March 1, 2009.
Valentine, J. and Collins, J. (2009b). Analyzing the Relationships among the Instructional Practices Inventory, School Climate and Culture, and Organizational Leadership. American Educational Research Association, Annual Meeting, April 14, 2009, San Diego, CA.
Valentine, J. and Collins, J. (2011). Student Engagement and Achievement on High-Stakes Tests: A HLM Analysis across 68 Middle Schools. American Educational Research Association, Annual Conference, April 11, 2011.
Yair, G. (2000). Not just about time: Instructional Practices and Productive Time in School. Educational Administration Quarterly, 36(4), 485-512.