Centering Student Perspectives in School Improvement Efforts: Highlander Institute’s Student Experience Survey
Over the last twenty years teachers have been tasked with collecting, analyzing and using endless amounts of testing data in their classrooms. While this data can help us identify where students are struggling and understand patterns of progress, it has not translated into more equitable outcomes for students (Barshay, 2022).
How can we expand our thinking about student data to go beyond creating academic interventions? Can a holistic approach to measurement provide more meaningful insights and greater impact?
Centering students and families is a critical part of Highlander Institute's liberatory data approach. We are committed to reappropriating the power of data to focus on the student learning experience as a key measure of instructional success and efficacy.
During the 2020-2021 school year, our Research & Impact team developed the Student Experience Survey (SES) to quantify and analyze the learning experiences reported by different student groups (across grades 3-12). We know there is a strong evidence base connecting academic success and concepts such as belonging, academic mindset, cognitive skills, and engagement. Our team wanted to understand how student perspectives in these areas might differ based on lunch status, race, grade level, gender identity, or status as English Language Learners or Special Education students.
Initial survey administration at Highlander Institute partner schools yielded fascinating results:
- At a rural middle school, students receiving free lunch had significantly lower confidence in themselves as learners and a lower sense of belonging than their peers.
- At an urban ring elementary school, special education students and students performing below grade level had lower self-perceptions and less positive learning experiences than their peers.
- At an urban middle school, the academic mindset of English Language Learners and Special Education students was significantly lower than their peers.
Understanding student perceptions has provided new awareness about potential root causes of student underperformance and misbehavior. It has also framed important faculty conversations about compliance versus engagement; trust and belonging; and high expectations. One school leader shared this anecdote:
"When I was doing my daily rounds of classroom visits one morning, I witnessed an exchange between a student and a teacher. The student already understood the concept being taught, and asked the teacher for more challenging work. The teacher's response was that all students had to stay together on the same lesson. Frustrated, the student responded by disrespecting the teacher. Previously, that exchange would have prompted me to remove the student from class, call his parents, assign detention, and lecture him on his behavior. My new understanding enabled me to focus on the 'why'. Out in the hallway, I learned that feelings of boredom were a frequent occurrence for this student in the classroom."
Across our partner schools, survey results have mobilized teachers to understand the stories behind student perceptions, design intentional responses, and triangulate results with academic data. This process has improved the learning experience for students while supporting higher academic achievement. Consider the following use case:
At one partner school, students with IEPs had much lower rates of progress and achievement on interim MAP assessments than their peers. Student experience survey results also showed that students with IEPs felt significantly less comfortable than their peers sharing their thoughts and opinions. The Design Team (inclusive school improvement team) shared the survey data with the student council. Additional student reflections supported a targeted search for research-based strategies to elevate student voice. The Design Team piloted "dialogue journals" as a key strategy to help all students feel more heard and valued, which increased student confidence, improved relationships with teachers, and increased engagement. On spring MAP assessments, students with IEPs making exceptional growth rose 18 percentage points in ELA and 13 percentage points in Math.
Excited by the outcomes of our first year of survey administration, we convened partners to support us in validating the tool. Through the generous support of a 2022 award from Assessment for Good, and the help of a talented Research Advisory Committee (see Table 1), we designed initial process steps. Based on the results of stakeholder focus groups and statistical analyses, we addressed survey usability and accessibility. Revisions have made the second version shorter, clearer, and more user-friendly. An overview of survey domains and rationale can be found in Table 2 in the Addendum at the end of this post.
The next steps for our Research & Impact team include:
- Piloting the updated survey with over 1,000 students during the fall of 2022. Subsequent statistical analyses will continue to support the process of survey validation and test new reporting structures to make results easier to understand and act upon.
- Improving our companion tool to the Student Experience Survey. Used as an entry point to 1:1 coaching engagements, the Social Emotional Learning Self-Assessment (SELSA) asks teachers to respond to the Student Experience Survey as a student in their class, and then compares these scores to class averages and disaggregated student response patterns. This exercise celebrates areas of strength and creates a collaborative, supportive approach for responding to red flags.
- Building a Family Engagement Survey to understand the perspectives of parents and caregivers around awareness, belonging, and engagement at their child's school. This data will help schools understand the strength of current family engagement activities, and support the study of correlations between family engagement, student engagement, and student success.
We believe that human-centered tools can redefine the power of surveys to measure the student experience, instructional efficacy, and inclusivity. When connecting experience data with data on academic performance, teachers can develop a more holistic understanding of their students as learners, and better understand and address root causes of existing inequities.
We are always looking for new partners! If your school is interested in designing improvement efforts that are anchored by student perspectives and learning experiences, please get in touch to administer the SES in your school or district.
Appendix: Student Experience Survey Domains
|Student Experience Survey Domain||Description, Rationale, & Evidence Base|
|Academic Mindset||Student perceptions of their academic confidence
This domain asks students to rate themselves on key beliefs about their intelligence and ability. Perceptions of academic mindset deeply influence student behaviors as learners and impact their learning success.
Evidence base: Ames & Archer, 1988; Bandura & Schunk, 1981; Keith et al., 1993; Pintrich, 2000; Schunk & Hanson, 1985; Wentzel, 1991; Zimmerman, 1990; Paunesku et al., 2015; Claro et al., 2016; Dweck, Blackwell, & Trzesniewski, 2007
|Awareness||Student perceptions of identity affirmation
This domain asks students to reflect on teachers' understanding of student identities and sociocultural contexts. When teachers deepen their awareness of the students in their classrooms, they are more likely to integrate culturally responsive examples within the curriculum, affirm the academic traditions of different cultures, and raise their expectations of students. This leads to increased student engagement, motivation, and success.
|Belonging||Student perceptions of trusting relationships
This domain asks students to consider the level of trust, care, and connection within a classroom. Students who feel a sense of acceptance, affirmation, and support in a learning environment are more likely to show higher levels of engagement, persistence, motivation, and achievement.
|Classroom Community||Student perceptions of the academic community
This domain asks students to rate the level of cohesion and collaboration around academic expectations within a classroom. Establishing a culture of thinking and a supportive environment leads to higher expectations for students around persistence as well as engagement in challenging work.
|Shifting the Cognitive Load: Student Moves||Student perceptions of their independence as learners
This domain asks students to consider their level of effort with respect to learning, improving, and challenging themselves. Their willingness to engage, persist, and work toward goals demonstrates their capacity to succeed at higher-order thinking tasks and increases their awareness of how they learn best.
|Shifting the Cognitive Load: Teacher Moves||Student perceptions of the responsiveness of their teachers
This domain asks students to reflect on teacher actions that push their thinking and value their ideas. Responsive teachers are "warm demanders" who leverage trusting relationships to hold students accountable to high expectations and view teaching and learning as a collaborative effort that values all voices.
Evidence base: Hattie & Timperley, 2007; Van der Kleij et al., 2015; Zeiser, Scholz, & Cirks 2018; Anderson et al. 2019; Martin, Burns, & Collie, 2017; Bondy et al, 2007; Ford et al., 2002; Trumbull & Pacheco, 2005; Gregory & Huang, 2013; Boser et al., 2014
|Critical Reflection||Student perceptions of how equity and social justice are explored
This domain asks students to consider how their study of different perspectives and experiences raises issues of power and fairness. When students find meaning and relevance within lessons, they are more likely to apply their cognitive skills, think critically about the world, and understand the levers of change available to them.
Evidence base: McWhirter & McWhirter, 2016; Diemer & Blustein, 2006; Cammarota, 2007; Lee, et al., 2012, Chavous et al., 2003, summarized in Sleeter, 2011; Seider, Graves, & Clark, 2020; Temple, 2005; Cabrera et al., 2014; Lesley, 2001; Barnhardt et al., 2000; Carter, 2008; Dee & Penner, 2016