We asked….They told.
The High School Survey of Student Engagement (known as the Hessie) is a highly regarded survey measuring the academic, social, and emotional engagement of high school students across the United States. It is administered annually by the Center for Evaluation and Education Policy at Indiana University. Since the survey’s inception, over 500,000 students nationally have participated in the HSSSE.
Among its purposes are:
1. To help schools explore, understand and strengthen student engagement and
2. To conduct rigorous research on issues of student engagement
We administered the test to all PDS high school students last spring. It takes between 20-30 minutes and consists of questions designed to investigate the levels of student engagement across three dimensions of life in school:
Many of the kinds of questions that are asked on the HSSSE are in line with our vision of a PDS graduate so for us it is a key test of how well we are living up to our mission. I will give some very specific examples of the questions in a later post.
The intention as explained on the actual question bubble sheet itself is that student responses will help the school “better understand (student needs) in order to create a school environment that is engaging, challenging and productive….”
We now have our results – a thick binder stuffed with data, details, comparisons and charts. It’s all a bit overwhelming. But as we begin to mine it for information we can use, several things become apparent – chief among them being, that in terms of national comparisons, PDS students test very highly in terms of engagement across all the dimensions. This is not a surprise of course, but it is useful to have the statistical confirmation of our anecdotal evidence and gut feeling.
That said – there is never time for complacency and we will use their responses and this data to identify areas for ongoing focus. There are areas we want to take a close look at. And we need to ask students what the numbers might mean in terms of life at PDS.
So: What did we learn? What did our students tell us?
Here – as a very preliminary stab at sharing these complex results is the summary of the three dimensions outlined above.
In reading the chart take a look at the “Effect” column. Effect size indicates the practical significance of the mean difference between groups being compared. In educational research, it is most common to find effect sizes between 0.10 and 0.40. Effect sizes: .20 ‐ .49 = small, .50 ‐.79 = medium, and .80+ = large. Beyond .80 means the effect is very significant .
Now check the results in the three dimensions and check out that effect factor.
Your reactions and questions welcome.
"We were young and we were keen; Europe was in flames, and we were ready…
When I was in the emergency room last year having busted my elbow, a nurse…
Most of us have done it at some point or another - accidentally locked ourselves…
Thanks to the #1970 Club, I've spent the spare moments of the past week immersed…
The #1970 Club is starting tomorrow (October 14th) and I'm prepared with some reading and…
View Comments
Hi Dierj: Pretty astonishing results. If 0.8 standard deviation equals a very significant difference then what do we make of a 1.64 difference!
These results show that which we all know - that dedication and heartfelt engagement with a task lead to the best possible results. Well done.