Using pupil surveys to identify potential areas areas to improve

Pupil voice has the potential to be a powerful tool to identify areas of development for an individual teacher, department, year group or whole school. Once you’ve determined the right questions to ask, putting these into a Microsoft Form, collecting the data and then automatically feeding it into PowerBI leads to easily accessible analysis through whichever lens is most appropriate. This article will run through the basics of the survey setup, the PowerBI report pages and a framework for analysis and action. The following post will deal with the technical aspects of setting this all up.

The questions

Our aim was to get a grasp of the following areas of teaching practice:

  • The classroom environment
  • Feedback
  • Quality of instruction
  • Homework

We found a set of statements online that became our starting point. These were refined via input from all user groups and after testing them with some pupils to check we had a shared understanding of their meaning; some questions we modified, some we removed and some we kept. The survey needed to be short enough to be quick, but not so short that we missed information that could offer valuable insights.

Full pupil voice results for each pupil. Pupils complete a Microsoft Form and the results are automatically added to the PowerBI app. Icons are used via the conditional formatting option to make scanning the data easier.

Once the questions were defined, the next stage is to setup an online form to record the responses. We use Office 365 so Microsoft Forms was the go-to tool for us, however I’m sure you can follow the same process with other applications.

The survey uses the likert question type with statements divided into different sections. The statement responses are converted to numeric values in Power Query (‘always = 5 and ‘never’ = 1).

This survey is not anonymous for our pupils and we are clear about this with them. They have to login with their school account to complete the form which then records their email address in the survey results. Each response can then be linked to a pupil’s profile in the data model. The additional detail this provides in analysis is significant. For example, we can see how an individual pupil’s experience of school differs between subjects or whether responses differ based upon ability level of pupils in a class, subject, year group or school.

The survey also needs to record the subject and teacher name. Teachers want to see their own results and departments want to be able to analyse results across all year groups. To keep the process safe for all involved Row Level Security (RLS) has been set up so that teachers can only see their own results, heads of department can see all results for their department but not broken down by teachers and the management team can see results for all departments, but again not broken down by teacher. For the process to be successful, it is absolutely essential that everyone trusts the system and that discussions about the results are conducted in a supportive manner.

We also record the year group of the pupil completing the survey, at the time the survey is completed. Over time, this will allow us to look at changes over time for a year group. For example, let’s say year 9 geographers this academic year give a low score for feedback on their work. The department investigates this, decides they can improve and puts in place a range of strategies. Next academic year, the new year 9 pupils do the survey. We can now compare the result for last year’s year 9 with this year’s year 9 to answer the question ‘have we implemented our feedback strategies successfully?’.

Teachers share the QR code for the survey with pupils so that they can complete it in lessons and a few hours later after the data flow and data model have refreshed the results are automatically available to view in the PowerBI app.

A framework for analysis and action

Having all this data is one thing, but using it to drive improvement is quite another. To help, I have advocated using the ‘guess what, here’s what, so what, now what’ framework which I picked up from Jordan Benedict at Visualize Your Learning. For example, we can start by asking guess what the weakest statement response was? Before looking at any data, we can discuss as a department our collective ideas. We’d then look at the data; here’s what the weakest statement is. So what asks whether this survey result is valid, significant and therefore in need of action. If the survey results do highlight areas that require improvement then we consider what strategies can be implemented; now what can we do about it?

With thoughtful interpretation of the results, alongside further investigation and discussion with colleagues, we can gain valuable insight from the process that can allow for highly personalised, specific and measurable areas for development. When these are supported through effective CPD and coaching, we are better able to respond to this feedback in a continuous cycle of improvement.

The following post (coming soon) will go through the more technical aspects of setting this up, including:

  • Getting the data from Microsoft Forms to a PowerBI dataflow
  • Transforming the data so it’s in the right shape for analysis
  • Using a bridge table to link subjects, teachers and pupils to their respective dimension tables
  • Building the report pages

One comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s