Using PowerBI to create a dynamic mark book

As teachers, if we’re going to assess and record data, then we might as well make it as informative as possible. That’s why for the last few years I’ve been exploring different ways of capturing assessment data from across multiple teachers and connecting it with the rest of the data we hold about our pupils.

My vision is for teachers to be able to easily input their data and then have access to a dynamic mark book that includes interactive report pages that can help to spot patterns and trends in pupil progress. I also want to reduce the duplication that often occurs when teachers maintain their own mark book and also have to add data to a department mark book.

The system I established in previous years involved having to setup queries in each Excel file (one file per teacher) to append all the data before it could be added to our data model. This process took too much time and certainly wasn’t scalable beyond a handful of teachers.

This academic year, I’ve figured out how to make this work without a single query in an Excel file and so it is being offered as a tool colleagues can use in the senior school at Wellington. I will go through the technical aspect of this setup in another post, but for now, the focus will be on the visuals the end-user experiences.

Each teacher has an Excel file (saved in a SharePoint folder) and each sheet is pre-populated with the details for pupils in each of their classes as well as other columns about the assessment they are recording. Once the assessment data is input, it is automatically added to the PowerBI data model via a dataflow. The result is that this data can now be analysed in relation to report grades, exam outcomes, baseline scores and pupil profiles.

Pupil overview page

The first page is designed to give an overview of a pupils assessment data in either a single subject, or a range of subjects. It includes their average scores in each subject and those scores broken down by topic, sub-topic, assessment type and strand. Teachers don’t have to enter all this info, but the more assessment details are included the more useful the tool becomes.

The pupil overview page is designed to give a break down a pupils progress within a particular subject.

On the right hand side is a table that shows details about each assessment (1). This includes a code to indicate if it was late (L) to be submitted, incomplete (I) or if a pupil was asked to redo it (R). Teachers can also include comments (2) when inputting the info that might include areas to improve upon in the next assessment.

The line graph shows a pupils progress over time (3). You can hover over each point and a tooltip will load that provides information about the assessment that was completed on that particular day.

The slicers at the top of the page are designed to make it easy for each teacher to narrow down the range of pupils shown in the pupil slicer on the left of the screen. Slicers are also included that enable only high or low stakes assessments to be shown (4). This provides the flexibility to record both summative test results and lower stakes assignments such as retrieval quizzes, vocab tests and homework assignments. Using the slicer a teacher can then choose if they want to show all the assessment data or just the high or low stakes details.

The final slicer allows a user to switch between the assessment percentage measure and the assessment level measure (5). The level which we use on our reporting system is from 9-1 for IGCSE and 7-1 for IB. When this info is added, it improves the comparability between subjects.

Subject overview page

The subject overview report page is designed to analyse patterns within a department and allow comparisons with other departments.

The subject overview page allows patterns to be analysed within a particular department and comparisons made to other departments.

The assessment vs. CAT4 scores (top left) is a quick and easy way to see which pupils might be under-achieving, require additional support, need to be challenged and which should be celebrated. In this instance, I’ve added the new zoom slider feature so that you can zoom in and out to sections of the graph. Personalisation using a legend perspective has also been enabled so that the user can also change the legend; in this case I’ve selected gender but I could have selected year group, SEN or EAL status, teacher and so on.

The attainment differences chart (middle left) highlights the attainment for pupils in the selected subject versus their average in all subjects. A tooltip has also been added to show the data behind each pupil. Conditional formatting has been applied so it is easy to interpret who is doing better (green), the same as (amber), and worse than (red) in your subject compared to others.

The assessment data table (6) just shows assessment results by pupil as in the individual pupil overview page, whilst the attainment over time pivot table (7) illustrates report grades over time.

Assessment analysis page

The final page that I would like to share is the assessment analysis page. Once again, this helps to identify potential under-achievement and those who require additional support, but it also helps to validate attainment grades against assessment data.

The assessment analysis report page helps to validate the accuracy of attainment grades compared to assessment data.

After applying the relevant slicers, we can now start to analyse the data. The key questions to ask on this page relates to who is doing better or worse in assessments based upon CAT4 data (top left scatter graph), and to what extent does our assessment data informs our attainment grades (bottom left scatter graph).

At Wellington, assessment results are not the only information we use to determine attainment grades (report grade), however, we might reasonably expect that a pupil who does well in assessments will also get a high report grade and vice versa (1). This should result in a reasonably strong correlation. Where it does not, there might be legitimate reasons for that, however, it is certainly a question which can be asked.

For example, the pupil identified in the tooltip (2) has an average percentage of 65.2% and an average attainment grade of 7.6. We could argue that this pupil has been slightly over-reported since there are pupils with the same average percentage achieving a lower average attainment grade. Obviously, context is key here and the attainment grade might be entirely justifiable. However, where significant variations exist it can call into question the efficacy of the reporting or the quality of assessment.

The pivot table shows the same data as the scatter graphs. The first column (3) shows the difference between the CAT4 rank and assessment rank. The first pupil has a CAT4 rank of 55 and an assessment rank of 1, resulting in a difference of +54. This means that the pupil is doing much better than we might expect based upon their CAT4 data.

The next column (4) highlights the difference between the assessment and attainment ranks. The second pupil has an assessment rank of 2 and an attainment rank of 9 giving an overall score of -7. This could suggest they are being under-reported. If we only used assessments to define attainment grades then the the attainment and rank difference score should be 0 – every pupils’ assessment rank matches their attainment rank.

To conclude

As teachers we record a lot of assessment data, but sometimes the power of it can be lost when not linked to attainment and baseline data. By integrating it into PowerBI the analysis that can be used can make it easier to identify patterns and trends, ask informed questions and find opportunities to make improvements.

Additional resources

To learn how to create the assessment measure (level or percentage) that can change with the use of a slicer check out this tutorial: https://radacad.com/change-the-column-or-measure-value-in-a-power-bi-visual-by-selection-of-the-slicer-parameter-table-pattern

Adding personalisation capabilities to charts is demonstrated in this video by Guy In A Cube.

One comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s