# Statutory Assessments Report

This report provides senior leaders and governors with a quick reference on school standards compared to national figures over the past three years. It fulfils a need currently not met by the DfE's Analyse School Performance (ASP) system or Ofsted's School Inspection Data Summary Report (IDSR).

ASP only provides a three-year time series for selected key stage 2 results. ISDR only shows the quintile position of a school's results over the past three years. It does not show the underlying data.

The report includes data on the range of statutory assessments from EYFSP through to end of Key Stage Two SATs.

**Please note** that some school figures may vary from official statistics. This can happen if Insight doesn't have all data for all pupils. For example, data will be missing for a pupil who left school before you started using Insight.

Progress scores are not currently calculated, so must be manually entered.

In accordance with DfE guidance for pupils who change schools, Key Stage One pupils are filtered to those who were on roll as of 31 May in the year of their KS1 statutory assessments.

You can override any of the figures by clicking on the graphs, then entering the correct data from another source, such as ASP.

##### Report Settings

By default, the report shows the previous three years. On the 1st July it automatically rolls over to display the current year (and the previous two). This places a tick in the **Show Current Academic Year** box at the top of the report. If you'd prefer to action this rollover sooner, as you may already have your latest data in place, you can manually tick the box.

When the new academic year begins, the tick will be automatically removed (as the latest statutory data is now for the previous year).

##### National Data Sources

National figures come from the government's Explore our statistics and data website, for example the KS2 headlines page.

##### Interpreting Results

**Percentages:** The percentage point difference between the school's result and national data is evaluated in terms of how many pupils it represents.

This means the school's percentage can be lower than national, but considered inline because the gap represents less than one whole pupil. This is especially useful for smaller schools.

To calculate, we multiply the number of pupils in the cohort by the percentage point difference, then round down to the nearest whole number of pupils.

For example, if the national result is 78% and the school result is 63% then the difference is -15. If there are 28 pupils in the cohort, then the pupil gap is calculated as follows: 28 × -15 ÷ 100 = -4.2. This is rounded to -4 pupils. However, if there were just 5 pupils in the cohort, then the pupil gap (5 × -15 ÷ 100) would be 0.

The bars are colour coded as follows:

**Average ELG and MTC scores:** To account for a narrow range of scores the bars are colour coded as follows:

**Average scaled scores:** The evaluation here is based on the standard deviation between the national average scaled scores and schools' results, approximated to +/- 3.

The bars are colour coded as follows:

**Progress scores:** As with ASP and IDSR, a confidence interval is used to indicate if progress scores are significantly above, below or in-line with national average.

Confidence intervals give an idea of uncertainty around a school's progress scores and they vary in width depending on the size of the school, with smaller schools having wider intervals.

There are three possible progress outcomes:

- Sig+: The confidence interval sits entirely above the zero line. Progress is above national average.
- Sig-: The confidence interval sits entirely below the zero line. Progress is below national average.
- OK: The confidence interval straddles the zero line. Progress is inline with national average.

##### What are the progress confidence intervals?

It is difficult to say with certainty how much of the progress scores are down to the school (which may have scored higher with a different group of pupils) and how much is down to the pupils (for example, some may have performed well at any school). The confidence intervals reflect this uncertainty. If the confidence intervals for 2 schools overlap, then we can't say for certain that the 2 progress scores for these schools are significantly different.

Generally speaking, the greater the number of pupils, the smaller the range of the confidence interval. For smaller schools, the confidence interval tends to be larger, since fewer pupils are included, and therefore the score will be more greatly impacted by performance of individual pupils.

A school is above average if their progress score is above 0 and the whole confidence interval is above 0. Similarly, a school is below average if their progress score is below 0 and the whole confidence interval is below 0.

More detail on how the progress scores are calculated, and confidence intervals, may be found in the DfE Primary School Accountability measures guide.

##### Amending or adding school data

Clicking on any bar will display further details.

The national figure, expressed as a dotted line on the charts, is given here along with the rounded number of pupils, in your cohort, above or below that value.

Where you have official results that differ to those in Insight, click in the Official Results box to enter them. When you click Save the chart will then reflect your amended values.

##### Adding Key Stage 2 Progress data

The Key Stage Two progress values may be found in Analyse School Performance (ASP), the DfE School Performance site (as in the below example) or your School Inspection Data Summary Report (IDSR).

Find the progress data required, then click the appropriate Progress Score column in the Insight Statutory Data report and enter the values.

Click **Save** and the chart will update with your entered values.