Last reviewed 4 April 2016

Since September 2015, there has been a change in the inspection cycle. Tony Powell, consulting educationist, explains that outstanding and exempt schools are still not inspected unless there is strong evidence that they have deteriorated.


Schools that require improvement or are in a category of concern continue with a high level of monitoring, the frequency of which is largely determined by the monitoring HMI.

Schools that were graded “good” in their previous inspection are inspected on a three-year cycle when they will be subject to a one-day inspection visit from one HMI. In order to achieve this, Ofsted has made changes to the inspection methodology, including the introduction of the inspection dashboard. This article looks at some of the important features of the inspection dashboard.

Inspection methodology

The starting point for visits to good schools is a presumption that the school remains good, which justifies deploying only one HMI for one day. This means that the evidence gathered is much more limited. This is partly an extension of the principle of inspection in inverse proportion to success and partly funding, since Ofsted must reduce its overall budget by £32 million by 2020.

HMI will pose two questions.

  1. Is this still a “good” school?

  2. Is safeguarding effective?

If during the first day the evidence suggests that the school is no longer good, the school is informed of this, the inspection is deemed a Section 5 and additional inspectors are brought in for a second day.

Only a Section 5 inspection can change the school’s grade. Schools that wish to be upgraded from good to outstanding should take note of this and present supportive evidence from the very start of the inspection.

Inspection dashboard

The intention of the dashboard is to give inspectors and schools an overview of the data highlights of the most important aspects of the school’s context and outcomes for pupils. This will support a dialogue between inspectors and schools. Since HMI has only a limited time to prepare, the onus will be on the school to analyse any issues raised by the inspection dashboard and present its own evidence to explain strengths and weaknesses.

The training package

The inspection dashboard is not difficult to interpret but it uses different conventions, so the starting point for schools is to familiarise themselves with these by using the training package from the RAISEonline library. The “Inspection Dashboard and PANDA” pack contains general and technical guidance and anonymised examples. The data manager should use the guidance materials to train senior leaders and governors to interpret the tables and respond to questions during an inspection.

Strengths and weaknesses

The first page of the dashboard lists strengths and weaknesses and these can then be analysed in the detailed charts and tables. Check the school’s strengths and weaknesses against the full list in the training package. This will identify the school’s performance more accurately, for example there may be a weakness identified for one group of pupils that does not apply to other groups.

HMI will use these statements to plan the inspection so it is very important that the school analyses them fully and also identifies causal factors. This applies to strengths as well as weaknesses because inspectors are expected to identify any aspects of the school’s work that need to be improved or indeed disseminated. Although HMI will not use RAISEonline, the school should, because it contains additional and more detailed information on performance.

For example, where there is an identified weakness for a group, such as special needs, it is legitimate for the school to point out that the number of pupils was very small and that within this, the results were skewed by one pupil. However, where there is a general weakness or strength, such as in a subject, the school should identify the underlying causes in care and guidance, and/or curriculum, and/or teaching.

Inspectors will also want to know that the school is acting on its findings. Any general weaknesses should therefore appear as priorities in the improvement plan.

Floor standards

Also, on the first page is a table showing the school’s performance against the national floor standards. If these are not met, it is very unlikely that the school will be presumed to be good and therefore subject to a short inspection. Schools obviously know this well in advance.

The data in context

If a “good” school has a short inspection, any weaknesses identified in the inspection dashboard are not serious enough to suggest the school is less than “good”, otherwise it would receive a full Section 5 inspection. Nevertheless, evidence gathered in the first day may raise concerns. In this case, the lead HMI will change the inspection to a Section 5 and call in additional inspectors.

The data in the dashboard is for the past three years. During the inspection itself, the focus is on pupils currently in the school. Most weight will be given to the current progress of all groups of pupils in all year groups and across the curriculum. If a group identified in the inspection dashboard is making poor progress at the time of the inspection, this is evidence of systemic weaknesses in educational provision. This will certainly then impact on the judgment for leadership and management.

Academic outcomes


Progress is the most important performance measure and the first section after the title page. Inspectors will be looking for trends over time, differences between subjects and between prior attainment points and especially differences between disadvantaged and non-disadvantaged pupils.

Schools should first note the percentage coverage, then the cohort size under each of the prior attainment bars. This is the total number within that group not the number that made expected or better progress. Also, check the “in-gap” number, which is a calculated figure that shows the number of pupils represented by the gap between the school percentage and the national percentage. If the number of pupils is plus or minus one, this is interpreted as close to the national average.

Closing the gaps

Paragraph 178 of the Ofsted handbook states: “Inspectors will take particular account of the progress made by disadvantaged pupils by the end of the key stage compared with that made nationally by other pupils with similar starting points and the extent to which any gaps in this progress, and consequently in attainment, are closing.”

Value added

When analysing value added, note in particular the confidence interval. This is calculated from the size of the cohort. The larger the cohort the smaller the interval. This means that the interval may change over time as a school grows or shrinks and in secondary schools there will be different intervals for different subjects.

Do not read too much into small differences. Unless the result is completely outside of the confidence interval, it should be interpreted as: “not significantly different from average”.

Personal development, behaviour and well-being

The inspection dashboard is also one of the starting point for inspecting personal development, behaviour and well-being. An obvious area for analysis is attendance, since consistently low attendance for all pupils or groups is one of the descriptors for inadequate. Inspectors will particularly follow up poor attendance over time for groups such as the disadvantaged or those with special education needs.