Data can be used to tell the story of how well a provider is doing and to help set targets for improvement. Used properly it is a key tool to improve performance. It is a crucial source of evidence for self-assessment reports, funding bodies and inspection.

How does the way you use data to support quality improvement compare with that of the most effective provision seen on inspection?

The following strengths and areas for improvement have been taken from recent inspection reports across the Ofsted Learning and Skills remit.

Common inspection strengths

  • Good collection and use of management information data
  • Very effective use of data to improve programmes
  • Good use of data to monitor and improve training
  • Good use of data to set targets for assessors and learners

Common inspection areas for improvement

  • Poor collection and use of management information data
  • Inadequate use of data to inform staff
  • Insufficiently developed management information systems to support quality improvement
  • Insufficient strategic use of data

If you were given a similar area for improvement bullet at the end of your last inspection, self assessed this area as an area for improvement, or want to work to avoid such areas for improvement, then consider what inspectors judge to be key.

Particularly effective practice identified in inspections includes:

  • Managers recognising the importance of data in everyday and strategic management of training (too many providers collect data for funding body requirements but are not aware of how to use it to inform quality improvement activities).
  • Analysing how learning programmes are delivered in order to know what data needs to be collected and putting systems in place to do so (this includes impact of marketing to attract learners, the performance of all and different groups of learners, where learners go to on leaving or finishing programmes).
  • Making data an everyday part of job roles so that those involved in delivery of programmes are aware of how data can indicate how well things are going, for example, learner attendance or progress in completing assessments. Providers who have focused on this have come up with innovative ways of checking performance during programmes such as having a model (typical) learner who has average milestones indicated. Real learners can then be 'tracked' against this model learner to see how they are progressing.
  • Using data as an early indicator of emerging areas for improvement - often by comparing key performance indicators such as early drop out of learners. Some providers look at retention two months, six months and end of a year to compare performance year on year. They also look at other possible performance factors depending on the inspection context such as when key skills are achieved in work-based learning. This picking up problems of problems early from data at a snapshot of time enables providers to take early preventative action. For example, rather than waiting for a final success rate of 30% on a particular apprenticeship programme to indicate poor performance, realising that slow progress in assessments is an indicator of a potential problem, such as unsupportive employers, at six months and doing something about it.
  • Using past/historic performance data to set improvement targets for staff such as assessors (managers know from this data who is performing well or poorly and can put support in place to make improvements or identify good practice to help others in discussions with staff).
  • Identifying, collecting and using data where a change has been made in some aspect of provision to measure the impact of any quality improvements which have been introduced. When inspectors ask for examples of improvements these providers are not only able to give them examples but demonstrate that they have worked.
  • Not only taking into account the data needed by funding bodies and inspectorates when deciding which data to collect in order to provide it, but using it to analyse for quality improvement themselves.
  • Collecting data in a consistent way to examine trends over a period of time, typically over a three-year period.
  • Using simple graphs to present and analyse trend data as it is quicker to understand.
  • Looking at performance during management and quality improvement meetings.
  • Including full analysis of different groups of learners in reviews of programmes.
  • Taking action if data reveals developing problems.
  • Comparing data from different sources to gain insights into an organisation's performance (benchmarking). Comparisons can be made between different programmes or between different providers offering similar provision. For this to be successful data has to have been collected in a similar way.
  • Sharing relevant data with staff and encouraging them to analyse and use it.
  • Setting targets based on data that are understood and used by all involved in the delivery of training.

Useful data may include some or all of the following:

  • How learners get to you is seen as a 'funnel' by the best providers, wider (larger numbers of potential learners) at the top where enquiries are generated and narrower at the bottom (smaller numbers) where learners are enrolled and start programmes of learning - the straighter the walls of the funnel, the more enquiries are resulting in enrolments:

    • enquiries (who has shown interest in a particular programme - indicating the success of marketing strategies and activities - broken down as far as possible where they heard of the programme, then if possible by gender, ethnicity, disability, age and learning support needs).
    • interviews (who has been attracted to apply for a particular programme - indicating the success of marketing strategies and activities - broken down as far as possible where they heard of the programme, then by gender, ethnicity, disability, age and learning support needs).
    • enrolments (who has started a particular programme - broken down by gender, ethnicity, disability, age and learning support needs).
  • How learners perform:

    • attendance (broken down by different parts of the programme to indicate whether any parts are not popular - for example, key skills - so that reasons can be established and improvements made).
    • achievement or success rates classified by qualification (including full and partial), modes of attendance, previous qualifications of learners (value-added data can indicate that learners are doing well compared to similar groups of learners), broken down by gender, ethnicity, disability, age and learning support needs to see under and over-performance by particular groups.
    • retention rates (broken down by gender, ethnicity, disability, age and learning support needs to see under and over-performance by particular groups).
    • reasons for learners leaving early collected independently (providers have recognised that it is less likely that a learner will say "because of poor teaching" to the person who has taught them but will be more honest if asked by someone independent who wants to improve training for other learners.
    • destinations of leavers whether early leavers or those who have completed. Some providers are able to demonstrate progression into related employment and progression onto more advanced related qualifications, giving another positive performance facet (again broken down by gender, ethnicity, disability, age and learning support needs to see under and over-performance by particular groups).

The best providers always ask themselves "if we don't use any of the data collected why are we collecting it?"

Healthcheck questions

Health check

Do you know who your learners are (ethicity, gender, disability, age)?

Where have your learners come from and heard about you (where they live, last school attended, employer)?

Do you monitor retention, partial achievement, success and progression rates in each area of learning?

Do you do this for different groups of learners (gender, ethnicity, disability, age and learning support needs)?

Do you analyse the data routinely at different stages to identify problems (examples)? 

Do you compare your data with data from other sources (examples)?

How do you know whether learners progress to further education, training or employment on completion of their programme?

Why do learners leave early?

What actions are identified for quality improvement based on data?

Have you set quantifiable targets for improvement?

How is data used to measure the effectiveness of the actions you have taken?

What could you do next to improve your provision?

  • Read inspection reports to identify what the best providers are doing in your particular type of provision or area of learning (also check other types of provision as good practice is usually transferable between inspection contexts - adult and community learning, college, DWP, work-based, etc). As well as looking at providers with ‘outstanding’ aspects or monitoring visit reports with judgements of ‘significant progress’, look at providers who are similar to yourself in terms of remit, size and what they offer – Ofsted inspection reports
  • Get a clearer and richer understanding of what you need to do to improve – Learner-centred self-assessment
  • Use downloadable quality-improvement resources to develop your staff team and to focus on actions that will help to improve your provision – Actions for quality improvement
  • Adopt or adapt the best bits of other providers’ work that inspection has identified as being particularly effective – Ofsted good practice database examples
  • Measure just how effective your initial-assessment system is and if your quality-improvement initiatives are working – Data projects
  • Develop a blueprint for initial assessment of your learners – Initial assessment and support
  • Check whether your self-assessment report is fit for purpose – Self-assessment surgery projects
  • Use the guidance developed by Ofsted to know what to expect in order to prepare for inspection, look at the Ofsted inspection handbook for your remit or the inspection toolkit – use the search box if necessary - inspection handbooks and toolkit
  • Use the Excellence Gateway as a first ‘port of call’ when researching areas that you would like to improve. As well as the Ofsted-related area, simple word searches will bring you a variety of information about what others in the learning and skills sector are doing to improve their provision. This is particularly useful for any newer areas that you may wish to research.