Overview

A quality assurance system has been developed in conjunction with the hub's providers that helps manage performance and allocate resources to areas of greatest need. Providers are allocated to a risk band which governs the frequency of their audit and quality assurance visits. The system has the commitment of all providers and is seen as being fair, transparent and open.

How you can use this

Any hub or training provider that uses a third party to deliver training could adapt the banding system. Colleges, work-based learning providers and prisons could use it with subcontractors. Work-based learning providers could also use it for employers that provide training, and colleges could use it with franchised provision. It needs appropriate staffing to make it work, and all the parties involved need to support the idea. It makes a useful performance management tool.

Other aspects of Kent and Medway Hub's quality assurance system that work particularly well are the regular visits to centres by area managers to spread good practice and maintain contact with the 'parent body'. These could be applied to any training provider with geographically spread training, such as a local authority with a number of adult and community learning centres.

How it works

Until November 2002, Kent and Medway Hub had few hub-wide quality assurance processes or procedures. Setting up a formal hub quality assurance framework and structure was a priority for the newly appointed hub manager. Whatever was put in place needed to be open and transparent, and must be applied to all the hub's providers in order to bring about a change in culture. The hub wanted to integrate the quality assurance and learning processes in a simple quality cycle, and decided to use an external consultant to develop the system. Although the development of the quality management system (QMS) was driven by the hub, the hub manager ensured that it actively involved all the providers, so that they had a sense of ownership of the system.

The result was a clear quality assurance framework, based on a simple three-tier model. This establishes very clear parameters, but allows individual learning providers to create their own operational models, fit for their specific purpose, as long as they comply with the hub's requirements. This pragmatic approach has enabled the hub to win over the hearts and minds of all those involved. Providers appear to welcome the culture shift, applying quality assurance procedures as part of their performance management. Key documents and procedures, such as the learner charter, the claims process and the self-assessment guidelines are used by everyone.

Two area managers and a quality manager have been appointed, and they visit all the learning centres regularly. The area managers each cover half of Kent, and visit each centre once a month. This creates positive working relationships and helps to identify issues, problems and difficulties. It also allows them to provide advice and consultancy and share good practice. The relationship between the hub and the providers has become more active, helping to build a genuine culture of continuous improvement. The quality assurance manager makes separate visits, which can include training observations. The managers all have a thoughtful approach to their work. They talk to learners as equals, and are careful to avoid being perceived as authority figures.

Clear distinction is made between quality assurance and audit. Audit is the mechanistic process that has to be followed to satisfy the requirements of the hub, funding bodies, inspectorates and Ufi itself. Much of the audit process is quantitative, requiring providers to use prescribed paperwork and meet defined deadlines. Audit data, when processed, are an integral part of the hub's decision-making process, and are used at local and strategic levels. They have been essential in driving up standards, and in ensuring fairness, equity, and openness.

The hub has implemented an innovative 'traffic light' system that is used for both quality assurance and audit. In both, the system is used to determine how often the hub manager will visit centres. This has a double benefit. The providers that comply with the hub's requirements and are doing well get fewer visits and can focus on providing and supporting learning opportunities; and the hub's resources are targeted at those that most need it. This helps to keep the hub's running costs down, giving the providers more resources to put into their facilities and services.

The quality indicator system uses the accuracy and timeliness of data returns, the actual versus target enrolment numbers and completions, and other pre-planned criteria. There is clear guidance so that providers know exactly what constitutes good, satisfactory and unsatisfactory performance. A provider that is operating well is given a green light and only gets one full quality assurance visit a year. One that is performing adequately gets an amber light and one full quality assurance visit a quarter. If a provider's performance is causing concern, it gets a red light and a full quality assurance visit every month.

All quality assurance visits have formal, pre-agreed agendas, and any discussions that take place are fully minuted. Key action points are identified, and if two key action points are not improved significantly between visits the provider can be given a red light. If it remains on a red light for two months, its contract can be terminated, although the hub will take into account mitigating circumstances such as substantial staffing changes. There is a monthly review of providers' status, and an appeals procedure with the final decisions made at board level. Red lights can also be used in response to serious complaints.

The audit indicator system follows the same principle, but providers' compliance is checked by looking in detail at learners' records and individual learning programmes. If a provider gets a green light, 10 per cent of its learners' records are scrutinised quarterly. If it gets an amber light, 25 per cent are scrutinised every two months. If it gets a red light, all its records are checked once a month.

As a result of this process, the hub receives very detailed information on the performance of its learning providers. Specific improvement targets are agreed, with the incentive that if they are achieved, the provider can 'move up' a light. This model is very effective and is supported by providers, who see it as fair and just. The hub considers it a key component in its improved performance.