Author
Lyn Richards

Pub Date: 11/2009
Pages: 256

Click here for more information.
Lyn Richards
Title: REMS

Authors: Clare Tagg (Tagg Oram Partnership, www.taggoram.co.uk), Somia Nasim and Peter Goff (The Qualifications & Curriculum Development Agency, www.qcda.org.uk)

Reporting the Project

Reporting and reviewing is a key component of REMS as it allows the team and stakeholders to assess the reliability, robustness, usefulness and comprehensiveness of the system. A thorough reporting strategy has been devised and implemented, to ensure the analysis is consistently reported and shared with stakeholders.

The key aspects of the strategy are:

  • To produce quarterly reports which are precise and not over burdensome for the group interested in the analysis area. These reports are between 2-8 pages, in the style of a briefing paper using plain language, with technical information kept to a minimum.
  • Quarterly reports include a 'What's in REMS?' report and thematic reporting on emerging themes or core policy areas (see examples at www.taggoram.co.uk/research/cases/REMS).
  • Ad hoc reports as requested by colleagues or on burning issues which require immediate attention and need to be communicated instantly (see examples at www.taggoram.co.uk/research/cases/REMS).
  • Overall evaluation at crucial reporting points to report to the government department - the system is designed to report on a continuous and quarterly basis, but it is also to be used at crucial programme, end of year, review and evaluation points.
  • Validation of the analysis - the findings in the reports are always validated by the strand manager or the relevant expert in QCDA. Once signed-off the reporting is then communicated to Senior Management on a quarterly basis.
  • Reporting online. Now that the reporting process is finalised, the REMS team are designing an online web page for the QCDA intranet and website. This is being developed as another avenue to communicate 'What REMS is?' project progress and evidence reports.
  • All analysis and reporting influences the direction and development of the database, and therefore, each quarter the system is reviewed and evaluated. It is at this point that any changes to improve the system are made.
  • The system is never static and once reports are produced, this is not the end point; reporting may result in more detailed analysis of an area highlighted in the report or an update required next quarter. This continuous reanalysis is made easier by saving the NVivo queries that are used for the reporting.

Making use of REMS

As a result of the intense targeted reporting and communication the team has generated immense interest in the REMS potential and usage.

The success of the system has enabled the team to provide a service to external stakeholders, which is one of our major achievements. The team had from the onset deemed it important that REMS should be used by both internal and external stakeholders. During the exploratory stage of the project, it became apparent that a service of analysis and reporting would need to be provided to allow evidence reporting for stakeholders. It was clear that it would not be plausible for users to directly use the NVivo system because of the difficulties of providing appropriate access to the database and the skills needed to get the best out of the NVivo system.

In addition to work for the Qualifications & Credit Framework, Adult Skills and Lifelong Learning & Skills programmes, the success of the 14-19 Reform REMS project has led to requests from other programme teams in QCDA to devise similar systems. This is an impressive achievement, but the extension to other areas has resulted in the following questions:

  • Do we incorporate all programme areas into one and have one REMS system?
  • Will the coding framework and other functionalities work for these different programme areas?
  • If the systems are kept separate how do we handle evidence sources that relate to more than one or all areas?

Work has started on other programmes and the decision has been taken to use individual systems for each programme but to design them with the potential to merge into one REMS master. This will allow us to review and evaluate across the programme, core policy and thematic areas; to understand and determine issues, good practices and to ensure policy is evidence based.

Back to Project Home Page