Statistics Division Home
Fundamental Principles
Country Practices
Reference Materials and Links
Sign in

Quality and Risk Assessment Framework


DRAFT as at February 2000


This framework has been developed as a basis for assessing the risk of an output area or a collection not achieving expected quality or performance standards. It has been produced primarily as a guide for assessing the risk of quality problems, but can also point to where there is a need for additional investment in statistical and IT infrastructure to reduce risk or improve performance in areas of corporate concern.

Risk is judged across four broad dimensions:-


adaptability and responsiveness


interpretability (analysis, presentation)


accessibility of data and metadata
client service


respondent management
management of risk and performance

No one of these aspects can be considered in isolation from the others. In particular, a balance is required across aspects of quality, timeliness and cost.

The assessment has been developed in the context that the world and user needs are constantly changing and some emphasis is placed on the importance of having direction, being adaptive and having the right infrastructure and practices to enable an output area to meet expectations into the future as well as now. There is also some emphasis on the extent of corporate fit and achievement of corporate goals such as integration of data and responsiveness to Maori.

The framework has been prepared to allow for self-assessment by an output area , with most assessments being simply 'OK' or "not OK'. However, some of the assessments may be best made with input from an independent assessor. Regular assessments (say annually) should show improvements in those areas underachieving, particularly areas needing attention to ameliorate risk of inadequate performance.

While the framework relates to an output area (which generally brings together data from more than one source), it can be applied with some tailoring to a specific collection.


1.1 Relevance

For an output area, it is important that it is producing the right statistical information that informs current government and public decision making, research and discussion on issues in the subject field. It is also important that there is a sense of direction whereby the statistics are being enhanced or further developed so that they remain relevant or improve their usefulness.

It is usually the case that not all statistical needs in a field are able to be met from within allocated budgets and so there must be up-to-date understanding of priorities and outputs produced reflecting those priorities. There must also be a balance between the frequency and detail of high priority needs and the less important needs so that a range of information needs are being met.

Key indicators are:-

- good understanding of who are the key users and emerging new stakeholders
- information needs and associated quality standards are regularly assessed and defined through user consultation
- good understanding exists of policy directions and issues, and the context within policy is developed
- information supports the Maori Statistics Framework and the government's Maori outcomes assessment policy
- highest priority needs are being met
- information is produced at the right frequency to allow timely monitoring of changes
- information provides an estimate of what the key users want to measure, rather than something else and users are left to make adjustments
- questionnaires, definitions, classifications reflect contemporary needs and situations
- a balance exists in the frequency and detail of outputs across priority needs so that less important needs are met less frequently
- key user commitment and funding of lower priority needs when necessary

Other factors to consider are

what are the key unmet needs in priority order?
what would go if 20% less budget and what extra would be done with 20% more

1.2 Expertise

The production of relevant and reliable statistical output requires expertise in the relevant subject field and of the underlying statistical frameworks, concepts etc. It also requires understanding of the data (both source and outputs) and of the basic statistical principles underlying production of the data. In some cases the expertise can be provided by specialised staff or contracted in.

Key indicators are:-

- staff with good knowledge and understanding of subject field, source data, relevant concepts and classifications, current and emerging issues, statistical elements of operations
- expertise is maintained and not vulnerable to key staff changes
- good, up-to-date and accessible documentation
- staff in touch with counterparts in other statistical agencies, key users and relevant experts
- good understanding of key uses of data, including by Maori
- good understanding of the concepts and measurement issues related to Maori statistics and small populations in general
- contributions made to relevant professional associations, international developments

1.3 Adaptability and responsiveness

This refers to the ability of input systems to be adapted to collect different information and the time taken to respond to new demands. This will usually mean exploiting the collection infrastructure in a planned way so that there is a program of core surveys and of 'supplementary' surveys conducted at marginal cost to meet more detailed or lower priority needs, as well as an ability to collect unplanned new information with short lead times.

Key indicators are:-

- source collection infrastructure support a mix of core and 'supplementary' data collection
- flexibile systems able to quickly respond to requests for new information needs
- low cost adaptability of input collections and output systems to changing needs


Quality has many dimensions, all of which should be determined by the uses made of the data by the key users. From a user's perspective, quality of statistics is their 'fitness for use'. Relevance and accessibility are important characteristics, and these are covered as separate aspects of performance because they have wider implications. The key dimensions of quality to be assessed here are accuracy, coherence, interpretability and timeliness.

2.1 Accuracy

Accuracy of information produced by an output area relates to the degree to which the information correctly estimates what the statistical process(es) producing the information was designed to measure. The accuracy of an estimate covers both variability arising from measurement (sample and non-sample error) and any biases in the measures. Whether the statistical processes are measuring the right thing is covered under Relevance.

Key indicators are:-

- quality measures (eg sampling errors) and indicators (eg non-response rates) regularly produced and monitored
- mean square error meets standard for key user needs
- definitions of data consistent with both user and provider understanding
- assumptions on which key measures are based remain valid
- a revisions policy which balances the need to inform users of improved estimates with possible confusion from insignificant changes
- insignificant and consistent (size and direction) differences between preliminary and final estimates
- data rebased regularly
- sample redesigned or reselected regularly to maintain sample errors within quality standards

2.2 Coherence

This refers to the degree to which information is brought together, confronted and presented in a logical and comprehensive way, either from a single source or more likely brought together from different surveys or sources. Time series are a particularly common method of presentation where coherency is important. Other common examples are industries, areas, and population groups such as Maori. Common definitions, classifications and methodologies are important to achieving coherency, or least being able to adjusted for differences when the data is presented.

Key indicators are:-

- all source data measured in accordance with standards (frame, statistical units, definitions, classifications, processes)
- data presented in a framework, along with other relevant data from other sources
- long term time series are available for repeated measures, with explanations or adjustments for breaks in the series
- input data from different sources are confronted and reconciled
- output data is consistent or reconcilable with other sources
- concordances available to allow data to be related to previous classification versions or related classifications
- consistency between aggregates and components
- key classifications are maintained so that comparability is maintained over time to allow comparisons required by users while providing measures of contemporary and emerging events

2.3 Interpretability

This refers to the ease with which a user may understand and properly interpret information produced by an output area.

Key indicators are:-

- analysis undertaken to find and present key findings of and relationships within data
- information of significance to Maori and Maori public policy are clearly presented
- seasonal and trend analysis and other standard adjustment techniques (eg population standardisation) are undertaken to enhance the usefulness of the data and reduce problems of interpretation and comparisons
- presentation standards used for tables and graphs
- information on methods, concepts etc is up-to-date and readily available to users
- information on quality achieved is readily available

2.4 Timeliness

This is primarily determined by the length of time between the reference period of the data and its availability, but must be considered along with other aspects of quality as there are usually trade-offs.

Key indicators are:-

- achieved timeliness meets key user needs
- the relative importance between timeliness and accuracy is understood, and this balance is met
- preliminary results produced to meet time constraints of key users
- release dates are published in advance and met


3.1 Accessibility of information

Accessibility refers to the degree to which the information required by users is known and is easily obtainable in the format required for use.

Key indicators are:-

- main findings are made widely available (eg through press release)
- information on data availability (published and unpublished) is readily known
- catalogues/directories are available for field of statistics
- information is available in formats and media required by users
- data are made available in all media at the same time
- release dates are announced in advance
- access by Maori is facilitated by dissemination of relevant results and in appropriate formats
- information is affordable
- new technology is being used to improve the presentation and accessibility of data

3.2 Client service

As well as making information readily accessible, support services need to be provided so that users get the information they need, when they want it and how they want it.

Key indicators are:-

- expert assistance readily available
- standards set for time taken to meet requests
- standards met for time taken to meet requests

signals of user satisfaction and dissatisfaction with products and services
support for user funded surveys to meet lower priority or special needs
revenue generated, particularly trends (as an indicator of the extent of servicing provided)


4.1 Efficiency

Efficiency relates to the cost of production and focuses on the cost of inputs per output. For an output area, the cost incurred directly by the organisation for the supply of data, as well as indirectly by respondents, would usually be a critical input to the overall cost of production.

There would be an expectation of productivity improvements over time so that either costs are declining or approved outputs increasing (not output creep which relates to unapproved outputs being achieved instead of declared cost reductions). Best practice methods and contestable supply of inputs are a key to achieving efficient production.

Key indicators are:-

extent of use of contestable inputs
use of technology to achieve efficiency, timeliness or quality

- measures of efficiency monitored for all production processes
- cost parameters available and used in design of collections
- quality and costs for inputs set on the basis of their contribution to overall error and quality of outputs
- low cost systems maintenance required

4.2 Respondent management

The long term viability of production of any output depends on maintaining good relationships with the suppliers of the data used to produce the outputs (both respondents and custodians of administrative records). This means ensuring demands made are reasonable, that methods are taken to spread demands and that assurances given to respondents are honoured.

Key indicators are:-

- forms in source collections tested with respondents
- special collection needs of Maori addressed
- selection methods used to manage overlap and rotation of respondents
- administrative data used wherever possible
- good, up-to-date understanding exists of respondent information sources
- information provided to respondents on purpose of collection etc
- help/support system available
- measures of load produced regularly
- sound security of information
- confidentiality checking of releases

4.3 Management of risk and performance

Statistics New Zealand has many policies and standards which are expected to be followed by each output area and collection to achieve corporate goals. There are also expectations of practices to be followed to achieve cost-effective outputs produced to quality standards.

Key indicators are:-

- communication channels established and used regularly with key and other users
- survey infrastructure exploited to produce irregular and less frequent demands at marginal cost off core surveys
- implementation of SNZ Maori Responsiveness Plan
- standards set and regularly reviewed for key outputs
- indicators of quality (including timeliness) regularly measured and monitored
- problems and suggestions for improvement are logged and action tracked
- systematic feedback to collection design from errors and problems identified in analysis, editing, field etc
- documentation on standards, processes, etc up-to-date and accessible
- outputs systematically analysed and validated before release
- systems integrated with corporate infrastructure systems (CARS, BF, etc)
- information on methods, etc in SIM and available to users
- information on achieved quality reported to users
- continuous improvement philosophy to maintain relevance
- regular independent review
- data custodian
- data archived
- adherence to security and confidentiality policies
- release process so that all releases consistently available at release time
- metadata on SIM updated at time of release
- requirements of Statistics Act met


what are the priorities for improvement in performance
what are the major barriers to further improvement in performance (eg new IT system needed to improve responsiveness to users or timeliness)

John Cornish
Group Manager, Statistical Development
Statistics New Zealand

Back to top | Statistics Division Home | Contact Us | Search | Site Map
Copyright © United Nations, 2014