This Statement follows the Authority’s earlier Press Release of 16 August in which we announced an independent inquiry following an error in the Statistical Bulletin on Output in Construction published by the Office for National Statistics (ONS) on 12 August 2011, and how this error came to be made and how the episode was handled. The Authority’s review has been prepared on behalf of the Board of the UK Statistics Authority and comprises a brief account of the facts and a set of observations intended to help avoid similar occurrences in the future. These range more widely than the specific circumstances of the error.
On 12 August, ONS published a regular Statistical Bulletin containing the Quarter 2 (April to June 2011) construction output estimate indicating a quarter on quarter growth of 2.3 per cent. The figure that should have been published was growth of 0.5 per cent, unchanged from the initial estimate for Quarter 2 published on 26 July.
The increase in the estimated growth to 2.3 per cent had a consequential effect on growth of Gross Domestic Product (GDP). The first, incorrect, release stated that the revision to construction output was estimated to add 0.1 per cent to that quarter’s GDP growth. The second, corrected, estimate of 0.5 per cent had no effect on GDP, and the reference to GDP was removed from second version of the release.
Following normal practice, the release of the Bulletin was subject to a ‘lock-in briefing’ for journalists on the morning of 12 August. After this briefing, at 0930 hours, a journalist who had attended it pointed to an inconsistency between two tables in the Bulletin. ONS investigated this and the inconsistency was confirmed as an error. At 1310 hours ONS announced publicly that there had been an arithmetical error in the final stages of preparation of the Output in Construction estimates, and that the series was being recalculated. A corrected Bulletin was issued at 1630 hours that afternoon.
The immediate cause of the error was an erroneous spreadsheet formula for the latest quarter. The cell picked up March + April + May instead of April + May + June. The new line in the spreadsheet should have been entered manually rather than copied. When copied, it picked up the wrong months.
The error was then missed in the quality assurance procedures. The error should have been picked up on two counts:
- The inconsistencies in the tables should have been identified by routine checks but these checks were not adequately specified.
- The size of the revision from the previous estimate should have sounded alarm bells but did not, partly because many analysts expected a bounce back in the figures and the quarter to quarter changes had been volatile in some previous quarters.
Construction statistics had been the subject of critical comment from analysts and journalists for some months. It had been suggested that in mid-2010 the series was overestimating output, and later in the year that the series was underestimating output. The errors in the Bulletin were not related to any possible problems with the statistical series, except perhaps in the sense that the staff responsible for these statistics had been involved in a major programme of work to improve the statistics in recent years and this included a large volume of retrospective revisions incorporated in the Q2 Bulletin. This may have contributed to an environment in which it was easier for an error to be made and not be detected prior to publication. In effect, management attention was elsewhere.
- It is clear from the background to the specific error in construction output statistics that quality assurance arrangements were not adequately specified. ONS needs to be confident that fully robust quality assurance procedures are universally in place across the department, with this process documented and formally signed off in each case. There are several dimensions to quality assurance. It is not just a matter of automated, or pre-specified manual, checks (although had enough checks of this kind been done, the specific error would have been spotted). It is also about a culture of inspecting draft publications with an expert and experienced eye; and the development of an appropriate management culture to accommodate and support the continuous improvements of quality assurance. The Statistics Authority notes that a best practice guide for quality assurance was circulated internally at ONS on 15 June 2011 and, had it been followed, would have prevented the error.
- This raises a question about the role of senior managers in the production of statistical outputs. To people outside a statistical office it may seem self-evident that senior officials should be intimately involved in producing the Bulletins that are the focus of external and media interest. But the reality is more complex. There are many demands on the time of senior staff and it is necessary and desirable in many cases to leave regular monthly and quarterly production to the less senior people who do the day to day work. Too much involvement of senior staff could be counter-productive, slowing and confusing the regular production processes. However, whatever is now decided, in the light of this sequence of events, to be the appropriate role of senior managers, that role needs to be fully enforced with effective management disciplines. In relation to quality assurance, managers may have to accept less freedom to define their own roles on the basis of their personal understanding of what is needed and the prevailing culture of the office.
- Part of the role of senior managers, particularly where they are responsible for high profile statistical outputs, is to be aware of the external environment, including what the news media and analysts are saying about the statistics, the assumptions of external experts about future trends, and the relevance of the statistics to government policy and to markets, commerce and political debate in the four UK administrations. This background intelligence is needed not only to shape the commentary and advice published with the statistics but also to understand the likely impact of the statistics, and the implications of any errors in the statistics. Some of this external intelligence should also help in spotting errors (by cross-referencing the unpublished figures against external expectations), and in designing extra checks to prevent them happening. ONS should ensure that where such alert and responsive expertise is not fully evident, appropriate guidance, training and managerial support is provided.
- It is also clear that the use of out-dated spreadsheet technology was a significant contributory factor to the error that occurred. Further progress in moving away from this technology to more robust processes must be given high priority. Past projects to replace the use of spreadsheets have been less than wholly successful. The Statistics Authority will encourage and support fresh steps to overcome the obstacles. It is clear that due to the scale of this problem, eliminating the dependence on spreadsheets in regular statistical production will take significant time and resources, and priority must therefore be given to the areas that present the greatest risk. Construction statistics is evidently one of those areas.
- A further concern is that there appears to have been insufficient staff available, and weakness in terms of staff experience in the specific field of construction statistics, responsibility for which was transferred relatively recently to ONS. This is an issue for ONS management to tackle as a matter of urgency.Handling of corrections
- The error in the statistics was confirmed at 0930 hours on the day of release. However, the statistics themselves were not withdrawn until about four hours later. This was because ONS, understandably, wished not to announce the error before it was confident about when a recalculated set of statistics could be published. However, in this instance the decision to delay the announcement left users misinformed. The Code of Practice for Official Statistics requires producers to correct errors in statistical reports promptly. Given the market-sensitive nature of the construction output statistics, the magnitude of the error and the effect on estimations of GDP, the Authority would expect a different approach to be followed in future. Any substantive error should be announced as soon as is reasonably possible.
- It may be argued, as a general principle, that once a statistical Bulletin has been placed on the website, it should be possible for users to refer back to the document (both the statistics and commentary) in its original form, regardless of whether changes were subsequently needed; and that where there is an error that has to be corrected, the correction should be clearly marked, so that there is a clear audit trail. The replacement of the original content of a Bulletin, without this being immediately evident, could cause confusion and suspicion among users. The audit trail in this case is not clear, although this may have been affected by the launch of the new ONS website in late August.
- The corrected Bulletin contained some errors in presentation, for example the graph showing the volume of new work in the public housing sector appears to be wrongly labelled. The pressure to produce the revised release quickly will clearly have been a contributory factor here. These errors should be now corrected.