|Page tools: Print Page Print All RSS Search this Product|
INDEPENDENT TECHNICAL REVIEW INTO THE LABOUR FORCE SURVEY AND ABS RESPONSE
With the release of the September 2014 labour force estimates on 9 October 2014, the ABS announced an independent technical review would be conducted after unexpected volatility was observed in seasonally adjusted labour force estimates for July to September 2014. The review examined how the ABS produces the monthly labour force estimates to ensure that high quality, credible estimates will be produced.
The review was undertaken in October and November 2014 by Paul McCarthy. Mr McCarthy worked in the ABS in a variety of senior management positions responsible for critical economic statistics and also worked for the OECD and World Bank on economic statistics. The review was guided by a reference committee consisting of senior officials from the Treasury, the Reserve Bank of Australia and the ABS. The final review report was provided to the Australian Statistician on 27 November 2014.
The key recommendation of the review (recommendation 2) was implemented with the October 2014 estimates released in Labour Force, Australia (cat. no. 6202.0) on 6 November 2014.
REPORT FROM THE INDEPENDENT TECHNICAL REVIEW
The Executive Summary section of Paul McCarthy’s independent technical review is in the Attachment. The full report from the technical review is available on request by emailing firstname.lastname@example.org.
RECOMMENDATIONS AND ABS RESPONSE
Each recommendation from the independent technical review is presented below along with the ABS response to the recommendation.
The ABS has accepted the findings of the review and has agreed to all the recommendations made.
Increased number of respondents using e-forms
Changes made to the LFS supplementary survey program
Lower response rates
The LFS computing system
ABS LFS Supplementary Surveys Program
Timing of release of LFS publications
Volatility in the LFS estimates
Making changes to the LFS
Quality Incident Response Plan
In July, August and September 2014, the ABS Labour Force Survey (LFS) seasonally adjusted employment and unemployment data have been unusually volatile. As a result, in the September 2014 LFS results, the ABS set the seasonal factors for each of these three months to one (i.e. assuming no seasonality in these months).
Following the release of the September LFS estimates, the ABS announced that “it would commission a review with independent external input to develop an appropriate method for seasonally adjusting October 2014 and following months’ estimates”.
The scope of this independent review for the acting Australian Statistician’s consideration was to identify issues that impacted on the quality of recent LFS estimates, particularly any related to the changes that have been made to the LFS over the past 18 months or so, with emphasis on advising an appropriate course of action for changes required when compiling the October 2014 LFS estimates. Recommendations regarding any changes required to ensure the quality of LFS estimates in subsequent months were also to be included.
Recent changes to the LFS
Several significant changes have been implemented in the LFS program during the past 18 months:
2. Benchmarking the LFS to revised Estimated Resident Population estimates (20 years data were affected)
3. Changes to the questions on “job search”
4. Other changes to the LFS questionnaire
5. Changes to the LFS supplementary survey program
6. Introducing and expanding the use of e-forms
7. Response rates were lowered.
The effects of the first four are judged to be minimal, apart from the potential for some noise in the data at the time of their introduction.
The changes to the supplementary survey program have been the major reason for the volatile LFS estimates, particularly in July, August and September 2014 but also in some other months earlier in the LFS time series. The problem was that some supplementary surveys have an effect on the responses to the main LFS survey (e.g. the supplementary survey run in August for many years is associated with a lower level of employment than should be the case).
Recommendations regarding the October 2014 LFS estimates
The review found that the major source of volatility in the LFS estimates in recent months was the change in the supplementary survey program that is run in conjunction with the LFS. Other potential sources of volatility were the move to e-forms for the LFS and lower response rates, although these were considered to be marginal compared with the effects of the changed supplementary survey program.
The key recommendation regarding the October 2014 LFS estimates was provided in a preliminary report on 29 October 2014:
Outcomes of the revised seasonal adjustment methodology
Seasonal adjustment is the process of estimating and then removing from a time series those influences that are calendar related, i.e. those that tend to occur regularly once or more each year. They include the effect of periodic factors such as weather, holidays, social conventions, administrative practices and, in some indicators, the number of trading days in each month. Seasonally adjusted estimates are produced by estimating the seasonal component and removing it from the original (unadjusted) series.
There is evidence that LFS supplementary surveys have a systematic influence on the LFS time series. Historically, the supplementary surveys program has been consistent in the sense that, in general, the same supplementary surveys have been conducted annually in the same month. The reasonably consistent timetable for major supplementary surveys means that any effects they might have on the LFS data have generally been identified as part of the seasonal adjustment process and so have been removed from the seasonally adjusted data.
Starting in February 2014, the timetable for supplementary surveys changed quite dramatically. As a result, the supplementary survey effect is no longer being removed as part of the ongoing seasonal adjustment process. There was no obvious effect on the seasonality of the LFS employment or unemployment in the first few months after the change in this timetable, although the February LFS estimates were considered to be a bit noisier than usual. However, the LFS estimates became more volatile from July through to September. The seasonality observed in total employment and unemployment in these months was attributable to actual seasonal (calendar) influences plus the (unknown) extent to which the seasonality was caused by the supplementary surveys. Not adjusting for this latter factor when the supplementary survey program was changed would increase the effects of the irregular component of the time series; in other words, it would increase the volatility of the series.
The effect of individual supplementary surveys on the original (unadjusted) LFS estimates was tested using regression analysis. Seven supplementary surveys were found to most significantly influence the employment estimates and those seven plus an additional six were identified as having a significant influence on the unemployment estimates.
Prior corrections to take account of the effects of these supplementary surveys were estimated at the national level and applied to each of the directly seasonally adjusted components of employed persons and unemployed persons. Using the prior corrections to explicitly estimate the effects of the LFS supplementary surveys is judged to have provided a more reliable set of seasonally adjusted estimates and will be used in the ongoing seasonal adjustment of LFS estimates.
As an interim measure, the new seasonal adjustment methods were applied to data from December 2013 onwards. The full length of the monthly LFS time series (i.e. from February 1978 onwards) will be adjusted using this method when the annual seasonal reanalysis of the LFS is completed in early 2015.
There are three potential influences on the LFS estimates that need to be closely monitored:
2. Introducing and expanding the use of e-forms could have an effect on the LFS estimates.
3. Lower response rates could potentially bias the results or, at least, add volatility due to increases in the relative standard errors.
Changes caused by new supplementary surveys
The effects of discontinuing some supplementary surveys can be assessed using regression analysis and treated as prior corrections in the seasonal analysis of LFS data. However, there is no information available on the effects of introducing new supplementary surveys or of combining some surveys previously conducted separately. Therefore, the new program currently being introduced will potentially affect the LFS estimates to an unknown extent for at least the next three years. Any effects will show up as increased volatility in the seasonally adjusted LFS estimates in the months in which the new supplementary surveys are conducted and in the movements between those months and the adjoining ones.
One means of determining if there is a “supplementary survey effect” on the LFS would be to identify records that change their status in the month of a supplementary survey and then change back again in the following month. For example, a person classified as employed in the month before the supplementary survey, switching to unemployed in the supplementary survey month and then back to employed in the following month could be followed up to ascertain whether the changes were valid.
Alternatively, a more broadly based approach would be to check the history of numbers who have hopped from one category to another then back to the previous category across a major supplementary survey month. For example, for August where the observed supplementary survey effect is lower employment, the analysis would involve checking the numbers who were employed in July, changed to either unemployed or not in the labour force in August and then back to employed in September. Running a regression through the historical data would give the expected numbers of category hoppers, which could be compared with those in August this year to provide an indication of the changed impact from the new August survey. The seasonal factors for August could then be adjusted to take account of this assessment of the changed supplementary survey effect due to the new August survey. One problem is that the full extent of the category hoppers cannot be worked out until after the September LFS data are available. In August 2015, for example, an initial estimate of the effect could be made using 2014 data and then the seasonal factors for August 2015 could be revised once the full analysis can be carried out using the September LFS data. A similar analysis should be tested for each of the months in which a supplementary survey was run that has been shown to have an effect on the LFS results.
Statistics Canada may have a process for handling this type of effect and it would be worthwhile to check if their procedures could be adopted.
Expanding the use of e-forms
The ABS introduced e-forms into the LFS in May 2013. The current response by e-form has been fairly stable for several months at about 20% but the aim is to increase this level significantly. During the phase-in of online forms it was noted that employed persons were more likely to use the e-form option than unemployed persons or persons not in the labour force. However, there was no indication that persons using e-forms would respond differently to the questions compared with those being interviewed by phone or in person.
Introducing the e-forms is likely to cause two types of effect – a coverage effect and a reporting effect. The coverage effect is when the e-form captures a different respondent population than the existing paper form. The reporting effect is any tendency for respondents to provide different answers according to whether they are responding to an interviewer or using an e-form. The e-forms were introduced using a “split sample” approach, which enabled an analysis to separate these two effects by comparing the distribution of labour force status between the e-form respondents and similar respondents from the control group who were not offered the e-form option. The analysis showed that there was evidence of a coverage effect but there was no measurable reporting effect given both the known limitations of the strategy to detect a small impact and the relatively low e-form take-up rate.
Lower response rates
The LFS response rates have fallen from about 96% to 93% between the beginning of 2013 and September 2014. Analysis of the final 2-4% of responses in earlier years indicated that these last responses were not significantly different from the initial response. As a result, it was concluded that a drop in the response rates from 95-96% to about 93-95% should not have a significant effect on the LFS estimates. However, the response rates have fallen to the low end of that bound.
An analysis of the unmatched part of the common sample in September 2014 showed that the missing sample in August due to the fall in response rates was a group that was less likely to be employed. Labour Branch analysis showed that those who did not respond in August were different from the usual “hard to gets” which are more likely to be employed. They were likely to be either unemployed or not in the labour force, which would cause a higher employment estimate and lower unemployment estimate. (The response rates had dipped from about 93% in July to about 92% in August and then rose to about 93½% in September.)
Further evidence that the decline in response rates is potentially having an adverse impact on the survey results is that the preliminary estimates for August 2014 changed significantly in the few days between the preliminary estimates on Monday 1 September and the final estimates, which became available on Friday 5 September. However, part of this effect could be due to delays in coding records that had been received earlier.
There is no indication that the lower response rates are biasing the LFS results; rather, if they are having an effect then it is likely to show up as increased noise in the estimates. It should be noted that the Australian LFS response rates are still very high by international standards. For example, LFS response rates in Canada were around 93% prior to 2012 but have dropped to about 88-90% in the past couple of years. The response rates for the United Kingdom’s LFS are less than 80%.
The changes made to the LFS during the past 18 months or so placed a huge load on the LFS staff. Key personnel left part way through the process. Their replacements have performed well in difficult circumstances. However, LFS users are interested in the quality (believability!) of the LFS data rather than any problems being encountered by the ABS in handling the range of changes involved. A more systematic and comprehensive approach (including regular reviews) to managing statistical risks around the introduction of changes should be adopted. Future changes to the LFS should proceed only after a formal risk assessment is carried out on all aspects of the proposed changes. All relevant areas of the ABS should be involved (Labour Branch, Methodology, Time Series Analysis, Technology Services (if required)) and they should formally sign off an assessment of the effects of the changes and how they will be managed. This is particularly important where the change is complex, where change occurs over a long period of time, where ‘outside factors’ cause plans to change or when there are changes in key personnel.
The LFS data are amongst the more important economic indicators produced by the ABS. Any changes to these data and the ways in which they are compiled need to be communicated to key stakeholders well in advance of the changes being introduced. It is important to provide details of the reasons for any changes together with an assessment of their effects on the estimates.
SUMMARY OF RECOMMENDATIONS
This section lists the recommendations in the report and the paragraph numbers of the text to which each relates (note that there are gaps in the paragraph numbers because not every paragraph leads to a recommendation).
Increased number of respondents using e-forms (paragraphs 26-31)
Recommendation 1: The labour force characteristics of those responding by e-forms should continue to be analysed to determine how they compare with those being interviewed by telephone or in person and whether or not a changing take-up of e-forms could have an effect on the LFS estimates.
Changes made to the LFS supplementary survey program (paragraphs 32-51)
Recommendation 2: The LFS estimates should be seasonally adjusted using prior corrections for the supplementary surveys that have been assessed as having a significant effect on the seasonality of the LFS estimates from now on. The methodology should be closely examined as part of the annual seasonal reanalysis of LFS data in early 2015 to determine whether or not the method should be refined further prior to it being applied to the full length of LFS monthly series from February 1978 onwards.
Potential problems (paragraphs 52, 53)
Recommendation 3: The ABS should warn users of LFS data of the possibility of instability in the seasonally adjusted LFS estimates in those months in which new supplementary surveys are being conducted from 2014 on and that it will be at least three years before the effects on the LFS estimates of the new supplementary survey program can be measured with any degree of certainty. Steps should be taken to identify any such effects (e.g. by analysing individual records that change LF status more than once) and adjust the core LFS estimates, if necessary.
Lower response rates (paragraphs 54-57)
Recommendation 4: It is possible that the lower response rates may be having an effect on the estimates of employment and/or unemployment. The impact of lower response rates should be analysed further.
Trend estimates (paragraphs 64-68)
Recommendation 5: Expand the description of seasonal adjustment and trend in the 6202.0 Explanatory Notes (and in any related publications) by including the links between an original series and its seasonal, irregular and trend components (i.e. O = T * S * I) as an introduction to the more technical aspects already included in the notes.
Gross flows (paragraphs 69-71)
Recommendation 6: The ABS should highlight the fact that the changes in the numbers of persons employed, unemployed and not in the labour force from one month to the next are the net outcomes of considerably larger gross flows between these categories. One means of doing so would be to include some comments on the gross flows each month in the “Labour Force Commentary” section of Labour Force, Australia (cat. no. 6202.0).
Bayesian techniques (paragraphs 72-74)
Recommendation 7: Bayesian techniques should be considered as a means of identifying and resolving potential problems in core LFS series.
Composite estimation (paragraphs 75-78)
Recommendation 8: An ongoing program should be established to systematically re-calculate the weights underlying composite estimation. The frequency of re-calculation should be based on an analysis of the effects of the changes in the weights between 2007 and the current time.
The LFS computing system (paragraphs 79-82)
Recommendation 9: The LFS system and associated collection systems need to be replaced so that proposed changes to the LFS can be formally assessed (e.g. through splitting the sample and comparing responses to new and old questionnaires). As an interim measure, a separate means of storing seasonal factors in the LFS system should be implemented so changed seasonal adjustment methods can be tested without impinging on the LFS production environment and that the seasonal factors from the SEASABS production system can be used selectively, if required.
ABS LFS Supplementary Surveys Program (paragraphs 83)
Recommendation 10: Given the importance of the LFS data and the adverse effect of the associated supplementary survey program on these key economic indicators, the ABS should consider discontinuing the supplementary surveys, or conduct them in a manner that has a negligible impact on the LFS data.
Timing of release of LFS publications (paragraphs 84-87)
Recommendation 11: The LFS processing and publication cycle should be closely monitored to determine whether or not it is necessary to delay publishing LFS data until early in the week following the current timetable on an ongoing basis or at least when changes are being made to the LFS. In particular, delaying the publication should be considered if any significant change is to be implemented.
Stakeholder engagement (paragraphs 88, 89)
Recommendation 12: The ABS should involve key stakeholders in any proposals to change the LFS or the ways in which LFS estimates are compiled. Any proposed changes should be communicated in advance via a release on the ABS website providing details of what is happening and why.
Volatility in the LFS estimates (paragraph 103)
Recommendation 13: Statistics Canada has faced similar criticisms to the ABS regarding perceived volatility in its LFS estimates. One of the responses was to publish an article Interpreting monthly changes in employment from the labour force survey to show that recent history was, in fact, less volatile than the past. Although the recent volatility in the Australian LFS had a specific cause, the internal ABS analysis of the general volatility in the Australian LFS estimates should be expanded and formalised and published to help educate users about this issue.
Making changes to the LFS (paragraphs 104-106)
Recommendation 14: Wherever feasible, future changes to the LFS program should be introduced individually, spread out over time rather than bunched as has been the case during the past 18 months or so, to the extent they can be controlled (e.g. the timing of introducing Population Census benchmarks and benchmarking to ERP estimates is non-discretionary).
Change management (paragraphs 107-110)
Recommendation 15: If it does become necessary to make a number of changes to the LFS in a short time, specific risk management strategies need to be put in place to assess the potential impacts of each of the changes. A small working group consisting of senior Labour Branch/LFS staff and representatives of Methodology Division, Time Series Analysis Branch and Technology Services Division should evaluate the individual and cumulative impacts of the proposed changes and provide a formal risk management report to the relevant FAS and Deputy Statistician.
Quality Incident Response Plan (paragraphs 111-114)
Recommendation 16: As part of any future Quality Incident Response Plan (QIRP), the effects on LFS seasonality of changes in institutional arrangements (e.g. changes in the LFS questionnaire or in the supplementary survey program or in the incidence of response by e-forms) should be considered explicitly.
These documents will be presented in a new window.