Methodological News, Mar 2022

Features important work and developments in ABS methodologies

Released
30/03/2022

This issue contains three articles:

  • Upgrading the SUTSE model for nowcasting LFS unemployment estimates
  • Safe use of auxiliary data to improve household survey sample designs
  • Addressing additional non-response due to COVID-19 restrictions

Upgrading the SUTSE model for nowcasting LFS unemployment estimates

The ABS has upgraded its methodology for nowcasting the estimates of Australian unemployed persons from the ABS Labour Force Survey (LFS) using the new JobSeeker administrative dataset from the Department of Social Services (DSS).

In March 2020, JobSeeker payments replaced a range of labour market related payments. These different payments had previously been aggregated together in an ABS Seemly Unrelated Time Series Equations (SUTSE) model used to confront and quality assure survey estimates before they are released.

The upgraded SUTSE model now accounts for the two coincident events of major COVID-19-related impacts on the labour market and changes in the number of people in receipt of labour market payments, following the policy changes from March 2020 and onwards.

Because of the JobSeeker policy and data changes, and the COVID-19 impacts that also occurred around that time, the previous relationship between the ABS LFS unemployment person estimates and JobSeeker was altered. Therefore, the previous model became mis-specified since the COVID-19 pandemic. 

Our strategy was to repair the alignment and make the model more adaptive by utilising a Bayesian approach to

  • measure the different COVID-19 impacts on JobSeeker and LFS unemployed person data and adjust the corresponding data for temporal alignment within each series, and
  • adapt the uncertainty over the volatile periods at the beginning of the COVID-19 pandemic by allowing time varying variances for both series

Using recent data, our nowcast simulations suggested that the upgraded SUTSE model outperforms the previous model. It will be implemented in the near future.

For more information, please contact Dr. Xichuan (Mark) Zhang. 

Safe use of auxiliary data to improve household survey sample designs

The ABS has been using the Address Register to select samples for households since 2018.  This list-based approach contrasts with the area-based approach used prior to 2018. This change has enabled the ABS to link auxiliary data directly to the frame of household addresses for use in sample design, although this raises issues of social licence and appropriate protection for the privacy of personal information.

In general, greater benefits can be realised when taking a greater risk with privacy, so it is important to carefully balance benefit with risk. For example, consider a survey where we wish to increase the sample take of pensioner households, i.e. households where the primary source of income is a government pension or benefit. When designing the sample, we could consider two options that use Census data for over-sampling the population of pensioner households:

  • Option A - the number of people in each meshblock (MB) that receive a government pension or benefit, based on pensions data linked at the MB level (area-level)
  • Option B - the number of people in each address that receive a government pension or benefit based on pensions data linked to the Address Register at the address-level

Clearly, the data being used in option B has a greater privacy risk than the data being used in option A. Some data items may be considered too sensitive at the address-level compared to the MB level to warrant using in the sample design or in sample selection.  Analysis by the ABS determined that, for a fixed number of households contacted, an area-level approach could increase the number of pensioner households achieved in the sample by 23% compared to the option of not using pension data at all; while using the data at address-level increased the number of pensioner households in sample by 43%.  Although the gains are stronger from using data at address-level, due to the privacy risk involved it was decided to use the area-level approach.

One approach to reduce the privacy risk while still getting the benefits of address-level auxiliary information is the use of regression models, using the predicted values from the regression models in the design instead of using the auxiliary data directly.  The extra uncertainty from using a modelled value rather than a directly matched value reduces the privacy risk.

One application of this is for household surveys that select only a subsample of the usual residents from each household. Typically, these surveys select one adult and sometimes one child from each selected household. Sample designs for household surveys are usually based on equal probability sampling of addresses within strata defined by geographic units. This means that addresses in the same small area have previously needed to be in the same strata, regardless of the number of people resident at the address. When only one or two people in the household are selected, an equal probability sample of addresses in an area will lead to an unequal probability sample of persons in that area. This leads to people from lone person households getting a higher chance of selection in these surveys and people from larger households receiving a lower chance of selection.  This is corrected in the estimation weighting but the variation in the size of weights leads to increased variance in estimates.

Integrating the Census data and the Address Register provides a source of information regarding the size of each address. This has been used to correct for some of the imbalance in the sample of people selected at the design stage, rather than correcting for this effect in weighting. The result is a reduction of 10%-20% in the variance of survey estimates, i.e. the benefit allows a 10% reduction in sample size while maintaining the previous levels of accuracy.

The ABS will build on this work, for example, by revisiting the work on increasing sample take of pensioner households to see if a regression model prediction can produce further benefits without introducing unnecessary privacy risk.

For more information, please contact Bruce Fraser. 

Addressing additional non-response due to COVID-19 restrictions

The ABS offers a range of response options for households selected in ABS surveys, and all of the large surveys offer a face-to-face interview option.  The ABS encourages the use of the more cost-efficient reporting modes of online forms and telephone interviewing where this suits the survey subject matter, but face-to-face interviewing remains an integral part of household survey data collection.  In particular, the households that are hardest to contact and hardest to convert into respondents are usually face-to-face interview respondents.

This means the COVID-19 pandemic had a significant impact on ABS household survey data collection.  The need to protect the health of ABS interviewers and the use of lockdowns to manage the spread of COVID-19 meant there were long period where ABS interviewers could not conduct face-to-face interviews.  While the impact varied from survey to survey, we found that roughly 50% of households that would otherwise have returned a face-to-face response were able to be converted to an online or phone interview response, while the remaining 50% became non-response.  The impacts have been greater for surveys that collect sensitive personal information, for example experiences of assault, as the ABS practice is to predominantly use face-to-face interviewing for these surveys.

A range of additional non-response adjustment techniques have been investigated to assist with these impacts over the past two years.  These include:

  • matching to administrative data sources to provide additional sources for calibration
  • using past survey data to model the propensity to respond by face-to-face mode, in order to estimate response propensities for when face-to-face mode is not available
  • where impacts have varied between states, using the results of high-response states to estimate the impacts in low-response state

In this analysis, a complicating factor has been that the COVID-19 pandemic impacts on the real-world characteristics we are trying to measure as well as the likelihood of response.  For example, if after weighting a survey we observe that household income in a state that has been heavily impacted by restrictions has changed to a greater extent than other states, then is this a non-response bias, or is this a real-world effect caused by the restrictions?

While the work has been challenging, we believe we have been able to successfully treat most non-response impacts due to the restrictions on face-to-face interviewing.

For more information, please contact Bruce Fraser. 

Contact us

Please email methodology@abs.gov.au to:

  • contact authors for further information
  • provide comments or feedback
  • be added to or removed from our electronic mailing list

Alternatively, you can post to:

Methodological News Editor
Methodology Division
Australian Bureau of Statistics
Locked Bag No. 10
Belconnen ACT 2617

The ABS Privacy Policy outlines how the ABS will handle any personal information that you provide to us.

Previous releases

Releases from June 2021 onwards can be accessed under research.

Issues up to March 2021 can be found under past releases.

Back to top of the page