1504.0 - Methodological News, Jun 2013  
Previous ISSUE Released at 11:30 AM (CANBERRA TIME) 28/06/2013   
   Page tools: Print Print Page Print all pages in this productPrint All RSS Feed RSS Bookmark and Share Search this Product

Improving Survey Efficiency at the ABS with Responsive Design – A Simulation Study

The cost and effort of maintaining high response rates in ABS surveys has been increasing in recent years. How can the ABS adapt in a changed environment where we are observing declining response rates and increasing enumeration costs?

Traditionally, response rates are used as a measure of survey quality and are used to achieve precision targets. However, a high response rate does not guarantee low non-response bias. Responsive design can improve survey efficiency by:

providing us with an informed understanding of the trade-off between survey cost and quality, using additional indicators of survey quality

enabling us to adapt survey data collection strategies in response to improved survey performance intelligence.

Recent research (Shlomo et al 2012) has investigated two survey quality indicators to complement response rates: representativity indicators (or R-indicators) and maximal absolute bias, an upper limit to the survey non-response bias. These indicators require the response probabilities of all sample units, which are predicted using a response model. Non-response is assumed to be missing at random with respect to the survey data items, conditional on the covariates used in the response model.

We conducted a simulation study to explore the effects of responsive design on an ABS household survey. The simulation study investigated an alternative non-response follow-up targeting strategy and its effect on response rates, the R-indicator and maximal absolute bias. Under simulation conditions, we can calculate survey non-response bias and root mean square error (RMSE) for a number of survey items and use them as ultimate quality measures for assessment.

Our targeting strategy consisted of:

enumerating all households for the first two calls or interviewer waves

restricting interviewer follow-up to half the remaining non-responding households from wave three up to a maximum of ten waves

exploring three follow-up strategies: targeting the easy respondents, the difficult respondents or a random half.

All targeting strategies led to reduced survey costs and lower response rates. However, targeting follow-up to households that were less likely to respond resulted in the lowest survey bias and response rate. Overall, targeting difficult respondents resulted in reduced RMSE compared with the other two groups. The maximal absolute bias was informative for the difficult and random respondents but too high for the easy respondents.

Further work will incorporate Census data as covariates to improve the response model and use follow-up strategies based on real survey procedures.

Shlomo, N, Skinner, C & Schouten, B. (2012) ‘Estimation of an indicator of the representativeness of survey response’, Journal of Statistical Planning and Inference, vol. 142, pp. 201-211.

Further Information
For more information, please contact Lan Kelly (08 8237 7643, lan.kelly@abs.gov.au)