1504.0 - Methodological News, Sep 2003  
ARCHIVED ISSUE Released at 11:30 AM (CANBERRA TIME) 27/02/2004   
   Page tools: Print Print Page Print all pages in this productPrint All

CONVERTING MAIL SURVEYS TO CATI: FORM DESIGN AND TESTING

Until now, the Australian Bureau of Statistics has made very limited use of Computer Assisted Telephone Interviewing (CATI) for business surveys. Organisational change, declining response rates and ongoing pressure to produce statistics faster have led to a new interest in this mode of data collection. Recently a CATI instrument for data capture during the very final stage of intensive follow-up (IFU) was developed for the Business Technology Survey. This instrument replaced the informal use of telephone data collection employed by this collection.

The main forms design concerns were:

  • the questions in the paper form had to be reworded so they worked as an interview while minimising mode effects. In particular there were numerous questions with long "select all that apply" lists that had to be changed to obtain yes or no answers, while still obtaining similar data to the paper form;
  • the explanatory notes in the paper form were too long to read out every time. Notes had to be prioritised, in some cases moved to the question, and respondents had to be encouraged to ask for the rest. Getting good quality data had to be balanced with finishing the interview before respondent fatigue set in;
  • the screen design and functionality of the CATI instrument, developed in Blaise, needed to ensure the interview went smoothly and measurement error was minimal. Some standards could be adapted from the household Computer Assisted Personal Interviewing (CAPI) instruments used by the ABS, also in Blaise, however the users involved and the type of questions being asked are quite different;
  • the survey had two financial questions which could require record checking and/or a different contact person to the other questions, which could be awkward.

A testing strategy was developed to address these concerns and was implemented in a very short period. An expert review process continued throughout the development. Three different scripts were developed to address the wording and notes concerns. These scripts were tested as paper telephone interviews on randomly assigned groups of live respondents who were excluded from the main survey. The results were compared across scripts and with previous survey and testing results of the paper form. The main findings were that each option in lists did not need to be reworded into a separate question and that some basic questions could be open-ended. A final script combining the best parts of the three was tested again to ensure the new questions and notes worked.

At the same time a rough prototype was developed in Blaise. An informal user review of two different layout options was conducted. Individual users went through both versions and their behaviour and comments were recorded. The main findings emphasised the need to have layout follow a normal reading path down the screen. The final script was incorporated into the next prototype. An informal group run-through of the instrument was conducted and then formal useability testing. One by one, each operator who would be using the CATI phoned a pretend respondent and went through an entire interview. They were observed and debriefed. This allowed examination of the words, layout, navigation and sequencing all together. It was also excellent training in the instrument for the operator.

This process ensured not only an effective new data collection instrument, but a group of users who had contributed to the development of it and were therefore satisfied with the final result.

For more information, please contact: Emma Farrell on (02) 6252 7316.

Email: emma.farrell@abs.gov.au.