4720.0 - National Aboriginal and Torres Strait Islander Social Survey: User Guide, 2014-15  
Latest ISSUE Released at 11:30 AM (CANBERRA TIME) 27/05/2016   
   Page tools: Print Print Page Print all pages in this productPrint All

NON-SAMPLING ERROR


Lack of precision due to sampling variability should not be confused with inaccuracies that may occur for other reasons, such as errors in response and recording. Inaccuracies of this type are referred to as non-sampling error. This type of error is not specific to sample surveys and can occur in a census enumeration.

The major sources of non-sampling error are:


These sources of error are discussed in turn below.

Errors related to survey scope

Some dwellings may have been inadvertently included or excluded because, for example, the distinction between whether they were private or non-private dwellings may have been unclear. All efforts were made to overcome such situations by constant updating of dwelling listings both before and during the survey. Additionally, some people may have been inadvertently included or excluded because of difficulties in applying the scope rules concerning who was identified as usual residents, and concerning the treatment of some visitors. For more information on scope and coverage see the Methodology chapter.

Response errors

In this survey, response errors may have arisen from three main sources:
  • questionnaire design and methodology;
  • interviewing technique; and
  • inaccurate reporting by the respondent.

Errors may have been caused by misleading or ambiguous questions, inadequate or inconsistent definitions of terminology, or by poor overall survey design (eg context effects, where responses to a question are directly influenced by the preceding question/s). In order to overcome these types of issues, individual questions and the overall questionnaire were tested before the survey was enumerated. Testing included:
  • focus groups;
  • peer review; and
  • field testing (a dress rehearsal).

More information on pre- and field testing is provided in the Survey development chapter.

As a result of testing, modifications were made to:
  • question design, wording and sequencing;
  • the respondent booklet (prompt cards); and
  • survey procedures.

In considering modifications it was sometimes necessary to balance better response to a particular item/topic against increased interview time, effects on other parts of the survey and the need to minimise changes to ensure international comparability. Therefore, in some instances it was necessary to adopt a workable/acceptable approach rather than an optimum approach. Although changes would have had the effect of minimising response errors due to questionnaire design and content issues, some will have inevitably occurred in the final survey enumeration.

Response errors may also have occurred due to the length of the survey interview because of interviewer and/or respondent fatigue (ie loss of concentration). While efforts were made to minimise errors arising from deliberate misreporting or non-reporting, some instances will have inevitably occurred.

Accuracy of recall may also have led to response error, particularly in relation to the lifetime questions. Information in this survey is essentially 'as reported', and therefore may differ from information available from other sources or collected using different methodologies. Responses may be affected by imperfect recall or individual interpretation of survey questions, particularly when a person was asked to reflect on experiences in the 12 months prior to interview. The questionnaire was designed to strike a balance between minimising recall errors and ensuring the data was meaningful, representative (from both respondent and data use perspectives) and would yield sufficient observations to support reliable estimates. It is possible that the reference periods did not suit every person for every topic, and that difficulty with recall may have led to inaccurate reporting in some instances.

A further source of response error is lack of uniformity in interviewing standards. To ensure uniform interviewing practices and a high level of response accuracy, extensive interviewer training was provided. An advantage of using Computer Assisted Interviewing (CAI) technology to conduct survey interviews is that it potentially reduces non-sampling error. More information on interviews, interviewer training, the survey questionnaire and CAI is provided in the Survey development and Data collection chapters.

Response errors may have also occurred due to language or reading difficulties. In some instances, a proxy interview was conducted on behalf of another person who was unable to complete the questionnaire themselves due to language problems and where an interpreter was unable to be organised. A proxy interview was only conducted where another person in the household (aged 15 years or over) was considered suitable. The proxy may also have been a family member who did not live in the selected household, but lived nearby. A proxy arrangement was only undertaken with agreement from the selected person, who were first made aware of the topics covered in the questionnaire. Aside from difficulties in understanding English verbally/orally, there may have been difficulties in understanding written English. The 2014–15 NATSISS incorporated the extensive use of prompt cards, as pre-testing indicated that these could aid interpretation by selected persons. People were asked if they would prefer to read the cards themselves or have them read out by the interviewer. It is possible that some of the terms or concepts used on the prompt cards were unfamiliar and may have been misinterpreted, or that a response was selected due to its position on the prompt card.

Some respondents may have provided responses that they felt were expected, rather than those that accurately reflected their own situation. Every effort has been made to minimise such issues through the development and use of culturally appropriate survey methodology. Non-uniformity of interviewers themselves is also a potential source of error, in that the impression made upon respondents by personal characteristics of individual interviewers such as age, sex, appearance and manner, may influence the answers obtained.

Undercoverage

Undercoverage is one potential source of non-sampling error and is the shortfall between the population represented by the achieved sample and the in-scope population. It can introduce bias into the survey estimates. However, the extent of any bias depends upon the magnitude of the undercoverage and the extent of the difference between the characteristics of those people in the coverage population and those of the in-scope population.

Potential sources of undercoverage include:
  • frame exclusions;
  • non-response;
  • non-identification as Aboriginal and Torres Strait Islander; and
  • issues arising in the field.

Information on the potential sources of undercoverage is provided in the Explanatory notes of the 2014–15 NATSISS summary publication.

Rates of undercoverage

Undercoverage rates can be estimated by calculating the difference between the sum of the initial weights of the sample and the population count. If a survey has no undercoverage, then the sum of the initial weights of the sample would equal the population count (ignoring small variations due to sampling error).

The overall undercoverage rate for the 2014–15 NATSISS is approximately 62% of the in-scope population at the national level. This rate varies across the states and territories, as shown in the table below. This is a relatively large level of undercoverage when compared to other ABS surveys. For example, the estimated undercoverage rate for the Monthly Population Survey for private dwellings is on average 17%, and the non-response rate is around 7%. There was also an increase in undercoverage compared to the 2008 NATSISS, which had an estimated undercoverage rate of 53%.

Table 5.2.1. 2014–15 NATSISS undercoverage, by state/territory


NSW
Vic
QLD
SA
WA
Tas
NT
ACT
Australia
%
%
%
%
%
%
%
%
%
Rate of undercoverage
65.4
69.3
59.9
60.7
59.0
49.6
60.0
63.8
61.9

Potential bias due to undercoverage was addressed by the application of an adjustment to the initial weights. The weights were calibrated to population benchmarks to account for the undercoverage at the various calibration levels. More detailed information on weighting and benchmarking is provided in the Methodology chapter.

Errors in processing

Errors may also occur during data processing, between the initial collection of the data and final compilation of statistics. These may be due to a failure of computer editing programs to detect errors in the data, or during the manipulation of raw data to produce the final survey data files; for example, in the course of deriving new data items from raw survey data or during the estimation procedures or weighting of the data file.

To minimise the likelihood of these errors, a number of quality assurance processes were employed, including:
  • computer editing—edits were devised to ensure that logical sequences were followed in the questionnaires, that necessary items were present and that specific values lay within certain ranges. These edits were designed to detect reporting and recording errors, incorrect relationships between data items or missing data items;
  • data file checks—at various stages during processing (such as after computer editing or after derivation of new data items) frequency counts and/or tabulations were obtained from the data file showing the distribution of persons for different characteristics. These were used as checks on the content of the data file, to identify unusual values which may have significantly affected estimates and illogical relationships not previously identified. Further checks were conducted to ensure consistency between related data items, and in the relevant populations; and
  • where possible, the data were checked to ensure consistency of the survey outputs against results of other ABS surveys, such as the 2008 NATSISS and the 2012–13 Australian Aboriginal and Torres Strait Islander Health Survey.

Back to top of the page