4720.0 - National Aboriginal and Torres Strait Islander Social Survey: Users' Guide, 2008  
ARCHIVED ISSUE Released at 11:30 AM (CANBERRA TIME) 26/02/2010   
   Page tools: Print Print Page Print all pages in this productPrint All

SURVEY DESIGN


OVERVIEW

This chapter provides information on the following:



SURVEY DEVELOPMENT

Questionnaire testing

Pre-testing

Pre-testing covers a range of testing techniques, the common feature being that the testing is conducted prior to taking a survey into the field (ie 'field test'). This phase of testing is critical for identifying problems for both respondents and interviewers, particularly regarding question content. The techniques used are designed to identify problems with the part of the questionnaire being tested and give some indication of the source of the error.

Techniques that were available during the pre-testing phase of questionnaire development:
  • Focus groups - small groups of people who represent or who share similar characteristics with the target survey population. The ABS conducted NATSISS focus groups in June 2007 in both remote and non-remote locations in NSW, SA, WA and NT. Participants were presented with topics and questions which were intended to be included in the survey and were asked to provide feedback about the proposed topics and question wording.
  • Expert evaluation - a peer review process used to identify respondent semantic problems, respondent task problems, assess content validity and translate concepts.

A major advantage of pre-testing is that small samples are used and limited survey specific documentation and training is required as the testing is performed by people working on the survey. Consequently the process can allow several iterative tests of a set of questions in the time it would take to conduct a pilot test.

The broad objectives of the 2008 NATSISS pre-testing were to assess:
  • respondent understanding of proposed concepts and questions;
  • respondent reaction to potentially sensitive data items, and whether this affects reporting;
  • and improve the construct validity of proposed questions; and
  • the availability of data.

Field testing

The next phase of survey development involved field testing the survey questionnaire and procedures.

Pilot Test

The Pilot Test is the first field test of the entire question set. Testing is designed to:
  • check the positioning and interlinking of various questions or modules;
  • test the mechanical accuracy of the questionnaire, including sequencing, populations and general layout;
  • undertake a full timing analysis;
  • assess the effectiveness of interviewer training and documentation; and
  • assess field, office management and procedural issues.

Pilot testing was conducted in November 2007 in both remote and non-remote areas in NSW, SA, WA and NT. A maximum of four people per household, two adults and two children were randomly selected to be interviewed.

Dress Rehearsal

The Dress Rehearsal is the final test in the development cycle and mainly focuses on the procedural and timing aspects of the survey. Primarily, it is an operational test. Questionnaire design errors (eg sequencing errors) can be identified, investigated and corrected. Objectives of the Dress Rehearsal are to:
  • confirm the average interview time of the survey (for modules and the whole survey);
  • identify and rectify any issues that interviewers have with procedures, survey documentation, the survey questionnaire, or the Computer Assisted Interviewer Workload Management System;
  • identify and rectify any issues that respondents have with survey content and structure;
  • refine and add any necessary edits, edit notations or edit resolution instructions;
  • refine and improve the survey's documentation (Interviewers' Instructions; Office Instructions, flowcharts, etc); and
  • refine and improve interviewer training.

The dress rehearsal was conducted in April/May 2008 in both remote and non-remote locations in Victoria, SA, WA and NT.

DATA COLLECTION

The survey was conducted under the authority of the Census and Statistics Act 1905. The ABS sought the willing cooperation of households in this survey. For survey questions of a particularly sensitive nature (eg substance use) selected persons (or their proxies) may not have provided a response. More detailed information on allowed survey responses (including refusals) is provided in each of the topic-based chapters.

The confidentiality of all information provided by respondents is guaranteed. Under this legislation, the ABS cannot release identifiable information about households or individuals. All aspects of the survey's implementation were designed to conform to the Information Privacy Principles set out in the Privacy Act 1988, and the Privacy Commissioner was informed of the details of the proposed survey.

Trained ABS interviewers conducted personal interviews at selected private dwellings from August 2008 to April 2009. Interviews were predominantly conducted using a Computer-Assisted Interviewing (CAI) questionnaire. CAI involves the use of a notebook computer to record, store, manipulate and transmit the data collected during interviews. In remote areas, a paper back-up of the questionnaire was available, if needed, but generally was not used. In non-remote areas a self-enumerated paper form was used to collect information on substance use.

Prior to enumeration, ABS interviewers participated in cultural awareness training, which provided information specifically developed for surveys involving Aboriginal and Torres Strait Islander people. The training outlined the ABS protocol for conducting surveys in community areas and described cultural considerations for interviewers.


Interviews

Interviewers conducted a screening process to identify Indigenous households, that is, households where one or more household members were identified as being Aboriginal or Torres Strait Islander. Interviewers went to dwellings in selected areas and asked one usually resident household member (aged 18 years or over), if anyone in the household is of Aboriginal or Torres Strait Islander origin. If the household spokesperson stated that one or more usual residents were Aboriginal or Torres Strait Islander, the household form was commenced.

The household form collected general characteristics of the household, from one usually resident household member aged 18 years or over. This information included:
  • age;
  • sex;
  • geographic information;
  • relationship in household;
  • whether anyone aged 15-24 years was a full-time student; and
  • Indigenous status.

Based on this demographic information, individuals were randomly selected for personal interview. For selected households in discrete remote Indigenous communities and outstations, up to one Indigenous person aged 15 years and over and up to one Indigenous child aged 0-14 years was randomly selected. For selected households in non-remote and remote non-community areas up to two Indigenous persons aged 15 years and over and up to two Indigenous children aged 0-14 years were randomly selected.

An elected household spokesperson also answered some financial and housing items on behalf of other household members, including:
  • household facilities such as phone, bedrooms, computers;
  • household maintenance and working utilities;
  • dwelling tenure type including rental and mortgage arrangements;
  • household financial stress; and
  • income for non-selected adults.

If a usually resident household member aged 18 years or over was not available, interviewers made appointments to call-back to the household, as necessary.

Information on demographic and household characteristics is provided in the Population characteristics chapter. A full list of the topics included in both the personal interviews and the household form is provided in the Introduction.

In order to conduct a personal interview with the selected person (ie the respondent), interviewers made appointments to call-back to the household, as necessary. In some cases appointments for call-backs were made by telephone, however, all interviews were conducted face-to-face. Due to the sensitive nature of the survey questions, it was suggested that interviews be conducted in private. However, interviews may have been conducted in private or in the presence of other household members, according to the wishes of the respondent. Interviews, including the household assessment, took on average 109 to 119 minutes per fully-responding household in remote and non-remote areas.

Personal interviews were conducted with selected Indigenous persons aged 15 years and over. Exceptions occurred where the selected person:
  • was unable to complete the survey due to injury or illness; or
  • did not have sufficient English skills and an interpreter was unable to be arranged.

In the above instances, a proxy interview may have been organised. Where the selected person was mourning the death of a family member (Sorry Business) a personal interview was not pursued.

Proxy interviews were used to collect information on selected Indigenous children aged 0-14 years. Wherever possible, the proxy was a parent or guardian. If no parent or guardian was available, then a close relative or other household member who had responsibility for the child provided responses.

Where consent for interview was not given by a parent or guardian of an Indigenous person aged 15-17 years, a personal interview was not conducted.


Questionnaire

The questionnaire was administered by experienced ABS interviewers, who had received specific training for the survey. The questionnaire was further supported by detailed interviewer instructions, covering general procedural issues as well as specific instructions relating to individual questions.

The questionnaire is not fully indicative of the range of information available from the survey, as additional items were created in processing the data. For example, ABS classifications were applied to raw data inputs to create labour force status. Additionally, some questions were asked solely for the purpose of enabling or clarifying other questions, and are not available in survey results.

Initial household information was collected from one usually resident household member aged 18 years and over using a Household Form. This was similar in design to the household form used by the ABS Monthly Population Surveys (MPS). From this information, one Indigenous adult and one Indigenous child in the household (remote areas), and up to two Indigenous adults and up to two Indigenous children in the household (non-remote areas), were randomly selected to complete a personal interview. The personal interview consisted of a number of separate modules, collecting information on demographics; language and culture; social capital; life experiences; health; education; work; income and finances; housing and mobility; transport; information technology; and safety, crime and justice.


Computer assisted interviewing (CAI)

Interviews were conducted using a Computer Assisted Interviewing (CAI) questionnaire. CAI involves the use of a notebook computer to record, store, manipulate and transmit the data collected during interviews. This type of questionnaire offers important advantages over paper versions, including:
  • the ability to check the responses entered against previous responses, to reduce data entry errors by interviewers. The audit trail recorded in the questionnaire also provides valuable information about the operation of particular questions, and associated data quality issues.
  • the ability to use complex sequencing to define specific populations for questions, and ensure word substitutes used in the questions are appropriate to each respondent's characteristics and prior responses.
  • the ability to capture data electronically at the point of interview, removing the added cost, logistical, timing and quality issues around the transport, storage and security of paper forms, and the capture of information from paper forms into a computerised format.
  • the ability to deliver data in an electronic semi-processed form compatible with ABS data processing facilities (semi-processed in terms of data validation and some derivations which occur within the questionnaire itself). While both the input and output data still need to be separately specified to the processing system, input of the data in this form assists in that specification task and reduces the amount and complexity of some later processing tasks.
  • the provision for interviewers to record comments to help explain or clarify certain responses, or provide supplementary information to assist in office coding.

The questionnaire employed a number of different approaches to recording information at the interview:
  • questions where responses were classified by interviewers to one or more predetermined response categories. This approach was used for recording answers to more straightforward questions, where logically a limited range of responses was expected, or where the focus of interest was on a particular type or group of response (which were listed in the questionnaire, with the remainder being grouped together under ‘other’);
  • questions asked in the form of a running prompt, ie predetermined response categories read out to the respondent one at a time until the respondent indicated agreement to one or more of the categories (as appropriate to the topic) or until all the predetermined categories were exhausted; and
  • questions asked in association with prompt cards, ie where printed lists of possible answers were handed to the respondent who was asked to select the most relevant response(s). By listing a set of possible responses (either in the form of a prompt card or a running prompt question) the prompt served to clarify the question or to present various alternatives, to refresh the respondent’s memory and at the same time assist the respondent select an appropriate response.

To ensure consistency of approach, interviewers were instructed to ask the interview questions as shown in the questionnaire. In certain areas of the questionnaire, interviewers were asked to use indirect and neutral prompts, at their discretion, where the response given was, for example, inappropriate to the question asked or lacked sufficient detail necessary for classification and coding.


Copies of the survey questionnaire

With the release of this publication, a paper copy of the 2008 NATSISS questionnaire has also been made available on the ABS website. The survey questionnaire is provided as a reference to the 2008 NATSISS and should not be used for administering survey interviews.

DATA PROCESSING

Data capture

Computer-based systems were used to collect and process data from the survey. The survey used computer-assisted interviewing (CAI) for data collection and the ABS Household Survey Facilities (HSF) system to process the survey.

The use of CAI ensured that respondents were correctly sequenced throughout the questionnaire. Inbuilt edits meant that some issues could be clarified with the respondent at the time of interview.

Interviewer workloads were electronically loaded on receipt in the ABS office in each state or territory. Checks were made to ensure the workloads were fully accounted for and that questionnaires for each household and respondent were completed. Problems with the questionnaire identified by interviewers were resolved, where possible by using other information contained in the questionnaire, or by referring to the comments provided by interviewers.


Coding

Computer-assisted coding was performed on responses to questions on:
Classifications

Family relationships

Coding is based on household information collected for all persons in each dwelling. All usual residents were grouped into family units and classified according to their relationship within the family.

Geography

Geography data were classified according to the Australian Standard Geographical Classification (ASGC), Jul 2008 (cat. no. 1216.0).

Language

Languages spoken were coded utilising the Australian Standard Classification of Languages (ASCL), 2005-06 (cat. no. 1267.0).

Education

Coding is based on the level and field of education as reported by respondents and recorded by interviewers.

Educational attainment data were classified according to the Australian Standard Classification of Education (ASCED), 2001 (cat. no. 1272.0).

Occupation

Data were classified according to the:

Australian and New Zealand Standard Classification of Occupations (ANZSCO), First Edition, 2006; and

Australian Standard Classification of Occupations (ASCO), Second Edition, 1997 (cat. no. 1220.0).

Industry

Data were classified according to the:

Australian and New Zealand Standard Industrial Classification (ANZSIC), 2006 and 1993 (cat. no. 1292.0).

ABS office coding

There were a number of open-ended questions which required coding by ABS office staff. An example of office coding related to these types of questions occurred when people were asked about the types of contact they had made with family or friends who did not live with them (eg landline phone, mobile phone, internet, mail, etc). People were asked to nominate one or more response categories from a provided list, or they could provide an 'other' type of contact, which was then specified in a text field. Where possible, the text responses were allocated a code, which was based on the original list of categories.


Output processing

Information from the questionnaire, other than names and addresses, was stored on a computer output file in the form of data items. In some cases, items were formed from answers to individual questions, while in other cases data items were derived from answers to several questions.

During processing of the data, checks were performed on records to ensure that specific values lay within valid ranges and that relationships between items were within limits deemed acceptable for the purposes of this survey. These checks were also designed to detect errors which may have occurred during processing and to identify instances which, although not necessarily an error, were sufficiently unusual or close to agreed limits to warrant further examination.

Throughout processing, frequency counts and tables containing cross-classifications of selected data items were produced for checking purposes. The purpose of this analysis was to identify any problems in the input data which had not previously been identified, as well as errors in derivations or other inconsistencies between related items.

Output file

A multi-level hierarchical data file was produced. The contents of the person and household levels are briefly described below:
  • Person level - contains the majority of information on the respondent, including demographic and socio-economic characteristics (eg age, sex, marital status, education, labour force status, personal income), language and culture, health, life experiences, social capital, crime and safety, information technology, transport etc; and
  • Household level - contains information on the household and its members, collected in the household form, including family composition of household, household structure, number of persons/children in household, household income, dwelling tenure type, etc. The household level also contains some information on information technology and transport.

More information on the output file is provided in the Using the CURFS chapter.

Validation checks

The output data file was extensively validated through item-by-item examination of input and output frequencies, checking populations through derivations, internal consistency of items and across levels of the file, data confrontation, etc. Despite these checks, it is possible that some small errors remain undetected on the file.

As a result of the validation processes, some adjustments have been made to data on a record-by-record basis. Changes were done with reference to other information provided by respondents and only to correct clear errors that were not identified during the survey interview, for example, where the reported amount of time lived in previous and/or current dwelling is inconsistent with a person's age. Adjustments may also have occurred as the result of an edit not being applied or being by-passed. For example, where the response to a question was recorded as 'don't know' and was subsequently answered. In cases where the interviewer did not or was unable to return to the original question, the details may have been recorded in a text file.

In general, unless data were 'impossible' they have not been corrected, and results are essentially 'as reported'. To hide 'improbable' responses (eg extremely high alcohol consumption or income) some outliers have been reclassified, primarily to 'not stated' values. Some of these adjustments were made record-by-record; for others a global change was used for all records where reported values lay outside acceptable limits.

Decisions to apply treatments or adjustments to the data were made, as appropriate, by the ABS.

Data confrontation

In the final stages of processing, extensive analyses, including data confrontation, were undertaken to ensure the survey estimates conformed to known or expected patterns, and were broadly consistent with data from the 2002 NATSISS or from other ABS data sources, allowing for methodological and other factors which might impact comparability.

Comparisons of numerous demographic and socio-economic characteristics indicated that some of the 2008 NATSISS estimates did not align well with other ABS estimates due to potential undercoverage and other factors outlined elsewhere. As a result, additional benchmarks were incorporated into the survey's weighting strategy. See 'Weighting, benchmarking and estimation' for more information.

Detailed analyses were undertaken for each topic to check for consistency with data from the following sources:
Checks undertaken include:
  • comparison of estimates between sources;
  • looking at changes over time where items were comparable with other sources; and
  • comparisons of estimates by geographic region.

For new survey topics, external sources were used for comparison. For example, administrative data from the Australian Institute of Health and Welfare (AIHW) was used to check the maternal health and child health topics.

Data available from the survey are essentially 'as reported' by respondents. The procedures and checks outlined above were designed primarily to minimise errors occurring during processing. In some cases it was possible to correct errors or inconsistencies in the data which was originally recorded in the interview, through reference to other data in the record; in other cases this was not possible and some errors and inconsistencies may remain on the data file.

SURVEY METHODOLOGY

Scope and coverage

Scope

The scope of the survey is all Indigenous people who were usual residents of private dwellings in Australia. Private dwellings are:
  • houses;
  • flats;
  • home units; or
  • any other structures used as private places of residence at the time of the survey.

Usual residents are people who usually live in a particular dwelling and regard it as their own or main home. People usually resident in non-private dwellings, such as hotels, motels, hostels, hospitals, nursing homes, or short-stay caravan parks were not in scope.

Further scope exclusions for this survey were:
  • Non-Indigenous persons;
  • Non-Australian diplomats, diplomatic staff and members of their household;
  • Members of non-Australian defence forces stationed in Australia and their dependents; and
  • Overseas visitors.

Coverage

The 2008 NATSISS was conducted in remote and non-remote areas in all states and territories of Australia, including discrete Indigenous communities. Coverage refers to the extent to which the defined scope is represented by the achieved sample. Whereas, undercoverage is the shortfall between the population represented by the achieved sample and the in-scope population. Undercoverage can be planned or unplanned. In this survey, coverage exclusions were explicitly applied to some people who were part of the in-scope population to manage enumeration costs. These people were not included in the sampling frame (based on where Indigenous households were identified in the 2006 Census of Population and Housing). These exclusions were:
  • Collection Districts (CDs) with no Indigenous households;
  • Some Mesh Blocks with no Indigenous households;
  • Some remote (and very remote) Indigenous communities with a small number of Indigenous households; and
  • Some CDs in remote (and very remote) areas with a small number of Indigenous households.

These coverage exclusions result in an estimated undercoverage of approximately 6% of the in-scope Indigenous persons in Australia. Although these areas were not enumerated, the final sample was weighted to population benchmarks to account for these exclusions. In addition to this, there was further unplanned exclusions which resulted in increased undercoverage. Further information on undercoverage is provided in the Interpretation of results chapter.

The projected resident Indigenous population at 31 December 2008, excluding those living in non-private dwellings, was 520,350. At the same date, there were an estimated 24,400 Indigenous people living in non-private dwellings throughout Australia and approximately 200 Indigenous people considered to be migratory or living offshore.

For this survey, the population benchmarks were projections of the most recently released Estimated Resident Indigenous Population (ERP) data, in this case, 30 June 2006. For information on the methodology used to produce the projected resident Indigenous population see Experimental Estimates and Projections, Aboriginal and Torres Strait Islander Australians, 1991 to 2021 (cat. no. 3238.0). To create the population benchmarks for the 2008 NATSISS reference period, the Indigenous ERP from 30 June 2006 was projected forward to 31 December 2008 using average annual growth rates observed between the 2001 and 2006 censuses.

The coverage of this survey included persons who self-identify (in a face to face question) as being Indigenous, whereas the benchmarks (scope) are based on persons who identified as being Indigenous in the Census and Post Enumeration Survey (PES) using different identification mechanisms. It is worth noting that whether a person identifies as Indigenous or not can depend on the data collection methodology. For example, in the 2006 Post Enumeration Survey (PES), which collects data via a face to face interview, around 10% of the respondents who identified themselves as Indigenous in the 2006 Census, via a self-completion form, stated they were non-Indigenous.


Sample design

The 2008 NATSISS was designed to produce reliable estimates at the national level and for each state and territory. For selected states and territories (NSW, Qld, WA and NT) the sample for children aged 0-14 years and persons aged 15 years and over was allocated to produce estimates that have a relative standard error (RSE) of no greater than 25% for characteristics that at least 5% of these populations would possess. The survey was also designed to provide reliable estimates at the national level for children aged 0-3 years, and at the Victorian Inner City and Regional levels for children aged 0-14 years, with the same RSE requirements.

As with previous ABS Indigenous surveys, additional sample was collected in the Torres Strait Area, to ensure data of sufficient quality would be available for the Torres Strait Area and the remainder of Queensland.

Funding was received from the Council of Australian Governments (CoAG) and agreed through the Working Group on Indigenous Reform to enable the collection of data on Indigenous children aged 0-14 years. Funding was also received from the Victorian Government Department of Education and Early Childhood Development (DEECD) to enable additional sample to be included for Victoria.

Community and non-community samples

The sample design incorporated a random selection of:
  • discrete Indigenous communities (including any out-stations associated with them), known as the 'community sample'; and
  • dwellings in areas not covered by the community sample, referred to here as the 'non-community sample'.

The samples for community and non-community areas were designed separately, with each involving a multistage sampling process. These two designs were combined to ensure that all areas of the in-scope population had a chance of being selected in the survey. It was not possible for someone to be selected in both the community and non-community sample.

The sample design differed by community and non-community areas in:
  • Queensland (Qld);
  • South Australia (SA);
  • Western Australia (WA); and
  • the Northern Territory (NT).

Corresponding with the method used in non-community areas of Qld, SA, WA and the NT, the sample design for the following states/territory was the same throughout:
  • New South Wales (NSW);
  • Victoria (Vic);
  • Tasmania (Tas); and
  • the Australian Capital Territory (ACT) .

Community sample

The community sample in remote areas was obtained from a random selection of discrete Indigenous communities and out-stations using a specially developed Indigenous Community Frame. This frame was constructed for operational purposes using counts from the 2006 Census of Population and Housing and information collected in the 2006 Community Housing and Infrastructure Needs Survey (CHINS). All communities on this frame were in remote (or very remote) areas of Qld, SA, WA and the NT. From this frame, 71 Community Sets (containing one main community and zero or more out-stations) were selected for enumeration. The number of communities selected for each applicable state/territory were:
  • 14 in Qld;
  • 5 in SA;
  • 11 in WA; and
  • 41 in the NT.

A random selection of dwellings was made within the selected communities and out-stations, with different selection procedures applied to the main communities and out-stations. Depending on the size of the main community, up to 37 in-scope dwellings were selected for enumeration. All in-scope dwellings in selected out-stations were enumerated. Within each selected dwelling, up to one Indigenous person (aged 15 years or over) and up to one Indigenous child (aged 0-14 years) was randomly selected to participate in the survey.

Non-community sample

In non-community areas, dwellings were selected using a stratified multistage area sample. For the first time, Mesh Block level information within Census Collection Districts (CDs) was used to assist in targeting Indigenous people. A mesh block is a geographic building block, smaller than a CD, and consisting of approximately 50 households. A sample of CDs were randomly selected within each state and territory, with the likelihood of selection based on the number of Indigenous dwellings recorded in the area for the 2006 Census. All Mesh Blocks containing at least one Indigenous household within the selected CD were screened. Mesh Blocks containing no Indigenous households were either excluded on coverage or randomly sampled for screening. This approach significantly reduced screening effort in areas of low Indigenous density, such as major capital cities. In remote and very remote areas, all households were screened in selected CDs.

For each randomly selected dwelling, one usual resident aged 18 years or over, was asked whether anyone in the household was of Aboriginal or Torres Strait Islander origin. This screening question was used to identify Indigenous households, from which the sampling process of persons was undertaken. If a dwelling contained one or more Indigenous usual residents, random selection for participation in the survey occurred as follows:
  • up to two Indigenous persons (aged 15 years or over); and
  • up to two Indigenous children (aged 0-14 years).

The original sample allocation for the non-community component for each state/territory appear in the following table:

SAMPLE ALLOCATION, Non-community

NSW
Vic
QLD
SA
WA
Tas
NT
ACT
Australia

Expected fully responding persons
1 722
2 317
1 447
1 183
1 637
980
1 198
430
10 914



In non-community areas, a significantly lower than expected number of households, containing Indigenous usual residents, were found after the initial screening process (described above). Therefore, additional CDs and Mesh Blocks were selected in Vic, Qld, SA, WA, Tas and the NT. Selected dwellings in these Mesh Blocks were enumerated in early 2009. The lower than expected number of households, identified as containing Indigenous usual residents, may have been due to the following reasons:
  • mobility (i.e. persons moving away from the Mesh Block in which they were enumerated for the 2006 Census into areas excluded from coverage); and/or
  • non-identification of Indigenous usual residents and non-response. Refer to the Interpretation of results chapter for more information on undercoverage.

Survey response

After screening households in non-community areas, approximately 2.5% were identified as having an Indigenous usual resident. Of these households, 83% responded to the survey. This response rate does not take into account approximately 11% of households who were unable to be contacted and therefore establish the Indigenous status of usual residents. In communities, 78% of in-scope households were fully responding.

Some survey respondents provided most of the required information, but were unable or unwilling to provide a response to certain data items. The records for these persons were retained in the sample and the missing values were recorded as 'don't know' or 'not stated'. No attempt was made to deduce or impute for these missing values.


Weighting, benchmarking and estimation

Weighting

Weighting is the process of adjusting results from a sample survey to infer results for the total in-scope population. To do this, a 'weight' is allocated to each sample unit corresponding to the level at which population statistics are produced, eg household or person level. The weight can be considered an indication of how many population units are represented by the sample unit. For the 2008 NATSISS, separate person and household weights were developed.

Selection weights

The first step in calculating weights for each person or household is to assign an initial weight, which is equal to the inverse of the probability of being selected in the survey. For example, if the probability of being selected in the survey was 1 in 45, then the person would have an initial weight of 45 (that is, they would represent 45 people).

After calculating the initial person weights, an adjustment was incorporated into the weighting to account for Indigenous persons not covered by the sample. The initial household weights were also similarly adjusted.

Analysis indicated that there was a higher level of undercoverage is areas with a smaller Indigenous population. As a result, the initial person weights were adjusted to meet population benchmarks based on the expected Indigenous population size of the area. That is, selected CDs with a low population of Indigenous persons were adjusted to ensure the weights were equal to all CDs of the same population size according to demographic characteristics. This ensured that people in low population areas were represented by other people in low population areas, and likewise for high population areas. This adjustment was considered important as Indigenous persons in low population areas have different characteristics to Indigenous persons in high population areas. For more information on undercoverage see the Interpretation of results chapter.

Benchmarking

The person and household weights were separately calibrated to independent estimates of the population of interest, referred to as 'benchmarks'. Weights calibrated against population benchmarks ensure that the survey estimates conform to the independently estimated distributions of the population rather than to the distribution within the sample itself. Calibration to population benchmarks helps to compensate for over- or under-enumeration of particular categories which may occur due to either the random nature of sampling, non-response, non-identification or various other undercoverage factors. This process can reduce the sampling error of estimates and may reduce the level of undercoverage bias.

A standard approach in ABS household surveys is to calibrate to population benchmarks by state, part of state, age and sex. In terms of the effectiveness of 'correcting' for potential undercoverage bias, it is assumed that within the weighting classes the characteristics of the covered population (ie survey respondents) are similar to the uncovered population (ie people who were not surveyed), as determined by the benchmarking strategy. Where this assumption does not hold, biased estimates may result.

Person weights

For this survey, person weights were simultaneously calibrated to the following population benchmarks:
  • state by remoteness area;
  • state by age by sex;
  • Victoria by remoteness area by specific age groups by sex;
  • Torres Strait Islander status by Torres Strait Island Region by age; and
  • state by community/non-community (for this benchmark only Qld, SA, WA and the NT had differences between community and non-community areas, as these are the areas in which the sample design differed by community and non-community areas).

The 'state' population benchmarks consist of the six states and two territories:
  • New South Wales (NSW);
  • Victoria (Vic);
  • Queensland (QLD);
  • South Australia (SA);
  • Western Australia (WA);
  • Tasmania (Tas);
  • Northern Territory (NT); and
  • Australian Capital Territory (ACT).

'Remoteness area' was categorised into five groups, defined in the table below.

PERSON WEIGHTS, State by remoteness area

Grouping
Applicable remoteness area
Applicable state or territory

0
Inner city
All except ACT
Inner city and Inner regional area
ACT
1
Inner regional
All except ACT
2
Outer regional
NSW, Qld, SA, WA and NT
Outer regional and Remote
Vic
Outer regional, Remote and Very remote
Tas
3
Remote
Qld, SA, WA and NT
Remote and Very remote
NSW
4
Very remote
Qld, SA, WA and NT



For the main population, the 'age' of respondents was categorised into 14 groups, defined in the table below.

MAIN POPULATION, State by age of survey respondents by sex

Grouping
Age range (years)
Applicable state or territory(a)
Applicable sex

1
0-4
All states/territories
All persons
2
5-9
All states/territories
All persons
3
10-14
All states/territories
All persons
4
15-19
QLD, SA, WA, Tas and the NT
All persons
15-19
NSW, Vic and the ACT
Females
15-24
NSW, Vic and the ACT
Males
5
20-24
QLD, SA, WA, Tas and the NT
All persons
20-24
NSW, Vic and the ACT
Females
6
25-29
All states/territories
All persons
7
30-34
All states/territories
All persons
8
35-39
All states/territories
All persons
9
40-44
All states/territories
All persons
10
45-49
NSW, Vic, QLD, SA, WA, Tas and the NT
All persons
45-49
ACT
Females
45-54
ACT
Males
11
50-54
NSW, Vic, QLD, SA, WA, Tas and the NT
All persons
50-54
ACT
Females
12
55-59
NSW, Vic, QLD, SA, WA, Tas and the NT
All persons
55 and over
ACT
All persons
13
60-64
NSW, Vic, QLD, SA, WA, Tas and the NT
All persons
14
65 and over
NSW, Vic, QLD, SA, WA, Tas and the NT
All persons



For the 'Victoria by remoteness area by specific age groups by sex' population benchmarks, the 'age' of respondents was grouped as per the main population age groups (see table above), except for people living in outer regional Victoria, for whom 'age' was grouped as shown in the table below.

VICTORIA OUTER REGIONAL AREA POPULATION, by Age of survey respondents by sex

Grouping
Age range (years)
Applicable sex

1
0-4
All persons
2
5-9
All persons
3
10-14
All persons
4
15-19
Females
15-24
Males
20-24
Females
5
25-29
All persons
6
30-34
All persons
7
35-39
All persons
8
40-44
All persons
9
45-49
All persons
10
50 and over
All persons



'Sex consists of two categories:
  • male; and
  • female.

'Torres Strait Islander Status' consists of two categories:
  • non-Torres Strait Islander; and
  • Torres Strait Islander (includes people who were both Aboriginal and Torres Strait Islander).

'Torres Strait Island Region' consists of three categories:
  • Torres Strait area (QLD);
  • rest of QLD; and
  • rest of Australia.

'Community/non-community' consists of two categories which were defined by the survey design:
  • non-community; and
  • remote community.

Initially only the first four sets of benchmarks were included in the weighting process. Given there was a higher level of undercoverage than expected, extensive analysis was undertaken to ascertain whether further benchmark variables should be incorporated into the weighting strategy. Analysis indicated that there was greater undercoverage in non-community areas and that the inclusion of a community/non-community benchmark would improve the quality of the estimates, particularly for the NT. Further details on undercoverage are provided in the Interpretation of results chapter.

The survey was benchmarked to the estimated Indigenous resident population living in private dwellings at 31 December 2008. This estimated population is based on projections of the experimental estimates of the resident Indigenous population at 30 June 2006. More information on the calculation of projections is provided in Experimental Estimates and Projections, Aboriginal and Torres Strait Islander Australians, 1991 to 2021 (cat. no. 3238.0). As people in non-private dwellings (eg hotels) are excluded from the scope of the survey, they have also been excluded from the survey benchmarks. Therefore, the 2008 NATSISS estimates do not (and are not intended to) match estimates for the total resident Indigenous population obtained from other sources.

Household weights

The ABS does not produce Indigenous dwelling counts. Therefore, for the 2008 NATSISS, household level benchmarks were not available. Instead, the household weights for this survey were derived from the person level benchmarks. This was done by assigning the initial household weight (after the adjustment) to all Indigenous persons in the household. These weights were then calibrated to the person level benchmarks with the constraint that each person in the household must have the same final calibrate weight. The resulting weight was assigned as the final household weight. That is, the weights were calibrated in such a way that the household weights will reproduce the number of known person level benchmarks if all people in a household are included. Therefore, the sum of the household weights will only provide an estimate of the number of Indigenous households. This method was then analysed to ensure that person and household level estimates are as consistent as possible.

Estimation

Estimation is a technique used to produce information about a population of interest, based on a sample of units (ie persons or households) from that population. Each record in the 2008 NATSISS contains two weights:
The weights indicate how many population units (ie persons or households) are represented by the sample unit. Replicate weights have also been included - 250 person replicate weights and 250 household replicate weights. The purpose of these replicate weights is to enable calculation of the Relative Standard Error (RSE) for each estimate produced from the survey.

Survey estimates of counts of persons are obtained by summing the weights of persons or households with the characteristic of interest. Estimates for means, such as mean age of persons, are obtained by summing the weights of persons in each category (eg individual ages), multiplying by the value for each category, aggregating the results across categories, then dividing by the sum of the weights for all persons.

The majority of estimates contained in this publication are based on benchmarked person weights. However, the survey also contains some household estimates based on benchmarked household level weights.

COMPARISON TO THE 2002 NATSISS

The following table provides broad comparisons of the survey design for the 2008 and 2002 surveys.

COMPARISON OF THE 2008 AND 2002 NATSISS, by Survey design

2008 NATSISS 2002 NATSISS

Collection methodology Computer-assisted interview (CAI). In remote areas a paper back-up of the questionnaire was available in case of technical difficulties. Computer-assisted interview (CAI) for non-remote areas, and remote areas in NSW, Victoria and Tasmania.
- Pen and paper interviews (PAPI) for remote areas not covered by CAI.
Selected persons and proxies in remote and non-remote areas were given the option of using prompt cards. The respondent could read directly from the cards or the card may have been read aloud by the interviewer. Prompt cards were used in non-remote areas only.
In non-remote areas, selected persons aged 15 years and over were provided with a self-enumerated substance use form. As per the 2008 NATSISS
Personal interview for persons aged 15 years and over (where consent was not given for persons 15-17 years, an interview was not conducted). As per the 2008 NATSISS
A proxy interview was conducted for children aged 0-14 years and may also have been conducted for people who were unable to complete the survey due to injury or illness and those who did not have sufficient English skills (an interpreter may have been used). -
A household spokesperson aged 18 years and over provided household information, including: community facilities, bedrooms, telephone/IT access, transport, household facilities, maintenance, tenure, financial stress and income. A household spokesperson aged 18 years and over provided household information, including: dwelling, financial situation of household and income for people not selected for personal interview.
Scope Indigenous persons (all ages) Indigenous persons aged 15 years and over
Usual residents of private dwellings in remote and non-remote areas of Australia. As per the 2008 NATSISS
Sample design Mesh blocks and Collection Districts (CDs) used for selection and screening CDs used for selection and screening
For selected households in discrete remote Indigenous communities and outstations: one Indigenous person aged 15 years and over and one child aged 0-14 years. For selected households in non-remote and remote non-community areas: up to two Indigenous persons aged 15 years and over and up to two children aged 0-14 years. Up to three Indigenous persons aged 15 years and over per household.
Final sample 13,307 persons (7,823 adults and 5,484 children) 9,359 persons (aged 15 years and over)
Response rate 82% of households in non-community and community areas Approx 80% of households in non-community and community areas
Enumeration period August 2008-April 2009 August 2002-April 2003
Main ouput units Persons As per the 2008 NATSISS
Household As per the 2008 NATSISS
Exclusions Non-Indigenous persons As per the 2008 NATSISS
Overseas visitors Visitors
Non-Australian diplomats, diplomatic staff and memebrs of their household. -
Members of non-Australian defence forces stationed in Australia and their dependents -