Australian Bureau of Statistics

Rate the ABS website
ABS Home
ABS @ Facebook ABS @ Twitter ABS RSS ABS Email notification service
Newsletters - Methodological News - Issue 4, September 2001

A Quarterly Information Bulletin from the Methodology Division

September 2001



Measuring Australia's Progress (MAP) is a new ABS publication that will assess progress in Australia across the environmental, social and economic aspects of life. The first issue of MAP is scheduled for release in early 2002; the ABS expect to issue updates of the publication thereafter, perhaps annually. We envisage that MAP will contain 15 headline indicators that summarise the state of the economy, society and the environment, so allowing readers to form their own view of Australian progress.

Our choice of indicators was made with reference to the following criteria:
  • Indicators should focus on the progress outcome rather than, say, the inputs or other influences that generated the outcome, or the government and other social responses to the outcome. Increased life expectancy is an outcome of progress in the area of health. But increased government expenditure on health is an input. While increased expenditure would hopefully improve health outcomes, it may not necessarily be the case.
  • It was also judged important that movements in any indicator could be unambiguously associated with progress. That is, with all other things kept equal, all would agree that movement in an indicator in a particular direction was unambiguously good. All would agree that lower greenhouse gas emissions - if they could come without retarding economic growth and the like - would be unambiguously good. But we avoided indicators such as changes in divorce rates, because of their ambiguity. Does more divorce mean more broken marriages and unhappy people? Or does it mean fewer unhappy people trapped in bad relationships?
  • The availability of data at a national level and as a time series.

The project has been underway for over a year now and the ABS has received continual feedback from subject matter experts from both within and outside the ABS.

We recently conducted a fairly large consultation of key stakeholders. As a first step Jon Hall and other members of the project team met with many government departments including: Treasury, Environment Australia, Productivity Commission, Department of Education, Training and Youth Affairs, Department of Employment, Workplace Relations and Small Business, Department of Health and Aged Care and the Australian Institute of Health and Welfare.

During May, Jon presented seminars in all States and Territories that provided an outline of the proposed content and design of the publication. Jon spoke to some 250 people from outside the ABS, and about 450 consultation packs were sent out. In addition to the many comments received during the seminars, we received about 60 detailed written submissions. People were overwhelmingly supportive of the project's aims and MAP's approach to measuring progress, although there were criticisms of some of our suggested indicators and suggestions for improvements. The consultation closed at the end of June and enabled us to finalise both the headline and supplementary indicators. In the next few weeks we will be releasing a response to all the significant comments suggested in the consultation and whether (and how) we plan to take them on board in MAP's first issue.
After the publication is released in April 2002, the ABS will host a workshop to discuss the publication and consider how it should be refined in the future.

For more information, please contact Cristy Williams on (02) 6252 5546.



Over the last 10 years, Subject Matter Areas (SMAs) and the Sample and Frame Maintenance Procedures (SFMP) team have made significant investments in standardised tools and procedures for business surveys. There is a general belief that this investment has increased the quality of application of SFMP. However, there is a need for a mechanism which provides more consistent and regular measures of how well the SFMP rules are being applied and how well the tools are working.

The SFMP team proposed a quality assurance scheme that was endorsed by the Survey Integration Steering Committee (SISC) and was implemented in the March Quarter 2001 reference period. The quality assurance scheme serves two purposes:
  • to provide senior management and survey management objective data on the application of SFMP; and
  • to assist the SFMP team in future improvements to SFMP.

For the quality assurance scheme to be effective, there are remedial actions undertaken when standards fall below acceptable levels. Such remedial action may include the retraining of staff, the amendment of operational tools such as the questionnaire or the review of rules or procedures.

SMAs are responsible for their own quality assurance. The SFMP quality assurance scheme should be part of the total quality management associated with the processing of each economic survey. Each processing cycle, SMAs select a sample of units for checking. For subannuals, the sample size is 40 units per quarter and for annuals, the sample size is 160 units per annum. Results from the quality assurance are sent to the SFMP team upon completion. These results are aggregated and presented to SISC each quarter.

To provide some form of independent assessment, the SFMP team also conducts small audits on each survey on an annual basis. The audit consists of a subsample of the quality assurance sample already selected by the SMA. The main purposes of the audit are to ensure that the SMA are applying the correct methods of treatment and that there is sufficient documentation to justify decisions made in regard to SFMP.

For more information, please contact Rosslyn Starick on (03) 9615 7689.



For several years the ABS has been supplementing the business data collected in the Economic Activity Survey (EAS), with business income tax data collected by the Australian Taxation Office (ATO), in order to produce more reliable estimates for National Accounts. While preliminary estimates are still produced using just the EAS data for all employing businesses, the final estimates are produced using EAS data for the relatively few large and complex employing businesses, supplemented with ATO data (using the Income Tax Survey (ITS) which collects information by matching selected businesses to business income tax returns) for the many small, simply structured employing businesses, as well as non-employing businesses.

While the estimation for EAS and ITS have been fully integrated, the sample allocations for EAS and ITS have continued to be conducted sequentially. The previous sample allocation for EAS was designed to achieve equal industry level relative standard errors on the preliminary estimates, based on a fixed total sample size. About nine months later, the sample allocation for ITS was designed to achieve much lower approximately equal industry level relative standard errors on the final estimates. The major problem with this sample allocation strategy was that the sample allocations failed to achieve the approximately equal industry level relative standard errors on the final estimates. This was primarily due to the fact that the sample allocation for EAS, in particular the relatively small sample allocation within the complex employing businesses component of EAS, was fixed at the time of the sample allocation for ITS.

Under the latest sample allocation strategy, the sample allocations for EAS and ITS have been conducted simultaneously. The sample allocations have been be designed to achieve specified industry level relative standard errors on both preliminary and final estimates, under a total fixed total cost constraint. The key to the success of latest sample allocation strategy was the availability of information on the fixed costs (i.e. overhead costs) and variable costs (i.e. cost per unit) of conducting EAS and ITS. This also provided National Accounts with the opportunity to review the relative importance of the preliminary and final estimates - it was decided to improve the accuracy of the final estimates and hence relax the accuracy of the preliminary estimates.

The latest sample allocations for EAS and ITS satisfied these revised industry level relative standard errors for less than the total fixed cost constraint - principally due to a substantial reduction in the total sample size for ITS. In order to improve the accuracy of the final estimates and relax the accuracy of the preliminary estimates, there was a shift in the sample allocation from the simple employing businesses component to the complex employing businesses component of the EAS. Hence there was no reduction in the total sample size for the EAS.

For more information, please contact: John Preston on (02) 6252 6970.



Late last year, the ABS adopted a knowledge management (KM) strategy to enhance the bureau's capacity to find, preserve and share knowledge. KM is emerging as an important issue for the ABS analysis program.

Our work is project based. When the ABS (through the Analysis Board) decides that a new analytical product should be prototyped or a new methodology should be developed, a project team is established; the team members are redeployed when the project is completed. During the past eighteen months, we have been wrestling with the following questions:
  • At the outset of a project, how can the team find and assimilate the knowledge it needs about the subject matter, datasets and techniques?
  • During a project, how can the team share its experiences with clients and other stakeholders? How can the team bring its problems to the attention of people (in the ABS or elsewhere) who might provide helpful insights?
  • At the end of a project, how can the knowledge generated by the team be preserved and made accessible to future analysts?

We have been experimenting with ways of capturing and sharing knowledge throughout a project's lifecycle. Many of our teams now create "big picture" documents (that outline the project's aims and plan of attack), "state of play" documents (that describe recent and impending tasks) and "road maps" (that guide the reader through the documentation of our analysis processes and outputs). Examples of these documents can be viewed on the Analysis Branch workgroup database. Some of our clients and collaborators are finding these documents very helpful; many other ABS staff are visiting our branch home page periodically to get an overview of the whole analysis program. Project teams also share their experiences through the Methodology Division and other seminar series. The branch also conducts fortnightly in-house workshops. These are not polished seminars - their purpose is to provoke lively, early discussion about projects-in -progress and to share problems and insights.

We are now turning our minds to two broader KM issues. First, how can we gather intelligence about the opportunities for analytical work that will be most fruitful for the ABS and its customers? We are scanning our ABS partners' strategic directions statements, forward work programs and information development plans to see where new analytical products or new methods might fit into the national statistical jigsaw. We are experimenting with graphical and tabular ways of presenting this information.

Second, how can ABS staff engaged in analytical work share their experiences? By interviewing staff who have led analytical projects during the past eighteen months, we are distilling their experience into a guidebook that describes a standard research sequence - from initiation steps (initial problem statement, literature survey and data census) through intermediate steps (exploratory and full-scale analyses, peer review by other analysts, and plausibility checks by subject matter experts) to wrap-up steps (negotiating the transition/ implementation plan, writing up the journal of project team experiences, and generating the nuggets of technical knowledge that will be useful to future analysts). A draft guidebook will be ready in November 2001.

For more information, please contact Ken Tallis on (02) 6252 7290.




The Monthly Population Survey (MPS), incorporating the Labour Force Survey (LFS) and Supplementary questionnaires, is a survey including some 60,000 persons, Australia wide each month. An area based multistage design is used, selecting Collection Districts (CDs) at the first stage, blocks at the second stage and dwellings at the third stage. The survey employs a workforce of some 620 interviewers and costs almost $40 million to run over a five year period. It is one of the largest surveys undertaken by the ABS and is also of key importance to users as it provides the labour force estimates that are input to the National Accounts. Every five years, shortly after the Population Census is conducted, the MPS is redesigned to improve the efficiency of the survey using the latest available cost and variance information.

The purpose of the sample redesign is to choose the design that gives the best trade-off between costs and variance. In order to do this, costs and variances are each expressed as a mathematical function of the sample design parameters: the number of CDs to select; and the number of dwellings selected per CD (referred to as the cluster size). This predicts what the costs and variances will be for a given choice of design parameters, enabling us to minimise variance for a prescribed cost.

New Features

The 2001 redesign is due to be completed in March 2002 and will be phased in during October 2002 to June 2003. The redesign includes two new features:
  • A more coordinated approach towards collecting and producing Indigenous statistics from household surveys. This will be achieved via a separate Indigenous Communities Framework (ICF) in addition to the Private Dwelling and Special Dwelling frameworks.
  • The incorporation of the concept of remoteness into the stratification of the less densely populated rural areas. This will be achieved using the Access/ Remoteness Index for Australia (ARIA) which has been included in the 2001 Australian Standard Geographical Classification (ASGC).


The introduction of ARIA has replaced sampled and sparse area types in the stratification, which had been based on SLA level population density. This will have the advantage of keeping more remote towns and their surrounding district together in the same stratum. In addition, the use of ARIA also helps ensure that small islands off the north coast of Australia are treated as very remote rather than like larger towns because of their high density.

Variance Model

A substantial increase in computing power has made it possible to produce a variance model based on fifty different designs, considering all possible samples under each design and emulating the block formation and selection practices used in the MPS. This has resulted in variance models with a high level of fit. In addition variance models were obtained for the first time for smaller area types such as the sampled, sparse and indigenous area types.

The resulting variance model was further adjusted to reflect actual LFS accuracy levels for the middle of the design period. These adjustments allowed for the following differences between Census and MPS:
  • post-stratification estimation used in the MPS;
  • definitional and questionnaire differences between Census and LFS labour force data items;
  • relative changes in population and sample size;
  • sample loss/non-response;
  • changes in variance structure, such as a change in the ratio between within block and between block variance, during the life of the 1996 design; and
  • use of more final census counts as CD measures of size for the 2001 selections.

The Cost Model

The cost model was fitted to detailed cost data obtained in May 1999 which gave a more precise split by cost components and was the first available data to comprehensively represent the cost dynamics under the telephone interviewing approach. The cost model was subsequently adjusted for monthly seasonal changes and to reflect those Payment To Agents (PTA) costs not captured electronically, such as training, Commonwealth vehicles, superannuation and long service leave.

Optimisation Method

An optimisation method for 2001 has been developed to minimise Australian level variances in order to meet a specified cost value and specified relativities in state accuracy levels. This is a departure from previous allocations in which state accuracies were controlled more indirectly by adjusting state sampling fractions. The new approach is transparent in that the desired relativities in state accuracy requirements are specified as input to the optimisation. The method also ensures equal probability sampling within state.

A preliminary sample optimisation has been produced based on projected PTA expenditure for the 2001 design period which delivers slightly lower relative standard errors for employment and unemployment than those that were achieved during 1996. However, due to the increased emphasis on the PTA budget for this design, the optimisation will not be finalised until after the October Management Meeting where a decision will be made on the final budgetary allocation for the 2001 design MPS PTA expenses.

For more details, please contact Daniel Elazar in CO on (02) 6252 6962.



In the early days of the Statistical Clearing House (SCH), the Harmonisation Project was established to eliminate the duplication of information requests made to ABS survey managers by the SCH and other ABS internal review areas. Harmonisation of the SCH with the Collection Management System (CMS) was the main priority of this project, ensuring the information provided on the Commonwealth Business Surveys Register is the same as that provided on the CMS web-site. Fundamentally however, the SCH and CMS have different objectives.

The SCH reviews surveys involving 50 or more businesses run by, or on behalf of any Commonwealth government agency. Surveys are assessed on the basis of answers supplied to the 57 review questions (referred to as the SCH template) and a set of review criteria. The review questions cover all aspects of the statistical cycle; to ensure
  • there is no adequate alternative source of information available and no reasonable, alternative means of obtaining the required information with less respondent burden;
  • the survey methodology is appropriate to meet the objectives of the survey;
  • a group of businesses or business associations have been consulted about the nature and objectives of the survey and data availability, and there is an assessment of respondent load; and
  • there are adequate systems (both computer and people-based) to ensure the survey is conducted and processed in a manner that will provide output of appropriate quality.

The CMS on the other hand enables ABS Subject Matter Areas (SMAs) to document their surveys, within a corporate framework. This leads to varying levels of detail and the issues addressed in the CMS being determined by the SMA.
The SCH reviews ABS surveys by loading CMS information into the SCH template. This information is often inadequate to satisfy the SCH requirements without further explanations or clarifications.

There has been substantial progress with harmonisation in recent times. CMS Release 5 saw the introduction of SCH questions embedded as CMS questions, the inclusion of nine SCH questions which were not previously on the CMS and links from the CMS to the SCH processing system. The inclusion of SCH pop-up boxes in the appropriate CMS fields will ensure that when the CMS is completed, the SMA can refer to the pop-up box, see what the SCH question is and provide information that will be a suitable response to the SCH. This is a step forward, but still requires that the CMS entries are up to date. There is a now a stronger corporate focus on keeping CMS entries up to date with a number of CMS entries flagged as CMS publishable items, to be released to the Web.

The SCH has been trialling the process of undertaking SCH reviews directly from the CMS for some repeat surveys and a new survey and this process is working reasonably well. The SCH maximise the use of CMS provider load and quality measures as part of the post survey information processes. Again for this process to work effectively its relies on the CMS being updated in a timely way by survey managers.

With the dissemination of CMS information, initially via the Statistical Directory in December 2001, the dissemination of information about ABS surveys to the SCH web site will no longer be necessary with links instead being provided to the site of CMS dissemination. This is not possible in the short term as there is only limited overlap. However this will be expanded over time.

For more information, please contact Marietta Urh on (02) 6252 5565.


Commonwealth of Australia 2008

Unless otherwise noted, content on this website is licensed under a Creative Commons Attribution 2.5 Australia Licence together with any terms, conditions and exclusions as set out in the website Copyright notice. For permission to do anything beyond the scope of this licence and copyright terms contact us.