Australian Bureau of Statistics

Rate this page
ABS Home
ABS @ Facebook ABS @ Twitter ABS RSS ABS Email notification service
Newsletters - Methodological News - Issue 8, September 2002

A Quarterly Information Bulletin from the Methodology Division

September 2002



Measuring Australia's Progress (MAP), a new ABS publication, was launched by the Australian Statistician on 4 April 2002. MAP was developed by Analysis Branch with a great deal of assistance from colleagues in the ABS and elsewhere. It uses a set of indicators to help readers assess whether the economic, social and environmental aspects of life in Australia progressed over the 1990s.

Informing the discussion about national progress is one of the most important tasks that a statistical agency can take on. However, it is also likely to provoke vigorous debate, because there is no universal agreement regarding what dimensions of progress are most important, or what indicators best encapsulate those dimensions. MAP is a deliberately experimental publication, and the Statistician's foreword invites readers to comment on it.

Comments received since MAP was launched have been predominantly favourable. Many articles appeared in the press. Noted commentator Ross Gittins observed in the Sydney Morning Herald that "the Bureau of Statistics, for instance, issues some new stats most days, many of which get a lot of media attention, but rarely does it issue anything as remotely important as Measuring Australia's Progress".

At seminars around the country, audiences were very supportive. At least two States are now considering compiling similar measures at the State level.

However the comments have not all been favourable. Some commentators have argued that MAP should be underpinned by a more overt conceptual framework. Some have expressed disappointment that the ABS has not presented indicators for some dimensions of progress (such as the quality of national, business and community governance). Some disagreed with the choice and balance of progress indicators.

In October 2002, the ABS will host a workshop in which government, academic, community and other representatives will be asked to review the publication and to offer their views on the future course of progress measurement. The ABS will then consider what further work it might undertake.

The whole first issue of Measuring Australia's Progress (Cat. no. 1370.0) can be downloaded from ABS@ and this site.

For more information, please contact Jon Hall on (02) 6252 7221.



The Qualifying Quality program began as a research and development project within the Statistical Consultancy and Training Section within Methodology Division (MD), with the broad task of enhancing strategies for presenting information about the quality of data to users. This work has since culminated in the May 2002 Management Meeting Paper 'Making Data Quality Visible - A Focus for 2002-03', which provides much of the underlying framework for this program plan, and the discussion paper 'Qualifying Quality - A Framework for Supporting Quality-Informed Decisions', which overviews much of the conceptual framework for the program.

The program has since progressed to a broader ABS program under the direction of a Qualifying Quality Program Board, with representation from Economic Statistics Group (ESG), Population Statistics Group (PSG) and Information Management Division. The program will draw on the activities of multiple teams within the Methodology Division and the broader ABS, which may be specific to or overlap with the Qualifying Quality Program. This work will be coordinated by a small team, whose role it will be to monitor the status of the various activities, ensuring that the tasks both meet the objectives of the Qualifying Quality Program and link into each other as required.

The aim of the Qualifying Quality Program is to progress the outputs incrementally, drawing on existing infrastructure and opportunities arising from related projects. As such, a key role of the Program Board will be to consider the range of potential projects within the Qualifying Quality Program and provide guidance on the priorities, feasibilities and sequencing of the work program in the context of both the desired outcomes from the Qualifying Quality Program and the overall demands and priorities of the broader ABS work program.

The program proposed four objectives:
  • increase the information available externally about the quality of (ABS) data;
  • educate our users (and ourselves) about quality, and how knowledge of it should inform the uses of statistics;
  • publish and promote guidelines and frameworks about quality, and to
  • use information about data quality to manage and improve our statistical processes.

One key output from the program to date has been the development of a training course 'Making Quality-Informed Decisions' with the assistance of staff from the Learning and Development Section. The course introduces the concept of a data quality framework, using the framework originally developed by Statistics Canada, and provide participants with a set of techniques they can use to apply the framework in their work. In particular, the course considers the use of the framework in the context of:
  • refining the understanding of the data needs of end users;
  • describing the quality of existing data;
  • assessing the degree to which potential data sources meet these needs (both individually and in combination); and
  • managing the risks identified when considering the selected data sources through risk mitigation and contingency planning.

By the end of 2002, the course will have been run six times with over one hundred ABS staff having received the training, including courses being delivered in the Melbourne and Adelaide offices. More courses are also planned for 2003. In addition, a Statistical Impact Seminar has been presented overviewing the principles covered in the courses and their applications within the ABS. The intention is to repeat this seminar in both the ESG and PSG seminar series.

Other initiatives within the Qualifying Quality program include: the extension of the framework provided in the course to the development of information development plans and the coordination of overall data strategies; and the use of quality measures in both quality assuring survey data and developing quality improvement strategies for surveys. Details of progress on these and other initiatives will be included in future issues of Methodological News.

For more information, please contact Bill Allen on (02) 6252 6302.

Email :


It is several years since the Forms Development Procedures and Design Standards Manual was last updated on the Corporate Manuals DB. This means that some of the information in it was out of date or otherwise inappropriate. A new version developed by the Forms Consultancy Group has recently been released on the Corporate Manuals database and the Statistical Clearing House web site.

The update had the following aims:
  • To separate the form design standards (the current focus) from the development and testing information. These two types of information were seen as having different purposes and different audiences so they will be presented in separate manuals.
  • To update the factual procedural-type information- correcting the software used, giving the appropriate contact details for different aspects of forms design, etc.
  • Improving the standards themselves, including the addition of background references.

Most of the changes were quite minor, often involving a realignment of the standards to current survey practice. Some of the more significant changes are as follows:
  • Including a "Purpose of Collection" on the front page of self-administered business forms has become mandatory. The increasing need to improve response rates and provider motivation has lead to a requirement to explain on the form what the survey is for, and preferably, what are the potential benefits to the provider.
  • Standards have been developed for the use of sensitive and subjective questions, such as those using rating scales. Traditionally an integral part of household surveys, these sorts of questions are increasingly being used in ABS business surveys due to a shift away from purely accounting-type questions towards more motivational and behavioural items.
  • More detail has been added on how the information on forms should be structured. This includes how "parts" should be used, where instructions should go, and details on sequencing and spacing. One of the drivers for these additions is the increased collection of sparse data leading to more complicated questionnaires.

Before its release the new Manual went through two productive rounds of stakeholder consultation. It was also peer reviewed by Professor Don Dillman, and very favourable comments were received.

For more information, please contact Emma Farrell on (02) 6252 7316.


Information Development Plans (IDPs) are emerging as a major source of ideas for fruitful analytical work.

For a given field of statistics (say, education or the information economy), the ABS and other interested agencies may negotiate an IDP that:
  • spells out what statistics would best support policy design, evaluation, other decision-making and research
  • assesses the statistical potential of administrative and business databanks as well as direct collections run by the ABS
  • identifies statistical gaps, overlaps and other deficiencies and proposes ways of addressing them
  • sets out roles and responsibilities for enhancing the national information base in the given field.

Recently, a team of participants in the ABS Strategic Management Program (SMP) developed a guidance document for ABS staff and others who have responsibility for creating IDPs. The team's report suggests a seven-step process:

1. Define your field of statistics by referring to the view of the world adopted by policy agencies or researchers, to statistical frameworks, to standard socioeconomic theory and so on.

2. Understand the debate, especially what questions are in the minds of key stakeholders.

3. Define the desired information set by identifying major domains of concern, dimensions, variables and linkages.

4. Assess the data pool by identifying data custodians, their databanks, data content and quality.

5. Define what information would add value, especially what gaps and overlaps should be addressed, and what solutions (such as direct collection, deriving statistics from by-product databanks, or analysis) are best suited to addressing the deficiencies.

6. Negotiate to create the information by discussing who among the interested organisations will undertake each aspect of the statistical development activity.

7. Evaluate the match between statistical needs and the available statistics, and the process for generating the IDP and inspiring statistical development activity.

The project team also drew attention to some cross-cutting issues, such as how staff who are working on IDPs can share their experiences, discoveries and good practices.

The team's guidance document will expand and evolve during the next few years as we gain experience with creating IDPs. Also, Analysis Branch is developing an offshoot document that describes what analytical support would best assist the IDP creation process, and how ideas for analytical projects can be distilled from the intelligence gathered during that process.

To contact the SMP project team about its report or to discuss the analysis program, please phone Ken Tallis on (02) 6252 7290.



Over the past 20 years the Australian labour market has experienced significant changes. The structure and operation of the labour market is substantially different today compared to 1981 due to changes in demography, technology, industrial relations and education patterns. All of these factors will have had an affect on the unemployment rate.

Relatively high rates of unemployment throughout the 1980s and 90s, when compared to rates in the 1960s and 70s, have prompted extensive research into the causes and effects of unemployment. Most Australian studies of unemployment at the micro level use cross sectional surveys as the basis of their analysis. Cross sectional surveys provide a snapshot of the labour market at a particular point in time. This enables comparison in a number of dimensions such as gender, age, household type and education. However, these comparisons do not reveal if the same people are unemployed in both periods or if some have found employment and been replaced by people newly unemployed.

Ideally, one would like to have panel data (or longitudinal survey data) to answer these types of questions. Panel data is where an individual can be tracked over time. In the absence of panel data one can use cross section data sets to undertake cohort analysis. Cohort analysis cannot provide insights into labour dynamics at the individual level but it can at a group or cohort level. Particular cohorts may experience relatively higher levels of unemployment over time than the rest of the population. For example, cohorts which enter the labour force during an economic downturn may be scarred by being unable to find employment for an extended duration and have relatively high unemployment rates throughout the rest of their lives as a result.

This study employed cohort analysis to examine patterns in unemployment rates for the period 1981 to 2001 using quasi-panel data constructed from the Labour Force Surveys. Regression techniques were used to decompose the unemployment rates into age, year and cohort effects. The cohort effects were further closely examined among males and females by looking at some driving forces that affect unemployment rates such as labour supply (participation), education levels and the conditions on entry into the labour market. The Analysis branch project team (Ravi Ravindiran, Terry Rawnsley and Annette Jose) found that cohort effects were a significant determinant of unemployment for men, along with age and year effects. But cohort effects were insignificant in explaining unemployment patterns among women.

Results of this study are geared towards expanding our understanding of the methodology underlying cohort analysis, and to contribute to the body of analytical tools in analysing existing ABS datasets.

For more information, please contact Ravi Ravindiran on (02) 6252 7039, or


Each year, Analysis Branch holds an offsite conference to reflect on issues that affect the ABS analytical program. At our most recent conference, we focussed on four themes.

Collaborative work. How can we improve the effectiveness of our collaborations with colleagues in the ABS, other government agencies and universities? We discussed various models for collaborative work such as joint projects and cross-postings. During the past year or so, we have enjoyed a wider variety of collaborative work, including analyses of crime, disability, the influence of information technology on business performance, national progress and prices. Our analysis portfolio managers will negotiate more joint projects and cross-postings for 2003.

Quality of analytical work. How can we assure the quality of prototype analytical products and methodological advice? We discussed ways of defining and assessing the quality of analyses. Recently, we have been revamping our interim quality framework for "elaborately transformed statistics" to align with the standard ABS set of six quality attributes -- relevance, coherence, accessibility, interpretability, timeliness and accuracy. We also discussed processes for assuring quality. We rely mainly on peer review by colleagues in the ABS, other agencies and universities to assure the defensibility of our methods and the plausibility of our findings.

Research methods. How can we discover and share good practices for analytical investigations? We discussed ways of doing a literature review, conducting a data census and quality evaluation, selecting an appropriate analytical technique, communicating our findings, and preserving the knowledge gained during a project. As an offshoot of this discussion, we are developing new modules on research methods, quality-informed decisions and analytical writing for the Turning Economic Data into Information course.

Analytical careers. What pathways can be followed by an ABS officer who is interested in a career in socioeconomic analysis? We discussed the variety of "career anchors" that such a person might have -- such as wishing to become an expert in a particular style of analysis or subject field, or wishing to develop portable problem-solving skills. We also discussed the broad experience that an analyst can build by working in a variety of analysis groups, national statistical centres and survey areas. We are developing a document titled "Analytical Skills for ABS Staff" to supplement the landmark document "Statistical Skills for ABS Staff" issued in January 2001. A draft will be available for comment later this year.

For more information, please contact Ken Tallis on (02) 6252 7290.



Total Approach Management (TAM) is a framework or philosophy for designing, acquiring and processing inputs from businesses, and for monitoring and improving these processes. It is being developed as part of the ABS' Business Statistics Innovation Program, a program of re-engineering the way ABS collects and processes data from businesses.

From a high level, TAM arises from the realisation that a number of initiatives impact on providers and we need to fully consider their linkages and relative cost effectiveness. A TAM framework can provide cohesion to a range of work aimed at advancing improvements to provider management, data quality and operating efficiencies.

At a lower level, elements of best practice for the different stages of collection development, design, testing, operations and evaluation are used in many collection areas and processes, but not inside an overall framework. Best practices are therefore more difficult to systematically identify, quantify or generalise.

The TAM framework will provide assistance with this cohesion.

While the outcomes, processes and infrastructure in scope of ‘Provider Management’ represent a key component, TAM includes more than managing provider load and contact in that:
  • it emphasises that the overall provider experience and resulting data quality are related to areas apart from direct provider contacts (e.g. instrument development and content, frames and selection). It also emphasises feedback from processing into the design and development of the acquisition process;
  • it has a strong cost-effectiveness, process outcome monitoring, and improvement layer to explicitly recognise the need to routinely evaluate the statistical and cost-effectiveness of processes, practices and systems.

Outcomes sought from the TAM philosophy are to:
  • improve provider experience and expectations;
  • improve the quality of data provided;
  • improve the cost efficiency of data collection ;
  • reduce provider load;
  • reduce data acquisition and processing costs; and
  • improve response rates and cooperation.

Further information about Total Approach Management can be obtained from Rob Burnside on (02) 6252 7816.


Commonwealth of Australia 2008

Unless otherwise noted, content on this website is licensed under a Creative Commons Attribution 2.5 Australia Licence together with any terms, conditions and exclusions as set out in the website Copyright notice. For permission to do anything beyond the scope of this licence and copyright terms contact us.