Australian Bureau of Statistics

Rate this page
ABS Home
ABS @ Facebook ABS @ Twitter ABS RSS ABS Email notification service
Newsletters - Methodological News - June 2003

A Quarterly Information Bulletin from the Methodology Division

June 2003


In several of our projects, we are confronting an analytical problem that is unfamiliar to us - that of characterising through-time patterns in some key datasets. The problem is best explained using our lifelong learning project as an example.

We are interested in understanding the learning pathways that Australians follow throughout their lifetimes. Starting with data collected in the 2001 Survey of Education and Training (SETIT), we have compiled "education event profiles" for about 24,000 people. Each profile records the dates at which key education events occurred, such as completing secondary school, trade certificate, or university degree. Recently, we have done some analysis of the probabilities of certain education transitions, such as the probability that someone who completed year 12 will obtain a first post-school qualification within one year, two years, three years and so on.

The next phase of our lifelong learning project attempts to address three questions:
  • What are the typical "education pathways" that Australians follow? A pathway might encompass, say, the education events experienced, their sequence, their duration, and the intervals between completion of one experience and commencement of another. Each of these aspects of "pathways" can provide us with useful insights into the learning experiences of Australians.
  • How are education pathways evolving? Are some forms of education becoming more or less common, or are the sequences, durations or intervals of education events changing over time?
  • How do education pathways differ between one subpopulation and another? Are there differences according to sex, age, Indigenous status, ethnic background, and so on? Are changes in pathways more pronounced or more rapid for some subpopulations?

We would like to develop a taxonomy or classification of pathways - to group our 24,000 records into broad families that display similar patterns of educational experience. Having such a taxonomy would be helpful in several ways:
  • It would allow us to write soundly-based narratives about the dominant forms of lifelong learning in Australia.
  • It would provide a basis for analysing the evolution of education pathways, using either our first database (constructed from SETIT2001) or future, richer databases (constructed from past and future SETITs and auxiliary datasets).

This style of analytical problem is also emerging in other parts of our work program - such as describing the evolution of prices and quantities in supermarket scanner datasets and describing the life-histories of businesses in longitudinal databases.

For more information, please contact Shiji Zhao on (02) 6252 6053.

E-mail :


The Queensland Local Government Business Statistics Centre (BSC), with advice from Methodology Division's Forms Consultancy Group, recently undertook a post enumeration survey (PES) of 22 respondents in the quarterly Local Government Finance Statistics collection. This collection is centralised in the Queensland office. A new electronic Microsoft Excel form was introduced for the September 2002 quarter, with electronic lodgement through the ABS secure deposit box. These changes resulted in significant improvements in response rates, timeliness, and data quality. In view of these changes, the PES had three broad objectives:
  • examine the quality of data reported to ABS by councils and identify areas where forms design or data quality aspects of the collection methodology can be improved;
  • evaluate the electronic data reporting (EDR) process and instrument with a view to improving and expanding the ABS' EDR policy; and
  • begin to implement some aims and principles of Total Approach Management (TAM) for a well defined and relatively homogenous provider group.
TAM is a framework or overarching vision for designing, acquiring and processing inputs from respondents, and for monitoring and improving these processes. This exercise progressed the three TAM aims:

Develop best practices for economic collection instrument evaluation:

The Local Government BSC sought to address issues through the PES based on the background of known problems and input from stakeholders. The visit protocols were then developed and field tested before the main field work took place. Conclusions were:
  • testers need a well-developed visit script and a good understanding of some accounting standards and procedures;
  • there is some respondent confusion between the data requested annually by the State Grants Commission and data reported to ABS;
  • respondents need to be notified of collection instrument changes in advance so they can prepare appropriately; and
  • while ABS seeks revisions to previous reporting periods, most respondents use industry accounting standards based on 'year-to-date' reporting, meaning that a revision to previously reported data is made by addition to or deduction from the current reporting quarter.

Explore provider segmentation opportunities:

Local councils appeared relatively homogenous, technically capable, and amenable to a specialised collection approach as a distinct provider group. Two aspects of a segmentation approach to collecting are:
  • to expand the existing secure electronic reporting channel to include a range of non-financial data on behalf of different ABS subject matter areas. Combining data requests would rationalise respondent load through reducing points of contact and number of individual collection instruments ABS dispatches to councils.
  • considering the possibility of increasing automated or semi-automated data extraction and reporting.

While councils were in favour of electronic reporting, the PES found that the financial areas who completed the current electronic excel survey form were not involved in a wider reporting role. One reason for this is the inability to separate out Excel worksheets for distribution to other areas in council, such as environmental and human resources (specifically payroll) departments. Individual electronic instruments would still have to be despatched to other areas of interest to the ABS inside councils.

There is no immediate potential for automatic data extraction as respondents were happy with their current manual transcription methods. The financial activity sought in the current collection necessitates an amount of judgment and allocation/classification of finance data. The major reason for manual allocations of data is councils use of varying proprietary packages, implemented by individuals in different ways.

Continue work on the design, usability and application of electronic reporting:

The EXCEL 'e-form' and ABS Web-based secure deposit box were both very well received. Suggestions for improvement were:
  • provide further text areas to each worksheet that would aid in revisions to data and encourage comments pertinent to each section (because there is one worksheet for each section of the instrument);
  • give a unique receipt number along with an acknowledgment that data has been lodged on the secure Web-site;
  • provide 'year to date' running automatic totals; and
  • include an electronic version of comprehensive notes and classifications.

For further information, please contact Tracey Rowley on (02) 6252 5905 or, Robert Hall on (07) 3222 6053.

E-mail :


The ABS is keen to work closely with researchers in universities, government agencies and other organisations. The benefits are many, including - obtaining professional review of ABS methods and products, encouraging greater use of our statistics and a better appreciation of how they are collected or constructed, gaining access to technical skills and knowledge, and improving our understanding of how data are used in research.
Methodology Division (MD) has a particular responsibility to build partnerships with nonABS researchers in the fields of survey design, time series methods and other analyses. We use a variety of means to build such bridges, including the following:
  • ABS advisory groups. The Methodology Advisory Committee (MAC), for example, has drawn its membership from universities, sibling statistical agencies, other government bodies and the private sector. MAC meets twice a year to provide advice on techniques and research strategies useful to ABS survey designs and analyses. NonABS researchers are also members of other groups, including the Australian Statistical Advisory Council and the committees that advise on developments in economic and social statistics.
  • Collaborative projects. MD undertakes collaborative work with nonABS researchers, often under the umbrella of Australian Research Council "linkage grants". Our collaborations include research into: analyses of productivity at the macro and micro level; construction of temporal and spatial price indexes; the interaction between survey design and time series analysis; and the analysis of business demographics.
  • Project boards and peer review. For almost all of our analysis projects, and for many other statistical projects, we recruit nonABS researchers to our project boards (to guide our choice of analytical questions and research strategies) and our peer review panels (to vet our application of methods, our handling of data and the plausibility of our findings).

During the coming year or so, we are experimenting with a variety of other means for engaging nonABS researchers, such as hosting thematic workshops or Web-based discussion forums on emerging methodological issues.

For more information, please contact Ken Tallis : (02) 6252 7290.

E-mail :


March 2003 saw the successful launching of the first Statistical Services Branch (SSB) Survey Methodology Workshop held at ABS House in Canberra

All methodologists, including those located in the Regional Offices, were invited to attend. The workshop was viewed as a collective learning process, not an expert-to-expert workshop. To this end, many participants were invited to share methodological issues of current relevance.
A wide variety of sessions were presented. The sessions were arranged in terms of major themes as follows:

Strategic directions for methodological development.

  • Confidentiality and Data Access - This included an overview of basic strategies and methodological issues; and risk assessment for CURFs (Confidentialised Unit Record Files).
  • Sample frame and sample design - Presentations covered frame issues; synchronised selection; dependent source feedback; maximal brewer method of sample design; and multiple frames.
  • Regression estimation - The main focus of this theme was GREG (generalised regression) estimation. Other topics covered were the use of across stratum or within stratum estimation in Average Weekly Earnings; integrated weighting; and defence force personnel in labour force supplementary surveys.
  • Synthetic estimation - Overview; state estimation for the Economic Activity Survey and for the Mining and Utilities Survey.
  • Editing - Besides editing in general, this theme particularly focussed on significance editing.
  • Time Series Analysis (TSA) - As well as a general overview of TSA, presentations included a demonstration of SEASABS; use of REGARIMA and model choice techniques; analysis of Labour Force Survey supplementary effects; and a discussion of ways forward for TSA to better measure the real world.

The format of the workshop included presentations, discussions and demonstrations, where appropriate. The workshop provided an excellent technical training opportunity, particularly for Regional Office methodologists. It had a great atmosphere of interaction and discussion amongst all attendees which was a real positive of the workshop. As well as the scheduled morning and afternoon teas, there were two dinners, which were excellent times to meet/re-meet fellow methodologists. As the majority of the presentations were computer presentations, the slides and other related material were, in the main, documented and placed on an MD (Methodology Division) database. This will provide a ready source of information for attendees and also for future MD staff.

For further information please contact Paul Sutcliffe on (02) 6252 6759 or, Geneal Andersen (07) 3222 6209.

E-mail :


For the past several years, the ABS has been developing new estimates of the output of government services. In the past, the Australian national accounts measured such outputs in terms of the inputs used (labour, capital and intermediate goods and services). This procedure does not take account of productivity improvement.

New output measures have already been developed for three major classes of services - health, education and justice (police, courts and prisons). Some of these new measures have been incorporated in the national accounts; others are being evaluated. These three services account for around four-fifths of output for the general government sector. The remaining one-fifth includes such services as defence, revenue collection, social security administration and policy advising.

The most recent project in this program has developed experimental output measures for revenue collection and social security administration. The work has been made possible by the efforts of various government agencies during recent years to enhance the management information on the resources they use and on the services they produce. In particular, there is now an agreed classification scheme for such services and a systematic assembly of statistics regarding service volumes and costs.

The time series of such statistics is still quite short, so, as we have done for some of other industries, we shall monitor the movements in our experimental output series for some years, with a view to eventually incorporating them in the national accounts. Once they are incorporated, we shall have new output measures for around 90% of the general government sector. As yet there are no plans to develop output measures for the remaining services, such as defence or policy advising.

For more information, please contact Shiji Zhao on (02) 6252 6053.

E-mail :


Even the most polished and professional piece of analysis relies as much on the words around it as the numbers themselves. Over the past few months a handful of analysts from Analysis Branch have begun developing a training course to improve written communication in the branch.

The course comprises three modules, each of which will last for about 2 hours:
  • Getting the fundamentals right: covering the importance of plain English, clear communication and motivating your writing.
  • Executive summaries: the importance of writing a good executive summary and how to do it.
  • Editing your own and other people's work: which brings a professional editor's perspective to the importance of good editing.

The course, which is strongly practical, will be structured around discussion and written exercises (some of which will be done in class, some of which will be set for homework). Although the course is designed primarily for analysts from ASB, it may be offered to others in the Bureau (if it is successful and if there is demand). And parts of it may find their way into Ken Tallis's TEDII (Turning Economic Data into Information) course.

A pilot course will be run in July.

For more information, please contact Jon Hall on (02) 6252 7221.

E-mail :

or any of the analytical writing team

Annette Jose, Anil Kumar, John de Maio, Damian Mullaly, Cristy Williams


In 2002, the highly topical issue of salinity led to the rapid development of a survey of Land Management Practices by farming businesses in Australia. The mail-out Salinity Survey was followed by a telephone Post Enumeration Survey (PES) to examine the performance of the form's design. The PES entailed interviewing 131 providers who had returned the paper form and asking them questions similar to those on the form, but structured differently, to see how well the two sets of answers matched. A less formal face to face PES was also conducted by visiting respondents.

The concerns with the paper form's design and content were:
  • the question wording explicitly mentioned a topic (salinity) which was potentially sensitive for the provider so the telephone PES asked more general questions to see if the provider would mention salinity voluntarily;
  • some questions on the paper form had long and complex structures comprised of varying answer field formats, and the question content covered multiple topics. The PES broke these questions into their components and the interviewer asked the provider for one answer at a time;
  • some questions on the form began with a YES/NO filter positioned before response categories and other information, which the provider may have needed to read before they ticked YES or NO. The PES restructured these questions so the provider was aware of all relevant information before they could give an answer to each item;
  • other questions on the form had long lists of answer categories where the provider may have forgotten the actual question wording after the first few categories. The PES rephrased these categories into full questions.

The providers for the telephone PES were selected from groups based on how they responded to the questions of most concern in the paper form. These included providers who apparently responded correctly, obviously responded incorrectly, and those who left questions blank. The results from the PES were analysed and compared with the original survey responses by the selected providers. Taking editing into consideration, the broad totals were reasonably similar across surveys, but differences at the individual provider level were quite marked.

The subject of salinity wasn't as sensitive to providers as anticipated and those interviewed in the PES volunteered the relevant information freely. However, the results indicated the other design and content concerns were justified. False negatives and false positives were relatively common - i.e. the filter structure at the beginning of new and complex questions, which also covered multiple topics, appeared to be the cause of many of the problems.
Recommendations from the PES analysis include:
  • Avoid filters and have a "None of the above" category at the bottom of lists of answer options when asking conceptually complex questions, especially when they are new. Reading all the categories can assist the provider's memory and comprehension, so they include items not initially thought of at the start of the question.
  • Multiple topic questions should be split into smaller questions, which are more consistent, so each group of questions relates to a specific topic and has the same answer field format.

The lessons learnt from this process will feed into the design of the next Salinity Survey as well as similar ABS forms.

For more information, please contact Glenn West on (02) 6252 6382, Emma Farrell on (02) 6252 7316, or Adam Sincock on (02) 6252 6766

E-mail :

Commonwealth of Australia 2008

Unless otherwise noted, content on this website is licensed under a Creative Commons Attribution 2.5 Australia Licence together with any terms, conditions and exclusions as set out in the website Copyright notice. For permission to do anything beyond the scope of this licence and copyright terms contact us.