|Page tools: Print Page RSS Search this Product|
Section V - Performance Information
The ABS goes to considerable lengths to ensure its data, analysis and interpretation are objective, and always publishes its statistics in ways that explain and inform, without advocating a particular position.
The ABS regularly reviews the methodologies used to produce statistics, providing the opportunity to make improvements and incorporate new approaches, where appropriate. For example, the methodology used for the Census Post Enumeration Survey was reviewed to ensure that the population estimates produced following the 2006 Census of Population and Housing are as accurate as possible. Information Paper: Measuring Net Undercount in the 2006 Population Census, 2007 (cat. no. 2940.0.55.001) was released in May 2007, describing the methodology used for this process, prior to the first releases of census data.
The ABS statistical system is open to outside scrutiny. Its methodologies are based on sound statistical principles and practices, and are disseminated widely. The Methodological Advisory Committee meets twice every year and consists of professional statisticians external to the ABS, who provide peer review of methodological developments in the ABS. In addition, a range of research papers are published to explain statistical developments and research. Topics covered in the past year included: improvements to times series analysis methods, improving household survey designs and enhanced use of information to monitor survey progress, assessing the quality of analytical products, and new analyses such as estimating average annual hours worked.
The ABS works to continuously improve the quality of its statistical processes. During 2006–07, the ABS reviewed its approach to quality-assuring the processing of statistical information, and has commenced implementing a range of measures aimed at reducing the incidence of statistical errors.
To ensure the production of high-quality statistics, quality monitoring is an integral part of the process. The following aspects of quality are examined in this chapter:
The ABS aims to produce a large and diverse range of statistics, with a quality designed to meet the key needs of researchers, policy makers and other users within the Australian community. The ABS also strives to ensure that the users of its information are provided with readily accessible information on quality, so they can make informed decisions on the suitability of the statistics for their intended use. This information is available on the ABS website (http://www.abs.gov.au), and the ABS is currently looking into ways of making this information more visible to users of its statistics.
The changing environment, in particular the increased importance of the ABS website as the main dissemination source of its statistics, has introduced new challenges and opportunities for improvement on ways the ABS can better ensure that end users have relevant, accessible and quality information to guide their use of the statistics. The ABS endorses the principle that the quality of the data should be described clearly. Work has commenced on producing statements on the quality of statistical outputs that have been written specifically for web-based dissemination. The statements will describe the quality of a statistical release using the six dimensions of the ABS data quality framework, as listed above.
The relevance of statistical information reflects the degree to which it meets the needs of the users of the information, and of concern is whether the available information addresses the issues that are most important to researchers and policy makers, and to the broader Australian community. The outputs produced, the concepts and classifications used, and the scope of the collection can all affect the relevance of the data.
A detailed understanding of the users of statistical information and their requirements is an important part of the statistical process, and the ABS has a range of mechanisms in place to achieve this, including advisory groups such as the Australian Statistics Advisory Council. A range of other groups and mechanisms, which the ABS uses to communicate with the users of statistics, are described in Chapter 10 Engagement with users and producers of statistics.
For particular surveys, key stakeholders are identified and consulted before and during the survey development. Further, each survey is regularly evaluated to assess the degree to which it meets user requirements. Information Development Plans are reviewed regularly for each area of statistics, bridging the gaps between user requirements and statistical outputs.
Other specialised reviews are also conducted regularly, for example, future directions in regional statistics and priorities for information technology statistics.
The ABS continues to refine its products to ensure they remain relevant. For example, during 2006–07, the ABS commenced implementing a revised classification of industry, Australian and New Zealand Standard Industrial Classification, 2006 (cat. no. 1292.0), which will enable ABS statistics to better reflect the economy in the real world.
The accuracy of statistical information is the degree to which the information correctly describes the phenomena it was designed to measure. Most statistics produced by the ABS are obtained from a sample of households or businesses. This process results in some uncertainty as to the accuracy of the estimates published. For example, the estimate from the sample may not be the same as would have been obtained if information had been collected from the whole population. This is known as sampling error. There are also other sources of error that potentially cause inaccuracy, including the level of non-response, the magnitude of revisions made as additional information is received, and errors from other parts of the collection process (non-sampling error). The ABS aims to inform users about the accuracy of statistics and enable them to assess whether the accuracy of the data will be sufficient to meet their needs.
Information about accuracy
As users will want to use statistical information for different purposes, it is important to make information available to enable them to make their own assessment of the quality. Descriptions of accuracy, as well as extensive information on the statistical methods used in collections, are routinely provided in concepts, sources and methods publications, the explanatory notes in publications, and through the Statistical Clearing House via the ABS web site.
In addition, major changes to methodology are explained in feature articles or information papers, such as a feature article on a new methodology for estimating the number of persons in the labour force in Forthcoming Changes to Labour Force Statistics, May 2007 (cat. no. 6292.0), and a new methodology for compiling the established house price index in House Price Indexes: Eight Capital Cities, March 2007 (cat. no. 6146.0).
The ABS has made few significant mistakes in the statistics it has released. On the infrequent occasions when substantial processing errors are found, it is ABS policy to publish corrected data as soon as possible. Instances where an error has resulted in a release being reissued in 2006–07 include:
To ensure the continued high level of accuracy of ABS statistics, the ABS continues to implement reviews and risk-mitigation strategies to ensure that processes are examined and any weaknesses identified are addressed.
Non-sampling error is a general term that describes all sources of error other than the error introduced by the sampling process. Sampling error can be measured by using the mathematical properties of the selected sample. Non-sampling error is much harder to measure.
Some sources of non-sampling error that are most relevant to statistical surveys include: non-response error; errors in identifying and contacting the population of interest for a survey; errors introduced by the questionnaire design, such as misunderstanding or inadvertently missing questions, or phrasing questions that predispose a respondent to answer in a particular way; and data capture, processing and coding errors.
The ABS minimises the impact of non-sampling errors by use of best practice procedures in questionnaire design, interview procedures, data validation and repair, and processing. Any significant changes to questionnaire wording or data collection methods are carefully trialled and evaluated before they are implemented.
The relative standard error (RSE) is a measure of the sampling error associated with an estimate. The magnitude of standard errors varies between collections and between data items within a collection due to factors such as the responding sample size and the nature of the data item. The RSE is a useful indicator for comparing the accuracy of estimates between surveys. Table 12.1 presents a summary view of the estimated RSEs for key statistics from a number of major ABS surveys. Further detailed information is included with each ABS publication, as well as in the concepts, sources and methods publications released by the ABS.
RSEs are affected by the size of the sample used, the sample design used for the survey, and by the underlying variability of the indicator in the population.
Sample size influences the level of accuracy that can be attained. For example, the accuracy of estimates from the Labour Force Survey varies between states and territories. To have the same level of accuracy, identical sample sizes would be required for all states and territories. The sample sizes between states vary, for example, the sample size for the New South Wales estimates is greater than the sample size for Northern Territory estimates. Hence the estimated RSEs for total employed persons in Australia is lower than any of the individual state estimates, and the estimated RSE for total employed persons in New South Wales is lower than the estimated RSE for total employed persons in the Northern Territory.
ABS sample designs for business surveys use groups of similar businesses (strata) as the basis for sample selection to improve the efficiency of estimation. Information such as employment size or annual sales can be used in this grouping. Many indicators, such as annual turnover or value of building work done, are closely related to the variables used in stratification, allowing these indicators to be estimated with relatively high accuracy. Other variables, such as capital expenditure or job vacancies, are not as closely related, and so cannot be estimated with the same accuracy.
As well as differences between surveys, the RSE can also change with time for any given survey. These changes may be due to changes in the way the survey is conducted, for example, changes in the sample size or the method of producing estimates, or changes in the population being studied, such as a change in the prevalence of a particular characteristic.
Greater accuracy was achieved for the labour forces estimates in 2007 due to an improved estimation methodology. Analysis of historical data showed that at the Australia level, for seasonally adjusted estimates, employment estimates were 0.07 per cent lower on average and unemployment estimates were 1.6 per cent lower on average, under the new estimator than under the previous estimator. For more information see Information Paper: Forthcoming Changes to Labour Force Statistics, May 2007 (cat. no. 6292.0).
The RSE for job vacancies is relatively large due to the underlying variability. That is, the number of job vacancies can vary considerably from business to business—and for any business it can vary considerably from month to month. Therefore, a very large sample would be required to measure job vacancies with high precision. In the table below, the estimated RSE for job vacancies for 2005–06 of 5.6 per cent is around the same as the estimated RSE for job vacancies for 2004–05 of 5.5 per cent.
Table 12.1: Relative standard errors (RSEs) for selected indicators(a)
REVISIONS TO DATA
One measurable component of statistical accuracy is revisions to data made after initial publication, resulting from additional information becoming available. Revisions are generally measured by their size and frequency over time.
Revisions are applied to statistical series to ensure that there is an appropriate balance between accuracy and timeliness in the release of the statistics. Revisions could be avoided, but this would mean that either the release of statistics would be substantially delayed, or that the statistics could not be improved by making use of any new or better sources of data that become available. The ABS aims to maximise the overall quality of the released statistics by publishing accurate statistics in a timely manner, while subsequently improving the accuracy through revisions as new data become available. It is also ABS policy to inform users of any significant revisions and, where appropriate, to revise past time series and advise users accordingly.
The tables below provide, for two key series, the mean revision and the mean absolute revision for the past seven years. The mean revision shows the percentage difference between the first estimate published, and that estimate one year later, averaged over the four quarters for the year. The mean absolute revision shows the average absolute values of the mean revision.
Table 12.2 describes the revisions to quarterly gross domestic product (GDP). In particular, it shows the difference between the first estimate of GDP and that estimate one year later, in terms of the mean revision and the mean absolute revision expressed as percentage points. The figures continue to show that revisions to quarterly GDP in recent years remain relatively small (mean absolute revision). Zero mean revision figures indicate that the revisions to quarterly GDP over the year have been offsetting. Despite the revisions to quarterly GDP being quite small, efforts to further improve the estimates are ongoing.
Table 12.2: Revisions to quarterly gross domestic product, percentage change(a)
(b) First three quarters of 2005–06 only
Mean absolute revisions to the quarterly current account transactions since 1999–2000 are shown in Table 12.3. The revisions to the current account deficit are expressed in percentage terms, rather than percentage points, as is the case with the revisions to GDP.
Table 12.3: Revisions to quarterly current account transactions(a)
(b) First three quarters of 2005–06 only
The timeliness of statistical information can be measured by the gap between the reference period (the period the data relate to) and the date of release of results. The ABS continues to adhere to pre-announced release dates and make improvements, where possible, to the timeliness achieved. Tables 12.4 and 12.5 present information on the timeliness for ABS monthly and quarterly tabular data for main economic indicator statistics, and other general releases. Table 12.6 reports on the timeliness of Confidentialised Unit Record Files (CURFs).
The high standard of timely release of statistical tables was maintained in 2006–07. The greatest change since last year has been the improvement in timeliness for other general tabular statistics.
Table 12.4: Time between end of reference period and release of tabular data (average number of elapsed days)(a)
Table 12.5: Time between end of reference period and release of tabular data for selected publications
The timeliness of release of information depends on a number of factors, including the amount and complexity of information being collected, the source of the data (for example, whether directly collected or sourced from administrative records), and the amount of processing or validation of the information required before release.
For example, labour force statistics are released very quickly after the end of the reference month. Part of the explanation for this is that the data collection is completed before the end of the reference month. In contrast, for demographic statistics on Australia’s population, the quarterly changes to population statistics are based on a variety of administrative sources, such as registrations of births and deaths, passenger cards completed at Australia’s borders, and modelled estimates of interstate migration (using information from Medicare card registration address changes, delayed by three months as registration often takes place after the actual move). It takes around five months before estimates can be published due to the time needed to acquire and process the administrative data, particularly with the delay of three months for the Medicare card data.
The elapsed time between the end of the reference period and the supply of the CURF data has improved significantly in recent years, as can be seen by the average number of elapsed days in Table 12.6. The number of CURF releases is related to a survey topic and may include both basic and expanded CURFs counted as a single release. More information on CURFs can be found in Chapter 13 Dissemination of statistics. As 2006–07 information is unavailable, Table 12.6 presents information for the previous year.
Table 12.6: Time between end of reference period and release of CURFs
The accessibility of statistical information refers to the ease with which it can be referenced. This includes the ease with which the existence of information can be ascertained, as well as the suitability of the form or medium through which the information can be accessed. The cost of the information may also be an aspect of accessibility for some users. More information on the accessibility of statistical information can be found in Chapter 13 Dissemination of statistics.
All statistics on the ABS website are now accessed free of charge. This policy was announced in December 2005. The change means that all publications, spreadsheets and census data on the website are now available without cost to any member of the public with internet access. However, people who require paper copies of publications, information on CD-ROM, or information more detailed than that published, will be charged under the ABS pricing policy.
A CURF is a product that allows approved researchers with a valid statistical purpose to access individual survey responses. The data files are confidentialised and access is carefully controlled to ensure that no individual or organisation can be identified. The ABS has worked to improve the accessibility of information available by increasing the number of CURFs released, with a milestone achieved on 22 June 2007— the release of the 100th ABS CURF (inclusive of both basic and expanded CURFs). The ABS has also continued work on improving the accessibility of CURFs through the ABS Remote Access Data Laboratory (RADL™).
In assisting users to access more customised census information, the ABS released the 2006 Census Table Builder, as noted in 2006 Census of Population and Housing—Product Brief, Table Builder (cat. no. 2065.0). Table Builder is a product that allows users to construct their own tables via an interactive web interface, using a database containing the 2006 Census Unit Record File. Users will be able to select classifications for person, family and dwelling. Table Builder will also be available via the ABS RADL™ portal.
The interpretability of statistical information reflects the availability of the supplementary information and metadata necessary to interpret and utilise it appropriately. This information normally covers the availability and clarity of metadata, including concepts, classifications and measures of accuracy. In addition, interpretability includes the appropriate presentation of the data in such a way that it aids in the correct interpretation of the data.
ABS releases are accompanied by extensive explanatory notes to aid the interpretation of statistical information. A range of material is also available on the ABS website detailing the methods, classifications, concepts and standards used by the ABS. During 2006–07, the Producer and International Trade Price Indexes: Concepts, Sources and Methods, 2006 (cat. no. 6429.0) was released for the first time as a replacement to the 1995 issue of Producer and Foreign Trade Price Indexes: Concepts, Sources and Methods, 1995 (cat. no. 6419.0).
The ABS is currently working to improve the metadata available for ABS collections. For more information see Chapter 15 Statistical standards and infrastructure.
A number of ABS publications combine, compare and contrast statistics from different sources to help users interpret how changes in one aspect of the economy or society can impact on other aspects. Examples include Measures of Australia’s Progress, 2006 (cat. no. 1370.0), Australian Economic Indicators, July 2007 (cat. no. 1350.0) and Australian Social Trends, 2006 (cat. no. 4102.0). For more information on these publications and other analytical work undertaken by the ABS to assist in the interpretation of statistics, see Chapter 14 Extended analysis of statistics.
The coherence of statistical information reflects the degree to which it can be successfully brought together with other statistical information within a broad analytic framework, now and over time. Coherence encompasses the internal consistency of a collection as well as its comparability, both over time and with other data sources. The use of standard concepts, classifications and target populations promotes coherence, as does the use of common methodology across surveys.
Coherence of ABS outputs requires the use of nationally and internationally agreed concepts and classifications. Standard concepts and classifications are not only used extensively within the ABS, but also promoted to other producers of statistical information within Australia. Information on statistical standards, concepts, classifications and methodologies are readily accessible through the ABS website. For more information see Chapter 15 Statistical standards and infrastructure.
The Statistical Clearing House (SCH) provides approval to conduct surveys that are directed to fifty or more businesses and that are conducted by, or on behalf of, any Australian Government agency, to ensure that surveys are necessary, well designed, and place minimal burden on business respondents. One of the criteria used by the SCH is the coherence of the statistical information that will be produced. In particular, surveys are assessed on their use of standard methodologies, concepts and classifications, their consistency with past or future surveys, and the extent to which outputs can be compared, or jointly used, with other sources of data. For more information about the SCH see Chapter 10 Engagement with users and producers of statistics.
Any changes that may impact on the coherence of ABS statistics are detailed in the explanatory notes that accompany each release. Significant changes may lead to series breaks in time series, or adjustments to past data. Occasionally situations occur that necessitate breaks being applied to the trend series. These breaks are necessary because of a change in the underlying level of the original series. An example of a recently applied series break in trend estimates can be found in Overseas Arrivals and Departures, Australia, November 2005 (cat. no. 3401.0) in short-term resident departures (Indonesia), to account for the decrease in movements resulting from the Bali bombing of 1 October 2005. This break not only applied to the estimates of the individual country (ie Indonesia) but as a consequence a break was required to the regional total series (eg Total South-East Asia) and to the total series.
Further, when the Telstra Corporation was effectively privatised on 20 November 2006 (changing from public sector to private sector for the purposes for ABS statistics), this change impacted the average weekly earnings by sector series, and a series break was made in Average Weekly Earnings, Australia, February 2007 (cat. no. 6302.0). For more information on the impacts of the privatisation of the Telstra Corporation see Future Treatment of Telstra in ABS Statistics, 2007 (cat. no. 8102.0).