1001.0 - Australian Bureau of Statistics -- Annual Report, 2007-08  
ARCHIVED ISSUE Released at 11:30 AM (CANBERRA TIME) 05/12/2008   
   Page tools: Print Print Page  
Contents >> Section V - Performance Information >> Chapter 11 - Quality and timeliness

INTRODUCTION

The quality of the statistics produced by the ABS is critical to ensuring the ABS achieves its mission of assisting and encouraging informed decision-making. The ABS strives to maximise the quality of the information it produces, taking account of budgetary constraints and the load placed on survey respondents.

ASPECTS OF QUALITY

To ensure the production of high quality statistics, quality monitoring is an integral part of the process. The following aspects of quality make up the ABS data quality framework:

  • institutional environment

  • relevance—the degree to which information meets the needs of users

  • accuracy—the degree to which the information correctly describes the phenomena being measured

  • timeliness—the delay between the reference period and the release of the information

  • accessibility—the ease with which the information can be referenced

  • interpretability—the availability of supplementary information necessary to interpret the statistical information, and

  • coherence—the degree to which the information can be brought together with other information, and over time.


    Addressing the quality of a statistical product will always involve balancing these aspects. For example, initiatives that could improve the accuracy of the statistics may reduce the timeliness. The ABS must also work within its budget, and find the right balance between achieving high quality statistical information and collecting an appropriate range of statistics. The ABS aims to produce a large and diverse range of statistics, with a quality designed to meet the key needs of policy makers, researchers and other users within the Australian community. The ABS also strives to ensure users of its information are provided with readily accessible information on quality, so they can make informed decisions on the suitability of the statistics for their intended use. This information is available electronically on the ABS website <https://www.abs.gov.au>, through explanatory and technical notes, and more recently through Quality Declarations. Quality Declarations describe the quality of a statistical release using the dimensions of the ABS data quality framework, to assist users in determining the ‘fitness for purpose’ of the product being viewed. Quality declarations were introduced to the ABS website from 25 October 2007, with the 2006 Census of Population and Housing being the first released.

    image_0022

    image_0023

    In recent years, the ABS has established a more formal ‘end-to-end’ framework for its survey design, data collection, data processing and dissemination activities. The focus is on ‘total quality management’, including quality assurance, as well as quality inspection processes (for example, ‘Quality Gates’) being strategically positioned and designed within end-to-end workflow processes. This ‘end-to-end’ focus allows effort on quality assurance and inspection to be targeted in a manner that provides the greatest overall benefits to the accuracy of the final outputs. For example, the ABS focuses maximum attention on the quality of that ‘input’ data, which will have maximum impact on the final outputs.

    Improved facilities for capturing and analysing ‘management information’ about how various steps in the process have contributed to the quality of the final data are an important aid in this regard, allowing the cost and benefit of each current step to be better understood, along with the likely cost and benefit of various process tuning options. Common, well managed data stores, and integrated systems working on a common basis, have reduced the risk of ‘human error’ during processing as well as the risk of many different stand-alone systems holding inconsistent data and/or processing it inconsistently. Finally, improved facilities within the ABS for analysing and reporting allow possible quality issues to be identified and investigated in a timely manner prior to publication, including:

  • how data has changed during the current processing cycle

  • how data for this cycle compares with data from a previous cycle of the same statistical activity (where applicable), and

  • how data for this cycle compares with other, related, sources of data.

    The changing environment, in particular the increased importance of the ABS website as the main dissemination source of its statistics, has introduced new challenges and opportunities for improvement on ways the ABS can better ensure end users have relevant and accessible quality information to guide their use of the statistics. The ABS endorses the principle that ‘the quality of the data should be described clearly and understandably’.

    INSTITUTIONAL ENVIRONMENT

    The ABS aims to produce high quality statistics that can be used with confidence. It also aims to exercise the highest professional standards in all aspects of its statistical operations, and recognises a quality culture is fundamental to maintaining the trust of the Australian community.

    The ABS goes to considerable lengths to ensure its data, analysis, and interpretations are objective, and always publishes its statistics in ways that explain and inform, without advocating a particular position.

    The ABS aims to maintain transparency in its operations and performance. Some of the ways in which this is achieved include:

  • advertisement of all scheduled release dates for publications up to 12 months in advance

  • use of daily press and media releases, to inform users of publications being released each day

  • a strict embargo policy, which is known to the public, that ensures impartiality for the release of all publications

  • publication of the ABS three-year Forward Work Program, which describes the ABS work program, including the resources to be used, outputs, clients and uses of statistical information, and the proposed developments over the next three years

  • release of information about statistical standards, frameworks, concepts, sources and methods in a range of information papers and other publications, and

  • inclusion of details of major revisions to published data in the explanatory notes of the relevant publication.


    The ABS regularly reviews the methodologies used to produce statistics, providing the opportunity to make improvements and incorporate new approaches, where appropriate. For example, a new sample design for the Labour Force Survey was phased in from November 2007. This new design makes use of the composite estimate methodology introduced in June 2007 and population information from the 2006 Census of Population and Housing, leading to a reduced respondent burden for statistics of comparable quality. Information Paper: Labour Force Survey Sample Design, 2007 (cat. no. 6290.0) was released in November 2007, describing the new sample design and the impact on quality.

    LABOUR FORCE SURVEY—CHANGES TO THE SAMPLE SIZE IN 2008

    As a 2008-09 savings initiative, the Australian Statistician announced that from July 2008, the sample size of the Labour Force Survey will be reduced by 24%, when compared with the June 2008 sample being implemented under the 2006 sample design (outlined in the information paper referred to above).

    The ABS is implementing this sample reduction in such a way that the sample can be easily increased again in the future, if the ABS funding position changes.

    The new sample, while smaller, will still be representative, with selections made in all parts of Australia. There will be increased volatility in the estimates, particularly the original and seasonally adjusted estimates, but this volatility will be random. Overall, the relative standard errors for estimates of employment and unemployment at the national, state and territory level are expected to be approximately 15% higher than those expected from the 2006 sample design.

    The ABS statistical system is open to outside scrutiny. Its methodologies are based on sound statistical principles and practices, and are disseminated widely. The Methodological Advisory Committee meets twice a year, and consists of professional statisticians external to the ABS, who provide peer review of methodological developments in the ABS. In addition, a range of research papers are published to explain statistical developments and research. Topics covered in 2007–08 include:

  • assessing the quality of linked datasets

  • imputation and estimation for the census, and

  • sample designs for surveys of Indigenous persons and comparing seasonal adjustment methodologies for quarterly series, when monthly data is also available.


    RELEVANCE

    The relevance of statistical information reflects the degree to which it meets the needs of the users of the information. Of concern is whether the available information addresses the issues most important to policy-makers, researchers, and to the broader Australian community. The outputs produced, the concepts and classifications used, and the scope of the collection can all affect the relevance of the data.

    A detailed understanding of the users of statistical information and their requirements is an important part of the statistical process, and the ABS has a range of mechanisms in place to achieve this, including its peak advisory group, the Australian Statistics Advisory Council. A range of other advisory groups and mechanisms, which the ABS uses to communicate with the users of statistics, are described in Chapter 9, Engagement with users and producers of statistics.

    For particular surveys, key stakeholders are identified and consulted before and during the survey development. Further, each survey is regularly evaluated to assess the degree to which it meets user requirements. Information Development Plans are reviewed regularly for each area of statistics, bridging the gaps between user requirements and statistical outputs.

    INFORMATION DEVELOPMENT PLANS

    Information development plans (IDPs) involve a review of the needs of users in a particular area of statistics, a review of the available sources of data, an assessment of the gaps and overlaps in information available, and recommendations on a future work program.

    More information on IDPs can be found in Chapter 9, Engagement with users and producers of statistics.

    The ABS continues to review and refine its products, to ensure they remain relevant. For example, changes to classifications used in import and export statistics are being implemented for these statistics to remain internationally relevant and comparable. The impact of the updated United Nations Standard International Trade Classification is discussed in Information Paper: Changes to International Trade Statistics, 2007–08 (cat. no. 5368.0.55.009).

    During 2007–08, the ABS has continued to roll out a revised classification of industry, Australian and New Zealand Standard Industrial Classification, 2006 (cat. no. 1292.0), to a number of ABS annual collections. The revised classification enables ABS statistics to better reflect the economy in the real world. Further information on implementation of the Australian and New Zealand Standard Industrial Classification, 2006 can be found in Chapter 14, Statistical standards and infrastructure.

    ACCURACY

    The accuracy of statistical information is the degree to which the information correctly describes the phenomena it was designed to measure. Most statistics produced by the ABS are obtained from a sample of households or businesses. The estimate from the sample may not be the same as would have been obtained if information had been collected from the whole population—this is known as sampling error. There are also other sources of error that potentially cause inaccuracy, including the level of non-response, the magnitude of revisions made as additional information is received, and errors from other parts of the collection process (non-sampling error). The ABS aims to inform users about the accuracy of statistics, so they can assess whether the accuracy of the data will be sufficient to meet their needs.

    INFORMATION ABOUT ACCURACY

    As users will want to use statistical information for different purposes, it is important to make information available to enable them to make their own assessment of the quality. Descriptions of accuracy, as well as extensive information on the statistical methods used in collections, are routinely provided in concepts, sources and methods publications, the explanatory notes in publications, quality declarations attached to publications, and at the Statistical Clearing House (see the ABS website <https://www.abs.gov.au> and/or the National Statistical Service site <http://www.nss.gov.au>).

    In addition, major changes to methodology are explained in feature articles or information papers, such as Changes to Weights of the Price Indexes for the Output of the General Construction Industry, 2008 (cat. no. 6406.0), about changes to producer price indexes; and, Experimental Estimates of Industry Multifactor Productivity, 2007 (cat. no. 5260.55.001), about new methods for industry-level multifactor productivity estimation.

    The ABS has made few significant errors in the statistics it has released. On the infrequent occasions when processing errors are found, it is ABS policy to publish corrected data as soon as possible. During 2007–08, there were some minor errors found in statistical releases, including:

    Census of Population and Housing QuickStatsThe electronic release contained an error occurring in counts for some categories of ‘Language spoken at home’ and ‘Industry of employment’ for Australian totals, and was reissued in April 2008
    Natural Resource Management on Australian Farms 2004–05 (cat. no. 4620.0)The publication was reissued in December 2007 due to errors in the calculation of some percentages, and to move some agricultural units from Victoria to New South Wales.
    To ensure the continued high level of accuracy of ABS statistics, the ABS continues to implement reviews and risk mitigation strategies to ensure that processes are examined and any weaknesses identified are addressed.

    NON-SAMPLING ERROR

    Non-sampling error is a general term that describes all sources of error other than the error introduced by the sampling process. Sampling error can be measured by using the mathematical properties of the selected sample. Non-sampling error is much harder to measure.

    Some sources of non-sampling error that are most relevant to statistical surveys include: non-response error; errors in identifying and contacting the population of interest for a survey; errors introduced by the questionnaire design, such as misunderstanding or inadvertently missing questions, or phrasing questions that predispose a respondent to answer in a particular way; and data capture, processing and coding errors.

    The ABS minimises the impact of non-sampling errors by use of best practice procedures in questionnaire design, interview procedures, data validation and repair, and processing. Any significant changes to questionnaire wording or data collection methods are carefully trialled and evaluated before they are implemented.

    The ABS continues to work to reduce the impact of non-sampling errors. For example, it has become more difficult to make contact with respondents in household surveys for a number of reasons, including higher workforce participation and the increased number of people living in secure apartment blocks. As part of the 2007 Survey of Mental Health and Wellbeing, a follow-up study is being conducted to investigate patterns of non-response, to further ABS understanding of non-response issues and to assist in improving response management in the future.

    SAMPLING ERROR

    The relative standard error (RSE) is a measure of the sampling error associated with an estimate. The magnitude of standard errors varies between collections and between data items within a collection due to factors such as the responding sample size and the nature of the data item. The RSE is a useful indicator for comparing the accuracy of estimates between surveys. Table 11.1 presents a summary view of the estimated RSEs for key statistics from a number of major ABS surveys. International comparisons of RSEs for selected indicators can be found in Table 11.2. Further detailed information is included with each ABS publication, as well as in the concepts, sources and methods publications released by the ABS.

    RSEs are affected by the size of the sample used, the sample design used for the survey, and by the underlying variability of the indicator in the population.

    Sample size influences the level of accuracy that can be attained. For example, the accuracy of estimates from the Labour Force Survey varies between states and territories. To have the same level of accuracy, identical sample sizes would be required for all states and territories. The sample sizes between states vary, for example, the sample size for the New South Wales estimates is greater than the sample size for Northern Territory estimates. The estimated RSEs for total employed persons in Australia is lower than any of the individual state estimates, and the estimated RSE for total employed persons in New South Wales is lower than the estimated RSE for total employed persons in the Northern Territory.

    ABS sample designs for business surveys use groups of similar businesses (strata) as the basis for sample selection to improve the efficiency of estimation. Information such as employment size or annual sales can be used in this grouping. Many indicators, such as annual turnover or value of building work done, are closely related to the variables used in stratification, allowing these indicators to be estimated with relatively high accuracy. Other variables, such as capital expenditure or job vacancies, are not as closely related, and so cannot be estimated with the same accuracy.

    As well as differences between surveys, the RSE can also change with time for any given survey. These differences over time may be due to changes in the way the survey is conducted, for example, changes in the sample size or the method of producing estimates, or changes in the population being studied, such as a change in the prevalence of a particular characteristic.

    The RSE for job vacancies is relatively large due to the underlying variability. That is, the number of job vacancies can vary considerably from business to business, and for any business it can vary considerably from month to month. Therefore, a very large sample would be required to measure job vacancies with high precision.

    Table 11.1: Relative standard errors (RSEs) for selected indicators (a)

    PublicationIndicator
    RSE (%)
    Economic indicators
    Retail Trade, Australia (cat. no. 8501.0)Total turnover for the retail industry, Australia
    0.8%
    Private New Capital Expenditure and Expected Expenditure, Australia (cat. no. 5625.0)Actual private new capital expenditure, Australia
    1.6%
    Business Indicators, Australia (cat. no. 5676.0)Company gross operating profit, Australia
    1.6%
    Building Activity, Australia (cat. no. 8752.0)Value of building work done, Australia
    0.7%
    Average Weekly Earnings, Australia (cat. no. 6302.0)Full-time adult ordinary time earnings, Australia
    0.8%
    Job Vacancies, Australia (cat. no. 6354.0)Job vacancies, Australia
    4.3%
    Social indicators
    Labour Force Survey (cat. no. 6202.0)Total number of persons employed (aged 15 and over), Australia
    0.6%
    Time Use Survey (cat. no. 4153.0)Total time spend on employment related activities (all persons)
    1.2%
    Household Use of Information Technology
    (cat. no. 8146.0)
    Number of households accessing the Internet at home
    0.6%
    (a) RSEs are presented for surveys conducted in respect of the 2006–07 reference period.

    Table 11.2: Relative standard errors (RSEs) for selected indicators, with selected international comparisons

    IndicatorRSE (%)RSE (%) for comparable indicator in selected country
    Publication AustraliaNew ZealandUnited States of AmericaCanada
    Retail Trade, Australia (cat. no. 8501.0)Total turnover for the retail industry, Australia0.8%1.8% (a)0.4% (c)0.7% (e)
    Labour Force Survey
    (cat. no. 6202.0)
    Total number of persons employed (aged 15 and over), Australia0.6%0.9% (b)1.9% (d)0.2% (f)
    (a) Source: Statistics New Zealand : Retail Trade Survey ISSN 1178–0355 (http://www.stats.govt.nz)
    (b) Source: Statistics New Zealand : Labour Market Statistics: 2007ISSN 1177–8040 (http://www.stats.govt.nz)
    (c) Source: US Bureau of Census Monthly Retail Trade Survey (http://www.census.gov/mrts/www/data/text/nrelys.txt)
    (d) Source: Bureau of Labor Statistics Current Population Survey (http://www.bls.gov/bls/empsitquickguide.htm)
    (e) Source: Statistics Canada Retail Trade cat.no. 63–005–X (http://www.statcan.ca)
    (f) Source: Statistics Canada Labour Force Information cat.no. 71–001–X (http://www.statcan.ca)

    REVISIONS TO DATA

    One measurable component of statistical accuracy is revisions to data made after initial publication, resulting from additional information becoming available. Revisions are generally measured by their size and frequency over time.

    Revisions are applied to statistical series to ensure there is an appropriate balance between accuracy and timeliness in the release of the statistics. Revisions could be avoided, but this would mean that either the release of statistics would be substantially delayed, or the statistics could not be improved by making use of any new or better sources of data that become available. The ABS aims to maximise the overall quality of the released statistics by publishing accurate statistics in a timely manner and subsequently improving the accuracy through revisions as new data become available. It is also ABS policy to inform users of any significant revisions and, where appropriate, to revise past time series and advise users accordingly.

    One of the main causes of revisions to time series data in the past has been the application of filters to decompose the original series into its trend, seasonal and irregular components. These filters use data from both past and future time points, and so different filters must be used at the end of a series as the future time points are not available, leading to revisions as this information becomes available. Most ABS time series now use autoregressive integrated moving average (ARIMA) modelling methods, which improve the revision properties of seasonally adjusted and trend estimates. ARIMA modelling relies on the characteristics of the series being analysed to project future period data. The projected values are temporary, intermediate values, which are only used internally to improve the estimation of the seasonal factors. The projected data do not affect the original estimates and are discarded at the end of the seasonal adjustment process.

    The tables below provide, for two key series, the mean revision and the mean absolute revision for the past eight years. The mean revision shows the percentage difference between the first estimate published, and that estimate one year later, averaged over the four quarters for the year. The mean absolute revision shows the average absolute values of the mean revision.

    Table 11.3 describes the revisions to quarterly gross domestic product (GDP). In particular, it shows the difference between the first estimate of GDP and that estimate one year later, in terms of the mean revision and the mean absolute revision expressed as percentage points. The figures continue to show revisions to quarterly GDP in recent years remain relatively small (mean absolute revision). Zero mean revision figures indicate that the revisions to quarterly GDP over the year have been offset. Despite the revisions to quarterly GDP being quite small, efforts to further improve the estimates are ongoing.

    Table 11.3: Revisions to quarterly gross domestic product, percentage change (a)

    Difference between first estimate and estimate one year later
    Reference yearMean absolute revision (% points)Mean revision (% points)
    1999–000.1-
    2000–010.2-
    2001–020.2-
    2002–030.1-
    2003–040.20.2
    2004–050.1-
    2005–060.2-
    2006–07(b)0.2-
    (a) Seasonally adjusted chain volume measure.
    (b) First three quarters of 2006–07 only.

    Mean absolute revisions to the quarterly current account transactions since 1999–2000 are shown in Table 11.4. The revisions to the current account deficit are expressed in percentage terms, rather than percentage points, as is the case with the revisions to GDP.

    Table 11.4: Revisions to quarterly current account transactions (a)

    Difference between first estimate and estimate one year later
    Reference yearMean absolute revision (%)Mean revision (%)
    1999–002.30.3
    2000–013.4-0.3
    2001–022.7-0.1
    2002–031.8-0.5
    2003–041.20.3
    2004–051.70.3
    2005–061.1-0.6
    2006–07 (b)1.60.4
    (a) Seasonally adjusted data.
    (b) First three quarters of 2006–07 only.

    TIMELINESS

    The timeliness of statistical information can be measured by the gap between the reference period (the period the data relate to) and the date of release of results. The ABS continues to adhere to pre-announced release dates and make improvements, where possible, to the timeliness achieved. Tables 11.5 and 11.6 present information on the timeliness for ABS monthly and quarterly tabular data for main economic indicator statistics, and other general releases. Table 11.7 reports on the timeliness of confidentialised unit record files (CURFs).

    The high standard of timely release of statistical tables was maintained in 2007–08, with similar time periods between the end of the reference period and publication release.

    Table 11.5: Time between end of reference period and release of tabular data (average number of elapsed days) (a)

    Main economic indicator tabular statisticsOther general tabular indicator statistics
    Year of ReleaseMonthlyQuarterlyMonthlyQuarterly
    2001–0229513478
    2002–0328493374
    2003–0429512685
    2004–0529512575
    2005–0630522487
    2006–0731512283
    2007–0831502384
    (a) Where a publication or spreadsheet has been reissued, the reissue date is used in the calculation of the average.

    Table 11.6: Time between end of reference period and release of tabular data for selected publications

    PublicationFrequencyAverage number of elapsed days (a)
    Retail Trade, Australia (cat. no. 8501.0)Monthly34
    Building Approvals, Australia (cat. no. 8731.0)Monthly35
    Labour Force, Australia (cat. no. 6202.0)Monthly11
    Consumer Price Index, Australia (cat. no.6401.0)Quarterly24
    Australian National Accounts: National Income, Expenditure and Product
    (cat. no. 5206.0)
    Quarterly66
    Australian Demographic Statistics
    (cat. no. 3101.0)
    Quarterly169
    (a) Average is taken over the most recent year’s releases.

    The timeliness of release of information depends on a number of factors, including the amount and complexity of information being collected, the source of the data (for example, whether directly collected or sourced from administrative records), and the amount of processing or validation of the information required before release. The timeliness can also vary over the year, particularly in March/April and December/January due to the concentration of public holidays at this time.

    For example, labour force statistics are released very quickly after the end of the reference month. Interviews are generally conducted in week 2 and 3 of a given month. Respondents are asked to report for a set ‘reference week’,i.e. the previous week. This means the data collection is completed before the end of the reference month, and labour force statistics can be released in a timely manner.

    In contrast, for demographic statistics on Australia’s population, the quarterly changes to population statistics are based on a variety of administrative sources, such as registrations of births and deaths, passenger cards completed at Australia’s borders, and modelled estimates of interstate migration (using information from Medicare card registration address changes, delayed by three months as registration often takes place after the actual move). It takes around five months before estimates can be published due to the time needed to acquire and process the administrative data, particularly with the delay of three months for the Medicare card data.

    The elapsed time between the end of the reference period and the supply of the CURF data has improved significantly in recent years, as can be seen by the average number of elapsed days in Table 11.7. The number of CURF releases is related to a survey topic and may include both basic and expanded CURFs counted as a single release. More information on CURFs can be found in Chapter 12, Communication of statistics.

    Table 11.7: Time between end of reference period and release of CURFs

    Reference yearNumber of CURFs releasedAverage number of elapsed days
    2002–035724
    2003–042548
    2004–057375
    2005–067353
    2006–07(a)5325
    (a) Further microdata from the 2006–07 reference year are still to be released, which would increase the number of 2006–07 microdata released and increase the average number of elapsed days.

    ACCESSIBILITY

    The accessibility of statistical information refers to the ease with which it can be referenced. This includes the ease with which the existence of information can be ascertained, as well as the suitability of the form or medium through which the information can be accessed. The cost of the information may also be an aspect of accessibility for some users. More information on the accessibility of statistical information can be found in Chapter 12, Communication of statistics.

    All statistics on the ABS website are now accessed free of charge. The change means all publications, spreadsheets and census data on the website are now available without cost to any member of the public through Internet access. However, people who require paper copies of publications, information on CD-ROM, or information more detailed than that published, will be charged under the ABS pricing policy.

    CURFs are a product that allows approved researchers with a valid statistical purpose to access individual survey responses. The data files are confidentialised and access is carefully controlled to ensure that no individual or organisation can be identified. The ABS has worked to improve the accessibility of information available by increasing the number of CURFs released, with more than 110 CURFs released by the ABS (inclusive of both basic and expanded CURFs). The ABS has also continued work on improving the accessibility of CURFs through the ABS Remote Access Data LaboratoryTM (RADLTM).

    The ABS has made a version of SEASABS (seasonal adjustment software) available for release, enabling users to undertake their own seasonal adjustment. In 2007–08, use of SEASABS by Australian and state/territory governments has expanded, as well as roll-out in the private sector.

    INTERPRETABILITY

    The interpretability of statistical information reflects the availability of the supplementary information and metadata necessary to interpret and utilise it appropriately. This information normally covers the availability and clarity of metadata, including concepts, classifications and measures of accuracy. Interpretability also includes appropriate presentation of the data.

    ABS releases are accompanied by extensive explanatory notes to aid the interpretation of statistical information. A range of material is also available on the ABS website detailing the methods, classifications, concepts and standards used by the ABS. For the first time, in 2007–08, releases have also been accompanied by quality declarations to assist users in determining whether the information is suitable for their needs.

    A number of ABS publications combine, compare and contrast statistics from different sources, to help users interpret how changes in one aspect of the economy or society can impact on other aspects. Examples includeMeasures of Australia’s Progress, 2008 (cat. no. 1383.0.55.001), Australian Economic Indicators, May 2008 (cat. no. 1350.0) and Australian Social Trends, 2007 (cat. no. 4102.0). For more information on these publications and other analytical work undertaken by the ABS to assist in the interpretation of statistics, see Chapter 13, Extended analysis of statistics.

    COHERENCE

    The coherence of statistical information reflects the degree to which it can be successfully brought together with other statistical information, within a broad analytical framework and over time. Coherence encompasses the internal consistency of a collection as well as its comparability, both over time and with other data sources. The use of standard concepts, classifications and target populations promotes coherence, as does the use of common methodology across surveys. For example, estimates of interstate trade published in Qld Stats (cat. no. 1318.3) are moving to the same frame of businesses used by other economic surveys, leading to improved comparability with other ABS statistics.

    Coherence of ABS outputs requires the use of nationally and internationally agreed concepts and classifications. Standard concepts and classifications are used extensively within the ABS, and also promoted to other producers of statistical information in Australia. Information on statistical standards, concepts, classifications and methodologies are readily accessible through the ABS website. For more information see Chapter 14, Statistical standards and infrastructure.

    The Statistical Clearing House (SCH) provides approval to conduct surveys that are directed to 50 or more businesses and that are conducted by, or on behalf of, any Australian Government agency, to ensure that surveys are necessary, well designed, and place minimal burden on business respondents. One of the criteria used by the SCH is the coherence of the statistical information that will be produced. In particular, surveys are assessed on their use of standard methodologies, concepts and classifications, their consistency with past or future surveys, and the extent to which outputs can be compared, or jointly used, with other sources of data. For more information about the SCH see Chapter 9, Engagement with users and producers of statistics.

    In the 2006–07 reference year, with the survey conducted in late 2007, a number of changes to ABS annual economic surveys impact on coherence. The Annual Integrated Collection now comprises a number of annual surveys conducted via a common methodology, resulting in increased comparability between the component surveys. However, the sample design change, which also encompasses the updated Australian and New Zealand Standard Industry Classification 2006 (cat. no. 1292.0), causes reduced coherence over time. Change measurement strategies are in place to assess the impact of the new methodology and classification (Information Paper: ANZSIC 2006 Implementation, 2006, ABS cat. no. 1295.0). Multiple cycles of the collection are being run and published under both the old and new methodologies, to ensure the change is transparent and that it is treated appropriately in producing time series estimates.




    Previous PageNext Page