Coordination of Health Care Study: Use of Hospitals and Emergency Departments, Australia methodology

Latest release
Reference period
2015-16 financial year
Released
21/11/2019
Next release Unknown
First release

Explanatory notes

Introduction

1 This publication presents initial results from the third stage of the Coordination of Health Care Study (the Study) which links information on state and territory hospitalisations and emergency department presentations to the 2016 Survey of Health Care.

2 The Coordination of Health Care Study is funded by the Australian Institute of Health and Welfare (AIHW) and is jointly conducted by the Australian Bureau of Statistics (ABS) and the AIHW.

3 The first stage of the Study, the Survey of Health Care, was conducted throughout Australia in April-June 2016 and presented information on participants’ experiences with health care professionals (for example, general practitioners and specialists) and the broader health care system (for example, diagnostic tests, hospital admissions and emergency department visits). The scope of the Survey was people aged 45 years and over who had at least one general practitioner (GP) visit in the 12 months prior to selection in the Survey (that is, from 24 November 2014 to 24 November 2015). These people were chosen because they are more likely to have complex and chronic conditions, and have experiences with multiple health care providers. These people are collectively referred to as the ‘Study cohort’ in this publication.

4 For the second stage of the Study, consent was sought from participants for the release of their Medicare Benefits Schedule (MBS) and Pharmaceutical Benefits Scheme (PBS) claims information (for the period 1 January 2014 to 30 June 2018) to the ABS for the purpose of linkage to Survey results.

5 For the third stage of the Study, consent was sought from participants for the release of their records for admissions or attendances at hospitals and emergency departments (for the period 1 January 2014 to 30 June 2018) to the ABS for the purpose of linkage to Survey results.

6 Hospital data received for the Study include public hospitalisations in all states and territories and private hospitalisations in New South Wales, Victoria, Queensland and Western Australia. Data on private hospitalisations in South Australia, Tasmania, the Northern Territory and the Australian Capital Territory were not available. See Table 2 for numbers of hospital records received for the Study.

7 Emergency department data received for the Study relate to services provided at public hospitals that meet the criteria for inclusion in the NNAPEDCD (see paragraphs 54 to 57 for more information). Services provided at private hospital emergency departments are not included in the NNAPEDCD. Data on emergency department presentations in the ACT in 2015-16 were not available at the time of publication. See Table 4 for numbers of emergency department records received for the Study.

8 This publication presents estimates of people in the Study cohort by number of hospitalisations and emergency department presentations in 2015-16. Estimates are presented by selected socio-demographic characteristics.

Scope and coverage

9 The scope of the Study is people aged 45 years and over who had at least one GP visit in the 12 months prior to selection in the Survey of Health Care (that is, from 24 November 2014 to 24 November 2015). A GP visit means having a claim against any one of a defined set of MBS item numbers (see Appendix 1 – GP item numbers).

10 The scope of the Study is people in all states and territories. The scope includes:

  • people who were registered to receive Medicare benefits at any time prior to November 2015;
  • people who live in private and non-private dwellings;
  • visitors and diplomats from countries where there is a reciprocal Medicare arrangement; and
  • people who received services through Aboriginal Medical Services.


11 The scope excludes:

  • people who were not registered with Medicare;
  • people who did not have a GP visit in the period 24 November 2014 to 24 November 2015;
  • people who had only had GP transactions which were not billed through Medicare (for example through doctors who draw a salary and do not bill to Medicare); and
  • people who were in active military service and obtained all their medical services through the military.


12 The sample frame for the Study was the Medicare Enrolment Database (MEDB). The sample was selected from this frame by the Department of Human Services (DHS) in accordance with a stratification and allocation specified by the ABS.

13 As people were in scope of the Study if they saw a GP at least once in the 12 months prior to selection (that is, from 24 November 2014 to 24 November 2015), there may have been people who saw a GP at least once in the 12 months prior to the enumeration period of the Survey of Health Care (that is, April 2016 to June 2016) who were not in scope as they did not visit a GP between 24 November 2014 and 24 November 2015. Similarly, there may have been people who visited a GP in the 12 months prior to selection but did not visit a GP in the 12 months prior to enumeration who were in scope.

Sample design

14 The Study sample was designed to support estimates at the Primary Health Network (PHN) area level. A stratified random sample was used where the strata were based on the following variables:

  • age groups (five-year groups from 45-79 years of age, then 80 years and over);
  • sex (male and female);
  • PHN area (31 PHNs plus an extra category for unknown PHN);
  • socio-economic category (people were divided into three socio-economic strata ‘low’, ‘medium’ and ‘high’ based on their postcode’s score on the Index of Relative Socio-Economic Advantage and Disadvantage, ‘low’ and ‘high’ being the bottom and top two deciles respectively); and
  • number of GP visits in the 12 months prior to selection (number of GP visits was split into users with 1-11 visits and users with 12 visits or more).


15 People on the frame were assigned a PHN based on the postcode of their postal address as recorded on the MEDB. A correspondence between postal areas and PHNs was used to do this. As not every postcode is included in the ABS postal area classification, there were around 100,000 people who could not be allocated a PHN. At the sample design stage, these people were allocated to an unknown PHN category.

16 There were 8.8 million people in scope on the MEDB. A total sample of around 124,000 people was selected by sorting within stratum by number of GP visits and then applying a skip using a random start.

17 Also incorporated into the overall sample design was a requirement to oversample people who saw a GP more than 12 times such that the resulting sample consisted of approximately half people who saw a GP 12 or more times and half who saw a GP 1-11 times.

Response rates

18 There were around 124,000 people selected for the Study.

19 From the people selected for the Study, 35,495 people responded to the Survey of Health Care, giving a response rate of 28.6%. In this survey it is not possible to distinguish between non-response and sample loss. For example, a person may have been selected to participate, but will not have received any survey materials due to an out-of-date address on the MEDB.

20 Of people who responded to the Survey, 25,502 people provided consent for the release of their hospital and emergency department information to the ABS. This represents a response rate of 20.6% from the 124,000 people initially selected for the Survey.

21 The table below contains response rates by the state or territory that the person was selected in. Persons selected in the unknown PHN category have unknown state for selection. In outputs from the Study, respondents are placed into the geographic regions (e.g. state, PHN) and SEIFA decile that correspond to their reported home postcode.

Table 1: Coordination of health care study response rates

 NSWVic.QldSAWATas.NTACTUnknownAust.
Approached sampleno.38,24822,89227,4157,82511,1613,9403,4113,8435,337124,072
Survey of Health Care
 Responding sampleno.10,7386,9837,7552,6323,3511,3846511,30569635,495
 Response rate%28.130.528.333.63035.119.1341328.6
Participants who consented to use of hospital/ED information(a)
 Responding sampleno.7,8664,9935,5891,8512,5801,0275561,040. .25,502
 Response rate%20.621.820.423.723.126.116.327.1. .20.6
Participants who consented to use of hospital/ED information and MBS claims information(a)
 Consenting sampleno.5,2733,0743,7801,0881,526628327785. .16,481
 Response rate%13.813.413.813.913.715.99.620.4. .13.3
Participants who consented to use of hospital/ED information and MBS and PBS claims information(a)
 Consenting sampleno.4,8812,8453,4949971,397586303729. .15,232
 Response rate%12.812.412.712.712.514.98.919.0. .12.3

a. For hospital/emergency department data included in the Study, persons with an unknown PHN category for selection were allocated to a state or territory based on their home postcode.

 

Data collection

22 Survey of Health Care data and MBS, PBS and hospital/ED consent information was collected by mail. In order to facilitate maximum response, a four stage mail-out approach was used. The four stages consisted of:

  • a DHS cover letter, a Primary Approach Letter and a translated paper introducing respondents to the study in 10 languages;
  • a DHS cover letter, the Survey of Health Care form, a Consent Form for Release of Hospital Data, a Consent Form for Release of Department of Human Services Data, a translation paper, a brochure and a reply paid envelope;
  • a DHS cover letter, a reminder/thank you postcard and a translation paper. This wave was only despatched to people who had not returned a survey form, or who had not contacted the ABS to refuse participating in the study as of the 26th of April 2016; and
  • a replication of stage 2, despatched only to those who had not returned a survey form nor made contact with the ABS as of the 26th of April 2016.


23 In each phase of the mail out, a cover letter from the DHS was included, explaining that the DHS had not provided the ABS with any personal details of the selected person.

24 People with low English proficiency, or who had a disability which prevented them from completing the survey on their own, were able to complete the survey over the phone. People with low English proficiency were offered the option of an interpreter from the Translation and Interpreting Service (TIS National) who could facilitate a phone call with the ABS and translate as an ABS officer provided information or collected the participant's data over the phone.

25 De-identified hospital (that is, admitted patient) and emergency department information was provided to the ABS for consenting participants through the following data linkage process:

  • ABS provided a study specific identifier and personal information for consenting participants to state and territory data linkage units, who undertook data linkage to match identifying information to admitted patient and emergency department records for the period 1 January 2014 to 30 June 2018. These de-identified linked admitted patient and emergency department records and study specific identifiers were provided to ABS directly or via AIHW depending on the state or territory:
     
    • admitted patient and emergency department records for South Australia and admitted patient records for Queensland were provided directly to the ABS; and
    • for all other state and territory data, the state or territory data custodian mapped de-identified admitted patient and emergency department records to National Hospital Morbidity Database (NHMD) and National Non-Admitted Patient Emergency Department Care Database (NNAPEDCD) records held by AIHW. AIHW then extracted the relevant admitted patient records from the NHMD and emergency department records from the NNAPEDCD, and provided these to ABS.
       

Weighting, benchmarks and estimation

Weighting

26 Weighting is the process of adjusting results from a sample survey to infer results for the total 'in scope' population. To do this, a 'weight' is allocated to each enumerated person. The weight is a value which indicates the number of people in the population represented by the sample person.

27 For information on calculation of weights used in the Survey of Health Care component of the Study, see paragraphs 18-22 of the Explanatory Notes of Survey of Health Care, Australia, 2016 (cat. no. 4343.0). Weights for MBS and PBS data were calculated in a similar manner to the Survey of Health Care component, with additional benchmarks as specified in paragraph 29 of the Explanatory Notes of Coordination of Health Care Study: Use of Health Services and Medicines, Australia, 2015-16 (cat. no. 4343.0.55.001).

28 For the third stage of the Study, three sets of weights were created corresponding to different combinations of consent provided by Study participants for the different collections included in the Study. These are:

  • hospital weight – applicable to the 25,502 Study participants who consented to the use of their hospital/ED information;
  • hospital-MBS weight – applicable to the 16,481 Study participants who consented to the use of their hospital/ED information and MBS claims information; and
  • hospital-MBS-PBS weight – applicable to the 15,232 Study participants who consented to the use of their hospital/ED information and MBS and PBS claims information.
     

Benchmarks

29 Weights are calibrated against population benchmarks such that estimates conform to the distribution of the MEDB population rather than the distribution within the sample itself. Calibration to population benchmarks helps to compensate for over or under-enumeration of particular categories of people/households which may occur due to either the random nature of sampling or non-response.

30 The Survey of Health Care includes weights benchmarked to counts of the in-scope population at November 2015 from the MEDB for PHNs (based on postal address postcode) by sex by 10 year age groups (to age 75 and over).

31 For MBS and PBS claims information integrated to the Survey of Health Care, additional benchmarks were included in calibration to improve estimates of the use of MBS services and PBS medicines. These benchmarks were the number of people who received GP, specialist, operation or other allied health services in 2015-16. See paragraph 29 of the Explanatory Notes of Coordination of Health Care Study: Use of Health Services and Medicines, Australia, 2015-16 (cat. no. 4343.0.55.001) for more information.32 For the hospital weight described in paragraph 28, an additional benchmark is included in calibration to improve estimates of hospitalisations. This benchmark is the estimated total number of people in New South Wales, Victoria and Queensland with a private hospitalisation in 2015-16, derived from the hospital-MBS weight.

Estimation

33 Estimates of counts of people are obtained by summing the weights of people with the characteristic of interest. In the Study the different responding/consenting sample groups (see paragraphs 21 and 28) each weight up to the in scope population of 8.8 million people aged 45 years and over who had at least one GP visit in the 12 months between November 2014 and November 2015.

34 The Survey of Health Care weights sum the responding survey sample of 35,495 people to the 8.8 million in scope population.

35 The hospital weights sum the 25,502 people who consented to use of their hospital/ED information to the 8.8 million in scope population. Estimates for hospitalisations and emergency department presentations in this publication were derived using the hospital weight.

36 The hospital-MBS weights sum the 16,481 people who consented to use of their hospital/ED information and MBS claims information to the 8.8 million in scope population.

37 The hospital-MBS-PBS weights sum the 15,232 people who consented to use of their hospital/ED information and MBS and PBS claims information to the 8.8 million in scope population.

Reliability of estimates

38 All sample surveys are subject to error which can be broadly categorised as either:

  • sampling error; and
  • non-sampling error.


39 Sampling error is the difference between the estimate derived from a sample of people, and the value that would have been produced if all people in scope of the survey had been included. For more information refer to the Technical Note.

40 In this publication, estimates with an RSE of 25% to 50% are preceded by an asterisk (e.g. *3.4) to indicate that the estimate has a high level of sampling error relative to the size of the estimate, and should be used with caution. Estimates with an RSE over 50% are indicated by a double asterisk (e.g. **0.6) and are generally considered too unreliable for most purposes.

41 Margins of Error are provided for proportions to assist users in assessing the reliability of these data. The proportion combined with the MoE defines a range which is expected to include the true population value with a given level of confidence. This is known as the confidence interval. This range should be considered by users to inform decisions based on the proportion. Proportions with a MoE of greater than 10 percentage points are preceded by a hash (e.g. #40.1) to indicate the range in which the true population value is expected is relatively wide.

42 Non-sampling error may occur in any collection, whether it is based on a sample or a full count of the population such as a census. Sources of non-sampling error include: non-response; errors in reporting by respondents or recording of answers by interviewers; and errors in coding and processing data. Every effort was made to reduce the non-sampling error by: careful design and testing of the questionnaire; follow-up of respondents; and extensive editing and quality control procedures at all stages of data processing.

43 Non-response bias occurs where non-respondents may have different characteristics from those who did respond. While the Study is potentially affected by non-response bias, it is not possible to reliably quantify this. The magnitude of any bias depends on the rate of non-response and the extent of the differences in characteristics between those people who responded to the survey and those who did not. See Table 1.2 of the Explanatory Notes of Survey of Health Care, Australia, 2016 (cat. no. 4343.0) for an assessment of results from the Survey of Health Care component of the Study with other ABS collections.

Data quality

Hospital data

44 For New South Wales, Victoria, Western Australia, Tasmania, the Northern Territory and the Australian Capital Territory, hospital data (that is, admitted patient data) provided to ABS for the Study was sourced from the AIHW National Hospital Morbidity Database (NHMD). For Queensland and South Australia, data were sourced from the Queensland Hospital Admitted Patient Data Collection (QHAPDC) and South Australia Public Hospital Separations dataset respectively.

45 Hospital data in the Study include public hospitalisations in all states and territories and private hospitalisations in New South Wales, Victoria, Queensland and Western Australia. Data on private hospitalisations in South Australia, Tasmania, the Northern Territory and the Australian Capital Territory were not available. The following table presents the number of hospital records received for the Study at the time of publication.

Table 2 - Number of hospital records received for coordination of health care study, public and private hospitals

 1 January 2014 to 30 June 2017(a)2015-16(b)
State/territory of hospitalPublicPrivateTotalPublicPrivateTotal
New South Wales12 64013 67226 3123 6714 2397 910
Victoria8 7316 08614 8172 4671 8714 338
Queensland9 12511 06920 1942 8483 4266 274
South Australia4 026na4 0261 136na1 136
Western Australia(c)(c)11 035(c)(c)3 183
Tasmania1 544na1 544413na413
Northern Territory180na18059na59
Australian Capital Territory1 557na1 557488na488
Australia. .. .79 665. .. .23 801

na not available
a. Records received with a hospital separation date between 1 January 2014 and 30 June 2017. Records for 1 July 2017 to 30 June 2018 are scheduled to be received in early 2020.
b. Records received with a hospital separation date between 1 July 2015 and 30 June 2016. Data in this publication are based on these records.
c. Records received for Western Australia include hospitalisations occurring in public and private hospitals, however information about whether each hospitalisation occurred in a public or private hospital was not available for the Study.
 


46 Based on AIHW Admitted Patient Care 2015-16 hospital statistics¹, private hospitalisations in South Australia, Tasmania, the Northern Territory and the Australian Capital Territory accounted for around 5% of all hospitalisations in 2015-16 for people aged 45 years and over. This implies that around 95% of all hospitalisations of people aged 45 years and over in Australia were in scope of the Study.

47 The following table presents, for people who consented to use of their hospital information for the Study, the proportion who had at least one hospitalisation in 2015-16 and the proportion who self-reported in the 2016 Survey of Health Care being admitted to hospital in the last 12 months, by state or territory of usual residence.

Table 3 - Persons aged 45 years and over who consented to use of hospital information(a), linked and self-reported estimates

 Proportion of people (%) 
State/territory of usual residencePeople who had at least one hospitalisation in 2015-16(b)People who self-reported being admitted to hospital in last 12 months(c)
New South Wales26.122.2
Victoria21.121.5
Queensland29.021.6
South Australia13.723.9
Western Australia31.723.1
Tasmania14.422.7
Northern Territory4.021.1
Australian Capital Territory13.120.6
Australia24.422.1

a. Persons aged 45 years and over who had at least one GP visit in the 12 months between 24 November 2014 and 24 November 2015 and who consented to use of their hospital information for the Study (25,502 persons).
b. Excludes data on private hospitalisations in South Australia, Tasmania, the Northern Territory and the Australian Capital Territory. These data were not available for the Study.
c. As reported in the 2016 Survey of Health Care.
 


48 There are a number of issues that should be considered when interpreting the proportion of people in the Study cohort in each state or territory who had at least one hospitalisation in 2015-16:

  • data on hospitalisations were not available for private hospitals in South Australia, Tasmania, the Northern Territory and the Australian Capital Territory;
  • people may be admitted to hospital in a state or territory other than their state or territory of usual residence. However, most hospitalisations within a state or territory are for people resident in that state or territory;
  • differing levels of bias between states and territories in the sample of people in the Study cohort who consented to use of their hospital information may affect estimates. For example, if a particular group of people is more (or less) likely to have consented, and their patterns of hospitalisation differ to other groups (for example, older people are more likely to go to hospital than younger people) then this may affect results. While weighting methods address this issue they are not able to correct for all possible instances of over or under representation; and
  • some groups of records may be more likely to link, or conversely less likely to link, than other groups of records. This may result in over representation of some groups and under representation of others. Information on linkage rates for the Study were insufficient to provide an assessment of the extent of under or over representation of groups within the Study cohort. While weighting methods address this they are not able to correct for all possible instances of over or under representation.


49 While differences between states and territories could be expected to exist due to factors such as different age structures and models of care between states and territories, these are confounded by the issues noted above. Comparisons between states and territories of estimates and proportions of people in the Study cohort who had at least one hospitalisation in 2015-16 are not recommended.

50 Across states and territories there are some differences between the proportion of people who had at least one hospitalisation in 2015-16 and the proportion of people who self-reported in the 2016 Survey of Health Care being admitted to hospital in the last 12 months (see Table 3), which may partially be explained by the issues noted in paragraph 48.

51 For example, 13.7% of people in the Study cohort living in South Australia had at least one hospitalisation in 2015-16 while 23.9% self-reported that they had been admitted to hospital in the last 12 months. It is likely that this difference is due, in part, to the exclusion of private hospitalisations in South Australia from the Study, although it is not possible to conclusively say so. If information on private hospitalisations were available the difference in the two figures could be expected to be smaller.

52 For example also, 26.1% of people in the Study cohort living in New South Wales had at least one hospitalisation in 2015-16 while 22.2% self-reported that they had been admitted to hospital in the last 12 months.

53 For the Northern Territory, 4.0% of people in the Study cohort had at least one hospitalisation in 2015-16 compared with 21.1% who self-reported that they had been admitted to hospital in the last 12 months. The magnitude of this difference is not likely to be explained by the exclusion of private hospitalisations in the Northern Territory from the Study. Hospital data for the Northern Territory in this publication should therefore be treated with considerable caution.

Emergency department data

54 For New South Wales, Victoria, Queensland, Western Australia, Tasmania, the Northern Territory and the Australian Capital Territory, emergency department data provided to ABS for the Study was sourced from the AIHW National Non-admitted Patient Emergency Department Care Database (NNAPEDCD). For South Australia, data were sourced from the South Australia Public Hospital Emergency Department dataset.

55 The following table presents the number of emergency department records received for the Study at the time of publication.

Table 4 - Number of emergency department records received for coordination of health care study

State/territory of hospital1 January 2014 to 30 June 2017(a)2015-16(b)
New South Wales12 0113 404
Victoria4 6961 374
Queensland5 7561 671
South Australia1 827491
Western Australia3 4021 022
Tasmania977289
Northern Territory22479
Australian Capital Territory(c)1 130(c)na
Australia30 0238 330

na not available
a. Records received with an emergency department presentation date of 1 January 2014 to 30 June 2017 inclusive. Records for 1 July 2017 to 30 June 2018 are scheduled to be received in early 2020.
b. Records received with an emergency department presentation date of 1 July 2015 to 30 June 2016 inclusive. Data in this publication are based on these records.
c. Data for the ACT excludes records for 2015-16 which were not available at the time of publication.
 


56 The NNAPEDCD provides information on the care provided for non-admitted patients registered for care in emergency departments in public hospitals where the emergency departments meets the following criteria:

  • a purposely designed and equipped area with designated assessment, treatment, and resuscitation areas;
  • the ability to provide resuscitation, stabilisation, and initial management of all emergencies;
  • availability of medical staff in the hospital 24 hours a day; and
  • designated emergency department nursing staff 24 hours per day 7 days per week, and a designated emergency department nursing unit manager.
     

57 Services provided at private hospital emergency departments or by public hospitals which do not have an emergency department that meets the criteria in paragraph 56 are not included in the NNAPEDCD. In 2013-14 coverage of NNAPEDCD was about 88% of all public hospital emergency department occasions of service reported to the National Public Hospital Establishments Database (NPHED).

58 The following table presents, for people who consented to use of their emergency department information for the Study, the proportion who had at least one emergency department presentation in 2015-16 and the proportion who self-reported in the 2016 Survey of Health Care that they went to an emergency department in the last 12 months, by state or territory of usual residence.

Table 5 - Persons aged 45 years and over who consented to use of emergency department information(a), linked and self-reported estimates

 Proportion of people (%) 
State/territory of usual residencePeople who had at least one emergency department presentation in 2015-16People who self-reported that they had been to an emergency department in last 12 months(b)
New South Wales17.318.0
Victoria12.317.2
Queensland14.120.0
South Australia12.019.2
Western Australia16.119.0
Tasmania14.918.1
Northern Territory5.822.5
Australian Capital Territory(c)2.219.7
Australia14.518.4

a. Persons aged 45 years and over who had at least one GP visit in the 12 months between 24 November 2014 and 24 November 2015 and who consented to use of their emergency department information for the Study (25,502 persons).
b. As reported in the 2016 Survey of Health Care.
c. Data on emergency department presentations in the ACT in 2015-16 were not available at the time of publication. This figure therefore relates to people resident in the ACT who had at least one emergency department presentation in 2015-16 in a state or territory other than the ACT.
 


59 Similar to hospitals data presented in this publication, there are number of issues that should be considered when interpreting the proportion of people in the Study cohort in each state or territory who had at least one emergency department presentation in 2015-16:

  • the NNAPEDCD has incomplete coverage of emergency or urgent care provided by hospitals as the scope is restricted to formal emergency departments (see paragraphs 56 and 57), and this coverage varies by state or territory;
  • differing levels of bias between states and territories in the sample of people in the Study cohort who consented to use of their emergency department information may affect estimates. For example, if a particular group of people is more (or less) likely to have consented, and their patterns of use of emergency departments differ to other groups (for example, older people are more likely to go to an emergency department than younger people) then this may affect results. While weighting methods address this issue they are not able to correct for all possible instances of over or under representation;
  • some groups of records may be more likely to link, or conversely less likely to link, than other groups of records. This may result in over representation of some groups and under representation of others. Information on linkage rates for the Study were insufficient to provide an assessment of the extent of under or over representation of groups within the Study cohort. While weighting methods address this they are not able to correct for all possible instances of over or under representation; and
  • data on emergency department presentations in the Australian Capital Territory in 2015-16 were not available at the time of publication.


60 While differences between states and territories could be expected to exist due to factors such as different age structures and models of care between states and territories, these are confounded by the issues noted above. Comparisons between states and territories of estimates and proportions of people in the Study cohort who had at least one emergency department presentation in 2015-16 are not recommended.

61 Across states and territories there are some differences between the proportion of people who had at least one emergency department presentation in 2015-16 and the proportion of people who self-reported in the 2016 Survey of Health Care that they had been to an emergency department in the last 12 months (see Table 5), which may partially be explained by the issues identified in paragraph 59.

62 For example, 12.3% of people in the Study cohort living in Victoria had at least one emergency department presentation in 2015-16, while 17.2% of people living in Victoria self-reported that they had been to an emergency department in the last 12 months.

63 For the Northern Territory, 5.8% of people in the Study cohort had at least one emergency department presentation in 2015-16, compared with 22.5% of people who self-reported that they had gone to an emergency department in the last 12 months. Given the higher proportions of people who had at least one emergency department presentation in 2015-16 in the other states (ranging from 12.0%-17.3%), and given their smaller differences with self-reported data, emergency department presentation data for the Northern Territory in this publication should be treated with considerable caution.

64 Data on emergency department presentations in the Australian Capital Territory in 2015-16 were not available at the time of publication. Emergency department data in this publication for the Australian Capital Territory therefore relate to people resident in the Australian Capital Territory who had at least one emergency department presentation in 2015-16 in a state or territory other than the Australian Capital Territory, and are not comparable with data for other states and territories.

Confidentiality

65 The Census and Statistics Act, 1905 provides the authority for the ABS to collect statistical information, and requires that statistical output shall not be published or disseminated in a manner that is likely to enable the identification of a particular person or organisation. The requirement means that the ABS must take care and make assurances that any statistical information about individual respondents cannot be derived from published data.

66 Perturbation is used in this publication to minimise the risk of identifying individuals in aggregate statistics. Perturbation involves a small random adjustment of the statistics and is considered the most satisfactory technique for avoiding the release of identifiable statistics while maximising the range of information that can be released. These adjustments have a negligible impact on the underlying pattern of the statistics. After perturbation, a given published cell value will be consistent across all tables. However, adding up cell values to derive a total will not necessarily give the same result as published totals.

Classifications

67 Geographic classifications were applied to the survey data based on the respondent’s reported home postcode, using correspondences between the geography of interest and ABS Postal Area geography.

68 Standard ABS Geographies were classified according to the Australian Statistical Geography Standard (ASGS): Volume 1 - Main Structure and Greater Capital City Statistical Areas, July 2016 (cat. no. 1270.0.55.001).

69 Remoteness areas are classified according to the Australian Statistical Geography Standard (ASGS): Volume 5 - Remoteness Structure, July 2016 (cat. no. 1270.0.055.005).

70 Primary Health Networks (PHNs) are a classification developed by the Department of Health; see Primary Health Networks in the Glossary. The correspondence between PHN and ABS Postal Area geography was used to relate a person’s postcode as listed on the MEDB to a PHN.

71 Where a postcode crossed a PHN boundary the entire postcode was allocated to the PHN with largest proportion of people living in it. There was a slight exception where a postcode crossed a state boundary; in this case individuals were manually coded to the state they reported as their address.

Products and services

    Data cubes

    72 Data cubes containing tables in Excel spreadsheet format can be found on the ABS website (from the Data downloads). These present tables of estimates and proportions and their corresponding relative standard errors (RSEs) and margins of error (MoE).

    Customised data requests

    73 Special tabulations of the data are available on request. Subject to confidentiality and sampling variability constraints, tabulations can be produced from the survey incorporating data items, populations and geographic areas (including state and territory level data), tailored to individual requirements. These are provided in electronic form.

    74 For further information about these and related statistics, contact the National Information and Referral Service on 1300 135 070, or email client.services@abs.gov.au. The ABS Privacy Policy outlines how the ABS will handle any personal information that you provide to us.

    Related publications

    75 Other related publications which may be of interest to users include:

    • Survey of Health Care, Australia, 2016 (ABS cat. no. 4343.0) – released September 2017
    • Coordination of Health Care Study: Use of Health Services and Medicines, Australia, 2015-16 (ABS cat. no. 4343.0.55.001) – released December 2018
    • Survey of Health Care: selected findings for rural and remote Australians (AIHW Cat. no. PHE 220) – released April 2018
    • Coordination of health care – experiences with GP care among patients aged 45 and over, 2016 (AIHW Cat. no. CHC 2) – released July 2018
    • Coordination of health care: experiences of information sharing between providers for patients aged 45 and over, 2016 (AIHW Cat. no. CHC 3) – released July 2019
    • Coordination of health care: experiences of barriers to accessing health-care services among patients aged 45 and over, 2016 (AIHW Cat. no. CHC 4) – scheduled for release early 2020.
       

    Acknowledgements

    76 ABS surveys draw extensively on information provided by individuals, businesses, governments and other organisations. Their continued cooperation is very much appreciated and without it, the wide range of statistics published by the ABS would not be available. Information received by the ABS is treated in strict confidence as required by the Census and Statistics Act 1905.

    77 The Coordination of Health Care Study is funded by the AIHW, and jointly conducted by the ABS and AIHW. This publication was jointly prepared and released by the ABS and the AIHW.

    78 The ABS and AIHW also acknowledge and thank the DHS for its assistance in the sample selection and postage process of the study.

    79 This report would not have been possible without the valued cooperation and efforts of the following organisations. The ABS and AIHW acknowledge and thank them for their assistance with the Coordination of Health Care Study:

    • Centre for Health Record Linkage, NSW Ministry of Health and ACT Health
    • Statistical Services Branch - Queensland Health
    • Data Linkage Branch - Western Australian Department of Health
    • SA NT Datalink and SA Health and Northern Territory Department of Health
    • Tasmanian Data Linkage Unit and the Tasmanian Department of Health and Human Services
    • Centre for Victorian Data Linkage - Victorian Department of Health and Human Services, and the Victorian Agency for Health Information
    • Population Health Research Network.


    80 The Study uses variations of questions sourced from other national and international non-ABS surveys (see Table 1.3 of the Explanatory Notes of Survey of Health Care, Australia, 2016, cat. no. 4343.0 for more information). The ABS and AIHW would like to acknowledge the following organisations:

    • Harvard Medical School, Boston
    • Department of Health & Human Services, Victoria
    • The Commonwealth Fund, New York
    • Statistics Canada, Ottawa.
       

    Appendix - GP item numbers

    Show all

    Type of MBS serviceBroad Type of ServiceMBS item
    General practitionerUnreferred Attendances: VR GP1:4, 13:14, 19:20, 23:26, 33, 35:40, 43:44, 47:51, 193, 195, 197, 199, 597, 599, 601:603, 2497:2501, 2503:2504, 2506:2507, 2509, 2517:2518, 2521:2522, 2525:2526, 2546:2547, 2552:2553, 2558:2559, 2574:2575, 2577:2578, 5000, 5003, 5007, 5010, 5020:5023, 5026:5028, 5040:5043, 5046, 5049, 5060, 5063:5064, 5067
    Unreferred Attendances: Enhanced Primary Care700:710, 712:747, 749:750, 757:759, 762, 765, 768, 771:773, 775, 778:779, 900, 903, 2700:2702, 2710, 2712:2713, 2715, 2717, 2719, 6087
    Unreferred Attendances: Other5:12, 15:18, 21:22, 27:32, 34, 41:42, 45:46, 52:84, 86:87, 89:93, 95:98, 101, 160:173, 444:449, 598, 600, 696:698, 980, 996:998, 2100, 2122, 2125:2126, 2137:2138, 2143, 2147, 2179, 2195:2199, 2220, 2598, 2600, 2603, 2606, 2610, 2613, 2616, 2620, 2622:2624, 2631:2633, 2635, 2664, 2666:2668, 2673:2675, 2677, 2704:2705, 2707:2708, 2721, 2723, 2725, 2727, 4001, 5200, 5203, 5207:5208, 5220:5228, 5240, 5243, 5247:5248, 5260:5263, 5265:5267, 17600

    Technical note - reliability of estimates

    1 Two types of error are possible in an estimate based on a sample survey: sampling error and non-sampling error. The sampling error is a measure of the variability that occurs by chance because a sample, rather than the entire population, is surveyed. Since the estimates in this publication are based on information obtained from a sample of persons in scope of the survey they are subject to sampling variability; that is, they may differ from the figures that would have been produced if all in-scope persons had been included in the survey. One measure of the likely difference is given by the standard error (SE). There are about two chances in three that a sample estimate will differ by less than one SE from the figure that would have been obtained if all persons had been included, and about 19 chances in 20 that the difference will be less than two SEs.

    2 Another measure of the likely difference is the relative standard error (RSE), which is obtained by expressing the SE as a percentage of the estimate. The RSE is a useful measure in that it provides an immediate indication of the percentage errors likely to have occurred due to sampling, and thus avoids the need to refer also to the size of the estimate.

    \(\Large{R S E \%=\left(\frac{S E}{E s t i m a t e}\right) \times 100}\)

    3 RSEs for published estimates are supplied in Excel data tables, available via the Data downloads section.

    4 The smaller the estimate the higher is the RSE. Very small estimates are subject to such high SEs (relative to the size of the estimate) as to detract seriously from their value for most reasonable uses. In the tables in this publication, only estimates with RSEs less than 25% are considered sufficiently reliable for most purposes. However, estimates with larger RSEs, between 25% and less than 50% have been included and are preceded by an asterisk (eg *3.4) to indicate they are subject to high SEs and should be used with caution. Estimates with RSEs of 50% or more are preceded with a double asterisk (eg**0.6). Such estimates are considered unreliable for most purposes.

    5 The imprecision due to sampling variability, which is measured by the SE, should not be confused with inaccuracies that may occur because of imperfections in reporting by interviewers and respondents and errors made in coding and processing of data. Inaccuracies of this kind are referred to as the non-sampling error, and they may occur in any enumeration, whether it be in a full count or only a sample. In practice, the potential for non-sampling error adds to the uncertainty of the estimates caused by sampling variability. However, it is not possible to quantify the non-sampling error.

    Standard errors of proportions and percentages

    6 Proportions and percentages formed from the ratio of two estimates are also subject to sampling errors. The size of the error depends on the accuracy of both the numerator and the denominator. For proportions where the denominator is an estimate of the number of persons in a group and the numerator is the number of persons in a sub-group of the denominator group, the formula to approximate the RSE is given below. The formula is only valid when x is a subset of y.

    \(\large {RSE\left(\frac{x}{y}\right) = \sqrt{R S E(x)^{2}-R S E(y)^{2}}}\)

    Comparison of estimates

    7 Published estimates may also be used to calculate the difference between two survey estimates. Such an estimate is subject to sampling error. The sampling error of the difference between two estimates depends on their SEs and the relationship (correlation) between them. An approximate SE of the difference between two estimates (x-y) may be calculated by the following formula:

    \(\large{S E(x-y) = \sqrt{[S E(x)]^{2}+[S E(y)]^{2}}}\)

    8 While the above formula will be exact only for differences between separate and uncorrelated (unrelated) characteristics of sub-populations, it is expected that it will provide a reasonable approximation for all differences likely to be of interest in this publication.

    9 Another measure is the Margin of Error (MOE), which describes the distance from the population value that the sample estimate is likely to be within, and is specified at a given level of confidence. Confidence levels typically used are 90%, 95% and 99%. For example, at the 95% confidence level the MOE indicates that there are about 19 chances in 20 that the estimate will differ by less than the specified MOE from the population value (the figure obtained if all dwellings had been enumerated). The 95% MOE is calculated as 1.96 multiplied by the SE.

    10 The 95% MOE can also be calculated from the RSE by:

    \(\Large{{MOE}(y) \approx \frac{R S E(y) \times y}{100} \times 1.96}\)

    11 The MOEs in this publication are calculated at the 95% confidence level. This can easily be converted to a 90% confidence level by multiplying the MOE by:

    \(\LARGE{\frac{1.645}{1.96}}\)

    or to a 99% confidence level by multiplying by a factor of:

    \(\LARGE{\frac{2.576}{1.96}}\)

    12 A confidence interval expresses the sampling error as a range in which the population value is expected to lie at a given level of confidence. The confidence interval can easily be constructed from the MOE of the same level of confidence by taking the estimate plus or minus the MOE of the estimate.

    Significance testing

    13 For comparing estimates between surveys or between populations within a survey it is useful to determine whether apparent differences are 'real' differences between the corresponding population characteristics or simply the product of differences between the survey samples. One way to examine this is to determine whether the difference between the estimates is statistically significant. This is done by calculating the standard error of the difference between two estimates (x and y) and using that to calculate the test statistic using the formula below:

    \(\LARGE{\frac{|x - y|}{SE(x - y)}}\)

    where

    \(\LARGE{S E(y)=\frac{R S E(y) \times y}{100}}\)

    14 If the value of the statistic is greater than 1.96 then we may say there is good evidence of a statistically significant difference at 95% confidence levels between the two populations with respect to that characteristic. Otherwise, it cannot be stated with confidence that there is a real difference between the populations.

    Glossary

    Show all

    Administrative data

    Information that is collected for purposes other than that of a statistical nature. This type of information is often obtained from records or transactional data from government agencies, businesses or non-profit organisations which use the information for the administration of programs, policies or services.

    Admitted patient

    A patient who undergoes a hospital's admission process to receive treatment and/or care.

    Data integration

    Statistical data integration involves combining information from different administrative and/or statistical sources to provide new datasets for statistical and research purposes. Data integration can help policy makers and researchers gain a much better understanding of Australian families, communities, industry, and the economy. This better understanding can help to improve the development and delivery of government services in areas such as health, education, infrastructure, and other community services. Further information on data integration is available on the ABS website.

    De-identified data/records

    Data that have had any identifiers removed. May also be referred to as unidentified data. The Survey of Health Care, MBS, PBS, hospital and emergency department records used in the Coordination of Health Care Study were de-identified and do not include person name, address or Medicare number.

    Emergency department presentation

    An emergency department presentation is the arrival of a patient at an emergency department that results in clerical registration or triage. In this publication, emergency department presentations are those that meet criteria for inclusion in the National Non-Admitted Patient Emergency Department Care Database (NNAPEDCD). See paragraph 56 of the Explanatory Notes for more information.

    General practitioner

    A general practitioner (GP) is a doctor who has completed training in general practice. GPs are many Australians’ first point of contact for health issues, and play a crucial role in delivering coordinated care across a person’s life. People use GP services for a variety of reasons including short-term illnesses, preventive health practices and management of long-term health conditions. MBS items used to define general practitioner services in this publication are listed in Appendix 1 – MBS items.

    Hospitalisation

    A hospitalisation is a completed episode of admitted hospital care ending with discharge, death or transfer—or a portion of a hospital stay starting or ending in a change to another type of care (for example, from acute care to rehabilitation).

    Index of Relative Socio-Economic Disadvantage

    See Socio-Economic Indexes for Areas.

    Long-term health condition

    A long-term health condition is a health condition that is expected to last or has lasted 6 months or more and has been diagnosed by a health professional. Respondents to the Survey of Health Care were asked whether they had any of the following conditions:

    • diabetes
    • heart disease (including angina or past heart attack)
    • high blood pressure or hypertension
    • effects of a stroke
    • cancer (including melanoma but not other skin cancers)
    • asthma
    • chronic lung disease (including Chronic Obstructive Pulmonary Disease)
    • osteoporosis or low bone density
    • arthritis (including osteoarthritis, rheumatoid arthritis or lupus)
    • mental health condition (including anxiety disorder, depression or bipolar disorder)
    • Alzheimer’s disease or dementia
    • moderate or severe pain lasting longer than six months
    • other long-term health condition/long-term injury.
       

    Medicare Benefits Schedule

    The Department of Human Services collects data on the activity of all persons making claims through the Medicare Benefits Scheme and provides this information to the Department of Health. Information collected includes the type of service provided (MBS item number) and the benefit paid by Medicare for the service. The item numbers and benefits paid by Medicare are based on the Medicare Benefits Schedule (MBS) which is a listing of the Medicare services subsidised by the Australian Government. See Appendix 1 – MBS items for a mapping of MBS items used in this publication.

    Medicare Enrolment Database

    The Medicare Enrolment Database (MEDB) includes listings of people who are registered to receive Medicare benefits in Australia.

    Pharmaceutical Benefits Scheme

    The Department of Human Services provides data on prescriptions funded through the Pharmaceutical Benefits Scheme (PBS) to the Department of Health. The PBS lists all of the medicines available to be dispensed to patients at a Government-subsidised price.

    Primary Health Networks

    Primary Health Networks have been established with the key objectives of increasing the efficiency and effectiveness of medical services for patients, particularly those at risk of poor health outcomes, and improving coordination of care to ensure patients receive the right care in the right place at the right time. Each Primary Health Network has a corresponding geographic area. See Primary Health Networks on the Australian Government Department of Health website for more information.

    Remoteness Areas

    Broad geographical regions that share common characteristics of remoteness based on the Remoteness Structure of the ABS's Australian Statistical Geographical Standard. The classification includes a Remoteness Structure which divides Australia into six broad regions called Remoteness Areas. The purpose of the Remoteness Structure is to provide a classification for the release of statistics that inform policy development by classifying Australia into large regions that share common characteristics of remoteness, based on physical distance from services.

    Self-assessed health

    A person's general assessment of their own health against a five point scale comprising excellent, very good, good, fair and poor.

    Socio-Economic Indexes for Areas

    Socio-Economic Indexes for Areas is a product developed by the ABS that ranks areas in Australia according to relative socio-economic advantage and disadvantage. There are four indexes, each focusing on a different aspect of socio-economic advantage and disadvantage, based on different subsets of information from the five-yearly Census.

    The 2011 Census-based Index of Relative Socio-Economic Advantage and Disadvantage was used in sample design for the 2016 Survey of Health Care.

    Data included in this publication use the 2016 Census-based Index of Relative Socio-Economic Disadvantage. A lower Index of Disadvantage quintile (e.g. quintile 1) indicates relatively greater disadvantage and a lack of advantage in general. A higher Index of Disadvantage (e.g. quintile 5) indicates a relative lack of disadvantage and greater advantage in general.

    For more information see Census of Population and Housing: Socio-Economic Indexes for Areas, Australia, 2016.

    Usual GP

    A usual GP is defined as the GP that people go to for most of their health care.

    Usual place of care

    Usual place of care is defined as the place that people usually go if they are sick or need advice about their health. Examples of a usual place of care include a clinic with GPs only, a clinic with GPs and other health professionals, a community health centre, an Aboriginal medical service or, for some patients, a hospital emergency department.

    Abbreviations

    Show all

    ABSAustralian Bureau of Statistics
    ACTAustralian Capital Territory
    AIHWAustralian Institute of Health and Welfare
    ASGSAustralian Statistical Geographical Standard
    Aust.Australia
    CHCCoordination of Health Care
    DHSDepartment of Human Services
    EDHospital emergency department
    GPGeneral practitioner
    MBSMedicare Benefits Schedule
    MEDBMedicare Enrolment Database
    MoEMargin of Error
    NHMDNational Hospital Morbidity Database
    NHSNational Health Survey
    NNAPEDCDNational Non-admitted Patient Emergency Department Care Database
    NPHEDNational Public Hospital Establishments Database
    NSWNew South Wales
    NTNorthern Territory
    PBSPharmaceutical Benefits Scheme
    PHNPrimary Health Network
    PHRNPopulation Health Research Network
    QHAPDCQueensland Hospital Admitted Patient Data Collection
    QldQueensland
    RSERelative standard error
    SASouth Australia
    SEIFASocio-Economic Indexes for Areas
    SHCSurvey of Health Care
    Tas.Tasmania
    TISTranslation and Interpreting Service
    Vic.Victoria
    WAWestern Australia
    Back to top of the page