Childhood Education and Care, Australia methodology

Latest release
Reference period
June 2017
Released
23/04/2018
Next release Unknown
First release

Explanatory notes

Introduction

1 The statistics in this publication were compiled from data collected in the Childhood Education and Care Survey (CEaCS) that was conducted throughout Australia in June 2017 as a supplement to the Australian Bureau of Statistics (ABS) monthly Labour Force Survey (LFS).

2 The CEaCS collected information on topics including:

  • usual care arrangements (types of care, duration and cost)
  • care arrangements used in the survey reference week (types of care, duration and cost)
  • attendance at a preschool or preschool program (usually or in the survey reference week)
  • need for additional formal care or preschool
  • early childhood education and learning activities.
     

3 The main aim of the survey was to provide estimates of:

  • care arrangements for children aged 0-12 years
  • attendance at educational institutions for children aged 0-12 years
  • informal learning activities for children aged 0-8 years
  • families' requirements for formal care or preschool
  • education, income and working patterns of parents of children aged 0-12 years.
     

4 The publication Labour Force, Australia (cat. no. 6202.0) contains information about survey design, sample redesign, scope, coverage and population benchmarks relevant to the monthly LFS, which also apply to supplementary surveys. It also contains definitions of demographic and labour force characteristics.

Scope

5 The scope of the 2017 CEaCS was restricted to Australian resident children aged 0-12 years and their families living in private dwellings and excluded:

  • any non-residents visiting Australia (diplomatic personnel of overseas governments and their families, members of non-Australian defence forces stationed in Australia, or non-residents otherwise visiting Australia)
  • all non-private dwellings (includes hospitals, nursing homes and prisons, as well as boarding schools, residential colleges, hotels and motels).
  • Indigenous communities.


6 The survey also excluded members of the Australian permanent defence forces. One parent families where the parent was a member of the Australian permanent defence forces and couple families where both parents were members were out of scope. In couple families where one parent was a member of the Australian permanent defence forces, no employment information is available for that parent, but information on the family and the children was obtained from the other resident parent.

7 The survey was conducted in both urban and rural areas in all states and territories but excluded people living in Indigenous Communities. The exclusion of people living in these areas is unlikely to impact on state and territory estimates, except in the Northern Territory where they account for approximately 15% of the total population aged 15–74 years.

Coverage

8 The survey coverage excludes persons absent from their usual residence for an extended period. One parent families where the parent was temporarily absent and couple families where both parents were temporarily absent were not enumerated. In couple families where one parent was temporarily absent, no employment information is available for that parent, but information on the family and the children was obtained from the other parent.

9 The estimates in this publication relate to persons covered by the survey scope. In the LFS, coverage rules are applied which aim to ensure that each person is associated with only one dwelling and hence has only one chance of selection in the survey. See Labour Force, Australia (cat. no. 6202.0) for more details.

Sample size

10 Approximately 88% of selected households were fully responding to CEaCS, resulting in 4,813 family records and 7,411 children records.

Data collection

11 Interviews were conducted between 11 and 24 June 2017 with parents of children aged 0-12 years either face-to-face or over the telephone, using computer assisted interviewing (CAI).

12 For interviews conducted between 11 and 17 June 2017, the reference week was 4 to 10 June 2017. For interviews conducted between 18 and 24 June 2017, the reference week was 11 to 17 June 2017.

13 Only households with at least one child aged 0-12 years were selected for CEaCS, and only one family per household was selected to complete the survey (i.e. in multi-family households). Detailed information about child care arrangements and early childhood education was collected for a maximum of two children aged 0-12 years per family. In families with more than two children aged 0-12 years, two children were randomly selected and the complete set of information was collected for these children. Summary information was collected for the other children in the family, including the number attending child care and/or preschool and the cost of this care (including any rebates such as the Child Care Benefit). Information pertaining to parents (such as income, education and employment) and family characteristics was also collected. All information was obtained from a parent or guardian of the selected child/ren, who permanently resided in the household.

14 Supplementary surveys are not conducted on the full LFS sample. Since August 1994, the sample for supplementary surveys has been restricted to the first 7 of the 8 months during which a dwelling is enumerated in the LFS (i.e. seven-eighths of the LFS sample).

Estimation method

Weighting

15 Weighting is the process of adjusting results from a sample survey to infer results for the total population. To do this, a 'weight' is allocated to each sample unit, which for CEaCS can be either a child or a family. The weight is a value which indicates how many population units are represented by the sample unit. The first step in calculating weights for each unit is to assign an initial weight, which is the inverse of the probability of being selected in the survey.

Population benchmarks

16 The initial weights were calibrated to align with independent estimates of the population of interest, referred to as 'benchmarks'. The population included in the benchmarks is the survey scope. Weights calibrated against population benchmarks ensure that the survey estimates conform to the independently estimated distribution of the population rather than the distribution within the sample itself. Calibration to population benchmarks helps to compensate for over or under-enumeration of particular categories of persons/households which may occur due to either the random nature of sampling or non-response.

17 For child estimates, CEaCS was benchmarked to the Estimated Resident Population (ERP) in each state and territory at 30 June 2017, by age and sex. For family estimates, CEaCS was benchmarked to independently calculated estimates of the total number of households in Australia with children aged under 13 years. For CEaCS, households with children under 13 years are a proxy for families. CEaCS estimates do not (and are not intended to) match estimates for the total Australian population obtained from other sources.

Estimation

18 Survey estimates of counts of children or families are obtained by summing the weights of children or families with the characteristic of interest.

19 To minimise the risk of identifying individuals in aggregate statistics, a technique is used to randomly adjust cell values. This technique is called perturbation. Perturbation involves small random adjustment of the statistics and is considered the most satisfactory technique for avoiding the release of identifiable statistics while maximising the range of information that can be released. These adjustments have a negligible impact on the underlying pattern of the statistics. After perturbation, a given published cell value will be consistent across all tables. However, adding up cell values to derive a total will not necessarily give the same result as published totals. The introduction of perturbation in publications ensures that these statistics are consistent with statistics released via services such as Table Builder.

Reliability of estimates

20 All sample surveys are subject to error which can be broadly categorised as either sampling error or non-sampling error. For more information refer to the Technical Note.

Using the data

Changes between surveys

21 Questions about parental educational attainment and participation were removed from the CEaCS 2017 questionnaire as this is information is now provided to CEaCS from the Labour Force Survey. In 2014, this information was collected by the CEaCS or taken from the May 2014 LFS supplementary survey, the Survey of Education and Work.

22 In 2017, less editing of income data outliers in the 'Weekly income of father / mother / parent(s)' data items was undertaken, compared to earlier CEaCS surveys. This is likely to have resulted in a greater number of high income outliers in these items which may be due to errors in self-report collection. Tables in this publication refer to categorised income items only, the quality of which are less likely to be affected by outliers. However, outliers may impact on the reliability of means and medians calculated using continuous income items, so users should exercise caution when calculating and analysing such results through the TableBuilder product. See the Microdata: Childhood Education and Care, Australia, June 2017 (cat. no. 4402.0.55.001) for more details.

23 In 2017, an error in the data item 'Principal source of income of parent(s)' was corrected. For previous iterations of CEaCS, this data item should no longer be used or compared with 2017.

24 In 2017 CEaCS, parents were asked questions about whether they would like their child to attend more preschool even if the child already attended the maximum allowance (15 hours per week). This group were not asked these questions in 2014. As a consequence, multiple items related to 'Requirements for (additional) formal care or preschool' have a different population compared to the same items provided in 2014. The affected data items are marked with a footnote in the Data Item List, on the Child Level tab. In order to compare data items from 2017 with their equivalent 2014 data items, a filter is required to limit the population to only those respondents whose child did not attend the full allocation of 15 hours of preschool. This can be done using category '5. Is not attending full allocation of preschool' from the data item 'Whether child is currently attending full allocation of 15 hours per week of preschool' as a filter on 2017 data items. However, caution should be used when interpreting such data comparisons as the context in which these questions were asked is likely to have changed over time. The policy, "Universal Access to Early Childhood Education" requiring 15 hours a week of free preschool for all children was only introduced in late 2013. Parents' responses to some questions in CEaCS may have been influenced by their unfamiliarity with the policy in 2014, compared to 2017, when it was likely that this was better understood by parents. For instance, they may have made more enquiries about the availability of additional preschool in 2014 than in 2017.

25 People living in Indigenous communities or in very remote parts of Australia were excluded from CEaCS in 2011 whereas in 2014 and 2017 only people living in Indigenous communities were excluded.

26 In the 2011 CEaCS some questions about school attendance were only asked of children aged 4 to 8 years. Since 2014, these questions have been asked of children aged 4 to 12 years in order to provide broader information about the age of starting school and educational programs attended prior to school.

27 As was done in 2011 and 2014, the 2017 CEaCS continued to ask about 'usual' use of each type of care before asking about 'last week'.

28 Questions relating to informal learning alone by the child or with someone else other than their parent or guardian were removed from the survey in 2014.

29 While the amount of rebate available on child care costs has remained the same since 2011, since 2014 parents and guardians have been asked extra questions about the Child Care Rebate in order to improve the quality of data for child care costs. For more information see Child Care Benefit and the Child Care Rebate below.

30 After each Census, population estimates are normally revised back five years to the previous Census year. As announced in the June 2012 issue of Australian Demographic Statistics (cat. no. 3101.0), intercensal error between the 2006 and 2011 Censuses was larger than normal due to improved methodologies used in the 2011 Census Post Enumeration Survey. The intercensal error analysis indicated that previous population estimates for the base Census years were over-counted. An indicative estimate of the size of the over-count is that there should have been 240,000 fewer people at June 2006, 130,000 fewer in 2001 and 70,000 fewer in 1996. As a result, Estimated Resident Population estimates have been revised for the last 20 years rather than the usual five.

Consequently, estimates of particular populations derived since CEaCS 2014 may be lower than those published for previous years as the CEaCS estimates have not been revised. Therefore, comparisons of CEaCS 2014 and CEaCS 2017 estimates of the number of children or families with previous years should not be made. However, for comparable data items, comparison of rates or proportions between years is appropriate.

31 In July 2014, the LFS survey questionnaire underwent a number of developments. For further information see Information Paper: Questionnaire Used in the Labour Force Survey, July 2014.

32 Prior to CEACS 2014, summary/time series results from the survey were previously available in Childhood Education and Care, Australia Datacubes, June 2011 (cat. no. 4402.0.55.003).

Interpreting the data

33 A small number of same-sex couple families completed the 2017 CEaCS. The ABS prefers to include the survey data of same-sex couples in official statistics in the same way as opposite-sex couples. However, the ABS must also minimise the risk of individuals and families being identified in published statistics and will apply methods of confidentialisation when there are small numbers of respondents with recognisable characteristics. Given the small number of same-sex couple families, they have been included in the 'Not applicable' category of all data items that relate specifically to the mother or the father (e.g. their Labour force status). This methodology also has flow on effects to the calculation of other data items that rely on reported information specific to the mother or father (e.g. the Labour force status of the parent/s). Other data for same-sex couple families (e.g. whether children usually attend formal care) is included in the published statistics (not in the 'Not applicable' category). The ABS is currently reviewing the best methods of reporting information for same-sex couple families while still protecting their confidentiality, so a different approach may be used in future. More information on treating aggregate data is available from ABS Confidentiality Series, Aug 2017 (cat. no. 1160.0).

Comparability with other data sources

34 Care needs to be taken when comparing the 2017 CEaCS data with other surveys or administrative data, as the CEaCS collects information on usual child care and preschool attendance patterns as well as attendance in the survey reference week (i.e. at a point in time). In addition, information in the CEaCS is collected in person or by telephone from parents and hence may differ from that which might be obtained from other sources (such as administrative data) or via other methodologies (such as a paper form). These factors should be considered when interpreting the estimates in this publication.

35 The ABS also publishes estimates of preschool attendance in Preschool Education, Australia, 2017 (cat. no. 4240.0), which reports administrative data from National Early Childhood Education and Care Collection (NECECC). For the reasons outlined below, it is not recommended that comparisons of the preschool attendance counts reported in this CEaCS publication be made with preschool attendance and enrolment counts presented in the Preschool Education, Australia publication.

  • The CEaCS is a sample survey that collects information by interview with parent or guardian, who are asked about their child's attendance at preschool. In contrast, data used by the NECECC are mainly sourced from administrative collections, which are supplemented where necessary to improve the coverage of preschool program service providers that are not otherwise captured due to funding, regulation or licensing arrangements.
  • The scope for the two collections differ. In particular, CEaCS excludes Indigenous Communities.
  • The reference periods for the two collections differ. CEaCS was conducted from 11-24 June 2017, while NECECC was collected on 4 August 2017, with a recommended reference period of 31 July to 6 August 2017.
  • The CEaCS collects counts of children who 'usually attended' or who 'attended in the week prior to the survey'. In comparison, the NECECC collection counts children attending and enrolled during the reference period.
     

Due to these methodological and conceptual differences between the two collections, it is recommended that the user carefully consider their data requirements to ensure choosing the most appropriate preschool estimates for their needs. For more information on the NECECC, refer to Preschool Education, Australia, 2017 (cat. no. 4240.0) and National Early Childhood Education and Care Collection: Concepts, Sources and Methods, 2013 (cat. no. 4240.0.55.001).

Child Care Benefit and the Child Care Rebate

36 The Child Care Benefit (CCB) was introduced in July 2000 and replaced Child Care Assistance. The CCB is available to parents or guardians, foster parents or grandparents with a child in their care who is attending a child care service approved by, or registered with, the government.

37 Families using approved child care services can choose to receive their CCB as reduced child care fees (benefit paid direct to provider), fortnightly or quarterly payments direct to parents or as a lump sum payment at the end of the financial year (benefit paid direct to parents).

38 Families who receive the CCB for approved care may also be eligible to receive the Child Care Rebate (CCR) which was introduced in July 2009 and used to calculate the cost of child care since 2011 CEaCS. Prior to 2009, the CCR was known as the Child Care Tax Rebate (CCTR) and in the 2008 CEaCS, this was used to calculate cost of child care. In 2008 the CCTR was 30% of out-of-pocket child care costs while the CCR entitles eligible families to a rebate of up to 50% of out-of-pocket child care costs, up to an annual limit per child, after the CCB has been applied. For more information on different types of family assistance, see Australian Government Department of Human Services.

39 In July 2011 there was a change to the way families can choose to receive CCR. Families can now choose to receive their CCR as reduced child care fees (benefit paid direct to provider) as well fortnightly or quarterly payments direct to parents or as a lump sum payment at the end of the financial year direct to parents (both previously available).

Cost of care

40 Within this publication, cost of care is reported as the net cost of care to the parents after the CCB and CCR have been deducted, and is estimated based on a number of variables in the CEaCS.

41 Families receive the CCB and CCR in different ways and as a result questions in the 2017 CEaCS asked families to report the cost of child care. This was followed up with subsequent questions asking if the cost was before or after CCB and CCR to accurately calculate the out of pocket expense of child care to the family.

42 In a minority of cases, where parents had claimed or intended to claim the CCB as a lump sum, the amount of CCB has been estimated. The CCB was estimated based on information provided by the Department of Education. The value of the CCB can be calculated using information about: the type of care; the number of hours of care; the standard hourly rate; family income; number of children in the family using child care; whether the child attends school; and for long day care and family day care, whether the care is part-time or full-time.

43 In estimating the CCB for the small number of cases where parents were claiming a lump sum payment, it was assumed that:

  • if the parent intended to claim the CCB, then the care provider was eligible (i.e. an approved or registered child care service)
  • basic eligibility requirements for the benefit were met (e.g. residency and children's immunisation)
  • the parent provided their tax file number to the Family Assistance Office, which enabled them to obtain the CCB above minimum rates (depending upon the parents' income)
  • the parent met the CCB work, study, training test, which is required for CCB in relation to registered care and is required for CCB in excess of 24 hours care per week for approved care.


44 The CCR work, study, training test was applied if families received the CCB or were intending to claim the CCB for approved care. The reduction in child care fees was calculated on the out-of-pocket expenses incurred by families for approved care after the CCB. As assumptions were made about families' eligibility for the CCB and CCR, care should be taken when using estimates of cost of care data presented in this publication.

45 In CEaCS 2017, the income brackets used to calculate CCB in the calculation of estimated net cost of care have been indexed up from 2014.

Requirements for additional formal child care and preschool

46 The 2017 CEaCS continued to collect information on whether parents would like their child to attend more formal care or preschool than they were attending at the time of the survey. This includes instances in which children were already attending care or preschool and parents wanted them to attend more, as well as instances in which children did not attend any care or preschool and parents wanted them to attend. It does not include instances in which parents wanted to change service providers but not type or quantity of service.

47 These measures are not intended to provide an indicator of the number of additional formal care or preschool places required, the 'unmet demand' for formal care or preschool, or the number of children on waiting lists for formal care or preschool. This is because CEaCS cannot capture the likelihood that a parent will take steps to access the care or preschool they require, or place their child in this care or preschool. Various factors including cost, location and the perceived suitability or quality of the service will have an influence on whether parents take these steps.

Products and services

48 Some of the data cube tables titles and order have changed since the 2014 publication. The majority of the content of the tables remains unchanged. A concordance is provided in the spreadsheet "Table title concordance CEACS 2017 to 2014" located under the Data downloads section. All data cube tables present corresponding Relative Standard Error (RSE) for estimates, while Margins of Error (MOEs) are provided for proportions.

49 For users who wish to undertake more detailed analysis of the data, the survey microdata will be released through the Table Builder product. For further details refer to the Microdata pages on the ABS website.

50 Special tabulations are available on request. Subject to confidentiality and sampling variability constraints, tabulations can be produced from the survey incorporating data items, populations and geographic areas selected to meet individual requirements. These can be provided in printed or electronic form. Note that detailed data can be subject to high relative standard errors which in some cases may result in data being confidentialised.

51 For further information about these or related statistics, contact the National Information and Referral Service on 1300 135 070. The ABS Privacy Policy outlines how the ABS will handle any personal information that you provide to us.

Acknowledgments

52 ABS publications draw extensively on information provided freely by individuals, businesses, governments and other organisations. Their continued cooperation is very much appreciated; without it, the wide range of statistics published by the ABS would not be available. Information received by the ABS is treated in strict confidence as required by the Census and Statistics Act 1905.

Previous surveys

53 The CEaCS has collected information on childcare arrangements in Australia since 1984 (previously as the Child Care Survey). In 2008, the CEaCS expanded to collect more information about early childhood education such as preschool and children's adjustment to school. For releases prior to 1993, please refer to the Historical Publications Index.

Next survey

54 The ABS plans to conduct this survey again in the future.

Related publications

55 Current publications and other products released by the ABS are available from the ABS website. The ABS also issues a daily upcoming release advice on the website that details products to be released in the week ahead.

Technical note - data quality

Reliability of the estimates

1 The estimates in this publication are based on information obtained from a sample survey. Any data collection may encounter factors, known as non-sampling error, which can impact on the reliability of the resulting statistics. In addition, the reliability of estimates based on sample surveys are also subject to sampling variability. That is, the estimates may differ from those that would have been produced had all persons in the population been included in the survey.

Non-sampling error

2 Non-sampling error may occur in any collection, whether it is based on a sample or a full count such as a census. Sources of non-sampling error include non-response, errors in reporting by respondents or recording of answers by interviewers and errors in coding and processing data. Every effort is made to reduce non-sampling error by careful design and testing of questionnaires, training and supervision of interviewers, and editing and quality control procedures at all stages of data processing.

Sampling error

3 Sampling error is the difference between the published estimates, derived from a sample of persons, and the value that would have been produced if the total population (as defined by the scope of the survey) had been included in the survey. The size of the sampling error depends on the following factors:

  • Sample design - the final design attempts to make the key survey results as representative as possible within cost and operational constraints
  • Sample size - the larger the sample on which the estimate is based, the smaller the associated sampling error
  • Population variability - the extent to which people differ on the particular characteristic being measured. The smaller the population variability of a particular characteristic, the more likely it is that the population will be well represented by the sample, and therefore, the smaller the sampling error. Conversely, the more variable the characteristic, the greater the sampling error.
     

Standard error and relative standard error

4 One measure of the sampling error is given by the standard error (SE), which indicates the extent to which an estimate might have varied by chance because only a sample of persons was included. There are about two chances in three (67%) that a sample estimate will differ by less than one SE from the number that would have been obtained if all persons had been surveyed, and about 19 chances in 20 (95%) that the difference will be less than two SEs.

5 Standard errors can be calculated using the estimates (counts or percentages) and the corresponding RSEs. See What is a Standard Error and Relative Standard Error, Reliability of estimates for Labour Force data for more details.

6 The relative standard error (RSE) is a measure of sampling variability that scales the SE to be relative to the size of the estimate. It is useful for comparing the accuracy of estimates that are of different sized populations. The RSE is obtained by expressing the SE as a percentage of the estimate.

\(\ R S E \%=\left(\frac{S E}{estimate}\right) \times 100\)

7 In CEaCS, RSEs for count estimates have been calculated using the Jackknife method of variance estimation. This involves the calculation of 60 'replicate' estimates based on 60 different sub-samples of the obtained sample. The variability of estimates obtained from these sub-samples is used to estimate the sample variability surrounding the count estimate.

Margin of error and confidence intervals

 8 Another useful measure is the margin of error (MOE), which shows the largest possible difference (due to sampling error) that could exist between the estimate and what would have been produced had all persons been included in the survey, at a given level of confidence. It is useful for understanding and comparing the accuracy of proportion estimates. Confidence levels can vary (e.g. typically 90%, 95% or 99%), but in this publication, all MOEs are provided for estimates at the 95% confidence level. At this level, there are 19 chances in 20 that the estimate will differ from the population value by less than the provided MOE. The 95% MOE is obtained by multiplying the SE by 1.96.

\({\ M O E=S E \times 1.96}\)

9 The estimate combined with the MOE defines a range, known as a confidence interval. This range is likely to include the true population value with a given level of confidence. A confidence interval is calculated by taking the estimate plus or minus the MOE of that estimate. It is important to consider this range when using the estimates to make assertions about the population or to inform decisions. Because MOEs in this publication are provided at the 95% confidence level, a 95% confidence interval can be calculated around the estimate, as follows:

\(\ 95 \% \text { Confidence Interval }=(estimate-M O E, estimate + M O E)\)

Using the measures of sampling error with the estimates

10 This publication reports the relative standard error (RSE) for estimates of counts ('000) and the margin of error (MOE) for estimates of proportions (%). These measures are included in the datacubes available on the Data downloads section.

11 Estimates of proportions with a MOE greater than 10% are annotated to indicate they are subject to high sample variability and particular consideration should be given to the MOE when using these estimates. Depending on how the estimate is to be used, a MOE of greater than 10% may be considered too large to inform decisions. In addition, estimates with a corresponding standard 95% confidence interval that includes 0% or 100% are annotated to indicate they are usually considered unreliable for most purposes.

12 Only estimates with RSEs less than 25% are considered sufficiently reliable for most analytical purposes. All other estimates with RSEs between 25% and 50% are annotated to indicate they are subject to high sample variability relative to the size of the estimate and should be used with caution. In addition, estimates with RSEs greater than 50% are annotated to indicate they are usually considered unreliable for most purposes.

13 Caution needs to be applied when performing statistical tests for estimates on rare populations where the RSE is above 25%. In these instances, the small sample is more vulnerable to non-sampling error and the distribution of the sampling error is not symmetric around the estimate.

Calculating measures of error and difference

 14 Proportions or percentages formed from the ratio of two count estimates are also subject to sampling errors. The size of the error depends on the accuracy of both the numerator and the denominator. A formula to approximate the RSE of a proportion is given below. This formula is only valid when the numerator (x) is a subset of the denominator (y):

\(\ R S E\left(\frac{x}{y}\right) \approx \sqrt{[R S E(x)]^{2}-[R S E(y)]^{2}}\)

15 When calculating measures of error, it may be useful to convert RSE or MOE to SE. This allows the use of standard formulas involving the SE. The SE can be obtained from RSE or MOE using the following formulas:

\(\ S E=\frac{R S E \% \times \text { estimate} }{100}\)

\(\ S E=\frac{M O E}{1.96}\)

16 The RSE can also be used to directly calculate a MOE with a 95% confidence level:

\( M O E=\frac{R S E \% \times e s t i m a t e \times 1.96}{100}\)

Differences

17 The difference between two survey estimates (counts or percentages) can also be calculated from published estimates. Such an estimate is also subject to sampling error. The sampling error of the difference between two estimates depends on their SEs and the relationship (correlation) between them. An approximate SE of the difference between two estimates (x - y) may be calculated by the following formula:

\( S E(x-y) \approx \sqrt{[S E(x)]^{2}+[S E(y)]^{2}}\)

18 While this formula will only be exact for differences between separate and uncorrelated characteristics or sub populations, it provides a good approximation for the differences likely to be of interest in this publication.

Significance testing

19 A statistical significance test for a comparison between estimates can be performed to determine whether it is likely that there is a difference between the corresponding population characteristics. The approximate standard error of the difference between two corresponding estimates (x - y) can be calculated using the formula shown above in the Differences section. The standard error is then used to calculate the following test statistic:

\(\large\frac{ |x-y|}{S E(x-y)}\)

20 If the value of this test statistic is greater than 1.96 then there is evidence, with a 95% level of confidence, of a statistically significant difference in the two populations with respect to that characteristic. Otherwise, it cannot be stated with confidence that there is a real difference between the populations with respect to that characteristic.

Glossary

Show all

Quality declaration - summary

Institutional environment

Relevance

Timeliness

Accuracy

Coherence

Interpretability

Accessibility

Abbreviations

Show all

Back to top of the page