4228.0 - Programme for the International Assessment of Adult Competencies, Australia, 2011-2012 Quality Declaration 
ARCHIVED ISSUE Released at 11:30 AM (CANBERRA TIME) 15/02/2013   
   Page tools: Print Print Page Print all pages in this productPrint All
EXPLANATORY NOTES


INTRODUCTION


1 This publication contains results from the Australian component of the Programme for the International Assessment of Adult Competencies (PIAAC) conducted in 24 countries around the world. The PIAAC survey was enumerated throughout Australia from October 2011 to March 2012 with funding provided by the Department of Education, Employment and Workplace Relations (DEEWR).

2 PIAAC is an international survey coordinated by the Organisation for Economic Co-operation and Development (OECD). The results from PIAAC will assist in answering questions concerning whether Australians have the literacy skills required for meeting the increasingly complex demands of everyday life and work. PIAAC provides information on skills and competencies for people aged 15-74 years in the three domains of:
  • literacy
  • numeracy
  • problem solving in technology-rich environments (PSTRE).
3 PIAAC is the third survey of international comparisons of adult literacy skills conducted in Australia. Its predecessors were the Adult Literacy and Life Skills Survey (ALLS) 2006 and Survey of Aspects of Literacy (SAL) 1996 (internationally known as the International Adult Literacy Survey (IALS)). PIAAC expands on these previous surveys by assessing skills in the domain of 'problem solving in technology-rich environments' and by asking questions specifically about skill use at work. The literacy and numeracy scores previously released in the ALLS and SAL publications are not comparable with PIAAC data for reasons which are listed in the Comparability of Time Series section below. The remodelled scores from ALLS and SAL are included in additional data cubes to be appended later in 2013.

4 Data from PIAAC, ALLS and SAL are used to inform policy matters including the Council of Australian Governments (COAG) National Agreement for Skills and Workforce Development.
5 To analyse the relationship between the assessed competencies with social and economic well-being, PIAAC collected information on topics including:
  • general demographic information including income
  • participation in education and training activities
  • participation in labour force activities
  • self-perception of literacy, numeracy and information communication technology (ICT) skill use at work and everyday life
  • self-perception of generic skills used at work
  • volunteering, trust and health
  • language background
  • parental background.
6 Preliminary data is presented for the literacy and numeracy skill domains only. Data cubes for final data will be appended to this product later in 2013. A full list of the data items from PIAAC will also be available from the Downloads tab of this publication later in 2013. Users can subscribe to receive Email Notifications to be advised when updates are available for this product. From the attached link, select 4. Social Statistics, sub-topic 42. Education, then select the product 4228.0 Programme for the International Assessment of Adult Competencies.
7 Twenty-four countries participated in the PIAAC survey internationally. This publication contains Australia data only. The OECD proposes to publish international results in October 2013, and these will be available from the OECD website at www.oecd.org.
SCOPE

8 The scope of the survey is restricted to people aged 15 to 74 years who were usual residents of private dwellings and excludes:
  • diplomatic personnel of overseas governments
  • members of non-Australian defence forces (and their dependants) stationed in Australia
  • overseas residents who have not lived in Australia, or do not intend to do so, for a period of 12 months or more
  • people living in very remote areas
  • people living in Census Collection Districts (CDs) which contained Discrete Indigenous Communities.
9 People living in CDs which contained Discrete Indigenous Communities were not enumerated for operational reasons.


COVERAGE

10 Households where all of the residents were less than 18 years of age were excluded from the survey because the initial screening questions needed to be answered by a responsible adult (who was aged 18 years or over).

11 If a child aged 15-17 years was selected, they could be interviewed with the consent of a parent or responsible adult.


DATA COLLECTION

12 Data was collected by trained ABS interviewers who conducted computer-assisted personal interviews.
13 A series of screening questions were asked of a responsible adult in a selected household to determine whether any of the residents of the household were in scope for the survey.

14 One resident of the household, who was in scope, was randomly selected to be interviewed. This respondent was asked a background questionnaire to obtain general information on topics including education and training, employment, income and skill use in literacy, numeracy, and ICT.

15 If a child aged 15-17 years was selected, the parent or responsible adult was asked questions about the household's income.
16 For language problems, if acceptable to the respondent, an interpreter could assist with the background questionnaire, but the self-enumerated exercise was not completed. Self-enumerated exercise

17 After the background questionnaire was completed, the respondent undertook a self-enumerated exercise. This contained tasks to assess their literacy, numeracy or problem solving skills in technology-rich environments. The exercise tasks were based on activities that adults do in their daily lives such as following instructions on labels, interpreting charts and graphs, measuring with a ruler, using email, internet searches and navigating websites. Tasks were at varying levels of difficulty.

18 The exercise could be completed at a separate time to the background questionnaire.

19 Respondents either completed the exercise on the notebook computer (with a mouse attached) or in paper-booklets. All respondents first took a core exercise to assess their capacity to undertake the main exercise. Those who passed the core stage proceeded to the main exercise. Those who failed the core stage were directed to the Reading Components booklet, which was designed to measure basic reading skills. Refer to the appendix titled Pathways through the self-enumerated exercise for further information about this process.

20 All respondents were provided with a pencil, ruler, notepad and calculator to use during the exercise. There were no time limits, and the respondent was not allowed to receive any assistance from others.

21 The role of the interviewer during the self-enumerated exercise was to discretely monitor the respondent's progress, and to encourage them to complete as many of the tasks as possible.

Observation module

22 When the interview was complete and the interviewer had left the household, the interviewer used the computer to answer a series of questions which collected information about the interview setting such as any events that might have interrupted or distracted the respondent during the exercise.

Scoring the exercise tasks

23 At the completion of the interview, if the respondent had a paper exercise booklet, it was forwarded to the ABS. The core and main exercise booklets were marked by trained scorers, and the responses from the Reading Components booklets were recorded. A sample of the booklets were independently re-scored by a different scorer to assess the accuracy of the scoring.

24 The responses from the computer-based and paper-based exercises were used to calculate scores for each of the skill domains completed by the respondent. The derivation of the scores was performed by Educational Testing Service (ETS) of Princeton USA (who also performed this task for the ALLS and SAL surveys). Refer to the appendix titled Scores and skill levels for further information about the calculation of the scores. Score imputation

25 In order to minimise respondent burden, respondents did not complete exercises in all three of the skill domains. Respondents completed exercise tasks in only one or two of these domains, depending on the assessment path they followed. To address this, PIAAC used multiple imputation methodology to obtain proficiency scores for each respondent for the skill domains for which the respondent was not required to do an exercise. Problem solving in technology-rich environment scores were not imputed for respondents who were sequenced to the paper-based Core booklet (i.e. they had no computer experience, or they did not agree to do the exercise on the computer, or they did not pass the computer-based Core Stage 1).


SAMPLE DESIGN

26 PIAAC was designed to provide reliable estimates at the national level and for each state and territory.

27 Dwellings included in the survey in each state and territory were selected at random using a multi-stage area sample. This sample included only private dwellings from the geographic areas covered by the survey.

28 The initial sample for PIAAC consisted of 14,442 private dwellings. Of the 11,532 households that remained in the survey after sample loss, 8,446 (73%) were fully responding or provided sufficient detail for scores to be determined.


ESTIMATION METHOD

Weighting

29
Weighting is the process of adjusting results from a sample survey to infer results for the total population. To do this, a 'weight' is allocated to each enumerated person. The weight is a value which indicates how many people in the population are represented by the sample person.

30 The first step in calculating weights for each unit is to assign an initial weight, which is the inverse of the probability of the unit being selected in the survey. For example, if the probability of a person being selected in the survey was 1 in 300, then the person would have an initial weight of 300 (that is, they represent 300 people).

Non-response adjustment
31 Non-response adjustments were made to the initial person-level weights with the aim of representing those people in the population that did not respond to PIAAC. Two adjustment factors were applied:
  • a literacy-related non-response adjustment, which was aimed at ensuring survey estimates represented those people in the population that had a literacy or language related problem and could not respond to the survey (these people cannot be represented by survey respondents because their reason for not completing the survey is directly related to the survey outcome, however they are part of the PIAAC target population.)
  • a non-literacy-related non-response adjustment, which was aimed at ensuring survey estimates represented those people in the population that did not have a literacy or language related problem but did not respond to the survey for some other reason.
Population benchmarks

32
After the non-response adjustment, the weights were adjusted to align with independent estimates of the population, referred to as 'benchmarks', in designated categories of sex by age by state by area of usual residence. This process is known as calibration. Weights calibrated against population benchmarks ensure that the survey estimates conform to the independently estimated distributions of the population described by the benchmarks, rather than to the distribution within the sample itself. Calibration to population benchmarks helps to compensate for over or under-enumeration or particular categories of people which may occur due to either the random nature of sampling or non-response.

33 The survey was calibrated to the in scope estimated resident population (ERP).

34 Further analysis was undertaken to ascertain whether benchmark variables, in addition to geography, age and sex, should be incorporated into the weighting strategy. Analysis showed that including only these variables in the weighting approach did not adequately compensate for undercoverage in the PIAAC sample for variables such as highest educational attainment and labour force status, when compared to other ABS surveys. As these variables were considered to have possible association with adult literacy additional benchmarks were incorporated into the weighting process.

35 The benchmarks used in the calibration of final weights for PIAAC were:
  • state by highest educational attainment
  • state by sex by age by labour force status
  • state by part of state by age by sex.
36 The education and labour force benchmarks, were obtained from other ABS survey data. These benchmarks are considered 'pseudo-benchmarks' as they are not demographic counts and they have a non-negligible level of sample error associated with them. The 2011 Survey of Education and Work (people aged 16-64 years) was used to provide a pseudo-benchmark for educational attainment. The monthly Labour Force Survey (aggregated data from November 2011 to March 2012) provided the pseudo-benchmark for labour force status. The sample error associated with these pseudo-benchmarks was incorporated into the standard error estimation.37 The process of weighting ensures that the survey estimates conform to persons benchmarks per state, part of state, age and sex. These benchmarks are produced from estimates of the resident population derived independently of the survey. Therefore the PIAAC estimates do not (and are not intended to) match estimates for the total Australian resident population (which include people and households living in non-private dwellings, such as hotels and boarding houses, and in very remote parts of Australia) obtained from other sources.

Estimation

38 Survey estimates of counts of people are obtained by summing the weights of people with the characteristic of interest.

39 Note that although the literacy-related non-respondent records (154 people) were given a weight, plausible values were not generated for this population. Estimates included in this publication do not include the literacy-related non-respondent population.


RELIABILITY OF ESTIMATES

40 All sample surveys are subject to error which can be broadly categorised as either sampling error or non-sampling error.

41 Sampling error is the difference between the published estimates, derived from a sample of people, and the value that would have been produced if all people in scope of the survey had been included.

42 Non-sampling error may occur in any collection, whether it is based on a sample or a full count such as a census. Sources of non-sampling error include non-response, errors in reporting by respondents or recording answers by interviewers, and errors in coding and processing data. Every effort was made to reduce the non-sampling error by careful design and testing of the questionnaire, training and supervision of interviewers, follow-up of respondents, and extensive editing and quality control procedures at all stages of data processing.

43 In contrast to most other ABS surveys, the PIAAC estimates also include significant imputation variability, due to the use of multiple possible assessment modules and the complex literacy scaling procedures. The effect of the plausible scoring methodology on the estimation can be reliably estimated and is included in the calculated RSEs. This is covered in more detail in the Data quality (Technical Note).

SEASONAL FACTORS

44 The estimates are based on information collected from October 2011 to March 2012, and due to seasonal factors they may not be representative of other time periods in the year. For example, employment is subject to seasonal variation through the year. Therefore, the PIAAC results for employment could have differed if the survey had been conducted over the whole year or in a different part of the year.


DATA QUALITY

45 Information recorded in this survey is essentially 'as reported’ by respondents and hence may differ from that which might be obtained from other sources or via other methodologies. This factor should be considered when interpreting the estimates in this publication.

46 Information was collected on the respondents' perception of various topics such as their employment status, health status, skill use and aspects of their job. Perceptions are influenced by a number of factors and can change quickly. Care should therefore be taken when analysing or interpreting this data.

47 For each competency, proficiency is measured on a scale ranging from 0 to 500 points. To facilitate analysis, these continuous scores have been grouped into five skill levels for the literacy and numeracy skill domains, and three skill levels for the problem solving in technology-rich environments skill domain, with Level 1 being the lowest measured level of literacy. The relatively small proportions of respondents who are assessed as being at Level 5 for the literacy and numeracy skill domains, often results in unreliable estimates of the number of people at this level. For this reason, whenever results are presented by skill level, Levels 4 and 5 are combined.

Missing values

48 For a number of PIAAC data items, some respondents were unwilling or unable to provide the required information. When this occurred, the missing response was recorded as either 'don't know', 'refused' or 'not stated or inferred'. These categories are not explicitly shown in the publication tables, but have been included in the totals with footnotes provided to note these inclusions. Proportions shown in the tables are based on totals which include these categories.

49 Listed below are data items where responses coded to the 'don't know' or 'refused' category were higher than 1%:

  • 'Current work earnings from wage or salary - annual gross pay' data item, 2.5% of people (approximately 424,000) had responses of 'don't know' or 'refused'
  • 'Current work earnings from business - last financial year' data item, 1.4% of people (approximately 234,000) had responses of 'don't know' or 'refused'
  • 'Level of highest educational qualification of mother or female guardian (ISCED)', 8.1% of people (1.4 million) had responses of 'don't know' or 'refused'
  • 'Level of highest educational qualification of father or male guardian (ISCED)', 9.7% of people (1.6 million) had responses of 'don't know' or 'refused'.
50 Aside from the items listed above, the proportions of responses of 'don't know' or 'refused' did not exceed 1% for any other data item, with the vast majority being less than 0.5%.

LEVEL OF EDUCATION

Level of highest educational attainment (ASCED)

51 Level of highest educational attainment was derived from information on highest year of school completed and level of highest non-school qualification. The derivation process determines which of the 'non-school' or 'school' attainments will be regarded as the highest. Usually the higher ranking attainment is self-evident, but in some cases some secondary education is regarded, for the purposes of obtaining a single measure, as higher than some certificate level attainments.

52
The following decision table is used to determine which of the responses to questions on highest year of school completed (coded to ASCED Broad Level 6) and level of highest non-school qualification (coded to ASCED Broad Level 5) is regarded as the highest. It is emphasised that this table was designed for the purpose of obtaining a single value for level of highest educational attainment and is not intended to convey any other ordinality.

Diagram: Highest attainment decision table


53
The decision table is also used to rank the information provided in a survey about the qualifications and attainments of a single individual. It does not represent any basis for comparison between differing qualifications. For example, a respondent whose highest year of school completed was Year 12, and whose level of highest non-school qualification was a Certificate III, would have those responses crosschecked on the decision table and would as a result have their level of highest educational attainment output as Certificate III. However, if the same respondent answered 'certificate' to the highest non-school qualification question, without any further detail, it would be crosschecked against Year 12 on the decision table as Certificate not further defined. The output would then be Year 12. The decision table, therefore, does not necessarily imply that one qualification is 'higher' than the other. For more details, see Education Variables, 2002 (cat. no. 1246.0).54 Once the ASCED coding was complete, a concordance was applied to obtain the data item 'Level of highest qualification completed - ISCED' which is an international standard classification of education.

Current study level (ASCED) and Incomplete study level (ASCED)


55
Level of education of current study was derived using the decision table displayed above, taking into account level of education of school study in current year and level of education of non-school study in current year for people who are undertaking concurrent qualifications.

56 Once the ASCED coding was complete, a concordance was applied to obtain the data items 'Level of qualification currently studying for - ISCED' and 'Level of incomplete qualification - ISCED' .


LABOUR FORCE STATUS

57 The labour force status data presented in the preliminary tables of this publication use a concept of labour force defined for the international PIAAC survey. Additional tables to be appended to this publication later in 2013 contain labour force data which is more closely aligned with the Australian definitions used in the ABS Labour Force Survey. The definition of the 'Employed' category in the international and the Australian data item are essentially the same. However, there is a subtle difference in the concept of 'Unemployed', which in turn impacts on the estimates for 'Out of labour force'.

Unemployed - International data item definition

58 People aged 15-74 years who were not employed, and:
  • had actively looked for full-time or part-time work at any time in the four weeks up to the end of the reference week and were available for work within 2 weeks, or
  • will be starting a job within 3 months and could have started within 2 weeks had the job been available then.
Unemployed - Australian data item definition

59 People aged 15-74 years who were not employed, were available for work in the reference week, and at any time in the four weeks up to the end of the reference week:
  • had actively looked for full-time or part-time work, or
  • were waiting to start a new job.
DATA COMPARABILITY

Comparability of time series

60 As noted above (paragraph 3), data previously released in the ALLS and SAL publications are not directly comparable with PIAAC data. The reasons for this are:

  • The literacy and numeracy scores previously published for ALLS and SAL have been remodelled to make them consistent with PIAAC. These scores were originally based on a model with a response probability (RP) value of 0.8 but are now based on a model with a RP value of 0.67. The latter value was used in PIAAC to achieve consistency with the OECD survey Programme for International Student Assessment (PISA), in the description of what it means to be performing at a particular level of proficiency. The new RP value does not affect the score that was calculated for a respondent. However, it does affect the interpretation of the score. Therefore, users of this data should refer to the new skill level descriptions provided in the appendix Scores and skill levels when performing time-series comparisons.
  • The prose and document literacy scales from ALLS and SAL have been combined to produce a single literacy scale which is comparable to the PIAAC literacy scale.
  • The numeracy scores from ALLS have been recalculated using a model that incorporates the results of all countries that participated in ALLS. (The previous model was based only on countries that participated in the first round of ALLS.) This has resulted in some minor changes to the ALLS numeracy scores. SAL did not collect a numeracy domain which is comparable with ALLS and PIAAC.
61 These remodelled literacy scores (from ALLS and SAL) and numeracy scores (from ALLS) are included in additional data cubes to be appended later in 2013.

62 The problem solving in technology-rich environments competency is a new edition in PIAAC and is not comparable to the problem solving scale derived in ALLS.

63 PIAAC was not designed to assess health literacy preventing any comparison with ALLS on that skill domain.

64 To ensure comparability between the previous surveys, 60% of the literacy and numeracy tasks used in the PIAAC exercise were previously used in the ALLS and SAL surveys. However, in PIAAC most respondents completed the exercises on a computer (70%), rather than a paper-based exercise (30%). In ALLS and SAL, all respondents completed paper-based exercises.

65 PIAAC includes new questions for respondents who were employed or had recent work experience about:
  • the frequency of use of a number of generic skills used in the workplace, including communication, presentation and team-working skills
  • skill practices at work, specifically reading, writing, mathematics and ICT skill activities at work which are considered important drivers of skills acquisition and the questions are designed to complement what is being measured in the exercise.
66 For each respondent in PIAAC, ten plausible values (scores) were generated for the domains measured (whereas for ALLS and SAL only five plausible values were generated). While simple population estimates for any domain can be produced by choosing at random only one of the ten plausible values, this publication uses an average of the ten values. For example in order to report an estimate of the total number of people at Level 1 for literacy, the weighted estimate of the number of respondents at Level 1 for each of the ten plausible values for literacy individually, was calculated. The ten weighted estimates were then summed. Finally, this result was divided by ten to obtain the estimate of the total number of people at Level 1 for literacy. The process was repeated for each skill level. Refer to the appendix titled Scores and skill levels for further information about the calculation of estimates using all ten plausible values in combination.67 Changes to the scope and coverage of PIAAC from ALLS and SAL are:
  • overseas residents who have lived in Australia, or intend to do so, for a period of 12 months or more are included in the scope for PIAAC and ALLS, but were excluded for SAL
  • people living in Collection Districts which contain Discrete Indigenous Communities were excluded from the scope of PIAAC, but were included for ALLS and SAL if they were not in a very remote area
  • households where all of the residents were less than 18 years of age were excluded from PIAAC coverage, but were included in ALLS and SAL.
68 The full and part literacy non-response records (154 people) were weighted but not given plausible scores for PIAAC. Other part non-response (3) records were weighted and given plausible scores for PIAAC. However, similar records were treated as non-respondents for ALLS and SAL.

Comparability with other ABS surveys

69
PIAAC collected data across a range of topics, some of which have been included in previous ABS surveys. Where possible question modules from existing surveys were used in the PIAAC questionnaire to facilitate comparison with other surveys. However, given PIAAC is part of an international survey, there was a requirement to use internationally developed question modules to ensure the results are comparable with data from other countries involved in the survey.

70 Additionally, PIAAC is a sample survey and its results are subject to sampling error. As such, PIAAC results may differ from other sample surveys, which are also subject to sampling error. Users should take account of the RSEs on PIAAC estimates and those of other survey estimates where comparisons are made.

71 Differences in PIAAC estimates, when compared with the estimates of other surveys, may also result from:

  • differences in scope and/or coverage
  • different reference periods reflecting seasonal variations
  • non-seasonal events that may have impacted on one period but not another
  • underlying trends in the phenomena being measured.
72 Finally, differences can occur as a result of using different collection methodologies. This is often evident in comparisons of similar data items reported from different ABS collections where, after taking account of definition and scope differences and sampling error, residual differences remain. These differences often have to do with the mode of the collections, such as whether data is collected by an interviewer or self-enumerated by the respondent, whether the data is collected from the respondent themselves or from a proxy respondent. Differences may also result from the context in which questions are asked, i.e. where in the interview the questions are asked and the nature of preceding questions. The impacts on data of different collection methodologies are difficult to quantify. As a result, every effort is made to minimise such differences.

CLASSIFICATIONS

73 Country of birth data are classified according to the Standard Australian Classification of Countries (SACC), Second Edition, 2008 (cat. no. 1269.0).

74 Geography data (State/territory) are classified according to the Australian Statistical Geography Standard (ASGS): Volume 1 - Main Structure and Greater Capital City Statistical Areas, July 2011 (cat. no. 1270.0.55.001).

75 Languages data are classified according to the Australian Standard Classification of Languages (ASCL), 2005-06 (cat. no. 1267.0). The survey questionnaire listed the 10 most frequently reported languages first spoken at home. Interviewers were instructed to mark the appropriate box, or if the reported language was not among those listed, to record the name of the language for subsequent coding.

76 Education data are classified according to the Australian Standard Classification of Education (ASCED), 2001 (cat. no. 1272.0). Coding was based on the level and field of education as reported by respondents and recorded by interviewers. From the ASCED coding, the level of education was also classified according to the International Standard Classification of Education (ISCED), 1997. For an example of a broad level concordance between these two classifications, see Australian Standard Classification of Education (ASCED), 2001 (cat. no. 1272.0).

77 Occupation data are classified according to the ANZSCO - Australian and New Zealand Standard Classification of Occupations, First Edition, 2006 (cat. no. 1220.0). From the ANZSCO coding, occupation was also classified according to the International Standard Classification of Occupations (ISCO), 2008.

78 Industry data are classified according to the Australian and New Zealand Standard Industrial Classification (ANZSIC), 2006 (Revision 1.0) (cat. no. 1292.0). From the ANZSIC, industry was also classified according to the International Standard Industrial Classification of All Economic Activities (ISIC), Rev.4, 2008.


PRODUCTS AND SERVICES

Data cubes

79 A data cube (spreadsheet) containing tables produced for this publication is available from the Downloads tab of the publication. Estimates, proportions and the corresponding relative standard errors (RSEs) are presented for each table.

80 Additional data cubes containing final data are to be appended to this product later in 2013. Users can subscribe to receive Email Notifications to be advised when updates are available for this product. From the attached link, select 4. Social Statistics, sub-topic 42. Education, then select the product 4228.0 Programme for the International Assessment of Adult Competencies.

Microdata

81 For users who wish to undertake more detailed analysis of the survey data, a basic confidentialised unit record data file (CURF) is available on CD-ROM from Microdata: Programme for the International Assessment of Adult Competencies (PIAAC), 2011-2012 (cat. no. 4228.0.30.001).

82 Additional microdata products are proposed to be released later in 2013. Users can subscribe to receive Email Notifications to be advised when updates are available. From the attached link, select 4. Social Statistics, sub-topic 42. Education, then select the product 4228.0.30.001 Microdata: Programme for the International Assessment of Adult Competencies.

83 Further information about microdata is available from the Microdata Entry Page on the ABS web site.


ACKNOWLEDGEMENTS

84 ABS publications draw extensively on information provided freely by individuals, businesses, governments and other organisations. Their continued cooperation is very much appreciated; without it, the wide range of statistics published by the ABS would not be available. Information received by the ABS is treated in strict confidence as required by the Census and Statistics Act 1905.


NEXT SURVEY

85 The OECD proposes to conduct the PIAAC survey internationally every ten years. The next PIAAC survey is therefore proposed to be conducted in 2021.


RELATED PUBLICATIONS

86 Refer to the Related Information tab for other ABS publications which may be of interest.

87 The OECD proposes to publish international results in October 2013, and these will be available from the OECD website at www.oecd.org.

88 The OECD publication titled 'Literacy, Numeracy and Problem Solving in Technology-Rich Environments - Framework for the OECD Survey of Adult Skills' provides further information about the PIAAC survey. This publication, as well as further background and conceptual information about the PIAAC survey, is available from the OECD website at www.oecd.org.

89 The Education and Training Topics @ a Glance page contains a wealth of information and useful references. This site can be accessed through the ABS website.