4906.0.55.003 - Personal Safety Survey, Australia: User Guide, 2012  
Previous ISSUE Released at 11:30 AM (CANBERRA TIME) 13/05/2014   
   Page tools: Print Print Page Print all pages in this productPrint All RSS Feed RSS Bookmark and Share Search this Product  
Contents >> Survey Design and Operation >> Data Collection


Personal face to face interviews were conducted with one randomly selected person aged 18 years or over who was a usual resident of the selected household. Interviews were conducted from February to December 2012. Interviews took, on average, around 30 minutes to complete.

While the survey was conducted under the authority of the Census and Statistics Act, 1905, participation in the survey was not compulsory.

The Survey Advisory Group provided input in relation to special survey procedures.



Due to the sensitive nature of the information being collected, special procedures were used to ensure the safety of those participating and the reliability of the data provided.

While generally the standard ABS approach was followed, as with all surveys, there were also specific field procedures applied reflecting the sensitive nature and content of the survey. The aims of these procedures were to maximise response rates and to ensure the safety of both respondents and interviewers. They were also designed to help ensure confidentiality of responses and the integrity of data.

In considering the best method of advising respondents they had been selected to participate in the survey, it was decided that similar procedures would be adopted to those used in the PSS 2005 and WSS 1996. Rather than sending an official letter preceding the interview (a Primary Approach Letter), the interviewer would cold-call and explain the nature of the survey when they arrived at the selected household. This procedure was designed to ensure maximum chance of participation, should a respondent have been living in the same household as a perpetrator. Interviewers were given copies of an introductory letter which they could use to provide information about the survey. The letter detailed the official status of the survey, provided a deliberately vague outline of what information would be collected and assured respondents of the confidentiality of data collected. The letter did not detail the sensitive information to be collected.

At this first approach, it was predetermined (through the sample selection process) whether a male or female was to be interviewed at the selected dwelling. Predetermining the gender of the person to be interviewed allowed interviewers to tailor their approach, depending on who answered the door.

To determine whether an interview would be required, a series of screening questions were asked of the person initially answering the door, to determine the number of usual male/female residents aged 18 years and over. From this information the interviewer was able to determine whether the person they were talking to would be selected for interview or if further details of the usual residents would need to be collected to determine who would be selected for interview.

While every effort was made to ask the screening questions (to determine who the selected person would be), to talk directly to the selected person to gain their cooperation, it was not always possible for the interviewers to do so. Sometimes the household spokesperson refused to answer the screening questions, or to answer the information about the usual residents and sometimes the household spokesperson would refuse on behalf of the selected respondent. In light of all the possible blocking points, interviewers were provided with specific training aimed at assisting them in gaining cooperation for the survey in order to ensure the highest response rates possible for a voluntary survey.

In order to maintain the confidentiality and integrity of the data collected, interviewers were instructed not to approach any dwellings known to them. It was considered that if the interviewer was known to the selected respondent that they may not respond truthfully to the questions (as they may not feel comfortable revealing to the interviewer what has happened or is currently happening to them). Additionally, if upon collecting household information, an interviewer discovered that someone within the household was known to them, they were instructed to return the record to the office for re-allocation to another interviewer. To minimise the chances of this happening, workloads were not allocated to interviewers within their local area.

Further, due to the workload allocation processes currently used by the ABS, occasionally an interviewer would be assigned dwellings in the same workload which were within eyesight of one another (unavoidable in cul-de-sacs and blocks of units facing one another). The ABS considered this a risk to both data confidentiality (given that the length of time taken to conduct a PSS interview may indicate the complexity of someone's life experiences to a neighbour) and response rates (given that neighbours may discuss the content of the survey with one another, causing a household who had not yet completed their interview to refuse). Interviewers were instructed to conduct an interview at one dwelling, but any dwellings within eyesight of the first were to be returned to the office for reallocation to either another interviewer, or the same interviewer in a later workload (eg leaving months between interview times).


A specific requirement of the survey was that all interviews were conducted alone in a private setting, ensuring that other members of the household were not aware of the survey content or the responses given. This ensured the complete confidentiality of any information collected and the security of both the respondent and the interviewer, where the respondent may have been living in the same household as a perpetrator. If preferred by the respondent, the option of conducting the interview at an alternate location or by telephone interview was also available.

People were first advised of the general nature of the survey and asked if they wished to proceed with the interview. During the interview less sensitive questions were asked first, such as their demographic details and general feelings of safety questions. This allowed people to become comfortable with the method of questioning, to build a certain level of rapport with the interviewer and also to familiarise them with the survey content. Once the questions regarding a person's experience of violence were reached in the interview, respondents were informed of the sensitive nature of the upcoming questions and their permission to continue with the interview was sought. Only 533 people (a small proportion of records) chose not to proceed in the 2012 PSS at this point.

The questions asked during the interview may have caused emotional distress for some respondents. With this in mind, the ABS provided a card to each respondent which listed phone numbers and/or websites of selected national services/organisations providing sexual assault and domestic violence services, or general counselling information (such as Lifeline and 1800 RESPECT). This card was offered to all people at the conclusion of an interview, irrespective of whether they had reported an incident of violence.

Proxy Interviews

No proxy interviews were conducted. Interpreters or other family members were not used: this was to ensure the safety of those participating (where the respondent may have been living in the same household as a perpetrator) and the reliability of the data provided (where the respondent may not have felt comfortable revealing sensitive information through an interpreter/other family member, who may not have been aware of the respondent's past or current experiences).

To cater for instances where a respondent did not speak English, a small number of interviewers with foreign language skills were trained to conduct PSS interviews. These interviews were mostly conducted over the phone. Where a respondent required the assistance of another person to communicate with the interviewer and an ABS interviewer who spoke their language was not available, interviews were not able to be conducted. Therefore it is possible that the PSS may under represent those from a non-English speaking background. Similarly, where a respondent required the assistance of another person to communicate with the interviewer, interviews were not able to be conducted. It is also likely that the PSS will under represent those with a profound or severe communication disability.


Information was collected by specially trained ABS interviewers. Experienced ABS interviewers were provided with detailed instruction manuals about the survey content & the procedures to be followed and also attended a comprehensive two day survey training program (for further information, see "Training" below).

To help ensure respondent comfort and well-being, as well as encouraging participation, the ABS used female interviewers for the PSS. It was considered that men and women would be more likely to feel comfortable revealing sensitive information about their possible experiences of violence to a women. This was based on collective advice from experts in the field during the survey development, was in line with the successful procedures followed for the 2005 PSS and was also supported by the 2012 PSS Survey Advisory Group. To cater for instances where this might not be the case, the ABS also trained a small number of male interviewers, in case a respondent preferred that their interview be conducted by a male. No requests for a male interviewer were made for the 2012 PSS.

The use of specially trained interviewers ensured that rapport could be established with respondents and that the relevant concepts and definitions could be explained as necessary.


Specialised Personal Safety Survey training was conducted for interviewers to ensure all interviewers used a standard approach.

The training program included sessions to familiarise the interviewers with:
    • the concepts addressed in the survey (definitions);
    • the specialised survey procedures developed for the survey (including sensitive approach methods to maximise response);
    • the Computer Assisted Interview (CAI) instrument;
    • administrative aspects of the survey; as well as
    • Sensitivity and Awareness Training.

The Sensitivity and Awareness Training session aimed to increase interviewers' awareness of the experience of survivors of violence, and their own response to the topic. It also provided techniques to assist interviewers to deal with difficult or emotional interviews and to react professionally and appropriately to the topics addressed in the survey. The ABS utilised external consultants, specialised in this field to provide this component of the interviewer training.

A support network was put in place to provide support, stress management and coping strategies for interviewers while they were working on the survey. The main components of the network were access to counsellors, a contact person in the office and the provision of voluntary emotional debriefing sessions at the end of enumeration. Interviewers were strongly encouraged to use this network.


A Computer Assisted Interview (CAI) instrument was used for the 2012 PSS. It contained a household form and a personal questionnaire. The household form collected, from any responsible adult within the household, basic demographic data, such as sex, age, country of birth and details of the relationship between individuals in the household. The instrument then randomly selected an in-scope person of the assigned gender to be interviewed.

The survey questionnaire was designed and thoroughly tested according to standard ABS procedures. Factors taken into consideration included:
    • the length and wording of questions;
    • the suitability of response categories;
    • the sensitivity of the subjects and issues to be covered;
    • the ability of people to recall events which occurred in the past;
    • minimising and simplifying instructions;
    • the logical sequence of the instrument;
    • the inclusion of edits; and
    • the length of interviews.

Careful consideration was also given to the structuring of the instrument, so that more sensitive topics were progressively introduced. Information was recorded by interviewers in a number of different ways, such as:
    • Predetermined response categories - This approach was used for recording answers where a limited range of responses were expected, or where the focus of interest was on a particular type of group of responses. Response categories were listed in the survey instrument and were expected to cover all given responses.
    • Running prompt - In these questions, predetermined response categories were read out to the respondent one at a time until the respondent indicated agreement to one or more of the categories (as appropriate to the topic) or until all predetermined categories were exhausted.
    • Prompt cards - Where appropriate, printed lists covering the range of possible answers to the question were shown to the respondent who was asked to select the most relevant response. By listing a set of possible responses (either in the form of a prompt card or running prompt question) the prompt served to clarify the question or to present various alternatives, to refresh the respondent's memory and at the same time assist the respondent to select an appropriate response. Prompt cards were utilised significantly more often in PSS 2012 than in previous surveys as it was determined through testing that this method of answering questions greatly reduced the emotional burden on both the respondent and the interviewer.
    • Open-ended question - A question was placed at the end of the instrument which allowed respondents to raise issues which had not been covered in the preceding questions. It was not coded in the survey processing, unless the comment mentioned an amendment to a response. The information from this question will be assessed for future survey development. An open-ended question was also located within the Emotional Abuse module for respondents to state any emotionally abusive behaviours they had experienced which they believed had not yet been covered in the Interview. The information collected from this question was used to code whether or not the respondent had experienced Emotional Abuse (according to the definition used in PSS), and will also be used for further survey development.
    • Responses for coding - This method was used for family, country of birth, education and qualifications and income questions. Responses were recorded by the interviewer and either automatically coded by the instrument or subsequently coded by office staff. For further detail regarding this coding see Data processing.

Testing of the questionnaire

As with all ABS surveys, the questionnaire was tested using experienced ABS interviewers and applying the procedures and methods planned for the final survey. A Pilot Test was conducted in Victoria during October and November, 2012. Targeted interviews with known victims of violence were conducted at crisis support centres as part of the Pilot Test. This phase included cognitive testing, where a series of questions were used to probe the respondent for question meaning, comprehension and gather information on potential areas of content development. The main purpose was to ensure that the content of the survey was effectively tested on people who have experienced violence as well as obtaining feedback about their reactions to the Survey's content. A Dress Rehearsal was conducted across NSW, Victoria and QLD from April to June 2011.

The broad aims of the testing program were to test new and modified survey content, particularly the new content for emotional abuse, to ascertain respondent reactions and identify any sensitivities associated with the survey content, assess the introduction of additional prompt cards and revised response categories to ensure they were appropriate, to test operational aspects of the survey instrument, and to assess the suitability of modified field procedures and the comprehensiveness of overall survey procedures and documentation. As a result of the testing program, the survey instrument was progressively improved and the methodology and survey procedures refined.

To ensure consistency of approach, interviewers were instructed to ask the interview questions precisely as they appeared in the instrument.

Previous PageNext Page