Changing Bodies, Changing Minds, Changing Worlds

Get Complete Project Material File(s) Now! »

Chapter 4. Quantitative Methodology

This chapter describes the methodological approach used to quantitatively assess activity, challenge, distress, and challenge management, in cyberspace, by a range of NZ young people. The chapter will begin by describing the pilot study. Next, the chapter will outline the details of the main study, beginning with a description of the sample, including recruitment, response rates, and demographic details of participants. The next section will describe the procedures for the main study. The last section will describe the questionnaire and its measures. Note that Chapter 5 will outline the analytic approach for these quantitative data.

The Pilot

Recruitment

The participants for the pilot phase of the research were recruited via friendship networks of a family known to my supervisor. Participants received the same information sheets, parent information sheets, and consent forms as participants in the main study (see below).

Participants

The eight participants who piloted the survey included three young women, and five young men. The women were aged 13, 15 and 16 years, and the men 13, 14, 14, 15, and 15 years. All identified as New Zealand Europeans.

Procedures

After construction and online testing of the pilot survey, participants were assembled at a private home, after school hours to pilot the survey. Once participants had arrived, and returned signed parental consent forms, I reiterated key points from the participant information sheets, emphasising that participation was voluntary and participants were not obligated to continue, or answer any questions, if they did not want to. As no participants withdrew at this point, I went on to explain the pilot’s purpose was to ensure that young people of different ages could understand the survey questions, and to see how long it took to complete. I emphasised that the survey draft may have included items requiring further work. I then asked them to fill out the survey, as well as to write down the numbers for any questions that were confusing, did not make sense to them, or could be interpreted in more than one way. The time was noted when participants began the survey, and I asked them to indicate when they had finished, so I could record completion times. Upon completion of the survey, and discussions about any problematic items, participants were invited to choose a pizza from the local pizza parlour (a small token of thanks for their participation). While participants waited for pizza, and then ate dinner, I individually addressed with them any wording issues that had come up in the pilot. After dinner I followed up with the relevant participants to check that my suggested re-wording was appropriate.
Following this process, three questions were worded slightly differently, a response option that appeared six times throughout the survey was changed, the response options for one question were reduced to remove unnecessary redundancy, and one, now redundant, question was removed. The original surveys took between 18 minutes and 47 minutes to complete, depending on young people’s experiences of cyberspace and age (with,participants reporting more cyber-experience and/or younger ages requiring more time). The eight pilot surveys were not included in the final sample.

The Main Sample

Recruitment

Participants in the main study were recruited from five high-schools around NZ (see Table 4). Participating schools were identified by me, or by colleagues and friends, and were selected to ensure that a diverse range of communities, and therefore, a diverse range of participants, would be sampled. Table 4 demonstrates that these schools were diverse, sampling participants from around the country, as well as from metropolitan, nonmetropolitan, co-educational and single-sex schools with a range of decile ratings (4–10). Following University of Auckland Human Participants Ethics Committee (UAHPEC) approval for this phase of the research, I contacted school contacts and invited them to ask their principal to consider the research. Following agreement, I talked with principals and sent a letter outlining the research (Appendix G) and a consent form for school participation (Appendix H). Participating schools then disseminated information sheets for students (e.g., Appendix I), parents (e.g., Appendix J), and consent forms (e.g., Appendix J). Consent forms needed to be returned to enable a student’s participation. Note, that while these appendices represent the original research template, schools were able to specify various aspects of how they would run the survey (discussed below).

Response rates

Each school determined how many students would be approached for participation. Schools also decided on what (if any) UAHPEC-approved token reward (e.g., a ‘healthy’ biscuit, a ‘mufti’ day, a ‘bbq’ meal) would be provided to students who returned a signed consent forms (regardless of whether the form allowed or denied their actual participation). In order to stimulate participation, schools could select their desired consent-form return-rate, which would result in students receiving a reward (a rate of 75% was suggested). However, all participating schools requested (and received) student rewards, even when this rate was not achieved. Table 4 describes the response rates for each school. Only Schools A and C administered the survey to the whole school. The other schools administered the survey to a subset of the school. For instance, School B requested 1,800 surveys, School D ran the survey with five Year 9 and 10 classes (with an assumed sample size of approximately 135), and School E requested 300 surveys for 10 classes (two classes each of Year 9, 10, 11, 12, and 13 students). Where the actual population of students in these schools did not match the number of surveys disseminated, the response rate is an approximation based on the numbers of surveys requested by the school (these numbers may therefore slightly underestimate response rates). Thus, the overall response rate was approximately 58.5% (N = 1,821). After incomplete or suspicious surveys (e.g., where most answers had systematically been selected [n = 148]) were removed, the final sample size was 1,673, representing an approximate response rate of 53.7%. 4.2.3. Participants Table 4 demonstrates that participants from school B made up 58.5% of the sample. This proportion, combined with the 416 participants from schools A and C, meant that the majority of participants (83.3%) were recruited from mid decile (e.g., decile 4–6) schools. Mid-decile schools may include a range of socio-economic backgrounds, including a number of students from lower socio-economic backgrounds. Thus, while no low (1–3) decile schools participated, it is likely that a number of lower socio-economic status students were sampled.
A gender skew was present in the data, with 62.3% of participants who provided gender information (n = 1,668) reporting that they were female (n = 1,039). The ages of participants ranged from 12 to 19, with a mean of 15.3 (SD = 1.44). As discussed later in Chapter 5, dichotomous categories of [12- to 14-year-old] “younger” (n = 553) and [15- to 19-year-old] “older” (n = 1,100) students were constructed. A 2×2 Pearson Chi-Square analysis conducted with SPSS22 (Version 18) found no significant differences in the gender proportions of these two age categories (χ2(1) = .018, p = .89), with gender proportions only differing by 0.4% across the age groups. Participants were able to select any number of ethnicity descriptions (and/or describe “Other” ethnicities—see Appendix L), and 8.5% (n = 141) selected more than one ethnicity description. The majority of the sample (38.9%, n = 651) identified as ‘NZ European or Pākehā’, followed by ‘Asian’ (23.0%, n = 385), ‘Indian’ (19.4%, n = 324), ‘Other Ethnicity’ (10.1%, n = 169), ‘Pasifika’ (9.1%, n = 152), Māori (4.6%, n = 77), and ‘Other European’ (4.4%, n = 74). These figures indicate an ethnically diverse sample, which under-samples census proportions of Māori (14.6%) and NZ European Pākehā (77.6%) young people, whilst over-sampling ‘Asian’ (9.2%) and ‘Other’ (0.9%) (Ministry of Social Development). While these proportions may not necessarily mirror NZ society, they nonetheless reflect the diversity that this sampling framework sought to achieve (though, due to small sample sizes, particular caution is advised when considering Māori inferences).

Procedures

Recognising the importance of school staff for data collection, I travelled to each participating school and talked at staff meetings about the study (as well as young people’s use of cyberspace). Four of these staff meetings included a standard NetSafe presentation about young people’s experiences of challenge in cyberspace. Staff members were encouraged to attend these meetings by the provision of a free morning tea. During these presentations I also summarised the research aims, procedures, introduced the survey liaison person at the school, and answered any questions about the research. In these meetings I also described how the school had decided to run the survey, and clarified the teachers’ roles in that process. Schools A, D, and E requested teachers to disseminate information and consent forms through form time, while Schools B and C disseminated information sheets and consent forms with their school newsletter. All schools asked form teachers to return students’ signed consent forms to the school liaison person. The school liaison person then collated the list of authorised students and returned these to teachers so that only consented students could participate in the research. Schools A and C also asked me to introduce the research to students at school assemblies. The data collection procedure also differed by schools. Three schools decided to administer the survey through a department common to most students (e.g., English) while two administered it through a period common to all students (e.g., Sustained Silent Reading). Schools A, C, and D decided to use the online survey and Schools B and E used pen-and-paper versions of the survey (Appendix M). While the survey was initially piloted and run via Survey Monkey, a website used to produce and host online surveys, School B’s computer network problems prevented use of the online survey. School B delayed data collection for one week while I produced and printed paper versions of the survey. Upon completion of data collection, signed consent forms and surveys (where applicable) were couriered to me for secure storage and data entry. Data collection was staggered according to the desired times for the various schools and took place from the third week in August 2007 until the third week in March 2008. Other than School E, all schools requested, and received, tailored reports on the data collected from their school. In two instances, I met in-person with school staff to discuss these findings.

READ  Local context in regions with concentrated animal feeding operations (CAFO)

Survey Instrument and Measures

The questionnaire was explicitly based on the results and language of the focus group phase of the research.
Unless otherwise noted, the language and examples used in the following questions draw from the focus group data. The 62 main items in the survey aimed to assess activity, challenge, distressing challenge, and the management of distressing challenges, in cyberspace. The actual items were the same in the online and paper surveys; however the smart logic of the online survey (which would automatically skip future redundant questions based on the responses to previous questions) was obviously not a part of the paper surveys. Instead, paper surveys included instructions for participants about sections that they could skip, depending on their answer to a question (e.g., it did not seek details about challenge-management for challenges they did not experience). Unless otherwise noted, all questions in the survey were answered by mouse-clicking check boxes or by ticking boxes with a pen. Due to low response rates and/or the fact that some items were deemed unimportant for the final research questions, 17 main items were not included in the analysis. This discussion addresses only those items that were used in the following quantitative analysis.

Demographics

Three items assessed participant age (with options ranging from 12 to 19+ years), gender (female or male), and the term(s) used to describe participant ethnicity (e.g., see Section 4.1.2 above).

Activity

The activity items explored the range and frequency of activity in cyberspace. In order to collect responses within a defined period whilst providing enough of a sampling frame to capture potentially rare events, participants were asked about their activity in cyberspace in the “past year (in the 12 months up till today)”. This sampling frame was also used regarding participants’ experiences of challenge. Other research, like the Growing up with Media survey (Ybarra, Diener-West, & Leaf, 2007) and the YISS2 (Wolak et al., 2007b), also used this time frame. Frequencies of such activity (and most challenges) were measured with six items: “No”; “Yes, everyday or nearly every day”; “Yes, two or three times a week”; “Yes, once or twice a month”; “Yes, one time every few months”; and “Yes, this happened only once in the year”. The use of these response options, and time frame, across the survey facilitated comparison across and between the activity and challenge items. A very similar response framework was used in the Growing up with Media survey (Ybarra, Diener-West, et al., 2007, p. S44)

Communicating on mobile phones and/or the Internet

The first main item consisted of two questions. The first collected information about whether participants had “communicated on mobile phones/cell phones (e.g., talking, texting, txting, pxting to others)”, and the second, whether they had “used the Internet to communicate online (e.g., messaging, chatting, commenting, in-game chat, talking, webcam, emailing, or posting messages to others, etc.)”. The range of examples in these (and future) questions, aimed to assist participants to recognise a range of potentially relevant experiences. This particular item assessed communication on mobile phones separately to Internet communication. This represented findings from the focus groups, which suggested that the modality of communication was experienced differently by some young people (e.g., some young people reported that communication on mobile phones produced more time-management problems for them than communication on the Internet).

Researching information

The second main item assessed other cyber-activities and included nine sub-questions. These activities, thought to be less modality-sensitive than the communication activity, were assessed across both “the Internet and mobile phones”. Such activities included those identified in the first phase of the research such as, researching information, publishing content, banking, making new friends,findin agirlfriend/boyfriend, et cetera. The researching information activity was described as: “surf the Web for information (e.g., getting information on things that interest you and/or information for school projects”. The wording used the terms surf, information, and school projects to highlight the variety of information (including leisure topics) that young people may research. The “things that interest you” clause aimed to include hobbies, as well as the research of other information (e.g., health related research – see Ybarra & Suman, 2008, for recent USA survey results on this). The explanation around this item was broad, to encourage participants to identify more than homework-related research.

Publishing content

This item assessed if students had used mobile phones or the Internet to put their own “personal stuff (like personal pictures, videos, art work, music, etc.) online (like on Bebo, YouTube, etc.)”. This question included a large number of examples to highlight not only the various types of content (e.g., not just photos of themselves and friends) that young people may place online, but also the various places where a range of content may be hosted (e.g., in social networking sites, on video-sharing sites, etc.).

Banking

Banking was included as an activity in the survey because it acted both as an indicator of the use of cyberspace to develop and participate in economic activity, as well as a measure of information risk, should a participant’s computer security be compromised. The example “looking at your bank account information online” was included in the survey as a basic example and precursor of other electronic banking activities (e.g., fund transfers).

Making new friends

This item assessed whether participants had made new friends in cyberspace. The example included “like making new friends on Bebo or on Instant Messenger (etc.)” who they “had never met in ‘real’ life before”.
Within the context of convergence, the wording of this question was written to emphasis the multiple modalities for such friendship making, as well as acknowledging that “real” life is not necessarily simply the offline world

Communicating with new people

In addition to making new friends, the ability for cyberspace to enable communication with new people who may not necessarily be considered friends, was assessed with a question that asked if participants had “ever used the Internet or a mobile phone to chat, message, video, or web-cam (or communicate in another way) with anyone who you hadn’t first met face-to-face”. Additionally, to confirm that these persons were not already known in-person by the participant, the following note accompanied this question: “(For example – like communicating with new friends on Bebo or on Instant Messenger (etc.) who you have NEVER met in ‘REAL’ life)”. The variety of forms of this activity, as well as popular locations for it, were included to highlight the variety of situations that may fit this activity.

Finding new boyfriends and girlfriends

The development of intimate relationships in cyberspace was assessed by asking if participants had got “a NEW girlfriend or boyfriend online or with a mobile”. This question sought to provide data to explicitly highlight the role of cyberspace in the production of new relationships, rather than its potential role in existing relationships (where the communication items at the beginning of this section may be more relevant).

Trading

Like the banking item, this question also provided indication about the degree to which cyberspace played a role in young people’s economic activity. In order to embrace the ability of cyberspace to facilitate the trading and bartering of virtual goods, as well as real money, this question elaborated that such trading could include “buying, selling, or swapping real or virtual things online using Internet shops, trading sites (like TradeMe, etc.) or games (like Habbo Hotel, Runescape, etc.)”.

Gaming

Online gaming differs from offline gaming in its ability to easily facilitate interaction with ‘strangers’. In order to distinguish offline from online gaming, this item assessed “games that use the Internet” like online games that “may play inside a web browser (like Runescape, Club Penguin, and other games on Miniclips etc.) or use their own programmes to go online (like World of Warcraft, Second Life, etc.)”, or “some PlayStation, Nintendo, and Xbox games” that also “use the Internet”. These high profile games, games’ sites, and gaming consoles were given as examples, so participants would be able to recognise similar, though less common online gaming situations, and answer this question accurately.

Consuming media

This question focused on media forms (other than games) that young people reported consuming during then focus group research. The place of popular media within adolescent development has been heavily emphasised in earlier chapters of the thesis. This question asked about listening to music, and whether participants looked at “video clips, movies, and photos (YouTube, iTunes, LimeWire, etc.)”.

Chapter 1. Introduction
1.1. The Approach
1.2. Changing Bodies, Changing Minds, Changing Worlds—Themes of Development
1.3. Media: A Bad Influence?
1.4. New Media
1.5. The Contemporary History of the Current « New Media » Context
1.6. A New Developmental Setting
1.7. Opportunities and Benefits of the New ICT Developmental Setting
1.8. Challenges of the New ICT Developmental Setting
1.9. Aims of the Research Project
1.10. Summary and Thesis Outline
1.11. Conclusion
Chapter 2. Activity in Context
2.1. Qualitative Methodology
2.2. Results and Discussion
2.3. Context
2.4. Activity
Chapter 3. Challenge and Management
3.1. Challenge
3.2. Challenge
3.3. Resiliency and Challenge Management
3.4. Management [of Challenge]
3.5. A Framework for understanding the Production and Management of Challenge in Cyberspace
Chapter 4. Quantitative Methodology
4.1. The Pilot
4.2. The Main Sample
4.3. Procedures
4.4. Survey Instrument and Measures
Chapter 5. Quantitative Results
5.1. Organisation of the Analysis
5.2. Activity in Cyberspace
5.3. Challenge in Cyberspace
5.4. Managing Distressing Challenge in Cyberspace
5.5. Resolving Distressing Challenges
5.6. Post Hoc Exploration of Challenge
Chapter 6. Discussion
6.1. Introduction
6.2. Context
6.3. Activity
6.4. Challenge
6.5. Distress
6.6. Managing Distressing Challenge
6.7. Limitations
6.8. Implications
6.9. Conclusion
GET THE COMPLETE PROJECT

Related Posts