Growing Up With Media: Methodological details

 

Growing Up With Media:

Methodological Details 

March 2013

 

This is the first in a series of 7 bulletins summarizing the methodology for and findings of the Growing up with Media (GuwM) Study. GuwM is a longitudinal survey of 1,586 youth aged 10-15 years at baseline. Data were collected initially between August – September, 2006, again between November, 2007 – January, 2008, and finally between August – November, 2008. The survey protocol was reviewed and approved by the Centers for Disease Control and Prevention Institutional Review Board (IRB).

Introduction

Youth violence is a significant public health issue that negatively affects individuals, families, and communities.1, 2 According to the Centers for Disease Control and Prevention, youth violence represents a wide range of behaviors.3 While some behaviors can cause more emotional harm than physical harm (e.g., bullying, slapping), others can result in serious injury or death (e.g., assault with a deadly weapon, robbery). Estimates of combined direct and indirect costs associated with youth violence in the United States are more than $158 billion every year.1 Research on the development of aggression and violence indicates that it occurs via a confluence of factors including individual (e.g., genetics), family (e.g., poor parent-child relationships), school (poor academic performance), peer, and community (e.g., neighborhood violence) characteristics.4 Although exposure to media violence is not the single reason for youth violence, it probably is, as the American Academy of Pediatrics asserts, “the singly most easily remediable contributing factor”.5 After an exhaustive review of available research, Anderson and colleagues concluded that “the scientific debate over whether media violence increases aggression and violence is essentially over” and underscored the need for studies identifying the “magnitude of media-violence effects on the most severe types of violence”.6 Research compiled over the last 50 years points to the conclusion that exposure to TV violence is one of many factors contributing to violent behavior among young people.7-9

Research also suggests links between violent video games and aggressive behavior.10, 11 Learning techniques used to teach and reinforce game behaviors are the same as those utilized by educators to teach and reinforce positive academic lessons. Graphic content centered on killing other players and characters immerses the player in a world of death, blood, and violence. The graphics in today’s video games are much more realistic than those used in earlier studies. A recent meta-analysis found that recently conducted studies report a larger correlation between playing violent video games and aggressive behavior than earlier research.11 With rapid increases in access to the new, more realistic games there is a need to continue to examine the link between violence and video games including the access to these games through the Internet.

Gaps remain in our understanding of the relationship between media violence and aggressive behavior despite the prolific research activity that has taken place.5, 12-17 First, despite substantial evidence of short-term effects of media violence on arousal, thoughts, and emotions2, 7, 18 little research has considered the long-term link with seriously violent or criminal behavior2, 19-21 particularly among youths (as opposed to adults). Second, adult studies suggest that frequent pornography use among men who consume violent pornography may be more likely to perpetrate sexual aggression.22-24 How findings of adult studies translate to child and adolescent perpetration of sexually aggressive behavior is largely unknown. In a review of the literature, Benedeck and Brown conclude that additional research is necessary and that “perhaps the most ethical and safe route to studying the effects on children of exposure to televised pornography is epidemiological studies”.25 Third, the influence that exposure to violence in new media, particularly the Internet and cell phones, is largely unknown.

To address these issues, Growing up with Media, a national longitudinal study of 1,586 children, was funded through a cooperative agreement with the Centers for Disease Control and Prevention (U49 CE 000206). The aim of the study was to measure youth exposures to violent media and the subsequent expression of seriously violent and aggressive behavior.

Objectives

The primary objective of this study is to assess prospectively the role of violent media in involvement in violent behavior. More specifically, the research objectives as funded by the CDC are:

  • Objective 1. To examine the association between exposure to violent media and serious violent behavior, including victimization and perpetration resulting in injury.
  • Objective 2. To assess specific aspects of media (i.e., type and content) that are likely to contribute to risk for violence.
  • Objective 3. To identify individual and contextual factors that mediate or moderate the association between exposure to violent media and serious violent behavior, with particular attention to the potential moderating effects of gender and prior exposure to real-life violence.

Study Team

The GuwM longitudinal study is a collaborative effort. Dr. Michele Ybarra at Internet Solutions for Kids was the Principal Investigator. Drs. Philip Leaf and Marie Diener-West at the Johns Hopkins Bloomberg School of Public Health were co-investigators. Dr. Merle Hamburger at the Centers for Disease Control and Prevention (CDC) was the Project Officer. Dr. Dana Markow at Harris Interactive was the Project Manager.

Internet Solutions for Kids had the primary responsibility of questionnaire design. Dr. Michele Ybarra, Dr. Philip Leaf, Dr. Marie Diener-West, and Dr. Merle Hamburger worked with Harris Interactive Inc. to provide support and guidance in crafting the final questionnaire. The Harris team was led by Dr. Dana Markow, Vice-President, Youth & Education Research. The data were collected by Harris Interactive.

Survey Development Procedures

Prior to the recruitment for the longitudinal survey, focus groups and a pilot survey were conducted to further develop the final survey instrument.

Focus groups
The focus groups were conducted in May 2005 in Rochester, New York. Two focus groups were conducted, one with boys and the other with girls, with 6 participants in each group. Participants were in grades 7-8 (13-14 years old) and required to have at least some Internet exposure, other than email, weekly (i.e., at least 2 hours in the past week).
Each focus group lasted about 1 ½ hours. Participants received $40 for their time.

The two goals of the focus group were:

  • Confirm the language of items that are potentially trend-specific (e.g., names of current violent computer games
  • Explore issues related to logistics and panel retention (e.g., what reasons do youth find compelling to participate in a project like this).

For each media content category, focus group participants provided the names of current popular games. For example, the “Shooting games” category listed games such as Grand Theft Auto, Halo, and Death Rites 2 to help youth correctly categorize the types of games they play. Also, it became clear that some youth were completely unaware of different types of violent websites (e.g., “death sites”). Based upon these results, the survey instrument was modified to provide several relevant examples when classifying media content categories; and a third response option was therefore added to this survey section: “No, I don’t know what this is” (as opposed to: “No, I know what this is but I have never been to one”).

There was some concern (particularly among girls) that parents would not give them privacy to answer questions when taking the survey. Boys reported being less concerned about privacy – a specific reason for this was that their parents already knew of any fights they have been involved in. Based on this feedback, the survey instrument was also modified to include questions for child participants about where the child is completing the survey, whether they are alone in the room, and whether their parent is watching them complete the survey.

When exploring issues related to panel retention and reasons for youth to participate in a project like this, focus group participants said it would depend on the incentive – a monetary incentive would be preferred rather than a gift card. Nonetheless, for practical reasons of sending money through the mail, all youth participants received gift cards.

Pilot survey

We also conducted a pilot survey with 100 randomly identified households in April 2006. Households were identified by using random digit dialing (RDD), a sampling technique where computer-generated telephone numbers are called at random.26

The three goals of the pilot survey were to:

  1. Test the survey instrument
  2. Test the recruitment and survey completion methodology
  3. Confirm the hypothesized prevalence rates of eligible households

Data was collected only once for the pilot, with no follow-up data collection. Procedures for participating households included random digit dialing of households, online log in, consent of the adult, subsequent online assent from the child, and completion of the surveys online. Participants received $25 for their time.

Response rate:
For the pilot survey, the response rate from the random digit dialing of households was 29% based on the following calculation:

RR =                                                        Interviews                                                      
Interviews + (Refusals + Non-contacts + Other) + e(Unknown eligibility)

RDD telephone sample disposition
14,095 Total Sample
103 Interview
103 Recruit (Qualified, agreed to online portion)
45 Eligible, Non-Interview
9 Qualified household, refused use of demographics or broke off at this point
36 Qualified household, refused online portion
8066 Unknown eligibility
5881 Not eligible


Of the 103 participants recruited from the random digit dialing, 34% completed both the parent and child online survey. Based on these results, we deemed the use of random digit dialing as part of our recruitment strategy an unfeasible method to recruit our targeted population of participants.

Pilot online survey disposition
103 Total Recruits
35 Online completes (parent and child)
2 Partial completes (parents only)
2 Online refusals – parent (Q1000)
0 Online refusals – child (Q1500)
10 Online suspends
18 Refused during telephone reminder (includes, “no time”)
36 No response


This was a pivotal point in the research project: with such a low response rate observed, the planned sampling methodology shifted from RDD recruitment to an online recruitment through HPOL.

Longitudinal Survey

Data source sampling method

The sample was obtained from the Harris Poll Online (HPOL) opt-in panel of millions of respondents. Online sampling and data collection was utilized to minimize costs and maximize the confidentiality of the information provided by the respondents. Web-based data collection has been found to provide greater self-disclosure for sensitive topics than found with telephone surveys.27, 28 Invitations for this study were emailed to a stratified random sample drawn from the Harris Poll Online database initially targeted as a U.S. adult with a child in the household under the age of 18. As quotas began to fill, the sample was then targeted to parents who have a child between the ages of 10 and 15.

The HPOL panel has been recruited through hundreds of sources using diverse recruitment methods in order to minimize selection bias, including:

  • Co-Registration Offers on Partner Websites
  • Targeted Emails Sent by Online Partners to their Audience
  • Graphical and Text Banner Placements on Partner Websites
  • Refer-a-Friend Program
  • Client Supplied Sample Opt-Ins
  • Trade Show Presentations
  • Targeted Postal Mail Invitations
  • TV Advertisements
  • Telephone Recruitment of Targeted Populations

In Wave 1, a stratified random sample of Harris Interactive’s online panel was invited through password protected email invitations to participate in a survey about their experiences with various types of media.

Qualified respondents for Wave 1 were defined as:

  • U.S. adults (ages 18 or older)
  • Parents/guardians of a 10 to 15 year old child who lives in the household at least 50% of the time
  • Youth has Internet access somewhere (i.e., at home, another person’s house, school, library, or elsewhere)
  • Youth has accessed the Internet within the past 6 months
  • Respondent is familiar / most familiar with child’s daily activities
  • Parent/guardian and child give their informed consent/ assent to participate in the survey

Using the sample obtained from HPOL opt-in panel, a U.S. representative sample of 1,591 pairs of parents and their children ages 10 to 15 were surveyed online in Wave 1. Five child-caretaker pairs from Wave 1 were removed because of data quality issues (e.g., duplicate child respondents in the same household). The final sample size was 1,586 . Follow-up data collection points, Wave 2 and Wave 3, consisted of the sample of parent-child pairs who completed the survey in Wave 1.

Control of the sample and incentives. 

To maintain the reliability and integrity of the sample, the following procedures were used for each Wave of the survey:

Password protection. Each invitation contained a password-word protected link to the survey that was uniquely assigned to that email address. Password protection ensures that a respondent completes the survey only one time.
Reminder invitations. To increase the number of respondents in the survey and to improve overall response rates at Wave 1, up to six reminder invitations were mailed after the initial invitation to those respondents who had not yet participated in the survey.

“Instant Results” of selected survey findings. To increase the number of respondents in the survey and to improve overall response rates, respondents were able to access results to pre-determined, selected questions after completing the survey.

Cash incentives. To increase the number of respondents in the survey and to improve overall response rates, parents were offered a $10 cash incentive and children a $15 Target gift card for completing each of the Wave 1 and Wave 2 surveys and a $20 cash incentive and children a $25 Target gift card for completing the Wave 3 survey.

Telephone calls. To increase the number of respondents in the survey and to improve overall response rates, telephone calls were made to respondents who could not be reached by email (invalid address, email bounced back, etc.) or who did not complete the survey after the email reminders were sent.

Mailing. A few weeks after field start, a letter containing the URL link to the survey and password was sent to those respondents for whom a valid email address or phone number was unavailable or who had not yet completed the survey.

HIpointsSM.* To increase the number of respondents and to improve overall response rates, adults were awarded HIpoints (Harris Interactive points that can be exchanged through Harris for select merchandise) at Wave 1.


* Between Wave 1 and subsequent waves some respondents unsubscribed or were removed from Harris Interactive’s online panel. These respondents received only the cash incentive and not the HIpoints or HIstakes incentives.

Data quality control
Interviews were conducted using a self-administered online questionnaire via Harris’ proprietary, web-assisted interviewing software. Online questionnaires are programmed into the system with the following checks:

  1. Question and response series
  2. Skip patterns
  3. Question rotation
  4. Range checks
  5. Mathematical checks
  6. Consistency checks
  7. Special edit procedures

For questions with pre-coded responses, the system only permits answers within a specified range; for example, if a question has three possible answer choices (“Agree,” “Disagree,” “Not Sure”), the system will only accept coded responses to these choices.

Data collection procedures

  • Wave 1 was conducted between August 24 – September 14, 2006.

Panelists were emailed survey invitations beginning on August 24, 2006. Seven reminders were sent to participants who had not yet participated in the survey. Reminders were sent across 3.5 weeks, the first being sent 3 days after the initial invitation.

After determining respondents’ eligibility for the survey at Wave 1, respondents were given a brief description of the research, which also referenced the two additional surveys to be conducted in Wave 2 and Wave 3, as well as the incentive amount for completing each survey. Before continuing on with the main survey, parents and their children were individually asked to read this assent form and indicate their willingness to participate in the survey.

  • Wave 2 was conducted between November 2, 2007 – January 10, 2008.

Panelists were emailed survey invitations beginning on November 2, 2007. One reminder was sent 2 days after the initial invitation to those who had not yet participated in the survey.

Participants in the Wave 1 survey were contacted via an email invitation and asked to complete the second wave of the study.

Screening was conducted at the beginning of the survey to confirm that the appropriate respondents participated. Respondents entered their age and gender at the start of the survey and their entries were compared with those collected in Wave 1. Respondents’ age in Wave 2 had to be within 2 years of the age entered in Wave 1 in order to enter the survey. In a few instances further follow-up was needed to clarify some respondents’ age or gender.

  • Wave 3 was conducted between August 29 – November 26, 2008.

Panelists were emailed survey invitations beginning on August 29, 2008. One reminder was sent 2 days after the initial invitation to those who had not yet participated in the survey.

Participants in the Wave 1 survey were contacted via an email invitation and asked to complete the third wave of the study. Similar to Wave 2, screening was conducted at the beginning of the survey to confirm that the appropriate respondents participated.

Respondents entered their age and gender at the start of the survey. Respondents’ age in Wave 3 had to be within 3 years of the age entered in Wave 1 in order to enter the survey. In a few instances further follow-up was needed to clarify some respondents’ age or gender.

Across waves, respondents were asked to enter their contact information. These data were captured and stored in a separate survey instrument to ensure that personally identifiable information was not directly linked to survey responses.

Panel maintenance

In order to ensure the highest possible retention rate of Wave 1 participants, Harris Interactive engaged in several efforts during the intervening period between the Wave 1 and Wave 2 surveys (September 2006 – October 2007) and between the Wave 2 and Wave 3 surveys (January 2008 – August 2008). Wave 1 parent participants were contacted several times throughout the year to remind them of the study and allow them to update their contact information.

Mailing. After both Wave 1 and Wave 2, three mailings were sent to parents who participated.

  • Mailing #1 consisted of a thank you letter, 5”x7” participation certificate (color), and prepaid postcard to update any address/email/phone changes.
  • Mailing #2 included a letter reminding participants of the upcoming survey and a prepaid postcard to update any address/email/phone changes.
  • Mailing #3 consisted of an email alert reminding participants of the upcoming survey and provided an email and phone number by which they could update any address/email/phone changes.**
Between Wave 1 & Wave 2 Between Wave 2 & Wave 3
Mailing #1 January 2007 February 2008
Mailing #2 June 2007 May 2008
Mailing #3 July 2007 July 2008

** In Wave 2, respondents received an additional email informing them of a delay in the planned start of the survey.

Opportunities to update contact information. In addition to the prepaid postcard, respondents were also given the opportunity to update their contact information via a toll free 800 number and an email address. Other inquiries by respondents were addressed by project staff at Harris Interactive during the interim period between surveys.

Pages: 1 2