Sample disposition
At Wave 1, an initial number of 32, 524 survey invitations were mailed to panelists. Of these, 9% (2,986 were undeliverable (e.g. email address was no longer valid).
Wave 1 Invitations | |
32,524 | Total number of panelists sent invitations |
2,986 | Number of panelists who had invitations and/or reminders bounced back |
29,538 | Number of panelists sent invitations, excluding those with at least one bounceback |
At Wave 1, the 6,438 non-qualified respondents were screened out for the following reasons:
- Not qualified due to parent qualification (age, gender, and presence of child in the household, and familiarity with child);
- Not qualified due to lack of willingness to participate/ consent; and
- Not qualified due to child qualifications (access to Internet and consistency with child parent pre-qualified).
Wave 1 Completed Interviews | |
9,035 | Total number of respondents |
560 | Total number of suspended interviews (unknown qualification) |
6,438 | Total number of non-qualified respondents |
2,037 | Total number of qualified respondents (GUWM panelist and over-quota respondents) |
1,591 | Total number of qualified respondents (GUWM panelists) |
446 | Total number of over-quota respondents |
5 | Respondents removed due to data quality issues |
1,586 | Final number of Wave 1 participating households |
Response/ retention rates
Wave 1, the initial response rate for the Growing up with Media study was 31% based on the following calculation***:
(# qualified respondents + # non-qualified respondents + # suspended respondents)
Total number of invitations sent – # invitations that bounced back as undeliverable
*** The response rate was revised from 26% in March 2013 when an error in the initial response rate calculation was identified and corrected.
At Wave 2, the online survey was completed by 1,204 pairs of parents/guardians and their children who had completed the Wave 1 survey. The follow-up rate was 76% from Wave 1.
Wave 2 Completed Interviews | |
1,586 | Initial number of potential Wave 2 respondents |
1,204 | Respondents who completed the Wave 2 survey |
26 | Suspended interviews (unknown qualification) |
34 | Suspended interviews (qualified respondent) |
9 | Refusals (Parents) |
24 | Non-qualified respondents (whose age/gender did not match those recorded in Wave 1) |
287 | Non-responders |
2 | Respondents without a valid email, phone or mail address and therefore did not receive invitations to Wave 2 |
At Wave 3, the online survey was completed by 1,157 pairs of parents/guardians and their children who had completed the Wave 1 survey. The follow-up rate was 73% from Wave 1.
Wave 3 Completed Interviews | |
1,577 | Initial number of potential Wave 3 respondents (Wave 1 completers who did not refuse in Wave 2) |
1,157 | Respondents who completed the Wave 3 survey |
16 | Suspended interviews (unknown qualification) |
18 | Suspended interviews (qualified respondent) |
6 | Refusals (Parents) |
3 | Refusals (Youth) |
33 | Non-qualified respondents (whose age/gender did not match those recorded in Wave 1) |
336 | Non-responders |
Weighting the data
Data for all waves were weighted to represent the population of U.S. parents of children who at Wave 1 were ages 10-15, had access to the Internet, and had accessed the Internet in the past 6 months. Parents were the target instead of the children because the former were the recruitment source. Variables used in weighting were age, gender, race/ethnicity, region, education, household income, and age/gender of child who took the survey.
The weighting algorithm also included a variable called a propensity score, to account for differences between those who are online versus those who are not, those who join online panels versus those who did not, and those who responded to this particular survey invitation versus those who did not.
In addition, a separate weight variable was calculated that adjusts for respondents’ propensity to participate in the study after Wave 1. This weight accounts for respondents’ propensity to participate in risky behaviors. This weight variable adjusts for these differences as well as those noted above (demography and propensity to be online).
Editing and cleaning the data
The data processing staff at Harris Interactive performs machine edits and additional cleaning for the entire data set. Harris Interactive’s edit programs act as a verification of the skip instructions and other data checks that are written into the program. The edit programs list any errors by case and type. These are then resolved by senior Editing and Data Processing personnel who inspect the original file and make appropriate corrections. Complete records are kept of all such procedures.
Birth date questions were added at Wave 2 and Wave 3. Based upon adult responses to these questions, data cleaning indicated that 18 youth were likely 9 years of age, and 12 youth were 16 years of age at Wave 1. To maximize the amount of data and because caregivers did not know the eligibility criteria and, therefore, were unlikely to have misreported their child’s age purposefully, these youth are included in the analyses.
During review of the Wave 4 updated participant contact information, two households were identified as duplicate records. Subsequent follow-up with the participant strongly suggested that the person may have been an adult posing as a child. To be conservative, both records were deleted.
Missing data and “refused” responses were imputed using best-set regression29, which imputes data based on the best available subset of specified predictors. To reduce the likelihood of imputing truly non-responsive answers, participants were required to have valid data for at least 80% of the survey questions asked of all youth. Five respondents did not meet this criterion and were dropped from the Wave 1 sample for a final number of 1,581; 9 were dropped from the Wave 2 sample for a final number of 1,195; and 7 from the Wave 3 sample for a final number of 1,150.
The final analytical sample
Demographic characteristics of participants are shown in the table below. To demonstrate the effects of weighting on the data, we present first unweighted, and then second weighted data. As can be seen in the table below, differential drop out by demographic characteristics was not noted over time.
Youth and Household Demographic Characteristics (weighted) | Wave 1 (n=1,581)% (n) | Wave 2 (n=1,195)% (n) | Wave 3 (n=1,150)% (n) |
Sex | |||
Female | 51.1 (787) | 50.3 (591) | 50.8 (568) |
Male | 48.9 (794) | 49.7 (604) | 49.2 (582) |
Age(mean) | 12.6 | 13.7 | 14.5 |
Race | |||
White | 71.1 (1153) | 73.7 (899) | 72.4 (854) |
Black or AfricanAmerican | 13.7 (217) | 12.8 (147) | 14.1 (150) |
Mixed racial background |
8.6 (114) | 7.5 (80) | 7.9 (83) |
All other | 6.5 (97) | 6.0 (69) | 5.6 (63) |
Hispanic ethnicity | |||
Yes | 18.1 (205) | 16.4 (144) | 16.6 (136) |
No | 81.9 (1376) | 83.6 (1051) | 83.4 (1014) |
Youth and Household Demographic Characteristics (unweighted) | Wave 1Included in the analytical sample(n=1,581)% (n) | Excludedfrom the analytical sample(n=5)% (n) | Wave 2Included in the analytical sample(n=1,195) % (n) | Excludedfrom the analytical sample(n=9)% (n) | Wave 3Included in the analytical sample (n=1,150)% (n) | Droppedfrom the analytical sample(n=7)% (n) |
Sex | ||||||
Female | 49.8 (787) | 60.0 (3) | 49.5 (591) | 77.8 (7) | 49.4 (568) | 57.1 (4) |
Male | 50.2 (794) | 40.0 (2) | 50.5 (604) | 22.2 (2) | 50.6 (582) | 42.9 (3) |
Age(mean) | 12.6 | 12.2 | 13.6 | 13.1 | 14.5 | 12.6 |
Race | ||||||
White | 73.0 (1153) | 20.0 (1) | 75.2 (899) | 33.3 (3) | 74.3 (854) | 28.6 (2) |
Black or African American | 13.7 (217) | 20.0 (1) | 12.3 (147) | 11.1 (1) | 13.0 (150) | 14.3 (1) |
Mixed racial background | 7.2 (114) | 20.0 (1) | 6.7 (80) | 11.1 (1) | 7.2 (83) | 14.3 (1) |
All other | 6.1 (97) | — | 5.8 (69) | — | 5.5 (63) | — |
Decline to answer a | NA | 40.0 (2) | NA | 44.4 (4) | NA | 42.9 (3) |
Hispanic ethnicity | ||||||
Yes | 13.0 (205) | 20.0 (1) | 12.0 (144) | — | 11.8 (136) | — |
No | 87.0 (1376) | 80.0 (4) | 88.0 (1051) | 77.8 (7) | 88.2 (1014) | 71.4 (5) |
Decline to answer a | NA | — | NA | 22.2 (2) | NA | 28.6 (2) |
Limitations of the study
There are several limitations to the study. Participants were randomly sampled from an online panel. While sophisticated sampling techniques are applied to adjust for the self-selection into the online panel, and randomization of sampling reduces the self-selection into the cohort, the sample may still be unrepresentative of some sub-populations. This is not unique to online panels however. Technology, including answer machines, caller id, and cell phone-only households, have greatly reduced the ability to recruit a representative sample using random digit dial techniques. School-based surveys reflect only youth who attend schools; they do not include home-schooled children (3-5% of our sample, depending on wave) nor those who are absent on the day of data collection.
Participants had to read English. Findings may not be representative of households where other languages (e.g., Spanish) are the primary language. This is not an inconsequential point given recent shifts noted in Census data for ethnicity representation particularly among US youth.
Data were collected in the summer. Although we ask youth to report about their exposures in the past 12 months, it is possible that rates would have been different if data were collected during the school period.
It should be noted that this study recruited a sample of parent/caretakers and youth who have access to the Internet because violent exposure online is a primary exposure of interest. While the proportion of adults and children who have access to the Internet is rapidly increasing, children who live with
caregivers who access the Internet may be different than those who live with caregivers who do not. This sample may not be representative of households without exposure to the Internet.
References:
1. Centers for Disease Control and Prevention. Youth Violence Fact Sheet. 2008; http://www.cdc.gov/ncipc/pub-res/YVFactSheet.pdf. Accessed April 23, 2007.
2. U. S. Department of Health and Human Services. Risk Factors for Youth Violence. Youth Violence: A Report of the Surgeon General. Washington, D.C.: U.S. Department of Health and Human Services; 2002.
3. National Center for Injury and Prevention Control. Understanding Youth Violence Fact Sheet. Atlanta, GA: Centers for Disease Control and Prevention;2008.
4. Huesmann L. Risk Factors for Youth Violence. Youth Violence: A Report of the Surgeon General. New York: Academic Press; 1998.
5. American Academy of Pediatrics. Policy Statement Media Violence. RE95261995.
6. Anderson CA, Berkowitz L, Donnerstein E, et al. The influence of media violence on youth. Psychological Science in the Public Interest. 2003;4(3):81-110.
7. Bushman B, Huesmann L. Effects of Televised Violence on Aggression. In: Singer D, Singer J, eds. Handbook of Children and the Media. Thousand Oaks, CA: Sage Publications; 2001:223-268.
8. Strasburger V, Wilson B. Media Violence. Children, Adolescents & the Media. Thousand Oaks: Sage Publications; 2002:73-116.
9. Potter W. On Media Violence. Thousand Oaks, CA: Sage Publications; 1999.
10. Anderson CA, Bushman BJ. Effects of Violent Video Games on Aggressive Behavior, Aggressive Cognition, Aggressive Affect, Physiological Arousal, and Prosocial Behavior: A Meta-Analytic Review of the Scientific Literature. Psychological Science. 2001;12:353-359.
11. Gentile DA, Anderson CA. Violent Video Games: The Newest Media Violence Hazard. Media violence and children: A complete guide for parents and professionals. Westport, CT US: Praeger Publishers/Greenwood Publishing Group; 2003:131-152.
12. Funk JB. Video games. Adolescent Medicine: State of the Art Reviews. 1993;4(3):589-598.
13. Montgomery K. Youth and digital media: a policy research agenda. Journal of Adolescent Health. 2000;27(2):61-68.
14. Cantor J. Media violence. Journal of Adolescent Health. 2000;27(2 Suppl):30-34.
15. Hogan M. Media matters for youth health. Journal of Adolescent Health. 2000;27(2 Suppl):73-76.
16. Rich M, Bar-on M. Child health in the information age: Media education of pediatricians. Pediatrics. 2001;107(1):156-162.
17. Willis E, & Strasburger, V.C. Media violence. Pediatric Clinics of North America. 1998;45(2):319-331.
18. Huesmann LR, Moise-Titus J, Podolski C-L, Eron LD. Longitudinal relations between children’s exposure to TV violence and their aggressive and violent behavior in young adulthood: 1977-1992. Developmental Psychology. 2003;39(2):201-221.
19. Browne KD, Hamilton-Giachritsis C. The influence of violent media on children and adolescents:a public-health approach. Lancet. 2005;365(9460):702-710.
20. Olson CK. Media violence research and youth violence data: why do they conflict? Academic Psychiatry. 2004;28(2):144-150.
21. Huesmann LR, Taylor LD. The role of media violence in violent behavior. Annual Review of Public Health. 2006;27:393-415.
22. Linz D, Donnerstein E, Penrod S. The Effects of Multiple Exposures to Filmed Violence Against Women. Journal of Communication. 1984;34(3):130-147.
23. Donnerstein E, Linz D. Mass Media Sexual Violence and Male Viewers: Current Theory and Research. American Behavioral Scientist. 1986.
24. Demare D, Briere J, Lips H. Violent pornography and self-reported likelihood of sexual aggression. Journal of Research in Personality. 1988.
25. Benedek EP, Brown CF. No excuses: televised pornography harms children. Harvard Review of Psychiatry. 1999;7(4):236-240.
26. Schonau M, Zapert K, Simon LP, et al. A comparison between
response rates from a propensity-weighted Web survey and an identical RDD survey. Social Science Computer Review. 2004;22(1):128-138.
27. Joinson A. Causes and implications of disinhibition behaviors on the Internet. In: Gackenbach J, ed. Psychology and the Internet: Intrapersonal, Interpersonal, and Transpersonal Implications. New York, NY: Academic Press; 1998:43-58.
28. Joinson A. Social desirability, anonymity, and Internet-based questionnaires. Behavior Research Methods, Instruments, & Computers. 1999;31(3):433-438.
29. Stata Statistical Software [computer program]. Version 10.1. College Station, TX: Stata Corporation; 2008.
Other Bulletins in this Series:
§ Parent and Youth Media Use Patterns
§ Parent and Youth Reported Household Rules Characteristics
§ Exposure to Violence and Sex in Media
§ Youth Violence Victimization and Perpetration
§ Mental Health and Psychosocial Indicators
Selection of Other Publications:
Ybarra, M., Diener-West, M., Markow, D., Leaf, P., Hamburger, M., & Boxer, P. (2008). Linkages between internet and other media violence with seriously violent behavior by youth. Pediatrics, 122(5), 929-937.
Ybarra, M.L., Espelage, D., & Mitchell, K.J. (2007). The co-occurrence of internet harassment and unwanted sexual solicitation victimization and perpetration: Associations with psychosocial indicators. Journal of Adolescent Health, 41 (6,Suppl), S31-S41.
Mitchell, K.J. & Ybarra, M.L. (2009). Social networking sites: Finding a balance between their risks and benefits. Archives of Pediatrics & Adolescent Medicine, 163(1) 87-89.
Acknowledgements:
The GuwM Study was funded by a Cooperative Agreement with the Centers for Disease Control and Prevention (U49/CE000206; PI: Ybarra). Points of view or opinions in this bulletin are those of the authors and do not necessarily represent the official position of policies of the Centers for Disease Control.
We would like to thank the entire Growing up with Media Study team: Internet Solutions for Kids, Harris Interactive, Johns Hopkins Bloomberg School of Public Health, and the CDC, who contributed to the planning and implementation of the study. Finally, we thank the families for their time and willingness to participate in this study.
This bulletin was prepared jointly by (in alphabetical order): Dr. Josephine Korchmaros, Dr. Kimberly Mitchell, Ms. Tonya Prescott, and Dr. Michele Ybarra.
Suggested citation: Center for Innovative Public Health Research (CiPHR). Growing up with Media: Methodological Details. San Clemente, CA: CiPHR.
Pages: 1 2