Content area
In fact, methodological and analytical approaches to social research, which evolve constantly, render old methods less useful or even inappropriate in some cases.Therefore, keeping pace with relevant advances in quantitative methodology and analysis is critical to competing successfully for external funding and advancing knowledge about individual and social problems. [...] resources such as the Methodology Center at Penn State University provide online readings and tutorials on topics such as handling missing data, latent class analysis, and latent transition analysis, to name a few (http://methcenter.psu.edu).
Most educators and scholars rely on the methodological and analytical content they received during their doctoral training to build and sustain their research careers. Thus, when designing new investigations, faculty members often rely on the design, measurement, and analytical techniques they learned during their formative training. In the past, this strategy worked well for most investigators in social work and other social sciences. In recent years, however, an increase in the sophistication of quantitative techniques and a proliferation of the software used to analyze complex data mean that not only content but also research techniques mastered during doctoral training are outdated more quickly than in the past. In fact, methodological and analytical approaches to social research, which evolve constantly, render old methods less useful or even inappropriate in some cases.Therefore, keeping pace with relevant advances in quantitative methodology and analysis is critical to competing successfully for external funding and advancing knowledge about individual and social problems.
DESIGN AND ANALYTICAL ADVANCES
Many new and sophisticated design, measurement, and analysis strategies have been introduced in social research in the past decade. For example, recent methodological and analytical developments have made possible overcoming challenges like attrition in efficacy and effectiveness trials. Once an unknown and uncertain process, multiple imputation strategies for handling missing data have become common practice in longitudinal studies (Schafer & Graham, 2002). Such strategies are particularly common today in randomized trials assessing the efficacy or effectiveness of prevention programs or social interventions (Graham, Olchowski, & Gilreath, 2007). Similarly, innovative methods to deal with selection effects that use techniques such as propensity analyses have been introduced in intervention research (Fraser, 2004). Yet another example is evident in advances in estimating power, a thorny problem that has been made less complicated by advances in specialized software programs (Raudenbush & Bryk,2002).
Achieving mastery of new analytical techniques may be challenging to experienced investigators who did not previously receive adequate training in similar approaches. For instance, once the exception, statistical methods known variously as multilevel, hierarchical, or mixed models are now used frequently to answer increasingly complicated research questions related to the onset and persistence of selected behaviors and outcomes. Similarly, an increase in group-randomized trials (GRTs) has led to numerous analytical advances necessary to analyze data collected in schools and community prevention experiments (for examplejenson & Dieterich, 2007 iJenson, Dieterich, Rinner, Washington, & Burgoyne, 2006). That is, when randomization is done at the group level, treatment effects need to be analyzed at the group level. For example, in a study where schools are randomly assigned to treatment and control conditions, the denominator degrees of freedom for a test of the difference between the means of the control and experimental schools is not based on the number of students but rather on the number of schools in the study. Thus, the appropriate statistical models in a GRT partition both random and fixed effects at the student-, classroom-, and school-levels. It is important to note that the use of such models has a significant effect on the sample size and power requirements in a study.
This brief discussion barely scratches the surface of recent developments in quantitative methodology. Additional issues and recommendations pertaining to design sampling, power, and analysis are found in the recent social work literature (for example, Fraser, 2004; Jenson et al., 2006; LeCroy & Krysik, 2007; Nash, Kupper, & Fraser, 2004). Readers may also wish to examine important papers from other disciplines that summarize new approaches to preventing or handling complex methodological and analytical issues in social research (for example, Muthén, 2002; Murray, Varnell, & Blitstein, 2004; Raudenbush & Bryk, 2002; Singer & Willett, 2003).
SOFTWARE ADVANCES
Seasoned readers of this journal can look back over their careers and easily see the numerous advances made in software applications for quantitative data. Some readers may actually remember the practice of entering data via a punch card system and pouring over large data printouts! Today, powerful software to analyze complex data operates in the framework of a very competitive market and in a constantly changing environment.
The Statistical Package for the Social Sciences (2008) (SPSS) has been the most widely used software application for analyzing quantitative data in the social sciences since its inception in 1968 (http://www.spss.com/). SPSS has been supported in nearly all American universities and is familiar to most faculty and doctoral students in social work and related disciplines. In addition, Statistical Analysis Software (SAS) (SAS Institute Inc., 2001) has been used extensively by faculty members and students in the past 30 years.
Despite the dominance of standard programs like SPSS and SAS, a number of innovative software programs have emerged as their competitors. For example, statistical packages such as hierarchical linear modeling (Bryk, Raudenbush, Cheong, & Congdon, 2004) and MLwiN (Rasbash, Steele, Browne, & Prosser, 2004) offer excellent features for running complex models. Further, multilevel approaches are also available in software packages such as R (R Development Core Team, 2006), Stata (StataCorp, 2003), Mplus (Muthén & Muthén, 2007), and S-PLUS (Insightful, 2008; http://www. insightful.com/). Finally, software such as Optimum Design (Raudenbush & Liu, 2001) is an example of a freeware program used to calculate power and sample size in randomized trials. The program is available at http://sitemaker.umich. edu/group-based/optimal_design_software.
It is important to note that online resources are available to researchers interested in using new analytical techniques and corresponding software. For example, most of the software programs listed above offer online consultation and ways to pose questions pertaining to analysis and software application. In addition, resources such as the Methodology Center at Penn State University provide online readings and tutorials on topics such as handling missing data, latent class analysis, and latent transition analysis, to name a few (http://methcenter.psu.edu). Similarly, the William T Grant Foundation (http://www. wtgrantfoundation.org/index.htm) offers online resources aimed at helping investigators design and analyze data from longitudinal studies.
Readers should also be aware of educational opportunities that are available to faculty and students interested in learning advanced design and analytical techniques. These opportunities range from two-day training sessions sponsored by organizations such as the Institute for the Advancement of Social Work Research (2008) (http://www.iaswresearch.org) to intensive summer classes sponsored by the University of Michigan's Inter-University Consortium for Political and Social Research (2008) (ICPSR). A schedule for the ICPSR summer program in quantitative methods is available online at http://www. icpsr.umich.edu/training/summer/.
SUMMARY
Recent advances in methodology, analytical approaches, and software applications for analyzing complex data pose both opportunities and challenges to social work researchers. These advances have allowed investigators to ask sophisticated research questions, identify important relationships and effects, and discover new knowledge about the etiology, prevention, and treatment of individual and social problems. However, rapid methodological, analytical, and technological changes also challenge investigators to stay abreast of new developments that are critical to building and sustaining productive research careers. To meet these challenges, social work researchers must remain current with methodology in their substantive fields. Further, academic researchers should participate in, and develop, interdisciplinary teams with members that bring complimentary and advanced conceptual, methodological, and analytical skills to multiyear investigations.
THE CURRENT ISSUE
This issue features new research addressing topics of mental illness and mental health, poverty, and child welfare. Hoe and Brekke examine the measurement invariance of the Brief Symptom Inventory (Derogatis & Melisaratos, 1983) among a sample of individuals with severe and persistent mental illness. Their findings shed new light on the utility of a frequently used instrument for African Americans, Caucasians, and Latinos. The relationship among caregiver mental health, neighborhood characteristics, and social network influences among caregiver-child dyads at risk for child maltreatment is the topic of a report by Lindsey and colleagues. The authors' findings point to the importance of the relationship between social networks of caregivers and children's mental health needs. Implications for implementing social network strategies in mental health services are discussed.
Wu and colleagues examine the effects of participation in Wisconsin's Temporary Assistance for Needy Families (TANF) program on employment and earnings over a six-year period. Through the assessment of administrative data, findings reveal that TANF participants fit three successful employment trajectories at follow-up. Slightly more than one-half of these subjects sustained their progress in the longterm. Koh and Testa use propensity score matching between children in kinship and nonkinship foster care to analyze differences in permanency outcomes between the two groups. Findings indicate that children in nonkinship foster homes have higher rates of placement disruption after matching. It is interesting that these differences disappear at one year follow-up. Implications of these and other results for preventing foster care disruption are offered. Finally, in this issue's Research Note, Pandey and Bright provide insightful interpretations and helpful suggestions regarding degrees of freedom.
REFERENCES
Bryk, A., Raudenbush, S., Cheong.Y. E1 & Congdon, M. (2003). HLM: Version 6.0 [Computer software]. Chicago: Scientific Software International.
Derogatis, L. R., & Melisaratos, N. (1983). The Brief Symptom Inventory: An introductory report. Psychological Medicine, 13, 595-605.
Fraser, M. W. (2004). Intervention research in social work: Recent advances and continuing challenges. Research on Social Work Practice, 14, 210-222.
Graham, J. W., Olchowski.A. E., & Gilreath.T. D. (2007). How many imputations are really needed? Some practical clarifications of multiple imputation theory. Prevention Science, 8, 206-213.
Insightful, Inc. (2008). S-PLUS [Computer software]. Retrieved March 28, 2008, from http://www. insightful.com/.
Institute for the Advancement of Social Work Research. (2008). Rigorous and relevant: Research methods workshops. Retrieved March 28, 2008, from http://www. iaswresearch.org.
Inter-University Consortium for Political and Social Research, University of Michigan. (2008). Summer program of quantitative methods of social research. Retrieved April 3, 2008, from http://www.icpsr. umich.edu/training/summer/.
Jenson, J. M., Dieterich, W. A., Rinner, J. R., Washington, F., & Burgoyne, K. (2006). Implementation and design issues in group-randomized prevention trials: Lessons from the Youth Matters public schools study. Children & Schools, 28, 207-218.
LeCroy, CW, & Krysik. J. (2007). Understanding and interpreting effect size measures. Social Work Research, 31, 243-250.
Murray, D. M.,Varnell, S. P., & BlitsteinJ. L. (2004). Design and analysis of group-randomized trials: A review of recent methodological developments. American Journal of Public Health, 94, 423-432.
Muthén, B. O. (2002). Beyond SEM: General latent variable modeling. Behaviormetrika, 29, 81-117.
Muthén, L. K., & Muthén, B. O. (2007). Mplus user's guide (5th ed.). Los Angeles, CA: Muthén & Muthén.
Nash, J. K., Kupper, L. L., & Fraser, M.W. (2004). Using multilevel statistical models in social work intervention research. Journal of Social Service Research, 30, 35-54.
R Development Core Team. (2006). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing.
Rasbash, J., Steele, E, Browne, W, & Prosser, B. (2004). A user's guide to MLuHN, Version 2. 0. London: Center for Multilevel Modeling, Institute of Education, University of London.
Raudenbush, S., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis (2nd ed.). Thousand Oaks, CA: Sage Publications.
Raudenbush, S., & Liu, X. (2001). Effects of study duration, frequency of observation, and sample size on power in studies of treatment effects on polynomial change. Psychological Methods, 6, 387-401.
SAS Institute. (2001). SAS (Version 8) [Computer software]. Cary, NC: Author.
SchaferJ. L., & Graham.J.W (2002). Missing data: Our view of the state of the art. Psychological Methods, 7, 147-177.
Singer, J. D., & Willett. J. B. (2003). Applied longitudinal data analysis: Modeling change and event occurrence. New York: Oxford University Press.
StataCorp. (2003). Stata statistical software: Release 8. College Station, TX: StataCorp LP.
Statistical Package for the Social Sciences. (2008). Retrieved March 31, 2008, from http://www.spss. com/
Jeffrey M. Jenson, PhD, is the Winn Professor for Children andYouth at Risk and director of the doctoral program, Graduate School of Social Work, University of Denver, 2148 South High Street, Denver, CO 80208; e-mail: [email protected].
Copyright National Association of Social Workers, Incorporated Jun 2008