Content area
Berlin et al. call for research on the design of more-efficient, longer-term randomized controlled trials and other data management and statistical techniques to help detect adverse drug events after clinical testing has concluded. Other innovative analytic approaches in this issue of the Journal include the application of a variant of the capture-recapture method to estimate the number of homeless people (typically a very difficult population to enumerate); the use of cluster analysis, common in the analysis of microarray data, to clarify behavioral groupings among intravenous drug users; and the application of geographic information system methodology to help the authors "see" how various park characteristics influence physical activity.
Public health thrives on innovation. In our attempts to understand the causes of disease and to improve intervention delivery modalities and outcomes, we are in a constant state of application and innovative revision; we collect data, perform analyses, and implement actions on the basis of those results. Those new actions are assessed, and the process is repeated and has been repeated across human history-it is our nature. And more often than not, if the assessment of the action leads to a null finding (or, to our ancestors, a "dead end"), we prefer to transform, modify, and innovate rather than give up.
Innovation can be operationalized in a variety of ways, including innovation in experimental design, theoretical frameworks, and in statistical analyses. In this issue of the Journal, we present several examples of innovation in design, frameworks, and analysis.
Nair et al. give us a primer on fractional factorial experimental designs that allow us to help unpack the causative agent or agents of multicomponent interventions. Fractional factorial designs are useful when we have a large number of potential causative "factors" of interest but only a limited (i.e., "fractional") number of experimental units. West et al. extend the concept and ability of designs to infer causation outside the context of the randomized controlled trial but do justice to the caution and limitations accompanying alternative designs. Berlin et al. call for research on the design of more-efficient, longer-term randomized controlled trials and other data management and statistical techniques to help detect adverse drug events after clinical testing has concluded. After the recent discovery of several long-term serious adverse effects among users of pharmaceuticals that had been declared "safe and effective" in clinical testing, this is a timely innovation.
Aiello et al. employ meta-analysis-a powerful and often underutilized analytic tool in public health-taking advantage of many studies addressing the same hypothesis under different conditions and settings. They present a fairly definitive case for the effectiveness of hand washing against infectious diseases. Resnicow and Page push back on our reliance and use of linear theoretical models in public health and behavioral medicine, and challenge us to consider whether chaos theory provides a better framework.
Pals et al. take their expertise in the design and analysis of group randomized trials and extend it to individually randomized trials. Do their innovative approaches need to be applied to randomized controlled trials as well? Desai and Begg continue that conversation by offering much-needed practical advice on which regression approaches, and which important covariates, are appropriate to use when analyzing clustered data.
Other innovative analytic approaches in this issue of the Journal include the application of a variant of the capture-recapture method to estimate the number of homeless people (typically a very difficult population to enumerate); the use of cluster analysis, common in the analysis of microarray data, to clarify behavioral groupings among intravenous drug users; and the application of geographic information system methodology to help the authors "see" how various park characteristics influence physical activity.
The articles in this issue, although by no means exhaustive, allow us to celebrate the ways in which we innovate around designs, frameworks, and analyses to push the boundaries of not only what we know, but how we get there. Enjoy.
Roger D. Vaughan, DrPH
Associate Editor for Statistics and Evaluation, AJPH
doi: 10.2105/AJPH.2008.142802
Copyright American Public Health Association Aug 2008
