0
Articles |

Press Releases by Academic Medical Centers: Not So Academic? FREE

Steven Woloshin, MD, MS; Lisa M. Schwartz, MD, MS; Samuel L. Casella, MPH; Abigail T. Kennedy, BA; and Robin J. Larson, MD, MPH
[+] Article and Author Information

From the Veterans Affairs Outcomes Group, Department of Veterans Affairs Medical Center, White River Junction, Vermont, and The Dartmouth Institute for Health Policy and Clinical Practice and Dartmouth Medical School, Hanover, New Hampshire.


Note: Drs. Woloshin and Schwartz contributed equally to the creation of this manuscript. The order of their names is arbitrary.

Disclaimer: The views expressed herein do not necessarily represent the views of the Department of Veterans Affairs or the U.S. government.

Acknowledgment: The authors thank Renda Wiener, MD, MPH, for helpful comments on an earlier draft; the members of the Veterans Affairs Outcomes Group for their ongoing helpful feedback during all phases of the project; Deborah Kimbell, Andrew Nordhoff, and Jane D'Antonio for their feedback on the media relations interview script; and Jennifer Snide, MPH, and Katie Van Veen, BA, for technical assistance.

Grant Support: By the National Cancer Institute (grant R01 CA104721). Drs. Woloshin and Schwartz were also supported by Robert Wood Johnson Generalist Faculty Scholars Awards.

Potential Financial Conflicts of Interest:Honoraria: S. Woloshin (National Institutes of Health, for “Medicine in the Media”), L.M. Schwartz (National Institutes of Health, for “Medicine in the Media”).

Reproducible Research Statement:Study protocol: Available from Dr. Schwartz (e-mail, lisa.schwartz@dartmouth.edu). Statistical code and data set: Not available.

Requests for Single Reprints: Lisa M. Schwartz, MD, MS, Veterans Affairs Outcomes Group (111B), Department of Veterans Affairs, Veterans Affairs Medical Center, White River Junction, VT 05009; e-mail, lisa.schwartz@dartmouth.edu.

Current Author Addresses: Drs. Woloshin, Schwartz, and Larson: Veterans Affairs Outcomes Group (111B), Department of Veterans Affairs, Veterans Affairs Medical Center, White River Junction, VT 05009.

Ms. Kennedy: Dartmouth Medical School, 1 Medical Center Drive, Lebanon, NH 03756.

Mr. Casella: Dartmouth Institute for Health Policy and Clinical Practice, The Dartmouth Institute for Health Policy and Clinical Practice, Center for Education, 30 Lafayette Street, Lebanon, NH 03766.

Author Contributions: Conception and design: S. Woloshin, L.M. Schwartz, S.L. Casella.

Analysis and interpretation of the data: S. Woloshin, L.M. Schwartz, S.L. Casella, R.J. Larson.

Drafting of the article: S. Woloshin, L.M. Schwartz, S.L. Casella.

Critical revision of the article for important intellectual content: S. Woloshin, L.M. Schwartz, A.T. Kennedy.

Final approval of the article: S. Woloshin, L.M. Schwartz, A.T. Kennedy, R.J. Larson.

Statistical expertise: S. Woloshin, L.M. Schwartz.

Obtaining of funding: S. Woloshin, L.M. Schwartz.

Administrative, technical, or logistic support: L.M. Schwartz.

Collection and assembly of data: S. Woloshin, S.L. Casella, A.T. Kennedy, R.J. Larson.


Ann Intern Med. 2009;150(9):613-618. doi:10.7326/0003-4819-150-9-200905050-00007
Text Size: A A A

Background: The news media are often criticized for exaggerated coverage of weak science. Press releases, a source of information for many journalists, might be a source of those exaggerations.

Objective: To characterize research press releases from academic medical centers.

Design: Content analysis.

Setting: Press releases from 10 medical centers at each extreme of U.S. News & World Report's rankings for medical research.

Measurements: Press release quality.

Results: Academic medical centers issued a mean of 49 press releases annually. Among 200 randomly selected releases analyzed in detail, 87 (44%) promoted animal or laboratory research, of which 64 (74%) explicitly claimed relevance to human health. Among 95 releases about primary human research, 22 (23%) omitted study size and 32 (34%) failed to quantify results. Among all 113 releases about human research, few (17%) promoted studies with the strongest designs (randomized trials or meta-analyses). Forty percent reported on the most limited human studies—those with uncontrolled interventions, small samples (<30 participants), surrogate primary outcomes, or unpublished data—yet 58% lacked the relevant cautions.

Limitation: The effects of press release quality on media coverage were not directly assessed.

Conclusion: Press releases from academic medical centers often promote research that has uncertain relevance to human health and do not provide key facts or acknowledge important limitations.

Primary Funding Source: National Cancer Institute.

Editors' Notes
Context

  • News reports often exaggerate the importance of medical research.

Contribution

  • The researchers reviewed press releases issued by academic medical centers. They found that many press releases overstated the importance of study findings while underemphasizing cautions that limited the findings' clinical relevance.

Caution

  • The researchers did not attempt to see how the press releases influenced actual news stories.

Implication

  • Academic center press releases often promote research with uncertain clinical relevance without emphasizing important cautions or limitations.

—The Editors

Medical journalism is often criticized for what reporters cover (for example, preliminary work) and how they cover it (for example, turning modest findings into miracles) (14). Critics often place blame squarely on the media, pointing out that few journalists are trained to critically read medical research or suggesting that sensationalism is deliberate: Whereas scientists want to promote the truth, the media just want to sell newspapers.

But exaggeration may begin with the journalists' sources. Researchers and their funders, and even medical journals, often court media attention through press releases. The strategy works: Press releases increase the chance of getting media coverage (56) and shape subsequent reporting (7). An independent medical news rating organization found that more than one third of U.S. health news stories seemed to rely solely or largely on press releases (1).

Academic medical centers produce large volumes of research and attract press coverage through press releases. Because these centers set the standard for research and education in U.S. medicine, one might assume that their press releases are measured and unexaggerated. To test this assumption, we examined press releases from academic medical centers in a systematic manner.

We selected the 10 highest-ranked and 10 lowest-ranked of the academic medical centers covered in U.S. News & World Report's medical school research rankings (8) that issued at least 10 releases in 2005. In addition, we identified each medical school's affiliates by using an Association of American Medical Colleges database. The Appendix Table lists the centers and their affiliated press offices. We initially intended to compare press releases by research ranking, but because we found few differences, we report findings across the entire study sample, highlighting the few differences by rank where they exist.

Table Jump PlaceholderAppendix Table.  Highest- and Lower-Ranked Medical Schools (and Their Affiliated Press Offices) for Research That Issued at Least 10 Press Releases in 2005
Press Release Process

During 2006, a former medical school press officer conducted semistructured (15-minute) telephone interviews with “the person in charge” of media relations at the 20 centers. The interview script (Appendix) covered release policy (how is research chosen?), production (writing, review, researcher's role), and an overall assessment (perceived pressure for media results, praise, or backlash).

Press Release Content

We searched Eureka Alert (a press release database) for all “medical and health” releases issued by the 20 centers and their affiliates in 2005. The Figure summarizes the search results.

Grahic Jump Location
Figure.
Study flow diagram.

*Of the medical schools that issued at least 10 press releases in 2005.

Grahic Jump Location
Science Promoted

After excluding duplicate or nonresearch releases (such as those announcing grants), we determined study focus (animal or human) and publication status; if the study was published, we characterized the journal's academic prominence by using the Thompson Scientific Journal Citation Reports “impact factor.”

Content Analysis

We randomly selected 200 press releases (10 per center) and assessed presentation of study facts, cautions, and presence of exaggeration by using separate coding schemes for human and nonhuman studies (the Appendix includes both schemes). The schemes included 32 unique items (10 for human studies only, 4 for nonhuman studies only, and 18 common to both). Sixteen items involved simply extracting facts from the release (for example, study size); the other 16 items required subjective judgments (for example, were there cautions about confounding for observational studies?). To confirm key study details (such as population, design, and size), we obtained the research reports (journal article or meeting abstract) referenced in the releases.

Coding Reliability and Analysis

Two research assistants who were blinded to the study's purpose independently coded releases. To measure reliability, the coders and investigators reviewed each code's definition and then reread the release to confirm (or change) their code. Errors due to definition or data entry problems were corrected before agreement was calculated. Intercoder agreement was “nearly perfect” (9) for both sets of items: for factual items, κ was 1.0 (range, 0.98 to 1.0), and for subjective items, κ was 0.97 (range, 0.79 to 1.0). Disagreements were resolved by 4 of the investigators. We used STATA, version 10 (StataCorp, College Station, Texas) for all analyses.

Role of the Funding Source

The project was funded by the National Cancer Institute and the Robert Wood Johnson Generalist Faculty Scholars Program. Neither source had any role in study design, conduct, or analysis or in the decision to seek publication.

Press Release Process

All centers said that investigators routinely request press releases and are regularly involved in editing and approving them (Table 1). Only 2 centers routinely involve independent reviewers. On average, centers employed 5 press release writers (the highest-ranked centers had more writers than lower-ranked centers [mean, 6.6 vs. 3.7]). Three centers said that they trained writers in research methods and results presentation, but most expected writers to already have these skills and hone them on the job. All 20 centers said that media coverage is an important measure of their success, and most report the number of “media hits” garnered to the administration.

Table Jump PlaceholderTable 1.  Press Release Process and Press Releases Issued by the 20 Academic Medical Centers
Press Releases Issued

Table 1 shows that the centers issued 989 medical research-related releases in 2005. The centers averaged 49 releases per year; the range was 13 (Brown Medical School) to 186 (Johns Hopkins University School of Medicine). Twelve percent of the releases promoted unpublished research from scientific meetings. Higher-ranked centers issued more releases than lower-ranked centers (743 vs. 246) and were less likely to promote unpublished research (9% vs. 20%).

Press Release Quality

Table 2 summarizes the measures of press release quality.

Table Jump PlaceholderTable 2.  Type of Research Promoted in and Quality of the 200 Press Releases Analyzed in Detail
Study Details and Cautions

Of the 95 releases about primary human research (excluding unstructured reviews and decision models), 77% provided study size and most (66%) quantified the main finding in some way; 47% used at least 1 absolute number, the most transparent way to represent results (1011). Few releases (12%) provided access to the full scientific report.

Two thirds of the 200 randomly selected releases reported study funding sources; 4% noted conflicts of interest (either that none [3 releases] or some existed [4 releases]).

Of all 113 releases about human studies, 17% promoted published studies with the strongest designs (randomized trials or meta-analyses). Forty percent reported on inherently limited studies (for example, sample size <30, uncontrolled interventions, primary surrogate outcomes, or unpublished meeting reports). Fewer than half (42%) provided any relevant caveats. For example, a release titled “Lung-sparing treatment for cancer proving effective” (which concluded that treatment was “a safe and effective way to treat early stage lung cancer in medically inoperable patients”) lacked cautions about this uncontrolled study of 70 patients.

Among the 87 releases about animal or laboratory studies, most (64 of 87) explicitly claimed relevance to human health, yet 90% lacked caveats about extrapolating results to people. For example, a release about a study of ultrasonography reducing tumors in mice, titled “Researchers study the use of ultrasound for treatment of cancer,” claimed (without caveats) that “in the future, treatments with ultrasound either alone or with chemotherapeutic and antivascular agents could be used to treat cancers.”

Exaggeration

Twenty-nine percent of releases (58 of 200) were rated as exaggerating the finding's importance. Exaggeration was found more often in releases about animal studies than human studies (41% vs. 18%).

Almost all releases (195 of 200) included investigator quotes, 26% of which were judged to overstate research importance. For example, a release for a study of mice with skin cancer, titled “Scientists inhibit cancer gene. Potential therapy for up to 30 percent of human tumors,” quoted the investigator as saying that “the implication is that a drug therapy could be developed to reduce tumors caused by Ras without significant side effects.” Coders thought that the “implication” exaggerated the study findings, because neither treatment efficacy nor tolerability in humans was assessed.

Although 24% (47 of 200) of releases used the word “significant,” only 1 clearly distinguished statistical from clincial significance. All other cases were ambiguous, creating an opportunity for overinterpretation: for example, “Not-for-profit hospitals consistently had significantly higher scores than for-profit hospitals.”

Press releases issued by 20 academic medical centers frequently promoted preliminary research or inherently limited human studies without providing basic details or cautions needed to judge the meaning, relevance, or validity of the science. Our findings are consistent with those of other analyses of pharmaceutical industry (12) and medical journal (13) press releases, which also revealed a tendency to overstate the importance and downplay (or ignore) the limitations of research.

Our study has several limitations. First, content analysis coding always involves subjectivity. The high level of agreement observed among coders, however, is reassuring. Second, our findings are based on 20 centers, which may raise concern about generalizability. Because only the top 52 (of 125) centers receive a U.S. News & World Report ranking, we believe that our findings represent a best-case scenario. Third, our coding scheme was not exhaustive. We focused on study details and cautions we thought were of highest priority because press releases are typically 1 page or fewer.

Most important, because we did not analyze subsequent news coverage of press-released research, we cannot directly link problems with press releases (such as lack of cautions or numbers, or exaggeration) with those in news reports. Our findings would be stronger if we had shown integration of exaggerated information from releases into news stories. Nevertheless, press releases matter: They attract journalists' attention, and although the practice is generally discouraged, many health news stories—perhaps as many as one third—seem to rely largely or solely on the press release (1). Journalists worry that such reliance will increase, given newsroom cutbacks and greater demand for online news (7). The problems that we document are very similar to those seen in analyses of medical news: no quantification of main results (1416) and no mention of intervention side effects (1416), conflicts of interest (16), or study limitations (14). We believe that academic centers contribute to poor media coverage and are forgoing an opportunity to help journalists do better.

The quickest strategy for improvement would be for centers to issue fewer releases about preliminary research, especially unpublished scientific meeting presentations, because findings often change substantially—or fail to hold up—as studies mature (17). Forty percent of meeting abstracts and 25% of abstracts that garner media attention (18) are never subsequently published as full reports in medical journals (19). Similarly, centers should limit releases about animal or laboratory research. Although such research is important, institutions should not imply clinical benefit when it does not exist (and may not for years, if ever): Two thirds of even highly cited animal studies fail to translate into successful human treatments (20).

When press releases are issued, they should include basic study facts and explicit cautions. For example, press releases should remind journalists that strong inferences cannot be drawn from uncontrolled studies, or that surrogate outcomes do not always translate into clinical outcomes. Although good press releases will probably help, quality reporting also requires good critical evaluation skills. Fortunately, journalists have opportunities to acquire these skills, through such programs as the Association of Health Care Journalists seminars; the Knight Science Journalism Medical Evidence Boot Camp at MIT; and “Medicine in the Media: The Challenge of Reporting on Medical Research,” a workshop sponsored by the National Institutes of Health, the Dartmouth Institute for Health Policy and Clinical Practice, and the Department of Veterans Affairs.

Investigators can also do better. They could forgo requesting releases for studies with obvious limitations and review releases before dissemination, taking care to temper their tone (particularly their own quotes, which we often found overly enthusiastic).

By issuing fewer but better press releases, academic centers could help reduce the chance that journalists and the public are misled about the importance or implications of medical research. Centers might get less press coverage, but they would better serve their mission: to improve the health of their communities and the larger society in which they reside.

Schwitzer G.  How do US journalists cover treatments, tests, products, and procedures? An evaluation of 500 stories. PLoS Med. 2008; 5:95. PubMed
CrossRef
 
Smith D, Wilson A, Henry D, Media Doctor Study Group.  Monitoring the quality of medical news reporting: early experience with media doctor. Med J Aust. 2005; 183:190-3. PubMed
 
Wilkes MS, Kravitz RL.  Medical researchers and the media. Attitudes toward public dissemination of research. JAMA. 1992; 268:999-1003. PubMed
 
Woloshin S, Schwartz LM.  Media reporting on research presented at scientific meetings: more caution needed. Med J Aust. 2006; 184:576-80. PubMed
 
de Semir V, Ribas C, Revuelta G.  Press releases of science journal articles and subsequent newspaper stories on the same topic. JAMA. 1998; 280:294-5. PubMed
 
Bartlett C, Sterne J, Egger M.  What is newsworthy? Longitudinal study of the reporting of medical research in two British newspapers. BMJ. 2002; 325:81-4. PubMed
 
Russell C.  Science reporting by press release. Columbia Journalism Review. 2008. Accessed athttp://www.cjr.org/the_observatory/science_reporting_by_press_rel.phpon 3 March 2009.
 
.  America's Best Graduate Schools: Plus a Directory of Business, Education, Engineering, Law, and Medical Schools. Washington, DC: U.S. News and World Report; 2005.
 
Landis JR, Koch GG.  The measurement of observer agreement for categorical data. Biometrics. 1977; 33:159-74. PubMed
 
Naylor CD, Chen E, Strauss B.  Measured enthusiasm: does the method of reporting trial results alter perceptions of therapeutic effectiveness? Ann Intern Med. 1992; 117:916-21. PubMed
 
Schwartz LM, Woloshin S, Black WC, Welch HG.  The role of numeracy in understanding the benefit of screening mammography. Ann Intern Med. 1997; 127:966-72. PubMed
 
Kuriya B, Schneid EC, Bell CM.  Quality of pharmaceutical industry press releases based on original research. PLoS ONE. 2008; 3:2828. PubMed
 
Woloshin S, Schwartz LM.  Press releases: translating research into news. JAMA. 2002; 287:2856-8. PubMed
 
Cassels A, Hughes MA, Cole C, Mintzes B, Lexchin J, McCormack JP.  Drugs in the news: an analysis of Canadian newspaper coverage of new prescription drugs. CMAJ. 2003; 168:1133-7. PubMed
 
Høye S, Hjortdahl P.  [“New wonder pill!”—what do Norwegian newspapers write]. Tidsskr Nor Laegeforen. 2002; 122:1671-6. PubMed
 
Moynihan R, Bero L, Ross-Degnan D, Henry D, Lee K, Watkins J. et al.  Coverage by the news media of the benefits and risks of medications. N Engl J Med. 2000; 342:1645-50. PubMed
 
Toma M, McAlister FA, Bialy L, Adams D, Vandermeer B, Armstrong PW.  Transition from meeting abstract to full-length journal article for randomized controlled trials. JAMA. 2006; 295:1281-7. PubMed
 
Schwartz LM, Woloshin S, Baczek L.  Media coverage of scientific meetings: too much, too soon? JAMA. 2002; 287:2859-63. PubMed
 
Scherer RW, Langenberg P, von Elm E.  Full publication of results initially presented in abstracts. Cochrane Database Syst Rev. 2007; MR000005. PubMed
 
Hackam DG, Redelmeier DA.  Translation of research evidence from animals to humans [Letter]. JAMA. 2006; 296:1731-2. PubMed
 

Figures

Grahic Jump Location
Figure.
Study flow diagram.

*Of the medical schools that issued at least 10 press releases in 2005.

Grahic Jump Location

Tables

Table Jump PlaceholderAppendix Table.  Highest- and Lower-Ranked Medical Schools (and Their Affiliated Press Offices) for Research That Issued at Least 10 Press Releases in 2005
Table Jump PlaceholderTable 1.  Press Release Process and Press Releases Issued by the 20 Academic Medical Centers
Table Jump PlaceholderTable 2.  Type of Research Promoted in and Quality of the 200 Press Releases Analyzed in Detail

References

Schwitzer G.  How do US journalists cover treatments, tests, products, and procedures? An evaluation of 500 stories. PLoS Med. 2008; 5:95. PubMed
CrossRef
 
Smith D, Wilson A, Henry D, Media Doctor Study Group.  Monitoring the quality of medical news reporting: early experience with media doctor. Med J Aust. 2005; 183:190-3. PubMed
 
Wilkes MS, Kravitz RL.  Medical researchers and the media. Attitudes toward public dissemination of research. JAMA. 1992; 268:999-1003. PubMed
 
Woloshin S, Schwartz LM.  Media reporting on research presented at scientific meetings: more caution needed. Med J Aust. 2006; 184:576-80. PubMed
 
de Semir V, Ribas C, Revuelta G.  Press releases of science journal articles and subsequent newspaper stories on the same topic. JAMA. 1998; 280:294-5. PubMed
 
Bartlett C, Sterne J, Egger M.  What is newsworthy? Longitudinal study of the reporting of medical research in two British newspapers. BMJ. 2002; 325:81-4. PubMed
 
Russell C.  Science reporting by press release. Columbia Journalism Review. 2008. Accessed athttp://www.cjr.org/the_observatory/science_reporting_by_press_rel.phpon 3 March 2009.
 
.  America's Best Graduate Schools: Plus a Directory of Business, Education, Engineering, Law, and Medical Schools. Washington, DC: U.S. News and World Report; 2005.
 
Landis JR, Koch GG.  The measurement of observer agreement for categorical data. Biometrics. 1977; 33:159-74. PubMed
 
Naylor CD, Chen E, Strauss B.  Measured enthusiasm: does the method of reporting trial results alter perceptions of therapeutic effectiveness? Ann Intern Med. 1992; 117:916-21. PubMed
 
Schwartz LM, Woloshin S, Black WC, Welch HG.  The role of numeracy in understanding the benefit of screening mammography. Ann Intern Med. 1997; 127:966-72. PubMed
 
Kuriya B, Schneid EC, Bell CM.  Quality of pharmaceutical industry press releases based on original research. PLoS ONE. 2008; 3:2828. PubMed
 
Woloshin S, Schwartz LM.  Press releases: translating research into news. JAMA. 2002; 287:2856-8. PubMed
 
Cassels A, Hughes MA, Cole C, Mintzes B, Lexchin J, McCormack JP.  Drugs in the news: an analysis of Canadian newspaper coverage of new prescription drugs. CMAJ. 2003; 168:1133-7. PubMed
 
Høye S, Hjortdahl P.  [“New wonder pill!”—what do Norwegian newspapers write]. Tidsskr Nor Laegeforen. 2002; 122:1671-6. PubMed
 
Moynihan R, Bero L, Ross-Degnan D, Henry D, Lee K, Watkins J. et al.  Coverage by the news media of the benefits and risks of medications. N Engl J Med. 2000; 342:1645-50. PubMed
 
Toma M, McAlister FA, Bialy L, Adams D, Vandermeer B, Armstrong PW.  Transition from meeting abstract to full-length journal article for randomized controlled trials. JAMA. 2006; 295:1281-7. PubMed
 
Schwartz LM, Woloshin S, Baczek L.  Media coverage of scientific meetings: too much, too soon? JAMA. 2002; 287:2859-63. PubMed
 
Scherer RW, Langenberg P, von Elm E.  Full publication of results initially presented in abstracts. Cochrane Database Syst Rev. 2007; MR000005. PubMed
 
Hackam DG, Redelmeier DA.  Translation of research evidence from animals to humans [Letter]. JAMA. 2006; 296:1731-2. PubMed
 

Letters

NOTE:
Citing articles are presented as examples only. In non-demo SCM6 implementation, integration with CrossRef’s "Cited By" API will populate this tab (http://www.crossref.org/citedby.html).

Comments

Submit a Comment
Submit a Comment

Supplements

Summary for Patients

Clinical Slide Sets

Terms of Use

The In the Clinic® slide sets are owned and copyrighted by the American College of Physicians (ACP). All text, graphics, trademarks, and other intellectual property incorporated into the slide sets remain the sole and exclusive property of the ACP. The slide sets may be used only by the person who downloads or purchases them and only for the purpose of presenting them during not-for-profit educational activities. Users may incorporate the entire slide set or selected individual slides into their own teaching presentations but may not alter the content of the slides in any way or remove the ACP copyright notice. Users may make print copies for use as hand-outs for the audience the user is personally addressing but may not otherwise reproduce or distribute the slides by any means or media, including but not limited to sending them as e-mail attachments, posting them on Internet or Intranet sites, publishing them in meeting proceedings, or making them available for sale or distribution in any unauthorized form, without the express written permission of the ACP. Unauthorized use of the In the Clinic slide sets will constitute copyright infringement.

Toolkit

Want to Subscribe?

Learn more about subscription options

Advertisement
Related Articles
Topic Collections
PubMed Articles

Want to Subscribe?

Learn more about subscription options

Forgot your password?
Enter your username and email address. We'll send you a reminder to the email address on record.
(Required)
(Required)