Wednesday, September 29, 2010

PLoS Medicine: Seventy-Five Trials and Eleven Systematic Reviews a Day: How Will We Ever Keep Up?

An interesting perspective on the prospects for comparative effectiveness research to make a difference - will it just be contributing to information overload?


PLoS Medicine: Seventy-Five Trials and Eleven Systematic Reviews a Day: How Will We Ever Keep Up?

3 comments:

  1. During my time as a practicing physical therapist, I found one of the most frustrating aspects of my job was the amount of effort it took to stay up-to-date with the available literature. My training placed a heavy emphasis on the use of evidence based medicine, but after graduating I quickly found that this focus was more of an ideal than a practical solution to quality improvement within the profession.

    Getting onto the internet and searching for research on a particular topic can be a challenge in and of itself as there are a variety of search engines and databases that one can investigate using an almost endless combination of key-words and alternate phrasings. Then, if one was lucky enough to find an article relevant to his or her inquiry, the chances that he or she would be able to access the full text are less than promising. Here at URMC, we are very fortunate to have access to a variety of journals and databases, but a practicing clinician who is not affiliated with such as large institution can probably only access a fraction of this information. But for the sake of argument, if one can find a pertinent article and can access its full text, the results are very unlikely to hold any definitive answers to the original questions. We have had some great discussions about the incremental nature of scientific discovery in our recent classes, and while this is an undeniable aspect of research, it is nevertheless a frustrating barrier to incorporating new research into practice for a clinician whose primary objective is to work with patients. We frequently hear clinicians bemoan the time consuming burdens which limit the amount of time available to spend with patients, but the diffuse and restricted nature of our collective scientific knowledge could likely be viewed as an even greater obstacle, both consuming significant chunks of clinicians’ time and depriving patients of cutting edge information which could be shaping treatments.

    In theory, systematic reviews should help alleviate some of these problems, but as this article points out, they are not without issues. Furthermore, I would argue that even systematic reviews are not necessarily presented in a manner which is easy to digest, interpret, and implement. Long story short: research is not very user friendly. It seems to me that if we truly hope to see an increase in evidence-based medicine, we should not only be examining how to better conduct systematic reviews or comparative effectiveness research, but also how to better disseminate this information to the clinicians who can actually utilize it.

    ReplyDelete
  2. This article effectively points out one of the important hurdles researchers are facing while conducting a research: literature search and review. I had a first hand experience of working on a cochrane systematic review and I remember the amount of time I spent on Pubmed and Mesh searching for a handful of RCTs. Even after retrieving most of the trials done so far, only a couple of them could be included in the review as most of them did not meet the standard criteria set by cochrane to be included in systematic review. We had to contact authors of original trial as they did not report all the results of the trial. The process of getting the relevant articles before even starting to review was a nightmare in itself.
    Cochrane has received a lot of complains regarding the unreliability of search engines, which they have started to address by including the keywords to be entered in search engines to retrieve the relevant articles. However, some of the most widely used search engines are far from being reliable. I cannot forget how a bunch of keywords gave completely different search results on Pubmed every time while searching for articles. Thus I feel that addition to publishing an easily interpretable paper it is also important to have standard methods to retrieve such articles from search engines.

    ReplyDelete
  3. This article proposed a discussion about how to deal with the mountains of evidence in a world of information. How should we survive in such an ever changing and big bang world? Evidence is there initially to guide us make wise decisions. But what should we do when it evolves to the situation that people have to struggle and spend much of time in filtering the most relevant and valid information for our practice in the next seconds?

    I guess our policy makers might now in such dilemma to some extent, more than 20,000 articles on trials, almost 60,000 on cases report and 80,000 on reviews. However, through the figures presented in this article, it seems that people are gradually identifying an efficient way in dealing with this issue: effectively analysis literature by systematic review. Unlike many other reviews, systematic review always employs objective and transparent approaches, regardless of quantitative or qualitative, to identify, appraise and synthesize literatures. It does not necessary involves statistical analysis techniques such as meta-analysis, but it always deals with the thorough literature review, critical appraisal and reasonable interpretations about the results of papers that are rigorously selected for analysis. By doing this, systematic review are making every endeavor to propose the valid and reliable evidence for policy makers, leaving them released from the countless unfiltered and often contradictory results. Obviously, a high-quality systematic review does significantly facilitate the process of decision making by combining the most update and convincing evidence to help people critically appraise the advantages and disadvantages of each options available.

    However, it is unfortunate to see that the increase of systematic review done by researchers is fall behind the exponentially increase of that in trails and case studies. Admittedly, there must be someone design and practice trials and interventions in advance to address certain questions and propose possible solutions. But I guess even the most comprehensive multi-center clinical trials themselves can not be so confident to claim that their results can be directly applied into the clinical world immediately. Well, some may argue that at the initial stage of the exploration of any innovation techniques, we do need that courage to try. I agree with that. And that’s why we also treasure case studies that are apparently less strong as in generalization. But the decision making, especially those done by policy makers, still calls for great efforts in rigorously examine all the evidence available. There should be such a staff of professionals to critically collect the fruits of those expensive projects in a scientific manner and actually formulate them into something that our policy makers will not feel so reluctant to take a glance at or even take actions.

    I think the skill of systematically literature review is also one of the essential qualifications that future researchers should be equipped with. I am not sure how many of researchers will become policy makers in a certain field. But it seems that the value of our work may be, to a great extent, tightly correlated with that how much extent our results be adopted by the policy makers or the field of industry.

    ReplyDelete