"S. L. Baker" - Maybe you aren't doomed to cancer or other diseases because of your genes; scientists find risk research is inaccurate
by S. L. Baker
http://www.naturalnews.com/032577_genetic_testing_disease.html
(NaturalNews) Countless people have had regular expensive testings and even organs removed because, based on genetic testing, they've been told they are sure to get cancer. Think about women having breasts amputated to avoid breast cancer in the future, for example. Others live a life of worry and a feeling of impending doom if they believe test results of "biomarkers" for specific diseases show sooner or later they will probably get heart disease, dementia or some other potential killer.
But it turns out, according to a new study, the research about the associations between genes and other biomarkers and various maladies is vastly overstated. Bottom line: were told you are more or less doomed to eventually get a certain disease so you need constant vigilance and testing? Turns out, none of that may be true.
Research from John Ioannidis, MD, DSc, an expert in scientific study design at the Stanford University School of Medicine, shows clinicians may be making decisions for their patients based on inaccurate conclusions not supported by other, larger studies.
For instance, one widely cited study links the BRCA1 gene mutation with colon cancer; another links levels of C-reactive protein in the blood with cardiovascular disease. Still another claims an association between homocysteine levels with vascular disease. The trouble is, these conclusions turned out to apparently be gross exaggerations.
In a statement to the press, Dr. Ioannidis claims these mistakes are "...the result of statistical vagaries coupled with human nature and the competitive nature of scientific publication."
His research paper is published in the June 1 issue of the Journal of the American Medical Association (JAMA).
"No research finding has no uncertainty; there are always fluctuations," Dr. Ioannidis stated. "This is not fraud or poor study design, it's just statistical expectation. Some results will be stronger, some will be weaker. But scientific journals and researchers like to publish big associations."
However, by publishing these popular "big associations", the medical journals give the papers lots of publicity; the research papers are cited over and over in the mainstream medical community (with little if any critiques of the findings). And soon they are seen as proof linking a biomarker to a disease.
The result can be unneeded testing, treatment, and more. It can also cause patients undue anxiety, stress and fear for their future.
In all, Ioannidis and colleague Orestis Panagiotou, MD, from the University of Ioannina School of Medicine in Greece, investigated 35 widely cited studies which analyzed the relationships between biomarkers such as the presence of specific genes or infections, levels of blood proteins and other markers and the likelihood of developing conditions such as cancer and heart disease.
They found that less than half of the biomarkers in these studies had statistically significant associations with disease risk in larger follow-up studies. What's more, only one of every five of the original selected studies increased a patient's relative risk for a condition by more than 1.37, practically no risk at all (A relative risk of 1 means there is no difference between two groups).
In addition to statistical findings that don't show up in additional, bigger studies, Dr. Ioannidis said another problem is that researchers can also superimpose their own bias.
"Researchers tend to play with their data sets, and to analyze them in creative ways. We're certainly not pointing out any one investigator with this study; it's just the societal norm of science to operate in that fashion. But we need to follow the scientific method through to the end and demand replication and verification of results before accepting them as fact," he stated.
Dr. Ioannidis, the C.F. Rehnborg Professor in Disease Prevention at Stanford, outlined some of these same troubling research worries years ago in a 2005 essay published in the journal PLoS-Medicine entitled, "Why most published research findings are false."