Tuesday, November 24, 2009

Thursday, September 24, 2009

Is there antifreeze in vaccines or not?

In "Toxic Myths About Vaccines," author David Gorski MD accuses anti-vaccinationists of outright lying about toxins in vaccines. He especially ridicules them for being "chemistry-challenged" on assertions regarding one particular toxin: antifreeze.
Here’s one example. The aforementioned Jenny McCarthy has been repeating that there is “antifreeze” in vaccines, as she did in the interview linked to earlier. That line is straight off of a number of antivaccination websites. (Amazingly Mr. Heckenlively managed to restrain himself from repeating “the “antifreeze in vaccines” gambit. I can only hope that it is due to intellectual honesty, although I can’t rule out the possibility that he just didn’t know about it.) One website in particular links to an MSDS about Quaker State Antifreeze/Coolant, the principal ingredients of which are ethylene glycol and diethylene glycol. Guess what? There’s no ethylene or diethylene glycol in vaccines.
Not so fast, Dr. Gorski. There IS ethylene glycol in vaccines. It's called 2-Phenoxyethanol, and is found in childhood vaccines Infanrix, Deptacel, Pediarix, and Ipol, amongst others. You see, the other name for 2-Phenoxyethanol is ETHYLENE GLYCOL monophenyl ether.

The MSDS on car antifreeze, the regular ethylene glycol, says that the lethal oral dose to kill 50% of rats is 4700 mg/kg. The MSDS on 2-Phenoxyethanol, the vaccine ethylene glycol, says the lethal oral dose to kill 50% of rats is 1260 mg/kg. Comparing apples to apples, the vaccine ethylene glycol is a lot more toxic than car antifreeze--to rats anyway.

The debate shouldn't be on whether ethylene glycol exists in vaccines. It does, period. The debate should be on whether this type of ethylene glycol and this amount of ethylene glycol can cause the same adverse reactions as those normally associated with car antifreeze.

It is a situation where both sides are bending and polarizing the truth to suit their own agendas, while parents looking for honest, straightforward, objective information are screwed. Is antifreeze in vaccines? Not exactly--not the kind we put in our cars. Aha, then antifreeze is NOT in vaccines? Not exactly--a type of ethylene glycol that is known to have similar (actually higher) levels of toxicity to car antifreeze is found in very small amounts in a number of childhood vaccines.

So word to the wise, parents. Do your own research. How do you sort it out, when both sides are liberal with the truth-bending?

1. Look for precision. Science is precise. It is not whether A is true or not true. Science defines A carefully, and then qualifies under what conditions A is true and not true. Anyone who gives you a simple "fact" is bending the truth, because reality is not simple.

2. Look for references. Someone says there is antifreeze in vaccines? What makes them say that? Someone says it is NOT in vaccines? Where all have they looked? Follow their research trail for arriving at that conclusion. (In this case, if they had looked under the right chemical names, they would have found it.)

3. Look for objectivity. Read the original research papers. Outline the "plot"--what did they do in the study? Now to tease out confirmation bias, blind yourself to the results. Switch the research findings so that the results come out the opposite of what you would like to believe. If the study finds no autism-vaccine connection, much to your relief, then pretend it did. If the study finds a strong autism-vaccine connection, as you knew it would, pretend it didn't find anything at all. Once the results are disagreeable, the flaws in the research design and methodology come leaping out like magic.

4. Trust no one but yourself. If you let other people do the thinking for you, then you'll just end up with other people's thoughts--and prejudices, and agendas. It's kind of obvious, but it needs to be said. This is what this blog is all about: think for yourself.

For further research:
Vaccine excipient table sorted by vaccine.

Tuesday, June 9, 2009

Hasty Generalizations

There is a very common fallacy called, "hasty generalization." Basically, it says it is illogical to assume something happens all the time, everywhere, just because you've seen it once, or a few times.

The fallacy is illustrated in this common joke.

An engineer, a scientist, and a mathematician were riding a train into Ireland. As they observed the passing vista, they saw a black sheep. The engineer commented, "Interesting, that the sheep in Ireland are black." The scientist corrected him, "Let's not generalize too hastily. At least one sheep in Ireland is black." The mathematician thoughtfully added, "No, at least one side of one sheep in Ireland is black."

We all generalize hastily without knowing that we are. But in some cases, the fallacy is rather obvious. Take the case of global warming. Physicist Freeman Dyson recently commented on generalizations, not only from regional warming, but also from inadequate computer models that do not represent the true dynamics of the planet.

Freeman Dyson Takes on the Climate Establishment

"Global Warming is Baloney"

Tennesee Burger King Signs, "Global Warming is Baloney."

Free speech in the world of contracts. Does a franchise have the right to free speech when it embarrasses the brand name?

Sunday, June 7, 2009

I went to sleep in America....

Recently, John Cusack wrote an article criticising the direction of the Obama administration. He quoted Constitutional law professor Jonathan Turley:
"Well it can't get any worse: extreme executive privilege arguments in court, withholding of abuse photos, adoptions of indefinite detentions without trial, restarting military commissions, and blocking any torture investigation. Welcome to Bush 2.0..."
That is a nice summary of decision after decision coming out of our current administration that has anyone who cares about the Constitution concerned, to say the least. Let's recap:

1. Sovereign immunity (April 09): The government can spy on you as much they want, as long as they don't tell the public what they got on you.

2. Secrecy law (May 09): If the government did anything illegal between 9/11/01 and 1/22/09, they don't have to tell you, or show anyone the pictures they took.

3. Indefinite preventative detentions (May 21 09): If the government thinks you're dangerous, they can lock you up. Without a trial. Forever.

Now we have the latest:

4. Proposal to allow guilty pleas in capital cases--without a trial. If the government wants to sentence you to death, they can torture you--excuse me, interrogate you intensely--until you confess, and execute you based on that confession.

One comment I read in response to Glen Greenwald's blog on this topic inspired the title to my own blog: "I went to bed in America and woke up in a soviet gulag."

I have one minor disagreement. I think we went to sleep in America, and none of us has woken up yet.

Wednesday, May 20, 2009

What is Pseudoscience?

The real purpose of the scientific method is to make sure Nature hasn't misled you into thinking you know something you don't actually know.
-- Robert M. Pirsig, Zen and the Art of Motorcycle Maintenance

When I googled "pseudoscience," I got a lot of pages describing pseudoscience as any group of ideas that is not accepted by the establishment and is not consistent with the established body of knowledge. In other words, most people use the word "pseudoscience" to name-call and disparage claims they do not believe and do not like. Sure, they talked a lot about how pseudoscience is not testable, but they do not define testability.

After all, there are different standards for tests. What is "testing" to one person is sloppy, inconclusive blather to another. In science, the quality of the data lies in how well others can independently "test" their validity and reproduce the same results. The more details provided to evaluate and reproduce the methodology, the higher the quality of the paper. The more you have to trust the authors' word for anything, the lower the quality is. Requiring trust, or faith, in the methodology is not found in real science.

This motivated me to write my own page on how to identify pseudoscience. Rather than use the label to discredit non-conventional thinking, I would list testability criteria that can apply to popularly accepted knowledge as well. In addition, I would emphasize that there is no specific cut-off in the pseudo-science/science spectrum. Rather, it is important to understand that the less scientifically rigorous a conclusion is, the more pseudoscientific it is.

How to Recognize Pseudoscience:

The bottom line in identifying pseudoscience is recognizing claims and conclusions that are not supported by the evidence provided. Just like commercial products, evidence comes in varying qualities. Pseudoscience is the infomercial or used-car-salesmanship of the science world, a lot of exaggerated, selective conclusions not supported by the quality of the product (or data).

Now all scientific papers have flaws. It is not that we expect perfection in the data. We just don't want any false advertising, if you will, that ignores the flaws.

As an example, I will use a study frequently cited by government and health authorities as further "scientific" evidence that thimerosal (mercury) in vaccines do not cause autism. In 2003, Madsen et al conducted a retrospective study looking at autism rates in Denmark before and after the removal of thimerosal in vaccines, from 1971 to 2000. After the removal of thimerosal in 1992, autism rates actually skyrocketed. Authors concluded that thimerosal cannot be a causal factor in the development of autism. (Reference: Madsen, MD et all. PEDIATRICS Vol. 112 No. 3 September 2003, pp. 604-606 Thimerosal and the Occurrence of Autism: Negative Ecological Evidence From Danish Population-Based Data.)

I chose a paper popularly cited by authorities to show that my pseudoscience criteria are not biased against novel ideas, but can apply to conventionally accepted claims as well. Real science is not attached to an ideological agenda either pro or against establishment, but calls out the flaws where they occur, and demands that conclusions match the data.

1. Vague definitions / Lack of transparency and critical details: Pseudoscience does not use precise, objective, and transparent definitions that can be independently evaluated for as valid. Their definitions are often lacking in detail and murky, forcing the reader to trust that the authors really measured what you think they measured.

For example, in Madsen et al's paper, the authors defined autism as meeting criteria for ICD-8 code 299, "psychosis proto-infantilis." Why did they not use ICD-8 code 308, "infantile autism," like everyone else? When I emailed the lead author for clarification, he simply replied that Denmark had always used ICD-8 code 299 for autism, skipped the use of ICD-9 altogether, and for the specific diagnostic criteria, I would have to consult a Danish child psychiatrist. In short, before 1994, for 23 out of the 29 years in the study period, we have no clear definition of autism. Since Denmark departed from ICD-8 codes in other countries, the specific diagnostic criteria they used is critical. As it stands, we simply have to trust that what the Danes saw as "psychosis proto-infantilis" is the same animal we see as "autism."

2. Changing definitions. Pseudoscience lumps inconsistently defined measurements all together. At best, it is sloppy and unreliable. At worst, it constitutes a sleight-of-hand. Imagine someone tells you that Chemical A caused a study subject to lose weight. But you find out that the "before" weight measurement was taken on a different scale than the "after" measurement. In real science, one would use exactly the same scale for an honest comparison.

Denmark removed thimerosal in 1992. In 1994, Denmark changed its diagnosis of autism from ICD-8 299 "psychosis proto-infantilis" to ICD-10 F84 "infantile autism." Before 1995, Denmark's autism rates counted only inpatient autism cases, those severe enough to be admitted for hospitalization. After 1995, Denmark started counting both hospitalized cases and outpatient cases. So autism rates skyrocketed after thimerosal was removed. Did they really skyrocket, or did it look like that because the definition of autism changed to count a lot more kids? We will never know. The data don't say.

3. No real or actual data provided / Insufficient or adjusted statistics. Pseudoscience present data in graphs or some statistical artifact such as person-years or relative risk. They only show "adjusted" data, even when a straight-up presentation of raw data will do. Then they omit details that would allow independent evaluation of how the data was "adjusted" and if that adjustment was valid.

In Madsen et al's paper, we only see a graph of incidence rates from 1971 to 2000. Any 5th grader who has watched PBS's Cyberchase can tell you that graphs can imply statistical significance, or a sharp rise or fall, where there is actually none. Was the rise significant or just a random fluctuation? The data don't say.

4. No control group / Poor control group. Pseudoscience infers causation from a change in one group, without comparing it to another group that hasn't experienced the change. In science, the comparison group is called the control group. Pseudoscience often has no control group, or designs a control group that is so different from the study group that you can't pinpoint what caused any difference in results.

In the study in question, there simply was no control group. As a correlational study, it did not involve experimentation, and therefore did not strictly follow the scientific method. All correlational and anecdotal studies are quasi- or pseudo-scientific right out of the gates, and need to be interpreted with many qualifiers and exceeding caution.

As it was, it would have been helpful to compare autism rates with those in other countries using similar diagnostic criteria, or compare autism rates between different groups in Denmark, such as autism rates between vaccinated children and unvaccinated children never exposed to thimerosal. Without any comparison at all, the study is not much more than a glorified anecdote of one country instead of one person. Without any controls, anecdotes of thousands of people are still anecdotes.

5. No or inadequate analysis of confounders. In science, confounders are factors that could have caused the result you see instead of, or in addition to, the factor you are studying (called the independent variable). Pseudoscience doesn't consider the impact of factors other than the independent variable.

To illustrate, in the Danish study, a possible confounder is that the amount of thimerosal Danes were exposed to was very low compared to that in other countries. It is possible that the amount of thimerosal is critical in causing autism or not, the Danes were not affected by the low amount they got. Another possible confounder is that thimerosal is only one of several co-factors that play a role in causing autism, or one of many different causes. After all, the autism spectrum disorder has many clinical presentations, and it is entirely possible it has multiple and complicated etiologies. Just because autism rates rose despite absence of thimerosal doesn't automatically mean thimerosal does not play a role. Just because obesity rates rise despite a shortage of ice cream doesn't mean ice cream doesn't make you fat.

6. No or inadequate analysis of flaws and weaknesses. Pseudoscience doesn't have flaws or holes, or doesn't acknowledge as many as it should. Real science is self-critical to carefully place necessary limitations and qualifiers in its conclusions.

Although Madsen et al acknowledged that changing the definition of autism to include outpatient cases after 1994 might have inflated the rates somewhat, the authors downplay the impact. They do not address the other vagaries of definition at all, nor any of the flaws listed here. They do not qualify or limit their conclusions in context of these flaws.

7. Conclusions unwarranted by the data.  Pseudoscience likes to jump to conclusions. Real science is painfully slow, painfully tentative, and painfully precise about interpretation.

The only conclusion warranted by this study is that between 1994 and 2000, inpatient diagnoses of ICD-10 F84 childhood autism appear to have risen in Denmark despite no use of thimerosal in childhood vaccines, but it is unknown if this "rise" is significant or a result of random fluctuation. Certainly, no conclusions on causation or exoneration from causation can be inferred.

If you look at the data in this study against the rigorous standards of real science, they are too vague and inconsistent to support the conclusion that thimerosal cannot possibly cause autism. The study has all the outward trappings of "scientific" research, but none of the fundamental pillars of real science. It is the poster-child of pseudoscience.

Summary

Whenever you read a paper where you don't have enough information to independently evaluate and "test" the data and statistics for validity, a red flag should go up. Pseudoscience can be found everywhere, from widely accepted research findings supported by authorities to "alternative"/paranormal/conspiracy theories.

Wednesday, February 4, 2009

Blog on hiatus

I regret to announce that this fledgling blog is on hiatus from December 2008 to May 2009. I recently moved to an apartment in South America that has no phone or internet access. Despite my best intentions, I cannot do the kind of research necessary for blogging without continuous access to the internet. I plan to be back when I can.