Author: MarkH

  • Natural News' Mike Adams Adds Global Warming Denialism to HIV/AIDS denial, Anti-vax, Altie-med, Anti-GMO, Birther Crankery

    I still think that list is pretty incomplete, the RationalWiki has more, but it’s interesting to see a potential internal ideological conflict as Adams sides with big business and the fossil fuel industry to suggest CO2 is the best gas ever. While he doesn’t appear to directly deny CO2 is a greenhouse gas, he’s managed to merge his anti-government conspiratorial tendencies with his overriding naturalistic fantasy to decide the government (and Al Gore) are conspiring to destroy our power infrastructure with carbon taxes, and deny the world the benefit of 1000ppm CO2 in the atmosphere. His solution? Pump coal power exhaust into greenhouses growing food. I’m not kidding:

    This brings up an obvious answer for what to do with all the CO2 produced by power plants, office buildings and even fitness centers where people exhale vast quantities of CO2. The answer is to build adjacent greenhouses and pump the CO2 into the greenhouses.

    Every coal-fired power plant, in other words, should have a vast array of greenhouses surrounding it. Most of what you see emitted from power plant smokestacks is water vapor and CO2, both essential nutrients for rapid growth of food crops. By diverting carbon dioxide and water into greenhouses, the problem of emissions is instantly solved because the plants update the CO2 and use it for photosynthesis, thus “sequestering” the CO2 while rapidly growing food crops. It also happens to produce oxygen as a “waste product” which can be released into the atmosphere, (slightly) upping the oxygen level of the air we breathe.

    He seems to have forgotten about all the mercury, lead, cadmium, volatile organics, sulfur etc., emitted by burning coal. I wonder how these different crank theories somehow manage to occupy the same brain, as his mercury paranoia appears temporarily overwhelmed by his anti-government conspiracism. I mean, he’s defending burning coal. It boggles the mind. I’m not exactly the biggest food purity buff, but even I find the idea of growing food in coal-fire exhaust somewhat, well, insane? Mad? Totally bonkers? What’s the right word for it? Maybe we need to create a new word for this level of craziness? Maybe we should name it after Adams, and call it Adamsian. You could say “Adamsian nuttery” to really refer to a truly bizarre level of crankery. Unless it’s an April Fools day prank, but then it was published on the 31st…nope, I think he’s just that nuts.

  • Anti-GMO writers show profound ignorance of basic biology and now Jane Goodall has joined their ranks

    It’s a sad day for the reality-based community, within the critiques of Jane Goodall’s new book ‘Seeds of Hope’ we find that in addition to plagiarism and sloppiness with facts, she’s fallen for anti-GMO crank Jeffrey Smith’s nonsense.

    When asked by The Guardian whom she most despised, Goodall responded, “The agricultural company Monsanto, because I know too much about GM organisms and crops.” She might know too much, but what if what she knows is completely wrong?

    Many of the claims in Seeds of Hope can also be found in Genetic Roulette: The Documented Health Risks of Genetically Engineered Foods, a book by “consumer advocate” Jeffrey Smith. Goodall generously blurbed the book (“If you care about your health and that of your children, buy this book, become aware of the potential problems, and take action”) and in Seeds of Hope cites a “study” on GMO conducted by Smith’s “think tank,” the Institute for Responsible Technology.

    Like Goodall, Smith isn’t a genetic scientist. According to New Yorker writer Michael Specter, he “has no experience in genetics or agriculture, and has no scientific degree from any institution” but did study “business at the Maharishi International University, founded by the Maharishi Mahesh Yogi.” (In Seeds of Hope, Goodall also recommends a book on GM by Maharishi Institute executive vice president Steven M. Druker, who also has no scientific training). As Professor Bruce Chassy, an emeritus food scientist at the University of Illinois, told Specter, “His only professional experience prior to taking up his crusade against biotechnology is as a ballroom-dance teacher, yogic flying instructor, and political candidate for the Maharishi cult’s natural-law party.” Along with fellow food scientist Dr. David Tribe, Chassy runs an entire website devoted to debunking Smith’s pseudoscience.

    And it apparently escaped Goodall’s notice that Smith’s most recent book—the one that she fulsomely endorsed—features a foreword by British politician Michael Meacher, who, after being kicked out of the Tony Blair’s government in 2003, has devoted a significant amount of time to furthering 9/11 conspiracy theories.

    Goodall is, of course, not the first scientist of fame and repute to fall in for crankery and pseudoscience. From Linus Pauling to Luc Montagnier, even Nobel Prize winning scientists have fallen for psuedoscientific theories. However, we should always be saddened when yet another famous scientist decides to go emeritus and abandon the reality-based community.

    There always seem to be a couple of different factors at play when this happens. For one, such scientists appear to have reached a such a status that it becomes very difficult for others to criticize them. It’s like a state of ultra-tenure, in which you practically have to insult the intelligence of an entire continent before people will object to your misbehavior. The second common factor seems to be that they start operating in a field in which they lack expertise, but seem to assume their expertise in other unrelated fields should allow them to waive in. This appears to be the case with Goodall, as even someone with rudimentary knowledge of molecular biology should be able to see the gaping holes in the anti-GMO movement’s logic.

    For example, let’s start with the easy-pickings at Natural News. A recent article by Jon Rapaport entitled “Brand new GMO food can rewire your body: more evil coming” is a perfect example of how the arguments made against GMO foods are based on fundamentally-unsound understanding of biology. The author writes:

    It’s already bad. Very bad. For the past 25 years, the biotech Dr. Frankensteins have been inserting DNA into food crops.

    The widespread dangers of this technique have been exposed. People all over the world, including many scientists and farmers, are up in arms about it.

    Countries have banned GMO crops or insisted on labeling.

    Now, though, the game is changing, and it’ll make things even more unpredictable. The threat is ominous and drastic, to say the least.

    GM Watch reports the latest GMO innovation: designed food plants that make new double-stranded (ds) RNA. What does the RNA do? It can silence a gene. It can activate a gene that was silent.

    If you imagine the gene structure as a board covered with light bulbs, in the course of living some genes light up (activation) and some genes go dark (silent) at different times. This new designed RNA can change that process. No one knows how.

    No one knows because no safety studies have been done. If you have genes lighting up and going dark in unpredictable ways, the functions of a plant or a body can change randomly.

    Pinball, roulette, use any metaphor you want to; this is playing with the fate of the human race. Walk around with designer-RNA in your body, and who knows what effects will follow.

    At this point, I think anyone familiar with the science of RNA interference (RNAi) has slapped themselves in the forehead, for anyone who wants a decent introduction the Wiki does a pretty good job. It’s clear that the author is projecting his own ignorance of RNAi onto the rest of us. Briefly, until about 20 years ago, the so-called “central dogma of molecular biology” was a one way road from DNA being transcribed into RNA which was then translated into a functional protein. Even this is a pretty gross simplification, but it’s fair to say, that prior to the discovery of RNAi, RNA was thought to be little more than a messenger in the cell, serving as an intermediary between the DNA code, and the protein function. Yes, we knew that some RNA had enzymatic function, was incorporated into some proteins, etc., but it wasn’t seen so much as a regulatory molecule.

    Then, after a few intriguing findings in plants, Fire and Mello discovered that RNA itself could control the translation of other genes in c. elegans. Almost by accident, they found that if you inserted a double-stranded RNA molecule corresponding to a RNA transcript, that transcript would be degraded and the protein it encoded for wouldn’t be expressed. It was a surprising finding. One would think that what would work would be the anti-sense strand of RNA that would bind the sense strand and somehow inhibit it’s entry into the ribosomal machinery and ultimately interfere with translation. Instead, what they found was double-stranded RNA had a function all of it’s own, with a previously unknown cellular machinery specifically-purposed with processing dsRNA and inhibiting gene function through an entirely different mechansim. Subsequently we’ve also found the RNAi not only can directly regulate the levels of RNA transcripts, but can also regulate gene suppression, and activation directly on promoter sequences on DNA itself.

    It’s amazing, decades after the discovery of RNA and understanding of its primary function, we discovered this new and incredibly complex layer of regulation of genetics by RNA molecules involved in everything from development to disease. But what does that mean for us? Should we be worried about gene-regulating RNA molecules in our food?

    Of course not! RNAi is an intrinsic function of most eukaryotes. Just about every food you’ve ever eaten in your entire life is chock-full of RNA molecules, including double-stranded inhibitory RNAs involved in the normal biological processes occurring within the cell. If other organisms could affect us by poisoning us with RNA, we wouldn’t last a minute. Weirdly, in GMO paranoia world, however, whatever we consume has the potential to take over our bodies. The basic molecules of all life, that exist in everything we eat, take on new powers once handled by human scientists. The article hinted at as evidence of this risk (but of course not actually cited by the author) that suggests miRNA may have “cross-kingdom” effects, is a great example of crank cherry-picking, as the evidence demonstrating it may be artifact is of course not mentioned. And we shouldn’t be surprised, as it would be a pretty extraordinary hole in our defenses if other organisms could so easily modify our gene expression.

    One of the great limitations of gene therapy as a potential therapy has been that it’s extremely difficult to introduce genes, or specifically regulate them with external vectors. If it were as simple as just feeding us RNA that would be something. For better or worse (likely better), your body is extremely resistant to other organisms tinkering with its DNA or cellular machinery.

    Ok, but then you say, “Hey, that’s Natural News, we know they’re morons.” Ok, how about Clair Cummings in Common Dreams panic-posting about the GMO threat to our water supply from this week? Great evidence that “progressive” is no insulation from “anti-science”:

    Today is World Water Day. The United Nations has set aside one day a year to focus the world’s attention on the importance of fresh water. And rightly so, as we are way behind in our efforts to protect both the quantity and quality of the water our growing world needs today.(Image: EarthTimes.org)

    And now, there is a new form of water pollution: recombinant genes that are conferring antibiotic resistance on the bacteria in the water.

    Researchers in China have found recombinant drug resistant DNA, molecules that are part of the manufacturing of genetically modified organisms, in every river they tested.

    Genetically engineered organisms are manufactured using antibiotic resistant genes. And these bacteria are now exchanging their genetic information with the wild bacteria in rivers. As the study points out, bacteria already present in urban water systems provides “advantageous breeding conditions for the(se) microbes.”

    Antibiotic resistance is perhaps the number one threat to public health today.

    Transgenic pollution is already common in agriculture. U.C. Berkeley Professor Ignacio Chapela was the first scientist to identify the presence of genetically engineered maize in local maize varieties in Mexico. He is an authority on transgenic gene flow. He says it is alarming that “DNA from transgenic organisms have escaped to become an integral component of the genome of free-living bacteria in rivers.” He adds that “the transgenic DNA studied so far in these bacteria will confer antibiotic resistance on other organisms, making many different species resistant to the antibiotics we use to protect ourselves from infections.”

    Our expensive attempts to filter and fight chemicals with other chemicals are only partially effective. Our attempts to regulate recombinant DNA technology has failed to prevent gene pollution. The only way to assure a sustainable source of clean water is to understand water for what it is: a living system of biotic communities, not a commodity. It is a living thing and as such it deserves our respect, as does the human right to have abundant fresh clean water for life.

    You heard it, now they’re making up a new category of pollution “gene pollution”.

    Let’s go back to some of the basic science here, so again, we can display just how silly and uninformed these Chicken Littles are. When molecular biologists wish to produce large quantities of a DNA or protein, what they usually do is insert the sequence into an easy-to-grow organism like E. Coli, or yeast, or some other cell, and then have the biologic machinery of those cells produce it for us. This is one of the most simple forms of genetic modification, and we use it from everything to making plasmid DNA in the lab, to the production of recombinant human insulin for diabetics. In order to make sure your organism is making your product of interest you include a gene that encodes for resistance to an antibiotic (in bacteria most commonly to ampicillin) so that when you grow your bug you can make sure the only cells growing are the ones that are working for you by including that antibiotic in the mix. Other resistance genes we use are often for antibiotics we don’t use in humans, like hygromycin or neomycin, which is nephrotoxic if injected (but also poorly absorbed).

    “That’s terrible!”, you say, “how could we teach so many bacteria to be resistant to antibiotics! Surely this will kill us all!”

    Um, no. For one, the resistance genes we use aren’t novel or made de novo by humans, they already existed before a single human was ever treated with an antibiotic. The first antibiotic discovered, penicillin, is a natural product. It’s an ancient agent in an ongoing war between microorganisms. The antidote for penicillin and related molecules was actually discovered at about the same time as we discovered penicillin. Beta-lactamase, which breaks open the structure of the penicillins and inhibits their antibiotic effects was around long before humans figured out how to harness antibiotics for our own purposes. The gene, which we clone into plasmids to make our GMO bacteria work for us, came from nature too. Now if we were growing bacteria in vancomycin or linezolid, yeah, I’d be pissed, but that’s not what’s happening. And even though we still use older penicillins clinically, it’s with full knowledge that resistance has been around for decades, and they are used for infections that we know never become resistant to the drugs, like group b strep (or syphilis). The war for penicillin is over. We lost. Any bug that’s going to become resistant to penicillin already is.

    The antibiotic resistance that plagues our ICUs and hospitals doesn’t come from GMOs being taught to fight ampicillin, it comes from overuse of more powerful antibiotics in humans. The genes that are providing resistance to even beta-lactam resistant antibiotics like the carbapenems or methicillin are the result of a more classic form of genetic modification – natural selection.

    So what is the risk to humans from the DNA encoding a wimpy beta-lactamase or whatever being detected in water? Zilch. Nada. Zip.

    The paranoia over recombinant DNA has persisted for decades despite no rational basis for a threat to humans or other living things. The continued paranoia over rDNA is a sign that the GMO paranoids get their science from bad movies, not textbooks or serious knowledge of the risks and benefits of this technology. rDNA is why we have an unlimited supply of insulin, it’s how we have virtually all of our knowledge of molecular biology, it’s how we even have an understanding of how things like antibiotic resistance work. It’s been around since the 70s and how many times have you heard of it actually hurting a person?

    This is the state of the argument over genetically-modified organisms. To the uninitiated this stuff sounds like it might be kind of scary. But with any real understanding of the molecular mechanisms of these technologies, the plausibility of their risk drops to zero. Sadly, Goodall has not only shown a pretty poor level of scholarship with this new book, but also, has fallen in with cranks promoting implausible risks of this biotechnology. It’s unfortunate because she should be respected for her previous work as an environmentalist and a conservationist. This is what is so annoying about anti-GMO paranoia. It makes environmentalists look like idiots, as it distracts from actual threats to the environment with invented threats and irrational fears of biotech. I’m sure I’ll now be accused of being in the pocket of big ag, as I am in every thread on GMO, but I assure you, I have no financial interests, or any dealings with these companies ever. I’m irritated with the anti-GMO movement because it’s an embarrassment. It’s Luddism, and ignorance masquerading as environmentalism. It’s bad biology. It’s the progressive equivalent of creationism or global warming denial. It’s classic anti-science, and we shouldn’t tolerate it.

  • Fixing the Chargemaster Problem for the Uninsured

    For those disturbed by the evils of the hospital chargemaster as exposed by Brill’s piece in time, Uwe E. Reinhardt’s proposed solution is a must read.

    While the hospitals are never going to charge the uninsured the same rate as they charge medicare (and probably be less forgiving the more they think they can get out of you), that’s no reason we can’t force them to with state law. Apparently that’s what Reinhardt had them do in Jersey:

    In the fall of 2007, Gov. Jon Corzine of New Jersey appointed me as chairman of his New Jersey Commission on Rationalizing Health Care Resources. On a ride to the airport at that time I learned that the driver and his family did not have health insurance. The driver’s 3-year-old boy had had pus coming out of a swollen eye the week before, and the bill for one test and the prescription of a cream at the emergency room of the local hospital came to more than $1,000.

    By circuitous routes I managed to get that bill reduced to $80; but I did not leave it at that. As chairman of the commission, I put hospital pricing for the uninsured on the commission’s agenda.

    After some deliberation, the commission recommended initially that the New Jersey government limit the maximum prices that hospitals can charge an uninsured state resident to what private insurers pay for the services in question. But because the price of any given service paid hospitals or doctors by a private insurer in New Jersey can vary by a factor of three or more across the state (see Chapter 6 of the commission’s final report), the commission eventually recommended as a more practical approach to peg the maximum allowable prices charged uninsured state residents to what Medicare pays (see Chapter 11 of the report).

    Five months after the commission filed its final report, Governor Corzine introduced and New Jersey’s State Assembly passed Assembly Bill No. 2609. It limits the maximum allowable price that can be charged to uninsured New Jersey residents with incomes up to 500 percent of the federal poverty level to what Medicare pays plus 15 percent, terms the governor’s office had negotiated with New Jersey’s hospital industry.

    Reinhardt also makes clear that the problem of excess cost is not the chargemaster or hospital profits, which are not so extraordinary, as did I in my original piece (at least compared to excess drug costs, insurance administration, inefficiently delivered service and unnecessary services etc). But the injustice of the uninsured facing these inflated bills that are designed to antagonize large payers like health insurance companies, should be addressed. You can’t bleed a radish, and hospitals should stop trying to when it comes to the uninsured. Since they won’t without government encouragement, such legislation should be considered at the state and national levels.

  • New homebirth statistics show it's way too dangerous, and Mike Shermer on liberal denialism

    Two links today for denialism blog readers, both are pretty thought provoking. The first, from Amy Tuteur, on the newly-released statistics on homebirth in Oregon. It seems that her crusade to have the midwives share their mortality data is justified, as when they were forced to release this data in Oregon, planned homebirth was about 7-10 times more likely to result in neonatal mortality than planned hospital birth.

    I’m sure Tuteur won’t mind me stealing her figure and showing it here (original source of data is Judith Rooks testimony):

    Oregon homebirth neonatal mortality statistics, from the Skeptical OB.

    Armed with data such as these, it needs to become a point of discussion for both obstetricians and midwives that out of hospital births have a dramatically-higher neonatal mortality, and this is worse for midwives without nursing training (the DEM or direct-entry-midwives). It’s their body and their decision, but this information should be crucial to informing women as to whether or not they should take this risk. It also is only a reflection of neonatal mortality, one could also assume it speaks to higher rates of morbidity as well, as longer distances and poorer recognition of fetal distress and complications will lead to worse outcomes when the child survives. It should be noted this data is also consistent with nationwide CDC data on homebirth DEMs, and actually better than midwife data for some states like Colorado.

    The second article worth pointing out today (even though it’s old) is from Michael Shermer in Scientific American on the liberal war on science. Regular readers know that I’m of the belief there isn’t really a difference between left and right-wing ideology on acceptance of science, it just means they just reject different findings that collide with their ideology.

    The left’s war on science begins with the stats cited above: 41 percent of Democrats are young Earth creationists, and 19 percent doubt that Earth is getting warmer. These numbers do not exactly bolster the common belief that liberals are the people of the science book. In addition, consider “cognitive creationists”—whom I define as those who accept the theory of evolution for the human body but not the brain. As Harvard University psychologist Steven Pinker documents in his 2002 book The Blank Slate (Viking), belief in the mind as a tabula rasa shaped almost entirely by culture has been mostly the mantra of liberal intellectuals, who in the 1980s and 1990s led an all-out assault against evolutionary psychology via such Orwellian-named far-left groups as Science for the People, for proffering the now uncontroversial idea that human thought and behavior are at least partially the result of our evolutionary past.

    There is more, and recent, antiscience fare from far-left progressives, documented in the 2012 book Science Left Behind (PublicAffairs) by science journalists Alex B. Berezow and Hank Campbell, who note that “if it is true that conservatives have declared a war on science, then progressives have declared Armageddon.” On energy issues, for example, the authors contend that progressive liberals tend to be antinuclear because of the waste-disposal problem, anti–fossil fuels because of global warming, antihydroelectric because dams disrupt river ecosystems, and anti–wind power because of avian fatalities. The underlying current is “everything natural is good” and “everything unnatural is bad.”

    Whereas conservatives obsess over the purity and sanctity of sex, the left’s sacred values seem fixated on the environment, leading to an almost religious fervor over the purity and sanctity of air, water and especially food.

    I’m worried that Shermer has confused liberal Luddism with denialism, and I would argue some anti-technology skepticism is healthy and warranted. While I agree that the anti-GMO movement does delve into denialist waters with regularity, these are not good examples he has chosen. One needs to be cautious with technology, and it’s a faith-based assumption that technology can solve all ills. I’m with Evgeny Morozov on this one, the assumption there is (or should be) a technological fix for every problem has become almost a religious belief system. Appropriately including the potential perils of a technology in its cost-benefit analysis is not a sign of being anti-science. Even overblowing specific risks because of individual values isn’t really anti-science either. It might be anti-human to put birds before human needs as with wind turbines, but no one is denying that wind turbines generate electricity. And while liberals may be overestimating the risk of say, nuclear waste generation over carbon waste generation (guess which is a planet-wide problem!), it doesn’t mean they don’t think nuclear power works or is real. They just have an arguably-skewed risk perception, which is an established problem in cases of ideological conflict with science or technology. There is also reasonable debate to be had over the business-practices of corporations (Monsanto in his example), which need and deserve strong citizen push-back and regulation to prevent anti-competitive or abusive behavior.

    Anti-science requires the specific rejection of data, the scientific method, or strongly-supported scientific theory due to an ideological conflict, not because one possesses superior data or new information. I don’t think Shermer actually listed very good examples of this among liberals. If you’re going to talk about GMO denialism, don’t complain about people fighting with Monsanto, talk about how anti-GMO advocates make up crazy claims about the foods (see natural news for example) such as that they cause autism, or cancer. And even then it’s difficult to truly say this is a completely liberal form of denialism as Kahan’s work shows again, there is a pretty split ideological divide on GMO.

    I agree that liberals are susceptible to anti-science and the mechanism is the same – ideological conflict with scientific results. However, the liberal tendency towards skepticism of technology is healthy in moderation, and anti-corporatism is not automatically anti-science. In an essay that was striving to say we must be less ideological and more pragmatic, Shermer has wrongly lumped in technological skepticism, and anti-corporatism with science denial.

  • Bittman changes his tune on Sugar Study, while Mother Jones Doubles Down

    There’s been an interesting edit in Marc Bittman’s sugar post, as he has now changed his tune on the PLoS one sugar study, now Bittman acknowledges obesity too is important. That was big of him, it is after all, the most important factor. Maybe my angry letter to the editor had an effect, but he’s grudgingly changed this statement:

    In other words, according to this study, obesity doesn’t cause diabetes: sugar does.

    To:

    In other words, according to this study, it’s not just obesity that can cause diabetes: sugar can cause it, too, irrespective of obesity. And obesity does not always lead to diabetes.

    The second sentence is totally unnecessary. Of course obesity doesn’t always cause diabetes, or heart attack or whatever. Nor do cigarettes always cause lung cancer. Nor does sugar intake always lead to obesity or diabetes. But obesity is the primary cause of type two diabetes, just as cigarettes are the primary cause of lung cancer, and who knows what sugar is doing.

    Mother Jones, sadly, has decided to double down, calling the PLoS One study the “Best. Diet. Study. Ever.” It’s not, of course. It’s merely interesting and suggestive of an effect. It is not nearly proof of causation. They also laud the Mediterranean diet study (maybe it was supposed to be the Best. Study. Ever.?), however, they again show they’re not actually reading these papers because if you read our coverage of the study you’d know they didn’t actually study the Mediterranean diet! In a case of the blind leading the blind, they quote Bittman’s misinformed piece on the Mediterranean diet study

    Let’s cut to the chase: The diet that seems so valuable is our old friend the “Mediterranean” diet (not that many Mediterraneans actually eat this way). It’s as straightforward as it is un-American: low in red meat, low in sugar and hyperprocessed carbs, low in junk. High in just about everything else — healthful fat (especially olive oil), vegetables, fruits, legumes and what the people who designed the diet determined to be beneficial, or at least less-harmful, animal products; in this case fish, eggs and low-fat dairy.

    This is real food, delicious food, mostly easy-to-make food. You can eat this way without guilt and be happy and healthy. Unless you’re committed to a diet big on junk and red meat, or you don’t like to cook, there is little downside

    Except for one critical fact. The subjects assigned to the Mediterranean diet did not have lower consumption of red meat, sugar and hyperprocessed carbs, or other junk! If you look at the supplementary data, you see that the subjects took the positive recommendations of the diet (olive oil, nuts, fish), and more or less ignored the negative recommendations (less meat, less spreadable fats/butter, less baked goods). If you look at figures like supplementary S6, the study groups did not change their diets in these categories relative to the controls, so the effects on their cardiovascular events relative to controls aren’t likely to be from the diet recommendations. When there were changes relative to baseline, even when statistically significant, the changes were tiny.

    The participants in this study actually had a very high fat intake, about 35-40% of calories across all groups. And while there was a statistically-significant decrease in cardiovascular events like stroke and heart attack in both study groups (Med + olive oil, Med + nuts), only one arm of the so-called Mediterranean diet (Med + Olive oil) had a non-significant decrease in mortality, while the other arm (Med + Nuts) had a similar curve compared to the “do nothing” control. My interpretation of this, and it’s fine to be critical of it, is that this isn’t that meaningful. If anything, the only variable correlating with decrease in mortality was excess olive oil consumption (> 4 tbsp/day), not the Mediterranean diet. Either that, or eating nuts cancels out the beneficial effects of the diet on mortality.

    This is why people always dump on nutrition science when it appears to change every 10 years. Results get overblown, and when the inevitable regression towards the mean occurs, we get blamed for it. The reality is, the press coverage of science is extremely poor, and there is not adequate critical analysis and presentation of results to their audience.

  • Don't Switch to the Mediterranean Diet Just Yet

    The New York Times made big news with reports that the New England Journal of Medicine study on the beneficial effects of the Mediterranean diet showed it could dramatically reduce the rates of heart attack and stroke. But this study has major issues that bear directly on whether or not physicians should make new recommendations about dietary intake of fats like olive oil, or whether patients should adopt the diet as a whole. Let’s talk about the trial.

    First of all, this is a randomized, controlled trial, in which 7447 men and women between 55 and 80 years of age who had major risk factors for cardiovascular disease such as diabetes, obesity, smoking, hyperlipidemia etc., were divided evenly between 3 groups, one which received recommendations on a “low fat” diet, and two in which there was extensive counseling on the Mediterranean combined with either a ready free supply of extra-virgin olive oil, or alternatively a variety of nuts.

    The primary end points being studied was the combined number of heart attacks, strokes, and death, and over the course of about 5 years of study about 288 such events occurred. If you combine all three of these end points together, and evaluate their frequency between the groups you find 96 of these end points occurred in the “Mediterranean diet with extra-virgin olive oil” or 3.8% of the group, 83 occurred in the Mediterranean diet with nuts for 3.4% of that population, and 109 in the control group for 4.4% of the controls.

    But before anyone takes these results to heart, we have to recognize major flaws with the study design, and the populations that comprised these three groups. First, the rate of primary events was surprisingly low for such a high risk group, and because the study was stopped early, absurdly for “ethical reasons”, the number of events is quite low. For the life of me I can’t think of what that ethics committee was thinking. These results are not that dramatic. Further, the “low fat” diet was very ineffectually enforced or counseled, to the point that midway through the study the authors revised the protocol to include more counseling sessions. Evaluating the supplementary data, specifically table S7, you see this control group was in no way on a low fat diet. They still were consuming 37-39% of their calories from fat! “Low fat” should have 10-15% of calories from fats, so basically, everyone ignored the diet. Further, all of the groups consumed a similar amount of total fat, mono and poly-unsaturated fats, and even a used olive oil as their main culinary fat. All groups consumed (see table S5) a similar amount of red meat (forbidden from all diets), butter, soda, baked goods, etc. The places where there seemed to be more dramatic differences were in olive oil consumption (about 50% of controls had > 4 tbsp a day, vs 80% of the “nuts” group and about 90% of the “extra-virgin olive oil group), wine consumption (modest at about 30% in diet groups vs 25% in “low fat” control), nuts (crazy high at 90% in nuts group, vs 40% and 20% in “olive oil” and “low fat”, as well as modest elevation of the amount of fish, fruits and vegetables in the Mediterranean groups. Further, some of these differences, such as the consumption of alcohol, fruits and vegetables, was higher in the Mediterranean groups at baseline (notice no mean change in table S6) so the groups may have started out in a different place.

    What does this mean? First of all, we have to reject the notion that this study compared Mediterranean diet to “low fat” diet. This was a study of basically no diet intervention versus increasing your intake of fish, nuts and/or olive oil. Otherwise, there didn’t appear to be compliance with the negative suggestions of the Mediterranean diet, to decrease red mean intake, baked goods, dairy, etc. The participants basically took the recommended items and increased them in their diets, but didn’t exclude any of the “discouraged” items. This is very interesting, but to call it the “Mediterranean diet” is misleading. In reality, it’s diet supplementation with olive oil, nuts and fish.

    Second, the final results, while they sound impressive (30% reduction in combined primary end points!) are actually not as important as some of the less-emphasized findings. For this we have to evaluate the secondary endpoint, which happens to be the one we really care about – all cause mortality. They could not show a difference in mortality! So while you might be less likely to have a heart attack or stroke, you’re no less likely to die. This is why I’m so confused they ended the study early. This is really the only end point that matters, and it was unchanged at the interval at which the ethics committee decided this study had to be stopped for efficacy. Why did they do this? The evidence is suggestive that with more participants, the Mediterranean diet + olive oil might have diverged a bit and shown a benefit compared to the do nothing “low fat” control, but this didn’t reach significance.

    What have we learned? Compared to other Spanish folks between the ages of 55 and 80, all with cardiovascular risk factors, those that added olive oil, nuts, and fish to their diet had fewer cardiovascular events, but no difference in their mortality compared with people that did nothing to change their diet.

    Why did this make the front page of the New York Times? Let’s show a little bit more critical analysis of findings, and not just swallow the PR.

  • No, It's Not the Sugar – Bittman and MotherJones have overinterpreted another study

    Diet seems to be all over the New York Times this week, with an oversell of the benefits of the Mediterranean diet, and now Mark Bittman, everyone’s favorite food scold, declaring sugar is the culprit for rising diabetes. His article is based on this interesting new article in PLoS One and begins with this wildly-inaccurate summary:

    Sugar is indeed toxic. It may not be the only problem with the Standard American Diet, but it’s fast becoming clear that it’s the major one.

    A study published in the Feb. 27 issue of the journal PLoS One links increased consumption of sugar with increased rates of diabetes by examining the data on sugar availability and the rate of diabetes in 175 countries over the past decade. And after accounting for many other factors, the researchers found that increased sugar in a population’s food supply was linked to higher diabetes rates independent of rates of obesity.

    In other words, according to this study, obesity doesn’t cause diabetes: sugar does.

    No! Not even close. I hate to repeat his misstatement, because I’d hate to reinforce this as a new myth, but it’s critical to see his full mistake here. This is a wildly inaccurate summary of the authors’ findings, and one they don’t even endorse in their discussion. Bittman has actually just said “obesity doesn’t cause diabetes”, and now has proven himself a deluded fool.

    Let’s talk about this paper. This is what is called an “ecological study”, which means it studies populations as a whole, rather than individual patients. Using data from the United Nations Food and Agricultural Organization, the International Diabetes Federation, and various economic indicators from the World Bank, the authors compared populations of whole countries, in particular the prevalence of diabetes correlated to other factors such as GDP, urbanization, age, obesity, and availability of certain varieties of food like sugar, meat, fibers, cereals and oil. Using the rise, or fall, of diabetes prevalence over the last decade in various countries, they correlated this increase with increasing availability of sugar, obesity, urbanization, aging populations etc., and found a few interesting things. For one, increases in GDP, overweight and obesity, tracked significantly with increasing diabetes prevalence. But interestingly, when those factors were controlled for, increasing availability of sugar also tracked linearly with increasing diabetes prevalence, and the longer the duration of the exposure, the worse it got.

    However, this does not mean that “obesity doesn’t cause” diabetes, if anything, it’s further support for the exact opposite. While a correlative study can’t be a “smoking gun” for anything, the data in this paper supports increasing modernization/GDP, obesity, and sugar availability are all correlated with higher diabetes prevalence. Even if the sugar relationship is causal, which is no guarantee, the increase in sugar availability could only explain 1/4 of the increase in diabetes prevalence. Obesity is still the main cause of diabetes, which can be demonstrated on an individual level by increases in weight resulting in loss of glycemic control, and subsequent weight loss results in return of euglycemia. In particular, the results of studies of bariatric surgery, in both restrictive and bypass procedures, weight loss is accompanied by improvement in diabetes. The attempts of toxin paranoids like Bittman to reclassify sugar as a diabetes-causing agent, and to dismiss obesity as a cause, are highly premature.

    Mother Jones, has a slightly more balanced read, but it still oversells the results.

    This is a correlation, of course, and correlation does not always equal causation. On the other hand, it’s an exceptionally strong correlation.

    Well, that’s another overstatement. Want to see a picture?

    http://www.plosone.org/article/fetchObject.action?uri=info:doi/10.1371/journal.pone.0057873.g002&representation=PNG_M

    Article Source: The Relationship of Sugar to Population-Level Diabetes Prevalence: An Econometric Analysis of Repeated Cross-Sectional Data

    Basu S, Yoffe P, Hills N, Lustig RH (2013) The Relationship of Sugar to Population-Level Diabetes Prevalence: An Econometric Analysis of Repeated Cross-Sectional Data. PLoS ONE 8(2): e57873. doi:10.1371/journal.pone.0057873

    Figure 2. Adjusted association of sugar availability (kcal/person/day) with diabetes prevalence (% adults 20–79 years old).

    I wonder what the R-squared is on that line fit. Now, consider a comparison with obesity rates by diabetes prevalence:

    Figure 1. Relationship between obesity and diabetes prevalence rates worldwide.

    Obesity prevalence is defined as the percentage of the population aged 15 to 100 years old with body mass index greater than or equal to 30 kg/meters squared, from the World Health Organization Global Infobase 2012 edition. Diabetes prevalence is defined as the percentage of the population aged 20 to 79 years old with diabetes, from the International Diabetes Federation Diabetes Atlas 2011 edition. Three-letter codes are ISO standard codes for country names.
    doi:10.1371/journal.pone.0057873.g001

    Hmm, they didn’t fit a line here, but I can bet the fit would be better. Diabetes strongly correlates with BMI, this has been shown time and again using national survey data like NHANES or SHIELD. And before people start whining about BMI as an imperfect measure of obesity, it is perfectly appropriate for studies at a population level, and other metrics such as waist size, hip/waist ratios etc., all show the same thing. Diabetes risk increases linearly with BMI, with as many as 30% of people with BMI > 40 having diabetes, and further, we know from cohort and interventional studies that weight loss results in decreased diabetes. Much of this data is correlative as well (with the exception of the weight-loss studies), and the study that would prove this for certain – dividing people into diets providing excess fat, vs sugar, vs mixed calories, vs controls, with resultant measurement of diabetes rates, would be unethical. Either way, declaring sugar the enemy is both incomplete, and premature. While this paper provides interesting correlative evidence for increased sugar availability increasing diabetes prevalence, it is still subject to risk of confounding errors, it is correlative, and the link does not explain away other known causes of type II diabetes such as obesity. It is a warning however, and we should dedicate more study towards determining if sugar consumption (rather than mere availability) is an independent risk factor for type II diabetes.

    Bittman has wildly overstated the case made by this article. He should retract his claims, and the title and false claims should be corrected by the editors. This is a terrible misrepresentation of what this study shows.

  • What happens when you study conspiracy theories? The conspiracy theorists make up conspiracy theories about you!

    I’ve known about this effect for a while as I’ve been variously accused of being in the pocket of big pharma, big ag, big science, democrats and republicans etc. Now Stephan Lewandowsky, in follow up to his “NASA Faked the Moon Landings – Therefore (Climate) Science is a Hoax.” paper, has used these conspiratorial responses to study how conspiracy theorists respond to being studied! It’s called “Recursive fury: Conspiracist ideation in the blogosphere in response to research on conspiracist ideation“.

    Here’s the abstract:

    Conspiracist ideation has been repeatedly implicated in the rejection of scientific propositions, although empirical evidence to date has been sparse. A recent study involving visitors to climate blogs found that conspiracist ideation was associated with the rejection of climate science and the rejection of other scienti c propositions such as the link between lung cancer and smoking, and between HIV and AIDS (Lewandowsky, Oberauer, & Gignac, in press; LOG12 from here on). This article analyzes the response of the climate blogosphere to the publication of LOG12. We identify and trace the hypotheses that emerged in response to LOG12 and that questioned the validity of the paper’s conclusions. Using established criteria to identify conspiracist ideation, we show that many of the hypotheses exhibited conspiratorial content and counterfactual thinking. For example, whereas hypotheses were initially narrowly focused on LOG12, some ultimately grew in scope to include actors beyond the authors of LOG12, such as university executives, a media organization, and the Australian government. The overall pattern of the blogosphere’s response to LOG12 illustrates the possible role of conspiracist ideation in the rejection of science, although alternative scholarly interpretations may be advanced in the future.

    Awesome. It’s actually a great paper, from the introduction discussing Diethelm and Mckee’s work on conspiratorial ideation (who cited us in their original paper), to the comparisons between censorship accusations by diverse anti-science movements from the tobacco/cancer denial to HIV/AIDS denial, Lewandowsky et al., lay the groundwork for understanding this problem as a fundamental characteristic of all anti-science. They even cite a book chapter in which the authors make the link that conspiracies are specifically used to rhetorically challenge science when one lacks adequate data (Lahsen, M. (1999). The detection and attribution of conspiracies: the controversy over Chapter 8. In G. Marcus (Ed.), Paranoia within reason: a casebook on conspiracy as explanation (pp. 111{136). Chicago: University of Chicago Press.) I’ll have to look that one up, as that was our primary conclusion about denialism when we started writing about it in 2007.

    The authors then go on to the conspiracist reaction to their original paper:

    When the article by Lewandowsky et al. became available for download in July-August 2012, the climate denialist blogosphere responded with considerable intensity along several prongs: Complaints were made to the rst author’s university alleging academic misconduct; several freedom-of-information requests were submitted to the author’s university for emails and documents relating to LOG12; multiple re-analyses of the LOG12 data were posted on blogs which purported to show that the e ects reported Recursive fury 8 by LOG12 did not exist; and a number of hypotheses were disseminated on the internet with arguably conspiracist content. This response is not altogether surprising in light of research which has shown that threats – in particular to people’s sense of control – can trigger targeted small-scale conspiracy theories (Whitson & Galinsky, 2008), especially those involving a speci c opponent (Sullivan, Landau, & Rothschild, 2010).

    So what does any good scientist who is interested in the empirical study of conspiracy theories do in such a situation? Mine it for data!

    The remainder of this article reports a content analysis of the hypotheses generated by the blogosphere to counter LOG12. The extent and vehemence of contrarian activity provided a particularly informative testbed for an analysis of how conspiracist ideation contributes to the rejection of science among web denizens. Unlike previous analyses of web content, the present project was conducted in real time” as the response to LOG12 unfolded, thus permitting a fi ne-grained temporal analysis of the emerging global conversation.

    Using google alerts and other strategies they tracked the response to their paper throughout the denialsphere, then evaluated them using 6 criteria to judge whether the author used conspiracist tendencies independent of actual content. These criteria were great, and as I read them I couldn’t help thinking it is really a beautiful summary of the aberrant thought processes of the conspiracist. They were (1) assuming nefarious intent (NI) on the part of their opponent, (2) delusions of persecution including Galileo comparisons (persecution/victimization or PV) -awesome-, (3) a “nihilistic degree of skepticism”/paranoid ideation (NS), (4) an inability to believe in coincidence or “not an accident” (NoA) thinking, (5) toleration of inconsistencies and contradictions in their own counter-hypotheses as long as they challenge the “official” version (or Must-Be-Wrong MbW), and (6) the incorporation of contrary evidence as further evidence of a conspiracy thus “self-sealing” their hypothesis (SS). This is a really brilliant break down of the behavior if you ask me in particular number 6 which they even provide the perfect example of:

    Concerning climate denial, a case in point is the response to events surrounding the illegal hacking of personal emails by climate scientists, mainly at the University of East Anglia, in 2009. Selected content of those emails was used to support the theory that climate scientists conspired to conceal evidence against climate change or manipulated the data (see, e.g., Montford, 2010; Sussman, 2010). After the scientists in question were exonerated by 9 investigations in 2 countries, including various parliamentary and government committees in the U.S. and U. K., those exonerations were re-branded as a whitewash” (see, e.g., U.S. Representative Rohrabacher’s speech in Congress on 8 December 2011), thereby broadening the presumed involvement of people and institutions in the alleged conspiracy. We refer to this “self-sealing” criterion by the short label SS.

    At denialism blog we’ve been describing these tactics for years, in particular I feel like the Crank Howto seems to incorporate most of these denialist tactics. In particular, that the authors recognized the persecution complex of the conspiracist is heart warming.

    For the meat of the study, the authors then go through the evolution of reactions to their paper, and it’s fascinating. Starting with lots of allegations of “scamming” (must be wrong) and a smear to make them look like nutters (persecution victimization) the conspiracy theories then evolved about everything to whether or not the authors didn’t actually contact skeptic blogs (amazingly the blogs they did contact came out and appear to have lied about not being contacted), persecutorial delusions about the authors blocking individual skeptics IP addresses from accessing the paper (and further conspiracies that when they are being unblocked it’s just to make them look paranoid), conspiracies about it being a ploy by the Australian government (nefarious intent), and it gets crazier and crazier from there. One of the most fascinating aspects of the evolution of the response was how, predictably, as more information was made available, rather than quashing conspiracies, the conspiracy theorists would just broaden the nefarious actors to larger and larger circles of foes:

    Second, self-sealing reasoning also became apparent in the broadening of the scope of presumed malfeasance on several occasions. When ethics approvals became public in response to an FOI request, the presumption of malfeasance was broadened from the authors of LOG12 to include university executives and the university’s ethics committee. Similarly, the response of the blogosphere evolved from an initial tight focus on LOG12 into an increasingly broader scope. Ultimately, the LOG12 authors were associated with global activism, a $6 million media initiative, and government censorship of dissent, thereby arguably connecting the response to LOG12 to the grand over-arching theory that “climate change is a hoax.” Notably, even that grand “hoax” theory is occasionally thought to be subordinate to an even grander theory: one of the bloggers involved in the response to LOG12 (cf. Table 1) considers climate change to be only the second biggest scam in history. The top-ranking scam is seen to be modern currency, dismissed as “government money” because it is not linked to the gold standard

    And doesn’t that bring this back beautifully, full-circle, to the author’s original hypothesis in the first paper that free-market extremism is behind global warming denialism?

    Finally the authors discuss implications for science communication, and, unlike most people, I think they actually understand the problem. That is, you can’t fix this problem with more communication, and more data. The nature of the conspiracy theorist is that all additional data and all contradictory data will only be used to demonstrate further evidence of conspiracy, that the conspiracy is even larger, or that the data are fraudulent. The “self-sealing” nature of the conspiracy theory, as the authors describe it, makes it fundamentally immune to penetration by logic, reason, or additional information.

    Implications for science communication. A de fining attribute of conspiracist ideation is its resistance to contrary evidence (e.g., Bale, 2007; Keeley, 1999; Sunstein & Vermeule, 2009). This attribute is particularly troubling for science communicators, because providing additional scientifi c information may only serve to reinforce the rejection of the evidence, rather than foster its acceptance. A number of such “back fire” e ffects have been identi fied, and they are beginning to be reasonably well understood (Lewandowsky, Ecker, Recursive fury 37 et al., 2012). Although suggestions exist about how to rebut conspiracist ideations|e.g., by indirect means, such as affirmation of the competence and character of proponents of conspiracy theories, or affirmation of their other beliefs (e.g., Sunstein & Vermeule, 2009) we argue against direct engagement for two principal reasons.
    First, much of science denial takes place in an epistemically closed system that is immune to falsifying evidence and counterarguments (Boudry & Braeckman, 2012; Kalichman, 2009). We therefore consider it highly unlikely that outreach e fforts to those groups could be met with success. Second, and more important, despite the amount of attention and scrutiny directed towards LOG12 over several months, the publication of recursive hypotheses was limited to posts on only 24 websites, with only 13 blogs featuring more than one post (see Table 1). This indicates that the recursive theories, while intensely promoted by certain bloggers and commenters, were largely contained to the “echo chamber” of climate denial. Although LOG12 received considerable media coverage when it first appeared, the response by the blogosphere was ignored by the mainstream media. This confinement of recursive hypotheses to a small “echo chamber” reflects the wider phenomenon of radical climate denial, whose ability to generate the appearance of a widely held opinion on the internet is disproportionate to the smaller number of people who actually hold those views (e.g., Leviston, Walker, & Morwinski, 2012). This discrepancy is greatest for the small group of people who deny that the climate is changing (around 6% of respondents; Leviston et al., 2012). Members of this small group believe that their denial is shared by roughly half the population. Thus, although an understanding of science denial is essential given the importance of climate change and the demonstrable role of the blogosphere in delaying mitigative action, it is arguably best met by underscoring the breadth of consensus among scientists (Ding, Maibach, Zhao, Roser-Renouf, & Leiserowitz, 2011; Lewandowsky, Gignac, & Vaughan, 2012) rather than by direct engagement.

    Don’t argue with cranks. I can’t agree more. And historically this is what has worked with denialist groups. You don’t debate them as if they’re honest brokers, you treat them as the defective brains that they are, and eventually, their influence dwindles, and they’ll be reduced to a small community of losers sharing their delusions of grandeur and righteous indignation in some tiny corner of the internet.

    The key to preventing denialism isn’t in arguing with those that have already formed fixed, irrational ideas. It can only happen with prevention – early education that emphasizes logic, scientific methods, rational thought and non-ideological, pragmatic approaches to problem solving.

  • What is the cause of excess costs in health care Part 4 – Time's "Bitter Pill", CEO compensation and the Kafkaesque chargemaster

    Steven Brill’s extensive piece in Time has generated a good discussion once again on why Americans pay so much more for health care than other countries, and while I agree with most of his critiques, he seems to have gotten overly hung-up on the hospital chargemaster.

    Readers of this blog know I’ve also discussed reform in health care, the diverse sources of excess cost including price gouging on pharmaceuticals, defensive medicine, expensive end-of-life care, the high cost of primary care in the ER etc, and both Brill and I appear to have relied on the same sources of data in the McKinsey report. We’ve also discussed the “hidden tax” of the uninsured, as well as some signs of reform (although of note, my belief that and EMR would reduce waste has proven wrong as the software designers have designed their programs to gouge medicare on behalf of the doctor – increasing costs!).

    Brill has written a piece for Time that takes a different tactic, however. Rather than starting from the data on sources of excess costs (although he does reference them), he starts with the patients’ bills and then tries to figure out where all these expenses come from. This is a creative and innovative approach to the problem, except I cringed when I read it because I knew what he was going to find before the end of the first page. Medical bills are insane.

    The total cost, in advance, for Sean to get his treatment plan and initial doses of chemotherapy was $83,900.

    Why?

    The first of the 344 lines printed out across eight pages of his hospital bill — filled with indecipherable numerical codes and acronyms — seemed innocuous. But it set the tone for all that followed. It read, “1 ACETAMINOPHE TABS 325 MG.” The charge was only $1.50, but it was for a generic version of a Tylenol pill. You can buy 100 of them on Amazon for $1.49 even without a hospital’s purchasing power.

    Dozens of midpriced items were embedded with similarly aggressive markups, like $283.00 for a “CHEST, PA AND LAT 71020.” That’s a simple chest X-ray, for which MD Anderson is routinely paid $20.44 when it treats a patient on Medicare, the government health care program for the elderly.

    Every time a nurse drew blood, a “ROUTINE VENIPUNCTURE” charge of $36.00 appeared, accompanied by charges of $23 to $78 for each of a dozen or more lab analyses performed on the blood sample. In all, the charges for blood and other lab tests done on Recchi amounted to more than $15,000. Had Recchi been old enough for Medicare, MD Anderson would have been paid a few hundred dollars for all those tests. By law, Medicare’s payments approximate a hospital’s cost of providing a service, including overhead, equipment and salaries.

    This is the fascinating aspect of his article but also the tragedy. Because Brill becomes so highly focused on what hospitals charge, and the apparent horrifying mark-up of medicine, I worry that people will come away thinking that this is the cause of excess costs.

    The problem Brill is describing is that the costs itemized by hospitals on their bills come from a mysterious document called the “chargemaster”. I’m pretty sure no one knows how the costs are calculated, and I know there is no rational basis for the prices on it. But as shocking as the bills that are generated by these documents are, they are largely irrelevant to the excess costs of healthcare. The chargemaster is a red herring, and despite Brill’s protestations that it is some central evil in health care expenditure, it really isn’t.

    The tragic circumstances Brill describes, repeatedly, are what happens when someone who is uninsured, underinsured, or paying out of pocket for medical care experiences when they receive their bill. It is covered with charges that seem insane – because they are – and lacking the knowledge and tools to deal with these ridiculous charges, the uninsured, understandably, believe they need to pay this full, ridiculous bill. But they don’t, and shouldn’t. An uninsured person receiving one of these bills is a little like a tourist who has just been dropped in a foreign bazaar, with no understanding of the rules of the market, discounts, haggling, gouging, or any idea of what other actors in the same market are paying. The reality is that hospital bills, and many other bills in medical care are little more than an opening gambit in an irrational, and largely incomprehensible, cost war between providers and payers. Hospitals, knowing that insurers never pay a full hospital bill, and will haggle and discount charges away or pay a percentage of “chargemaster” costs, push back by artificially inflating costs – as demonstrated throughout Brill’s article. When someone who doesn’t have the power or expertise of the insurance company receives their bill (hospitals of course can’t send different bills to different kinds of customers right?) they end up paying more, or worse, think they have to actually pay the full bill, because they have fallen into this market without the tools or knowledge to navigate it.

    The proof of this are the medical bill advocates Brill interviews for the article. For a fee, you can hire someone who has similar expertise as the payers to fight back, and often reduce these bills to a tiny fraction of their original amount.

    Shocked by her bill from Stamford hospital and unable to pay it, Janice S. found a local woman on the Internet who is part of a growing cottage industry of people who call themselves medical-billing advocates. They help people read and understand their bills and try to reduce them. “The hospitals all know the bills are fiction, or at least only a place to start the discussion, so you bargain with them,” says Katalin Goencz, a former appeals coordinator in a hospital billing department who negotiated Janice S.’s bills from a home office in Stamford.

    Goencz is part of a trade group called the Alliance of Claim Assistant Professionals, which has about 40 members across the country. Another group, Medical Billing Advocates of America, has about 50 members. Each advocate seems to handle 40 to 70 cases a year for the uninsured and those disputing insurance claims. That would be about 5,000 patients a year out of what must be tens of millions of Americans facing these issues — which may help explain why 60% of the personal bankruptcy filings each year are related to medical bills.

    “I can pretty much always get it down 30% to 50% simply by saying the patient is ready to pay but will not pay $300 for a blood test or an X-ray,” says Goencz. “They hand out blood tests and X-rays in hospitals like bottled water, and they know it.”

    After weeks of back-and-forth phone calls, for which Goencz charged Janice S. $97 an hour, Stamford Hospital cut its bill in half. Most of the doctors did about the same, reducing Janice S.’s overall tab from $21,000 to about $11,000.

    Brill also seems very disturbed by two consistently capitalistic elements of hospitals, high compensation of hospital CEOs and non-physician administrators, and the ability of “non-profit” hospitals to actually make a profit. The difference, is, of course, “profits” in this case go back into the hospital, hiring staff, into equipment, buildings, etc., rather than shareholders pockets so one could argue the money largely goes back into the community (disclosure – I work for a non-profit hospital). But what about CEO pay? Brill seems largely offended that they make so much, even making the suggestion at the end of the article that hospital profits be taxed at 75% (insuring no one will ever invest in them again) and that CEO pay be limited to some multiple of a lower-level employee’s salary. Sounds wonderful, except it will only work if that model is applied to CEOs across the board. Brill cites hospital CEO compensations of between 1-4 million dollars. But the average CEO salary is about 13 million dollars as of 2011. Yes CEOs are paid too much, but it’s a global problem, not just in healthcare. If anything, healthcare CEOs are payed a great deal less than their counterparts in other industries, it seems strange to single out healthcare as the one service where CEO pay is somehow more sinful than in others, especially if you want hospitals and medical centers to be able to attract talented management. So am I bothered by CEO pay? Sure, but not as much in my industry as in say, finance, where CEOs who aren’t making profit, and are hurting shareholders get rewarded with bonuses and golden parachutes. Brill seems to be upset that the industry is profitable and are CEOs are well-compensated. But this isn’t so much a source of excess costs in healthcare as it is a source of excess costs in capitalism, and I don’t anticipate that changing any time soon, nor do I think it’s ultimately a good idea to nationalize or socialize the industry to make no profit.

    This is too bad, because in an extensive, lengthy documentation of the absurdity of the hospital billing scheme, and anger at CEO pay, Brill’s final recommendations hit many of the true sources of excess cost.

    So how can we fix it?

    Changing Our Choices
    We should tighten antitrust laws related to hospitals to keep them from becoming so dominant in a region that insurance companies are helpless in negotiating prices with them. The hospitals’ continuing consolidation of both lab work and doctors’ practices is one reason that trying to cut the deficit by simply lowering the fees Medicare and Medicaid pay to hospitals will not work. It will only cause the hospitals to shift the costs to non-Medicare patients in order to maintain profits — which they will be able to do because of their increasing leverage in their markets over insurers. Insurance premiums will therefore go up — which in turn will drive the deficit back up, because the subsidies on insurance premiums that Obamacare will soon offer to those who cannot afford them will have to go up.

    Agreed. In particular I find the financial conflict of doctors owning labs and radiology equipment, that they then can profit from ordering tests on, is very disturbing. It’s been shown in the literature that this arrangement results in the physicians ordering more unnecessary tests, and demonstrates that it’s an unacceptable conflict of interest that only increases costs.

    Similarly, we should tax hospital profits at 75% and have a tax surcharge on all nondoctor hospital salaries that exceed, say, $750,000. Why are high profits at hospitals regarded as a given that we have to work around? Why shouldn’t those who are profiting the most from a market whose costs are victimizing everyone else chip in to help? If we recouped 75% of all hospital profits (from nonprofit as well as for-profit institutions), that would save over $80 billion a year before counting what we would save on tests that hospitals might not perform if their profit incentives were shaved.

    We could save lots of money if we forbade various industries from making profit and taxed their incomes at 75%. But I don’t think this is a viable suggestion in a capitalist society, and I believe that if we did increase competition in insurance, pharmaceutical, and other healthcare markets we could decrease costs. Socialism is not the answer, although Brill makes a compelling argument for a public-option as medicare’s administration and pricing is highlighted in the article as a model of efficiency compared to the private insurers. I’d have no problem with expanding medicare as a payer. Next:

    We should outlaw the chargemaster. Everyone involved, except a patient who gets a bill based on one (or worse, gets sued on the basis of one), shrugs off chargemasters as a fiction. So why not require that they be rewritten to reflect a process that considers actual and thoroughly transparent costs? After all, hospitals are supposed to be government-sanctioned institutions accountable to the public. Hospitals love the chargemaster because it gives them a big number to put in front of rich uninsured patients (typically from outside the U.S.) or, as is more likely, to attach to lawsuits or give to bill collectors, establishing a place from which they can negotiate settlements. It’s also a great place from which to start negotiations with insurance companies, which also love the chargemaster because they can then make their customers feel good when they get an Explanation of Benefits that shows the terrific discounts their insurance company won for them.

    While I agree outlawing the chargemaster should be considered, I’m not really sure it will work. For one, yes, it is a fiction, it’s not an actual source of excess costs like so many other problems described in the McKinsey report. And hospitals are in a bind. They can’t charge what things actually cost, because insurance companies will still try to pay less. Hospitals know what their costs are, but it’s nearly impossible to truly itemize a patient’s stay, and take into account the exact time every physician doctor and nurse spent with them versus another patient, exactly how many disposable materials were used, as well as factor in all the other costs hospitals need to balance such as covering uninsured patients (most of whom never pay their bills), profit-losing divisions of hospitals and “mission” costs, as well as having a adequate nest egg at the end of the year to ensure adequate capital to expand as needed to meet demand, buy new equipment, and hire new staff. My guess as to what the chargemaster is accomplishing (and it is just a guess), is that it’s a strategy that reliably returns a certain amount of profit on each hospitalization relative to patient utilization of specific services, while providing plausible deniability for what is ultimately overcharging the insured to subsidize the total costs of running their operation. It has to be inflated to cover all the costs of overhead, supplies, etc., that just can’t be reliably quantified, and the fact that insurance companies will only pay a fraction of the final bill. The alternative would be to “bundle” costs so that hospitalizations for specific services cost a set amount, and hospitals then have an incentive to come in under that reasonable fixed price for the service. But then, you run into the problem of penalizing the hospitals that treat needier, poorer, sicker, older populations that cost more to treat, and will have poorer outcomes, readmissions, and complications.

    It’s a pickle. You want hospitals to have some measure of profitability – they provide a necessary service, employment, and pride to the community. But if you create profit incentives that put them in conflict with the community – like avoiding poorer, sicker, older patients, they won’t provide the very service for which they exist.

    I don’t have a good solution to this problem. We need to own up to the costs of treating the difficult populations, rather than continuing to play the insurance shell-game. Hospitals have to treat people who are sick no matter what because of laws like EMTALA. EMTALA is an unfunded mandate, passed during the Reagan years, that doesn’t specify how hospitals shall recoup their losses when someone can’t pay, just that they have to provide care for sick people no matter what. The unwillingness to pay for the poor and the uninsured pushes even their primary care into the ER, and they present with more acute problems that earlier access to primary care could have managed more cheaply, raising the costs of their care, and foisting it on the hospitals – especially the non-profit, inner-city hospitals providing care to the most at-risk. They then have to figure out some way of spreading these costs back onto medicare and the insured and paying patients without attaching a rider to the bill that simply says, “paying for the sick and uninsured in your community — $1200” (then you actually only have to pay only $200 for this service in reality because everything is inflated). I suspect the chargemaster, in its irrational and frightening way, is accomplishing this task. It’s not pretty, but I’d love to hear suggestions to address the need of hospitals to provide universal health care without a funding source to do so.

    Then Brill gets to the things which I agree the McKinsey report shows are our real sources of excess costs:

    We should amend patent laws so that makers of wonder drugs would be limited in how they can exploit the monopoly our patent laws give them. Or we could simply set price limits or profit-margin caps on these drugs. Why are the drug profit margins treated as another given that we have to work around to get out of the $750 billion annual overspend, rather than a problem to be solved?

    Just bringing these overall profits down to those of the software industry would save billions of dollars. Reducing drugmakers’ prices to what they get in other developed countries would save over $90 billion a year. It could save Medicare — meaning the taxpayers — more than $25 billion a year, or $250 billion over 10 years. Depending on whether that $250 billion is compared with the Republican or Democratic deficit-cutting proposals, that’s a third or a half of the Medicare cuts now being talked about.

    We pay twice as much for our drugs as any other country, R&D is a BS excuse, and the inability of medicare to collectively bargain is anti-capitalistic and anti-market. How is it possible that in a capitalistic society a buyer isn’t allowed to bargain for bulk purchase? It’s just a wealth-redistribution scheme! And the proof is systems like the Veterans Administration, which is allowed to negotiate for lower prices, and does, typically paying about half as much for the same drugs.

    Similarly, we should tighten what Medicare pays for CT or MRI tests a lot more and even cap what insurance companies can pay for them. This is a huge contributor to our massive overspending on outpatient costs. And we should cap profits on lab tests done in-house by hospitals or doctors.

    I think that particular conflict of interest should be banned, but one must remember again that differing CT and MRI costs are largely a reflection of which hospitals have higher “mission” costs. It’s not the CT in the suburb that costs medicare more, it’s the CT in the inner-city.

    Finally, we should embarrass Democrats into stopping their fight against medical-malpractice reform and instead provide safe-harbor defenses for doctors so they don’t have to order a CT scan whenever, as one hospital administrator put it, someone in the emergency room says the word head. Trial lawyers who make their bread and butter from civil suits have been the Democrats’ biggest financial backer for decades. Republicans are right when they argue that tort reform is overdue. Eliminating the rationale or excuse for all the extra doctor exams, lab tests and use of CT scans and MRIs could cut tens of billions of dollars a year while drastically cutting what hospitals and doctors spend on malpractice insurance and pass along to patients.

    Tort reform may benefit but it’s effects will take decades to see. It’s almost impossible to adequately quantitate the cost of “defensive” medicine, and how it has inculcated generations of physicians to overtest, overtreat, and overspend on patients. Even if such changes were made, the paranoia runs deep within us. I agree it’s costly, but it will take more than just removing the threat of lawsuits to generate more responsible cost-practices for physicians.

    Over the past few decades, we’ve enriched the labs, drug companies, medical device makers, hospital administrators and purveyors of CT scans, MRIs, canes and wheelchairs. Meanwhile, we’ve squeezed the doctors who don’t own their own clinics, don’t work as drug or device consultants or don’t otherwise game a system that is so gameable. And of course, we’ve squeezed everyone outside the system who gets stuck with the bills.

    We’ve created a secure, prosperous island in an economy that is suffering under the weight of the riches those on the island extract.

    And we’ve allowed those on the island and their lobbyists and allies to control the debate, diverting us from what Gerard Anderson, a health care economist at the Johns Hopkins Bloomberg School of Public Health, says is the obvious and only issue: “All the prices are too damn high.”

    If you throw in a ban on Direct To Consumer Advertising on Drugs, implementation of CPRS or a universal record standard for the EMR (not these medicare-gauging EMRs that for-profit companies are designing), and better end-of-life planning, data, and management, I think we’d go a long way to reducing costs. Brill’s final recommendations hit the mark, but I’m concerned his obsession with the admittedly Kafkaesque chargemaster is a distraction. That’s not the true source of the excess costs. Rather I suspect it’s a way for hospitals to try to rationalize the redistribution of resources from medicare and the insured to cover their broader mission.

    And one final petty complaint which as a surgeon I couldn’t resist:

    Steve H.’s bill for his day at Mercy contained all the usual and customary overcharges. One item was “MARKER SKIN REG TIP RULER” for $3. That’s the marking pen, presumably reusable, that marked the place on Steve H.’s back where the incision was to go.

    No! The pen is sterile! We have to mark the skin after prep – the alcohol based preps we use will wash away most markings placed before sterile preparation, and iodine preps hide them pretty well too. Complaining about a lot of potentially “reusable” items is also just not going to fly in this modern world of MRSA and nosocomial infections.

  • Denialism From Forbes Courtesy of Heartland Hack James Taylor

    It’s fascinating when you catch the start of a new bogus claim enter the denialsphere, bounce from site to site, and echo about without any evidence of critical analysis or intelligence on the part of the denialists. A good example of this was an article by Heartland Institute’s contributor to Forbes, James Taylor, falsely claiming only a minority of scientists endorse the IPCC position on the causes of global warming. This new nonsense meme gets repeated by crank extraordinaire Steve Milloy, bounces the next day to Morano’s denialist aggregation site, and before long I’m sure we’ll be seeing it on Watt’s site, Fox news, and in a couple more weeks, in an argument with our conservative uncles.

    The claim is, of course, a deception (or possibly total incompetence) on the part of Heartland’s “senior fellow for environment policy” (I wonder if there is significance to the use of “environment” as opposed to “environmental”). Linking this paper in the journal Organization studies, Taylor makes a false claim that a mere 36% of scientists, when surveyed, hold the consensus view. Anyone want to guess at the deception? Cherry-picking! It was a survey of largely industry engineers and geoloscientists in Alberta, home of the tar sands. In the study authors’ words:

    To address this, we reconstruct the frames of one group of experts who have not received much attention in previous research and yet play a central role in understanding industry responses – professional experts in petroleum and related industries. Not only are we interested in the positions they take towards climate change and in the recommendations for policy development and organizational decision-making that they derive from their framings, but also in how they construct and attempt to safeguard their expert status against others. To gain an understanding of the competing expert claims and to link them to issues of professional resistance and defensive institutional work, we combine insights from various disciplines and approaches: framing, professions literature, and institutional theory.

    This is pretty classic denialist cherry-picking and and is one of the most common deceptive practices of denialists like Taylor. By suggesting a survey of industry geoscientists can be generalized to scientists as whole, Taylor has demonstrated the intellectual dishonesty inherent in denialist argumentation. You might as well make claims about the consensus that tobacco causes lung cancer by surveying scientists in the Altria corporation headquarters.

    The paper is actually quite interesting, and I’m glad I read it, as it is consistent with our thesis that ideological conflicts result in refusal to accept science that contradicts one’s overvalued ideas or personal interests. The authors surveyed a professional association of geoscientists in Alberta Canada (APEGGA), most of whom are working for the petroleum industry, and then performed a detailed analysis of their free-text responses on why they accept or reject climate science. What they found was there are 5 general “frames” used by respondents that their answers conformed to. The most common response was that global warming is real, and we need to act with regulation to address the problem (at 36%, the number quoted by Taylor to suggest there is no consensus), another 5% expressed doubt at the cause but agreed green house gases needed to be regulated. The second most common responses were “it’s nature” or “it’s a eco-regulatory conspiracy” and these responses showed a great deal of hostility in language towards environmentalism, proponents of global warming, liberalism etc. These came in at about 34% of responses and were more common from older white males in the higher tiers of the oil industry corporate structure. The most common remaining frame was a “fatalist” frame (17%) which could take or leave the science because hey, we’re screwed no matter what we do.

    The authors weren’t attempting to validate the consensus with this study, but rather were trying to understand how scientists working in industry justify their position on global warming, as they often reject the consensus view of climate science. When a true cross-section of climate scientists is sampled, agreement with consensus is found among about 90% of scientists and 97% of those publishing in the field. A more appropriate summary of what these authors showed was that oil industry geoscientists and engineers most frequently express a view consistent with the consensus IPCC view and a need for regulation of green house gases. A similar but slightly smaller number express hostility to the consensus view and about half as many as that think we’re screwed no matter what we do.

    It all would have been short-circuited if Forbes had exercised any kind of appropriate editorial control over this crank James Taylor. Or, if the echo chamber had read some of the comments on the initial post before rebroadcasting a false claim far and wide, but then, that would require intellectual honesty and a desire to promote factual information. Does Forbes have any interest that one of their bloggers is misrepresenting the literature in this way? Is this acceptable practice among their contributors? Is this the kind of publication they wish to be?

    Finally, I see the authors of the paper (who I alerted to the Forbes article’s presence – they clearly were not contacted by Taylor for comment) have response. From their comment:

    First and foremost, our study is not a representative survey. Although our data set is large and diverse enough for our research questions, it cannot be used for generalizations such as “respondents believe …” or “scientists don’t believe …” Our research reconstructs the frames the members of a professional association hold about the issue and the argumentative patterns and legitimation strategies these professionals use when articulating their assumptions. Our research does not investigate the distribution of these frames and, thus, does not allow for any conclusions in this direction. We do point this out several times in the paper, and it is important to highlight it again.

    In addition, even within the confines of our non-representative data set, the interpretation that a majority of the respondents believe that nature is the primary cause of global warming is simply not correct. To the contrary: the majority believes that humans do have their hands in climate change, even if many of them believe that humans are not the only cause. What is striking is how little support that the Kyoto Protocol had among our respondents. However, it is also not the case that all frames except “Support Kyoto” are against regulation – the “Regulation Activists” mobilize for a more encompassing and more strongly enforced regulation. Correct interpretations would be, for instance, that – among our respondents – more geoscientists are critical towards regulation (and especially the Kyoto Protocol) than non-geoscientists, or that more people in higher hierarchical positions in the industry oppose regulation than people in lower hierarchical positions.

    Incompetence or deception by Taylor? You tell me. Either way, this is the kind of shoddy, non-academic discourse we get from bogus ideological think tanks like Heartland. They should be embarrassed.

    Article Cited:
    Lianne M. Lefsrud and Renate E. Meyer
    Science or Science Fiction? Professionals’ Discursive Construction of Climate Change Organization Studies November 2012 33: 1477-1506, doi:10.1177/0170840612463317