Friday, February 6, 2015

Vaccine Talk: The Dose Makes the Poison, Part 1

A little over the top, perhaps?
I know it seems like a dramatic departure from the usual topics on this blog, but vaccines are something I'm very passionate about. Considering how relevant their discussion seems to be with this latest Measles outbreak, I figured it might be time to put to rest a number of myths about vaccines. No, I'm not going to talk about how they don't cause autism, because they don't. No, in fact, I'm going to beat a horse that's slightly less dead: anti-vaccine talking up a storm about how vaccines are loaded with "toxins" like formaldehyde and mercury. Unfortunately, what most of these people don't seem to understand is the simple pharmacological concept that the dose makes the poison. Surely you wouldn't slap a banana out of someone's hand because it has potassium in it and that's what lethal injections contain, would you?

In any case, let's begin. This is not an exhaustive list by any means, but I'm going to break down the most common so-called toxins for starters. 

Formaldehyde: Typical anti-vaxxers will point out that this is in embalming fluid and nail polish remover. Gross, right? Well, yeah, kind of, but it's also in things like apples, pears, and milk. And did you also know that when you drink fruit juices, your own body breaks them down into formaldehyde? Bet you didn't know that. I also bet you didn't know that formaldehyde is important in the formation of DNA and amino acids. The key here, again, is that the amount a kid would get from a trip to the doctor's office to get shots is incredibly small and is metabolized within hours:
"At their 6 month visit (when they are, on average, 16.5lbs or 7.5kg) with HepB, DTaP, IPV and flu, for a total of 307.5μg. That is about 160 times less than the total amount their body naturally produces every single day. Compare that to the 428.4-1,516.4μg of formaldehyde in a single apple."
Some people might say, well, if it's natural in an apple, that's different than being made in a lab. No, it's not. Two hydrogen atoms, a carbon, and an oxygen. On a more practical level, the formaldehyde actually has a purpose in vaccines: it is used to deactivate the bacterium in the vaccine so your body can form antibodies against it more easily.

Thimerosal: This is it, the big one. Mercury. Leaving aside that it is no longer in any vaccines except for certain seasonal flu shots, this used to be the major bogeyman of the anti-vax movement, with people insisting that it caused autism (which it doesn't). On some level, I suppose it would be understandable. After all, mercury is toxic to humans. However, not all mercury molecules are created equal. Methyl mercury, which is in things like tuna and those old-school glass thermometers, is the kind of mercury that usually comes to mind. It does accumulate in the body and is toxic to humans. 

By contrast, thimerosal, which contains ethyl mercury, does not accumulate in the body and is metabolized by the body much more quickly (in a matter of days). Not to mention, of course, that if you eat a can of tuna every once in awhile, you're getting about the same amount of methyl mercury as you would ethyl mercury in a vaccine, with the added detriment of bioaccumulation. Thimerosal was/is used as a preservative, so that multiple vaccine doses could be drawn from the same vial (for cost-effectiveness) and would not require refrigeration. This may not be an issue for us here in the U.S., but in poorer countries, where reliable refrigeration and costs matter more, this made vaccines more expensive and thus harder to get.

Aluminum: The new boogeyman, mercury's successor, etc. Aluminum is used as an adjuvant, which means that it basically provokes the body's immune response in order for it to properly form antibodies for the disease. Seems kind of important for vaccines to work, right? Not only is it important, but again, the dose makes the poison in this case. Anti-vaxxers point out that aluminum has been associated with breast cancer and Alzheimer's Disease. Not only are these claims dubious, at best, but the levels of aluminum in each shot are extremely low

For example: the aluminum content in vaccines ranges from 0.125 mg/dose to 0.85 mg/dose. By contrast, infant formula contains 0.225 mg/liter, and aspirin contains 10-20 mg/tablet. Put more clearly, the average adult takes in about 7-9 milligrams of aluminum per day. The average baby gets about 4.4 mg of aluminum from vaccines in the first 6 months of life. Over that same period, an infant receives 7 mg of aluminum over the same period from breast milk alone. Who knew that breast-feeding caused breast cancer, right? You're beginning to get the picture.

So there you have it, those are the three biggest "toxins" you hear about from the anti-vaccine crowd. There are plenty more for me to run through, but I'll spread them over a number of blog posts so we both don't get burnt out. Hopefully the next time you read about these toxins, just remember, the dose makes the poison.

Friday, November 29, 2013

Having Your Cake and Eating it Too: Health Care Edition


I noticed fairly recently that there's growing anger among many state policymakers and insurance commissioners about the narrowing provider networks under many of the plans in the ACA's exchanges.

Before I delve into the few points I wanted to make, I suppose a quick primer is in order for how most insurers work. An insurance company, say, Aetna or BCBS, contracts with a network of providers and hospitals and agrees to pay them a certain amount in reimbursement for various procedures. What the aforementioned malcontents are upset about is that many of these plans on the exchanges in some states are featuring more narrow networks of hospitals and doctors in order to offer lower premiums. Naturally, the ACA is to blame for this, right? 

Not really. I'd argue that this is more of a feature of a competitive market than anything else. The insurers, wanting to offer more affordable plans in the exchanges, contracted with doctors and networks that agree to take lower payments for equivalent procedures than other, more expensive hospitals or doctors would. The savings, then, are passed on to consumers in the form of lower premiums.

The other important point to be made here is that this kind of insurer, known as a Health Maintenance Organization, or HMO, has been around since 1973, and has been growing in prevalence for quite some time. The ACA simply accelerates their growth because--guess what? People like lower prices. People like competition. And, quite frankly, this kind of narrowing of the networks is a very good thing, because it plays a huge role in slowing the growth of health care costs (which, by the way, means more money you could be spending on other things). For example, in the 1990s, HMOs became far more prevalent, but in the early 2000s, there was a backlash in many states as a result of the narrowing of provider networks. An MIT study comparing states that restricted these cost-cutting measures to those that did not found this:
"I find that the backlash increased the U.S. health care share of GDP by 2 percentage points relative to a counterfactual with no backlash, which is slightly more than its entire increase during the backlash period."
You'll also note the spike in 2001, at the height of the backlash, which appears to corroborate this story:

Look, I can understand that some people might be upset about losing a doctor they like, but this isn't really news at all. Insurer networks change all the time. And people are going to have to decide if they want to have a wider choice of doctors or if they want to save money, similar to any other decision a cost-conscious consumer would have to make.

Policymakers--both left and right--should stop promising the moon to constituents on health care, because that's exactly how we ended up in such a mess. Designing a health care system that isn't as staggeringly wasteful and inefficient as ours is can't be achieved if we promise that you can always keep your doctor or your health plan, or if we try to force insurance companies to cover certain hospitals. This isn't realistic, and even worse, it isn't good policy, and it has to stop.

P.S. I think it should be noted that there are winners and losers under every single public policy, and acting surprised or expressing a sense of schadenfreude only further degrades the quality of policy discourse.

Monday, November 4, 2013

Breaking News: Incompetent ECB is Shocked At Its Own Incompetence

Forgive me, that was cutting, but this is just too much. Last Friday, there was an article in the Washington Post that began with the following sentence:
"On top of high unemployment and sluggish growth, the European Central Bank has a new headache: an unexpected drop in inflation."
Seriously? The drop in inflation was "unexpected?" On the one hand, you've got most Eurozone countries adopting one form of austerity or another--in the case of Greece or Spain, quite severe cuts, while in Germany the cuts are more modest. On the other hand, you've got a staggeringly incompetent central bank that has repeatedly refused to take any additional steps to ease monetary policy in the face of falling inflation and sky-high unemployment






So how this comes as a surprise to anyone is really beyond me. In fact, it should have been all too predictable. Now all that remains is to see if the ECB is actually willing to do something about it. So far, European officials have lulled themselves into complacency since the most immediate crisis was staved off last year. But the real crisis is still very much alive and well, and they ignore it at their (and Europe's) peril.

Friday, November 1, 2013

Employer Provided Health Insurance Isn't Really Provided by Employers


As many of you may have noticed, I've been on a bit of a tear lately attacking employer-provided health insurance. Anyway, I'm at it again today. One of the things I've actually heard a lot of people saying about EPI usually goes something like this, "Why should I care if my employer-provided plan ultimately costs more that one I can get in the exchanges? My employer's the one footing most of the bill!" 

This is demonstrably wrong. It helps to think about it like this: when you're a worker who gets paid in cash and gets insurance through your job, your total compensation is how much your employer spends on you--both on cash wages and for your insurance. Simply put, every dollar an employer puts in towards paying "their part" of your premium is basically just one less dollar they'd have paid you in wages. In fact, over the longer run, employees who don't get insurance from their employers ended up getting paid higher wages overall (ever notice how contract workers get higher pay than salaried ones?). So even if your employer supposedly seems to be footing the bill for most of your insurance plan, they really aren't, and every time health care costs rise more rapidly, that ends up depressing your wages. For those of you visual learners, here's an interesting comparison to consider:



There's a bit of a lag in the data, and there are obviously many, many more factors at play here than just health care costs, but it certainly is interesting to think about and does seem to bear out the theory. As you can see, in the early 1990s when health care costs grew more slowly, wages grew somewhat more quickly. 

In any case, this apparent one-for-one trade-off that's been found between EPI and wages is made even worse by the fact that employers sometimes end up offering plans that are far more expensive than many employees actually need. For example, a company that has both younger and older employees might offer a generous (read: expensive) health plan that the younger employees don't actually need, but are basically coerced into getting if they want to be insured. This creates the doubly unfortunate situation in which a rise in costs eats into their wage growth and is further exacerbated by that already expensive plan. Needless to say, this is a lose-lose situation for a lot of people.

So have I convinced you yet?

Monday, October 28, 2013

Employer-Provided Health Insurance: A Uniquely American Blunder


Most Americans nowadays--myself included--get health insurance through their employers. At last count, about 60% of non-elderly Americans obtained their insurance this way, although that number has been falling for years and shows further signs of dropping--in no small part due to some of the provisions in Obamacare. I'm sure some of you have no doubt read stories about how companies like Walgreen's and Trader Joe's are dropping employees from their health plans and pushing them into the health insurance exchanges. And contrary to popular belief, that's a very good thing. Indeed, health policy wonks from across the political spectrum--from Milton Friedman to Austin Frakt--are very much in favor of completely eliminating employer-provided health insurance (EPI). 

And why shouldn't they be? No other industrialized country does it quite like this. If we're being honest, the system of EPI wasn't devised as some carefully-crafted public policy--it was basically an accident brought about by wage controls during World War II. Basically, what happened was that these wage controls were put into place by the National War Labor Board, which set a cap on all wage increases. But there was a loophole: the caps didn't apply to employee benefits. So employers started competing for labor on the basis of health insurance plans instead of higher wages. Then, the IRS made it even more explicit in 1954 when it declared that employer-provided health benefits were not taxable. And the rest, as they say, is history, with EPI being subsidized by taxpayers to the tune of around $300 billion in 2012

But as I said before, this system is uniquely American, and it is also uniquely problematic for a wide variety of reasons. First and foremost, it makes opaque a system that, like all other things in a market economy, would benefit greatly from a healthy dose of transparency. Specifically, it masks who actually pays for an employee's health care, which makes employees less cost-conscious when consuming health care (you always spend more when it's other people's money, right?), which in turn drives up health spending. Furthermore, employers, rather than employees, are choosing health plans for a large, diverse group of people with varying health care needs, while at the same time the actual costs aren't readily visible to people on these plans. As a result, the rising costs of health care due to this over-consumption and lack of cost-consciousness has created a situation in which wages are depressed in order to continue paying for health insurance that, oftentimes, includes coverage for things the employee doesn't even want or need. This graph should prove helpful to explaining my point:



In addition to depressing wages, the existence of EPI creates a situation where employees are reluctant to quit their jobs (job lock, in wonk's parlance) to, say, start a new business or find a job that better suits their skills--because they'd lose their insurance. One study found that employee turnover was reduced by a whopping 25% because of this! On a more practical level, changing insurers every time you change jobs--and the resulting change of health care providers that often follows--also makes for much poorer quality of care because it makes health care coordination and early diagnosis that much harder. It also costs more, to boot. To be sure, the Affordable Care Act does largely ameliorate the former problem; the latter one, not as much. 

Given such significant downsides to America's system of health insurance, why does it persist? At a guess, I would argue that it's a sort of status quo bias, where any change from the present state of affairs is seen as a loss, by both ordinary people and policymakers. As Milton Friedman once put it, we "regard it [employer provided insurance] as part of the natural order." Indeed, President Obama famously quipped "If you like your insurance plan, you will keep it. No one will be able to take that from you." Michele Bachmann complained that Obamacare will cause millions to lose their employer-provided insurance coverage. She's right about that, but this is one loss I won't be weeping over.

Thursday, October 24, 2013

Out of Sight, Out of Mind


Austin Frakt over at The Incidental Economist had a good post today about how Ted Cruz's spokeswoman argued that, because the Senator was on his wife's employer-provided health plan, it didn't cost taxpayers any money. In his post, Frakt points out the obvious mistake--all employer-provided insurance is taxpayer-subsidized (to the tune of about $180 billion a year). After reading his post, I realized that this mistake on the part of Cruz's spokeswoman was reflective of a larger problem, in which many people fail to realize (or, more often, deliberately mask) the fact that, just because a program or policy doesn't have direct costs to taxpayers, doesn't mean that it doesn't cost people any money. 

Take, for example, the sugar quota that the U.S. has, which strictly limits the amount of foreign sugar we're allowed to import. There's nothing sweet (I'm sorry, I couldn't resist) about this policy--in 2011, it cost U.S. consumers an estimated $3.86 billion. Yet the American Sugar Association has the gall to insist that "U.S. sugar policy ran at zero cost to American taxpayers from 2002 to 2012." Zero cost indeed. To reiterate my point, here's a graph of U.S. vs world sugar prices:



In any case, you get my point: Just because the U.S. government doesn't directly funnel taxpayer dollars to the sugar industry, doesn't mean that a quota doesn't cost Americans any money. These policies are quite literally everywhere in the U.S., and in every case, they were a clumsy attempt at avoiding a tax hike while still achieving the same goal.

The list goes on and on with this sort of thing: instead of taxing gasoline, we insist on higher fuel economy standards for automakers. Instead of using tax dollars to provide health care to workers, we instead insist that employers must provide health insurance to their employees. But not the smaller companies, because that wouldn't be fair. The problem with a lot of these policies is that they end up having unintended consequences and are far less transparent and far more complicated in their mechanisms than a simple taxes-and-transfers scheme would be. Milton Friedman was right when he said that there's no such thing as a free lunch, and it holds true in nearly all things, from the sugar quota to the employer mandate. The sooner we realize that, the sooner we can stop trying to get something for nothing and start crafting better policies.

Monday, October 21, 2013

Everything You Need to Know About the Eurozone Crisis


With the onset of the global financial crisis of 2008, the U.S. economy took a steep nose-dive into recession. Similarly, the economy of the Eurozone, which was likewise dependent on a housing bubble, collapsed. However, the nature of Europe’s monetary union created a unique aspect to the region’s crisis. More specifically, the monetary union and unified currency of the area played a large part in creating what has now become known as the Eurozone Crisis, in which peripheral nations such as Greece, Ireland, Spain, Italy, and Portugal, (GIPSI nations) have found it increasingly difficult to finance their government deficits in global bond markets. While leaders on both sides of the Atlantic have sought to explain the Eurozone Crisis as a story of fiscal profligacy and overgrown welfare states, their narratives simply run counter to the available evidence.

Before attempting to explain the current crisis, though, I've always found it helpful to contextualize matters with a bit of history. In 2002, member states of the Eurozone gave up their independent currencies and adopted a single currency--the Euro. In so doing, these nations also effectively relinquished their respective independent monetary policies. The single currency fueled a false sense of security in the core Eurozone nations, leading French and German banks to lend large amounts of money to banks in the European periphery nations--because hey, it was all the same currency, so nothing could go wrong, right? This, in turn, created a large property bubble, similar to the one we had here in the U.S. This bubble, formed concurrently with its American counterpart, led to a period of rapid growth in the GIPSI nations. This rapid growth, which was financed primarily by rapidly increasing private-sector debt levels resulting from those massive core-to-periphery capital flows I just mentioned, led wages and prices in the GIPSI nations to rise substantially faster than those in core European countries:



Note the large gap between Germany and the periphery nations. In any case, these increased wages and price levels made these economies deeply uncompetitive compared to Germany, with their products being far more expensive on the world stage, driving their exports down. With the popping of the housing bubble, private spending sharply declined, and inevitable cyclical (recession-induced) deficits arose as governments found themselves with far lower tax revenues and far higher government expenditures, with no prospect of exporting their way to an economic recovery.

In response to this severe crisis, the European Central Bank (ECB), like most global central banks at the time, cut interest rates in an effort to increase liquidity in the markets and spur a recovery. While the downturn in the Eurozone was fairly ubiquitous across different nations, the subsequent recoveries across Europe spanned from fairly robust in Germany to nonexistent in the periphery nations. Germany, with its persistently large current account surpluses, traded its way to a vigorous recovery, with unemployment peaking just above 8% in 2009 and falling to 5.3% at last count. In stark contrast, the unemployment rates of the GIPSI nations remain well above any measure of full employment and show few signs of falling


Nevertheless, the ECB, which was (in theory) supposed to conduct appropriate monetary policy for all of the Eurozone, raised interest rates in mid-2011, sending the GIPSI nations' bond yields soaring and pushing the sovereign debt crisis into a much more acute stage.

What the ECB's actions indicated to the bond markets was that it sought to conduct monetary policy that was appropriate not for the whole of the Eurozone, but for Germany alone. In order for a government to be able to borrow at low interest rates, like America can, bond markets must have confidence in the ability of that government to pay back what is owed. What the ECB demonstrated is that Spain, Italy, Ireland, Italy, and Portugal lack a central bank willing to stand by their respective economies. As a result, the future economic prospects of these nations are correspondingly diminished in the eyes of the bond market. That's why they pay high interest rates on their debt. And it's precisely for this reason, for example, that the United Kingdom is able to borrow at a lower rate of interest than Spain while it simultaneously has larger debt-to-GDP as well as a larger deficit. 


Simply put, the bond market understands that the Bank of England will do what is necessary for England’s economy to thrive and produce the necessary tax revenues to pay off its debts. Spain—and indeed the rest of the GIPSI nations—by contrast, are left with a central bank that has demonstrated to bond markets that it is more concerned with Germany’s economy than with their own.

Ultimately, the critical lesson that ought to be drawn from this ongoing crisis is that its causes are largely not fiscal in nature, they stem from the flawed structure of the European monetary union itself. Unfortunately, leaders in both the United States and across Europe have subscribed to the theory that fiscal profligacy and overly large welfare states on the part of the GIPSI nations are the root causes of this crisis. 

An easy way to debunk this thesis is to compare the average public social expenditures (the size of the welfare state) in 2007 and average budget deficits among Eurozone nations between 1999 and 2007. It quickly becomes clear, looking at the chart below, that the size of the GIPSI nations’ welfare states is not a root cause of the crises, as Germany’s welfare state on the eve of the financial crisis in 2007 was larger than any of those found in the debtor nations:


The more widely cited theory among European policymakers is one in which the GIPSI nations partook in excessive government spending during the years leading up to the current crisis. However, in examining the average budget deficits between 1999 and 2007, this theory just doesn't stack up:


To be sure, Greece did run large deficits, but all other nations showed few signs of profligate spending. Italian deficits were roughly equal to those in France, yet France is not facing Italy’s current borrowing costs. Further discrediting this theory is the fact that Spain and Ireland both ran budget surpluses during the years leading up to the crisis, with their large budget deficits arising as a result of the economic crisis, rather than being its cause. 

This misdiagnosis on the part of the policymakers has led them into mistakenly demanding fiscal austerity in the face of high unemployment as the price of numerous bailout packages. However, as indicated before, the true problem facing many of the GIPSI nations is the fact that their labor costs and price levels grew wildly out of balance with those of Germany and other core nations, creating massive trade deficits:


Ordinarily, a nation would rectify this problem simply by having its central bank print lots of money to devalue its currency, thus making its exports competitive again. However, due to the monetary union, the GIPSI nations don't have the ability to devalue their currencies, and the ECB’s generally timid and ineffective policy responses have done little to this end.

At the end of the day, the Eurozone faces three choices: First, the ECB can opt to pursue a policy of slightly above-average inflation, such that prices and wages in the places like Germany rise to balance with those in the periphery. This policy option would be most likely to preserve the Eurozone and restore the periphery’s competitiveness. In general, it would also serve as economic stimulus to the entire Eurozone, because a continent of growing economies is a lot more beneficial to Germany than one made up of nations in crisis. In turn, GIPSI nations would be able to stage a more robust, export-led recovery, which in and of itself would restore confidence in the bond markets as well as improve their budgetary outlooks. With robust recoveries underway, these nations could then turn to balancing their longer-term budgets, rather than the other way around.


The second choice would be a Eurozone break-up or a peripheral nation’s exit from the union. This option, though politically disastrous, would allow the GIPSI nations to devalue their own currencies accordingly to restore competitiveness, but would also likely lead to bank runs in the exiting nation as well as cause turmoil across financial markets for some time after an exit. The likelihood of an exit is pretty hard to pin down, since you never really know how long a country is willing to put up with such unacceptably grueling economic conditions before doing something extreme.

The third option, which is currently being pursued by the Eurozone, is one of “internal devaluation,” as economist Paul Krugman dubbed it, in which nations would adopt structural changes and austerity measures and slowly suffer through deflation (falling prices) until their nation’s exports regained competitiveness. This option, due to the downward “stickiness” of wages (that is, workers are less willing to take pay cuts than pay rises) and in turn of prices, would be tremendously slow and painful, with the debtor nations facing years of catastrophically high unemployment and all of the associated human costs.

Indeed, while Mario Draghi has prevented the Euro from completely falling apart when he pledged to buy an unlimited number of GIPSI government bonds, there's a hell of a difference between staving off an acute crisis and staging a recovery. And he seems intent on keeping it that way. Even though Europe has faded from the limelight in recent months, it is very much still in crisis. For Spaniards and Greeks, 26% unemployment doesn't just fade away. For Italians, 12% unemployment isn't just something you forget about. Sooner or later, something's gotta give.