From the 16th century to the 19th, scurvy killed around 2 million sailors, more than warfare, shipwrecks and syphilis combined. It was an ugly, smelly death, too, beginning with rattling teeth and ending with a body so rotted out from the inside that its victims could literally be startled to death by a loud noise. Just as horrifying as the disease itself, though, is that for most of those 300 years, medical experts knew how to prevent it and simply failed to.
In the 1600s, some sea captains distributed lemons, limes and oranges to sailors, driven by the belief that a daily dose of citrus fruit would stave off scurvy’s progress. The British Navy, wary of the cost of expanding the treatment, turned to malt wort, a mashed and cooked byproduct of barley which had the advantage of being cheaper but the disadvantage of doing nothing whatsoever to cure scurvy. In 1747, a British doctor named James Lind conducted an experiment where he gave one group of sailors citrus slices and the others vinegar or seawater or cider. The results couldn’t have been clearer. The crewmen who ate fruit improved so quickly that they were able to help care for the others as they languished. Lind published his findings, but died before anyone got around to implementing them nearly 50 years later.
This kind of myopia repeats throughout history. Seat belts were invented long before the automobile but weren’t mandatory in cars until the 1960s. The first confirmed death from asbestos exposure was recorded in 1906, but the U.S. didn’t start banning the substance until 1973. Every discovery in public health, no matter how significant, must compete with the traditions, assumptions and financial incentives of the society implementing it.
Which brings us to one of the largest gaps between science and practice in our own time. Years from now, we will look back in horror at the counterproductive ways we addressed the obesity epidemic and the barbaric ways we treated fat people—long after we knew there was a better path.
I have never written a story where so many of my sources cried during interviews, where they shook with anger describing their interactions with doctors and strangers and their own families.
About 40 years ago, Americans started getting much larger. According to the Centers for Disease Control and Prevention, nearly 80 percent of adults and about one-third of children now meet the clinical definition of overweight or obese. More Americans live with “extreme obesity“ than with breast cancer, Parkinson’s, Alzheimer’s and HIV put together.
And the medical community’s primary response to this shift has been to blame fat people for being fat. Obesity, we are told, is a personal failing that strains our health care system, shrinks our GDP and saps our military strength. It is also an excuse to bully fat people in one sentence and then inform them in the next that you are doing it for their own good. That’s why the fear of becoming fat, or staying that way, drives Americans to spend more on dieting every year than we spend on video games or movies. Forty-five percent of adults say they’re preoccupied with their weight some or all of the time—an 11-point rise since 1990. Nearly half of 3- to 6- year old girls say they worry about being fat.
No comments:
Post a Comment