Given that I’m up to my eyeballs in scientific literature at present, I thought this might be a good topic for a post. It’s funny, how each day we are bombarded with articles in the new media and social networks about that “studies report that people who eat chocolate are leaner than those who don’t” (my personal favorite) or studies that “demonstrate that a Mediterranean-style diet can reduce risk for chronic disease, such as type 2 diabetes or heart disease.” BUT, as consumers of mass media, we don’t always get the full picture or know what’s behind the study. So, here’s a little food for thought as you read up on “the latest and greatest” – whether it’s in the New York Times, Journal of the American Medical Association, Eating Well Magazine, or People Magazine (Yes, I read all of these…).
1. What kind of study was it? The best sort of study is a randomized control trial (RCT), where every aspect of the research is controlled, down to what people eat, how much and when. These are awfully expensive to conduct, especially for diet-related concerns, so cohort studies (large groups of people followed for a significant period of time to see whether a disease or outcome develops) are also good choices. When you see something like “cross-sectional” or “ecologic” study in the news article, you may want to be slightly more wary of that the data is indicating.
2. Size. Was it a large study (e.g. 100s or 1000s of participants) or was it small (perhaps less than 30 participants)? Size, in a really well run study, doesn’t always matter, but it’s usually pretty darn important. You really can’t learn a lot about something that you want to apply to an entire population from a study of just 10 people.
3. How did the researchers get the data on what people ate? For nutritional studies, this is often the “smoking gun.” Food intake and nutritional information can be collected in a number ways, including actual monitoring (e.g. in the case of an RCT), via 24-hour food recalls (either phone/in-person interviews or survey-based) or via things like a food diaries/food records and food frequency questionnaires. The latter is largely self-reported and can be fraught with error. After all, can you recall how many times you ate chicken food in the last 6 months? Me neither. People also often under-report for various reasons (umm, that cupcake I just ate? Do I have to tell?!). Understanding what is collected, by whom and how often is important to deciphering how clear the results of the study truly may be.
4. How long was the study run? Most nutritional influences on health are not immediate (unless you’re looking at foodborne illness, ugh), so a fairly lengthy period of study is often best. It’s hard to see the benefits of Vitamin D supplementation on prevention of Alzheimer’s in a couple of weeks, for example. Depending on the topic under study, several weeks, months or years are most appropriate to draw valid conclusions that can then apply to future nutritional strategies.
5. Read closely for words like “association,” “correlation,” and “causal.” Things like “can lead to” are not bullet-proof by any means. But, it’s also important to remember that causality, in the most scientific sense, is extremely difficult to demonstrate. What we eat also interacts with our genetics, our environment and our other healthy (or not-so-healthy) habits – so it’s not always a clear 1:1 path from food to a disease (or lack of). For the most part, we will see things like, “Red meat consumption is associated with an increased risk of total, cardiovascular disease, and cancer mortality.” This doesn’t necessarily mean that red meat causes CVD, but in a number of well-conducted studies, a large number of people who died from CVD (or cancer, etc.) were also major consumers of red meat. Gives you something to consider, for sure.
6. Where you read it counts. If you read about a study in People Magazine, that doesn’t automatically mean it’s full of hot air (but really, this shouldn’t be your first source of nutrition info, for obvious reasons). If the study referenced in a news article was: (a) conducted by well-known scientific researchers (ideally at a major university or government agency) and (b) was originally published in a scientific journal, such as JAMA, Lancet, American Journal of Epidemiology, or other similar lofty (and peer-reviewed) publications, you can be generally confident it’s of high-caliber. Your best bet for solid nutrition data is this sort of scientific literature, or well-known news publications such as the New York Times, Washington Post, etc. Some fitness magazines also do a good job of covering research, but “sound bytes” often prevail over complete coverage of the ins and outs of a study.
That said, there are some really crappy studies that are published, so here’s my final advice: DO READ carefully, DO ASK QUESTIONS, DO READ MORE, and yes, it’s ok to be a little cynical! But please, try to be a bit OPEN MINDED too. We can all learn something new about the stuff we eat every day. And, it might one day save your life.