A study just published in Annals of Internal Medicine reported that sedentary behaviour is associated with higher risks for heart disease, diabetes as well as cancer of the breast, colon, womb and ovary.
This was a meta-analysis - that is, a study that pooled together the data of 47 previously published epidemiological studies.
It replicated the results of two other similar meta-analyses, one published in 2012, the other last year.
But this meta-analysis also looked at what happened when people with sedentary work exercised as well, something that the two earlier meta-analyses did not.
What it found would usually be reported as: "Exercise did not completely neutralise the health hazards of sedentary behaviour."
But this sentence implies that sedentary behaviour actually causes health hazards which exercise reduces.
However, the study did not detect any such cause-and-effect relationship because no epidemiological study can ever detect such relationships.
To report this study accurately then, one should say: "Exercise reduced, but did not completely eliminate, the association between sedentary behaviour and health hazards."
However, such a sentence usually gets edited into the one above as most editors don't consider that doing away with the term "association" is important.
But it is: It makes the study say something it did not since no epidemiological study can ever prove that A causes B.
Only an experimental study can do so.
An epidemiological study involves observing people in their natural habitats, say, workers at their sedentary jobs in the modern office.
By contrast, an experimental study is one where subjects who are as similar as possible in age, sex, race and so on are assigned at random to one group that is given A or, as randomly, to another group that is not given A.
Since these two groups would differ only in whether they are exposed to A or not exposed to A, any difference in the outcome B being studied must be caused by A, say a new drug.
Importantly, only such an experimental approach can show cause and effect. By contrast, epidemiological studies can only look for associations - between the lack of exercise and cancer, for example.
But such subjects are also "exposed" simultaneously to other things - genetics, say - that might impact cancer rates more than exercise.
Since epidemiological studies are not randomised, such factors other than the lack of exercise may well be driving the results.
Thus an epidemiological study can never say if A causes B, or vice versa, or if C impacts A and/or B.
Since they cannot prove causation, their results must be reported in the passive voice as correlations.
But most editors prefer the active voice.
This is why the correct sentence of, hypothetically speaking, "drinking two cups of coffee is associated with double the risk of melanoma" may be edited into the incorrect one that reads: "Drinking two cups of coffee a day can double your risks of melanoma".
This editorial preference leads to news reports like the recent one in The Washington Post that "tweets can better predict heart disease rates than income, smoking and diabetes, study finds", which the observational study in question didn't and couldn't.
Researchers themselves must also take part of the blame for this sort of reporting.
They write their papers carefully, knowing that peer review won't ever stand for "coffee can cause melanoma" when only associations were observed.
But once their papers are published, some researchers throw all caution to the wind.
With an eye to garnering publicity for themselves - famous researchers secure more research funding - they are wont to exaggerate their findings to the media. If so, why bother with such studies then? Well, they are pieces in a jigsaw puzzle.
For example, large epidemiological studies show that alcohol is associated with specific cancers.
They cannot show that alcohol causes cancer.
But experts noted in the 2014 World Cancer Report of the World Health Organisation that because these cancers were associated with all types of alcoholic beverage and because alcohol is proven to cause cancer in animals, they concluded that alcohol actually causes cancer.
So, there will always be epidemiological studies.
How then should a journalist report an interesting one? He must treat its claims sceptically and, each time he reports one, he should also remind readers that it can't show causation.
Now, the present study, being a meta-analysis of epidemiological studies, couldn't and didn't ascertain that sedentary behaviour caused various health hazards or that regular exercise reduced their risks.
It only identified some interesting correlations.
It seems reasonable to assume that exercise makes us fitter and healthier.
But the study didn't and couldn't prove this as cause and effect.
After all, a healthy person is more likely than an ill one to exercise, so cause and effect may, in fact, be reversed.
Also, diet, genetics and other factors may come in between exercise and health.
Media stories about this study canvassed other experts who suggested that standing up at work might help fight off the hazards of a sedentary lifestyle.
We burn up more calories standing up than sitting down, they said.
And so on. Standing up at work might well be better than sitting at work but there is no randomised experiment to prove this.
A rigorous one would require many observers shadowing many subjects five days a week at work for years.
But since standing carries no risks, unlike a new drug, say, such a study can't be justified financially.
So, by all means, stand to work if you wish but don't advocate it as somehow enhancing longevity.
And living longer isn't even the best reason to exercise. Being fitter might be: You would live better.
This article was first published on Feb 15, 2015.
Get a copy of The Straits Times or go to straitstimes.com for more stories.