Don’t Get Meme’d! Ten ways authors mislead their audience

MemedTitle

The past few days, I’ve seen any number of infographics/memes/macros that have used various techniques to mislead, obfuscate, and confuse issues as far ranging as environmental science, genetic engineering, evolution, bills before Congress (in the USA), and military pensions (again, in the USA). Having accidentally come across them, and beginning with the premise that they have the intellectual rigor of the cover of “The National Enquirer,” I decided to look at them and see if they nevertheless had anything more to them, anything in common. That gave birth to this post.

What follows are ten common ways that memes in particular, and that people in general, deceive. Look out for these in reading the news, watching TV, and especially in watching advertisements or political commercials. If you do, I honestly think it will help you to not be taken in by charlatans, hucksters and snake-oil salesmen. Once I started to look for these techniques I saw them everywhere, even in things I said myself to others, and I imagine that if you do the same then your results will not be so very different from mine.

1. Coincidence as Causation

I think we’re all somewhat familiar with this one, but despite that fact it remains one of the most common and most effective techniques used to mislead people. If I show you how as one thing goes up, another thing also goes up, it seems obvious that the first thing caused the second right? Well, no. Not at all. That’s not even correlation – it doesn’t even suggest that the first thing might be more likely than not connected to the second – it’s ultimately just coincidence. I can connect so many things to each other through coincidence alone, if that’s suggestive of causation, that the world will start to look like a very strange place indeed. The decline of pirates causes global warming, cell phones cause autism, fast food causes gun violence, and I can’t think of more examples off the top of my head but you get the point. It takes more than a demonstration that two things happened either in sequence or simultaneously to show that one thing caused the other. So watch out when someone says, “but studies show that giving more maternity leave decreases the murder rate!” and ask them why they think that, why you should think that, and if they’ve really thought about it past thinking that it’s something they’d really like to believe were true.

2. Weasel Words

We all use weasel words. In fact, I just did right there. Weasel words are intentionally ambiguous and often hyperbolic words that make something seem bigger, more significant, without actually demonstrating anything substantive about it. In fact, there is no such thing as “countless” studies – studies are always something that can be counted. Often when someone says that, “countless” turns out to be 2 or 3, or maybe 10. A “miracle” cure means nothing, because the word “miracle” doesn’t mean it’s necessarily better only that the author thinks it’s the bee’s knees and wants you to open up your wallet for it. So like I said, we all use weasel words. The trick is figuring out exactly what someone means when they use them. When you see something “countless,” ask why, because usually that’s because it simply hasn’t been counted. When you see a “miracle cure,” ask just how much better it is. When you see something that’s “statistically significant,” ask for the numbers. Maybe it’s a miracle, maybe it is a huge number, maybe it really is significant, but until someone actually proves that it remains unproven. Stating that it is true doesn’t make it so.

3. Argumentum ad Monsantum
Source: Scientific American

I first learned about this by reading the source article, so if you want to read about this more you should go there. AaM goes farther than just GMOs, though, and I’m not going to focus on that issue (I have plenty of other posts about that if you want to read them) right now. Argumentum ad Monsantum is simply when you argue against a technology by saying that bad people (or people you believe to be bad) use that tech somewhere, so nobody should use it anywhere. Some common variations on that theme are Argumentum ad Liberarum, Argumentum ad Conservatum, Argumentum ad Al Gore-um, Argumentum ad Fox News-um, and so on. People we don’t like are allowed to use good things too, from time to time, and the things they use aren’t automatically tainted by their having used them. If those things can help us, or if they are demonstrably true, we should use them. If they’re not, we shouldn’t. Who uses a thing doesn’t imply how useful that thing is, at the end of the day.

4. False Controversy
Supplemental material: Politico

Not long ago on a show I watch (HBO’s “Newsroom”) they used a really good analogy that applies here. It appears in the linked article. Not too differently from that, when scientific findings that have been confirmed by 95% of scientists in a given field and are considered central to that field are dissented with by a few, the news writes about that as a “controversy.” It isn’t. Controversies in science happen when the field is split down the middle, or close to it; 95/5 is not a split down the middle, it’s a split down the obstinate few. After all, nine of ten dentists agree we should use toothpaste; the other one likes filling cavities all day and recommends we use Coca-Cola as mouthwash. That’s not a controversy. We shouldn’t be biased towards fairness when that requires us to be biased against truth.

5. Crafting the “other”

This is similar to #3, but this technique is used when you have to make someone appear bad and then use that appearance to tarnish their ideas. It happens when you hear things like, “well-fed scientists, fat from their [bad sponsor] claim [thing we are expected to disagree with]” or “[Guy], well known [other political party], says [thing we are expected to disagree with]” By making someone appear as ‘other,’ as someone different, as being connected with people or ideas we don’t like, it becomes easier to dismiss their ideas. That doesn’t actually address the ideas, though. A bad person, even Hitler or Stalin or Pol Pot, can have a worthwhile idea. We should not be afraid of ideas, for ideas cannot kill us. If ideas are bad, reject them. When it is relevant, consider context and bias, but remember that even if a person is biased they are not necessarily wrong. Check their math, and check their evidence. If it checks out, then check it out yourself to make sure they didn’t leave anything out. If they didn’t, then even if they are the worst sort of person their ideas are good, and ought to be given due consideration. And really, scientists are just people. Some are good, some are bad, most are decent. They are no different from anyone else in that.

6. Consensus as Conspiracy

When one person says something, it’s an opinion. When ten men say something, it’s a committee. When a hundred men say something, it’s a protest. When a group gathers together round an idea, it’s a movement. But, when everyone votes for the same idea, it’s easy to see a conspiracy. That’s how Kings and Dictators get re-elected, after all. That seems reasonable enough for many issues we deal with every day – abortion, same-sex marriage, religion, capital punishment, taxes, and so on. If ten people look at any of those, ten answers will come back. Scientists are often of one mind, though, so it seems like there should be a conspiracy hiding in that consensus somewhere. But, science doesn’t work like that. It observes and records, and consensus does not indicate they are of one mind just that they are all observing the same thing. If ten men get up in the morning and stand in the sunlight, how many will say it made them colder? Not many. There’s no conspiracy in that, just the consensus that comes from immutable fact.

7. Lost in Translation

Way back when, Latin was the language of science. But, a number of years ago that I can’t recall right now, science started to be conducted in vernacular, in the language that was popular where it was being done. So science in England was conducted in English, German in Germany, and so on. Because science still aims to remove ambiguity whenever possible and say exactly what it means and mean exactly what it says in all cases, though, it needed to redefine words to be more clear. So, jargon was born. One of the most problematic sides of this development is that there are certain terms which exist in both a scientific and non-scientific context, which in either context mean very different things. The best example of this is the word “theory.” A theory in common usage is an idea that has yet to be substantiated, a thought or opinion at the beginning of its life cycle. In a scientific context, it’s nearly the opposite thing. A theory in science is something that has been well substantiated, supported, widely believed and tested multiple ways with great success and no significant deviation – an idea at the end, or at least the middle, of its life cycle. Another frequent offender is “significant.” When people in general say “significant” they mean a lot, a great deal, or a substantial amount. When scientists say significant, they mean that whatever it is, it is in an amount that is 95% likely to be greater than 0. So instead of meaning that the change is great or substantial, they mean that it probably exists. Writers use this duo-denotation (two meaning in different contexts) to great effect when writing about science, by using a word out of context without translation or explanation. By making some absurd claim like that evolution is “just a theory” when there is nothing that is small or insignificant about a theory in science, they exploit this just like they exploit their readers by saying that a finding was “significant” when that significance places it just barely above being nothing at all.

8. Superficial Sensationalization
Supplemental: TED

When scientists seems to make contradictory claims based on equally valid research, this is most frequently at the heart of it. Scientists call it “going past the evidence,” but popular writers who cover scientific issues do it all the time. “Oxytocin is the moral molecule” or “TV is like heroin for the brain” or “Sugar is as addictive as crack cocaine” are all guilty of this, of sensationalizing a finding in the interest of getting readers and money by trading away facts and integrity. When it comes down to it, what they really mean is that “many people who express low levels of oxytocin in a recent study also tend to show more antisocial, violent, and immoral behavior” or “prisoners tend to express lower than average levels of oxytocin” or “Watching television causes a similar pattern of neural activity as measured through functional MRI technology as that caused by consuming small quantities of heroin” or  “sugar causes a similar increase in certain neurotransmitters that are often correlated with addiction as does crack cocaine” which all sounds much more qualified, much more uncertain, and much less solid than the more sensational claims made on the researchers’ behalf in popular media. The safe thing to do, then, is to check if a news story links to the study it is based on. If it does, read that and make up your mind about it. If it doesn’t, then stop reading, because whatever claim it makes can’t be validated on the merits of the evidence anyway. Even if it’s claiming to have found the secret to everlasting life, if there’s no proof to back it up then it might as well be fiction.

9. The False Default State

There is a popular idea in the world, and like many popular and seductive ideas it is also very simple: natural is better. Natural foods are better than processed foods, natural fibers are better than plastics and polyesters, organic crops are better than genetically engineered varieties. This idea, in addition to being popular, is also correct just often enough to be memorable. It is also a good example of a frequent trick that is used to draw people in, the use of an illusory default state, of referring to a golden past time when things were better. This is used when politicians refer to that time in the past (say, the 1950s) when things were better, when people say “if only we’d just [x] like we used to then things would be all better…” and all the other times when people pretend that some single change would make things all better. This is false not because it is completely wrong, but because it is incompletely right. It is right some of the time, true, but it is also wrong plenty of other times. For instance, E.coli is natural. So is Botulism. So is the Black Death. So is cancer. So are ulcers. So are cavities. Those are also coincidentally things that have been demonstrably part of the human experience for as long as we can tell, for as long as humans have been around. The 1950s were good in some ways, but if you were a woman or a black man or a disabled person, they were pretty rough compared to now. There was more crime per capita, certainly, so even if you weren’t part of those groups you were more likely to be a victim of something unfortunate and illegal. Ideas that begin with the idea that the world has fallen from grace, that it just needs to get back to fundamentals, that it just needs to focus on natural things, ignore the progress of human society and the advantages that have been wrought by the advance of time and technology and society in general. After all, if it were that easy to make everything good and pure and right again, don’t you think we’d have done that already?

10. Absence as Evidence

The final trick I’m going to cover harkens back to a frequent argument against evolution by natural selection. It is also quite simple. It is the idea that because there are gaps in the fossil record, or in any other array of evidence used to substantiate evolution, that it is uncertain. Anything could be in those gaps, it is said – evidence of god, evidence of creation, of some other mechanism. Anything you can possibly imagine might be hiding in there, and because it hasn’t been nailed down for certain the theory itself is uncertain. Or, the absence of evidence is evidence unto itself. That is a seductive idea, because it allows us to believe whatever we want to believe about the things that are still yet to be discovered in the world. We can think that angels are real, or UFOs, or mermaids, because we haven’t found conclusive evidence that they definitely don’t exist – the absence of evidence is evidence unto itself. If it were true that we looked everywhere already and we haven’t found those things, if it were true that their presence was directly predicted by the theories we have and they definitely are not there, then that would in fact be true. Just as we do not usually assert that because we have not found our car keys that they must therefore definitely be in the ignition already, and therefore we should just get in our car and turn the ignition as though the keys were already there, taking it on faith that they are, science cannot abide the idea that absence is evidence. Absence of evidence is reason for continuing research, definitely, but it is not reason to believe that mermaids are real, or that the moon is made of cheese.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s