7 Out of 10 People are Sick of Surveys 5 May 2015
“There are lies, damned lies, and statistics” according to Benjamin Disraeli, as quoted by Mark Twain. Disraeli perhaps didn’t have in mind the endless slew of surveys and polls that characterise our attempts to understand society, but it seems equally fitting.
Countless polls are to be expected around election season, commissioned by think-tanks, newspapers and the political parties themselves. The points dropped or gained by various political parties after debates or policy announcements provide plenty of fodder for commentators and pundits.
Increasingly common too are surveys on religion and belief. There are those released by Pew for example, which are sometimes meta-studies. There are surveys conducted via YouGov. There are those commissioned by news outlets, such as one of British Muslim attitudes conducted by ComRes and paid for the BBC in early 2015. If you want a statistic on religion and society, you can probably find it somewhere online via these surveys.
It is understandable that we rely on surveys. They provide cold hard facts, not just opinions. But these ‘facts’ aren’t always as reliable as they seem, and just like any nugget of information, they can often be taken advantage of.
For example, Stephen Jones at Newman University, who is researching religion and science, highlighted to me how polls can create conflict where there is none.
“Generally, polls on science and religion offer extremely limited options, which end up exaggerating the opposition between religious and secular people. The Horizon ‘War on Science’ poll is a case in point. It asked people whether they believe in: 1) an evolutionary process in which God plays no part; 2) creation of the world by God in the last 10,000 years; or 3) intelligent design, where God intervenes directly in evolutionary processes.
“Those who believe in classical evolution and God (which a lot of people do, though we don’t know how many) are not given any option at all, and are forced to become either anti-evolution creationists or atheists.”
You’d think a survey commissioned by the BBC would avoid such simple mistakes but such problems are common.
A badly phrased question or answer can provide all the statistics necessary for someone to prove a point. After the Parisian shootings, several UK media outlets commissioned surveys on Muslim views. Sky’s survey results led with various headlines, including “39% of Muslims say the police and MI5 are radicalising young people”, the BBC’s survey focused on their findings that “1 in 4 Muslims have some sympathy with the motives” of the Parisian shooters. The problem with such findings is that there is little depth to them. What is actually meant by having sympathy with the motives of the Parisian shooters? Some believe it was a reprisal against cartoons of the Prophet, others view it as a violent lashing against the structural inequalities of French society, others still see the shootings as part of al-Qaeda’s recruiting tactics in the West. Given this complexity and debate, is de facto statement about the motives of the shooters at all useful?
Returning to Stephen Jones, he points the same ambiguity in other questions about Muslims.
“The question that really annoys me is about whether or not Muslims support ‘Sharia’. This is routinely used to imply that large numbers of Muslims in Britain support the replacement of British democracy with theocracy. The problem here, which religiously illiterate polling companies don’t recognise, is that for many Muslims Sharia means something akin to ‘God’s path’, and so they don’t want to reject the notion entirely. That doesn’t mean, though, they have any interest in the kind of political system advocated by groups like Hizb ut-Tahrir.”
The problem with surveys is that they seek to condense human experience and the world into discrete multiple choice questions. By nature, surveys are about breadth rather than depth, but without appropriate consideration of how these discrete choices are made, surveys can become meaningless or even at times, misrepresentative.
This isn’t to say all polls are useless of course. There are those conducted by researchers, who choose discrete answers after extensive fieldwork to ensure they represent the full gamut of viewpoints on an issue – Linda Woodhead’s use of YouGov polls being a good example. The Professor of religious studies selects responses based not upon her own presumptions, or ‘common sense’ , but upon fieldwork conducted by religious studies scholars across the country. The fieldwork is often in-depth and qualitative, and surveys are the natural next-step to begin to assess how prevalent views are.
There are then good surveys and bad surveys. The challenge is telling the difference, a task made easier by maintaining a healthy dose of scepticism towards them.
If you enjoyed this article, support us and receive more like it by subscribing to hardcopy magazine: –