In the next few posts I'm going to focus on different aspects of how problems are constructed, with each post looking at one of those aspects.
This post looks at the role of evidence.
Evidence is something we all intuitively think of as central to defining any problem. We rely on evidence to tell us about the nature of the problem, its causes and consequences, and what solutions might be effective in addressing it.
Often, when we think we don't know enough about a particular problem, we will decide that we need more evidence before we can really tackle it effectively. And when we want to persuade other people that a problem is important we turn to evidence in order to make our case.
Those ideas about evidence tend to assume that we define and address problems based on the evidence that we find. In fact, the process can often work the other way round. The problems we decide need addressing and the ways in which we define them can dictate what evidence we collect and how we interpret any evidence that already exists.
A good example of a problem where this occurs is the heated topic of immigration in the UK. And a good example of evidence-gathering in this area is a recent piece of research by two academics at UCL that analysed the fiscal contribution made to the UK economy by migrants over the past 15 to 20 years.
Their report made for some attention-grabbing headlines, which split broadly into two camps.
The Guardian and the Financial Times enthusiastically proclaimed that EU migrants arriving in the UK between 2001 and 2011 had contributed a net £20 billion to the UK economy.  The Telegraph and the Daily Mail trumpeted the shocking finding that non-EU migrants had cost the UK a net £120 billion in the period 1995-2011. 
Both of those figures were from the same report. The choice of which figure to run with in the headline (and what kind of emphasis to apply within the article) is broadly down to the political leanings of each newspaper. That is, they chose from the evidence available the figure that would fit best with their existing view of the problem. (Although that's obviously a simplification – the Independent referenced both figures in its headline). 
This is an obvious point, but an important one. Clearly we do not rely entirely on evidence for our problem constructions if we decide which bits to believe or emphasise based on our existing ideas.
Equally, the evidence itself is not innocent – it can, as in this report, be legitimately open to very different interpretations. Evidence in the social and political world is usually quite messy, consisting of partial or ambiguous findings that may not always fit neatly with the problem definitions we want to employ.
This means that we argue a lot about the significance, relevance and reliability of different pieces of evidence.
There are several ways in which we can argue about evidence; I've picked out a few here from the media coverage of the report by way of illustration.
First, you can challenge the evidence itself, i.e. whether the facts and figures involved are really correct or not. The Telegraph and the Daily Mail both cited comments by the Director of MigrationWatch (a generally anti-immigration think tank) disputing some of the figures in the report.
Second, you can challenge the authority of the source of the evidence – in this case the two UCL academics. The Telegraph noted that both MigrationWatch and Civitas had previously criticised research by the two authors. The Express noted an immigration-related prediction made in 2003 by one of the authors that subsequently turned out to be incorrect. 
Third, you can challenge how others are using or interpreting the evidence. The Telegraph said the the authors themselves 'emphasised their findings on the contribution of European migrants and gave less prominence to the findings on the costs of non-EEA immigration.' The Daily Mail said that the report 'sought to put an overwhelmingly positive gloss on the economic impact of mass immigration.'
Fourth, you can reframe the evidence – especially numerical evidence – by recasting it in a different light, suggesting that it does not in fact show what others might think it does. Andrew Green of MigrationWatch (cited by the Telegraph) did this quite skilfully by suggesting that the total amount contributed by EU migrants to the UK economy (the £20 billion figure) actually worked out at less than £1 per person per week over the period examined in the report.
Fifth, you can challenge some of the assumptions found in the evidence. The Daily Mail argued that 'Critics will say the report is backward looking – focusing on the taxes paid by the influx of Eastern Europeans when they are young, single and healthy – but not the future burden their families may place on schools, hospitals and the welfare state.' Channel 4 noted that the report hadn't taken into account illegal immigration. 
I've concentrated mostly on the right-wing press here because in this instance the report itself did emphasise the £20 billion figure – so critique of the report came largely from the right whereas the left-wing outlets were mostly just endorsing it. The techniques of evidence interpretation and framing are easier to spot and illustrate in criticism than they are in approval.
That's not to say that the right-leaning press were somehow distorting the information that their left-leaning counterparts were faithfully presenting. The latter had their frames and interpretations too. The Guardian, for example, argued that the report 'reveals that Britain is uniquely successful, even more than Germany, in attracting the most highly skilled and highly educated migrants in Europe.' That, again, is a very partial interpretation of the evidence.
Unfortunately, much of this sort of debate is dismissed as 'spin'. We can tend to dismiss such comments as exactly what we'd expect of media outlets with particular political preferences. Actually, all of the techniques I've outlined above can be legitimate if used properly – although in practice they may often not be (for example, personal criticism of one of the report authors which clearly has no bearing on the quality or otherwise of the report's findings).
There are a couple of important points to take from all this.
First, evidence is, to an extent, what we make of it.
The rider 'to an extent' is key, of course. I do not mean to suggest that all evidence is relative and all that matters is our interpretation. What I am arguing is that, when we come across evidence relating to a particular social problem, we tend to want to fit it into our existing preconceptions about that problem. In order to do so we might find ourselves employing one or more of the methods outlined above.
The second point is that we should be keenly aware of the ways in which evidence is debated and interpreted – including when we do it ourselves.
Such interpretation can be dangerous if it leads to a basic misrepresentation of the evidence. It can also be dangerous if, more subtly, it leads us to cherry pick just those bits of evidence that happen to conform to our existing views. All that produces is unchanging opinions which struggle to respond to a changing world.
But looking at how others interpret evidence can also be very informative. A closer look at the interpretations they offer reveals a great deal about the preconceptions they have regarding the problem under discussion.
Finally, it is worth emphasising that, despite these issues, defining and addressing problems based on evidence is clearly better than just making things up as we go along. But evidence isn't the be all and end all, either. A better awareness of its uses and abuses, and of our own biases, can help us become more aware (and perhaps critical) of implicit problem constructions as well as put forward our own.
This has important implications for choosing which charities to support. As I discussed in my last post, choosing which charity to support implies choosing which problem to address. Evidence plays an important part in how we choose which problems to address and what to do in order to address them.
To the extent that such evidence is ambiguous and uncertain we need to be keenly aware of its role in informing our decisions.
More profoundly, it is worth asking ourselves whether our existing preconceptions about particular problems are influencing the evidence we collect and our approach to interpreting it. It might just be that we need to rethink the problem before we rethink the evidence.
Image from .