(In)Visibility in policy work: when more data isn’t always the answer

By Mary Sadid, NSUN Policy Officer

“No one does that research so we can’t prove what we know to funders.”

“We aren’t funded because we can’t collect that much data.”

Increasing visibility is a common goal in policy work: amplify voices, gather data, produce reports, evidence the issues, facilitate funding. As the above quotes from NSUN members show, data infrastructure is often lacking for minoritised/racialised groups who may struggle to prove what they know. This can perpetuate a cycle of being ignored, un/underfunded and marginalised. A natural response to this from policy makers is to advocate for better data infrastructure and evaluation frameworks, but it is not always so straightforward: invisibility can be a better option than being exposed to a hostile system through mishandled data.

As the backlash to NHS Digital’s attempted ‘data grab’ showed us, data can be disclosive of our identities, even when anonymity is part of the offer. The consequences of disclosure are not the same for everyone: for those of us, often racialised or with precarious migration status, who are over-policed and under-protected, incomplete attempts at anonymity can inadvertently draw unwanted, potentially endangering attention. This can be particularly acute for people who experience paranoia or distressing beliefs around being watched and may have experience of surveillance. This could be in criminal justice, mental health, or detention settings as well as through government departments such as the Department for Work and Pensions (DWP). If, for example, your care staff wear body cams, and you experience paranoia or overwhelming beliefs around being watched, how do you disaggregate the two?

A desire to avoid surveillance or being on the radar of the statutory sector may inform decisions from not registering with a GP to not claiming benefits, even if eligible to do so. However, if we decide that the problem lies solely with communities, and only address problems at this level, we risk perpetuating labels such as ‘hard to reach’ that let the state and systems off the hook. Even in the absence of experiences of paranoia/distressing beliefs, talking about “hesitancy” may situate rationality outside of the groups in question.

In responding to such tensions around visibility, part of the challenge lies in identifying where data is ‘leaky’, and how we can stop ourselves from putting others at risk. This requires listening and believing the experiences of those who face intense scrutiny. Examples such as the increased prevalence of interaction between the NHS and Home Office since the Immigration Health Surcharge was set up in 2015, NHS PREVENT referrals leading to individuals being held in ‘pre-crime’ spaces and information sharing agreements between the DWP and Greater Manchester Police reflect the increased harnessing of cross-agency communication for a punitive agenda.

Patterns likely vary from place to place, perhaps making it difficult to predict the likelihood of help-seeking going wrong. What does feel clear, however, is that assuming a benevolent or well-intentioned system can be an endangering and negligent view. This points to a need for a rights-based approach and clarity on what this means and looks like, but also knowledge of where an individual’s rights may not be upheld by the law or its intermediaries.  

When talking about data and risk, informed consent becomes a challenge. If we don’t recognise the capacity of data sharing to place some of us at risk, can we ethically collect and share someone’s data? Unpacking this question requires recognition of power dynamics where a lack of data and funding is not causative but more likely to be a symptom of ongoing power disparities. Instead of collecting more data to attempt policy change or obtain funding to work on a particular issue, we may need to challenge the ask and the evidentiary threshold. This means addressing the structure and histories of the systems at play as well as our own places within these systems.

Much of the information we seek to quantify or evidence already exists, but it is often held in spaces where individuals and communities are discredited or treated as unreliable or inadequate narrators. Policy work can play a role in sanitising and packaging information such that it is acceptable to the policy maker, reinforcing hierarchies of knowledge, and creating distance from the realities on the ground.

26th July 2021