Who this is for.

You are in an internal meeting. A colleague is walking through a funding opportunity and its reporting requirements. You are doing the maths in your head, working out what it would cost your team to collect data for this funder for the next two years. You raise a concern; it is discussed but left hanging as a caveat for review by the trustees.

At the board meeting, a trustee asks a question you were not expecting about the data: if we are successful with this funder and have to report this way, how will we know if our services are working? Good question. You do not have an answer on the spot, so you go away to prepare a paper for the next meeting.

Now, at your desk, you are facing a blank document. This piece will help you write that paper and what it needs to get a yes from the board.

The backstory.

Barnardo's was founded in 1866 and is the UK's largest children's charity by expenditure. In 2022, it reached 357,276 individuals through 794 services and partnerships. The organisation operates more than 800 frontline services, has over 700 charity shops, employs around 8,000 people, and relies on a volunteer base of between 15,000 and 22,000 individuals.

Barnardo's is not a small charity that has outgrown its systems; it is a large, long-established organisation with systems developed over decades to support relationships relevant at the time of their creation.

The Oracle ERP system was implemented in 1999, and business analytics capabilities were added in 2010, followed by self-service HR. By the time this data transformation story began, Barnardo's had more than 150 years of accumulated operational complexity and over 20 years of data systems layered on top of one another, resulting in a data infrastructure designed to accommodate an expanding array of external reporting relationships.

When Meera Naik took on the role of Head of Data Transformation, there were more than 20 database systems in use and over 15 different outcome measures operating across various services, each tailored to the requirements of different funders or commissioners - a reflection of the range of funding relationships and service types across the organisation.

Remi Martins-Tonks joined Barnardo's in 2017 and went on to lead the Data and Insight team, growing it from seven members to more than 25. He noted that the challenge he inherited was substantial: extracting real value from the data was becoming increasingly overwhelming due to the complexity of requests and the multitude of different systems involved. The complexity he described was not just about internal factors. It also included external relationships, such as with funders, that affected how the organisation measured its success.

Reality check.

The situation at Barnardo’s is not unusual. The 2024 Charity Digital Skills Report found that 31% of charities say they are poor at, or not engaging with, the collection, management and use of data. A further 34% express the same concerns regarding data-informed decision-making. However, 80% of charities in the same survey say digital initiatives are a priority. The disconnect between aspirations and capabilities is not about a lack of desire; rather, it is about what the data has been built to do.

The Arts Council is well known in the sector for reporting requirements so onerous that many charities simply do not apply to be National Portfolio Organisations. Meaning, what may look like a funding problem is more specifically a data one, in which data architecture has been shaped by accountability relationships to the point where it can no longer serve the organisation. The board of a charity facing that choice is rarely asking whether the data is good, but rather whether reshaping operations to meet a new funder's requirements is worthwhile relative to the cost of demonstrating impact to other stakeholders.

Poor data quality seldom results from negligence; instead, it generally develops from growth. Each funding relationship creates new demands for data, and services adjust to meet these demands. Over time, data collection primarily focuses on satisfying others' needs. As a result, data quality deteriorates, and staff become disengaged. By the time the problem is noticed, the systems have already become harder to use, and staff have already begun to disengage. This is how charities end up collecting more and learning less.

Regardless of your organisation's size, the core question remains the same: Are you collecting data that you can use, or are you simply forwarding data to others?

The real problem.

Most charity data structures aren’t failing. They have adapted. The problem is that adaptation, done repeatedly over years at the service level, eventually produces an organisation that cannot see itself clearly.

In Barnardo’s case, they did not lack data; they had an enormous amount of it. The issue lay in the direction the data travelled. Data flowed upward to funders, the board, and funding commissioners, but mostly stopped there. Each new funding relationship brought additional reporting demands, new metrics to track, and varied formats to integrate. As a result, the systems had to multiply to accommodate these relationships, until much of what remained was compliance work mistaken for insight.

Martins-Tonks explains: "Because of the complexity of requests and using different systems, getting real value from our data was becoming an overwhelming task." The phrase “complexity of requests” is revealing as it indicates that the data served inbound demand from multiple external relationships, and does not showcase the capability of the organisation itself.

Here is how this process works in practice: A service manager collects data because a funder requires it. This data is entered into a system, which then generates a report sent to senior management. The service manager never sees the report again and has no knowledge of its accuracy, how it was used, or whether it led to any changes. Over time, they may lose interest in the accuracy of the data, because why would they care? The data doesn’t engage them.

Consider the limitations that arise when data is created solely for external accountability. First, consistent comparisons between services become impossible because each is evaluated against different criteria set by funders. Additionally, identifying unmet needs is challenging because the data reflects only what commissioners are interested in, rather than the lived experiences of your communities. This situation complicates decision-making regarding investments or redesigns, as the evidence available has been gathered to answer someone else's questions. Finally, it becomes difficult to mitigate mission drift because you gradually lose the ability to measure what is truly important to you.

Barnardo's identified one of its three strategic data pillars as reducing the administrative burden on frontline staff, allowing them to concentrate on their core mission work. This acknowledgement highlights a reality that many charities hesitate to voice: the current methods of data collection are taking valuable time and energy away from those dedicated to mission work, which they never get back.

The deeper issue is the governance challenge that seldom gets addressed in board discussions. Over time, the data architecture has been influenced by a series of decisions: some made at the board level and others at the operational level. Each of these decisions seemed reasonable at the time, but collectively, they have contributed to the current situation of holding data that no longer serves the organisation that collects it.

This raises the question that most charities don’t answer: if you stripped out every metric your funders require, would what you have left tell you whether your services are working?

What they did.

Barnardo's started by addressing a pressing problem and recognising an opportunity for change, then moved to a more formal strategy. 

The invisible foundation

Before any visible progress could be seen, several years of unglamorous infrastructure repairs were required. This involved migrating from outdated on-premise systems to Microsoft Azure, establishing frameworks for data governance and quality, and ensuring compliance with GDPR and NHS regulations. Microsoft Surface tablets were deployed to field workers during this period, providing frontline staff with a consistent, mobile-compatible device for data entry.

This infrastructure created a unified system where previously there had been a patchwork of layers.

"See, Hear, Respond": When the data started flowing.

In 2020 and 2021, Barnardo's launched the "See, Hear, Respond" programme, a COVID-era initiative aimed at supporting children and young people who had become invisible to services during lockdown. Frontline staff collected and anonymised client information, entering it directly into Power BI (Microsoft's data visualisation and reporting tool), where the data became live, automatically synced, and available the same day.

The dashboard improved, but the bigger shift was in how Barnardo’s engaged with community organisations. Real-time case triaging was revolutionised, as the programme uncovered an important insight: children needing more than 15 hours of support faced complex, overlapping challenges that the existing service structure could not address. As a result, a new team was established specifically to assist these young people. Additionally, the delivery model changed as it was discovered that underserved young people were more reachable when Barnardo's proactively approached them rather than waiting for them to reach out.

Martins-Tonks highlighted that this information changed the grassroots community organisations they connected with and informed how cases were triaged. That is, data performing operational tasks, which differs from accountability.

The formal strategy: Three pillars and the bigger challenge

In early 2022, Barnardo's engaged a data consultancy to develop a formal data strategy. Over six weeks and through extensive stakeholder interviews, this process generated more than 100 findings and more than 20 strategic recommendations.

Three strategic pillars emerged from this work: to better understand the needs and experiences of the children and families they support through improved data collection and analysis; continuous innovation to develop new approaches to services due to the changing needs of the beneficiaries they support; and to reduce the administrative burden on frontline staff so they can focus on frontline work.

Many organisations in a similar situation might have stopped at the first two pillars. However, the inclusion of the third pillar recognised an often-overlooked cost that is not visible in financial accounts but evident in the daily experiences of the staff. Including it in the strategy demonstrated the organisation’s commitment to understanding and addressing the workload its employees face.

A key theme throughout this process was knowledge transfer, with the specific task of building internal capabilities to prevent supplier dependency. The Data and Insight team expanded from seven to over 25 members, and new apprenticeships in data analytics, data science, and machine learning were launched.

The cultural dimension

Martins-Tonks emphasised that while the technical work was important, the biggest challenge lay in embedding a data culture within the organisation, ensuring that staff felt curious and confident in using the data products being developed for them.

Pausing on this highlights the challenge Barnardo’s was facing. The scepticism encountered was not typical resistance to change; rather, it was rational. Staff had spent years entering data into systems that provided no return on their efforts. They had learned that data collection was a one-way process. Simply implementing a better system wouldn’t have resolved this issue. What was necessary was a consistent demonstration of reciprocity: collecting data, returning something useful to the people entering it, and making decisions based on that information. Rinse and repeat. Most charities will jump to fix this with change management principles, but it is a track record issue that requires time and cannot be expedited through training sessions or communication campaigns. Instead, it must be rebuilt on a case-by-case basis, eroding scepticism a little more each time. Speed is key here, as is evident in the "See, Hear, Respond" programme, where same-day-data informed triage.

What shifted.

The most significant change was seen in the "See, Hear, Respond" programme, where data influenced operational decisions in real time instead of leading to reporting decisions weeks later. This resulted in changes in partner selection, triage processes, and service structure. Additionally, a new team was created because the data indicated a need that the existing structure was not fulfilling.

At the reporting level, the transition from monthly reports to same-day data enabled live operational decision-making. 

The growth of the team, from seven members to more than 25, is also telling. The organisation is investing in internal data capabilities as a long-term function, viewing it as an ongoing process rather than a temporary project.

What is more challenging to assess is the extent to which the third pillar was implemented at scale. While "See, Hear, Respond" demonstrated the third pillar's effectiveness for one programme, it remains unclear whether the reduction of the frontline administrative burden has been applied across all 800 services, as this information is not publicly available.

Why it worked.

The phasing was important. The “See, Hear, Respond" programme was implemented before the formal strategy was established. When Barnardo's participated in the consultant workshops in 2022, it wasn’t seeking permission to envision a different relationship with data; it had already observed one in action. This proof of concept shifted the conversation from mere aspiration to solid evidence.

The third pillar was identified before it was delivered. By including the reduction of the frontline administrative burden in the strategy document and in its mission-driven goals, the organisation established a level of accountability it previously lacked. Once it is documented as one of the three commitments, it becomes much harder to deprioritise.

The principle of knowledge transfer was maintained. Every major engagement was designed to conclude with the development of internal capability, rather than creating ongoing dependency on external support. The investment prioritised people over platforms.

Cultural change was led from the front. Martins-Tonks described his approach as centred on learning through experience: delivering, reflecting, and adjusting accordingly. This modelled the behaviour he was encouraging the organisation to adopt.

Where this breaks.

The scale of Barnardo's work is not replicable for most charities. With a data team of 25 people, multiple major consultant engagements running in parallel, and a multi-year infrastructure programme, this level of support is not feasible for a charity with, say, 30 or 50 staff members. However, the underlying principles are transferable, even if the precise programme is not.

The dependency on leadership - the key person risk -  is noteworthy and has implications for the transferability of these principles. The data strategy, driven by Martins-Tonks, was built around a leader capable of managing both the technical and cultural aspects of the organisation. This leader had the seniority and relationships necessary to ensure the strategy's success. When that leader leaves, as leaders do, an urgent question emerges: What practices were truly embedded in the organisation, and what relied solely on that individual?

There is also the reporting trap to consider. Barnardo's moved from Oracle to Microsoft Dynamics 365 for its HR system. By 2024, the Dynamics deployment had significant functionality gaps, including issues with sickness reporting, absence management, and payroll data export. A new vendor engagement was necessary to address these problems. This structural pattern is not unique to Barnardo's; it’s a common issue for any organisation that grows and evolves while its systems remain outdated.

Additionally, the issue of funder relationships does not have a straightforward solution. Barnardo's acknowledged this. In most cases, large charities have more leverage to challenge onerous reporting requirements than smaller ones. For a charity with ten or twenty staff members considering an Arts Council National Portfolio Organisation application, for example, the dilemma of whether to reshape their entire data infrastructure (if they even have one) to meet a single funder’s requirements remains just as difficult to navigate. 

The toolkit.

The work carried out by Barnardo's required resources that most charities don’t have. What follows is designed for a smaller organisation. The question it addresses is: what should a board paper proposing a data review include to secure approval?

Before you write the paper, anticipate the most likely objection you will encounter. The objections will likely not concern the problem itself, as the board has indicated a desire to address it. Instead, the challenge will be the scale of the response: why is a comprehensive review necessary rather than simply adjusting the existing data framework?

The honest answer is that the problem likely did not arise from a single decision and cannot be resolved with just one adjustment. If it is like Barnardo’s, the problem accumulated over many years through various decisions, some made at the board level and some operationally. Each decision, while individually reasonable, has contributed to the current situation. Simply improving within the existing structure adds yet another layer to an already complex problem. 

Start the paper by identifying the problem simply: the organisation is collecting data in ways that meet external requirements but no longer consistently help it understand, compare, and improve its own services. Then set out why this is a board issue, what decision you are asking the board to make, and what early progress would justify continuing beyond phase one.

What.

Why this makes a difference for small charities.

Current state audit.

Identify the data you currently collect, who it’s for, and whether it is returned to the individuals or teams that gathered it. You won’t need a consultant; just dedicate two hours with your team leads to create a list. This audit will help you determine whether your issues are technical, governance-related, or rooted in culture, and it will guide all subsequent actions.

Funder dependency map.

List every active reporting requirement, which funder it serves, and what it costs in staff time per quarter. Most boards have not been made aware of these hidden costs. Making this information visible can be one of the most valuable contributions you can include in the paper.

Cost of doing nothing.

Identify what becomes impossible if the current situation continues. For example, being able to consistently compare services, spot unmet needs early, make a coherent impact case across multiple funders, and protect against mission drift. This is the strategic argument for why this is a board issue and not just an operational task. 

Phasing and decision points.

Boards are unlikely to approve an open-ended programme, so structure the proposal in phases, including specific decision points: outline what will be reviewed in phase one, specify which decisions must be made before commencing phase two, and describe the exit strategy if the programme is not successful. 

The culture question.

The paper should focus on how frontline staff will experience the change, rather than just detailing what the new system will do. If employees have spent years inputting data into systems that provided no feedback, this represents an issue not just with the technology, but also with past experiences. It is important to address this; otherwise, the board may not grasp why the change management needs to take as long as it does.

What early success looks like.

This is about credible results within 6 to 12 months, for example, how a service uses near-live data to change how it works; how a reporting process is simplified or removed; or how an internal outcomes framework reduces bespoke reporting sprawl. Make it concrete enough that the board can see the impact of the intervention. 

Still brewing.

Martins-Tonks left Barnardo's in 2023 and currently leads data initiatives at the Money and Pensions Service. The data strategy he developed was described at the time of his departure as a journey still in progress. It's unclear who is now carrying this strategy forward and how much of what was designed has been maintained, as this information is not yet publicly available.

The third pillar, reducing the administrative burden on frontline staff, has shown the least evidence of large-scale delivery. The "See, Hear, Respond" programme demonstrated this principle, but it remains uncertain whether such improvements have been scaled across Barnardo’s 800 services.

The HR system issues serve as a reminder of the challenges faced. Barnardo's migrated from Oracle to Dynamics 365 and is now re-optimising its Dynamics deployment, as the core functionality drifted from the organisation's operational needs. The reporting issues have re-emerged and will continue to do so unless someone is specifically tasked with preventing them.

When the operations leader who drove these changes eventually leaves, and they will, will your board still understand why this intervention was necessary? Do you have a succession plan that addresses not just the role itself but also the institutional memory of why this work was initiated? Many charities have policies for the first question, but very few have one for the second.

Sources for this case study:

Amplifi: How becoming data-driven is helping Barnardo's achieve their strategic goal (2023) · Technology Magazine: Remi Martins-Tonks, Head of Data and Insight at Barnardo's (April 2023) · Microsoft Customer Stories: Barnardo's social impact through Microsoft Cloud (circa 2021–2022) · DataIQ: Meera Naik, Head of Data Transformation at Barnardo's (2021) · CORC: The journey of Barnardo's outcomes framework Development · FourVision: Building a futureproof HR solution for Barnardo's (2024) · Charity Digital Skills Report 2024 · Quematics: Charity data quality framework (March 2026)

Note: the specific document titles used here reflect Coffee Break Ops’ interpretation of the framework's requirements. The source case study describes the approach and outcomes. Some document names have been inferred from common practice in equivalent implementations.

Every week, a real ops problem from a real charity:
What they did, what it reveals, and what you can take back to your desk.
Sign up to get it in your inbox.

Keep Reading