Client story

Helping a government entity gain meaningful insights from their data

We ran a data governance program to improve data integrity and generate insights to inform decision making.


Data governance


September 2020 to May 2022




720,000 data checks


With more than 4,000 sites and a mixed real estate portfolio, our client relied on multiple systems to manage their data. Without a single source of truth when it came to data management, they were experiencing errors and inconsistencies and data could not be relied on to inform decision making.

The organisation did not have processes or standards in place to manage their data or maintain its integrity. There were no data dictionaries or glossaries, so stakeholders did not understand the impact that data errors or incomplete entries had on the systems, reports, KPIs and dashboards that were being produced. Address information was incomplete, and they had a lot of duplicate property names which made identification of the correct building a challenge. This resulted in inconsistent data between different platforms. 

The client required a data governance model to be implemented that provides granular information on their sites and ensures their data is accurate, thorough, consistent, reliable, and widely understood. 


We implemented a multifaceted program to embrace the client's data landscape and ensure databases are managed with specialist guidance, in a well-defined, standardised manner. We developed policies and processes, maintained compliance, usage, and quality metrics, delivered data standards, dictionaries, and other data governance best practice initiatives. 

We also ran a data stewardship program which defined data governance roles and responsibilities. This enabled the business to identify and empower individuals to act as data stewards for each of our governed data sources, which in turn boosted the stakeholder’s ownership and accountability for datasets. 

To enable our client to track and monitor the data governance program, we: 

  • Implemented Lighthouse, a data quality management system which can inspect hundreds of millions of data points a day, to assess data quality across all relevant applications and systems and is reported on monthly.
  • Utilised Property Hub, a single data repository which enables them to create and update property records across their portfolio and ensure data integrity across multiple applications. Monthly data audits using Property Hub are run to ensure data is being accurately maintained. 
  • Used dashboard reports to visualise portfolio level data across all their businesses.
  • Set up and complied with the data governance practices required by the organisation.
  • Client’s data quality (DQ) score increased to 99% over a 12-month period. 
  • Number of data checks increased from 141,732 to 720,000 over a 12-month period.
  • Data glossary was published. This is a document that identifies and defines the data’s attributes and definitions for the portfolio teams, so they easily find the data assets they need and gain an aligned understanding of their meaning. 
  • Key stakeholders have improved their data literacy and surveys are conducted to check stakeholder understanding and inform next actions.
  • Data is now reliably being used to inform decision making.

Dashboard report showing:

  1. Total data quality score of the client.
  2. Total number of exceptions across all platforms.
  3. Total number of data quality rules across multiple platforms.
  4. Data points that are checked with the data quality rules.
  5. Lighthouse tool monitoring the five different systems. This area shows the number of exceptions and score for each system.