Client story

We helped an organisation uncover meaningful insights from its data

A data governance program turned our client’s real estate information into a reliable body of data to support decision-making


Data governance


September 2020 to May 2022




720,000 data checks


Our government client has a real estate portfolio of more than 4,000 properties, each with their own story around how they are being managed and maintained. Due to errors and inconsistencies in the information kept about these properties, the portfolio became difficult to manage. Data was spread across multiple systems and without a single source of truth, the client struggled to make reliable decisions.

The core issue was that there were no processes or standards to manage the data or maintain its integrity, and the people populating and accessing the data didn’t understand the implications of this on systems, reports, key performance indicators and dashboards. 

For example, address information was incomplete, resulting in inconsistent data between platforms. Duplicate property names also made it difficult to identify buildings. 

To rectify these issues, we helped our client implement a data governance model that would provide granular information on sites and ensure data was accurate, thorough, consistent, reliable, and widely understood. 


We wanted to set up a program that would ensure the client’s databases could be managed in a well-defined, standardised way.

We implemented policies, processes, and compliance measures, along with other best practice initiatives, including usage and quality metrics, and data dictionaries.

We also ran a data stewardship program which defined roles and responsibilities around data governance. This enabled the business to identify and empower individuals to act as data stewards.

Several technology and automation solutions allowed our client to track and monitor its new data governance program. These included Lighthouse, which can assess for quality hundreds of millions of data points per day across multiple applications, as well as produce monthly reports.

Another, Property Hub, is a single data repository which enables the client to create and update property records across the portfolio, ensuring data integrity across multiple applications. The program also runs monthly data audits. 

Dashboard reports helped our client visualise portfolio-level data across its business.


The number of data checks carried out across our client’s portfolio increased from 141,732 to 720,000 over 12 months, resulting in a 60% average increase in the data quality score across our five systems (Corrigo, Portfolio, Spend, PH and Clarizen).

We also created and published a data glossary and rules register a document that identifies the attributes of data and defines the information so it can be found easily. This has also helped key individuals improve their data literacy. Regular surveys check the individuals’ continued grasp on the data and inform next actions.

Our client is now reliably using its data to inform decision making.

The above image shows a dashboard displaying the percentage of data that is accurately recorded; the data quality score of the client (a weighted measure of successful data checks based on the severity of imposed rules, with a total possible score of 500 points); the total number of exceptions (statistical errors which reflect the difference between a value obtained from a data collection process and the true value) across all the platforms that capture data; the total number of data quality rules across multiple platforms; ‘below threshold rules’ (rules that are below the accepted data quality range); and total rules.

The boxes on the bottom left indicate the number of data points that are checked with the data quality rules, and ranks the severity of inaccurate data.

The box on the bottom right is a report produced by the Lighthouse tool monitoring five systems, with hundreds of rules capturing thousands of data points. This area shows the number of exceptions and score for each system.