Using the framework

These key areas of the data maturity framework will help you to begin your data journey.

The framework intentionally does not define which issues are most important, or define a critical path for development, as that will depend on your provider's circumstance.

However, while we hope you will be able to navigate the framework and identify actions that will help you move forward, if you would like support with this, please contact your relationship manager.

There are common priorities that many universities and colleges face. Some examples of these can be found below, with the key issues that will help you get the right foundations in place.

Data quality

Good data quality ensures people have confidence in data, and trust that it can be used to make informed and effective decisions. There is no set definition of good data quality - we consider good data quality to be data that is ‘fit for purpose.’

If the data fulfils its intended use, then it is deemed to be of sufficient quality. Data is typically assessed against seven key characteristics when determining whether it is ‘fit for purpose’:

  • Accessibility: how easily available the data is to those who need to access it.
  • Accuracy: a dataset that reflects the real world and is a true account of what it intended to represent.
  • Completeness: a dataset that is full and contains all required information.
  • Consistency: where data values in one data set are consistent with values in another data set.
  • Timeliness: data that is available to end users within an agreed and useful timeframe.
  • Uniqueness: describes data that only has one entity in the dataset.
  • Validity: describes the degree to which the data is in the range and format expected.

These are the key areas we suggest you focus on to start your data quality improvement journey successfully.

Accountability - data roles

Level 1 - We don't have specific roles for data management or any concept of why we would need them beyond considering data protection issues.

Level 2 - We perform the most basic data management roles, mainly around data cleaning, though they are not formalised. 

Level 3 - We have defined some formal data management responsibilities in our key areas like student support teams.

Level 4 - We have several data specific roles which confer accountability across a variety of departments, for both operations and business change. 

Level 5 - Data citizenship is embraced by people in all parts of the organisation, who advocate and manage data as an organisational asset.

View the statements (pdf)

Accountability - resolution

Level 1 - There is no collaboration between cross-disciplinary teams to diagnose, troubleshoot or resolve data issues. 

Level 2 - There is some ad-hoc collaboration between registry and other departments to fix serious problems. 

Level 3 - We have a forum for sharing issues around quality and other data management issues which meets regularly and is respected. 

Level 4 - Multi-disciplinary teams work together to resolve data issues - either tactically or as part of a data improvement programme. 

Level 5 - Data issues are worked on collaboratively between all functions, and prioritised according to wider business initiatives and needs. 

View the statements (pdf)

Assurance - Monitoring and measurement

Level 1 - We do not have an approach to improving data quality and there is a sense of apathy towards such initiatives across the business.

Level 2 - We have some good ideas, but nothing formal enough to call a plan. We focus on statutory compliance and tend to 'fix and forget'.

Level 3 - We have data quality targets and key performance Indicators (KPIs), but these are not regularly assessed or challenged.

Level 4 - We set, monitor and maintain quality metrics for most of our data, and have automated some remedial work where possible.

Level 5 - We ensure all our data is rigorously maintained to the published levels of quality using well-understood metrics.

View the statements (pdf)

Assurance - issue management

Level 1 - We fix errors as they are reported to us. No analysis of why the failure occurred is done and we do not track issues after resolution.

Level 2 - We fix errors during routine checking, but we do not have time for root cause analysis unless it's a very serious issue. 

Level 3 - We track and record all data issues using automated error checking though we do not yet have standards in place for resolution.

Level 4 - We have a data quality plan that defines how we resolve issues. It includes performance measures that we monitor and routinely assess.

Level 5 - Our data quality plan is supported by automated error reporting which is monitored by senior management.

View the statements (pdf)

Business architecture - requirements

Level 1 - We continue with data collections for no obvious rationale other than 'we've always done this'.

Level 2 - We collect data we don't use or collect more than once. We don't know if something will break if we stop collecting it. 

Level 3 - There is some confusion around why some data is collected, but we understand our systems of record and our primary data feeds.

Level 4 - We have clearly mapped the flow of information across the organisation in order to understand the impact of change on data models.

Level 5 - We regularly review our data collection activities in line with our operational and strategic needs. Data collection is driven directly from these models. 

View the statements (pdf)

Business Architecture - governance

Level 1 - There is no recognisable understanding from any business unit of how data is managed by IT.

Level 2 - It is understood that IT is responsible for the storage and backup of data and some of the tools to manage it.

Level 3 - IT is included in wider business processes around outputs, but not fully integrated with new requirements and change. 

Level 4 - IT has a peer relationship with the wider business. IT's contribution to the electronic management of data is understood, although not always respected. 

Level 5 - Operational and change activity is seamlessly integrated between IT and the wider business with all roles and responsibilities defined and understood. 

View the statements (pdf)

Data Flows - validation

Level 1 - We have no consistency of data input - our systems do not support this and none of our people have access to any standards in relation to this.

Level 2 - We are starting to define how individual fields should be collected and what validation might be needed to support this.

Level 3 - We know how data should be collected to support our strategic goals. New system developments contain appropriate field level validation.

Level 4 - Our key systems contain field level validation. We need to improve our processes to ensure these are used effectively.

Level 5 - We have processes and standards to support data collection and entry. All our corporate systems are configured to support these standards.

View the statements (pdf)

Data flows - automation

Level 1 - Data resides in our systems and onward use is through manual extraction on an ad hoc basis.

Level 2 - We have very limited scheduled movement of data between some of our systems, focused on enabling basic operations such as access permissions.

Level 3 - We have mapped out how automation of data flows could support more effective analytics, and are starting to plan out how to deliver this.

Level 4 - We have built automated data flows between some of our key systems but these do not yet support wider activities.

Level 5 - All appropriate data processes from system admin to corporate priorities like reporting, are supported by automated, scheduled and maintained data flows.

View the statements (pdf)

Data models - data management

Level 1 - We have no idea how many datasets we have, where they are or what they are used for. Therefore, no master copy exists. 

Level 2 - Organisational and department data doesn’t reconcile, and changes are not supported by robust business process or data governance. 

Level 3 - We have an agreed single source for core datasets, even if this means we just know where the copies are.

Level 4 - An agreed single source is in place for core datasets and we signpost our teams to these sources to ensure their use is embedded.

Level 5 - Our data is created, integrated, consumed and purged with traceability to the master data model, and supported by rigorous business process.

View the statements (pdf)

Orchestration - maintenance

Level 1 - No one seems to understand how or when our data processing happens or what the impact will be on our reporting if something goes wrong. Fixes happen when a user spots a problem.

Level 2 - We know which processes are scheduled to run daily, weekly, monthly and annually. If something looks wrong, we fix it, though this often happens after reporting is refreshed.

Level 3 - Our ETL is monitored through ad hoc checks. Where there are issues, our scripts are updated so that our data remains accurate. We do not yet have a consistency of approach across people or datasets.

Level 4 - We are creating a template to improve consistency in our ETL activities and we are developing a framework to ensure the impact of changes in the business or our systems are understood.

Level 5 - We have a well-documented and maintained schedule for ETL. We know who is responsible for our processes, their accuracy and relevance and issues are flagged and rectified before reporting is impacted.

View the statements (pdf)

Learning analytics

Learning analytics provides insight that enables interventions to support individual students. For it to be successful, daily data inputs to learning analytics software need to be of sufficient quality and accurately formatted. This can be challenging for institutions if attendance monitoring systems are not embedded and if data transformation activity is still relatively manual. It is also important to ensure student support teams have the capabilities, and capacity, to analyse and act on the analysis produced.

These are the key areas we suggest you focus on to start your learning analytics journey successfully.

Find more support on learning analytics.

Executive engagement - investment

Level 1 - No data improvement proposals are sponsored or generally visible at a senior level.

Level 2 - We have some ideas that tend to be specific to big problems we're trying to fix. These initiatives can get priority if they are deemed important.

Level 3 - Senior management are aware of the value of data improvement, but support is limited to specific projects or issues, rather than an organisational priority.

Level 4 - Data improvement initiatives are supported by or run at a senior management level although they sometimes lose support in the long-term.

Level 5 - All data improvement initiatives are sponsored by senior management with strong support for providing the resources needed to undertake them.

View the statements (pdf)

Information lifecycle - constraints and ethics

Level 1 - We have not really considered ethics - we just use data as and when we need to.

Level 2 - Our data specialists understand policies and guidance, but we would not say there is much broader awareness of these.

Level 3 - Our teams understand that our data assets should be used responsibly but we rely on them to do this rather than support them.

Level 4 - We have agreed a plan to develop the controls and guidance needed to guarantee we never misuse data. It is currently focused on new projects.

Level 5 - We have defined acceptable usage cases and have appropriate controls in place to ensure all our data is never knowingly or unknowingly used unethically.

View the statements (pdf)

Accountability - resolution

Level 1 - There is no collaboration between cross-disciplinary teams to diagnose, troubleshoot or resolve data issues.

Level 2 - There is some ad-hoc collaboration between registry and other departments to fix serious problems.

Level 3 - We have a forum for sharing issues around quality and other data management issues which meets regularly and is respected. 

Level 4 - Multi-disciplinary teams work together to resolve data issues - either tactically or as part of a data improvement programme.

Level 5 - Data issues are worked on collaboratively between all functions, and prioritised according to wider business initiatives and needs. 

View the statements (pdf)

Change impact - new systems

Level 1 - When new systems are proposed, understanding the impact on data is not seen as a priority and our data products have broken unexpectedly.

Level 2 - When new systems are proposed, we recognise there may be an impact on data, but this is often considered too late to be managed efficiently, leading to manual work.

Level 3 - When new systems are proposed, we know there will be an impact on data, and we get enough notice to find and manage downstream impacts.

Level 4 - We have full visibility of integrations so we understand the impact on our data from changes to systems. We can advise system owners of these impacts in advance.

Level 5 - We ensure data specialists join project initiatives at an early stage to advise on the impact of system and process changes on our data, which ensures there is no downstream impact.

View the statements (pdf)

Enterprise architecture - organisational agility

Level 1 - We usually feel on the back foot when the university or college needs something new. We have started system projects that have failed.

Level 2 - We deliver new system and process requirements successfully - but it generally takes a long time and is resource intensive, risking other activities.

Level 3 - Our structures and knowledge ensure we can react to new priorities in a timely fashion, though we would like to be more proactive.

Level 4 - Our strategy prioritises the ability to respond at pace to new demands. We have identified and resourced key areas that we need to tackle to achieve this.

Level 5 - Our target operating model includes provision to manage new requirements and we aim to pre-empt systems and process changes.

View the statements (pdf)

Technology and applications - business systems

Level 1 - We do not have clear sight of the systems environment and therefore the extent of data sitting on shadow or non-shadow IT.

Level 2 - We have corporate systems for our main areas of activity but we know there are localised solutions to support some other processes.

Level 3 - We know what systems our organisation needs to support its processes. A roadmap is in place to deliver those we do not yet have in place.

Level 4 - We have standard systems in place for all our processes. Any remaining localised solutions exist because we have strategically decided that is the best solution.

Level 5 - Our systems architecture forms part of our enterprise architecture. Our focus is on managing change so we are prepared for new requirements from the organisation.

View the statements (pdf)

Data assets - master data

Level 1 - There is no consistency across our systems in how our organisational units, business terms, or their attributes are described.

Level 2 - While there is little consistency across our systems, we have created ad hoc solutions to manage mappings for reporting.

Level 3 - There is some consistency across systems with key identifiers and we have a plan in place to expand this.

Level 4 - We are in the process of centralising the recording of master data from our remaining systems. We ensure information is held at the correct granularity.

Level 5 - We have a complete centralised view of the master data in all our business systems. It is well documented in an appropriate format and we consider the impact of changes before we make them.

View the statements (pdf)

Requirements traceability - business questions

Level 1 - We do not consider "business questions" in our reporting provision and tend to make assumptions about the reporting that is needed.

Level 2 - We ask for requirements before we create reports, but we do not consider this information holistically or store it effectively.

Level 3 - We hold our requirements information in a central repository and define key business questions to direct our reporting.

Level 4 - Business questions form the foundation of our data processing, and we have started to create reporting to share this information with our users.

Level 5 - We present a business question backlog as part of our service. Business questions are the starting point for our data orchestration activities.

View the statements (pdf)

Report providers and users - approach

Level 1 - It always feels like we are firefighting, because we are struggling to keep up with demands of operational and change activity.

Level 2 - We have a do-and-forget mentality - there is no defined business process for many repeatable activities.

Level 3 - Our most repeated activities are reasonably well resourced. We struggle to deal with change or new initiatives.

Level 4 - Our daily activities are well understood, staffed and processed efficiently. We can handle unexpected events and periods of additional work. 

Level 5 - Daily activities are largely automated and supported by simple business processes. It is very rare we need to intervene.

View the statements (pdf)

Data literacy - analysis

Level 1 - Decision making is largely intuitive and not supported by data.

Level 2 - Data is used to support decisions, but the quality is poor or unknown.

Level 3 - Data is trusted to support several key decisions for daily operations and planning purposes.

Level 4 - Data is trusted, accurate, timely and available for supporting operational and strategic decisions.

Level 5 - Data is presented in customisable, analytical output and supports us in sophisticated analysis.

View the statements (pdf)

Generative AI

In a fast changing and dynamic landscape, the use of generative AI in reporting brings significant opportunity. It may deliver efficiencies, improve accessibility and has the potential to change cultural attitudes to how data is perceived, used and acted upon. Tools to provide narrative descriptions of charts and data trends have the potential to transform automated reporting.

However, for AI to be reliable and add value, tools require a mature data architecture. Providers are likely to need investment and development in organisational capability and data maturity, along with considering the security and storage implications relevant to broader AI use.

These are the key areas we suggest you focus on to start your AI journey successfully.

For more information on AI, read our generative AI primer and use our AI maturity toolkit.

Executive accountability - oversight

Level 1 - Our senior teams have no oversight or responsibility for data and reporting activities.

Level 2 - As a senior team, we have started to take an interest in the data we hold and what we use it for, but our teams understand the detail.

Level 3 - Our senior team understand current legislation and take active steps to ensure we adhere to this as an organisation.

Level 4 - As a senior team, we consider both current legislation and best practice in data and analytics, and understand we have a role in their successful delivery.

Level 5 - We promote data as an asset that is everyone's responsibility, and as a senior team we accept accountability for the role data plays in successful delivery of key activities.

View the statements (pdf)

Executive accountability - security

Level 1 - We don't know whether our data is secure, and we don't know how to find out.

Level 2 - There are known risks in our data but we have not yet considered our organisational risk appetite to decide what we need to address.

Level 3 - We have recognised classifications in place for our core datasets, and are confident enough in this to be formally internal audited.

Level 4 - We have recognised classifications in place for all our datasets, and we have appropriate access policies in place across our organisation.

Level 5 - Our teams are collaborating to consider how new technologies will impact on our data and to ensure new tools can be rolled out and used without risk to our data assets.

View the statements (pdf)

Accountability - resolution

Level 1 - There is no collaboration between cross-disciplinary teams to diagnose, troubleshoot or resolve data issues.

Level 2 - There is some ad-hoc collaboration between registry and other departments to fix serious problems.

Level 3 - We have a forum for sharing issues around quality and other data management issues which meets regularly and is respected.

Level 4 - Multi-disciplinary teams work together to resolve data issues - either tactically or as part of a data improvement programme.

Level 5 - Data issues are worked on collaboratively between all functions, and prioritised according to wider business initiatives and needs.

View the statements (pdf)

Technology and applications - servers and storage

Level 1 - We really struggle with our data - it's all over the place, on individual computers and in some cases in hardcopy formats.

Level 2 - We are starting to put some standards in place and most of our data is now in an agreed location. It is still hard to access for some users.

Level 3 - Our data is mostly stored in a secure, backed up environment. This is usually a local server with managed access.

Level 4 - All the data we need for our operations is secure and backed up. We are using cloud options for an increasing amount of our data storage where this is the right solution.

Level 5 - We are streamlining the data we store as part of our strategy to ensure we only hold the data we need, in a supported, securely accessed environment.

View the statements (pdf)

Data models - meta data

Level 1 - We don’t have any formal metadata available for any of our core or non-core datasets.

Level 2 - Where available, metadata is used to help understand impacts to datasets. No formal taxonomy or dictionary is in place.

Level 3 - Metadata is available (if not complete) for our core datasets, but generally developed and maintained within the student team.

Level 4 - Metadata is developed by named information asset owners and is available as part of a lineage and audit process.

Level 5 - Metadata is complete, rich, managed and maintained. This lack of ambiguity enables high re-use and ensures we can develop new services in a timely fashion.

View the statements (pdf)

Data assets - data catalogue

Level 1 - We don't have any understanding or visibility of the data available to us or what our systems hold.

Level 2 - There is knowledge of the data in our key systems but it is dependent on a few individuals and there is little or no documentation.

Level 3 - There is knowledge of the data within our systems but it is siloed and we have no formal mechanism to collate it in one place.

Level 4 - We are starting to create a data catalogue containing each data asset, and how our data can be leveraged for business use.

Level 5 - Our business has good knowledge of the data held in our systems, it is well-documented, and we understand it's potential in helping us address our strategic priorities.

View the statements (pdf)

Data assets - master data

Level 1 - There is no consistency across our systems in how our organisational units, business terms, or their attributes are described.

Level 2 - While there is little consistency across our systems, we have created ad hoc solutions to manage mappings for reporting.

Level 3 - There is some consistency across systems with key identifiers and we have a plan in place to expand this.

Level 4 - We are in the process of centralising the recording of master data from our remaining systems. We ensure information is held at the correct granularity.

Level 5 - We have a complete centralised view of the master data in all our business systems. It is well documented in an appropriate format and we consider the impact of changes before we make them.

View the statements (pdf)

Data assets - reference data

Level 1 - We do not have any visibility of the reference data used in systems and onward reporting - what we see is inconsistent.

Level 2 - We are starting to put together some standard reference data tables for our key business areas, but they are not used widely.

Level 3 - We manage key reference data, including our organisational hierarchy, in a central function though it's use in reporting is limited, and we do not yet have appropriate governance processes in place.

Level 4 - We are developing a governance framework for how to manage our reference data at an organisation level.

Level 5 - We have agreed reference data sets across our organisation, with consensus on content, onward governance, and who is accountable for its purpose and change requests.

View the statements (pdf)

Data literacy - capability

Level 1 - We have very little visibility of individual skills and confidence in data work and interpretation and we don't really think about it as a priority.

Level 2 - We assume that managers can use data and although some can, there are definitely areas where this is not the case. We aren't that clear how to address this.

Level 3 - We are actively looking at job specifications to ensure our expectations are clear. We are also starting to ensure decision making responsibilities are clearer to help our teams understand where data may add value.

Level 4 - We are creating a skills development framework to align our people, the data we have, and decision-making responsibilities.

Level 5 - We view data capability as a core skill for most of our people. Everyone understands the data expectations in their job and has a development plan in place to improve where this is needed.

View the statements (pdf)