Why gaps in the data from EPC audits leave as many questions as answers

Summary
Energy Performance Certificates (EPCs) have an important role in helping consumers to understand the energy efficiency of properties and provide advice on how to make improvements. They are also a legal requirement to demonstrate that rental properties meet minimum energy efficiency standards, and can be required to prove eligibility for some government grants and 'green' financial products.
Our research has found that consumers often question the accuracy of EPCs and large studies have also highlighted issues with accuracy. To get a better picture of the accuracy and reliability of EPCs we took a closer look at the auditing process for EPCs and made an Environmental Information request to access auditing data from the last four years [1].
We found that:
- 2% of EPCs are audited each year by the accreditation schemes, as is required under the scheme rules agreed with the government. 70% of the audits are selected randomly and 30% are selected using risk based ‘smart’ criteria.
- There are some gaps in the data that mean the government can’t say how many EPCs have significant errors, or how successful the smart auditing process is at identifying EPCs that are more likely to have mistakes.
- The government is not collecting information about what the most common problems are with EPCs. This information could be used to monitor accreditation schemes efforts to drive improvements in the accuracy and reliability of EPCs.
In order to improve trust in EPCs, we recommend that the government require accreditation schemes to publish more detailed data on the auditing results and consider additional steps to support the auditing process and improve accuracy.
The Role of Energy Performance Certificates
Energy Performance Certificates (EPCs) have an important role in supporting homeowners to reduce their energy bills and make the changes necessary to realise government ambitions for cutting carbon emissions and improving energy security. Some of the ways in which they are used include:
- Providing homebuyers, renters and landlords with information about the energy efficiency of properties to inform purchases.
- Providing advice on how a property's energy efficiency can be improved.
- Demonstrating eligibility for some government grants and loans as well as ‘green’ mortgages, loans and incentives from banks and building societies.
- They will also be used to measure private landlords' progress in meeting new minimum energy efficiency standards by 2030 if, as expected, this is introduced by the new Labour government.
However there have been recurring concerns about the accuracy of EPCs based on studies that have looked at large numbers of EPCs [2] as well as smaller scale samples. In a recent mystery shopping exercise conducted by Which? 8 out of 11 participants had concerns about the accuracy of their EPC [3]. Concerns about inaccurate EPCs can have different causes. In our mystery shopping exercise we found evidence of poor assessments, the EPC software generating poor results or advice, and misunderstandings about the terminology used in an EPC.
In order to shed more light on this issue we used an Environmental Information request to access the EPC auditing data from the Department for Levelling Up, Housing and Communities (DLUHC) [4], and then checked with three of the largest EPC accreditation schemes to ask if they could provide any additional information.
How EPC audits work
Under the EPC Scheme Operating Rules, accreditation schemes have to audit at least 2% of all EPC assessments each year (not including audits of new assessors, audits resulting from complaints or what are called ‘follow on’ audits). As part of this 2%, every assessor should have at least 0.5% of their assessments audited each year.
Assessments are chosen using two methods, with some chosen at random and others chosen on the basis of risk based criteria. These risk based or ‘smart’ audits identify data or combinations of data that are more likely to result in a failed assessment. The smart audit rules are publicly available and are periodically updated. The 0.5% of audits done on every assessor are chosen randomly.
Over the last four years approximately 69% of audits have been chosen using the random method and 31% were chosen using the smart method. The results of the random audits can be extrapolated to provide a figure for the total number of EPCs that have failed. The results of the smart audits cannot be used in this way as they are specifically identifying EPCs that are more likely to fail.
The audits themselves involve a desktop review of the evidence provided by the assessor including photographs and documentation. Measures are taken to ensure that photos from previous audits are not reused and that stock photos are not used. The audit does not involve the auditor conducting a second assessment of the property as this is potentially difficult to arrange with the property owner or tenant who would have to provide access.
Consumers can make a complaint if they are unhappy with their assessment, but this is a relatively high bar for consumers. As many EPCs are purchased when homeowners are selling their home they are unlikely to complain about errors if they don’t think it will make a significant difference to the price they get. It’s also possible that some consumers are not aware of errors depending on their level of expertise and the nature of the error.
Audit results
Our Environmental Information request found that accreditation schemes are auditing c.2% of EPC assessments each year as required by the Scheme Operating Rules.
The proportion of audited EPCs that were found to be defective over the last four years is 24%, but this includes audits that failed for reasons that don’t have a significant impact on the EPC itself. For example an assessment can fail because a photograph or a date stamp is missing. In these cases it can be reasonably easy and quick to rectify the issue as long as the assessor has the evidence or can provide other supporting evidence.
The proportion of audited EPCs that have significant errors has been 16% on average over the last four years. We have defined significant errors as those that required the EPC to be relodged in the system. This is required when the audit results in a recommendation change, a technical error, or if the audit produces a difference of 5 or more points in the score that determines the EPC rating.
The proportion of EPCs with significant errors has not changed significantly over the last four years, though the number of administrative errors has increased slightly from 2020/2021 to 2022/2023.
The data we received from our Environmental Information request (which is displayed in the graph above) showed the total number of audits that failed. However this doesn’t reveal what percentage of the random audits were defective and what percentage of the smart audits were defective. Unfortunately neither the government or the schemes were able to provide this data.
What the results show
Unfortunately, because neither the government nor the schemes are able to provide separate results for the number of random and smart audits that have errors, it is impossible to draw firm conclusions from the information they provide. For example:
- It is not possible to give an accurate figure for the number of EPCs that have significant errors each year, because neither the government nor the schemes provide information on the number of random audits that have been relodged each year.
- It is also not possible to say for certain that the number of significant errors is increasing or decreasing, because the smart auditing rules were changed in 2020 and again in 2022.
- The lack of data on the number of smart audits with significant errors each year also means that the government is unable to say how effective the smart auditing is in identifying defective EPCs. If smart audits are identifying large numbers of defective EPCs there would be an argument for increasing their number. For example, if it is assumed that only 10% of the random audits fail, then on a 69/31 random/smart split that would mean 29% of the smart audits are failing which would be a strong case for doing more smart audits.
We were also disappointed that neither the government or the schemes we asked could tell us more about the reasons why EPCs had to be relodged.
Conclusion
Given the importance of EPCs to households, landlords, financial providers and others it is important that they can be trusted. However, currently the government is not able to say with certainty what proportion of EPCs contain significant errors and whether this figure is rising or falling.
The inclusion of smart audits is a potentially positive aspect of the auditing process as it should help the schemes more efficiently identify EPCs which have errors. However there is currently no transparency as to how effectively the smart rules are working or the type of errors that the auditors are finding.
It is important to note that the audit data doesn’t tell us anything about the auditing process itself. More work would be needed to understand how effective the current process is.
The existence of the auditing system will have some benefit as a deterrent against poor performance if retraining and action against persistently failing assesors is effective. The process can also be used to correct the individual EPCs that are found to be defective, but given the number of audits conducted (2%) this will not have a significant impact on the overall quality of EPCs unless the findings are used to improve training and enforcement.
Recommendations
- In order to improve trust in EPCs and assess the accreditation schemes’ efforts to improve the accuracy of EPCs, the government should publish annual data on:
- The total number of EPCs conducted each year and how many of these were chosen for audit randomly and how many were chosen using risk based ‘smart’ rules.
- The proportion of random audits that result in the EPC being relodged on the EPC register.
- The proportion of smart audits that result in the EPC being relodged on the EPC register.
- Accreditation schemes should publish an analysis of the reasons why EPCs had to be relodged following an audit, and how these issues will be addressed in their training and enforcement processes.
- Although this research hasn’t looked at the auditing process itself, the government and schemes should consider whether improvements could be made through the use of additional data, including customer feedback. This could support improvements in identifying EPCs for smart auditing or in conducting the audits themselves, for example:
- Improving the audits by introducing third party data. If the property has a previous EPC, this could be used to identify if the new EPC has some unlikely results - for example if an end of terrace house has become a detached house or if a previously insulated wall has become uninsulated. This method has been used by some researchers and it has identified potential errors. However not all UK homes have had a previous EPC and it is also not able to account for changes that have been made since the previous EPC was issued. EPC’s could also be checked against other sources of data such as Land Registry data or the TrustMark database.
- Using customer feedback. Enabling customers to leave feedback on assessments could help to identify assessors that are regularly receiving low scores as a result of inaccurate EPCs. It is important that the ability to leave feedback is prominently displayed and straightforward, as the current feedback and complaint mechanisms set a high bar for consumers. Feedback should only be provided by the consumer that purchased the EPC and given within a relatively short period of time - perhaps one month. In some cases a customer’s concerns may be the result of a misunderstanding, but this emphasises the importance of clarity in the information that the EPC provides and guidance on how to understand an EPC.
- Using real performance data. The accuracy of EPCs could also be improved by using real performance data, rather than data that is modelled on the basis of assessors measurements and observations. Using sensors, it is now possible to take much more accurate readings of the energy efficiency of a home. This overcomes a lot of uncertainty and would give more accurate ratings. However an assessment would still be required to understand the insulation, heating and lighting in a property so that homeowners receive relevant advice about the improvements they could make.