Can modelling techniques be improved to correctly predict overheating in homes?
Head of Research and Development, Ben Abel
Hilson Moran has since the formation of its specialist building physics team, some 20 years ago, strived to improve the accuracy and robustness of its dynamic thermal modelling techniques and when the opportunity of comparing our modelling against some ‘real life’ measured results, I jumped at the opportunity. As part of the 2018 CIBSE Technical Symposium Ben Roberts, a PhD student from Loughborough University, presented his work where he had replicated the CIBSE TM59 overheating tests in a pair of semi-detached houses. This sparked an idea of modelling these houses in the dominant DTM software (TAS & IES) to see how closely the computer predictions came to the measured ‘real’ results.
The detailed results of this research has been published in CIBSE’s recent BSERT issue and demonstrated that although all the models are in broad terms correctly predicting overheating trends, we are potentially over-predicting the risk.The positive spin is that by using the TM59 methodology the designs we are producing are likely to be on the conservative side (i.e. higher performing glass, more shading, bigger window/vent opening areas etc.) which also builds in some resilience to climate change.
However, this leaves the question what could we be doing better? Are the models themselves correct? This got me thinking, the underlying principle equations and methodology formulations were all developed in the 1980’s and early 90’s, and generally these have not changed. Additionally, these were chiefly constructed around generating heating and cooling demands.